A phyton script to Overwrite a feature layer base on a Pandas DataFrame

1276
8
Jump to solution
08-10-2022 07:21 PM
Admin_D
New Contributor II

Hi,

I use a python script in Jupyter to get data from an external WebApp through its API, convert it to a pandas DataFrame, and publish it as a feature collection on AGOL . below is a part of the code where data (df_data) is imported as a feature collection and then gets published on AGOL.

df_data_fc = gis.content.import_data(df_data)

#Build a python dictionary from feature collection properties

df_data_fc_dict = dict(df_data_fc.properties)
df_data_json = json.dumps({"featureCollection": {"layers": [df_data_fc_dict]}})

#Use the dictionary in a list as the layers property of a json featureCollection to add the item to the portal

data_item_properties = {'title': ' Incidents Map',
'description':'Example demonstrating the integration between GIS and 3rd party portal ',
'tags': ' PCDS',
'text':df_data_json,
'type':'Feature Collection'}
data_item = gis.content.add(data_item_properties)


At the moment I need to come up with a python script to overwrite the published feature layer daily instead of publishing a new layer through each update.
I have done some research to find the related code but unfortunately, there are no similar situations where someone wants to overwrite a layer with a pandas data frame.

Is there any python scrip to use a pandas data frame in order to overwrite the above-published layer?

 

Tags (3)
0 Kudos
2 Solutions

Accepted Solutions
by Anonymous User
Not applicable

Looks like the data parameter for the overwrite method is expecting a file path and not a variable holding the json.  Try writing the json.dumps() to a temp file and use that path as the input. Since it keeps all the metadata the same, maybe the definition isn’t needed?

View solution in original post

0 Kudos
by Anonymous User
Not applicable

Not sure if it has anything to do with it, but in the data.json file, the text key has a python dictionary and not true json. Do they want GeoJSON, EsriJSON, or another standard? Seems that any standard that is tried results in an error... 

I saw that there is an old solution from 2018 that does this text: dictionary too, but I keep getting the 406 error or other failing stuff so maybe they updated that process and it no longer works that way.

The csv upload method seems more stable if you could dump the df into a csv instead of the JSON.

View solution in original post

0 Kudos
8 Replies
by Anonymous User
Not applicable

Wonder if you can use the overwrite method of FeatureLayerCollectionManager?

from arcgis.features import FeatureLayerCollection
your_pandas_layer = FeatureLayerCollection.fromitem(old_item)

your_pandas_layer.manager.overwrite(data_item_properties)

 

0 Kudos
Admin_D
New Contributor II

Thanks for the suggestions. I used the below code before but I got this error: 

TypeError: item must be a type of service, not Feature Collection

from arcgis.features import FeatureLayerCollection

newfeaturecollection = FeatureLayerCollection.fromitem(data_item)

0 Kudos
Admin_D
New Contributor II

Hi 

I published the feature layer first as a feature service and then used the overwrite method to avoid the above error but I faced another error  at the last line of the script as below:

Exception: Error while analyzing Feature Collection, Feature Collection JSON doesn't have 'layers'
(Error Code: 406)

Here is the lest version of the script. Any idea about this error?

 

 

df_data_fc = gis.content.import_data(df_data)

#Build a python dictionary from feature collection properties

df_data_fc_dict = dict(df_data_fc.properties)

df_data_json = json.dumps({"featureCollection": {"layers": [df_data_fc_dict]}})

#Use the dictionary in a list as the layers property of a json featureCollection to add the item to the portal

data_item_properties = {'title': ' Incidents Map',

'description':'Example demonstrating the integration between GIS and 3rd party portal ',

'tags': ' PCDS',

'text':df_data_json,

'type':'Feature Collection'}

from arcgis.features import FeatureLayerCollection

published_layer = gis.content.get('0b0bf5083fea423daa99472937749b4d')

newfeaturecollection = FeatureLayerCollection.fromitem(published_layer)

newfeaturecollection.manager.overwrite(data_item_properties)

 

0 Kudos
by Anonymous User
Not applicable

Looks like the data parameter for the overwrite method is expecting a file path and not a variable holding the json.  Try writing the json.dumps() to a temp file and use that path as the input. Since it keeps all the metadata the same, maybe the definition isn’t needed?

0 Kudos
Admin_D
New Contributor II

I used both a csv file and a json file to in the overwrite method in the below script:

csvfilepath = r'C:\Users\***\TelematticaIncidents.csv'
jsonfilepath = r'C:\Users\***\data.json'

from arcgis.features import FeatureLayerCollection
#get the hosted feature layer
dataitem = gis.content.get('eba42b0e69374cf098ce55bf1cd2d4a7')
newflayercol = FeatureLayerCollection.fromitem(dataitem)
newflayercol.manager.overwrite(csvfilepath)
#newflayercol.manager.overwrite(jsonfilepath)

But I got the below error:

Exception: Error while analyzing Feature Collection, Feature Collection JSON is empty
(Error Code: 406)

 I also attached the csv file and the json file to this reply.

Is there any thought on this?

0 Kudos
Kara_Shindle
Occasional Contributor III

Every hour, I have a script that exports a pandas DataFrame to a CSV, and then uses that to overwrite a hosted table in my account.

 

#This gets the hosted table in AGOL
table = gis.content.get('item number here')  

#from arcgis.features import FeatureLayerCollection
projects_collection = FeatureLayerCollection.fromitem(table)

#pandas exported to CSV
df4.to_csv(newCSV4, date_format='%m/%d/%Y')  

#overwrite old table with new csv data
projects_collection.manager.overwrite(newCSV4)  

 

Edit:

I do this as it keeps the CSV created as a backup.  Every time the script is run, the previous CSV is renamed and stored and then the new CSV takes on the original table name.  I only ever keep the current table and the old csv.

by Anonymous User
Not applicable

Not sure if it has anything to do with it, but in the data.json file, the text key has a python dictionary and not true json. Do they want GeoJSON, EsriJSON, or another standard? Seems that any standard that is tried results in an error... 

I saw that there is an old solution from 2018 that does this text: dictionary too, but I keep getting the 406 error or other failing stuff so maybe they updated that process and it no longer works that way.

The csv upload method seems more stable if you could dump the df into a csv instead of the JSON.

0 Kudos
Admin_D
New Contributor II

 This time I published the primary file in a csv format manually instead of publishing it as a json file from a dictionary. The scrip works without any problem.

Thanks Jeffk

0 Kudos