I know there are other options:
Thanks for any pointers.
Is this what you are looking?
Or you can use edit_features(adds=None, updates=None,....)
There is an example for it
Hope these information are useful for you
Hey again Simo
No its not so much the actual appending, its the uploading of a fGDB against the Hosted Feature Service itself (as opposed to an item in the portal) which I cant spot an easy way of doing without using the requests library and working directly with the REST endpoint.
It seems we have to do it in two steps:
Step 1: update the File Geodatabase item
Step 2: use the File Geodatabase item to update the Feature Service via append function which supports upsert.
I did a quick test here, and it seems works fine.
It would be good if we can update the feature service using zipped FGDB directly. and I don't know why the API doesn't. Hope someone from the development team can comment on this.
But we can workaround it by delete the zipped FGDB item after the update. I am just guessing, it really depends on your own workflow
it by delete the zipped FGDB item after the update
You can actually bypass the above approach of uploading directly to the portal as an item (still a very valid approach for appending) and upload your fGDB directly against a hosted feature service.
Benefit is that this data is manually cleaned automatically by the back-end Server and no reliance on you having to delete.
Appending definitely supports using this via the appendUploadID parameter on the REST endpoint.
But with the Python API:
I am ok working around this by using the requests module, but might be worth including in a future release?
There is a method called upload for the FeatureLayerCollection class.
but the append function for FeatureLayer does not have a upload_id parameter, and in the source code I can see the upload_id was intentionally set to None. So I assume the upload_id was deliberately disabled for the function.