Hi all.
For some reason, I can only append one feature at a time into an existing AGOL feature service.
I bring the feature service layer into Pro, plus a local data source (typically a file geodatabase). I then run the Append gp tool, mapping the fields appropriately. It appends the first feature, then fails with the following message:
Start Time: Friday, 31 March 2017 9:41:20 AM
ERROR 999999: Error executing function.
General function failure [Cannot insert duplicate key row in object 'user_2456.BOAT_SURVEY_DEMO_FISHSPECIES' with unique index 'GlobalID_Index'. The duplicate key value is (9bc2a5e1-d2e2-42fd-9802-7caa78b1c739).
The statement has been terminated.]
Failed to execute (Append).
At first, my local data had no global ids, so I tried adding them, but no change, even if I ticked 'preserve global ids' in the append tool (although that failed with a different message).
Has anyone else experienced this, know what is going on, and/or have a fix?
Cheers,
-Paul
Hi Paul,
This is a bug that was introduced at Pro 1.4. We are investigating the cause of it now and plan to have a fix for this in the next release. If our investigations reveal a workaround I will post it here.
- Russell
Thanks Russell.
Are there any suggested workarounds to this bug? We would like to add features to a hosted feature service used with collector, and are now faced with this issue as well.
The append works if only one item is selected - I had approx 10 records to append so just did them one at a time.
Assuming this is not a feasible option for large datasets, someone suggested that copy and paste might still work. I gather it just requires the source and target datasets to be exactly the same. I haven't tested this though. To do this, I would download the source feature layer from agol as a file geodatabase, truncate it to remove the existing data, append into the local layer, then try a copy paste into the agol online feature layer.
Let us know if you get that working...??
I'm still hoping we get an actual fix asap!
I have come up with a way to get the append on repeat to run via a model that keeps running until all items are appended one by one. Only problem is it takes about 4 mins per append and there are usually 100+/- a week.
I look forward to seeing this bug fixed. There is a layer I'd like many to be able to update but it's not reasonable for me to expect them to be willing to select the individual records and update one at a time. I may consider the model approach that damoneisenach came up with. Thanks!
Collin,
This issue was resolved with Pro 2.0. There is also a new Append option on the item page in the latest release of ArcGIS Online. This is a very efficient way to load data into your hosted services. I would recommend looking into this if you are using AGOL hosted feature services.
Russell
Hi Collin and Russell
Yes I can confirm the issue (at least in our testing) has been resolved in latest Pro.
I note though that appending (or deleting) large datasets via Pro seems excruciatingly slow to the point where I've found it impossible to update datasets with several thousand features. I gave up using Pro for that and instead used the ArcGIS Python API. I put together a script that reads a source spreadsheet in a panda df, converts to correct json format, then uploads in chunks of about 2000 features at a time. As a comparison, easily an hour or two to append via Pro, about 2 min via the script for a spreadsheet of about 20,000 records.
Note that I don't think the new option to load data direct from the AGOL will work as I think that only works on particular layers, e.g. layers published directly from csv or shape files?? Not sure on that, so something I will have to look into.
cheers,
-Paul.
p.s. See my other post which has had no replies as yet.
Why is ArcGIS Pro so slow to delete features or append features into ArcGIS Online?
Hi Paul, can you share some code snippets of using the Python API to load data into your feature service?