Select to view content in your preferred language

Any way to run /append and then /calculate on ONLY those appended features?

735
4
04-04-2022 11:22 AM
ewagstaff
Occasional Contributor

Hi all,

We are building an app that supports uploads of arbitrary line data to our online map. But we've encountered a bit of a problem: We want to allow uploads (appending the uploaded data to our layer) and then run calculate to include some extra info on those features. The problem is that I don't see a way to target JUST the features added to the layer via the /append endpoint.

/append only returns the submissionTime and lastUpdatedTime on the job URL -- frustratingly, neither of these appear to match the created_date or last_modified_date of the new features exactly. If we use a time range, I'm concerned about a case where two users upload in the same window and User 2's /calculate function will write values over User 1's upload.

This would not be a problem if /append allowed field data to be written to the features at the same time as the upload, however /append only supports fieldMapping from source data. Because this is arbitrary data, we can't be sure there will be anything identifying in the source data.

Is there a clear way to do this that I'm not seeing?

0 Kudos
4 Replies
JakeSkinner
Esri Esteemed Contributor

@ewagstaff are you able to apply the calculation before the append?  

0 Kudos
ewagstaff
Occasional Contributor

Can you run /calculate on an uploaded file (shp/gdb) directly? I was under the impression you could only run calculate on a feature layer, so we would have no choice but to run /calculate after /append to target those new features.

For context, we are using /addItem to pull in a file via dataUrl and then running /append using that new item's appendItemId. If there is a way to run /calculate beforehand, we would love to do that.

0 Kudos
JakeSkinner
Esri Esteemed Contributor

Instead of performing an append are you able:

  • Publish the uploaded file (shp/gdb) as a feature service
  • Perform your calculations on the published feature service
  • Add the published feature service to target service using /addfeatures
  • Delete published feature service
0 Kudos
ewagstaff
Occasional Contributor

That API flow does work, and was actually the system we have currently and are trying to transition away from. This is because the /addFeatures API requires significant data (attributes, geometry for every feature) to be sent in the request, limiting how much data can be sent at once. To accept an upload of 100k features, for example, the /addItem + /publish + /addFeatures flow takes about 15 minutes to complete. The calls need to be chunked in segments of about 500 at a time to prevent timing out, and even then is fairly error-prone and requires retry logic.

If there is another way to /addFeatures that's similar to /append, where I can just connect a source layer to a destination layer and have the features port over, that would be ideal.

At present, the /append call takes about 3 minutes for a dataset of similar size, so we are very keen to use it if possible.

0 Kudos