Select to view content in your preferred language

best way to run geoprocessing with large feature services

489
1
09-24-2024 06:40 PM
AntEsk
by
Emerging Contributor

i am trying to write a simple script to append a few large feature services into a local feature class.

The feature classes have many more records than the MaxServerRecords so i am splitting them up into smaller chunks and processing them.

i keep running into issues with the append failing, it will work sometime and not others, i get the following errors, "500 error the server threw an exception", "Cannot read Table" and "Unknow error has occurred contact ESRI".

my scripts all run fine with small numbers of features.

features that will work when run with small features will sometime fail when run in the large batch

i am using the url of the feature service as input for the arcpy.Append_management(), is this the best way to do it? 

i found an old blog that talks about using a FeatureSet?

 

0 Kudos
1 Reply
DuncanHornby
MVP Notable Contributor

I don't tend to process data drawn down from a feature service but may be you could explore the Python API as an alternative way of grabbing that online data? Have a look here.

0 Kudos