Select to view content in your preferred language

What is the best way to upload large feature layers?

99
1
Thursday
Labels (3)
MichaelBurbea1
Emerging Contributor

I have a set of feature layer that I upload monthly. Every month, I go and create fresh cuts of the data, each with 1 feature layer that has at most 154 fields. But I dread uploading the largest one. It's about 11 million records and 154 fields wide. The feature geometry is points stored as 4326, and most of the fields are ints, singles, one double one datetime, and 7 strings and the majority of which are nullable with large amounts of null. This one gdb when zipped is about 2.3gb, so pretty small. It takes about 6 hours to publish. Creating the zipped gdb is not the issue, that only takes about 12-15 minutes.

The next two largest layers are smaller either in field width (6 m rows by 36.columns), or length (1.6 m rows by 154 columns). They both finish much quicker usually in the hour or so mark.

It is not my network connection. The gdb is uploaded to arcgis very quickly, its publish the feature service that takes far too long. I am intimately aware that the arcgis servers are hosted on IIS, use newtonsoft json, and commonly like to give me errors like 503 because my token expired, or 500 because an out of memory error. (the python api hides these errors because its buggy, and lists both as the same "unknown error").

I use views, so that the switch is seamless for the end users, but the runtime is terrible, and the error rate is astronomical. It should not be that it takes me 3 tries usually for it to work every month, and it fails with all sorts of anomalies. Too few records, too many records, "getting stuck". I use arcgis for python (and I've used straight rest) to handle the upload, but I am happy to use other apis.
Does anyone have a better suggestion that doesn't require arcgis for enterprise?

Tags (1)
0 Kudos
1 Reply
RPGIS
by MVP Regular Contributor
MVP Regular Contributor

Hi @MichaelBurbea1,

One thing, if you have not considered it, is to publish your data as a vector tile service. This will greatly enhance the upload time while reducing the overall memory usage when rendering your information.

The only other option is to modify the hosted service definition so that only a certain number of records are returned. Other people have asked questions similar to yours but this is one that I found.

Solved: maxRecordCount - Esri Community

I am unfamiliar with any other options aside from these two that I know of.

0 Kudos