Extract large data from feature on ArcGIS Enterprise

371
0
01-04-2023 06:27 AM
arc_ticodex
New Contributor III

Hi everyone!

I have a feature layer with over 100,000 records published on Enterprise from a Postegre database and I want to extract the data into a web application built with the ArcGIS API for JavaScript.

Features layers with less than 100,000 records can be downloaded in .shp, .csv, .geojson and .kml. When trying to download features with more than 100,000 records the error log says the following:

Error executing tool. ExtractData Job ID: jdc139b252808463c9b1c40d31b9e48e8 : {"messageCode": "GPEXT_018", "message": "The remote feature service https://codexremote2.eastus.cloudapp.azure.com/server/rest/services/teste_CAESB/Setores_teste/Featur... exceeds the limit of 100,000 features. Create a hosted feature service to analyze large data.", "params": {"url": "https://codexremote2.eastus.cloudapp.azure.com/server/rest/services/teste_CAESB/Setores_teste/Featur..."}} {"messageCode": "AO_100026", "message": "ExtractData failed."} Failed to execute (ExtractData).

We have tried some alternatives to solve the problem. In Server Admin at services > System > SpatialAnalysisTools.GPServer we set the maximumRecords property to a value above 100,000.

We find this documentation -> Setting map service Properties (https://enterprise.arcgis.com/en/server/10.3/publish-services/linux/setting-map-service-properties.h...) and edit the feature settings, in the field maxSampleSize (An integer representing the maximum number of records that will be sampled when computing a class breaks renderer. The default is 100,000.) increasing to over 100,000 records.

Still we were not successful and it keeps accusing the same error.

Any thoughts?

0 Kudos
0 Replies