Error Creating Overlay

1238
2
09-19-2019 02:08 PM
TimIfill
New Contributor

I have a large feature layer from the local planning agency with land use data for the region.  I'm trying to extract just the data and create a map for one municipality by overlaying the two layers (land use and municipal boundary).  I'm getting this error message:

GetLayers for parameter 1 failed. Error: {"code" : 0, "messageCode":"GPEXT_018","message": "Number of features in service https://arcgis.dvrpc.org/arcgis/rest/services/Planning/LandUse2015_Enhanced/FeatureServer/0 exceeds the limit of 100,000 features. Use a hosted feature service as input to analyze large dataset.","params":{"url" : "https://arcgis.dvrpc.org/arcgis/rest/services/Planning/LandUse2015_Enhanced/FeatureServer/0"}} Overlay Layers failed.

I tried creating a hosted feature service on the developer site, which worked fine.  But when I tried to do the overlay with that, I just got another error message that just said "Error."

Any thoughts?

0 Kudos
2 Replies
by Anonymous User
Not applicable

Hi Tim Ifill‌,

ArcGIS Server Feature Services have a hard limit of 100,000 features for analysis. For Hosted Feature services, there isn't a hard limit, but "as the complexity of the features in the service increases, the number of features you can analyze decreases. For example, if the service contains polygon features that have thousands of vertices each, you might only be able to analyze a few hundred features. If the number or complexity of features exceeds what the tool can support, you will receive an error message"

Running the Overlay on a subset of the data should enable this to work as expected.

I hope this helps,

-Peter

0 Kudos
arc_ticodex
New Contributor III

Hi Peter!

I'm facing the same issue reported by TimIfill this time on ArcGIS Enterprise.

I have a feature with over 100,000 data records from a Postegre database and I want to extract the data into a web application built with the ArcGIS API for JavaScript.

Feture layers with less than 100,000 records can be downloaded in .shp, .csv, .geojson and .kml. When trying to download feature layers with more than 100,000 records the error log says the following:

Error executing tool. ExtractData Job ID: jdc139b252808463c9b1c40d31b9e48e8 : {"messageCode": "GPEXT_018", "message": "The remote feature service https://codexremote2.eastus.cloudapp.azure.com/server/rest/services/teste_CAESB/Setores_teste/Featur... exceeds the limit of 100,000 features. Create a hosted feature service to analyze large data.", "params": {"url": "https://codexremote2.eastus.cloudapp.azure.com/server/rest/services/teste_CAESB/Setores_teste/Featur..."}} {"messageCode": "AO_100026", "message": "ExtractData failed."} Failed to execute (ExtractData).

We have tried some alternatives to solve the problem. In Server Admin at services > System > SpatialAnalysisTools.GPServer we set the maximumRecords property to a value above 100,000.

We find this documentation -> Setting map service Properties (https://enterprise.arcgis.com/en/server/10.3/publish-services/linux/setting-map-service-properties.h...) and edit the featute layer settings, in the field maxSampleSize (An integer representing the maximum number of records that will be sampled when computing a class breaks renderer. The default is 100,000.) increasing to over 100,000 records.

Still we were not successful and it keeps accusing the same error.

Any thoughts?

0 Kudos