Select to view content in your preferred language

Guidance Needed on Rendering Large Datasets with ArcGIS JS API

250
6
3 weeks ago
FactSetResearchSystems
Emerging Contributor

I'm working with @ArcGIS/core@4.21.2 in a Vue application and facing some performance challenges when trying to display a large dataset on a map.

We’ve uploaded a CSV file with around 15 million rows into a hosted feature layer. When applying a filter, the resulting dataset includes approximately 350,000 points. However, the data is being fetched in batches, and seems to be limited by a maxRecordCount setting. Even though I attempted to update this setting using the ArcGIS updateDefinition API (as described here: https://developers.arcgis.com/rest/services-reference/online/update-definition-feature-service-.htm), the value still shows as 1000, and the actual requests seem capped at 4000 records.

In some cases, even fetching 4000 records takes up to 2 minutes, possibly due to how offset-based pagination is handled.

Given this situation:

  • What is the best approach for efficiently displaying such large datasets on the map?

  • Would it be better to use techniques like tiling, generalization, or server-side clustering?

Any recommendations or best practices would be greatly appreciated.

6 Replies
mgeorge
Esri Contributor

Hi @FactSetResearchSystems, are you using a FeatureLayer? We should be downloading pages of 8k features. I'd make sure caching is enabled. Here's an example of a map loading about 350k features: https://codepen.io/matt9222/pen/MYYpPRK?editors=1000

FactSetResearchSystems
Emerging Contributor

Yes I am using the FeatureLayer itself, this is interesting, you maps loads pretty fast, where as mine loads very slow, may be its the query to fetch the 350k records out from 15million which is taking time. How did you achieve 8k , my feature layer doesn't always give more than 4k. Is there any featurelayer or server setting you have done .

0 Kudos
mgeorge
Esri Contributor

Hmm strange -- it should be the default for AGOL hosted. Is there any kind of proxy that your service is running through? Also do you see if things are getting cached? Might be slow the first time, but I'd expect it to speed up once things are cached (there's a feature tile cache on the server and a CDN cache).

0 Kudos
FactSetResearchSystems
Emerging Contributor

I see my default its cached for 30 mints and i have increased it to 1 hour. I think its getting cached on the client side since i see the subsequent requests loads much faster. 

BBarbs
by
Occasional Contributor

We also have a few Vue apps that consume data from large datasets (90 million records). We wound up creating an API endpoint connected to our database that the app sends queries to, that way only what the client needs is sent to the app itself (in GeoJSON format). This usually winds up only being a few thousand points with a max of about 15k though.

I'm not sure if there's an easier way, but it's worked pretty well for us so far. 

mgeorge
Esri Contributor

That's a valid option -- only downside is that GeoJSON is not a very efficient format, though should be fine for a few thousand points 👍