Best/fastest way to display very large datasets in the ArcGIS JS API?

3850
10
03-12-2019 09:37 AM
TristanLopus1
New Contributor II

I have a very large set of polygons to display, and I cannot find a way to load them that is sufficiently fast and perfomant. I have tried GeoJSON, but loading is extremely slow. 

0 Kudos
10 Replies
Juan_ManuelAngel_Cuartas
New Contributor II

Hi Tristan, try to use this example and consist to set the WebGL with value of 1 in the dojo config.

var dojoConfig = {
    has: {
      "esri-featurelayer-webgl": 1
    }
  };

Display more data with FeatureLayer | ArcGIS API for JavaScript 3.27 

FeatureLayer | API Reference | ArcGIS API for JavaScript 3.27 

TristanLopus1
New Contributor II

Thank you very much for your reply Juan. I am using version 4.10 of the API and am not using Dojo. Does this option still exist in version 4.10? If so, how would I set it? Thanks!

0 Kudos
RobertScheitlin__GISP
MVP Emeritus

Tristan,

  4.10 uses webGL by default so there is nothing you need to do. Using GeoJSON would definetly be slower then using a Map service or Feature Service from ArcGIS Server or AGOL.

0 Kudos
TristanLopus1
New Contributor II

Thanks, I thought WebGL was used by default, but I wasn't sure. As for your suggestions of Server or AGOL, I appreciate them, but unfortunately, I need to be able to serve the data myself and build the features client side, for cost and regulatory reasons. I'm finding that may not be possible to do in a satisfactory way though.

ReneRubalcava
Frequent Contributor

How many features are you trying to load? In 4.11, we will be introducing a GeoJSONLayer that will be more performant than trying to load GeoJSON into a FeatureLayer or GraphicsLayer. That might help give you the results you are looking for. That release is coming later this month.

TristanLopus1
New Contributor II

Thank you so much for your reply, Rene. You have given me renewed hope that the API will be able to satisfy our performance needs. Right now I am working with a dataset of about 13,000 polygons, comprised of 2.4 million points. I'm not sure if that sounds unreasonable, but I have found a few other maps frameworks that can handle it no problem. I just don't like the others as much in general, and I am really hoping to make ArcGIS work because it is the overall best maps API I have found.

Does version 4.10 of the API use WebGL when rendering a FeatureLayer with GeoJSON features in a 2D MapView? This blog post, entitled FeatureLayer rendering: taking advantage of WebGL, from September 2017 indicates that, as of version 4.5, WebGL is used when rendering a FeatureLayer with hosted on AGOL and that WebGL support for features loaded on the client side was in the pipeline.

This new technology is currently only available when visualizing feature services hosted on ArcGIS Online. Support for client-side feature collections and non-hosted enterprise feature services will be supported at a later release.

Has the API since implemented a WebGL solution for client-side feature collections. If not, is WebGL part of the performance upgrade that you mentioned is coming in 4.11?

0 Kudos
ReneRubalcava
Frequent Contributor

Starting in 4.10, all layers, including raster layers are rendered in a single WebGL context. This includes all client side data as well.

TristanLopus1
New Contributor II

Thank you so much for your help, Rene. Look forward to the release of 4.11!

0 Kudos
Rhys-Donoghue
New Contributor III

I know this is an old post, but just adding my comments here in case it helps someone.  I often work with large GeoJSON datasets with Esri's JS API.  The very first thing I do is try to reduce the size of my GeoJSON file.  The easiest way to do this is normally by reducing the number of vertices in the features.  There are various ways to do this using Esri software.  If you don't have Esri software, you can use mapshaper.  This can reduce a 100MB GeoJSON file down to only 1MB without losing too much feature definition.  If the file is still not small enough, then you can decrease the precision of the coordinates, e.g. 12.34522153234 becomes 12.3.  This may make no difference to the how the data looks but can easily halve the file size.  To decrease the precision, I use Python, although this could be done easily via JavaScript if needed.  The other option that I looked at when my data was too big was using Postgres and only requesting the data in the current map extent (how normal ArcGIS Server map services work).  This worked fine.