Advice Needed on Handling Large Feature Layers and Rendering Speed

10-09-2018 12:59 PM
New Contributor III

Hello all,

I have a JS API app that renders and queries an AGOL Feature Layer (created from shapefile) that is hosted on my account.  That FL contains upward of 5-7k small polygon features.  My application refreshes every five minutes, which causes the features to redraw, with rendering symbology based upon certain criteria.  Aside from the five minute refresh, the user can also alter the aforementioned criteria which yields different symbology upon another refresh/redraw.  All that really isn't important, just wanted to stress the amount of feature drawing that is taking place.  

The issue is I have noticed that the redrawing can take quite a while at times, depending on the amount of features returned and net speed. 

My question is this, is there a better way to do this?  I see in newer JS API versions that a feature collection can be generated from a shapefile.  The app runs on a local APACHE server.  Could I possibly load a shapefile on that server and utilize a local feature collection for rapid refreshing?  Should I convert the shapefile (the native format) to JSON, is that any faster (seems like a lot of graphics to draw)?  I'd imagine being able to host a local JSON file with all features would be faster, essentially eliminating any internet traffic.

Essentially what I need to be able to do is the following:

Query the feature attributes

Render the features individually based upon those attributes (currently I use renderer.addValue for each feature meeting criteria then I use setDefinitionExpression on the layer to return only the features that I have set the renderer for)

Also, I believe there is a feature limit on that AGOL FL that I'd like to overcome

Any help would be greatly appreciated.

0 Kudos
0 Replies