I have an issue with drawing 400,000 graphics point on map. These data for drawing those points come from query from the database which I can manage now.
The problem is after I got around 400K points data in json array format (with lat,lon and around 10-15 attributes). It makes my website crashes due to a massive amount of data which consumes almost 600mb of memory when trying to draw these 400K points at the same time.
I'm looking for some idea to help dealing with this massive amount of data. I'm considering using ClusterLayer but still worried whether loading 400K data into ClusterLayer and really help or not. Or is there any other idea that I can use to manage my 400K point data?
Your help or any idea would be very appreciated.
Do you need to show all 400k points? That's an awful lot of points to show on a map... can you work with dynamically siphoning something like top 5k from the db?
As an aside, here's a neat little feature to help you determine client-side performance:
Since these aren't esri/graphics, they don't have popups or anything like that. If you wanted to try this route (rendering points with webgl) you could use three js's raycaster to add a click event that would detect when the user clicked a point and open a custom popup or something like that.
In 3.x, I found a JSON layer, created dynamically from data in SQL, was usable up to about 10k records, depending on the browser/machine. In the end, I created a map service sourced from SQL views using data registered to the server, pulling top 2k based on the map extent. This works pretty well for our purposes, although I'll definitely keep three js in mind when v4 is ready for us to use - I'm impressed, although, I'd be curious to know how lower-end machines, maybe with integrated cards, handle the rendering.