At the company I'm working for, we have a national dataset in AGOL - a hosted feature layer. It is already quite big (15 mil records) and we keep adding to it as we process new data. So this has the potential to get 20-30 times bigger. My question is, how big is too big for a hosted feature layer?
We show the data in different groups as views of the main layer and we've seen performance issues already. We keep rebuilding indexes on it whenever we notice a drop in performance and it seems to help, but I was wondering if there is anything else we can do. Also, is there a better way to store such big datasets in AGOL? Thanks!
Hi Manuela Butuc,
Thanks for the post - we have seen much larger hosted feature layers perform without issue; the ArcGIS Online product managers would like to connect with you to learn more about the specific performance issues you're noticing, what the views are used for (e.g. visualization, editing, querying, etc.), as well as more information about the data in the hosted feature layer (e.g. geometry, attachments, relationships, etc.). We'll send a message your way to get this started.