I'm going to upload about 500k point features to AGOL.
Is it best to leave it in one file?
Or is it best to make 3 or 4 chunks based on what part of town (or something) to keep the table sizes down?
Does it matter?
One file is easier, for sure, and makes some later calculations easier. But if there were a good reason to break it up, I'd do it.
Anybody know? Got some experience or an opinion?
Thanks!
Doesn't matter for table size, you have have massive tables on here. The real issue is going to be how you're accessing it. Even if you split it into multiple chunks, viewing the entire town at once is going to try and load all of them.
I don't think there are any good reasons to split it up, it's just going to make your life harder. Stick with one file.
I ended up doing the files by region, I could always join them later. But, so far, no problem in New Map Viewer doing styling or calculations. My recollection is that the math will always be done correctly in the cloud, but the web may not be able to show things correctly sometimes.
500k features are fine, particularly if they are points.
What you need to consider for larger datasets:
Splitting the same data it into multiple layers doesn't really achieve much. If you needed to do this for other reasons, I'd recommend Hosted Feature Layer Views (one dataset, views to control what is accessible).
So most of your considerations will be performance and storage costs. More on the latter here:
https://esriaustraliatechblog.wordpress.com/2021/06/22/which-arcgis-online-items-are-consuming-the-m...
Thanks! It's all point features for now, but very good considerations. Plus, there's some dodging of the continual, notorious xlsx upload bug.
I'm not familiar with the xlsx bug you are referring to but I am familiar with generic reasons tables won't upload correctly.
If you're publishing, honestly better to convert to file geodatabase and validate in pro when sharing. For end users might need education which can be hard.
If this is what you're seeing (and not some specific bug) happy to help further
yeah, it's been around for a while, and is scheduled for a fix.
1. user uploads table from xlsx file, creates hosted layer.
2. everything works fine! puts it in maps, styles it, etc.
3. does not use layer for a few days. user goes into map and does some calculation like count points, doesn't work, get's error message "no extent" or something. user can still style the layer in that map. but cannot open the layer via content or "description" options. it's like a zombie.
4. user waits a week or so...layer cures itself!
Sadly, what you end up doing is exporting it as a FGDB, then loading it from that is an instant cure. I've had to do that 3 times now when there was work that HAD to be done due to deadline.
Tech support says don't use xlsx, use csv, but that is a crazy extra step to get into...and I'm not sure the behaviour is that different.
Very frustrating!