I have a survey that has over 10,000 submissions and when users go to export data from the online dashboard, the site is very slow and occasional crashes. Would this lag be caused by storing that many submissions? Is there a way to archive old data? Or would I have to delete and recreate the survey?
Thanks
Maybe.
I found the biggest draw-down on the website is repeats. The more you have, the worse experience you will have. And that degraded experience is irrespective of the number of records (E.g., I have a survey with ~50 repeats. It lags with only a couple records added).
On the other hand, I have a survey with several repeats (7) and around 45k records, and it loads relatively fast and reliably for me. My colleague would have a harder time here. In this case, it's all up to internet connection speed (I am fiber and they are Skynet Starlink)
Really, 10k records isn't that many and generally speaking should be fine to load.
It's entirely possible. Retrieving that much information, depending on the geometry type and the number / size of the fields included, could be taxing on a server.
The lag isn't caused by storing the data, though, just in retrieving it. If you don't want to see old data, you can filter the data being downloaded, or use a View Layer to prevent old data from coming through at all.
Generally the single biggest contributor to slow exports would be photos (attachments). If you've left default image size on your device, that could be many gb of images and it'll take a while to export.
Is there a way to archive old data?
Your survey123 website should be using a Hosted Feature Layer View. You can apply a view filter to exclude older records from this layer.
But 10,000 point features isn't a lot and shouldn't cause issues.
If you go to the feature service item details page, what is the size of item and attachments?
There is a backend to ArcGIS Online. If you're tracking editing, editing changes, and have indexes on frequently edited fields, the backend tables will grow and performance will take a hit. I've seen services that are 1gb with no features in them. It's probably not what you're seeing as they won't be exported, but there's a chance that something like indexing is impacting it.
There's also things like Content-Delivery Networks and caching settings. E.g. a public, non-editable service with CDN on will be very fast for standard queries as it'll leverage the global network. A relative date query on an editable service will always go to the database. If you're interest in this you're best to watch the Esri Dev Summit videos:
https://mediaspace.esri.com/media/t/1_2pkz2pju
Thanks for the replies. Here is some additional information about the survey in question.
The slow site has really only become a problem in the last month or so. This data is accessed by internal and external partners, so it is not limited to my office site/internet
I tried filtering submissions from the last 7 days (168 entries) and then adding another filter by estimator (58 entries) and the page either gets stuck in an endless loading cycle or clicking on a entry results in a 1-3 minute load time.
I've attached the layer details.
Nothing you have listed there is particularly alarming, and is within scope for the S123 website. Generally speaking, this shouldn't cause the website to behave slowly. If you are seeing a slow website in this case, it is most likely one of two main causes: 1) Internet connection quality, 2) Something may be wrong on Esri's end.
Two big performance boots you can give yourself are:
1) Un-check "Form view". I find it creates unnecessary lag... it also likes to turn itself back on sometimes (so be prepared to turn it off a couple times in a row).
2) Get rid of related tables. In your case, you don't have that many related tables. But, it can still give a slight performance boost. For my large surveys with MANY related tables, I 'x' out of them and then bookmark that URL. I find it saves me a lot of loading time.