Hello,
We are creating a public web Survey123 that pulls in a considerable amount of data to use in the survey. The survey is designed to ask hunters about their deer harvest numbers for several seasons, hunting units, and hunting tag types. The goal is to have this be one singular survey. In order to ensure data quality, we want limit choice list, control relevancy of questions, and input constraints that are unique to that hunter and what type of hunting they did. This way hunters aren't answering questions for seasons or hunting units they did not posses. We have all this information in our SQL database and I translate it to a CSV document to use in Survey123. We then hookup the pulldata functionality and the whole thing comes to a standstill with load times that are excessively long. The Survey123 web page/connect will go not responding and crash. The CSV has about 50,000 rows with 100s of columns. Does anyone have a better way of importing data from a database into Survey123 to create public surveys that are tailored to the user or is this simply exceeding the capabilities of Survey123?
Thanks,
Nikholai
I have done up to 8,000 which is used by 12 fields in a repeat of 50 and it has been ok. Device dependent really. A older Samsung was taking 22 seconds to load but a brand new iPad Pro loads in 1.5 seconds.
Sounds like you are using a browser not the app though? 50,000 prob is pushing it but it could also be something with your design.
One thing I can think of is you prob do not need all of those columns. Keep it to just the ones you really need for the lookup.
Another idea if they are always connected is to use a javascript function for the lookup. Others may have more info on this and if it is supported in a browser.
Hope that helps some.
Hi Nicholai,
I've also had plenty of scenarios where I've had 12,000 odd rows in a CSV file, which crashed in the mobile app yet others report that they had included around 20,000 records and it worked fine, so I think this is OS and device dependent.
Regarding the JavaScript function in public forms, this is not support unfortunatly.
My suggestion is to slowly introduce rows into the CSV to test at what point you are hitting the roof and also split your CSV up into different CSVs if at all possible.
I hope this helps.
As others in this post have mentioned you can optimize the survey form and external spreadsheet - but probably not for a csv of your size. More discussion on GeoNet:
XLS form size, or number of rows limitation
1K, 5K,10K ... pushing the limit on External Choices in Survey123
JavaScript would be my go-to but as Albertus indicated it's not available for use with public surveys.
One alternative method would be to use the custom URL scheme to pre-fill the data based on another system.
For example, you could send out emails drawing off your database to construct a URL that will launch Survey123. The email will be tailored to each hunter and pre-populate fields unique to them (that can be read-only or even hidden to end-user). This will streamline the survey and put the control of updates in the system maintaining the records:
At some point in this workflow Webhooks could be factored in - triggering events like notifications or emails.
Just to add - the prefilled data pulled in from the URL scheme could also contribute to relevancy filters on other questions
I haven't got the whole Survey123 hooked up yet, but so far this is promising and seems like it will work! I'll let you know how it goes once everything is finished.
Thanks for the tip!
I'm having a similar issue. I have a form that is loading very slow. I have 3 CSV files in my media folder. They have 5,000/ 6,000/ and 5,000 rows each. I've already reduced them from over 11, 000. I tried to remove my relevant Transformer Search out from the repeat but nothing I've tried (even removing the repeater all together) seems to speed up the load time. Any suggestions?