Hello, community -
My colleagues and I have been struggling with the post-field side of the Survey123 workflow. We can create what appear to be well-designed, smart field forms that record field data in several related tables (themselves created through repeats and/or nested repeats). We hit significantly bumpier roads when we try to take that field data and format it into tables for our narrative reports. I can't escape the feeling that we're missing a solution somewhere.
Survey123's Report Templates would work if we were presenting tabular data on more of a per-resource/record basis, but we are often presenting tables of data from multiple records at once. As an example, we have crews dig dozens of holes along a survey corridor and record their findings (soil stratigraphy, disturbance, artifacts, etc.) for each hole in our reports, we present tables of all of the tests on a survey (or all of the tests in an area). Our field form has repeats for soil layer observations, soil inclusions (themselves nested within soil layers), observed artifacts (themselves in a different set of repeats), etc. Because our Shovel Test form uses different repeats, we've been unable to make the Report Templates work.
Here's a simplified version of what the output table should look like (blank cells would typically be merged together for a given Test/Layer/etc.):
|Test (main table)||Observation(s) (main table)||Soil Layer (repeat)||Inclusion(s) (nested repeat)||Artifact(s) (different, independent repeat)|
|Test 1||In a field||1 (0-33 cm) Dark brown sand||Gravel||Ceramic sherd (0-20 cm [our artifacts must be presented in 20-cm levels in our reports])|
|Tree roots||Scrap metal (20-40 cm)|
|2 (33-54 cm) Red Clay||Carbonate nodules|
|Test 2||Next to a house||1 (0-10 cm) brown sand||Tree Roots|
|2 (10-43 cm) red clay||carbonate nodules|
|Test 3||Next to a different house||1 (0-25 cm) dark brown sandy loam||--|
|2 (25-30 cm) dark brown loam||tree roots||Stone flake (20-40 cm)|
|3 (30-47 cm) pale yellow, dense clay||mottling|
Our current workflow is to take the raw, tabular field data (the main table and the 4-5 subtables associated with repeats), then rejoin them (to the extent possible) in Excel using various functions and queries over several steps. It all is very laborious and fairly error-prone, which we assume is the exact opposite of the intent of digital data collection platforms like this.
I fully expect that we're still just dealing with a learning curve on our end; people have dealt with this issue effectively in the past but we just aren't aware of what they did.
Are there database/query-focused functionalities within ArcGIS (Pro, presumably) that streamline the formatted tabular data output process (perhaps, fingers crossed, even semi-automatically)? Is there another tool that might be a better fit for what we are trying to do? Should we be designing our XLSForm in some other way?
I'm a bit reluctant to post proprietary examples on the forums, but I'd be happy to send examples through DM.
Thank you all very much in advance for any help you can provide.
I ran into this same thing with my first surveys. I thought they were great, until I looked at how the exported, and then they stunk. In the process of dealing with that mess of my own creation, I learned a few things -and while I know there are more options than I listed, that I hope may help you.
1. It is well worth your time to use functions within Connect that help the export be a bit tidier. Defaults, constraints, hidden fields, etc.
E.g1. null in bind::esri:fieldType - which allows you to prevent some of your questions, such as notes, from showing up at all in your feature class.
E.g2. use bind::esri:fieldLength - it's tricky, b'c you don't want to make it too small, but don't accept the default, which is 255 or in some cases 1000. I ended up with a select one yes/no that had a width of 255. Sort of overkill.
E.g. 3 For your 'name' field in Connect - those will be your field names in the feature class, so choose them well.
2. Do some trial exports of sample data you 'collect', and see just how messy the export is. That may show you some questions in particular that need attention in Connect. Identify the worst questions, dig around, and then if you can't figure out exactly how to improve it, ask the community. Try to provide as many details as possible and examples/screenshots and what you have tried.
3. Eliminate as many text boxes as is humanly possible. Force the users to use cascading selects, drop downs, etc. If there are text boxes, make them brief when possible. I have 255 size for "comments", but for a person's name (for ex) not in a list, limit it to 25 and force it to be first initial, last name, no punctuation.
4. My final solution was to work with a python wizard who wrote a script to help clean up the survey data and export it into a feature class and into a csv. That would have taken me years, but it took her a few days - so see if you can find a programmer to help clean up the data. Doesn't have to be python... The programmer was also able to do such wonderful things as check for extra spaces after words, parse fields with comma separated data into separate fields, etc.
Maybe those will save you a little time. It's definitely a learning curve, and I feel the information is scattered, so if you don't know the correct keyword, it can take a long time to find the help you need. If you come across any 'gems', please DM me or post!
Thank you so much, Peg, for your detailed reply and for all of those suggestions!
I'm well on the way on a couple of your suggestions already but it's good to get reinforcement that I'm on the right track.
Your e.g. 1 is one that I hadn't tried yet and it definitely might be a big help. So thank you for that one in particular.
We have very complicated forms as well so it makes sense that any reporting will take some work. Repeats are always hard.
All of our programs store the data differently in the backend. The schema that makes the forms work well is not a great way to permanently store the data. At season end we download then use Python to translate the data into something more usable and store it in SDE/SQL. We also create summary indicators that we serve out to the users in web services.
We then have web maps, Ops Dashboard, and Excel reports and PDFs, generated by Python mostly, with some Data Driven Pages.
Big thing I have noticed over the years is companies modernize the technology but not the workflow. People are flat our addicted to Excel and "paper" reports still, even in 2022. So we have users fill out a form, then go through a convoluted process to turn it into an Excel or paper report, then print that out and say this is not any easier. Well yes correct. 🙂 But it is really hard to change that behavior for sure. Then if you are in gvt sometimes you have to have a static report. I still have had luck though creating summary maps and services and showing them something better. I also make some summarized Excel or link the Excel directly to SQL. The market as changed to "live time" type reporting and so has Esri, but unfortunately many of us are stuck in static world due to policies.
Not sure if this helps but if you have big complicated forms then your reporting will be a pain also and you will probably need a script of some sort. Excel is probably not the answer. Using Python to manipulate the data and create a summary Excel may be though.
Thank you for this information. I'm starting to see that I won't be able to come up with a solution entirely within S123 and Excel. I, personally, don't know Python, but we can look into finding someone to help with that.
Thank you for your help!