Append Data Error

1157
1
09-18-2019 02:45 AM
ElizabethNaro
New Contributor II

We just started using survey123 to collect wildlife count data. Previously we recorded all our data manually in excel. Now we want to merge (or append) our historical data to the feature layer of the new data so that we can represent them together in a dashboard. The data we want to use comes into a related table from survey123 so we are appending data to that related table. I noticed early on that there was a limit to how much data you could append at once so I broke our data (8 years worth) into smaller individual csv files for appending. It's been working fine like this and I have been able to append more than 6 years of data, but now it is not working any more. The remaining files are in the exact same format and similar size to the others I have already appended, but when I try to append it just says "error. could not append data to layer". Why is this happening? Is there a limit to how much data you can append to a single layer? So far I have appended 7 files (csv) with a total of 8,611 features so the layer has a total of 9,839 features.

0 Kudos
1 Reply
ElizabethNaro
New Contributor II

So I eventually got this to work, but still don't really understand what happened. The issue seemed to be the number of features I was trying to append at once. It had been doing fine appending files (csv) with 1,000+ features each, but after my seventh file it stopped working. I had to go all the way down to appending 50 or 100 features at a time and was able to work back up to appending as many as 350 features at once. At this point I tried to append a file with 500 features and it stopped working again. I tested it with lowering the number of features by 50 each time and it didn't append until I got back down to only 100 features. After it started working again I appended in files of less than 200 features each and it was fine. I don't really understand why this happened and it was a huge pain when trying to append large amounts of historical data. If anyone has suggestions of how to avoid this issue in the future that would be appreciated.