Mobile Data Collection.
I am working on a building inspection mobile application using Collector for ArcGIS via ArcGIS Online. I am storing my feature class on an MS SQL 2012 enterprise geodatabase and publishing it out to a feature service shared with my ArcGIS online organization. I have 1,000,000 (+) building footprints (polygon) that I would like to make available to Collector for ArcGIS data collectors. I am seeking techniques to make this process efficient and manageable.
The Collector for ArcGIS users will not be updating geometries (digitizing) so I only need to provide them with scheduled survey areas they have been assigned at the start of a shift, I.e. Neighborhoods 5000-7000 building footprints. I want my survey teams to write back to my SDE Feature Class where I can analyze and report results of survey activities.
My first thought is to create an .MXD for each neighborhood and have numerous feature services writing back to one single feature class. Not sure how this would work and the set up seems too long.
I know ArcGIS online sets a 1000 feature limit on published feature layers.
I am would be willing to hear solutions, suggestions and/or work-arounds to share large volume datasets with ArcGIS online organizations.
Did you manage to get some answers.
How about prepopulate the date field ( select building and specify the day it will be inspected and use the definition query to show only building to be inspected today.
Datefield >= CURRENT_DATE or Datefield >= CURRENT_DATE +30 to only show those due in next 30 days.
There are a number of ways you could approach this...
For starters if you use an off-line workflow you can have far more than 1000 features, . We've done data collection using runtime apps that generated upwards of 500k features sync'd them across multiple devices and it worked.
If I were in your position I think I'd consider one of two options:
Either way keep in mind that you will eventually end up with some pretty large archives/business tables in your SQL Server database. We've frequently run into issues where that archive becomes corrupt, or ArcGIS Server/AGOL runs into other issues when syncing the data. The only option is to then clear and re-build the archive, republish the service, and in the case of collector remove and re-download the map to your devices.
If you have an advanced SQL DBA you can put lots of the work on the DB side. We use partitioned tables, column store indexes, FILESTREAM, and I am sure some other DB side tricks to manage datasets that climb well over 1 million. Takes some effort and we break or push the edges of almost everything out of the box ESRI says it supports but in doing so we maintain performance speeds on the front end for users and actually make some things more manageable for us on the back end.
I'd agree with mikedmanak that we use all offline for collections and that's been a key piece of managing the system especially since we could more easily push our statewide .tpk files for high resolution imagery we collect.