I don't want my field users to waste time or type errors by entering data that is available in other layers available in my enterprise GDB. I also want to process some if-then scenarios.
Agricultural use case work flow:
1. User enters a polygon and enters 12 attributes, all from drop down lists (domains), except for acres, which can be hand entered in special cases.
2. For each new polygon feature, I want to know the County that intersects the centroid of the polygon (for reporting purposes). I can pull that from my Counties layer by using a spatial join or select by location and the arpy.da module.
3. The polygon may or may not be inside of some jurisdictional boundaries. If it is IN, do geoprocess 1. If it is OUT, do geoprocess 2.
4. The user may have entered a specific number of acres as an attribute to the polygon. If they enter it, use that value. If they don't enter it, calculate acres from the polygon shape area.
etc., etc. etc.
I currently have a Python script running every 10 minutes, directly against the enterprise feature class in SQL Server. But I worry about race conditions if Collector activity scales up. ( It currently takes 6 minutes to process 3,400 features)