ArcGIS Pro is incredibly fast at drawing complex data. This can lead users to believe that their complex data is actually quite simple. As an example I have a real data set which has around 200 polygon features. Some features are a few metres across and have 10 vertices. Some features are 50km across and have 250,000 vertices.
ArcGIS Pro renders this data very rapidly from File Geodatabase or from a Postgres Geodatabase. In fact it's amazing. This gives me confidence to go and share this as a feature service to ArcGIS Enterprise or ArcGIS Online. When I press Analyze, ArcGIS Pro has nothing to say about my data at all ("no errors or warnings found")
But the effect of publishing this is a feature service is somewhat disastrous. With all the improvements to feature services, both ArcGIS Enterprise and ArcGIS Online have a fair attempt at rendering it (feature tiles, PBF format, quantization etc). But each feature tile - if it succeeds in making it to my browser at all - takes 30 seconds or more to render. And this is true at most scales - scale thresholds don't help as much as you would think. The resource usage on ArcGIS Enterprise is enormous as it tries to generate the tiles.
So my idea is - why can't ArcGIS Pro's Analyze function provide more insight into how my data might behave as a a map or feature service? I would argue this data is completely inappropriate to be used in a feature service without some work to clean it - for example significant generalization. (The data is already generalized to 10 metres and still has up to 250,000 vertices on a feature. I know this sounds implausible).
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.