ArcGIS Pro "Analyze" when sharing layer should check the data better

525
2
09-10-2020 09:59 AM
Status: Open
Labels (1)
AdamBranscomb1
Esri Contributor

ArcGIS Pro is incredibly fast at drawing complex data. This can lead users to believe that their complex data is actually quite simple. As an example I have a real data set which has around 200 polygon features. Some features are a few metres across and have 10 vertices. Some features are 50km across and have 250,000 vertices. 

ArcGIS Pro renders this data very rapidly from File Geodatabase or from a Postgres Geodatabase. In fact it's amazing. This gives me confidence to go and share this as a feature service to ArcGIS Enterprise or ArcGIS Online. When I press Analyze, ArcGIS Pro has nothing to say about my data at all ("no errors or warnings found"

But the effect of publishing this is a feature service is somewhat disastrous. With all the improvements to feature services, both ArcGIS Enterprise and ArcGIS Online have a fair attempt at rendering it (feature tiles, PBF format, quantization etc). But each feature tile - if it succeeds in making it to my browser at all - takes 30 seconds or more to render. And this is true at most scales - scale thresholds don't help as much as you would think. The resource usage on ArcGIS Enterprise is enormous as it tries to generate the tiles.  

So my idea is - why can't ArcGIS Pro's Analyze function provide more insight into how my data might behave as a a map or feature service? I would argue this data is completely inappropriate to be used in a feature service without some work to clean it - for example significant generalization. (The data is already generalized to 10 metres and still has up to 250,000 vertices on a feature. I know this sounds implausible).

2 Comments
RyanUthoff

I think this would be a very useful feature, but I can see how it might be hard to implement, or might give misleading results. Typically, it is up to the user who is publishing the data to be aware of feature service best practices. Of course there are several Esri documentation articles and videos regarding this.

The reason why I think that it might be hard to implement is because feature service behavior is dynamic. It can depend on the data. For example (assuming this is a traditional feature service and not hosted), you publish a feature service with only 5 records, but then add one million records in the EGDB. That has the potential to negatively affect the performance, but the feature service has already been published (with no need to republish) so the analyze would not be useful in this case.

Also, the server environment is dynamic. Feature service performance can vary if the server is under high load or if many people are using the feature service you published which the analyze functionality wouldn't be able to replicate. In your case, your server is having issues drawing features with 250,000+ vertices, but it's possible that other servers may not have issues. Maybe it would be nice to have a "stress test" functionality within ArcGIS Pro that would test the feature service under different scenarios, but that would require the feature service to already be published. I could also see how the "stress test" functionality could be a little dangerous, especially in a production environment.

At the end of the day, I think it would be great to have this feature in ArcGIS Pro. But it would still be up to the user to determine if it is truly a "problem" or not for them because as I said, every server environment is different.

ChrisHansenEUK

There is definitely a case for this.  Analyze warns me all the time about not having a feature layer template assigned, so it should be possible to give me a warning that my data is really detailed and may cause performance issues during and after publishing.  The checks don't have to be 100% accurate for the system or the data after publishing as it is a warning.

Ideally, the same checks would be rolled up into Enterprise to provide warnings for users bringing in shapefiles in through the front or being copied up into the Hosting DS.

If the checks were in Enterprise, then a permission could be added to choose whether users are allowed to bypass these warnings.  This would allow for different custom user roles to be set up in an org to differentiate Creator (Data Curator) v Creator (Regular user that is creating and sharing content as intended).  In turn, this reduces the risk to the wider Enterprise of deploying "likely candidates" for problems to a limited number of actors who may be aware of what they are doing.

For an example, I have a customer who uses frequently updated detailed flood modelling data in a large org and no amount of training and guidance will be able to stop the occasional user from trying to push that up into their corporate ArcGIS Enterprise (probably during a flood incident) and affecting the whole system.