The Avg_Instance used metric seems to be pretty low, rarely over 1, even though in Sever Manger I will often see the Instances used jump up to 8 instances are more. The Instance saturation is well over 50%.
I am assuming the actual time it is in use is very small, hence the small average.
So what would be a good way of judging what these numbers mean?
I would think some number range that corresponded closer to the Instance in Use data shown in Server Manager would be more useful for tuning and setting the Min and max Instances on a service. Is there a different metric for it? (Max Instances in Use??)
Configure service instance settings—ArcGIS Server | Documentation for ArcGIS Enterprise
"This chart is another good way to identify usage trends on a per-service basis and can help answer questions such as “What ArcGIS Server services might be good candidates to move to the shared instances pool?” If a service uses few instances on average, you may want to move it to the shared pool. If a service uses many instances on average, it should probably have a set of dedicated instances to process incoming requests."
Appreciate any guidance you can give.