Can someone please let me know how many services can published into a Shared Instance Pool? I know there is an esri general recommendation that we should limit the Maximum number of instances per machine to n + 1, where n is the number of server cores.
Now my questions are:
1- How many service can be published in each shared pool? For example at following set up with 4 shared instance, how many services can I have in all 4 instance and in each of them?
2- Considering the following settings (4 shared instance), how many ArcSOC will be process at server?
3- Considering the following settings (4 shared instance), how many CPU Cores will be needed at server to keep process under control?
BHK,
It's always a guessing game. The best you can do is examine your map services and the apps you will publish and estimate what you will need. Then after your services are published measure their performance with the free System Log Parser (SLP). SLP will give you stats on your services and it can guide you in making changes to improve service.
From my experience here are some tips:
Good luck,
Bernie.
1- How many service can be published in each shared pool? For example at following set up with 4 shared instance, how many services can I have in all 4 instance and in each of them?
TH: The idea behind shared instances is that there is no limit. You can publish as many as you want. The main thing is that how many of them will remain cached. Therefore when a request comes in for a service, it can return the result immediately instead of going thru all initializations.
In this example below, you see you can have up to 50 map services cached per SOC process. If I happened to have 51 map services, when a request comes in for the one service that is not cached in shared instanced, it will kick out one from the cache and make room for the new one, the map service instance starts up and returns results back to the client app.
As per an arcgis server doc,
The cache size setting controls how many services are cached by each instance in the shared instance pool. Unless you have a large amount of memory and a large number of services that are all regularly receiving requests, it's recommended that you keep the default value of 50 cached services per instance.
You may consider raising the cache size value if you have more than 50 services that are regularly receiving requests, you are experiencing performance problems, and you have available memory you wish to use for this purpose.
2- Considering the following settings (4 shared instance), how many ArcSOC will be process at server?
TH: There is one SOC process for each shared instance. Since you set '# of shared instances per machine' to 4, you will see 4 SOC processes. That also means at any given time only 4 requests can be processed simultaneously; once the 5th request comes in, it waits until one of the 4 SOC process available to response.
To view them, open Task Manager app on you Windows machine >> switch to Details tab >> if not done already, right-click on any column name, choose Select Column command, check Command Line column, click on OK >> you will see 4 SOC processes with -Dservice=System.DynamicMappingHost.MapServer in their values under Command Line column.
3- Considering the following settings (4 shared instance), how many CPU Cores will be needed at server to keep process under control?
TH: I can't answer this. I'd defer it to others who are expert in this area.
Here is an excerpt from an ArcGIS Server doc:
If most or all of your site's services use the shared instance pool, consider setting the number of service instances in the shared pool to twice the number of physical CPU cores on the individual machines in yourArcGIS Serversite (for example, if you're using 4-core machines, consider setting the pool size to eight instances).
also I'm adding a link to this blog post which a future reader might find helpful: https://www.esri.com/arcgis-blog/products/arcgis-enterprise/administration/shared-instances-arcgis-s...
i hope this helps