We're looking to spin up lots of small Enterprise Geodatabases (either with MS SQL or PostgreSQL), in order to manage different projects across the business. Over time, we may be looking at up to 100 of these that would need to connect as datastores. Is this to be discouraged? Currently, our file server is a registered folder that contains 100s of FGDBs already, but we're losing out on the multi-user editing and registered feature service functionality.
Does anyone know if there is a physical limit to the number of registered datastores, or whether there would be a performance loss having so many connections?
Just anecdotally - We haven't encountered a limit to the number of datastores registered or enterprise databases but we are running multiple ArcGIS Server instances in our datacenter (Windows Server, SQL Server DBMS) and our rule of thumb is that the systems starts to get much less stable once the service count exceeds 200. Core count doesn't actually seem to matter all that much, it's more a limitation of what Windows can handle.
I agree Michael's comment above. You ould have 100 SQL Server instances, that are used intensively or sporadically, that will impact the CPU usage, so lots of 'use' expect to need more cores.
Each datastore, is going to need at least 1 Map Service/Map Image Layer to be published, and this will be the limiting factor. I typically work to 100 services per ArcGIS Server, as an arbitratry rule of thumb, BUT you need RAM to support that number. Each single instance of a SOC consumes 100-200MB. So times that by 100 services, and you're going to need a fair amount of RAM if those services are published as dedicated.
You may want to look to use Shared Service Instances instead of dedicated to give the server sufficient headroom to run the services your suggesting, and you may need to be prepared to have more than one ArcGIS Server if there's a lot of usage.
Final thought. Have you got the Enterprise Portal, Server, and ArcGIS Data Store all running on one 4-CPU host? If so then that would be quite constrained.
That's a good point - in our example en enterprise instance includes separate servers (VMs) for DBMS, ArcGIS Server, and Web Adapter so a 4-core server is only doing AGS duties.
I wouldn't recommend a single-box setup for anything but a lite duty test environment or single-purpose environment that only support one application with a handful of services.
The usage patterns also matter quite a bit. A significant number of our services support mobile data collection, often in offline mode. In that instance the services are used in short bursts a few times a day. Migrating services to the Pro runtime and shared instance has also let us increase service density. On the flip side we've helped local governments configure their infrastructure and had a much lower server density because things like their parcel layers, snow plow status, etc.. are being hammered constantly by a lot of visitors.
really interesting points here. We have a 4 machine solution, with the Portal, ArcGIS Server, Datastore and SQL Server on separate machines. Currently serving 400 services to roughly 400 users. So far we haven't found any issues with performance. Good to hear people's positivity around multiple EGDB setups. Will look into this one further,