Is there a limit for data store?

222
6
Jump to solution
09-03-2019 01:03 PM
MVP Frequent Contributor

What is the theoretical limit for the number of services the 10.6.1 data store can handle? What is the documented limit?

1 Solution

Accepted Solutions
Esri Contributor

Hi Thomas,

There is no specific documented limit with respect to the number of services that can be published and have their data copied into the relational data store. However, ArcGIS Data Store is designed for organizations to utilize it to publish thousands of services and have it store the underlying data for these services. Hosted services have a smaller memory footprint as opposed to services that have been published to a federated server that is not the hosting server, and therefore demand less hardware resources. 

Please let me know if you have any questions.

-Jacob

View solution in original post

6 Replies
Esri Contributor

Hi Thomas,

There is no specific documented limit with respect to the number of services that can be published and have their data copied into the relational data store. However, ArcGIS Data Store is designed for organizations to utilize it to publish thousands of services and have it store the underlying data for these services. Hosted services have a smaller memory footprint as opposed to services that have been published to a federated server that is not the hosting server, and therefore demand less hardware resources. 

Please let me know if you have any questions.

-Jacob

View solution in original post

MVP Frequent Contributor

Interesting. In a test environment, I'm seeing the datastore fail at around 1000-ish services, hence the question. I'll take any further discussion to TS, thanks!

Occasional Contributor II

At a UC a few years ago I had someone at Esri tell me they had a Data Store hosting data for "tens of thousands" of services, but later someone else at Esri said that would have had to have been a highly customized instance...

To Jacob's point, map services' ArcSOCs eat up server memory (potentially improved at 10.7.x from what I've heard, but not yet directly observed) while hosted feature services don't - but the real problem with that that has gone ignored is that the default target for publishing to ArcGIS Enterprise from ArcGIS Pro is the hosting server, not another federated server (at least in my experience) and that the default type of service that is published is a map service and not the scalable hosted feature service (again, in my experience), thereby leading the majority of ArcGIS-Enterprise-connected, Pro-publishing users to ultimately compromise the integrity of the hosting server.

Back to the original question, if you look up the specs on PostgreSQL (I'm not sure which version underpins ArcGIS Data Store) the stated limits are quite high...

Reply
0 Kudos
MVP Frequent Contributor

Funny you mention that Paul Hoeffler‌! We're determined not to allow "rogue publishing" to the hosting server, and are now testing a custom role with unchecked "Publish server-based layers". unchecked, and making that the default role. In testing, I am able to do everything a publisher would expect to do in PTL (publish hosted services), but with that role, I cannot register a datastore, nor can I fire hose the hosting server with services. 

Occasional Contributor II

That's good information - let us know if your testing pans out or is complete, and we might move all of our ArcGIS Enterprise/Portal for ArcGIS users into a similarly-configured member role!

Reply
0 Kudos
Occasional Contributor II

Since we've upgraded to 10.7.1 we've set the hosting server so that many map services published from Pro will use Shared instances by default - another approach to dramatically reduce the drain of map services on the hosting server, if one is not going to exclude them completely by restricting the privileges in a custom role. The number of instances dedicated to the shared pool is set in the same area under ArcGIS Server Manager, Site, Settings, Pooling.