Spatial Temporal Data Store - Resources

696
0
02-17-2020 10:32 PM
Tolo__AdamTolo
Occasional Contributor

Looking for users who have had experience implementing a spatial temporal data store into their environment. Specially looking for the impacts on CPU/RAM & the disk space big data takes up.

Sizing

I was searching for some information in regards to how much disk space should we request for a Spatial Temporal Data Store?

Was specifically looking to use location tracking to automatically capture where users have been at lets say 1 min intervals.

Lets say with the following variables:

  • a work day is 60min x 8 hours = 480 points per user
  • 50 users
  • 52 weeks x 5 days = 260 days

Gives me around 6.3 million points in a year.

Does anybody know how much disk space this would approximate take in a year?

We do have a point based feature class in a file geodatabase with 18 million points with 30 attributes and it is 5GB in size. Would it be a good estimate to say that the location tracking will take up approx 2GB a year?

CPU & RAM

I have found that the minimum memory requirement is 16GB to start. 

ArcGIS Data Store 10.7.x system requirements—ArcGIS Enterprise system requirements | Documentation f... 

Lets say there is 50 users hitting the environment, that is an additional memory requirement.

Can't see any information on whether it has an impact on CPU but I was assuming it would.

Has anybody had any experience on the impact on CPU and memory?

0 Kudos
0 Replies