Spatio-temporal big data store uses way too much data storage compared to relation databases

926
1
Jump to solution
08-10-2021 12:22 AM
WalterSimonazzi_VicPol
New Contributor III

We have three spatio-temporal big data stores nodes in our ArcGIS Enterprise deployment.

We store a feed of the location of devices (around 16.000 devices) that send the location every 100 m or 5 min if stationary.

The plan is to store the data in STDS and use it for big data analysis in the future.  

We started to save the data on the 1st of June this year, that is less then 2 months ago,  and as of today we have a table with 112 million records/points that is taking around 44 GB of space on each node. 

Is that correct? it sounds to me that is taking too much storage and if this the case, we will need to abandon the plan of storing location historical data or set a limit. 

0 Kudos
1 Solution

Accepted Solutions
JakeSkinner
Esri Esteemed Contributor

@WalterSimonazzi_VicPolthe Spatiotemporal Data Store will require a minimum of 13 GBs of disk space from the installation alone.  The data will then be replicated to each node.  With a 112 million points, this may be correct depending on the number of attributes. 

How long of a time frame do you plan to keep the historical records for (i.e. 6 months, 1 year)?  There is a data retention option for a Spatiotemporal Data Store you can set. 

JakeSkinner_0-1628594644839.png

 

View solution in original post

1 Reply
JakeSkinner
Esri Esteemed Contributor

@WalterSimonazzi_VicPolthe Spatiotemporal Data Store will require a minimum of 13 GBs of disk space from the installation alone.  The data will then be replicated to each node.  With a 112 million points, this may be correct depending on the number of attributes. 

How long of a time frame do you plan to keep the historical records for (i.e. 6 months, 1 year)?  There is a data retention option for a Spatiotemporal Data Store you can set. 

JakeSkinner_0-1628594644839.png