File/Record Limits/Restrictions for GeoJSON Files to GeoEvent/Big Data?

731
2
06-12-2018 12:21 PM
MarkMcCart
Occasional Contributor

I am trying to backload 1,000,000+ records into our Spatiotemporal Big Store in Enterprise 10.5.1 via 'Directory Watch for GeoJSON file' input. However, I am only able to load ~500 records at a time with my .json file.

See attached screenshot for the errors I am getting.

Am I missing a setting on GeoEvent Server? Should I be using a different Input method? Or is GeoEvent not designed to 'bulk-load' data into the Spatiotemporal Big Data Store?

Thanks,

Mark

2 Replies
DanielCota1
Occasional Contributor

Hello Mark McCart‌,

I believe this would be expected for GeoEvent. The throughput will be determined by many different factors (the size of the event/data, the machine resources, etc). 500 sounds about right, but I would also be curious to know the nature of the data being written along with the performance of the Spatiotemporal Data Store machine. The BDS is designed to handle large amounts of data, but not necessarily when they come in at a frequency it cannot handle if resources are maxing out.

Either way, while we constantly strive to improve GeoEvent's performance, I think it will be a long time before we can reach the point of writing millions of records in the same time it takes to currently work with a couple hundred or thousand. Again, there are a ton of moving parts here, but overall, I would look into ensuring that the data is clean and the machine resources are sound.

-Daniel

0 Kudos
JoshuaBixby
MVP Esteemed Contributor

The BDS is designed to handle large amounts of data, but not necessarily when they come in at a frequency it cannot handle if resources are maxing out.

Since 500 doesn't really seem like "big data," where is it documented what GeoEvent is supposed to handle?  At some point there has to be some guidance for GeoEvent users, leaving it to "many different factors" seems a bit vague, if not disingenuous. 

0 Kudos