Hi,
I have a project where I need to receive 1 million requests coming from field devices and then save them on feature layer. What the best way to implement this in GeoEvent to handle this number of requests. And where to store the incoming points ? is it best practice to store them on a feature class or there is another way to store them? because each hour 6 million features will be saved on the feature class .
Hi @YObaidat ,
I guess the spatiotemporal big data store might be a good solution for storing the features.
Spatiotemporal big data stores—GeoEvent Server | Dokumentation zu ArcGIS Enterprise
The throughput of 16k or 100k events per second is a lot. I currently don't know what the GeoEvent can handle. This might vary on the type of the input connector and if you have just one ore more than one node handling the requests. Have you already tested the throughput?
Best,
Stefan
@YObaidat wrote:Hi,
I have a project where I need to receive 1 million requests coming from field devices and then save them on feature layer. What the best way to implement this in GeoEvent to handle this number of requests. And where to store the incoming points ? is it best practice to store them on a feature class or there is another way to store them? because each hour 6 million features will be saved on the feature class .
Thanks Stefan for your reply. Still need to test the case. I have in mind that I need solid infrastructure for that.
I send a lot of traffic to geoEvent via Rest but nothing close to 100K per second. I would think you would be better off with a Velocity solution utilizing gRPC instead of rest for something like this--
See this post --
https://community.esri.com/t5/arcgis-velocity-blog/arcgis-velocity-extending-real-time-data-ingestio...
Velocity might be a solution if using ArcGIS Online.
Yes, and you may have to to get that volume if data processed.