I have a situation where geoprocessing tools in ArcGIS Pro 2.9.3 aren't using all of the data from a spatiotemporal big data store feature service (Geoevent Server & Enterprise 10.8.1)
While developing my process, I loaded 500,000 points into the BDS via a geoevent server process. The BDS feature service is then accessible via our Enterprise Portal etc.
So in Pro I connect to Portal and am able to add the BDS feature service to a map, view the points and open the attribute table and see all 500,000 points in the table.
However then if I try and run some geoprocessing tools that use the BDS feature service as an input, the gp tools will only use the "max record count" as defined in the BDS feature service configuration, which is 10,000 by default.
The tools where I find this is the case are:
If I increase the "max record count" in the BDS feature service, then the tools will use more of the data but only up to that max value.
Despite being able to see the total number of features in the attribute table, there is no indication to the end user from any of the outputs of the gp tools that it hasn't used the total number of features for its calculations and has only used a smaller sub set of the available data.
It is similar to these issues
I would have thought that the ArcGIS Pro geoprocessing tools would be able to query the feature service to find the max record count and the total number of records and then loop through the required number of times to process all of the features for that operation.
The data is going to significantly grow over time, (its at about 5 million pts now), and would prefer to store the data in the spatiaotemporal BDS instead of our enterprise gdb.
Is there anything that I am missing when trying to use the BDS with ArcGIS Pro?