We are collecting AIS (Automatic Identification System) ship data to create heat maps in Portal for ArcGIS, and for the most part this has been successful. However, we are also quickly running into an issue of too many points. We have considered to reset the AIS feed to only collect points ever 60 seconds (or even greater)...right now it's collecting points more frequently than that. So for about 1 month's worth of collected data we are acquiring between 3.5 and 5.5 million points. I can get away with publishing that to our Portal, but when publishing more than one month's worth of data it becomes very restrictive and challenging.
So rather than alter the AIS feed to only collect data every minute (or longer), which cannot really be done because our AIS feed is being used by other services, the thinking is to statistically filter the points we have already collected using ArcGIS Pro, and then publish that to Portal.
What would be the best tool for this? If I want to reduce the number of points from say 4-million to less than 250,000, and still maintain enough data integrity to be able to produce meaningful heat maps, is there an optimal way of doing this? I'm guessing that this might be something akin to generalization, but with points.
Ideally, we would like to be able to create heat maps on demand using Portal for ArcGIS, and keeping our point total for the year to be under 4 million would be fine. But is generalizing points possible?
SOLVED (for now)
Using the SQL MOD function formula within the Select By Attributes geoprocessing tool (Using ArcGIS Pro 2.4):
MOD(OBJECTID+2n-x, n) = 0
• n is the every nth record
• x is the Objectid to start from
This will select every 10th record in a feature class, which reduces the point total by about 90%. There is some loss of data in areas where the points are moving faster in some areas when compared to others (this is AIS data, which record ship locations over a period of time), but for the most part this does work fairly well.