I agree with Eric Ironside, the problem described isn't a good fit for GeoEvent Server because every event record is generally considered atomic. I can't compute a sum or average value across several events without developing a custom processor to collect event records over a period of time and periodically dump a cache to compute a statistic.
Taking a closer look at Simon Jackson initial ideas, we could import the catchment polygons as geofences and then ingest and process the river gauge point features to get the name of the catchment AOI the points fall within (using a GeoTagger). But that only sets us up for some sort of post processing operation external to GeoEvent Server; we've only used GeoEvent Server to enrich a feature record set of river gauges with an attribute identifying a catchment. Assuming neither the location or geometry of catchment polygons / river gauge points frequently change, this becomes an infrequent batch operation, not a recurring real-time operation.
Turning the problem inside out, we could import the river gauge points as geofences then ingest and process the catchment polygons. A GeoTagger could be used to enrich each polygon with a comma separated list of sensor identifiers. But again, we only get identifying names from GeoTagger – if we split the delimited list in order to use a Field Enricher to pull actual flow rates into each record we're no better off that having ingested the point records in the first place ... we cannot rejoin the split delimited list to obtain metrics from several event records to compute an average.
I suppose we could use a Field Calculator to append each river gauge's value to the gauge's name and use a stream service to broadcast these so that a geofence synchronization rule could actively update a set of geofences whose name provides both a sensorID and an observedValue ... then, as part of a different GeoEvent Service, ingest and process the catchment polygons using a GeoTagger to collect all the "enhanced" geofence names into a comma delimited list. A regular expression, like Adam Repsher suggests, could then (maybe) pull apart the delimited list, extract each sensor's observed value from the "geofence name" and compute an average ...
But this is hardly the elegant solution Simon is looking for. It's forced, brutish, and fragile as we now have two independent GeoEvent Services polling catchments and river gauges, updating geofences, and presenting us with inherent race conditions.
If you really wanted to do this using GeoEvent Server, you would want to develop a custom processor that incorporated a timer. The processor would catch and hold a series of GeoTagged points as long as its timer had not expired. As the processor was collecting data, it could enter metrics from each river gauge into some sort of key/value structure in memory using the catchment identifier as the key. The processor's timer would reset as new gauge event records are received and its in-memory data structure updated. If the timer ever expired ... say, 30 seconds after not seeing any new data ... it would compute averages for all the catchments it had key values for and write-out updates to the catchment feature records.
The burden here is that you have to use the GeoEvent Server's Java SDK to create a custom processor. I don't see anything elegant you can do with GeoEvent Server out-of-the-box to solve this problem.
Hope this information is helpful –
RJ