Hi All,
Don't have much experience with GeoEvent... I am trying to update a single field in one hosted feature with a field from another hosted feature using GeoEvent. The Input type is 'Poll an ArcGIS Server for Features' and the Output type is 'Update a Feature'. Both the input and output count go up when a record meets the filter on the input (Complete = 'Yes'), though the value is not written to my output feature. Both hosted features have a 'Job ID' field, and I have set the TRACK_ID to these features for the Input and Output Definitions – not sure if this is correct. I have used the Field Mapper to map the 'Complete' field from the Source to the 'Complete' field of the Target. This is the only value I am after.
I'm sure it is something simple I am missing or have not setting up correctly. Do I need to add a filter for Job_ID = Job_ID so the right Output feature gets updated? I have assumed TRACK_ID manages this. I'm hoping someone can point me in the right direction or share an example? Thanks.
Jamie.
Solved! Go to Solution.
In your Field Mapper, include the sequence_id fields. These will be needed so it knows which features to update.
Try with the attached datasets. Publish each as editable feature services (i.e. Airports_Source, Airports_Target). In the below example, I published the services as hosted ArcGIS Server services. Here is how I set up GeoEvent:
1. Create a Poll an ArcGIS Server For Features input for the Airports_Source feature service
2. Create a Update a Feature output for the Airports_Target using name field as the Unique Feature Identifier Field:
3. Create a new GeoEvent Definition with only the name and fcc fields:
4. Create a GeoEvent Service. Add a Field Mapper Processer between the input and output:
I did not have to specify any fields as the TRACK_ID in this example. Let me know if you are able to get this to work on your end.
Hi @JamieLambert,
Do you have the JOB_ID field set for the Unique Feature Identifier Field for the Update a Feature output? GeoEvent will use this field to determine if a new record should be created, or an existing record will be updated.
Also, are the geometries the same between the two feature services? If they are not, this may be causing the event to fail.
Thanks @JakeSkinner.
A couple of updates after a meeting with the business line yesterday - 'job_id' is now replaced by 'sequence_id' as 'job_id' is not always unique. And 'Complete' is now 'ReportSent' (Input) and 'report_sent' (Output). The overall GeoEvent (10.8.1) setup is still the same and still not writing the value to the existing output record.
The Input is filtered to only records where a report has been sent (i.e. complete) and to Incremental Updates as once the report has been sent the record is no longer edited.
The Output Unique Feature Identifier is 'sequence_id'.
And 'sequence_id' is the TRACK_ID field for the Input Definition (I am assuming it is not required for the Output Definition as Output Unique Feature Identifier is already set).
Field Mapper in the Service maps the Input 'ReportSent' field to the Output 'report_sent' field.
The geometries are the same - both polygon.
I'm assuming that with the 'sequence_id' as Unique Feature Identifier and TRACK_ID assigned to the Input this will match the Input and Output records, then the Field Mapper will populate the Output 'report_sent' field with the Input value 'ReportSent'.
I think I have captured everything in the image below:
The 'ReportSent' value is being mapped, though it seems like the corresponding 'sequence_id' record in the Output hosted feature is not being picked up / written to. The Output feature is editable. Do I need to add a Filter or some other process? Thanks.
Jamie.
In your Field Mapper, include the sequence_id fields. These will be needed so it knows which features to update.
The sequence_id is already in both hosted features. I thought the Field Mapper was to copy a field from the Input and write it to the Output?
Or is this how the Unique Feature Identifier setting works? You assign that value then use Field Mapper to tie the Input and Output records based on how that value is mapped?
Happy to give it a crack anyway!
Thanks.
In your field mapper, you're only sending the field to update. It does not know which record to update without sending the sequence_id.
If I only have sequence id and report sent in the Input and Output Definitions, and map these two fields, nothing happens.
With an Input and Output Definition with all fields, though only sequence id and report sent mapped, those two fields are updated in the Output hosted feature and all other fields are overwritten to null.
I don't understand why fields not mapped are being updated. Is it even possible to update one field and leave the rest untouched? Is the Field Enricher (Feature Service) processor required?
Thanks.
Try with the attached datasets. Publish each as editable feature services (i.e. Airports_Source, Airports_Target). In the below example, I published the services as hosted ArcGIS Server services. Here is how I set up GeoEvent:
1. Create a Poll an ArcGIS Server For Features input for the Airports_Source feature service
2. Create a Update a Feature output for the Airports_Target using name field as the Unique Feature Identifier Field:
3. Create a new GeoEvent Definition with only the name and fcc fields:
4. Create a GeoEvent Service. Add a Field Mapper Processer between the input and output:
I did not have to specify any fields as the TRACK_ID in this example. Let me know if you are able to get this to work on your end.
Hey Jamie --
... those two fields are updated in the hosted feature [but] all other fields are overwritten to null. I don't understand why fields not mapped are being updated. Is it even possible to update one field and leave the rest untouched? Is the Field Enricher (Feature Service) processor required?
I'm not sure the question above was ever answered, so as a follow-up:
Hope some of this helps clarify tasks the processors you are configuring are actually doing.
-- RJ
I've marked "In your Field Mapper, include the sequence_id fields. These will be needed so it knows which features to update" as the solution as I was missing this part, as well as the sample provided thats helped me validate my configuration.
I think the rest of my setup was fine, though I think it was jammed up with my output definition that I was playing around with while I was trying to make it work as i wanted (missing the sequence_id in the Field Mapper obviously didn't help!).
Using the example dataset provided below (Sample.gdb.zip) and creating the service as outlined, then adding complexity one step at a time (input filter, update only, etc.) I was able to validate those steps successfully.
Once I was happy the setup was correct, I deleted everything from my original service (input, output, definitions, service) and rebuild from scratch, and no issues! A frustrating lesson, though a good result.
Thanks for all the help @JakeSkinner!