I’m looking for advice on setting up an automated workflow to keep a hosted feature layer up to date based on new site information.
My hosted feature layer is currently linked to Survey123 responses, where our government surveyors either capture new sites or verify the status of sites they recorded previously (still active, gone etc.). Contracted external providers also send us site addresses that need to be verified by the survey team. Currently this is sent over email in excel files.
I am trying to develop a workflow to automatically have this updated in the hosted feature layer based on unique additions, with an added variable to distinguish the data source (external vs internal). I am currently exploring having a shared Excel workbook stored in SharePoint be read through an Arc Notebook set to pull in data and update the layer at a set frequency. Is this the right approach? Can someone share guidance on how to set this up?
We’re not in a position to require external providers to use Survey123, and I’d like to keep their workflow Excel-based since that’s what they’re familiar with.
Thanks!
Varun
I'd look at Data Pipelines over Notebook, especially if you have Azure Storage and are used to likes of FME, Model Builder, or Interoperability.
Despite being quite simplistic at the moment, it is powerful enough that I've a 50 step process running to update a reporting layer and table on a daily basis.