I was wondering if there was a way to trigger a Python script after a hosted feature layer is edited. Right now, the Python script runs every 15 minutes on Task Scheduler. This works fine, but seems excessive as the script really only needs to run after the hosted feature layer is edited. TIA.
ArcGIS Notebook server allows for this using feature service webhooks and python in an ArcGIS Notebook.
https://developers.arcgis.com/rest/enterprise-administration/notebook/execute-notebook/
Unfortunately the execute notebook API is only available in ArcGIS Enterprise, and not ArcGIS online. We are relegated to scheduled runs in agol hosted notebooks.
It should be similar for task scheduler. Options scheduling, but to allow for an external process to invoke a local python script... Seems problematic.
Do your edits require ArcPy, or can they be done using the ArcGIS API for Python? If it's the latter you could create a serverless function (AWS Lambda, Azure Functions, etc.) and put the editing script in it, then have a webhook call it. If I remember correctly the webhook should pass a token that the serverless function can use for authentication when calling edit_features.
If you aren't already doing this, you could minimize the amount of editing your notebook is doing using timestamps and edit tracking:
This piques my interest, we are an AWS shop. My scripts run on a schedule on an EC2 instance. Currently, the script is very ArcPy based (Call hosted feature layer > Make Feature Layer > Export Features to Shapefile) but perhaps there is a way to circumvent ArcPy altogether.
Like the others have said, there isn't a way to automatically trigger a python script whenever certain edit criteria is met. However, with the release of power automate for arcgis there could be potential for some kind of automated trigger event to execute the script. I would need to dig in deeper to find if there is such a thing.
It would be a nice option alright, and like yourself I have some Notebooks scheduled to run every 15 minutes or every hour. As @MobiusSnake mentions, I have time filters and data checks so I'm only accessing those records that have been edited within the last 15 mins or an hour rather than the entire dataset.
@RPGIS mentions Power Automate, and you can also use Make to create and maintain webhooks. FME Flow (server) is also excellent for these webhooks for monitoring changes and triggering updates when an edit is made - although FME comes with a cost.
It would seem that Webhooks via a third-party software is he only option outside of scheduled Notebooks.
Also, could you please explain in further detail what exactly it is you are trying to accomplish. A lot of times people ask questions regarding a specific workflow but, when they go into further detail, they actually find that there is another route that can be taken which would get them closer to their desired result.
What I mean is if you are trying to, lets say update a table or several tables whenever an edit is made, there are workflows that use the arcade language that will automatically update those tables whenever an edit is made. If you are trying to run a script to update a report of sorts or send some kind of notification, then the python script or using Power Automate would make the most sense in that case.
Sure, I'll explain further. I have a user that updates a hosted feature layer periodically. She'll add a feature or edit an existing one. After she edits, she needs to export the hosted feature layer to shapefile format for use in her Tibco Spotfire project. It's a very simple script, call the hosted feature layer, use Make Feature Layer to turn off unnecessary fields, then Export Features to shapefile. There seems to be limited connectivity between AGOL and Spotfire.
Does this Spotfire Project have the ability to read APIs or is it strictly upload a dataset. Most people don't realize that feature services are technically called rest APIs which can be shared to different applications that, if they can be natively read, would basically work in that manner.
Another thing is have you thought about creating a data view that has export capabilities so that she can download it whenever she needs to rather than running something constant. If you enable the feature service to allow others to export and then create a view of the data, then she can export it whenever needed. It can also be set to have limited sharing capabilities also.