Automatically Update Field when New Row is added?

3306
6
Jump to solution
06-17-2021 02:59 PM
MikeMacRae
Occasional Contributor III

Hey everyone, I am just wondering if there is a way to update a field in a hosted feature layer automatically when a new row is added.

For example, say I have a table in my hosted feature layer and it contains 5 fields. To keep it simple, let's say one of the fields is a string field and is populated by concatenating 2 other fields. Currently, I have to manually run a field calculator to update that field each time a new row is added. I was hoping there was a way to store the field calculator formula in the field and every time a new row is added, it just automatically updates that field or possibly any other method that was automatically update a field. Is this doable?

Thanks

0 Kudos
1 Solution

Accepted Solutions
jcarlson
MVP Esteemed Contributor

As @DavidPike suggests, you can have it appear as an expression in the popup or the table in the "Classic" map. Was there a reason you needed to have a derived value as its own column?

Regardless, there is a way to automatically update it if you're comfortable with a bit of Python.

 

Simple Version: Frequent Intervals

So, you could just write a script like this in an AGOL Notebook:

 

from arcgis import GIS

# Connect to portal, get your layer
gis = GIS('your-org-url', 'user', 'password')
lyr = gis.content.get('hosted-layer-itemID')

# Calculate the field
lyr.layers[0].calculate(
    calc_expression = {
        'field': 'concat-fieldname',
        'value': 'CONCAT(firstfield, secondfield)'
    }
)

 

Then use the Tasks tab in the notebook to schedule it to run often enough to keep things up to date. Scheduled runs do cost you credits, but such a short, simple script will likely cost very little, even running it frequently.

 

Fancy Version: Injected Parameters and Webhooks

This can get quite a bit more complex (perhaps more trouble than it's worth), but will only trigger when, say, a new feature is created. Getting this to work does require your layer to be set up a certain way, so maybe this option is out of reach. But it's worth mentioning.

First, set up a notebook like above, but include a where clause in your field calculation. Set the variable where to '1=1' earlier in the notebook, then use injected parameters to allow this to be overridden.

Create a task to run at regular intervals, but leave it disabled for now.

 

where = '1=1'

lyr.layers[0].calculate(
    where = where
    ...

 

 

Next, create a webhook on the hosted feature service. Define it to trigger whenever you feel it's necessary, maybe on adds and updates of features.

Now, here's the tricky part. You don't really have control over the contents of your webhook's payload so you'll need some kind of intermediary here. Lots of the Esri examples use Integromat, maybe you have Microsoft Flows / Power Automate. You could also create a totally custom program to do this, if you know what you're doing.

However you do it, have the intermediary catch the payload and get the objectID of the created feature. Then you can submit a POST request to https://kendall.maps.arcgis.com/sharing/rest/community/users/<your-username>/tasks/<your-task-ID>/update to schedule a run of your notebook.

The payload for your POST request will look something like this:

 

{
  "title": "Single Run",
  "itemId": "your notebook itemID",
  "startDate": "UNIX timestamp of current time",
  "type": "ExecuteNotebook",
  "parameters" "{\'updatePortalItem\':true,\'saveInjectedParameters\':false,\'notebookParameters\':\'{\\\'where\\\':\\\'objectid=THEOBJECTID\\\'}\'}",
  "minute": "*",
  "hour": "*",
  "dayOfMonth": "*",
  "month": "*",
  "dayOfWeek": "?",
  "maxOccurrences": "1"
  "f": "json"
}

 

You'll probably need to figure out the token portion, too.

Worth noting: you can also work with scheduled notebook tasks in Python, if that's easier for you.

Anyway, that should get your task re-set to run at the current time, just once. By piping in the created/updated objectid from the webhook, the task's "where" parameter will filter your calculation to just those features added or updated.

 

I've gone on long enough. Talk about using a cannon to swat a mosquito! The simple option is probably all you'd want.

- Josh Carlson
Kendall County GIS

View solution in original post

6 Replies
DavidPike
MVP Frequent Contributor

Is it in a Web Map? You could just add an Arcade attribute expression.

0 Kudos
BrittanyBurson
Occasional Contributor III

Hi @DavidPike -- from our use case perspective, the Arcade attribute expressions are great, but our users always need to be able to export the data to .csv or similar. Since the map attribute expressions only calculate on the fly and only live within a Web Map/App, having them properly in the data is ideal. Scheduled updates as recommended below get us close, but having attributes update/populate at the same time the end user is editing the data would be best case scenario.

jcarlson
MVP Esteemed Contributor

As @DavidPike suggests, you can have it appear as an expression in the popup or the table in the "Classic" map. Was there a reason you needed to have a derived value as its own column?

Regardless, there is a way to automatically update it if you're comfortable with a bit of Python.

 

Simple Version: Frequent Intervals

So, you could just write a script like this in an AGOL Notebook:

 

from arcgis import GIS

# Connect to portal, get your layer
gis = GIS('your-org-url', 'user', 'password')
lyr = gis.content.get('hosted-layer-itemID')

# Calculate the field
lyr.layers[0].calculate(
    calc_expression = {
        'field': 'concat-fieldname',
        'value': 'CONCAT(firstfield, secondfield)'
    }
)

 

Then use the Tasks tab in the notebook to schedule it to run often enough to keep things up to date. Scheduled runs do cost you credits, but such a short, simple script will likely cost very little, even running it frequently.

 

Fancy Version: Injected Parameters and Webhooks

This can get quite a bit more complex (perhaps more trouble than it's worth), but will only trigger when, say, a new feature is created. Getting this to work does require your layer to be set up a certain way, so maybe this option is out of reach. But it's worth mentioning.

First, set up a notebook like above, but include a where clause in your field calculation. Set the variable where to '1=1' earlier in the notebook, then use injected parameters to allow this to be overridden.

Create a task to run at regular intervals, but leave it disabled for now.

 

where = '1=1'

lyr.layers[0].calculate(
    where = where
    ...

 

 

Next, create a webhook on the hosted feature service. Define it to trigger whenever you feel it's necessary, maybe on adds and updates of features.

Now, here's the tricky part. You don't really have control over the contents of your webhook's payload so you'll need some kind of intermediary here. Lots of the Esri examples use Integromat, maybe you have Microsoft Flows / Power Automate. You could also create a totally custom program to do this, if you know what you're doing.

However you do it, have the intermediary catch the payload and get the objectID of the created feature. Then you can submit a POST request to https://kendall.maps.arcgis.com/sharing/rest/community/users/<your-username>/tasks/<your-task-ID>/update to schedule a run of your notebook.

The payload for your POST request will look something like this:

 

{
  "title": "Single Run",
  "itemId": "your notebook itemID",
  "startDate": "UNIX timestamp of current time",
  "type": "ExecuteNotebook",
  "parameters" "{\'updatePortalItem\':true,\'saveInjectedParameters\':false,\'notebookParameters\':\'{\\\'where\\\':\\\'objectid=THEOBJECTID\\\'}\'}",
  "minute": "*",
  "hour": "*",
  "dayOfMonth": "*",
  "month": "*",
  "dayOfWeek": "?",
  "maxOccurrences": "1"
  "f": "json"
}

 

You'll probably need to figure out the token portion, too.

Worth noting: you can also work with scheduled notebook tasks in Python, if that's easier for you.

Anyway, that should get your task re-set to run at the current time, just once. By piping in the created/updated objectid from the webhook, the task's "where" parameter will filter your calculation to just those features added or updated.

 

I've gone on long enough. Talk about using a cannon to swat a mosquito! The simple option is probably all you'd want.

- Josh Carlson
Kendall County GIS
DavidPike
MVP Frequent Contributor

Bookmarked  🙂

0 Kudos
BrittanyBurson
Occasional Contributor III

Same! Thank you @jcarlson 

0 Kudos
MikeMacRae
Occasional Contributor III

Thanks @jcarlson for the effort to your response! Much appreciated. We are exploring the idea of a scheduled script, however I was hoping there was a more automatic way. We tested using Attribute Rules on the table when built in Pro, but they do not carry over when you publish the geodatabase as a hosted feature layer. Thanks to @BrittanyBurson for responding to my other question about this here. This may be a function coming in ArcGIS Online, which would be a great function!