Power Automate Webhooks - Impacts on feature layer in another AGOL account.

312
2
Jump to solution
02-15-2024 07:03 AM
ArturOliveiraSecto
New Contributor

When using webhooks in feature layers hosted in another Organization Account, we understand there is no credit usage. 

In relation to bandwidth, is there a huge impact in it ? How can we calculate the impact ? We are chasing to store only one attribute change to be captured by the webhook enabled in another organization feature layer. 

Let's say : in one feature layer shared by the orther organization for us to edit, we have 500 updates in that attribute, distributed among 5 sublayers. This is multiplied by 5, considering the fact we work in 5 feature layers simultaneously . That means 2,500 changes captured during the day. note: we are overestimating these metrics just to play safe.

0 Kudos
1 Solution

Accepted Solutions
KevinHibma
Esri Regular Contributor

This is a difficult question to answer, but simply, you shouldn't see any impact on the feature layer because of the webhook traffic. After the edit is performed, the information for the webhook is processed in a separate process from the feature layer. The feature layer doesn't become locked, or wait for the webhook to finish.

Is it possible this many edits could cause the webhook process to degrade? Maybe? Probably not? Again, a very difficult question to answer without actually benchmarking it (I do not have metrics like these). You can't control the ArcGIS Online processes that do this. What you do have control over is where the webhook is being sent to, which is either a commercial vendor or your own custom service -- does this process scale and handle a lot of webhooks? You might want to try and reduce the number of webhooks being dispatch which would reduce load everywhere. Take a look at the scheduleInfo parameter when you create your webhook. By increasing the frequency, more edits will be grouped into a single webhook. However, as-is, you shouldn't really expect more than 2 webhooks dispatched every minute (based on the 30 second default). That means you could have 3 edits or 10,000 edits in a single 30 second time period, it'll only ever be 1 webhook.

{
  "name":"Every-30seconds",
  "startAt":"1478280677536", //Integer
  "state":"enabled",
  "recurrenceInfo": {
    "frequency":"second",
    "interval":30
  }
}

 

View solution in original post

2 Replies
KevinHibma
Esri Regular Contributor

This is a difficult question to answer, but simply, you shouldn't see any impact on the feature layer because of the webhook traffic. After the edit is performed, the information for the webhook is processed in a separate process from the feature layer. The feature layer doesn't become locked, or wait for the webhook to finish.

Is it possible this many edits could cause the webhook process to degrade? Maybe? Probably not? Again, a very difficult question to answer without actually benchmarking it (I do not have metrics like these). You can't control the ArcGIS Online processes that do this. What you do have control over is where the webhook is being sent to, which is either a commercial vendor or your own custom service -- does this process scale and handle a lot of webhooks? You might want to try and reduce the number of webhooks being dispatch which would reduce load everywhere. Take a look at the scheduleInfo parameter when you create your webhook. By increasing the frequency, more edits will be grouped into a single webhook. However, as-is, you shouldn't really expect more than 2 webhooks dispatched every minute (based on the 30 second default). That means you could have 3 edits or 10,000 edits in a single 30 second time period, it'll only ever be 1 webhook.

{
  "name":"Every-30seconds",
  "startAt":"1478280677536", //Integer
  "state":"enabled",
  "recurrenceInfo": {
    "frequency":"second",
    "interval":30
  }
}

 

ArturOliveiraSecto
New Contributor

Thank you Kevin. Much appreciated. 

0 Kudos