Hello, I have an Arcade script for a Dashboards project that's meant to return any points that fall within a fire zone or a weather event (both polygons) and return information of that point and the areas they fall within. The script works with small sets of points but crashes if there's more than about 100 points, which is a problem because the feature set I'm using has about 150k points.
var portal = Portal("https://www.arcgis.com");
var fire_poly = FeatureSetByPortalItem(
portal,
'd957997ccee7408287a963600a77f61f',
1,
[
'IncidentName'
],
true
);
var weather_poly = FeatureSetByPortalItem(
portal,
'a6134ae01aad44c499d12feec782b386',
6,
[
'Event',
'Severity'
],
true
);
var pt_fs = FeatureSetByPortalItem(
portal,
'item_id_removed_for_security',
0,
[
'SAP_EQUIP_',
'SAP_STRUCT',
'TLINE_NM',
'SAP_FUNC_L',
'GIS_LAT',
'GIS_LONG'
],
true
);
var features = [];
var feat;
for (var pnt in pt_fs) {
var in_fire = [];
var in_event = [];
if (Count(Within(pnt, fire_poly)) > 0 || Count(Within(pnt, weather_poly)) > 0) {
for (var poly in fire_poly) {
if (Within(pnt, poly)) {
Push(in_fire, poly['IncidentName']);
}
}
for (var poly in weather_poly) {
if (Within(pnt, poly)) {
Push(in_event, poly['Severity'] + " " + poly['Event']);
}
}
feat = {
'attributes': {
'SAP_EQUIP': pnt['SAP_EQUIP_'],
'STRUCT_NO': pnt['SAP_STRUCT'],
'TLINE_NM': pnt['TLINE_NM'],
'FUNC_LOC': pnt['SAP_FUNC_L'],
'GIS_LAT': pnt['GIS_LAT'],
'GIS_LONG': pnt['GIS_LONG'],
'fire': IIf(IsEmpty(First(in_fire)), "N/A", Concatenate(in_fire, ", ")),
'weather_event': IIf(IsEmpty(First(in_event)), "N/A", Concatenate(in_event, ", "))
}
};
Push(features, feat);
}
}
var out_dict = {
'fields': [
{'name': 'SAP_EQUIP', 'alias': 'SAP Equipment ID', 'type': 'esriFieldTypeString'},
{'name': 'STRUCT_NO', 'alias': 'Structure Number', 'type': 'esriFieldTypeString'},
{'name': 'TLINE_NM', 'alias': 'Transmission Line', 'type': 'esriFieldTypeString'},
{'name': 'FUNC_LOC', 'alias': 'SAP Functional Location', 'type': 'esriFieldTypeString'},
{'name': 'GIS_LAT', 'alias': 'Latitude', 'type': 'esriFieldTypeDouble'},
{'name': 'GIS_LONG', 'alias': 'Longitude', 'type': 'esriFieldTypeDouble'},
{'name': 'fire', 'alias': 'Name of Fire', 'type': 'esriFieldTypeString'},
{'name': 'weather_event', 'alias': 'Weather Event', 'type': 'esriFieldTypeString'}
],
'geometryType': '',
'features': features
};
return FeatureSet(out_dict);
From testing it, along with help from @KenBuja in my previous post Execution Error in Arcade Script , it seems to usually crash at the conditional on line 46. I've been trying to find a way to create a subset of the points that fall within the 2 polygons before starting the main loop, eliminating the need for the conditional and using a smaller set of points, by using Union and Within but it seems that Union only works with individual features and not with feature sets.
Is there some way of making this script more efficient that I'm overlooking? Is it possible to return an array of features from a feature set to make my idea with Union work?
Solved! Go to Solution.
Sorry for the long post, but it's an interesting problem!
EDIT: My post kept getting flagged for "invalid HTML" no matter what I did, so I had to post a plain-text version. I'm trying to add the formatting back in.
The issue is probably related to how your for loops are structured. If you open your browser's developer tools and watch what's going on when an expression like this actually evaluates, you'll see that functions like Within submit a call to the REST endpoint of the service.
So, picking this apart:
For 150k points, and those other layers having 286 and 4063 features in them at the time of writing this, that means every hit on the initial point intersection kicks off 4,300 spatial overlays. If every one of your points intersected, we're talking hundreds of millions of operations. If I was your browser, I'd crash too!
Now, thankfully, those nested Within functions in the fire / weather loops are feature-to-feature, so there shouldn't be additional server calls made, otherwise we'd be talking hundreds of millions of server requests. But doing a spatial operation is still going to take some time, and any time your point has a hit in either of those forecast layers, it will result in a separate operation for every polygon in both services.
Here's a tip. When you make a spatial call against a FeatureSet, that happens on the server. Your intersecting point has to be included, but the other layer's geometry is actually not needed! So when you create the FeatureSets for your weather and fire polygons, go ahead and set those to false for returning the geometry. That will cut back considerably on the amount of data going back and forth.
Also, when we include a filter based on the Count of the features, it modifies the request to return the count only. And I'm using "intersects", not "within". For point/polygon overlays, they're the same thing, unless the point is exactly on the boundary of the polygon, which would be pretty rare for unrelated layers.
In your expression, you use Within(pnt, fire_poly) in your condition on line 46. But then when the condition is met, your sort of go backwards and recreate the same thing. If you just assign Within(pnt, fire_poly) to a variable and return it, the result will be a FeatureSet of all fire polygons intersecting with your point. By reusing this FeatureSet in our nested loop, there will be no need to loop through all the fire polygons. Eliminating those nested "within" functions will prevent the hundreds of millions of spatial operations.
Try replacing lines 42-58 with this:
for (var pnt in pt_fs) {
var in_fire = [];
var in_event = [];
var xs_fire = Intersects(pnt, fire_poly)
var xs_weather = Intersects(pnt, weather_poly)
// populate fire array if any
if (Count(xs_fire) > 0) {
for (var fire in xs_fire) {
Push(in_fire, fire['IncidentName'])
}
}
// populate weather array if any
if (Count(xs_weater) > 0) {
for (var event in xs_weather) {
Push(in_event, `${event['Severity']} ${event['Event']}`)
}
}
For such a big expression, even cutting down on the amount of server back-and-forth won't matter much when you've got to wait for 300,000+ separate GET requests to finish.
I wrote about a custom function that could help you here. Given that your fire and weather layers are fairly small, we could push them entirely to your browser's memory, then run the intersections. In this way, we only make server calls on the initial load, and none at all in our "pnt in pt_fs" loop.
Stick this at the top of your function, and then implement it in your weather/fire FeatureSet functions. Oh and a little caveat, if we're moving this into RAM instead of using the original REST endpoint, we've got to flip the returnGeometry back to true.
function Memorize(fs) {
var temp_dict = {
fields: Schema(fs)['fields'],
geometryType: '',
features: []
}
for (var f in fs) {
var attrs = {}
for (var attr in f) {
attrs[attr] = Iif(TypeOf(f[attr]) == 'Date', Number(f[attr]), f[attr])
}
Push(
temp_dict['features'],
{attributes: attrs}
)
}
return FeatureSet(Text(temp_dict))
}
var portal = Portal("https://www.arcgis.com");
var fire_poly = Memorize(FeatureSetByPortalItem(
portal,
'd957997ccee7408287a963600a77f61f',
1,
['IncidentName'],
true
));
var weather_poly = Memorize(FeatureSetByPortalItem(
portal,
'a6134ae01aad44c499d12feec782b386',
6,
['Event', 'Severity'],
true
));
None of my test data I have handy currently intersects with any fire or weather areas, so I can't really say if any of this will solve the problem, but I can say that running a version of your original expression kept going for over 10 minutes, whereas with the changes I made, it was under a minute.
Oh, you know what? I never actually used Memorize for a spatial table, and I left it null! 😅
function Memorize(fs) {
var temp_dict = {
fields: Schema(fs)['fields'],
geometryType: 'esriGeometryPolygon',
features: []
}
for (var f in fs) {
var attrs = {}
for (var attr in f) {
attrs[attr] = Iif(TypeOf(f[attr]) == 'Date', Number(f[attr]), f[attr])
}
Push(
temp_dict['features'],
{attributes: attrs, geometry: Geometry(f)}
)
}
return FeatureSet(Text(temp_dict))
}
Try swapping this in for the Memorize function. Any difference?
Sorry for the long post, but it's an interesting problem!
EDIT: My post kept getting flagged for "invalid HTML" no matter what I did, so I had to post a plain-text version. I'm trying to add the formatting back in.
The issue is probably related to how your for loops are structured. If you open your browser's developer tools and watch what's going on when an expression like this actually evaluates, you'll see that functions like Within submit a call to the REST endpoint of the service.
So, picking this apart:
For 150k points, and those other layers having 286 and 4063 features in them at the time of writing this, that means every hit on the initial point intersection kicks off 4,300 spatial overlays. If every one of your points intersected, we're talking hundreds of millions of operations. If I was your browser, I'd crash too!
Now, thankfully, those nested Within functions in the fire / weather loops are feature-to-feature, so there shouldn't be additional server calls made, otherwise we'd be talking hundreds of millions of server requests. But doing a spatial operation is still going to take some time, and any time your point has a hit in either of those forecast layers, it will result in a separate operation for every polygon in both services.
Here's a tip. When you make a spatial call against a FeatureSet, that happens on the server. Your intersecting point has to be included, but the other layer's geometry is actually not needed! So when you create the FeatureSets for your weather and fire polygons, go ahead and set those to false for returning the geometry. That will cut back considerably on the amount of data going back and forth.
Also, when we include a filter based on the Count of the features, it modifies the request to return the count only. And I'm using "intersects", not "within". For point/polygon overlays, they're the same thing, unless the point is exactly on the boundary of the polygon, which would be pretty rare for unrelated layers.
In your expression, you use Within(pnt, fire_poly) in your condition on line 46. But then when the condition is met, your sort of go backwards and recreate the same thing. If you just assign Within(pnt, fire_poly) to a variable and return it, the result will be a FeatureSet of all fire polygons intersecting with your point. By reusing this FeatureSet in our nested loop, there will be no need to loop through all the fire polygons. Eliminating those nested "within" functions will prevent the hundreds of millions of spatial operations.
Try replacing lines 42-58 with this:
for (var pnt in pt_fs) {
var in_fire = [];
var in_event = [];
var xs_fire = Intersects(pnt, fire_poly)
var xs_weather = Intersects(pnt, weather_poly)
// populate fire array if any
if (Count(xs_fire) > 0) {
for (var fire in xs_fire) {
Push(in_fire, fire['IncidentName'])
}
}
// populate weather array if any
if (Count(xs_weater) > 0) {
for (var event in xs_weather) {
Push(in_event, `${event['Severity']} ${event['Event']}`)
}
}
For such a big expression, even cutting down on the amount of server back-and-forth won't matter much when you've got to wait for 300,000+ separate GET requests to finish.
I wrote about a custom function that could help you here. Given that your fire and weather layers are fairly small, we could push them entirely to your browser's memory, then run the intersections. In this way, we only make server calls on the initial load, and none at all in our "pnt in pt_fs" loop.
Stick this at the top of your function, and then implement it in your weather/fire FeatureSet functions. Oh and a little caveat, if we're moving this into RAM instead of using the original REST endpoint, we've got to flip the returnGeometry back to true.
function Memorize(fs) {
var temp_dict = {
fields: Schema(fs)['fields'],
geometryType: '',
features: []
}
for (var f in fs) {
var attrs = {}
for (var attr in f) {
attrs[attr] = Iif(TypeOf(f[attr]) == 'Date', Number(f[attr]), f[attr])
}
Push(
temp_dict['features'],
{attributes: attrs}
)
}
return FeatureSet(Text(temp_dict))
}
var portal = Portal("https://www.arcgis.com");
var fire_poly = Memorize(FeatureSetByPortalItem(
portal,
'd957997ccee7408287a963600a77f61f',
1,
['IncidentName'],
true
));
var weather_poly = Memorize(FeatureSetByPortalItem(
portal,
'a6134ae01aad44c499d12feec782b386',
6,
['Event', 'Severity'],
true
));
None of my test data I have handy currently intersects with any fire or weather areas, so I can't really say if any of this will solve the problem, but I can say that running a version of your original expression kept going for over 10 minutes, whereas with the changes I made, it was under a minute.
Thank you for the response, I really appreciate the time and detail put into it! I tried implementing both solutions and got the script to run, however it returned an empty feature set. Currently 310 of my point set should be returned as they're within a fire zone. I went ahead and also tried my original script but with the Memorize function and got the same result, it ran but returned an empty feature set. Is it possible the Memorize function might be messing with the data somehow?
Oh, you know what? I never actually used Memorize for a spatial table, and I left it null! 😅
function Memorize(fs) {
var temp_dict = {
fields: Schema(fs)['fields'],
geometryType: 'esriGeometryPolygon',
features: []
}
for (var f in fs) {
var attrs = {}
for (var attr in f) {
attrs[attr] = Iif(TypeOf(f[attr]) == 'Date', Number(f[attr]), f[attr])
}
Push(
temp_dict['features'],
{attributes: attrs, geometry: Geometry(f)}
)
}
return FeatureSet(Text(temp_dict))
}
Try swapping this in for the Memorize function. Any difference?
Perfect, that worked! Thank you so much! Just to make sure I made everything as efficient as possible, here's how I implemented everything:
function Memorize(fs) {
var temp_dict = {
fields: Schema(fs)['fields'],
geometryType: 'esriGeometryPolygon',
features: []
};
for (var f in fs) {
var attrs = {};
for (var attr in f) {
attrs[attr] = IIf(TypeOf(f[attr]) == 'Date', Number(f[attr]), f[attr]);
}
Push(
temp_dict['features'],
{attributes: attrs, geometry: Geometry(f)}
);
}
return FeatureSet(Text(temp_dict));
}
var portal = Portal("https://www.arcgis.com");
var fire_poly = Memorize(FeatureSetByPortalItem(
portal,
'd957997ccee7408287a963600a77f61f',
1,
[
'IncidentName'
],
true
));
var weather_poly = Memorize(FeatureSetByPortalItem(
portal,
'a6134ae01aad44c499d12feec782b386',
6,
[
'Event',
'Severity'
],
true
));
var pt_fs = FeatureSetByPortalItem(
portal,
'5ca8049d03f1477fa25d5e36be72584a',
0,
[
'SAP_EQUIP_',
'SAP_STRUCT',
'TLINE_NM',
'SAP_FUNC_L',
'GIS_LAT',
'GIS_LONG'
],
true
);
var features = [];
var feat;
for (var pnt in pt_fs) {
var in_fire = [];
var in_event = [];
var xs_fire = Intersects(pnt, fire_poly);
var xs_weather = Intersects(pnt, weather_poly);
if (Count(xs_fire) > 0) {
for (var fire in xs_fire) {
Push(in_fire, fire['IncidentName']);
}
}
if (Count(xs_weather) > 0) {
for (var event in xs_weather) {
Push(in_event, `${event['Severity']} ${event['Event']}`);
}
}
if (Count(xs_fire) > 0 || Count(xs_weather) > 0) {
feat = {
'attributes': {
'SAP_EQUIP': pnt['SAP_EQUIP_'],
'STRUCT_NO': pnt['SAP_STRUCT'],
'TLINE_NM': pnt['TLINE_NM'],
'FUNC_LOC': pnt['SAP_FUNC_L'],
'GIS_LAT': pnt['GIS_LAT'],
'GIS_LONG': pnt['GIS_LONG'],
'fire': IIf(IsEmpty(First(in_fire)), "N/A", Concatenate(in_fire, ", ")),
'weather_event': IIf(IsEmpty(First(in_event)), "N/A", Concatenate(in_event, ", "))
}
};
Push(features, feat);
}
}
var out_dict = {
'fields': [
{'name': 'SAP_EQUIP', 'alias': 'SAP Equipment ID', 'type': 'esriFieldTypeString'},
{'name': 'STRUCT_NO', 'alias': 'Structure Number', 'type': 'esriFieldTypeString'},
{'name': 'TLINE_NM', 'alias': 'Transmission Line', 'type': 'esriFieldTypeString'},
{'name': 'FUNC_LOC', 'alias': 'SAP Functional Location', 'type': 'esriFieldTypeString'},
{'name': 'GIS_LAT', 'alias': 'Latitude', 'type': 'esriFieldTypeDouble'},
{'name': 'GIS_LONG', 'alias': 'Longitude', 'type': 'esriFieldTypeDouble'},
{'name': 'fire', 'alias': 'Name of Fire', 'type': 'esriFieldTypeString'},
{'name': 'weather_event', 'alias': 'Weather Event', 'type': 'esriFieldTypeString'}
],
'geometryType': '',
'features': features
};
return FeatureSet(out_dict);
Glad to hear it!
Just a quick follow up, do you know if it's at all possible to make the results of the table downloadable in Dashboards? So far it seems like it's unfortunately not and with Arcade being unavailable in Experience Builder, would I possibly be able to work around this issue by building out a web app with the API for JavaScript?