|
BLOG
|
Split is an operation in ArcGIS that breaks a line feature into two features at the split location. This operation can be performed in ArcGIS Pro or via the rest API split payload. Split respects the the line split policy, which is either update delete or , delete insert insert. However, often you want to more control on how and what lines are split, the attributes that are distributed between the two split lines, and what kind of features causes the line to split. For that we have authored an Attribute rule that allows you to do this. This rule was authored specially for the Utility Network because it has special logic, however it will work with other line features outside the controller dataset. The rule allows you to configure a point feature that when that feature is created on a line feature it will split the line and transfer the attributes from the point feature to the line features. This example configures a switch feature that splits the medium voltage line. You can add this Arcade expression as a new Calculation Attribute rule to the point class you wish to split the line. The rule does a spatial query, detects the line, updates its geometry and creates a new line feature. The payload for both edits can be changed to include additional attributes and logic. // NOTE: This is not tested on multipart lines
// NOTE: The intersect seems to be failing on large lines in GCS, we could evaluating buffering the point for a better intersect
//Expects($feature, 'SplitInt', 'SplitText');
// Exit Early Code
// If you want to exit and not split a interesting line based on a point value, fill out a dict
// with field name as the key and the values as an array to exit early
// Make sure to list these fields in the Expects
var exit_early_values = {} //'SplitInt', [200],'SplitText', ['DontSplitTheLine']);
// The field to not move to a new field, edit tracking fields need to be remove
// All fields listed here, need to be in upper case, they are forced to upper in the logic below.
var remove_fields_from_new_feature = ['SHAPE_LENGTH', 'GLOBALID', 'OBJECTID'];
// The line class to split
var line_class_name = "ElectricDistributionLine";
// This is used to get Non Editable fields, do not change the fields from *
var line_fs = FeatureSetByName($datastore, "main.ElectricDistributionLine", ['*'], true);
// Set this when using decimal degrees, it adjust the tolerances
var decimal_degrees = false;
// How man decimals to round coordinates to check if identical
// GCS should use a large value, such as 9
// PCS should use a value such as 4
var compare_coordinate_round_value = 2;
// When walking the line to split, a line is created between pairs of vertex
// this value is the distance ito determine if the created point is on that line
// GCS should use a small value, such as 0.0001
// PCS should use a larger value, such as 0.1
var point_on_line_tol = 0.1;
// For some scales, and large GCS lines, intersect does not return intersecting lines
// Setting this value uses a polygon buffer of the point for the intersect
// PCS should use a value such as 0.02
// GCS should use a value such as 0.00000002
var buffer_pnt_distance = 0;
if (decimal_degrees) {
compare_coordinate_round_value = 9;
point_on_line_tol = 0.00001;
if (buffer_pnt_distance != 0) {
buffer_pnt_distance = 0.00000002;
}
}
// option to skip logic to remove duplicate vertex at the start and end of line, this may be caused by splitting on
// a vertex
var remove_dup_vertex = true;
// When the point and line are un controlled, we cannot split a line when the point is midspan(not at a vertex)
var is_un_controlled = true;
// ************* End User Variables Section *************
function get_fields_by_type(feat, convert_string, param, value) {
var fields = Schema(feat).fields;
var return_fields = [];
var func = Decode(Lower(convert_string), "lower", Lower, "upper", Upper, Text);
for (var f in fields) {
if (fields[f][param] == value) {
var fld_name = fields[f].name;
if (!IsEmpty(convert_string)) {
fld_name = func(fld_name);
}
Push(return_fields, fld_name);
}
}
return return_fields;
}
function set_date_type(feat, dict) {
// Dates need to be set to date types for some platforms
var dt_keys = get_fields_by_type(feat, dict, 'type', 'esriFieldTypeDate');
for (var k in dict) {
if (IndexOf(dt_keys, Upper(k)) == -1) {
continue;
}
dict[k] = Date(dict[k]);
}
return dict;
}
function pDistance(x, y, x1, y1, x2, y2) {
// adopted from https://stackoverflow.com/a/6853926
var A = x - x1;
var B = y - y1;
var C = x2 - x1;
var D = y2 - y1;
var dot = A * C + B * D;
var len_sq = C * C + D * D;
var param = -1;
if (len_sq != 0) //in case of 0 length line
param = dot / len_sq;
var xx, yy;
var is_vertex = false;
// Do we want to use tolerenaces
if (compare_coordinate(x, y, x1, y1)) {
is_vertex = true;
}
if (compare_coordinate(x, y, x2, y2)) {
is_vertex = true;
}
if (param < 0) {
//is_vertex = true;
xx = x1;
yy = y1;
}
else if (param > 1) {
//is_vertex = true;
xx = x2;
yy = y2;
}
else {
//is_vertex = false;
xx = x1 + param * C;
yy = y1 + param * D;
}
var dx = x - xx;
var dy = y - yy;
return [Sqrt(dx * dx + dy * dy), [xx, yy], is_vertex];
}
function compare_coordinate(x, y, x1, y1) {
// TODO, probably move to Equals and compare the geometry
if ((Round(x1, compare_coordinate_round_value) != Round(x, compare_coordinate_round_value)) ||
(Round(y1, compare_coordinate_round_value) != Round(y, compare_coordinate_round_value)) ){
return false;
}
return true;
}
function pop_keys(dict, keys) {
var new_dict = {};
for (var k in dict) {
if (IndexOf(keys, Upper(k)) != -1) {
continue;
}
new_dict[k] = dict[k];
}
return new_dict;
}
function remove_vertex(path_array) {
if (!remove_dup_vertex){
return path_array;
}
var new_path = [];
var current_path = path_array[0];
var vertex_count = Count(current_path);
if (vertex_count > 2) {
if (compare_coordinate(current_path[0][0],current_path[0][1],current_path[1][0],current_path[1][1])) {
for (var i in current_path) {
if (i != 1) {
Push(new_path, current_path[i]);
}
}
current_path = new_path;
}
}
new_path = [];
path_array[0] = current_path;
current_path = path_array[-1];
vertex_count = Count(current_path);
if (Count(current_path) > 2) {
if (compare_coordinate(current_path[-1][0],current_path[-1][1],current_path[-2][0],current_path[-2][1])) {
for (var i in current_path) {
if (i != vertex_count - 2) {
Push(new_path, current_path[i]);
}
}
current_path = new_path;
}
}
path_array[-1] = current_path;
return path_array;
}
function cut_line_at_point(line_geometry, point_geometry, exit_when_not_at_vertex) {
var point_coord = null;
var interpolate_z = false;
// Check if the line has already been converted to a dict
var line_shape = null;
if (TypeOf(line_geometry) == 'Dictionary') {
line_shape = line_geometry;
} else {
line_shape = Dictionary(Text(line_geometry));
}
// Get the Z info and determine if Zs should be return/interpolated in lines
// TODO: Handle M's
if (Count(line_shape['paths'][0][0]) >= 3 && IsEmpty(point_geometry.Z)) {
point_coord = [point_geometry.X, point_geometry.Y];
interpolate_z = true;
} else if (Count(line_shape['paths'][0][0]) >= 3 && IsEmpty(point_geometry.Z) == false) {
point_coord = [point_geometry.X, point_geometry.Y, point_geometry.Z];
} else {
point_coord = [point_geometry.X, point_geometry.Y];
}
// If the point is at the start or end, skip splitting line
if (compare_coordinate(point_coord[0],point_coord[1], line_shape['paths'][0][0][0], line_shape['paths'][0][0][1]) ||
compare_coordinate(point_coord[0],point_coord[1], line_shape['paths'][-1][-1][0],line_shape['paths'][-1][1])) {
return [];
}
var min_distance = point_on_line_tol * 2;
var segment_id = [];
var line_path = line_shape['paths'];
for (var i in line_path) {
var current_path = line_path[i];
// Loop over vertex, exit when at last vertex
for (var j = 0 ; j < Count(current_path) - 1 ; j++) {
var from_coord = current_path[j];
var to_coord = current_path[j + 1];
var shortest = pDistance(point_coord[0], point_coord[1], from_coord[0], from_coord[1], to_coord[0],to_coord[1]);
var distance = shortest[0];
var coordinates = shortest[1];
var isVertex = shortest[2];
//push(segment_id, [i, j, coordinates, isVertex,distance*100000])
if (distance <= min_distance) {
segment_id = [i, j, coordinates, isVertex];
min_distance = distance;
}
}
}
if (IsEmptyButBetter(segment_id))
{
return [];
}
// Since we not on a vertex, pro for UN classes, needs to insert one, so we cannot split the line
if (exit_when_not_at_vertex && segment_id[-1] == false){
return [];
}
var new_path_1 = Slice(line_path,0,segment_id[0]+1);
var new_path_2 = Slice(line_path,segment_id[0]);
var new_seg_1= slice(new_path_1[-1],0, segment_id[1]+1);
Push(new_seg_1, segment_id[2]);
new_path_1[-1] = new_seg_1;
var new_seg_2= slice(new_path_2[0],segment_id[1] + 1);
Insert(new_seg_2,0, point_coord);
new_path_2[0] = new_seg_2;
return [new_path_1, new_path_2];
}
// Used to check different empty null states, override of core IsEmpty
function IsEmptyButBetter(data) {
if (IsEmpty(data)) return true;
for (var x in data) return false;
return true;
}
function check_exit_early(feat) {
if (IsEmptyButBetter(exit_early_values)) {
return false;
}
for (var k in exit_early_values) {
if (Includes(exit_early_values[k], feat[k])) {
return true;
}
}
return false;
}
if (check_exit_early($feature)) {
return;
}
var intersecting_lines;
if (buffer_pnt_distance == null || buffer_pnt_distance <= 0) {
intersecting_lines = Intersects($feature, line_fs);
} else {
intersecting_lines = Intersects(Buffer($feature, buffer_pnt_distance), line_fs);
}
var in_point_geometry = Geometry($feature);
var update_features = [];
var new_features = [];
var new_geoms = [];
// Loop through lines to split
for (var line_feature in intersecting_lines) {
var polyline_1 = null;
var polyline_2 = null;
new_geoms = cut_line_at_point(Geometry(line_feature), in_point_geometry, is_un_controlled);
// If a split was not found, do not modify the feature
//return text(new_geoms);
if (Count(new_geoms) != 2) {
continue;
}
var new_geom_1 = new_geoms[0];
var new_geom_2 = new_geoms[1];
if (Count(new_geom_2) == 0 || Count(new_geom_1) == 0) {
continue;
}
var line_spat_ref = Geometry(line_feature).spatialReference.wkid;
var new_geom_1 = remove_vertex(new_geom_1);
var new_geom_2 = remove_vertex(new_geom_2);
polyline_1 = Polyline({
"paths": new_geom_1,
"spatialReference": {
"wkid": line_spat_ref
}
});
polyline_2 = Polyline({
"paths": new_geom_2,
"spatialReference": {
"wkid": line_spat_ref
}
});
var polyline_1_length = Length(polyline_1);
var polyline_2_length = Length(polyline_2);
// Convert feature to dictionary to get all its attributes
var line_att = Dictionary(Text(line_feature))['attributes'];
var atts_to_remove = get_fields_by_type(line_feature, 'Upper', 'editable', false);
for (var i in remove_fields_from_new_feature) {
var fld = Upper(remove_fields_from_new_feature[i]);
if (IndexOf(atts_to_remove, fld) != 1) {
continue;
}
Push(atts_to_remove, fld);
}
line_att = set_date_type(line_feature, pop_keys(line_att, atts_to_remove));
// Check length of new shapes, adjust the current feature to the longest segment
if (polyline_1_length > polyline_2_length) {
Push(update_features, {
'globalID': line_feature.globalID,
'geometry': polyline_1
});
Push(new_features, {
//'globalID': GUID(),
'geometry': polyline_2,
'attributes': line_att
});
} else {
Push(update_features, {
'globalID': line_feature.globalID,
'geometry': polyline_2
});
Push(new_features, {
//'globalID': GUID(),
'geometry': polyline_1,
'attributes': line_att
});
}
}
// Only include edit info when a split was required
if (Count(update_features) > 0 && Count(new_features) > 0) {
var results = {};
results['edit'] = [{
'className': line_class_name,
'updates': update_features,
'adds': new_features
}]
return results;
}
return; Credit to Mike Miller, original author of this rule, more details here . Download a mobilegdb data with the split rule.
... View more
Monday
|
2
|
0
|
156
|
|
POST
|
The easiest way to test attribute rules is on a mobile geodatabase. You can create mobilegdb, add the classes and attribute rules, do the edit in Pro and verify the rules work. Once you are sure, export the rules and import them to your enterprise geodatabase. I wrote a simple Attribute rule to demonstrate what you are trying to do. When the pole has no inspections record the symbology is grayed out, when we add an inspection record, it turns red. I'm essentially maintaining an attribute on the pole called inspectioncount, for every related record added, removed, the count is calculated and the pole is updated. Hope that helps (attached)
... View more
Monday
|
0
|
0
|
37
|
|
BLOG
|
A common question I frequently get is how to temporarily disable attribute rules. This is to allow edits in the system even if they violate constraint rules, knowing that users will perform those checks at a later stage. I authored a blog demonstrating four different ways to do so I hope you enjoy it https://www.esri.com/arcgis-blog/products/arcgis-pro/data-management/dynamically-disable-all-attribute-rules -Hussein
... View more
2 weeks ago
|
5
|
1
|
192
|
|
BLOG
|
A very common ask in attribute rules is to capture the merge features events and perform some custom logic. While attribute rules don't have an explicit on merge event, we can simulate that with the Pro editing experience. I authored an example attribute rule to demonstrate how to perform this. We will be authoring an attribute rule that triggers when a user merges two lines using the Pro merge tool. When the user selects two line features and wants to merge them, the attribute rule will pick the largest line segment and copy the addresstext of that line feature to the new merged feature and then the rule will delete the old two segments. The challenge with the default option in the merge tool is that it deletes the original features being merged. This prevents attribute rule from easily querying the original features and perform custom logic. To allow for more a flexible behavior, we will need to modify some options. To achieve more control, we will be using the option to keep the original features in the merge tool, and let the attribute rules do the delete instead. This gives us more control as it turns the operation from a delete, update (delete the original feature, and update the geometry of the merged feature), to an update only. We will write the rule to trigger on update on geometry using the new triggering fields to set the rule to only trigger on shape change. On the update we will have the control to query the original features (since they won't be deleted) and copy any attribute we want off of it. And then later we can insert the logic to actually delete the original feature from attribute rules. Following is the attribute rule and a video demonstration. //When merging with the option "new feature" the rule will be executed on insert, if we enabled keep original features we get access to the original rows , this way we can query the merged lines, pick the largest segment , take the address from it and copy it to the new merged line. after merging we issue a delete to delete the two original segements.
var g = $feature.GlobalID
var fs = contains(filter($featureset, "globalid <> @g"), geometry($feature))
var biggerAddress = null;
var biggerLine = null;
var tobeDeleted = []
for (var g in fs)
{
push(tobeDeleted, {"globalId": g.globalid} )
if (biggerLine == null){
biggerLine = g;
biggerAddress = g.addresstext;
}
if (length(g) >= length(biggerLine)){
biggerLine = g;
biggerAddress = g.addresstext;
}
}
if (biggerLine == null) return;
return {
"result": {"attributes": {"addresstext": biggerAddress}},
"edit": [{
"className": "main.theline",
"deletes": tobeDeleted
}]
} Download the sample data with the rules in this post.
... View more
3 weeks ago
|
1
|
0
|
173
|
|
IDEA
|
Evaluate attribute rules only work with batch calculation and validation rules, it writes to the features being evaluated and may generate error features when failures are detected. Constraint and immediate calculation are triggered when a user edit is made. To prevent evaluate attribute rules from changing data on DEFAULT, this Arcade script condition must be added to all batch calculation and validation rules. This will skip the execution of the rule during evaluation and proceed with no changes. Note that all the features within the extent/selections will still be read and processed but because of this condition they will process faster. if (GdbVersion($feature)=='sde.DEFAULT')) return; //quit evaluation //process the rest Another suggestion to minimize the accidental edits/evaluation is to Enable editing from the editing tab. This introduces an explicit step to start editing which prevents the accidental edits. Also as Sean suggested, making default protected and assigning the version-admin portal rule to those users with higher privileges may solve the accidental issues. I do understand that it introduces alot of friction in some cases.
... View more
3 weeks ago
|
0
|
0
|
241
|
|
POST
|
Yes this error means the globalid you passed to the dictionary to update a feature doesn’t exist. You can avoid this by querying the layer and verify that features exist. Before you return the dictionary, do a console and enable logging on Pro arcgis monitor to see the arcade messages. if this is service enable debug on arcgis server manager. alternatively a trick I like to use is to return error message to force the rule to fail and display the message. return {“errorMessage”: text(thedic) }
... View more
11-28-2025
02:17 AM
|
0
|
1
|
240
|
|
POST
|
Thanks Ofir for the feedback. We will update the documentation to include the version= option its a valid request. While you can do all of this today through error inspector, we will work on a better user experience in gp tool to allow users to select the version name and expose version specific properties such as evaluating features modified in version.
... View more
11-05-2025
02:55 PM
|
0
|
0
|
268
|
|
POST
|
I authored an example attribute rule to demonstrate this on merge. When the user selects two line features and wants to merge, the attribute rule will pick the largest line segment and copy the addresstext of that line feature to the new merged feature and then the rule will delete the old two segments. Hope this helps //When merging with the option "new feature" the rule will be executed on insert, if we enabled keep original features we get access to the original rows , this way we can query the merged lines, pick the largest segment , take the address from it and copy it to the new merged line. after merging we issue a delete to delete the two original segements.
var g = $feature.GlobalID
var fs = contains(filter($featureset, "globalid <> @g"), geometry($feature))
var biggerAddress = null;
var biggerLine = null;
var tobeDeleted = []
for (var g in fs)
{
push(tobeDeleted, {"globalId": g.globalid} )
if (biggerLine == null){
biggerLine = g;
biggerAddress = g.addresstext;
}
if (length(g) >= length(biggerLine)){
biggerLine = g;
biggerAddress = g.addresstext;
}
}
if (biggerLine == null) return;
return {
"result": {"attributes": {"addresstext": biggerAddress}},
"edit": [{
"className": "main.theline",
"deletes": tobeDeleted
}]
}
... View more
11-03-2025
12:32 PM
|
1
|
1
|
744
|
|
POST
|
I'm not sure if the user workflow allows it but I would suggest using the option to keep the original features and let the attribute rules do the delete instead of the merge tool. This gives you more control as it turns the operation from a delete , update. To an update only and you will write the rule to trigger on update on geometry (use the triggering fields to set the rule to trigger only on the shape change) on the update you will have the control to query the original features (since they won't be deleted) and copy any attribute you want off of it. And then later you can insert the logic to actually delete the original feature. My guess as to why you are losing your update is as you suspected the order of the operations causes your update to be lost/ not persisted.
... View more
10-30-2025
05:20 PM
|
0
|
3
|
806
|
|
POST
|
Hey Mike, if you use the Evaluate Rule gp tool and select the features you want to evaluate , the tool will only evaluate selected features. This isn't currently available when evaluating through error inspector
... View more
10-23-2025
06:04 AM
|
0
|
0
|
352
|
|
POST
|
Make sure to pass in the correct layerid to queryDataElements that belongs to the Utility network layer, if you don't know it you can pass in empty string and that will pull all definitions of all layers which includes the utility network layer. sourceId should be there, if its not, the UN likely wasn't published. Go to to the feature service root json definition and look for utilityNetworkLayerId to see if its published.
... View more
10-14-2025
06:11 AM
|
0
|
0
|
584
|
|
POST
|
Hey Garry, I'm going to assume you are trying to evaluate attribute rules. So first make sure you have batch calculation or validation rules on the class you want to evaluate. then add that class to a new map, and then right click on the layer on the map, you should see a new context menu AddErrorLayers, if you don't or it is disabled this means you don't have batch rules.
... View more
09-08-2025
11:09 AM
|
0
|
0
|
277
|
|
BLOG
|
My colleague Koya has just published a blog detailing how you can performing snapping with Attribute rules. Wherever you are, web, mobile, Pro, edits you make can be made more accurate, and it is configurable. Amazing the stuff you can do with attribute rules. Give it a read! https://www.esri.com/arcgis-blog/products/utility-network/arcade/snapping-with-attribute-rules
... View more
09-03-2025
12:49 PM
|
4
|
0
|
230
|
|
POST
|
I can't tell what is going on but let us verify few things 1) Change the server logs to debug and see if you have entries for attribute rules being evaluated. 2) Does full extent work? 3) create new features and then evaluate them do they get picked up? if that doesn't work Finally try the following. Delete the validation rules, delete the error tables, then add the validation rule again, this will generate the error tables. Makes sure you are using a branch version workspace when you do that. Then delete the feature service and create new map, add your layer and the error tables and publish with the validation server, create a new feature service and test that new feature service..
... View more
08-27-2025
10:23 AM
|
1
|
0
|
533
|
|
POST
|
I would recommend not writing to the association table directly as it can be error prune. Instead use the provided attribute rules dictionary APIs I authored a blog about it https://www.esri.com/arcgis-blog/products/utility-network/electric-gas/advanced-attribute-rules-creating-utility-network-associations-with-attribute-rules for more advanced cases use this keyword Read more here about "^UN_Association" https://pro.arcgis.com/en/pro-app/latest/help/data/geodatabases/overview/attribute-rule-dictionary-keywords.htm
... View more
08-12-2025
11:01 AM
|
2
|
1
|
451
|
| Title | Kudos | Posted |
|---|---|---|
| 2 | Monday | |
| 5 | 2 weeks ago | |
| 1 | 3 weeks ago | |
| 1 | 11-03-2025 12:32 PM | |
| 1 | 01-02-2025 06:31 AM |
| Online Status |
Offline
|
| Date Last Visited |
Wednesday
|