|
POST
|
Glad it worked. The getUser, which returns the current user full json definition, is expensive. So we cache it in the SOC that serviced the edit. If you then made a change to the user details, assign a group, add a role, the cached entry is out of date. Restarting the service picks up the new definition, roles, and groups. The reason that why it may worked sometimes without restarting the service, is if your new edit happened to land on a SOC process that didn't cache the getUser call, which will result in a fresh call to retrieve the new data.
... View more
03-19-2026
09:52 AM
|
1
|
0
|
506
|
|
POST
|
Thanks for the case number. I read the case details. Can you try the following? it looks like you are assigning the group to the user while the service is running. After assigning the group, restart the feature service and try again and see if it fixes it.
... View more
03-19-2026
06:18 AM
|
1
|
2
|
513
|
|
POST
|
Thanks I saw that. Replied to the case. Adding a summary. The fact you are seeing Scratch workspace is just a side effect of using GetFeatureSetInfo. GetFeatureSetInfo returns the meta data about the featureset, not the actual data behind the featureset. ScratchRecordSet is just the temporary workspace we use to store data after querying the associations table, which as you see you got the full dictionary with default values. So nothing wrong here. To see the returned association rows and the actual class names, write the following arcade, your class names in mobile gdb should be something like "main.classname, use that in your filter. then in Pro open Arcgis monitor (CTRL+ALT+M), enable debug mode, and look for "ArcadeConsole", you should see the actual json. var connectivityAssociations = FeatureSetByAssociation(thefeature, 'connected')
for (var a in connectivityAssociations)
console(text(a))
<Event time="Fri Feb 06 09:53:44.904" type="Debug" task#="1605" thread="745c: Main CIM worker thread" elapsed="0" function="Geodatabase.AttributeRule" code="ArcadeConsole"> Arcade{"geometry":null,"attributes":{"className":"main.ElectricDistributionJunction","globalId":"{F4013AD9-F4F0-4B6F-A212-F353ED90F2B1}","isContentVisible":0,"ObjectID":65,"OID":65,"percentAlong":0,"side":"","terminal":1}} </Event> hope that helps
... View more
02-06-2026
10:01 AM
|
0
|
0
|
640
|
|
POST
|
Yes you may open a case, if you can include the mobile gdb with the script, and the globalids of the feature reproducing the problem, Pro released used it would be good. send me the case number privately. Thanks
... View more
02-06-2026
07:30 AM
|
0
|
2
|
649
|
|
POST
|
No that shouldn't be expected, can we have the Arcade script used? (And also the Pro version used) FeatureSetByAssociation returns a featureset, so you would need to either loop or do a First to get the association row. Best quick test is to use a Popup and click on one of the junctions that have connectivity associations. var x = First(FeatureSetByAssociation(($feature), "connected"))
if (x == null)
return "No connected feature"
else
return text(x); You should get something like this {"geometry":null,
"attributes":
{"className":"main.ElectricDistributionJunction",
"globalId":"{F4013AD9-F4F0-4B6F-A212-F353ED90F2B1}",
"isContentVisible":0,
"ObjectID":65,
"OID":65,
"percentAlong":0,
"side":"",
"terminal":1}}
... View more
02-05-2026
04:30 PM
|
0
|
4
|
665
|
|
POST
|
Correct, you need HonorSequenceOfEdits to be set to true to respect the array order in the applyEdits. Otherwise if that parameter is false or not provided, we will use the layerId order, which in your case it will be non-deterministic since all of them are same layer more on the doc https://developers.arcgis.com/rest/services-reference/enterprise/apply-edits-feature-service/
... View more
01-28-2026
12:24 PM
|
0
|
3
|
756
|
|
POST
|
This is useful thank you. I was able to reproduce and I do see what is going on. HonorSequenceOfEdits controls the outside layer edits payload, not the individual edits within a layer. As you may have noticed, the array of updates within a single edit are ordered by objectId and executed in ascending order (lowest objectid first). So in your example, objectid 61818 is updated first, followed by 71416 which is made into a main address, resulting in the failure, because 71816 was not updated yet. If you want to force the 3 updates to be in executed in the order, you will just need to change the payload from one edit with 3 updates to 3 edits each with a single update. They will still be in a single transaction, but 3 distinct edits to the same layer instead. Here is your applyEdits (simplified) before and after the change. //one edit, 3 updates
[{
"id": 21,
"updates": [{
"attributes": {
"objectid": 71816,
"globalid": "{0539CB83-A63B-4ED2-9502-398A414DB666}",
"object_address_type": 0,
"object_id": "{9BE6E41A-8558-4268-B6B3-FB96A755B98D}"
}
}, {
"attributes": {
"objectid": 71416,
"globalid": "{528C37D9-A398-4EC2-9AB4-027754F456B3}",
"object_address_type": 1,
"object_id": "{9BE6E41A-8558-4268-B6B3-FB96A755B98D}"
}
}, {
"attributes": {
"objectid": 61818,
"globalid": "{F37ED8AD-3D2B-4B8A-BB93-9DAD47B512FC}",
"object_address_type": 0,
"object_id": "{9BE6E41A-8558-4268-B6B3-FB96A755B98D}"
}
}
]
}
]
Change it to //3 edits, one update in each.
[{
"id": 21,
"updates": [{
"attributes": {
"objectid": 71816,
"globalid": "{0539CB83-A63B-4ED2-9502-398A414DB666}",
"object_address_type": 0,
"object_id": "{9BE6E41A-8558-4268-B6B3-FB96A755B98D}"
}
}
]
},
{
"id": 21,
"updates": [ {
"attributes": {
"objectid": 71416,
"globalid": "{528C37D9-A398-4EC2-9AB4-027754F456B3}",
"object_address_type": 1,
"object_id": "{9BE6E41A-8558-4268-B6B3-FB96A755B98D}"
}
}
]
},
{
"id": 21,
"updates": [ {
"attributes": {
"objectid": 61818,
"globalid": "{F37ED8AD-3D2B-4B8A-BB93-9DAD47B512FC}",
"object_address_type": 0,
"object_id": "{9BE6E41A-8558-4268-B6B3-FB96A755B98D}"
}
}
]
}
] This payload should solve this issue. Here is my repro case before and after the applyEdits change after the change it works
... View more
01-26-2026
02:25 PM
|
3
|
6
|
796
|
|
BLOG
|
Split is an operation in ArcGIS that breaks a line feature into two features at the split location. This operation can be performed in ArcGIS Pro or via the rest API split payload. Split respects the the line split policy, which is either update delete or , delete insert insert. However, often you want to more control on how and what lines are split, the attributes that are distributed between the two split lines, and what kind of features causes the line to split. For that we have authored an Attribute rule that allows you to do this. This rule was authored specially for the Utility Network because it has special logic, however it will work with other line features outside the controller dataset. The rule allows you to configure a point feature that when that feature is created on a line feature it will split the line and transfer the attributes from the point feature to the line features. This example configures a switch feature that splits the medium voltage line. You can add this Arcade expression as a new Calculation Attribute rule to the point class you wish to split the line. The rule does a spatial query, detects the line, updates its geometry and creates a new line feature. The payload for both edits can be changed to include additional attributes and logic. // NOTE: This is not tested on multipart lines
// NOTE: The intersect seems to be failing on large lines in GCS, we could evaluating buffering the point for a better intersect
//Expects($feature, 'SplitInt', 'SplitText');
// Exit Early Code
// If you want to exit and not split a interesting line based on a point value, fill out a dict
// with field name as the key and the values as an array to exit early
// Make sure to list these fields in the Expects
var exit_early_values = {} //'SplitInt', [200],'SplitText', ['DontSplitTheLine']);
// The field to not move to a new field, edit tracking fields need to be remove
// All fields listed here, need to be in upper case, they are forced to upper in the logic below.
var remove_fields_from_new_feature = ['SHAPE_LENGTH', 'GLOBALID', 'OBJECTID'];
// The line class to split
var line_class_name = "ElectricDistributionLine";
// This is used to get Non Editable fields, do not change the fields from *
var line_fs = FeatureSetByName($datastore, "main.ElectricDistributionLine", ['*'], true);
// Set this when using decimal degrees, it adjust the tolerances
var decimal_degrees = false;
// How man decimals to round coordinates to check if identical
// GCS should use a large value, such as 9
// PCS should use a value such as 4
var compare_coordinate_round_value = 2;
// When walking the line to split, a line is created between pairs of vertex
// this value is the distance ito determine if the created point is on that line
// GCS should use a small value, such as 0.0001
// PCS should use a larger value, such as 0.1
var point_on_line_tol = 0.1;
// For some scales, and large GCS lines, intersect does not return intersecting lines
// Setting this value uses a polygon buffer of the point for the intersect
// PCS should use a value such as 0.02
// GCS should use a value such as 0.00000002
var buffer_pnt_distance = 0;
if (decimal_degrees) {
compare_coordinate_round_value = 9;
point_on_line_tol = 0.00001;
if (buffer_pnt_distance != 0) {
buffer_pnt_distance = 0.00000002;
}
}
// option to skip logic to remove duplicate vertex at the start and end of line, this may be caused by splitting on
// a vertex
var remove_dup_vertex = true;
// When the point and line are un controlled, we cannot split a line when the point is midspan(not at a vertex)
var is_un_controlled = true;
// ************* End User Variables Section *************
function get_fields_by_type(feat, convert_string, param, value) {
var fields = Schema(feat).fields;
var return_fields = [];
var func = Decode(Lower(convert_string), "lower", Lower, "upper", Upper, Text);
for (var f in fields) {
if (fields[f][param] == value) {
var fld_name = fields[f].name;
if (!IsEmpty(convert_string)) {
fld_name = func(fld_name);
}
Push(return_fields, fld_name);
}
}
return return_fields;
}
function set_date_type(feat, dict) {
// Dates need to be set to date types for some platforms
var dt_keys = get_fields_by_type(feat, dict, 'type', 'esriFieldTypeDate');
for (var k in dict) {
if (IndexOf(dt_keys, Upper(k)) == -1) {
continue;
}
dict[k] = Date(dict[k]);
}
return dict;
}
function pDistance(x, y, x1, y1, x2, y2) {
// adopted from https://stackoverflow.com/a/6853926
var A = x - x1;
var B = y - y1;
var C = x2 - x1;
var D = y2 - y1;
var dot = A * C + B * D;
var len_sq = C * C + D * D;
var param = -1;
if (len_sq != 0) //in case of 0 length line
param = dot / len_sq;
var xx, yy;
var is_vertex = false;
// Do we want to use tolerenaces
if (compare_coordinate(x, y, x1, y1)) {
is_vertex = true;
}
if (compare_coordinate(x, y, x2, y2)) {
is_vertex = true;
}
if (param < 0) {
//is_vertex = true;
xx = x1;
yy = y1;
}
else if (param > 1) {
//is_vertex = true;
xx = x2;
yy = y2;
}
else {
//is_vertex = false;
xx = x1 + param * C;
yy = y1 + param * D;
}
var dx = x - xx;
var dy = y - yy;
return [Sqrt(dx * dx + dy * dy), [xx, yy], is_vertex];
}
function compare_coordinate(x, y, x1, y1) {
// TODO, probably move to Equals and compare the geometry
if ((Round(x1, compare_coordinate_round_value) != Round(x, compare_coordinate_round_value)) ||
(Round(y1, compare_coordinate_round_value) != Round(y, compare_coordinate_round_value)) ){
return false;
}
return true;
}
function pop_keys(dict, keys) {
var new_dict = {};
for (var k in dict) {
if (IndexOf(keys, Upper(k)) != -1) {
continue;
}
new_dict[k] = dict[k];
}
return new_dict;
}
function remove_vertex(path_array) {
if (!remove_dup_vertex){
return path_array;
}
var new_path = [];
var current_path = path_array[0];
var vertex_count = Count(current_path);
if (vertex_count > 2) {
if (compare_coordinate(current_path[0][0],current_path[0][1],current_path[1][0],current_path[1][1])) {
for (var i in current_path) {
if (i != 1) {
Push(new_path, current_path[i]);
}
}
current_path = new_path;
}
}
new_path = [];
path_array[0] = current_path;
current_path = path_array[-1];
vertex_count = Count(current_path);
if (Count(current_path) > 2) {
if (compare_coordinate(current_path[-1][0],current_path[-1][1],current_path[-2][0],current_path[-2][1])) {
for (var i in current_path) {
if (i != vertex_count - 2) {
Push(new_path, current_path[i]);
}
}
current_path = new_path;
}
}
path_array[-1] = current_path;
return path_array;
}
function cut_line_at_point(line_geometry, point_geometry, exit_when_not_at_vertex) {
var point_coord = null;
var interpolate_z = false;
// Check if the line has already been converted to a dict
var line_shape = null;
if (TypeOf(line_geometry) == 'Dictionary') {
line_shape = line_geometry;
} else {
line_shape = Dictionary(Text(line_geometry));
}
// Get the Z info and determine if Zs should be return/interpolated in lines
// TODO: Handle M's
if (Count(line_shape['paths'][0][0]) >= 3 && IsEmpty(point_geometry.Z)) {
point_coord = [point_geometry.X, point_geometry.Y];
interpolate_z = true;
} else if (Count(line_shape['paths'][0][0]) >= 3 && IsEmpty(point_geometry.Z) == false) {
point_coord = [point_geometry.X, point_geometry.Y, point_geometry.Z];
} else {
point_coord = [point_geometry.X, point_geometry.Y];
}
// If the point is at the start or end, skip splitting line
if (compare_coordinate(point_coord[0],point_coord[1], line_shape['paths'][0][0][0], line_shape['paths'][0][0][1]) ||
compare_coordinate(point_coord[0],point_coord[1], line_shape['paths'][-1][-1][0],line_shape['paths'][-1][1])) {
return [];
}
var min_distance = point_on_line_tol * 2;
var segment_id = [];
var line_path = line_shape['paths'];
for (var i in line_path) {
var current_path = line_path[i];
// Loop over vertex, exit when at last vertex
for (var j = 0 ; j < Count(current_path) - 1 ; j++) {
var from_coord = current_path[j];
var to_coord = current_path[j + 1];
var shortest = pDistance(point_coord[0], point_coord[1], from_coord[0], from_coord[1], to_coord[0],to_coord[1]);
var distance = shortest[0];
var coordinates = shortest[1];
var isVertex = shortest[2];
//push(segment_id, [i, j, coordinates, isVertex,distance*100000])
if (distance <= min_distance) {
segment_id = [i, j, coordinates, isVertex];
min_distance = distance;
}
}
}
if (IsEmptyButBetter(segment_id))
{
return [];
}
// Since we not on a vertex, pro for UN classes, needs to insert one, so we cannot split the line
if (exit_when_not_at_vertex && segment_id[-1] == false){
return [];
}
var new_path_1 = Slice(line_path,0,segment_id[0]+1);
var new_path_2 = Slice(line_path,segment_id[0]);
var new_seg_1= slice(new_path_1[-1],0, segment_id[1]+1);
Push(new_seg_1, segment_id[2]);
new_path_1[-1] = new_seg_1;
var new_seg_2= slice(new_path_2[0],segment_id[1] + 1);
Insert(new_seg_2,0, point_coord);
new_path_2[0] = new_seg_2;
return [new_path_1, new_path_2];
}
// Used to check different empty null states, override of core IsEmpty
function IsEmptyButBetter(data) {
if (IsEmpty(data)) return true;
for (var x in data) return false;
return true;
}
function check_exit_early(feat) {
if (IsEmptyButBetter(exit_early_values)) {
return false;
}
for (var k in exit_early_values) {
if (Includes(exit_early_values[k], feat[k])) {
return true;
}
}
return false;
}
if (check_exit_early($feature)) {
return;
}
var intersecting_lines;
if (buffer_pnt_distance == null || buffer_pnt_distance <= 0) {
intersecting_lines = Intersects($feature, line_fs);
} else {
intersecting_lines = Intersects(Buffer($feature, buffer_pnt_distance), line_fs);
}
var in_point_geometry = Geometry($feature);
var update_features = [];
var new_features = [];
var new_geoms = [];
// Loop through lines to split
for (var line_feature in intersecting_lines) {
var polyline_1 = null;
var polyline_2 = null;
new_geoms = cut_line_at_point(Geometry(line_feature), in_point_geometry, is_un_controlled);
// If a split was not found, do not modify the feature
//return text(new_geoms);
if (Count(new_geoms) != 2) {
continue;
}
var new_geom_1 = new_geoms[0];
var new_geom_2 = new_geoms[1];
if (Count(new_geom_2) == 0 || Count(new_geom_1) == 0) {
continue;
}
var line_spat_ref = Geometry(line_feature).spatialReference.wkid;
var new_geom_1 = remove_vertex(new_geom_1);
var new_geom_2 = remove_vertex(new_geom_2);
polyline_1 = Polyline({
"paths": new_geom_1,
"spatialReference": {
"wkid": line_spat_ref
}
});
polyline_2 = Polyline({
"paths": new_geom_2,
"spatialReference": {
"wkid": line_spat_ref
}
});
var polyline_1_length = Length(polyline_1);
var polyline_2_length = Length(polyline_2);
// Convert feature to dictionary to get all its attributes
var line_att = Dictionary(Text(line_feature))['attributes'];
var atts_to_remove = get_fields_by_type(line_feature, 'Upper', 'editable', false);
for (var i in remove_fields_from_new_feature) {
var fld = Upper(remove_fields_from_new_feature[i]);
if (IndexOf(atts_to_remove, fld) != 1) {
continue;
}
Push(atts_to_remove, fld);
}
line_att = set_date_type(line_feature, pop_keys(line_att, atts_to_remove));
// Check length of new shapes, adjust the current feature to the longest segment
if (polyline_1_length > polyline_2_length) {
Push(update_features, {
'globalID': line_feature.globalID,
'geometry': polyline_1
});
Push(new_features, {
//'globalID': GUID(),
'geometry': polyline_2,
'attributes': line_att
});
} else {
Push(update_features, {
'globalID': line_feature.globalID,
'geometry': polyline_2
});
Push(new_features, {
//'globalID': GUID(),
'geometry': polyline_1,
'attributes': line_att
});
}
}
// Only include edit info when a split was required
if (Count(update_features) > 0 && Count(new_features) > 0) {
var results = {};
results['edit'] = [{
'className': line_class_name,
'updates': update_features,
'adds': new_features
}]
return results;
}
return; Credit to Mike Miller, original author of this rule, more details here . Download a mobilegdb data with the split rule.
... View more
01-05-2026
03:50 PM
|
3
|
0
|
849
|
|
POST
|
The easiest way to test attribute rules is on a mobile geodatabase. You can create mobilegdb, add the classes and attribute rules, do the edit in Pro and verify the rules work. Once you are sure, export the rules and import them to your enterprise geodatabase. I wrote a simple Attribute rule to demonstrate what you are trying to do. When the pole has no inspections record the symbology is grayed out, when we add an inspection record, it turns red. I'm essentially maintaining an attribute on the pole called inspectioncount, for every related record added, removed, the count is calculated and the pole is updated. Hope that helps (attached)
... View more
01-05-2026
03:13 PM
|
0
|
0
|
272
|
|
BLOG
|
A common question I frequently get is how to temporarily disable attribute rules. This is to allow edits in the system even if they violate constraint rules, knowing that users will perform those checks at a later stage. I authored a blog demonstrating four different ways to do so I hope you enjoy it https://www.esri.com/arcgis-blog/products/arcgis-pro/data-management/dynamically-disable-all-attribute-rules -Hussein
... View more
12-30-2025
10:25 AM
|
5
|
1
|
455
|
|
BLOG
|
A very common ask in attribute rules is to capture the merge features events and perform some custom logic. While attribute rules don't have an explicit on merge event, we can simulate that with the Pro editing experience. I authored an example attribute rule to demonstrate how to perform this. We will be authoring an attribute rule that triggers when a user merges two lines using the Pro merge tool. When the user selects two line features and wants to merge them, the attribute rule will pick the largest line segment and copy the addresstext of that line feature to the new merged feature and then the rule will delete the old two segments. The challenge with the default option in the merge tool is that it deletes the original features being merged. This prevents attribute rule from easily querying the original features and perform custom logic. To allow for more a flexible behavior, we will need to modify some options. To achieve more control, we will be using the option to keep the original features in the merge tool, and let the attribute rules do the delete instead. This gives us more control as it turns the operation from a delete, update (delete the original feature, and update the geometry of the merged feature), to an update only. We will write the rule to trigger on update on geometry using the new triggering fields to set the rule to only trigger on shape change. On the update we will have the control to query the original features (since they won't be deleted) and copy any attribute we want off of it. And then later we can insert the logic to actually delete the original feature from attribute rules. Following is the attribute rule and a video demonstration. //When merging with the option "new feature" the rule will be executed on insert, if we enabled keep original features we get access to the original rows , this way we can query the merged lines, pick the largest segment , take the address from it and copy it to the new merged line. after merging we issue a delete to delete the two original segements.
var g = $feature.GlobalID
var fs = contains(filter($featureset, "globalid <> @g"), geometry($feature))
var biggerAddress = null;
var biggerLine = null;
var tobeDeleted = []
for (var g in fs)
{
push(tobeDeleted, {"globalId": g.globalid} )
if (biggerLine == null){
biggerLine = g;
biggerAddress = g.addresstext;
}
if (length(g) >= length(biggerLine)){
biggerLine = g;
biggerAddress = g.addresstext;
}
}
if (biggerLine == null) return;
return {
"result": {"attributes": {"addresstext": biggerAddress}},
"edit": [{
"className": "main.theline",
"deletes": tobeDeleted
}]
} Download the sample data with the rules in this post.
... View more
12-23-2025
09:32 AM
|
1
|
0
|
565
|
|
IDEA
|
Evaluate attribute rules only work with batch calculation and validation rules, it writes to the features being evaluated and may generate error features when failures are detected. Constraint and immediate calculation are triggered when a user edit is made. To prevent evaluate attribute rules from changing data on DEFAULT, this Arcade script condition must be added to all batch calculation and validation rules. This will skip the execution of the rule during evaluation and proceed with no changes. Note that all the features within the extent/selections will still be read and processed but because of this condition they will process faster. if (GdbVersion($feature)=='sde.DEFAULT')) return; //quit evaluation //process the rest Another suggestion to minimize the accidental edits/evaluation is to Enable editing from the editing tab. This introduces an explicit step to start editing which prevents the accidental edits. Also as Sean suggested, making default protected and assigning the version-admin portal rule to those users with higher privileges may solve the accidental issues. I do understand that it introduces alot of friction in some cases.
... View more
12-19-2025
03:26 PM
|
0
|
0
|
826
|
|
POST
|
Yes this error means the globalid you passed to the dictionary to update a feature doesn’t exist. You can avoid this by querying the layer and verify that features exist. Before you return the dictionary, do a console and enable logging on Pro arcgis monitor to see the arcade messages. if this is service enable debug on arcgis server manager. alternatively a trick I like to use is to return error message to force the rule to fail and display the message. return {“errorMessage”: text(thedic) }
... View more
11-28-2025
02:17 AM
|
0
|
1
|
963
|
|
POST
|
Thanks Ofir for the feedback. We will update the documentation to include the version= option its a valid request. While you can do all of this today through error inspector, we will work on a better user experience in gp tool to allow users to select the version name and expose version specific properties such as evaluating features modified in version.
... View more
11-05-2025
02:55 PM
|
0
|
0
|
544
|
|
POST
|
I authored an example attribute rule to demonstrate this on merge. When the user selects two line features and wants to merge, the attribute rule will pick the largest line segment and copy the addresstext of that line feature to the new merged feature and then the rule will delete the old two segments. Hope this helps //When merging with the option "new feature" the rule will be executed on insert, if we enabled keep original features we get access to the original rows , this way we can query the merged lines, pick the largest segment , take the address from it and copy it to the new merged line. after merging we issue a delete to delete the two original segements.
var g = $feature.GlobalID
var fs = contains(filter($featureset, "globalid <> @g"), geometry($feature))
var biggerAddress = null;
var biggerLine = null;
var tobeDeleted = []
for (var g in fs)
{
push(tobeDeleted, {"globalId": g.globalid} )
if (biggerLine == null){
biggerLine = g;
biggerAddress = g.addresstext;
}
if (length(g) >= length(biggerLine)){
biggerLine = g;
biggerAddress = g.addresstext;
}
}
if (biggerLine == null) return;
return {
"result": {"attributes": {"addresstext": biggerAddress}},
"edit": [{
"className": "main.theline",
"deletes": tobeDeleted
}]
}
... View more
11-03-2025
12:32 PM
|
1
|
1
|
1942
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 03-19-2026 09:52 AM | |
| 1 | 03-19-2026 06:18 AM | |
| 3 | 01-26-2026 02:25 PM | |
| 3 | 01-05-2026 03:50 PM | |
| 5 | 12-30-2025 10:25 AM |
| Online Status |
Offline
|
| Date Last Visited |
03-19-2026
06:14 AM
|