|
POST
|
I think a potential source of error is the "Calculate Field" tool you're using, that doesn't look like the interface for the tool in the Data Management toolbox. Try using that tool specifically and see if the errors persist.
... View more
03-31-2023
08:40 AM
|
0
|
2
|
2645
|
|
POST
|
If you leave the output parameter empty it should output to the default geodatabase with the default name, just like the normal geoprocessing pane behaviour. If you want to chain this output directly into the next task step then I'm not sure if or how that's possible, I just ask the user to pick the data themselves in the instructions and hope for the best.
... View more
03-17-2023
06:17 AM
|
1
|
0
|
920
|
|
POST
|
The expanded method help makes this clearer. You pass in a list that contains dictionaries, one dictionary per folder connection. Each dictionary has three keys and associated values: "connectionString" (A local or UNC path to a system folder), "alias" (optional alias for the path in the catalog view) and "isHomeFolder" (True if this entry is for the home folder, False otherwise). Basically you're passing in an anonymous config object like it's a JavaScript function, even though Python supports named objects. Guess whoever wrote that interface had a long day and didn't want to define a proper object 😁.
... View more
03-16-2023
05:42 PM
|
0
|
3
|
1440
|
|
POST
|
I've cleaned up the posted code and moved the counter increment to the right spot, let us know if this works. var pointGeometryX = Text(Geometry($feature).x);
var pointGeometryY = Text(Geometry($feature).y);
var pointID = $feature.assetID;
var lineImport = FeatureSetbyName($datastore, "LineLayer", ["*"], true);
var lineFilter = filter(lineImport "assetNodeTo = @pointID"); //Filter Line Layer to just the To Nodes that join to the moved point, Join is based on Asset ID (point.assetID=line.assetNodeTo)
var AddList = [];
var counter = 0;
var numToNode = Count(lineFilter);
if (numToNodes > 0) {
for (var i in lineFilter) {
var LineGeomPaths = Geometry(i).paths;
var fromNodeGeom = LineGeomPaths[0][0]; //Get From Nodes XY Coordinates
var fromNodeGeomX = Text(fromNodeGeom.x);
var fromNodeGeomY = Text(fromNodeGeom.y);
var polylineJSON = { //Build Polyline layer with new To Node (New moved point geometry), keep same From Node
"paths": [[
[fromNodeGeomX, fromNodeGeomY, 0],
[pointGeometryX, pointGeometryY, 0]
]],
"spatialReference": {
"wkid":2249
}
};
var polyFinal = Polyline(polylineJSON)
AddList[counter++] = {
"OBJECTID": i.OBJECTID,
"geometry": polyFinal
};
}
return {
"result": i.OBJECTID,
"edit": [{
"className": "LineLayer",
"updates": AddList
}]
}
} else {
return $feature.assetID
}
... View more
03-16-2023
02:08 PM
|
0
|
1
|
1472
|
|
IDEA
|
My team is constantly asked to show a point's lat/lon as a standard attribute, this idea would simplify those requests and ease maintenance.
... View more
03-16-2023
10:53 AM
|
0
|
0
|
8856
|
|
IDEA
|
A common request I get is to surface the latitude and longitude of spatial data as attributes for ease of use in online data tables and pop-ups. The most timely way to do this is an Arcade expression in either a pop-up or attribute rule to calculate the values. This currently requires the user to compute the transformation themselves, which is error-prone and difficult to read in the context of an Arcade script. My request is a way to get lat/lon through built-in Arcade functionality, either through functions or properties on the Geometry object type. Only supporting Web Mercator data sources is great as a first step but supporting all projections would be ideal as this increases the portability of all scripts. This would require support for both the Attribute Rules and Popup profiles for my use cases but could be expanded to many more profiles. I'd also take implementing this projection idea as a suitable solution, although adding lat/lon access as a convenience on top of that idea would be even better.
... View more
03-16-2023
10:43 AM
|
13
|
1
|
1285
|
|
POST
|
When you create a form from a service it always uses the first layer as a template. If you go to the "settings" tab in your XLSForm and check the "form_id" attribute you can change that to the appropriate layer name, but you'll have to remap the fields themselves. Hit up the ideas section if you keep running into this as I can't think of a way to pick which service layer the form is built from.
... View more
03-13-2023
04:28 PM
|
1
|
0
|
3357
|
|
POST
|
Ah, much better! I haven't looked too closely at the exact logic of the script, but there's a couple of big issues: You're attempting to "eval" a SQL expression on line 24, which is generating invalid Python. If you want to apply a filter to a cursor, you should create the SQL string and then pass it in as the third parameter to the cursor constructor. This'll let you use SQL syntax features and it filters the data in the database which saves you some needless Python work. Alternatively you can rewrite the conditional expression to use Python syntax, something like: [row for row in cursor if row[0] >= search_range[0] and row[0] <= search_range[1]] That list comprehension on line 24 is burning through the entire cursor object so it won't update the correct row. Overall I think your best bet is to take a step back and rearrange the order you're running your statements in. One trick that might apply here is to use a dictionary comprehension to quickly turn a dataset into a lookup table. Something like: lookup = {x[0]: x[1] for x in arcpy.da.SearchCursor(lookup_table, (lookup_key_field, value_field))}
with arcpy.da.UpdateCursor(update_table, (update_key_field, data_field)) as cursor:
for key, current_value in cursor:
new_value = lookup.get(key, current_value)
if new_value != current_value: # Avoid needless updates, saves costly db hits
cursor.updateRow((key, new_value)) That way you do one pass through the lookup table and make matches in memory, saving a ton of database activity. Good luck!
... View more
02-23-2023
05:20 PM
|
1
|
0
|
906
|
|
POST
|
Please post your code with the code sample formatting and proper indentation so we can see how things should work on our end, thanks!
... View more
02-23-2023
04:29 PM
|
0
|
0
|
918
|
|
POST
|
The Field Calculator also loads datetime fields in as native datetime objects so it'll work there as well. If they need the year as a string in a different format then the strftime method and the format code of their choice should do what they need.
... View more
02-23-2023
04:23 PM
|
1
|
0
|
6770
|
|
POST
|
As ESRI product, using malformed JSON? Say it ain't so! Anyways, I did some digging and the demjson package comes up a bunch, I'd say that's your best bet short of building your own parser library.
... View more
02-23-2023
04:09 PM
|
1
|
1
|
1225
|
|
IDEA
|
I'd expand this to make multi-choice options a full domain type, that way other parts of the platform can edit multiple choice questions. Bonus points if the choices are stored as bit flags if the backing field is an integer, the Survey123 method of concatenating every choice in the backing field seems wasteful when you can shove 16 choices in a short field.
... View more
02-23-2023
11:54 AM
|
0
|
0
|
4328
|
|
POST
|
The table structure for Branch Versioned data is much simpler than Traditional Versioning, it's basically a tweaked version of the format used by Archive-only tables. If you want to make direct edits via SQL, start by reconciling/posting/deleting all your child versions as appropriate, then add records in accordance with how ArcGIS clients do. Off the top of my head for SQL Server EGDBs: Inserts add a new record with a new GDB_ARCHIVE_OID, OBJECTID and GlobalID. The GDB_FROM_DATE is the UTC date of the transaction (this is different IIRC from classic archive tables, which use the local time). Updates add a new record with the same OID and GID but a new GDB_ARCHIVE_OID. Same rules for GDB_FROM_DATE. Deletes are the tricky one. In my initial testing it looked like you just grab the most recent record, set the delete flag and fill out the appropriate delete fields. In a later testing session it looked like deletes added a whole new record that's just a copy of the final record, but with the delete info set and the GDB_FROM_DATE set to the delete time. Make sure you test thoroughly. Almost goes without saying, but the state you set should always be 0. If you want to make edits against a specific version then good luck. Basically just observe how all your clients adjust the table under a variety of circumstances and then mimic that in SQL. I understand why nobody from ESRI will confirm how the operations work from the DB side but that would be a good resource, pretty please?
... View more
02-02-2023
09:19 AM
|
1
|
8
|
6132
|
|
POST
|
I'll leave some notes here for anybody who stumbles into this from a web search. With Traditional Versioning and Archiving enabled, the current state of the feature class is tracked in the usual business + delta table combos. If you crack open your RDBMS there'll be a "_H" table which stores a full archive of every change that made it into DEFAULT. This is handy as the usual methods of fully compressing a versioned table seem to fall apart if archiving is enabled. If you disable archiving, there's an option to keep the archive. The table has the same "_H" name but it's logically detached from the main table. This seems to make full compression possible again. Disabling versioning from this point leaves behind the business table with the final state of DEFAULT. Enabling Branch Versioning creates the expected table schema and doesn't fiddle with the old archive table. With this info the transitional workflow appears to be: Kick everyone out of the DB, stop every relevant web service etc. etc. Get the DEFAULT version to the desired state and blow out all child versions. Disable archiving and confirm the presence of the "_H" table. Disable versioning. Truncate all data. Enable Branch Versioning. ETL the "_H" table into the new schema, this will include the previously truncated data. I can't guarantee anything so talk to your ESRI rep if you're migrating a critical dataset and keep a working backup on hand.
... View more
01-26-2023
09:08 AM
|
2
|
1
|
4462
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 8 hours ago | |
| 1 | yesterday | |
| 1 | a week ago | |
| 1 | a week ago | |
| 1 | a week ago |
| Online Status |
Online
|
| Date Last Visited |
an hour ago
|