POST
|
I have a polygon feature class related to a table via Relationship Class (one-to-many, GlobalID in polygon related to a GUID in the table, field called REL_ZONE_ID). I am updating this related table manually. The related table lists the segments that intersect the polygons (polygon labels shown here are a shorthand version of the full Globals, full version listed in the REL_ZONE_ID field). So basically it's a table that keeps track of the lines intersecting a given polygon. I am moving between Pro 3.1 and 3.2, so I can't yet reliably use "FeatureSetByRelationshipClass". The table shown is the related table, storing a record of each line that intersects a polygon. I would like to do this automatically with a Calculation Attribute Rule, at INSERT, UPDATE, and DELETE. My approach: Get a FeatureSet of the records that are already related to the original polygon. Mark them for deletion by adding to a "delete" array. Review all line segments (time consuming?); if any intersect the newly updated polygon, append segment ID record to a new "adds" array. Return an "edit" record to delete the originally related records, and add the newly intersected records. The below does this, but it's a hack--I needed to add in the if (editType != "DELETE") because otherwise when a polygon was deleted, the rule would remove the currently related records as expected, but then add the same records back in (except with a NULL related ID, since the polygon would have been deleted, and not have an ID anymore). I thought I wouldn't need to do the if DELETE because I thought the "Intersect()" was being called on the $feature, and not the $originalFeature. So I assumed that if a feature was deleted, there couldn't be any intersections with the $feature, and therefore no records would be added to the "adds" array. To reiterate, the code below does exactly what I want, I just can't do it without the special "DELETE" caveat. /*MANAGE RELATED SEGMENTS*/
//---------------------------------------------------------------------//
// INPUTS
var proVersion = 3.1
var editType = $editContext.editType
// Get original/current polygon feature.
var zone = $originalFeature
var newZone = $feature
// Get all polylines in SEGMENTS feature class. Geometry required for 'Intersects()'.
var segmentsFS = FeatureSetByName($datastore, "SEGMENT", ["SEG_ID"], true)
if (proVersion >= 3.2) {
// Get table records that are related to the current POLYGON feature.
var relRecordsFS = FeatureSetByRelationshipClass(zone, "ZONE_SEG_RELATION", ["*"], false)
} else {
// Get all records from the related table.
// Filter when related ID matches GlobalID of orignal POLYGON feature.
var relRecsFS = FeatureSetByName($datastore, "SEG_TBL", ["*"], false)
var zoneGlobalID = $originalFeature.GlobalID
var relRecordsFS = Filter(relRecsFS, 'REL_ZONE_ID = @zoneGlobalID')
}
// MAIN ------------------------------------------------------------------------------- #
// Iterate over the currently related table records, add all 'deletes' array.
var deletes = []
for (var relatedRecord in relRecordsFS) {
Console(`ALREADY RELATED / TO DELETE: ${relatedRecord.SEG_ID} | ${relatedRecord.GlobalID}`)
Push(deletes, {"globalID": relatedRecord.GlobalID})
}
// For every SEGMENT line, check if it intersects current ZONE. If so, add to 'adds' array.
var adds = []
if (editType != "DELETE") { // If a feature is deleted, there won't be anything getting added. "newZone" alone isn't working?
for (var seg in segmentsFS) {
if (Intersects(newZone, seg)) {
Console(`FOUND NEW INTERSECTION / TO ADD: ${seg.SEG_ID}`)
Push(adds, {"attributes": {"REL_ZONE_ID": newZone.GlobalID, "SEG_ID": seg.SEG_ID} })
}
}
}
Console(`\nDELETE RECORDS:\n${deletes}\n\nADD RECORDS:\n${adds}`)
return {
'edit': [{
'className': 'SEG_TBL', // This is the TABLE where the edits are occurring.
'deletes': deletes, // [{'globalID': XXX}, {'globalID': YYY}] <-- 2 delete examples
'adds': adds // [{"attributes": {"REL_ZONE_ID": newZone.GlobalID, "SEG_ID": seg.SEG_ID}}, {"attr...] <-- 2 adds
}]
}
... View more
2 weeks ago
|
0
|
0
|
147
|
POST
|
Hey @DrewFlater, a follow up to this. I just tried to use arcpy's AddField with a "BIGINTEGER" data type, and ran into a similar issue as above: ERROR 000800: The value is not a member of TEXT | FLOAT | DOUBLE | SHORT | LONG | DATE | BLOB | RASTER | GUID.
Failed to execute (AddField). I'm not going to open a Pro Idea because... this issue of new data types clearly hasn't been addressed in a lot of places, so the "idea" would extremely general. I'll leave the prioritization decisions to you all. Thanks again for the initial reply.
... View more
3 weeks ago
|
1
|
0
|
118
|
POST
|
Fair enough, I appreciate the insights. I'll go ahead and play around with replacing the string in FeatureSetByName. Thanks again!
... View more
3 weeks ago
|
0
|
1
|
128
|
POST
|
Thanks for the inputs @RPGIS, but the code I have provided already "works"--even without specifying this line below. Maybe that's because the Arcade "environment" knows that all feature classes/tables will have an ObjectID, so you don't need to specify? Regardless, it seems inconsistent. But I don't have it listed in my version, and things work as expected. var OID = $feature.OBJECTID My issue is that I don't want to have to specify the field name three different times in the same code. I'm not saying Python is without problems, but coming from Python there would never be a case where I would have to hard-code a string three different times in the same extremely small block of code. I would assign the string field name to a variable, and then use that variable in every instance where it is needed--if the field name changes, I change the single instance of that variable assignment, and it updates everywhere. In my example, and both of your examples, we are specifying the same field name "Name_Text" three different times in the same tiny script. If this is a bug, fine, but if this is how the language was designed... why?
... View more
3 weeks ago
|
0
|
3
|
162
|
POST
|
I am trying to write a Calculation style Attribute Rule to restrict entries into a field to be unique values within the field. As an example, more than one row in the given table can't have a value of "TEST" in the given field--entries must be unique. The below achieves this (might not handle NULLs, but I can sort that out). What is constantly irritating to me is how I cannot figure out how to use the variables defined on the first few lines in the "FeatureSetByName()" and "Expects()" function calls? Updating the FeatureSetByName() call with the fcName and fldName variables will work when I hit "OK" in the AR code editor, but then fail when I try to save the rule. Updating Expects() to use the fldName variable throws an error right in the editor. Why do these functions work this way? If I change the name of my field, I have to go in and retype it 3 different times in the code? Why can I not just use one single variable for those names? Is this some Arcade/JavaScript idiosyncrasy I just don't understand? Thanks in advance to anyone who can sort me out on this. The point is to create a rule that can be used in different locations, with minimal updates to only the INPUTS section. Esri's example (link in the code below) hard-codes things, which is silly from a reusability standpoint. // INPUTS
var fcName = "TEST";
var fldName = "Name_Text";
// These two lines must be written with the actual strings, not the variables above.
var features = FeatureSetByName($datastore, "TEST", ["Name_Text"], false);
Expects($feature, "Name_Text");
// MAIN
// If any values in the field match the current row's value (and the IDs are different),
// this is a duplicate, and an error.
// Largely copied from:
// https://support.esri.com/en-us/knowledge-base/how-to-identify-a-duplicate-field-value-using-an-attrib-000029088
for (var i in features) {
if ((i[fldName] == $feature[fldName]) && (i.OBJECTID != $feature.OBJECTID)) {
return {"errorMessage": "ERROR: value not unique within field!"}
}
}
// If error above not found, return results dict with value as-is.
var fldUpdates = Dictionary()
fldUpdates[fldName] = $feature[fldName]
Console({"result": {"attributes": fldUpdates}})
return {"result": {"attributes": fldUpdates}}
... View more
3 weeks ago
|
0
|
5
|
188
|
POST
|
Anyone have any idea why you can sort a table based on a DATE field, but not a DATE ONLY field? I have both field types in the example feature class below, but when it comes to the Sort tool, only the DATE field is an option. Simple oversight by Esri, or am I missing something?
... View more
4 weeks ago
|
2
|
5
|
258
|
POST
|
Appreciate the input, thanks! This option will have to do for now. Here's what my implementation looks like: HHMM is a TEXT field that the user types in, HH is a FLOAT field that gets calculated. Figured reading HH would be more intuitive to understand than MM, but the concept is the same. I try to use the same "template" for Attribute Rules, so this is probably overly verbose, but here's the automatic calculation from HH:MM text to HH float: /*
Converts a text HH:MM format field and converts it to a float HH field.
*/
//---------------------------------------------------------------------//
// FUNCTIONS
function calculateHH(hhmm) {
// Set up results dict. If all checks pass, return this as-is.
var result = Dictionary("VALUE", null, "ERROR", null)
if (IsEmpty(hhmm)) {
return result
}
Console(hhmm)
var timeArr = Split(hhmm, ":")
var hh = Number(timeArr[0]) + (Number(timeArr[1]) / 60)
Console(`${hhmm} --> ${hh}`)
result["VALUE"] = Round(hh, 2)
return result
}
//---------------------------------------------------------------------//
// INPUTS
var hhmm = $feature.Hiking_HHMM
//---------------------------------------------------------------------//
// MAIN
var fldUpdates = Dictionary()
// Convert HHMM text to MM float ----------------
var result = calculateHH(hhmm)
if (!IsEmpty(result["ERROR"]))
{return {"errorMessage": `ERROR: ${result}`}}
else
{fldUpdates["Hiking_HH"] = result["VALUE"]}
Console({"result": {"attributes": fldUpdates}})
return {"result": {"attributes": fldUpdates}} And my solution to verifying the initial HH:MM inputs: /*
Script provides validation for a TEXT field that should contain a time duration
in the format H?H?:MM. There are several checks, since RegEx is not available.
1) Blank strings are not allowed; null values are fine
2) Format must contain a colon--as in 3:30, 12:04, :37
3) Values on either side of the colon must be numeric
4) Hours are optional, but must be 0 - 23
5) Minutes must always be two characters, and numeric values 0 - 59
This script is meant to be used as Calculation type AR, rather than a
Constaint AR.
*/
//---------------------------------------------------------------------//
// FUNCTIONS
function timeValidate (time) {
// Set up results dict. If all checks pass, return this as-is.
var result = Dictionary("VALUE", time, "ERROR", null)
// First check empty, then null. Can't do both simultaneously nicely.
if ((TypeOf(time) == "String") && Trim(time) == "") {
result["ERROR"] = "Blank input is not allowed."
return result
}
else if (IsEmpty(time)) {
return result
}
// Console(timeArray, Count(timeArray));
// Must contain one ":", yields two elements with Split().
if (Count(timeArray) != 2) {
result["ERROR"] = "Format must contain one colon."
return result
}
// Check if both values in the array are numbers.
for (var i in timeArray) {
var tPart = timeArray[i]
Console(`PART: ${i}, VAL: ${tPart}, ASNUM: ${Number(tPart)}, ISNAN: ${IsNan(Number(tPart))}`)
// If Number returns NaN, "Casting a non-numeric text or undefined to a number".
if (IsNan(Number(tPart))) {
result["ERROR"] = "At least one value is not numeric."
return result
}
// Hours validation, the first array item.
if (i == 0) {
Console(`VALIDATING HOURS: ${tPart}`)
// Comparison operators coerce Text to Number for comparison.
if (tPart > 23 || tPart < 0) {
result["ERROR"] = "Hours must be 0-23."
return result
}
}
// Minutes validation, the second array item.
else if (i == 1) {
Console(`VALIDATING MINUTES: ${tPart}`)
if (Count(tPart) != 2) {
result["ERROR"] = "Minutes must be two digits."
return result
}
else if (tPart > 59 || tPart < 0) {
result["ERROR"] = "Minutes must be 0-59."
return result
}
}
}
return result
}
//---------------------------------------------------------------------//
// INPUTS
// Expects($feature, "Shape_Length", "MPH", "Miles")
var time = $feature.Hiking_HHMM
// DERIVED
var timeArray = Split(time, ":")
Console(`TIME ARR: ${timeArray}`)
//---------------------------------------------------------------------//
// MAIN
var fldUpdates = Dictionary()
// result = {"VALUE": time, "ERROR": error string OR null}
var result = timeValidate(time)
// If an error is returned in results dict, show and make no changes.
if (!IsEmpty(result["ERROR"]))
{return {"errorMessage": `ERROR: ${result} Please use H?H?:MM format.`}}
// Else, add the field update information to the running updates log.
else
{fldUpdates["Hiking_HHMM"] = time}
Console({"result": {"attributes": fldUpdates}})
return {"result": {"attributes": fldUpdates}}
... View more
03-14-2024
09:45 PM
|
1
|
0
|
165
|
POST
|
For "data" reasons, a great solution--at the resolution I require, even minutes would work. My issue is that if a user is transcribing a record from elsewhere, say "7:37", they have to do the math to convert those hours into minutes (or hours and minutes into seconds, if seconds are my unit). Not a huge deal, but an extra step, and possibly error-prone. The other thing is quickly reading the table. I want to be able to read these records at a glance. Number of seconds (beyond a minute or two) means virtually nothing to me, and even minutes can get difficult: 407 minutes takes me a little while to process. So I'd really want that to show as HH:MM. And I don't know of a table "display" setting that would allow the math required to calculate that from seconds/minutes alone, and an Attribute Rule would... change the actual underlying data value? Plus, an Attribute Rule returning a value with the colon would be problematic with a DOUBLE field type. Storing as minutes is probably going to be the only solution here, but it's annoying that it can't be displayed in an easily readable way. My best bet might be to use an Attribute Rule to calculate the minutes into a SHORT/DOUBLE field from a TEXT field. So the user can type in 5:49 or whatever in the TEXT field, and then an Attribute Rule calculates the raw number of minutes into a different numeric field, and then valid sorting can take place on the MINUTES field. Appreciate the comment, thanks!
... View more
03-14-2024
07:15 PM
|
0
|
0
|
176
|
POST
|
I would like to store elapsed time as an attribute in a feature class, displayed as HH:MM (drive time, for example). I have tried several approaches, and none of them are great. "Text" Data Type - User inputs elapsed time as HH:MM, and an Attribute Rule checks that the input fits this format. Works fine, but won't sort in ascending/descending order obviously, and can't easily add/subtract time from the data. Visually good, but the "data" doesn't really mean anything. "Time Only" Data Type - This isn't really appropriate either, as my "time" data isn't wall time, but elapsed time. This field type doesn't allow me to store 0:30, for example. Also, stored time can't be larger than 23:59. "Short" Data Type - I have this set to display as ##:##, so if a user inputs 337, this shows as 3:37. This field type allows sorting, which is ideal. But, the underlying data is now pretty meaningless once it's separated from it's nice HH:MM display. So I'm hoping to come up with a way of entering elapsed time as "HH:MM" in a field that also stores the underlying data in a meaningful way (quantifiable, sortable, etc.). I would be happy to use Attribute Rules to assist with this.
... View more
03-14-2024
04:05 PM
|
0
|
4
|
260
|
POST
|
Your point about data types (String vs. NoneType) is an interesting one I hadn't considered, thanks for that. I can't replicate Pro (3.X) throwing an error regarding "nulling" a non-nullable field though, unless you're talking about using Field Calculator to "<Null>" a field, in which case it does yield an error. Thanks again for the detailed response.
... View more
03-04-2024
03:39 PM
|
0
|
0
|
218
|
POST
|
I am curious if/when other users are implementing "non-nullable" fields in their feature class/table design? If so, why? What does it achieve/what are you trying to achieve? I ask because I don't understand the utility of this setting. If I have a TEXT field that is "non-nullable", ArcGIS Pro will just throw in an empty string (maybe a space " ", my point is the same) as I am digitizing features. If I have a non-nullable numeric field, ArcGIS Pro will populate it with a "0". Date field, 12/31/1899. So, while the field is technically "non-nullable", this doesn't really help me from a data-integrity standpoint. If Pro would throw an error during editing that would prevent saving a record (similar to an Attribute Rule that fails, or a correct domain value not being chosen), I would understand the purpose. But it doesn't throw an error--it just fills in a generic value. The worst offenders are numeric fields, where Pro will populate a "0", which to me, is much harder to detect as an issue than simply allowing a <NULL> in that field, indicating the record should be reviewed. At least 12/31/1899 stands out as being a weird value that might require further review. Is the point simply to prevent NULL values because... they are not acceptable in a particular database for some reason (why not? why is "" and "0" filler data better?)? Does the nullability setting result in a different experience for FieldMaps users, where it actually makes more sense? For whatever it's worth, I am speaking mostly of File Geodatabases here, but do work regularly in Enterprise Geodatabases (SQL Server). If the behavior is different or makes more sense in an EGDB, I would be happy to hear about that too. Thanks, genuinely curious how folks are using the "nullability" setting.
... View more
03-04-2024
11:37 AM
|
0
|
3
|
297
|
POST
|
I think I understand what you're after. I reworked some of the code internals to simplify things, so I'd try copying this version, rather than updating anything you're currently using (unless you're comfortable doing that). The outputs should be the same, with the addition of the new label field. I'm using a label expression to show the "point_id" + "quadrant" labels. This takes the angle of the line (say, 45 degrees) and then assigns the quadrant (NE). It then inverts that to get the opposing quadrant (SE). The code is set up so that if the line is perfectly north-south or east-west (90 degrees, 180 degrees) then those labels will be "N" and "S" or "W" and "E", without the secondary direction. Let me know if something doesn't work! Here's an W-E example, which I suppose would only occur... ~1% of the time? """
RANDOM ANGLE TRANSECT LINES CENTERED ON SAMPLING POINTS
"""
#-----------------------------------------------------------------------------#
# IMPORTS
import math
import os
import random
import sys
from typing import Literal, Union, Tuple
import arcpy
#-----------------------------------------------------------------------------#
# INPUTS
# Full path to the geodatabase where the sampling points are, and where outputs will go.
workspace = r"your_full_path\geodatabase_name.gdb"
# Name of the point feature class in the above geodatabase that houses the
# sampling points.
in_pnt_fc = "_SAMPLE_POINT"
# Distance between sampling points (along x and y, not diagonals).
# Units match coordinate system.
point_spacing = 400 # feet, in my case
box_width = 80
# NORTH_BOX: squares will be oriented to grid-north
# NORTH_DIAMOND: diamonds will be oriented to grid-north
# ANGLE_BOX: square sides oriented to the tranect lines
# ANGLE_DIAMOND: diamonds angled to the transect lines
box_rotation = 'ANGLE_BOX' # OPTIONS: (NORTH_BOX, NORTH_DIAMOND, ANGLE_BOX, ANGLE_DIAMOND)
# Name of the output feature class to be created.
fc_line = "TRANSECT_LINE"
fc_endpoints = "TRANSECT_LINE_ENDPOINTS"
fc_box = f"TRANSECT_{box_rotation}"
# -----------------------------------------------------------------------------#
# SETTINGS
# This just overwrites previous outputs to help with testing.
arcpy.env.overwriteOutput = True
#-----------------------------------------------------------------------------#
# FUNCTIONS
def rotate_point(coordinate: tuple[int, int],
pivot_point: tuple[int, int],
angle: Union[int, float],
angle_format: Literal['DEGREES', 'RADIANS']='DEGREES'
) -> tuple[float, float]:
"""Rotates coordinate values around another point.
:coordinate: (x, y) coordinate pair to be rotated
:pivot_point: (x, y) coordinate pair around which point will be rotated
:angle: angle to rotate the coordinate
:angle_format: angle format, RADIANS or DEGREES
|(x_prime, y_prime)| rotated point coordinates"""
if angle_format == 'DEGREES':
angle = math.radians(angle)
x, y = coordinate
X, Y = pivot_point
x_trans = x - X
y_trans = y - Y
sin = math.sin(angle)
cos = math.cos(angle)
# Counter-clockwise rotation:
x_transprime = cos * x_trans - sin * y_trans
y_transprime = sin * x_trans + cos * y_trans
x_prime = x_transprime + X
y_prime = y_transprime + Y
# print(f"XY TRANS: {x_trans:,.0f}, {y_trans:,.0f}\n"
# f"XY TRANSPRIME: {x_transprime:,.0f}, {y_transprime:,.0f}\n"
# f"XY PRIME: {x_prime:,.0f}, {y_prime:,.0f}\n")
return (x_prime, y_prime)
def create_box(method: Literal['NORTH_BOX', 'NORTH_DIAMOND', 'ANGLE_BOX', 'ANGLE_DIAMOND'],
center_pnt: arcpy.Point,
width: Union[int, float],
angle: Union[int, float],
spat_ref: int) -> arcpy.Polygon:
"""Create a square box around a given point. Optionally rotate it.
:center_pnt: centroid of the square box
:width: width of side of box
:angle: angle the box should be rotated
|box_geom| geometry object representing the box"""
angle_dict = {'NORTH_BOX': 45,
'NORTH_DIAMOND': 0,
'ANGLE_BOX': angle + 45,
'ANGLE_DIAMOND': angle}
# Distance to move point based on known hypotenuse (1/2 width of box).
offset = math.sqrt(2 * ((width / 2) ** 2))
x, y = center_pnt.X, center_pnt.Y
Nx, Ny = x, y + offset
Ex, Ey = x + offset, y
Sx, Sy = x, y - offset
Wx, Wy = x - offset, y
box_corners = []
for coordinate in [(Nx, Ny), (Ex, Ey), (Sx, Sy), (Wx, Wy), (Nx, Ny)]:
box_corners.append(rotate_point(coordinate=coordinate,
pivot_point=(x, y),
angle=angle_dict[method],
angle_format='DEGREES'))
pnt_array = arcpy.Array([arcpy.Point(*corner) for corner in box_corners])
box_geom = arcpy.Polygon(pnt_array, spatial_reference=spat_ref)
return box_geom
def get_angle_quadrant(a):
"""As previously defined, angle resctricted [1-360]."""
# Check for 90 degree cardinal directions.
if a == 90:
return "N"
elif a == 180:
return "W"
elif a == 270:
return "S"
elif a == 360:
return "E"
# Determine non-90 quadrants.
if 0 < a < 90:
return "NE"
elif 90 < a < 180:
return "NW"
elif 180 < a < 270:
return "SW"
elif 270 < a < 360:
return "SE"
else:
raise ValueError("Angle value must be numeric and between 1-360, inclusive.")
#-----------------------------------------------------------------------------#
# MAIN
in_points = os.path.join(workspace, in_pnt_fc)
transect_lines = []
sample_box_dict = {}
pnt_desc = arcpy.da.Describe(in_points)
pnt_oid_fld = pnt_desc['OIDFieldName']
pnt_spat_ref = pnt_desc['spatialReference'].factoryCode
with arcpy.da.SearchCursor(in_points, [pnt_oid_fld, 'SHAPE@XY']) as scurs:
for oid, (x, y) in scurs:
# Random angle in degrees, 1-360; convert to radians.
angle_deg = random.randint(1, 360)
angle_rad = math.radians(angle_deg)
print(f"POINT {oid}, ANGLE: {angle_deg}")
# Get x/y offset of random point on the imaginary circle around each point.
start_x = (point_spacing/2) * math.cos(angle_rad)
start_y = (point_spacing/2) * math.sin(angle_rad)
# Using the sample point coordinate as the starting point,
# calculate real-world x and y of both start and end of transect line.
start_point = arcpy.Point(x + start_x, y + start_y)
end_point = arcpy.Point(x - start_x, y - start_y)
# -S/-E suffixes differentiate start/end boxes. Not currently used, but
# documented just in case (also, unique keys are required).
sample_box_dict[f'{oid}-S'] = create_box(box_rotation, start_point,
box_width, angle_deg, pnt_spat_ref)
sample_box_dict[f'{oid}-E'] = create_box(box_rotation, end_point,
box_width, angle_deg, pnt_spat_ref)
# Create a Polyline Geometry Object. Append dict entry to list.
new_line = arcpy.Polyline(arcpy.Array([start_point, end_point]))
transect_lines.append({'OID': oid, 'GEOM': new_line, 'ANGLE': angle_deg})
# Create three feature classes.
for fc, geom in [(fc_line, 'POLYLINE'), (fc_endpoints, 'POINT'), (fc_box, 'POLYGON')]:
arcpy.management.CreateFeatureclass(out_path=workspace, out_name=fc,
geometry_type=geom, spatial_reference=pnt_spat_ref)
print(f"CREATED FC {fc}")
# FC paths.
ln_fc = os.path.join(workspace, fc_line)
pt_fc = os.path.join(workspace, fc_endpoints)
box_fc = os.path.join(workspace, fc_box)
# Add and ID field to the points, the lines, and box FCs.
for fc in (ln_fc, pt_fc, box_fc):
arcpy.management.AddField(fc, 'POINT_OID', 'SHORT')
# Add angle field to the line FC.
arcpy.management.AddField(ln_fc, 'LINE_ANGLE', 'SHORT')
# Add the point type and the point angle to the point FC.
arcpy.management.AddField(pt_fc, 'POINT_TYPE', 'TEXT', field_length=5)
arcpy.management.AddField(pt_fc, 'POINT_QUAD', 'TEXT', field_length=5)
print("ALL FIELDS ADDED")
# Write transect line features----------------------------------#
with arcpy.da.InsertCursor(ln_fc, ['SHAPE@', 'POINT_OID', 'LINE_ANGLE']) as icurs:
for feature_dict in transect_lines:
icurs.insertRow([feature_dict['GEOM'],
feature_dict['OID'],
feature_dict['ANGLE']])
print("TRANSECTS WRITTEN")
# Write endpoint features----------------------------------#
invert_quad = {"NE": "SW", "SE": "NW", "SW": "NE", "NW": "SE",
"N": "S", "S": "N", "E": "W", "W": "E", }
with arcpy.da.InsertCursor(pt_fc, ['SHAPE@', 'POINT_OID', 'POINT_TYPE', 'POINT_QUAD']) as icurs:
for feature_dict in transect_lines:
start_pnt = feature_dict['GEOM'].firstPoint
end_pnt = feature_dict['GEOM'].lastPoint
angle = feature_dict['ANGLE']
quadrant_1 = get_angle_quadrant(angle)
quadrant_2 = invert_quad[quadrant_1]
icurs.insertRow([start_pnt, feature_dict['OID'], "START", quadrant_1])
icurs.insertRow([end_pnt, feature_dict['OID'], "END", quadrant_2])
print("ENDPOINTS WRITTEN")
# Create Sampling Zones----------------------------------#
# Transect Line Oriented
with arcpy.da.InsertCursor(box_fc, ['SHAPE@', 'POINT_OID']) as icurs:
for oid, geom in sample_box_dict.items():
icurs.insertRow([geom, oid.split('-')[0]])
print("BOXES WRITTEN")
... View more
11-27-2023
12:27 PM
|
0
|
1
|
518
|
POST
|
I tested several methods of adding fields to a blank feature class in an attempt to overcome the sluggish process that is "AddField" in a for loop. Posting all the code might make this unreadable, so here is a summary. All tests result in the same output**--field names are the same, domains are the same, etc. In this example, I am adding 500 fields to a newly created feature class. There is no data involved here, this is schema setup only. Methods & Times: Add fields in a for loop to a feature class already saved to file – 6:19 Add fields in a for loop to a memory feature class, then use CreateFeatureclass with the memory copy as the template – 0:56 Using AddFields (batch version with the “s”) – 4:56 FieldMappings() to create fields, ExportFeatures to file, AssignDomainToField in a loop – 2:09 FieldMappings() to create fields, ExportFeatures to memory, AssignDomainToField, CreateFeatureclass to file using memory as template – 0:19 So, in my experiments anyway, using a FieldMappings object and working in memory for as long as possible is the absolute fastest. BUT, simply adding fields to a memory feature class and using that as a template during the export to file is probably the simplest solution in terms of lines of code and maintenance. Results from a more reasonable 40 fields, in the same order as above: 0:25 0:02 0:14 0:09 0:02 So even when generating a handful of feature classes with am more modest number of fields, it might best to avoid using AddField in a for loop for a feature class that is already saved to file. **AddFields does not allow the user to specify Scale and Precision, and I didn’t see a quick way to modify those attributes after the fact. In a FGDB, it is irrelevant, but this might matter in an enterprise environment. Here's an example of the randomly generated fields for three of the five output methods (all three should be the same).
... View more
08-08-2023
04:12 PM
|
1
|
0
|
738
|
Title | Kudos | Posted |
---|---|---|
1 | 3 weeks ago | |
2 | 4 weeks ago | |
1 | 03-14-2024 09:45 PM | |
2 | 11-17-2022 07:46 AM | |
1 | 11-16-2022 04:55 PM |
Online Status |
Offline
|
Date Last Visited |
2 weeks ago
|