|
POST
|
Attribute Rules may be what you need, but without seeing an example of your field schema and an example of your intended inputs, it's hard to say for sure.
... View more
11-22-2022
08:34 AM
|
0
|
0
|
6289
|
|
POST
|
See another version below. This one will save your points too. It also creates your sampling boxes. There are four different methods you can use (see the "box_rotation" variable in the INPUTS section). Remember to update all the "INPUTS" based on your needs. Should be able to paste the code into the Python window again. Only one version of the transect lines are shown (lines for the 'ANGLE_DIAMOND'-style boxes), but you can see all four styles of the sampling boxes. Some different sizes and distances from the origin points are also shown. #-----------------------------------------------------------------------------#
# IMPORTS
import math
import os
import random
import sys
from typing import Literal, Union
import arcpy
#-----------------------------------------------------------------------------#
# INPUTS
# Full path to the geodatabase where the sampling points are, and where outputs will go.
workspace = r"C:TEMP\Random_Sample_Lines.gdb"
# Name of the point feature class in the above geodatabase that houses the
# sampling points.
in_pnt_fc = "_SAMPLE_POINT"
# Distance between sampling points (along x and y, not diagonals).
# Units match coordinate system.
point_spacing = 400 # feet, in my case
box_width = 50
box_rotation = 'ANGLE_BOX' # Literal[NORTH_BOX, NORTH_DIAMOND, ANGLE_BOX, ANGLE_DIAMOND]
# Name of the output feature class to be created.
fc_line = "TRANSECT_LINE"
fc_endpoints = "TRANSECT_LINE_ENDPOINTS"
# Leave this alone if you are not familiar with the f-string notation.
fc_box = f"TRANSECT_{box_rotation}"
# -----------------------------------------------------------------------------#
# SETTINGS
# This just overwrites previous outputs to help with testing.
arcpy.env.overwriteOutput = False
#-----------------------------------------------------------------------------#
# FUNCTIONS
def rotate_point(coordinate: tuple[int, int],
pivot_point: tuple[int, int],
angle: Union[int, float],
angle_format: Literal['DEGREES', 'RADIANS']='DEGREES'
) -> tuple[int, int]:
"""Rotates coordinate values around another point.
:coordinate: (x, y) coordinate pair to be rotated
:pivot_point: (x, y) coordinate pair around which point will be rotated
:angle: angle to rotate the coordinate
:angle_format: angle format, RADIANS or DEGREES
|(x_prime, y_prime)| rotated point coordinates"""
if angle_format == 'DEGREES':
angle = math.radians(angle)
x, y = coordinate
X, Y = pivot_point
x_trans = x - X
y_trans = y - Y
sin = math.sin(angle)
cos = math.cos(angle)
# Counter-clockwise rotation:
x_transprime = cos * x_trans - sin * y_trans
y_transprime = sin * x_trans + cos * y_trans
x_prime = x_transprime + X
y_prime = y_transprime + Y
# print(f"XY TRANS: {x_trans:,.0f}, {y_trans:,.0f}\n"
# f"XY TRANSPRIME: {x_transprime:,.0f}, {y_transprime:,.0f}\n"
# f"XY PRIME: {x_prime:,.0f}, {y_prime:,.0f}\n")
return (x_prime, y_prime)
def create_box(method: Literal['NORTH_BOX', 'NORTH_DIAMOND', 'ANGLE_BOX', 'ANGLE_DIAMOND'],
center_pnt: arcpy.Point,
width: Union[int, float],
angle: Union[int, float],
spat_ref: int) -> arcpy.Polygon:
"""Create a square box around a given point. Optionally rotate it.
:center_pnt: centroid of the square box
:width: width of side of box
:angle: angle the box should be rotated
|box_geom| geometry object representing the box"""
angle_dict = {'NORTH_BOX': 45,
'NORTH_DIAMOND': 0,
'ANGLE_BOX': angle + 45,
'ANGLE_DIAMOND': angle}
# Distance to move point based on known hypotenuse (1/2 width of box).
offset = math.sqrt(2 * ((width / 2) ** 2))
x, y = center_pnt.X, center_pnt.Y
Nx, Ny = x, y + offset
Ex, Ey = x + offset, y
Sx, Sy = x, y - offset
Wx, Wy = x - offset, y
box_corners = []
for coordinate in [(Nx, Ny), (Ex, Ey), (Sx, Sy), (Wx, Wy), (Nx, Ny)]:
box_corners.append(rotate_point(coordinate=coordinate,
pivot_point=(x, y),
angle=angle_dict[method],
angle_format='DEGREES'))
pnt_array = arcpy.Array([arcpy.Point(*corner) for corner in box_corners])
box_geom = arcpy.Polygon(pnt_array, spatial_reference=spat_ref)
return box_geom
#-----------------------------------------------------------------------------#
# MAIN
in_points = os.path.join(workspace, in_pnt_fc)
transect_lines = []
line_enpoint_dict = {}
sample_box_dict = {}
pnt_desc = arcpy.da.Describe(in_points)
pnt_oid_fld = pnt_desc['OIDFieldName']
pnt_spat_ref = pnt_desc['spatialReference'].factoryCode
with arcpy.da.SearchCursor(in_points, [pnt_oid_fld, 'SHAPE@XY']) as scurs:
for oid, (x, y) in scurs:
# Random angle in degrees, 1-360; convert to radians.
angle_deg = random.randint(1, 360)
angle_rad = math.radians(angle_deg)
print(f"POINT {oid}, ANGLE: {angle_deg}")
# Get x/y offset of random point on the imaginary circle around each point.
start_x = (point_spacing/2) * math.cos(angle_rad)
start_y = (point_spacing/2) * math.sin(angle_rad)
# Using the sample point coordinate as the starting point,
# calculate real-world x and y of both start and end of transect line.
start_point = arcpy.Point(x + start_x, y + start_y)
end_point = arcpy.Point(x - start_x, y - start_y)
# -S/-E suffixes differentiate start/edn boxes. Not currently used, but
# documented just in case (also, unique keys are required).
sample_box_dict[f'{oid}-S'] = create_box(box_rotation, start_point,
box_width, angle_deg, pnt_spat_ref)
sample_box_dict[f'{oid}-E'] = create_box(box_rotation, end_point,
box_width, angle_deg, pnt_spat_ref)
# Create a Polyline Geometry Object. Append dict entry to list.
new_line = arcpy.Polyline(arcpy.Array([start_point, end_point]))
transect_lines.append({'OID': oid, 'GEOM': new_line, 'ANGLE': angle_deg})
# Save start and end Point Geometry objects to another list.
line_enpoint_dict[oid] = (arcpy.PointGeometry(start_point),
arcpy.PointGeometry(end_point))
# Create three feature classes.
for fc, geom in [(fc_line, 'POLYLINE'), (fc_endpoints, 'POINT'), (fc_box, 'POLYGON')]:
arcpy.management.CreateFeatureclass(out_path=workspace, out_name=fc,
geometry_type=geom, spatial_reference=pnt_spat_ref)
print(f"CREATED FC {fc}")
# FC paths.
ln_fc = os.path.join(workspace, fc_line)
pt_fc = os.path.join(workspace, fc_endpoints)
box_fc = os.path.join(workspace, fc_box)
# Add fields.
for fc in (ln_fc, pt_fc, box_fc):
arcpy.management.AddField(fc, 'POINT_OID', 'SHORT')
arcpy.management.AddField(ln_fc, 'LINE_ANGLE', 'SHORT')
arcpy.management.AddField(pt_fc, 'POINT_TYPE', 'TEXT', field_length=5)
print("ALL FIELDS ADDED")
# Write transect line features.
with arcpy.da.InsertCursor(ln_fc, ['SHAPE@', 'POINT_OID', 'LINE_ANGLE']) as icurs:
for feature_dict in transect_lines:
icurs.insertRow([feature_dict['GEOM'],
feature_dict['OID'],
feature_dict['ANGLE']])
print("TRANSECTS WRITTEN")
# Write line start and end points.
with arcpy.da.InsertCursor(pt_fc, ['SHAPE@', 'POINT_OID', 'POINT_TYPE']) as icurs:
for oid, (start_geom, end_geom) in line_enpoint_dict.items():
icurs.insertRow([start_geom, oid, 'START'])
icurs.insertRow([end_geom, oid, 'END'])
print("ENDPOINTS WRITTEN")
# Create Sampling Zones----------------------------------#
# Transect Line Oriented
with arcpy.da.InsertCursor(box_fc, ['SHAPE@', 'POINT_OID']) as icurs:
for oid, geom in sample_box_dict.items():
icurs.insertRow([geom, oid.split('-')[0]])
print("BOXES WRITTEN")
... View more
11-21-2022
12:04 PM
|
0
|
1
|
4298
|
|
POST
|
Sure, I appreciate you taking a look. See zipped geodatabase attached. Example below was generated in Pro 3.0.2. My first post above was done in 2.9.2. Issue is the same. What I have attached here is slightly different than the example above--this time I'm using three fields in the field group. The issue is exactly the same, however. A user can calculate whatever values they want, which to me, largely defeats the point of implementing CVs. Also, to clarify, the Attribute Rules I have implemented here don't really have anything to do with my issue (I don't think); my question is really just about Field Calculator breaking CVs. Here's a screenshot that shows how Field Calculator can be used to ignore CVs. The three records shown are all invalid, and would not be possible to enter manually (the red error boxes would show). With Field Calculator, however, they are all happily accepted by the database. If this is the expected behavior of CVs, please let me know--if that's the case, then that would be the answer to my question. I would also be curious if other users have this issue, or what workarounds might exist.
... View more
11-20-2022
03:02 PM
|
0
|
0
|
3224
|
|
POST
|
I have implemented a series of Contingent Values for a Field Group using a set of domains applied to those fields. Works as expected. I have also implemented an Attribute Rule (Constraint) to prevent Field Calculator from calculating any values for a given field that are outside of the allowable domain values. This gets around the long-standing issue of users breaking Domain limitations with Field Calculator. Great. The problem is, a user can still calculate an invalid value in a given field based on the allowable contingent values. In other words, the calculated value is perfectly fine for the domain of the current field, but NOT acceptable considering the values in the other fields in the field group. Manually, this doesn't seem to be possible, since choosing an invalid set of values in the given field group will highlight the field group in red, and not allowing saving the edit--it will throw an error saying "Attribute does not satisfy a required Contingent Value", as expected. Using Field Calculator, a user can get around this. I would like to prevent this, ideally without setting up a Validation-type Attribute Rule, since I believe a user could also get around that by simply not running those validations (although I'm not well-versed in the "Validation" flavor of Attribute Rules). Image below: Fields Table showing fields and applied domains Domains Table showing the two domains (can only show the codes/descriptions of one domain...) CV Table showing the allowable contingent value sets Attribute Rule enforcing domains in a field, even with Field Calculator Attribute Table, with invalid 1-B sets of values, after the Field Calculation With Field Calculator, it's possible to achieve invalid CVs, even when the Field Group is "Restrictive." Any thoughts or ways other designers get around this issue would be appreciated. Seems like the only way would be to create custom Attribute Rule Constraints for each allowable Contingent Value... which defeats the purpose of even using Contingent Values in the first place.
... View more
11-18-2022
04:29 PM
|
1
|
6
|
3287
|
|
POST
|
Very helpful, thanks. Seems like I was missing the part where it is necessary to declare the attribute variables from the table that are to be used in the script...? Expects($feature, 'COLOR')
//$feature.COLOR;
var fieldName = 'COLOR'
return $feature[fieldName] The documentation, found at the link you provided: You should explicitly list all field attributes that will be used at the top of the expression. This will ensure the profile will request the appropriate data from required fields so the expression evaluates properly. $feature.COLLEGE;$feature.POPULATION;
Round(($feature.COLLEGE / $feature.POPULATION) * 100, 2); Alternatively, you may use the Expects function for this purpose. Expects($feature, "COLLEGE", "POPULATION");
Round(($feature.COLLEGE / $feature.POPULATION) * 100, 2);
... View more
11-18-2022
02:46 PM
|
0
|
0
|
4585
|
|
POST
|
My understanding is that Pro has a pretty limited flavor/implementation of SQL, and the CASE statement is not available. The only things I can point to supporting this are a lack of mentions of "CASE" in the Esri documentation, as well as Manifold complaining about how Esri doesn't support much in the way of SQL, but Manifold DOES through their third-party extension. Take their marketing pitch with a grain of salt perhaps, I've never used their software. Quote from the Manifold docs: https://sql4arc.com/info/sql_compared.shtml "SQL for ArcGIS Pro provides an extensive choice of data types, including a variety of geometry and vector (tuple) types, and operators, including tuple operators, CASE, CASE WHEN, CAST, CASTV, and much more. There are very long listings in the various SQL topics in the SQL for ArcGIS Pro user manual."
... View more
11-18-2022
11:52 AM
|
0
|
1
|
1469
|
|
POST
|
Need some help understanding Arcade's syntax. The top two lines throw an error, as shown. The bottom one works fine. I'd prefer to not hard-code field names, so using the bracket syntax with a variable would be preferable to the dot notation ($feature.COLOR). What am I missing here?
... View more
11-18-2022
11:23 AM
|
0
|
3
|
4651
|
|
POST
|
I think your image makes sense, but let me know based on this output. Here's another random run: Don't know what your experience with Python is, but that's how I would accomplish this task. More than happy to clarify anything if you have questions. Couple things: You can run this in an IDE with Pro's built-in Python environment, or simply paste the whole thing in the Python window if that is easier for you. You need to update the five values in the "inputs" section so they are tailored to your situation. I would use projected coordinate systems for all of this. If two adjacent north-south points also had two perfectly north-south transect lines, you could have overlapping endpoints. There are checks for this, but if it was me, I would just re-run the script to try again. Currently the full length of a line is the same as the length between any two adjacent (along x-y, not diagonal) grid points. If you have other specifications, feel free to add that. This script also creates the origin points for these lines (see small points). This was for testing (the "NOT NEEDED" comments in the code below show those parts). It does not create the endpoints. To create your endpoints at each end of the lines (since it sounds like you need those), I would just use the built-in Feature Vertices to Points #-----------------------------------------------------------------------------#
# IMPORTS
import math
import os
import random
import arcpy
#-----------------------------------------------------------------------------#
# INPUTS
# Full path to the geodatabase where the sampling points are, and where outputs will go.
workspace = r"your_geodatabase_path\GDB_NAME.gdb"
spat_ref_epsg = XXXX # This should be an integer code representing the EPSG number of a spatial reference.
# Name of the point feature class in the above geodatabase that houses the
# sampling points.
in_pnt_fc = "SAMPLE_POINT"
# Distance between sampling points (along x and y, not diagonals). Units match coordinate system.
point_spacing = 500 # feet, in my case
# Name of the output feature class to be created.
output_line_fc = "TRANSECT_LINE"
# -----------------------------------------------------------------------------#
# SETTINGS
# This just overwrites previous outputs to help with testing.
arcpy.env.overwriteOutput = False
#-----------------------------------------------------------------------------#
# MAIN
# Gets full path to the input sampling points feature class.
in_points = os.path.join(workspace, in_pnt_fc)
# NOT NEEDED. Compile the origin points here, for testing purposes mostly.
start_pts = []
"""
Using a SearchCursor, iterate over provided sample points.
At the given radius (half the sampling distance) create a random origin point for a line.
Get point on opposite side of the "circle" that is around the given sampling point.
Create a Polyline geometry object from those two points.
"""
transect_lines = []
with arcpy.da.SearchCursor(in_points, ['SHAPE@XY']) as scurs:
for (x, y), in scurs:
# print(f"POINT {oid}")
# Random angle in degrees, 1-360; convert to radians.
angle_deg = random.randint(1, 360)
# print(angle_deg)
angle_rad = math.radians(angle_deg)
# Get the x and y offset of the random point on the imaginary circle around each
# sampling point.
start_x = (point_spacing/2) * math.cos(angle_rad)
start_y = (point_spacing/2) * math.sin(angle_rad)
# Using the sample point coordinate as the starting point,
# calculate the real-world x and y of both the start and end of the transect line.
start_point = arcpy.Point(x + start_x, y + start_y)
end_point = arcpy.Point(x - start_x, y - start_y)
# Create a Polyline Geometry Object.
new_line = arcpy.Polyline(arcpy.Array([start_point, end_point]))
# Append the line geometry for this sampline point to a list.
transect_lines.append(new_line)
# NOT NEEDED. Testing the line origin points only.
start_pts.append(arcpy.PointGeometry(start_point))
print('\n')
# Create a blank polyline feature class to house the new transect lines.
ln_fc = arcpy.management.CreateFeatureclass(workspace, 'LINE', 'POLYLINE', spatial_reference=spat_ref_epsg)
with arcpy.da.InsertCursor(ln_fc, ['SHAPE@']) as icurs:
for geom in transect_lines:
icurs.insertRow([geom])
# NOT NEEDED. This creates a feature class for the origin points of the line.
pt_fc = arcpy.management.CreateFeatureclass(workspace, 'ORIGIN_POINT', 'POINT', spatial_reference=spat_ref_epsg)
with arcpy.da.InsertCursor(pt_fc, ['SHAPE@']) as icurs:
for geom in start_pts:
icurs.insertRow([geom])
... View more
11-17-2022
08:21 PM
|
0
|
0
|
4322
|
|
POST
|
Just a note, you are using arcpy.da.SearchCursor in your code here, when I think you intended to use an UpdateCursor.
... View more
11-17-2022
05:06 PM
|
0
|
0
|
4429
|
|
POST
|
I don't have much to contribute, other than that I noticed similar issues in the way ArcGIS Pro seems to handle/flag/calculate <Null>/None in an attribute table at the 3.0 version level. When using the Field Calculator and Python as the parser, calculating a numeric or a text field back to "None" (without the quotes, of course) yields a new warning message that was not present in previous versions of the software (for this particular calculation back to "<Null>", anyway). Operation still works, but it seems to indicate to me that either: A) NULLs are being handled/read/recorded differently, or B) this is totally unrelated to your problem and it's just a friendly warning message they now provide for... some reason. Here is my posted question about this, in case the answer provided buy @Robert_LeClair is of any help to you: Solved: Re: Calculating "None" Using Field Calculator Yiel... - Esri Community
... View more
11-17-2022
05:01 PM
|
0
|
0
|
767
|
|
POST
|
Any chance you can us MS Paint or something similar to add a picture of what you are looking for? Not totally clear what you're after from your post. I'd be happy to take a crack at it if I get a better idea of what you are looking for.
... View more
11-17-2022
04:28 PM
|
0
|
1
|
4332
|
|
POST
|
Does your input data have a defined projected coordinate system? Or are you only working with data that has a geographic coordinate system? What are the field types of your X and Y coordinates? Before you even try to calculate MGRS coordinates, do the input Lat/Long values look valid to you? Is this a feature class with geometries, or just a table? I just tried this with DD_2 as the input (a field for the Lats, a field for the Longs, like you said, both of which are "Double" type) and my results seem fine (minus the Excel export, that's the easy part). One thing to try might be creating the data out to a "memory" location. If you are not familiar with this, in the "Output Feature Class" input line, write "memory\Output" (without the quotes). This might help diagnose if this is a weird write privilege thing or a shapefile thing, etc. If don't have data privacy concerns, it could be helpful to include a screenshot of your attribute table with those X/Y columns.
... View more
11-17-2022
04:21 PM
|
0
|
0
|
1727
|
|
POST
|
Can you expand on what you mean by this, and what you are trying to accomplish? At face value, I'm not sure the request you've been given makes sense. For simplicity, let's say the following are the only two Contingent Value groups allowable for a feature class with four fields: Now let's say I enter in a record for a Crosstrek, Premium, Standard, Red (which is valid). What if I decide I want a CVT, instead of a Standard transmission? There would be no possible way to do that without the partial match option, because selecting CVT is NOT ALLOWABLE with the Premium Trim or the color Red. BUT, the CVT is allowable for the car itself, so it's a "partial match." By selecting CVT, it will either surround the other fields in a red box when they become invalid, or, in this case, default to the ONLY allowable options for the other fields. Basically, I suppose you could eliminate the "partial match" option, but if you got too far into a specific set of values, there would be no way to change anything without erasing the entire record and starting over, because there may ONLY be partial matches available (as in the case above). So, if my understanding of your situation and the way Contingent Values are implemented, it is not possible to do this, and it also would not make sense to simply from a data editing/modifying standpoint. If you want to ensure a specific order of values are entered correctly the first time, and if a user makes a mistake, they have to start the record over entirely, you may need to look into successive Attribute Rule Constraints.
... View more
11-17-2022
03:33 PM
|
0
|
1
|
6312
|
|
POST
|
Is there a means of programmatically (ArcPy) getting a list of all fields that are involved in Attribute Rule Calculations? It appears that it is possible to use arcpy.da.Describe with the ['attributeRules'] key to obtain the names of the fields that have calculations applied, but that will not include fields that might be part of the calculation. For example, you can get the name of Field3 where the calculation is applied, but I don't see a means of getting the names of Field1 and Field2 that are used to calculate a sum, for example, to be populated in Field3.
... View more
11-17-2022
01:35 PM
|
0
|
0
|
2184
|
|
POST
|
Robert, I appreciate the clarification on when and why this was added. Very helpful, thanks!
... View more
11-17-2022
10:27 AM
|
1
|
0
|
7556
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 10-09-2024 08:46 AM | |
| 1 | 07-07-2020 01:01 PM | |
| 1 | 10-09-2024 07:22 AM | |
| 1 | 07-17-2024 02:57 PM | |
| 1 | 10-24-2024 07:51 AM |
| Online Status |
Offline
|
| Date Last Visited |
04-08-2026
12:02 PM
|