|
POST
|
Want to thank Kiyoshi and David for the guidance above first off. I was brand new to this entire process. Combining the above posts of very helpful information/videos with the existing getting started documentation, I was able to successfully plot some airports in ICAO style based on an area of interest. If it’s helpful at all to any I’ve tossed together some notes as a "get started down this path" exercise. This exercise loads Airport data provided by https://aip.dfs.de/datasets/ Open a blank project and save. Download the project files referenced in below URLs referenced in the opening paragraph, then load the sample layout into your project. I ended up adding this folder as a shortcut in my project (Folders -> Add Folder Connection -> Sample Files Folder): Now, follow the instructions to build your Aeronautical Geodatabase. I converted this projects Default/Project database to the AIS database: https://pro.arcgis.com/en/pro-app/latest/help/production/aviation/set-up-an-aviation-geodatabase.htm Hop back over to the Enroute Template and follow the instructions from Step 6 onward to link your chart layout to your projects AIS database. On the Top left of this Catalog tab, notice that Items is currently selected. Click Workspace to update all layers to your AIS database at once. Be sure to click Validate and Apply. Import Data: Now import our ADHP AIXM file since our AIS database is set up. In your geoprocessing toolset, go to Import AIXM 5.1 Message and select the downloaded sample file. Click Run: https://pro.arcgis.com/en/pro-app/latest/tool-reference/aviation/import-aixm-5-1-message.htm If you navigate to Main -> ADHP and open the attribute table, it should be populated. Create an Area of Interest in Germany. I just drew a box around Frankfurt (i ended up making it much bigger than this example to get AD/HP and LS types), following the below steps: https://pro.arcgis.com/en/pro-app/latest/help/production/aviation/define-areas-of-interest.htm Note: After configuring the Area of Interest, right under Drawing Order, Right click your ICAO ENR Map and Click Properties. Under the color management option is Aviation AOI. Update that to the created AOI to the shape you drew and click Apply. The instructions point you to Generate Aviation Cartographic Features first, but I found you should create the Annotation layer first instead since generating new Cartographic Features populates the Annotation Feature Class. Be deliberate in following the steps up until Step 21. As a Note, Step 14 asks you to link annotation to the following feature class. Choose ADHP_C since our sample file uses Aerodromes/Heliports (ADHP_C): https://pro.arcgis.com/en/pro-app/latest/help/production/aviation/create-annotation-feature-classes.htm Step 21 says to add the label classes you want to use for the annotation features as a note. If you’ve set everything up right (including step 14/15 so the label classes do not error out with a Main_Info field error), go ahead and copy the code from the adhp_c_label_arcade_icao.lxp file in the charting tools and add it to the Class 1 Label: Complete the Configure feature-linked annotation feature classes section, pasting the sample code provided in the instructions in the rules section Next, the instructions say to Generate Aviation Cartographic Features https://pro.arcgis.com/en/pro-app/latest/help/production/aviation/cartographic-features.htm, set exceptions, and then prepare features. What this means is: Generate - This fills out the ADHP_C table with data based on your AOI. This step is done first. the _C just means Cartographic Prepare Features. This will take metadata from your ADHP table and add it into the ADHP_C table in a format aviation charting uses to do some magic. Generate features first. In the Geoprocessing toolset set your source to ADHP (Mine was Main/ADHP). And the Target to ADHP_C. I opted to keep it simple and only include things within the Area of Interest drawn. Prepare Aviation Data (https://pro.arcgis.com/en/pro-app/latest/tool-reference/aviation/prepare-aviation-data.htm). This toolset will basically take the <Null> Main_Info column currently in the ADHP_C table and add the Airport name & type to it for the charting symbology and label text to parse through via an Arcade Script. The configuration file is part of the product files located at ArcGIS Aviation Charting\Product: Files\3.3\SampleConfigs\ICAO\Enroute\PrepareAviationData\adhp_c_config_icao.json This should produce something akin to the below (my final product will differ a bit since I messed around with additional symbology and annotation classes for Landing Strips and Helipads a bunch. Note 1: I had a blank annotation feature class since I ran the Create before running the Annotation steps. If you do things out of order you can install the Aviation toolkit under \ArcGIS Aviation Charting\Product Files\3.3\Utilities\Scripts\Aviation Tools.pyt (just import it via right click catalogue -> add toolset), then run the Create Aviation Feature Linked Annotation tool. Note 2: In the first few failures, I never could get the symbology to apply on the ADHP_C layer. I ended up having to use the “Match Layer Symbology to a Style” tool and set it up as: Input Layer: ADHP/C Match Values: Expression Builder -> Copy and Paste the Arcade script located at \ArcGIS Aviation Charting\Product Files\3.3\SampleConfigs\ICAO\Visual\ArcadeExpressions\Symbol\adhp_c_symbol_arcade_icao.lxp Style: Aviation Charting ICAO (If this is not imported, you can add it from Insert -> Styles -> Add Style -> \ArcGIS Aviation Charting\Product Files\3.3\Styles\Aviation Charting ICAO.stylx. It should be in there if you’ve imported the ICAO Sample Map. Note 3: If you want to annotate more things like the HP and LS records, during the Annotation Feature Class creation step where you can click the Pencil and update Class 1 to the adhp_c_label_arcade_icao expression, Scroll down and just change the “AD” to “HP” or “LS” to create new classes. You can also add them to the Array as 1 class if you want. I kind of liked having 3 individual Classes for toggling. Note 4: If you end up messing anything up, you can delete the Annotation Feature Class from the gdb, then delete the records from the ADHP_C table (keep the Feature Class obviously don’t delete it). Build out your Annotation class again via the steps, Generate Carto Features, and Prepare Aviation Data. Best of luck. This tool is a beast but it's amazing.
... View more
10-16-2024
07:32 AM
|
3
|
0
|
1806
|
|
POST
|
Hi, I have to admit I'm at a loss in trying to use this toolset. The output is incredible. There are some videos in the community from 2015 using ArcMap about importing AIXM data and there's a 3-part YouTube video (Aviation Charting in ArcGIS Pro - YouTube) that demonstrates what the tool does. The data is pre-prepared and annotations already seem to be created for the demo though, which makes this whole thing look more like magic. Are there videos or tutorials anywhere that demonstrate using this tool from start to finish? Or at least videos or walkthroughs that show AIXM data being imported and then stylized? Is going through the data dictionary the only way I know which pre-defined table the data goes into? I have been here (Get started with ArcGIS Aviation Charting—ArcGIS Pro | Documentation). I can complete setting up the aviation gdb, get the template, and then draw an AOI. From here I don't know what else to do. I have attempted to use the APT_AIXM file from FAA NASR Data to load just airports and apply the charting style, which based on the data dictionary (AIS schema data dictionary—ArcGIS Pro | Documentation) should load into the ADHP table? The table is empty on import. Thanks for reading through and any assistance.
... View more
08-21-2024
10:19 AM
|
0
|
5
|
2038
|
|
POST
|
I just pasted that in and it worked out of the box. Did you also try adding media -> chart via the map?
... View more
07-15-2024
08:15 AM
|
0
|
0
|
1419
|
|
POST
|
You may want to remove the password in def Portal_push(). Without a response message it's hard to tell anything. On this line, add in: resp = agoLayer.edit_attributes(updates=[unit_dict])
Print (resp) I am going to guess the update you are sending is too large and needs to be split into multiple updates. [EDIT]: Eh maybe not...you're only updating one feature at a time. Maybe a bad data type so it rejects making the update.
... View more
05-30-2024
05:28 AM
|
0
|
2
|
1730
|
|
POST
|
It seems possible albeit somewhat involved: Part 2 - Working with Geometries | ArcGIS API for Python About halfway down is a Creating geometries interactively using the map widget section. It seems like you'll have the capture what the user draws inside the widget, copy the geometry, then append a new feature with that geometry to your layer. To fill out the metadata, you could import ipywidgets and make a few fields the user fills out https://ipywidgets.readthedocs.io/en/stable/
... View more
05-28-2024
09:01 AM
|
0
|
0
|
868
|
|
POST
|
I actually might? No idea if this will be of any use; I haven't looked at it since 2019, but I'll dump it in here and you can pick through it and see if anything jumps out at you as helpful. I'll try to pull out only the relevant bits. The below notebook at one point was more or less an email alert system that used python pickling to store data based on a geospatial evaluation. The notebook ran on a schedule and any changes detected between the stored/pickled dataframe and the new dataframe were emailed. I eventually abandoned the whole thing because it wasn't "great" and I jumped over to the Geoevent Server instead. I imagine your workflow could be something like your notebook (if you're using a notebook) is set to run every 15 minutes. The notebook "pickles" your FeatureLayer on first run. Every subsequent run the notebook compares the latest FeatureLayer pull with the pickled one. Changes are emailed and the pickle is overwritten with the latest. Your use case is probably a lot simpler than the one I have below. from arcgis.gis import GIS
import pickle
import pandas as pd
import smtplib
import csv
from arcgis.features import GeoAccessor, GeoSeriesAccessor
from IPython.core.display import display, HTML
# Pandas Columns to Drop Post Join
droppable_columns = ['index_right', 'objectid']
# Historic File Checker
h_pickle = False
# Global Run Flag. Used in each block.
global_run_flag = True
# Who is getting these emails
to_email_list = ['test1@test.com, test2@test.com']
def send_email_smtp(recipients, message,subject="Test Notification from AGOL Notebook - Tsunami"):
# Sends the `message` string to all of the emails rein the `recipients` list using the configured SMTP email server.
try:
# Set up server and credential variables
smtp_server_url = "relay_server"
smtp_server_port = 25
sender = "alert_sample@test.com"
# username = "your_username"
# password = secrets["smtp_email_password"]
# Instantiate our server, configure the necessary security
server = smtplib.SMTP(smtp_server_url, smtp_server_port)
server.ehlo()
# server.starttls() # Needed if TLS is required w/ SMTP server
# server.login(username, password)
except Exception as e:
log.warning("Error setting up SMTP server, couldn't send " +
f"message to {recipients}")
raise e
# For each recipient, construct the message and attempt to send
did_succeed = True
for recipient in recipients:
try:
message_body = '\r\n'.join(['To: {}'.format(recipient),
'From: {}'.format(sender),
'MIME-Version: 1.0',
'Content-type: text/html',
'Subject: {}'.format(subject),
'',
'{}'.format(message)])
message_body = message_body.encode("utf-8")
server.sendmail(sender, [recipient], message_body)
print(f"SMTP server returned success for sending email "\
f"to {recipient}")
except Exception as e:
log.warning(f"Failed sending message to {recipient}")
log.warning(e)
did_succeed = False
# Cleanup and return
server.quit()
return did_succeed
def create_empty_spatial() :
default_df = pd.DataFrame(columns=['ASSOC_CITY','ASSOC_ST','ActivePerimeter','CENTER','FACID','FACTYPE','GEO_CAT','GeoTime','LATITUDE','LG_NAME','LONGITUDE','OBJECTID','SHAPE','alertcode'])
return default_df
def create_spatial_evalulation_df(geospatialspatial_ww_area_frame) :
evaluation_df = tsunami_point_sdf.spatial.join(geospatialspatial_ww_area_frame, how='left')
evaluation_df = evaluation_df.drop(droppable_columns, axis = 1)
evaluation_df = evaluation_df.dropna(subset=['alertcode'])
evaluation_df['ActivePerimeter'] = 'Y'
return evaluation_df
def pickle_eval() :
default_df = create_empty_spatial()
pickle_check = os.listdir('.')
warning_pickle_file = 'tsunami_intersection_warning_previous.pkl' not in pickle_check
watch_pickle_file = 'tsunami_intersection_watch.pkl' not in pickle_check
advisory_pickle_file = 'tsunami_intersection_advisory_previous.pkl' not in pickle_check
if warning_pickle_file == True :
default_df.to_pickle("tsunami_intersection_warning_previous.pkl")
if watch_pickle_file == True :
default_df.to_pickle("tsunami_intersection_watch_previous.pkl")
if advisory_pickle_file == True :
default_df.to_pickle("tsunami_intersection_advisory_previous.pkl")
return True
def df_size_eval(frame):
s = frame.size
return s
def display_frames(framelist) :
for f in framelist :
display (f)
# Misc Notes
def Cleanup(*args) :
for filenames in args :
os.remove(filenames) The Evaluation stuff, if I've pulled it out correctly, is below. # Get our information layers and create dataframes. Split by warning type
tsunami_point_sdf = pd.DataFrame.spatial.from_layer(gis.content.get(Facility_Equipment_Points).layers[0])
tsunami_layer_sdf = select_features.sdf
# Query Watch/Warning/Advisory Layers and pull polygons based on alertCode
tsunami_layer_advisory_df = tsunami_layer_sdf.query('alertcode == "TSY"')
tsunami_layer_watch_df = tsunami_layer_sdf.query('alertcode == "TSA"')
tsunami_layer_warning_df = tsunami_layer_sdf.query('alertcode == "TSW"')
# Display Data
display_frames([tsunami_layer_advisory_df, tsunami_layer_watch_df, tsunami_layer_warning_df])
# If we have Data, create a df for evaluation
if df_size_eval(tsunami_layer_advisory_df) == 0 :
tsunami_intersection_advisory_df = create_empty_spatial()
else :
tsunami_intersection_advisory_df = create_spatial_evalulation_df(tsunami_layer_advisory_df)
if df_size_eval(tsunami_layer_watch_df) == 0 :
tsunami_intersection_watch_df = create_empty_spatial()
else :
tsunami_intersection_watch_df = create_spatial_evalulation_df(tsunami_layer_watch_df)
if df_size_eval(tsunami_layer_warning_df) == 0 :
tsunami_intersection_warning_df = create_empty_spatial()
else :
tsunami_intersection_warning_df = create_spatial_evalulation_df(tsunami_layer_warning_df)
### This is the pickle evaluation part ###
# Load our old data. We need to check to see if the files exist. If they don't, we create empty files based on the current advisories format. Basically, push an empty DF with the right columns to the directory.
if global_run_flag is True :
if h_pickle is False:
h_pickle = pickle_eval()
# Read our previous (or new)
previous_tsunami_intersection_advisory_df = pd.read_pickle("tsunami_intersection_advisory_previous.pkl")
previous_tsunami_intersection_watch_df = pd.read_pickle("tsunami_intersection_watch_previous.pkl")
previous_tsunami_intersection_warning_df = pd.read_pickle("tsunami_intersection_warning_previous.pkl")
display_frames([previous_tsunami_intersection_advisory_df, previous_tsunami_intersection_watch_df, previous_tsunami_intersection_warning_df])
# Determine if facilities is no longer under a watch/warning. Only retain records where the previous remains in new
if global_run_flag is True :
display (previous_tsunami_intersection_advisory_df['FACID'].isin(tsunami_intersection_advisory_df['FACID']))
display (previous_tsunami_intersection_watch_df['FACID'].isin(tsunami_intersection_watch_df['FACID']))
display (previous_tsunami_intersection_warning_df.FACID.isin(tsunami_intersection_warning_df.FACID))
previous_tsunami_intersection_advisory_df = previous_tsunami_intersection_advisory_df[previous_tsunami_intersection_advisory_df.FACID.isin(tsunami_intersection_advisory_df.FACID)]
previous_tsunami_intersection_watch_df = previous_tsunami_intersection_watch_df[previous_tsunami_intersection_watch_df.FACID.isin(tsunami_intersection_watch_df.FACID)]
previous_tsunami_intersection_warning_df = previous_tsunami_intersection_warning_df[previous_tsunami_intersection_warning_df.FACID.isin(tsunami_intersection_warning_df.FACID)]
# Now we do the opposite. Which facilities are newly detected as under things.
if global_run_flag is True :
display (tsunami_intersection_advisory_df['FACID'].isin(previous_tsunami_intersection_advisory_df['FACID']))
display (tsunami_intersection_watch_df['FACID'].isin(previous_tsunami_intersection_watch_df['FACID']))
display (tsunami_intersection_warning_df.['FACID'].isin(previous_tsunami_intersection_warning_df.FACID))
email_tsunami_intersection_advisory_df = tsunami_intersection_advisory_df[~tsunami_intersection_advisory_df.FACID.isin(previous_tsunami_intersection_advisory_df.FACID)]
email_tsunami_intersection_watch_df = tsunami_intersection_watch_df[~tsunami_intersection_watch_df.FACID.isin(previous_tsunami_intersection_watch_df.FACID)]
email_tsunami_intersection_warning_df = tsunami_intersection_warning_df[~tsunami_intersection_warning_df.FACID.isin(previous_tsunami_intersection_warning_df.FACID)]
display_frames([email_tsunami_intersection_advisory_df, email_tsunami_intersection_watch_df, email_tsunami_intersection_warning_df])
# We now want to concatenate our updated previous dataframe and our newest dataframe. Our previous dataframe only has ones still under an advisory, while and our email_tsunami_intersection_advisory_df has only new ones.
if global_run_flag is True :
tsunami_intersection_advisory_df = pd.concat([previous_tsunami_intersection_advisory_df, email_tsunami_intersection_advisory_df])
tsunami_intersection_watch_df = pd.concat([previous_tsunami_intersection_watch_df, email_tsunami_intersection_watch_df])
tsunami_intersection_warning_df = pd.concat([previous_tsunami_intersection_warning_df, email_tsunami_intersection_warning_df])
# Now we just need to pickle the current dataframes we've generated for each alert code and set them to our previous.
if global_run_flag is True :
tsunami_intersection_advisory_df.to_pickle("tsunami_intersection_advisory_previous.pkl")
tsunami_intersection_watch_df.to_pickle("tsunami_intersection_watch_previous.pkl")
tsunami_intersection_warning_df.to_pickle("tsunami_intersection_warning_previous.pkl")
if global_run_flag is True :
if email_tsunami_intersection_advisory_df.size > 0 or email_tsunami_intersection_watch_df.size > 0 or email_tsunami_intersection_warning_df.size > 0 :
send_email = True
print (send_email)
else :
send_email = False
print (send_email)
# Now we have a list of facilities already split out by warning type. We want to build this into one html email.
if global_run_flag is True :
# This sends only the additions/newest
tsy_notification = build_notification_table("Tsunami Advisory (TSY)", email_tsunami_intersection_advisory_df)
tsa_notification = build_notification_table("Tsunami Watch (TSA)", email_tsunami_intersection_watch_df)
tsw_notification = build_notification_table("Tsunami Warning (TSW)", email_tsunami_intersection_warning_df)
# This sends all data including new ones
# tsw_notification = build_notification_table("Tsunami Warning (TSW)", tsunami_intersection_warning_df)
# tsa_notification = build_notification_table("Tsunami Watch (TSA)", tsunami_intersection_watch_df)
# tsy_notification = build_notification_table("Tsunami Advisory (TSY) TESTING USES FWW", tsunami_intersection_advisory_df)
my_message = f"""
<div style="width: 100%; font-family:Calibri, Arial, san-serif">
<p style="font-size: 16px">A Tsunami Advisory, Watch, or Warning has been issued with potential impact to the following facilities & equipment:<br></p>
</div>
<div style="font-family:Calibri, Arial, san-serif">
<p style="font-size: 12px; font-weight: bold">
{tsw_notification}
</div>
<br>
<div style="font-family:Calibri, Arial, san-serif">
<p style="font-size: 12px; font-weight: bold">
{tsa_notification}
</div>
<div style="font-family:Calibri, Arial, san-serif">
<p style="font-size: 12px; font-weight: bold">
{tsy_notification}
</div>
<p style="font-family:Calibri, Arial, san-serif;font-size: 10.5px">Questions? Need Removed? Email john.r.evans@faa.gov</p>
"""
if global_run_flag is True :
if send_email is True :
send_email_smtp(recipients = to_email_list, message = my_message)
... View more
05-24-2024
07:09 AM
|
1
|
0
|
1533
|
|
POST
|
In poking around for a few minutes I don't see a register_listener operation in the docs. Do you have a link to it anywhere? https://developers.arcgis.com/python/api-reference/arcgis.features.managers.html#featurelayermanager If I do a quick dir check, the operation doesn't exist dir (managers.FeatureLayerManager)
['__class__',
'__delattr__',
'__dict__',
'__dir__',
'__doc__',
'__eq__',
'__format__',
'__ge__',
'__getattribute__',
'__gt__',
'__hash__',
'__init__',
'__init_subclass__',
'__le__',
'__lt__',
'__module__',
'__ne__',
'__new__',
'__reduce__',
'__reduce_ex__',
'__repr__',
'__setattr__',
'__sizeof__',
'__str__',
'__subclasshook__',
'__weakref__',
'_check_status',
'_get_status',
'_hydrate',
'_invoke',
'_refresh',
'_refresh_callback',
'_token',
'add_to_definition',
'contingent_values',
'delete_from_definition',
'field_groups',
'fromitem',
'properties',
'refresh',
'truncate',
'update_definition']
... View more
05-24-2024
05:52 AM
|
0
|
1
|
1537
|
|
POST
|
If you plop both spatial data frames on a map widget does the data look correct? Hard to do much without seeing the actual data.
... View more
05-13-2024
06:35 AM
|
0
|
0
|
1004
|
|
POST
|
Awesome yeah after a quick glance its probably something like changing df["date_str"] = pd.to_datetime(df['fluorometry_date']).dt.strftime('%m/%d/%Y') to df["date_str"] = pd.to_datetime(df['fluorometry_date']).dt.strftime('%m/%dd/%YYYY')
... View more
05-07-2024
11:55 AM
|
0
|
0
|
2371
|
|
POST
|
Sure seems that way; you're awfully close. Quickest way to check would be to comment out the date and see if success: true comes back and your updates show. # cyanopond_feature.attributes['Sample_Date'] = fluorsample_feature.attributes['fluorometry_date'] From there you can figure out what's going on; your pandas is likely not formatted in a way AGOL wants it to be. https://doc.arcgis.com/en/arcgis-online/manage-data/work-with-date-fields.htm This link has a table of what your timestamp should look like. You'll probably have to mess with your ['Sample_Date'] column a little bit to get it right.
... View more
05-07-2024
11:19 AM
|
0
|
0
|
2377
|
|
POST
|
Should be able to. I'm not super familiar with field maps. If I've got a good read on what you're asking for I think it looks something like var poly1 = FeatureSetByName($map, "Plantation Areas")
var intersects = Intersects(Geometry($feature), poly1)
var distinct_planting = []
var display_output = ""
for(var f in intersects) {
Push(distinct_planting, f['plantation_year_field'])
}
distinct_planting = Distinct(distinct_planting)
for (var pyear in distinct_planting) {
display_output = display_output + distinct_planting[pyear] + ' '
}
display_output = Trim(display_output)
return {
type : 'text',
text : display_output
}
... View more
05-07-2024
10:03 AM
|
2
|
2
|
2075
|
|
POST
|
What response are you getting when you run the update? can you add in: resp = cyanoponds_lyr.edit_features(updates=[cyanopond_feature])
Print (resp) It's sending out an update and getting a response back successfully, but the response is probably saying "hey i cant find this record so nothing was updated" or something; its not throwing an actual exception. I've wrestled with it before and how I fixed it was to create a template feature based on the fields I'm updating (In my example I'm only making things Active or Inactive), updated the attributes I want to send to with the objectID to update, and then fired them off. Row is just the row of data in my dataframe. features_to_update = []
def create_update_feature(row, objid) :
# Establish Feature Template
template_feature = {"attributes": {"OBJECTID": '', "active": ''}}
# Copy Template to a new feature var
update_feature = copy.deepcopy(template_feature)
# assign the updated values
update_feature['attributes']["OBJECTID"] = int(objid)
update_feature['attributes']["active"]=row["active"]
# Return the update feature
return update_feature
update_feature = create_update_feature(row, update_objid)
features_to_update.append(update_feature) Then you would do something like cyanoponds_lyr.edit_features(updates=[features_to_update]) This helped me out a lot: https://developers.arcgis.com/python/guide/editing-features/
... View more
05-07-2024
09:13 AM
|
0
|
0
|
2382
|
|
POST
|
Is the water rights data a single feature service or multiple feature services? It kind of sounds like instead of labeling the BLM Survey layer, you build a label expression that spits out one label from multiple columns in your water rights FS. This is some lazy pesudo code but something like var txt = "" IF($feature.RIGHTS1 != "") {txt = txt + $feature.RIGHTS1 + TextFormatting.NewLine} IF($feature.RIGHTS2 != "") {txt = txt + $feature.RIGHTS2 + TextFormatting.NewLine} IF($feature.RIGHTS3 != "") {txt = txt + $feature.RIGHTS3 + TextFormatting.NewLine} return { type : 'text', text : txt }
... View more
03-25-2024
10:39 AM
|
0
|
0
|
713
|
|
POST
|
There's a more in-depth discussion here with example code that seems? like it's what you're looking to do arcade-expressions/dashboard_data/SplitCategories(PieChart).md at master · Esri/arcade-expressions · GitHub
... View more
03-25-2024
10:09 AM
|
1
|
1
|
1375
|
|
POST
|
Get rid of your first 3 variables and just use PoleFeatures. f is just variable that holds the data from each feature within PoleFeatres ${f.PoleID} should just be ${f.ID}, sorry. Hard to work with data I can't see. If you put a Console(f) above the if statement, it will print out each feature within PoleFeatures as it iterates. You're sure the XYjoin field will match between the two data sets, yeah?
... View more
02-21-2024
03:23 PM
|
0
|
2
|
3226
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 04-11-2025 10:41 AM | |
| 3 | 10-16-2024 07:32 AM | |
| 1 | 01-08-2024 07:28 AM | |
| 1 | 02-05-2024 06:47 AM | |
| 1 | 05-24-2024 07:09 AM |
| Online Status |
Offline
|
| Date Last Visited |
3 weeks ago
|