POST
|
A nother way to do it would be; sdf=sdf.assign(longitude=sdf.SHAPE.astype(str).apply(lambda x: Point(x)\
.coordinates()).str[0],latitude=sdf.SHAPE.astype(str).\
apply(lambda x: Point(x).coordinates()).str[1])
... View more
05-27-2021
10:49 PM
|
0
|
0
|
3008
|
POST
|
It basically depends on what lines it is skipping. Sharing a code snippet would have been useful. I had a similar issue where I was generating centroids and wanting to use centroid coordinates. What I did was to ensure the centroids are written on portal and on the consequent run, the code would search for the new write and utilize it. In other instances, I introduced sleep time to make sure latency was covered for. See code snippet below
... View more
05-27-2021
10:35 PM
|
0
|
0
|
638
|
POST
|
Hi Tory I have had this question for a long time. We are an organization purely on AGOL. There hasn't been ways to schedule notebooks to run. We adopted Azure as a cloud computing system and things changed. Within Azure, I can create Functions. Functions are triggered by occurrence of time or new data. I integrated ArcGIS python API python scripts, scipy spatial and QGIS and I can now be able to directly read and update feature services on AGOL using SQL servers on Premise and Azure SQL dbs in cloud. Importantly, I can use databricks which is scalable and can handle million of rows of data I know you may not have these facilities, however it may not possible to do this within ArcGIS. I believe other cloud computing platforms other than Azure are capable of doing the same. I am finding it quite limited to rely on ArcPro to integrate GIS datasets with non GIS data and beginning to ask myself if as GIS specialist I should look beyond the conventional GIS software.
... View more
05-27-2021
10:20 PM
|
0
|
0
|
1530
|
POST
|
Resolved this issue by restricting myself to the ArcGIS API. I used this notebook and hence avoided writing to disc
... View more
01-17-2021
09:32 PM
|
0
|
0
|
1040
|
POST
|
Thanks, logged the issue https://github.com/Esri/arcgis-python-api/issues/890 . Will uninstall and reinstall 2.6 and see how I go
... View more
12-20-2020
04:48 PM
|
0
|
0
|
1142
|
POST
|
Was quite exited to update to ArcPro 2.7 and now I am wailing. This code ran well on ArcPro 2.6 until I rolled to 2.7 from arcgis.gis import GIS
import arcgis
from arcgis import features
from arcgis.geoanalytics import manage_data
from arcgis.features.manage_data import overlay_layers
from arcgis.features import GeoAccessor, GeoSeriesAccessor, FeatureLayer
import arcpy
import sys, os
import pandas as pd
import datetime as dt
from arcgis.features import FeatureLayerCollection
import http.client
import mimetypes
import json
import requests
from pandas import json_normalize
from arcgis import geometry
from copy import deepcopy
from arcgis.features.manage_data import dissolve_boundaries
gis = GIS("url.maps.arcgis.com", "UserName", "Password)
item=gis.content.get('Feature_Layer Portal ID')
l=item.layers[0]
df=l.query().sdf
#Create Temporary Geodatabase
if arcpy.Exists(r"C:\Centroid"):
arcpy.Delete_management(r"C:\Centroid")
arcpy.CreateFolder_management(r"C:", "Centroid")
if arcpy.Exists(r"C:\Centroid"):
arcpy.Delete_management(r"C:\Centroid\Centroid")
arcpy.CreateFolder_management(r"C:\Centroid", "Centroid")
arcpy.CreateFolder_management(r"C:","Centroid")
if arcpy.Exists(r"C:\Centroid\Centroid\KCentroids.gdb"):
arcpy.Delete_management(r"C:\Centroid\Centroid\KCentroids.gdb")
arcpy.CreateFileGDB_management(r"C:\Centroid\Centroid","KCentroids")
#Write spatially enabled dataframe to disk
df.spatial.to_featureclass('C:\Centroid\Centroid\KCentroids.gdb\jana') For reasons I cant explain df.spatial.to_featureclass can neither write feature class to geodatabase nor shapefile to a folder. Has anyone experienced this? If not what could it be because update was successful I get the error ValueError: invalid JSON data
... View more
12-17-2020
09:24 PM
|
0
|
5
|
1297
|
POST
|
@LarrySpear , @DanPatterson Saw this a tad late but here is another way to go about it. ArcGIS API for python has elaborate methods to extract centroids. Refer to this document If you have your content on AGOL from arcgis.gis import GIS
import arcgis
from arcgis import features
from arcgis.features import GeoAccessor, GeoSeriesAccessor, FeatureLayer
from arcgis import geometry
from arcgis.features.find_locations import find_centroids
#sign in portal
gis = GIS("https://url.arcgis.com", "UserName", "Password")
#Feature Service
item=gis.content.get('Feature_Service_ID')
l=item.layers[0]
#Generate Centroids. PLEASE NOTE THEY WILL BE PUBLISHED ON PORTAL
poly_to_point = find_centroids(l, output_name="Centroids") You can then generate and tabulate the geometry
... View more
12-10-2020
07:51 PM
|
0
|
0
|
3377
|
POST
|
An option I found and now use is the Azure Serverless Compute Functions. Keyvault Secrets can be integrated into Functions. What is fed in Functions is the Secret Identifier and not other credentials that can be used to retrieve the secrets conventionally. You then can add relevant codes logging into AGOL in the functions. That's the safest I got. If they are scripts required to run on schedules, use Azure TimerTriggers. If the scripts needed to be initiated by Feature Class Edits or updates Httpriggers can be used.
... View more
12-02-2020
11:15 PM
|
0
|
0
|
1181
|
POST
|
Could you please provide the tables you are trying to edit/update. Spatial enable dataframes could achieve the same and I have found them much easier and a way of the future I think
... View more
12-01-2020
06:47 PM
|
0
|
0
|
1702
|
POST
|
I have a python script (Notebook) that runs on a timmer trigger in Azure. The script is on a virtual machine (VM). Azure runbooks fires the VM shortly before the timmer trigger takes effect. These scripts among other things uses arcpy to compute a centroid for multiple polygons. I have installed ArcPro on the VM and this is what is used to run the script. Because of the costs and software updates associated with the VM, I have been asked to integrate this script into azure httptrigger functions. So far I have been able to set a webhook which listens to multiple edits on dependent feature services and triggers the script to run. What is needed now is for me to integrate the centroid computation component in the Azure functions. I am able to do this for all other things except the centroid computation bit which utilizes arcpy. This is because I am able to install other packages in Visual Studio Code and Azure except arcpy. Whereas I am able to install arcgis in Visual Studio Code virtual environment before I deploy the function into Azure, I have challenges installing arcpy. Questions 1. Has someone done this before and how do I go about it? 2. This post looks into use of use of docker to run ArcGIS Notebooks in Azure but only for enterprise. Can someone point me to documentation which I can use for ArcGIS Online because I am not on enterprise?
... View more
12-01-2020
03:57 PM
|
1
|
0
|
620
|
POST
|
I had this resolved. I dropped duplicates. Being a large file, I hadnt noticed the duplicates
... View more
11-23-2020
07:43 PM
|
0
|
0
|
733
|
POST
|
I have two feature services. An old and new one. They serve different purposes and are regularly updated by different teams. They have some common fields but not all. I would like to regularly find common rows based on column key and update one of them Layer to be updated: item=gis.content.get('xxxxx')
l=item.layers[0]
df=l.query().sdf
df.head() Layer to be used to update; item=gis.content.get('yyyyy')
l=item.layers[0]
df2=l.query().sdf
df2.head() Using the python API CODE I tried the following; g=l.query().features
g=l.query().features
for o in overlap_rows['key']:
# get the feature to be updated
original_feature = [f for f in g if f.attributes['key'] == o][0]
print(original_feature)
feature_to_be_updated = deepcopy(original_feature)
print(o)#print(str(original_feature))
matching_row = df2.where(df2.key== o).dropna()
feature_to_be_updated.attributes['Long'] = float(matching_row['Long'])
feature_to_be_updated.attributes['Lat'] = float(matching_row['Lat'])
features_for_update.append(feature_to_be_updated)
features_for_update This results into an error; ---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
In [532]:
Line 9: feature_to_be_updated.attributes['Long'] = float(matching_row['Long'])
File C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3-clone1\lib\site-packages\pandas\core\series.py, in wrapper:
Line 112: raise TypeError(f"cannot convert the series to {converter}")
TypeError: cannot convert the series to <class 'float'>
--------------------------------------------------------------------------- On investigation, I found for some reason the line below at time returns two rows and not one despite the fact that the key is unique for each row; matching_row = df2.where(df2.Ops_Code == o).dropna() Any reasons why this is happening? Has someone done such an update before?
... View more
11-23-2020
12:33 AM
|
0
|
1
|
750
|
POST
|
Not sure the efficaccy of the solution you are after but this is how you go about it. import pandas as pd
#Pivot df to send GrowthYrs to columns and come up with new frame
df1 = pd.pivot_table(df, values=['GrowYrAbr', 'AvgHt', 'MaxHt'], index=['Unit'],columns=['GrowthYr'])
#Step above results in mutliindex, so collapse it by renaming column using a list comprehension with the help of fstrings
df1.columns = [f'{a}_{b}' for a, b in df1.columns]
#merge new frame to old
df2=pd.merge(df,df1, how ='left', on='Unit')
print(df2)
Unit GrowthYr GrowYrAbr AvgHt MaxHt AvgHt_21 AvgHt_22 AvgHt_23 \
0 2021 21 0.59 0.94 9.58 1.495 NaN NaN
1 2022 22 0.85 1.30 10.05 NaN 1.865 NaN
2 2023 23 1.12 1.68 10.52 NaN NaN 2.245
3 2021 21 1.41 2.05 5.61 1.495 NaN NaN
4 2022 22 1.71 2.43 6.14 NaN 1.865 NaN
5 2023 23 2.01 2.81 6.67 NaN NaN 2.245
GrowYrAbr_21 GrowYrAbr_22 GrowYrAbr_23 MaxHt_21 MaxHt_22 MaxHt_23
0 1.0 NaN NaN 7.595 NaN NaN
1 NaN 1.28 NaN NaN 8.095 NaN
2 NaN NaN 1.565 NaN NaN 8.595
3 1.0 NaN NaN 7.595 NaN NaN
4 NaN 1.28 NaN NaN 8.095 NaN
5 NaN NaN 1.565 NaN NaN 8.595
... View more
11-18-2020
10:44 PM
|
1
|
3
|
1621
|
POST
|
The link tiles all my subscriptions and I have to unlock each and read. What we had before was questions on topics subscribed to appeared in the notifications. You could read and set them as unread if you wanted to revisit . It was therefore so easy to see developments on a specific question or even reply to it and contribute on the fly without going into each topic. Is the facility still available and how do I set individual questions on a topic to be logged in as a single entity notification?
... View more
11-13-2020
05:07 PM
|
0
|
1
|
1360
|
POST
|
This is not what I am after. This sends email notifications to my personal email. I would like to set so that issues on topics of interests are logged onto the platforms notification. That way if I want to check past notifications they are all in one place. Is that possible? How do I get that done?
... View more
11-13-2020
04:51 PM
|
0
|
3
|
1366
|
Title | Kudos | Posted |
---|---|---|
1 | 12-01-2020 03:57 PM | |
1 | 11-18-2020 10:44 PM | |
1 | 09-28-2020 10:07 PM | |
1 | 03-06-2019 05:31 PM |
Online Status |
Offline
|
Date Last Visited |
05-03-2023
09:10 PM
|