POST
|
Hello James, Do you have a snippet or a example of what I could use? I am trying to see where to get started with this idea.
... View more
08-04-2020
07:40 AM
|
0
|
1
|
101
|
POST
|
Does anyone know if the S123 Report functionality can be included as an attachment with the same feature service? I am thinking there has to be something with the ArcGIS Python API to make this happen but not entirely sure.
... View more
07-28-2020
08:06 AM
|
0
|
3
|
159
|
POST
|
Does anyone know if this issue has been resolved or has an alternate solution? I have a python script that converts a JSON file to CSV and then it overwrite a hosted feature service. The script recently threw an error referencing that the collection.manager.overwrite function is no longer working. I found this link that references the Bug that has been reported. BUG-000115253: When using the overwrite method from arcgis.features.
... View more
07-22-2020
08:41 AM
|
0
|
1
|
66
|
POST
|
Benjamin Mittler how about adding the table through the dashboard?
... View more
05-01-2020
09:02 AM
|
1
|
0
|
500
|
POST
|
Benjamin Mittler have you looked at creating a view layer? It gives you the ability to hide fields for the end user.
... View more
04-27-2020
05:32 PM
|
0
|
2
|
500
|
POST
|
Kevin here are the two solutions that Esri recommended, with the first one working for me with no problem. Let me know if you do not get it up and running. Workarounds Create Feature Class from CSV Add the CSV file to the map Right click the table and click on display XY data Fill in all the required information Create a feature class Use this as the input feature layer for appending the data OR Publish CSV file as hosted feature layer in ArcGIS Online Click on the Add item from your computer in ArcGIS Online Browse for the CSV file Make sure the box for "Publish as hosted feature service" is checked Fill in all the required fields Click on Publish Add the both published hosted feature layers into ArcMap Use Append (Data Management) tool Use the hosted feature layer created from the CSV file as the input feature layer and the other hosted feature layer as the target layer Hit on Run.
... View more
04-27-2020
05:30 PM
|
0
|
0
|
80
|
POST
|
Suzann Leininger Are you able to group the filter selection?
... View more
04-27-2020
11:24 AM
|
0
|
0
|
129
|
IDEA
|
Also having a timestamp embedded on the export would be helpful.
... View more
04-16-2020
09:15 AM
|
2
|
0
|
682
|
POST
|
I am attempting to find a solution through Arcade to help combine multiple values into one. I have an example below where Navajo County would represent 5 and Pima County would represent 135 vs having individual records on the map.
... View more
04-11-2020
11:50 AM
|
0
|
0
|
155
|
POST
|
This is a script that created in Jupyter Notebook and exported as a Python Script that runs on a server once a day. It parses a website that builds a nice data frame/CSV that then appends to a point feature service. Line 134 should work for your workflow. Hope that helps. #!/usr/bin/env python # coding: utf-8 # In[42]: import bs4 as bs import urllib . request import pandas as pd import numpy as np from arcgis . gis import GIS from arcgis import features from copy import deepcopy import arcpy # In[43]: source = urllib . request . urlopen ( 'https://smoke.azdeq.gov/' ) . read ( ) soup = bs . BeautifulSoup ( source , 'lxml' ) # In[44]: table = soup . table # In[45]: table = soup . find ( 'table' ) # In[46]: table_rows = table . find_all ( 'tr' ) # In[47]: for tr in table_rows : td = tr . find_all ( 'td' ) row = [ i . text for i in td ] print ( row ) # In[48]: dfs = pd . read_html ( 'https://smoke.azdeq.gov/' , header = 0 ) for df in dfs : print ( df ) # In[49]: df . head ( ) # In[50]: df [ 'Location' ] = df [ 'Location' ] . astype ( str ) . str . replace ( r "[\(\)']" , '' ) # In[51]: lat = [ ] lon = [ ] # In[52]: # For each row in a varible, for row in df [ 'Location' ] : # Try to, try : # Split the row by comma and append # everything before the comma to lat lat . append ( row . split ( ',' ) [ 0 ] ) # Split the row by comma and append # everything after the comma to lon lon . append ( row . split ( ',' ) [ 1 ] ) # But if you get an error except : # append a missing value to lat lat . append ( np . NaN ) # append a missing value to lon lon . append ( np . NaN ) # In[53]: # Create two new columns from lat and lon df [ 'latitude' ] = lat df [ 'longitude' ] = lon # In[54]: df [ 'latitude' ] = df [ 'latitude' ] . apply ( lambda x : ' ' . join ( x . split ( ' ' ) [ 1 : ] ) ) # In[55]: del df [ 'Location' ] # In[56]: df . head ( ) # In[57]: df . to_csv ( r 'C:\Code\Python\Web_Scraping\ADEQ_RX_Burns\CSV\ADEQ_RX_Burns1.csv' , encoding = 'utf-8' , index = False ) # In[63]: gis = GIS ( "https://www.arcgis.com" , "Your Username" , "Your Password" ) print ( "Logged in as " + str ( gis . properties . user . username ) ) # In[64]: arcpy . management . XYTableToPoint ( r "C:\Code\Python\Web_Scraping\ADEQ_RX_Burns\CSV\ADEQ_RX_Burns1.csv" , r "C:\Code\Python\Web_Scraping\ADEQ_RX_Burns\New File Geodatabase.gdb\ADEQ_RX_Burns1" , "longitude" , "latitude" , None , "GEOGCS['GCS_WGS_1984',DATUM['D_WGS_1984',SPHEROID['WGS_1984',6378137.0,298.257223563]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]];-400 -400 1000000000;-100000 10000;-100000 10000;8.98315284119521E-09;0.001;0.001;IsHighPrecision" ) # In[65]: arcpy . management . CalculateField ( r "C:\Code\Python\Web_Scraping\ADEQ_RX_Burns\New File Geodatabase.gdb\ADEQ_RX_Burns1" , "Ignition_Date" , "datetime.datetime.now()" , "PYTHON3" , '' , "TEXT" ) # In[66]: arcpy . management . Append ( r "'C:\Code\Python\Web_Scraping\ADEQ_RX_Burns\New File Geodatabase.gdb\ADEQ_RX_Burns1'" , "https://services6.arcgis.com/l7uujk4hHifqabRB/arcgis/rest/services/ADEQ_RX_Burns1/FeatureServer/0" , "NO_TEST" , r'Smoke_Unit_Number "Smoke Unit Number" true true false 2147483647 Text 0 0 , First , #,C:\Code\Python\Web_Scraping\ADEQ_RX_Burns\New File Geodatabase.gdb\ADEQ_RX_Burns1,Smoke_Unit_Number,0,5000;Burn_Number "Burn Number" true true false 2147483647 Text 0 0,First,#,C:\Code\Python\Web_Scraping\ADEQ_RX_Burns\New File Geodatabase.gdb\ADEQ_RX_Burns1,Burn_Number,0,5000;Burn_Name "Burn Name" true true false 2147483647 Text 0 0,First,#,C:\Code\Python\Web_Scraping\ADEQ_RX_Burns\New File Geodatabase.gdb\ADEQ_RX_Burns1,Burn_Name,0,5000;Ignition_Date "Ignition Date" true true false 8 Date 0 0,First,#,C:\Code\Python\Web_Scraping\ADEQ_RX_Burns\New File Geodatabase.gdb\ADEQ_RX_Burns1,Ignition_Date,-1,-1;Approved_Acres "Approved Acres" true true false 0 Long 0 0,First,#,C:\Code\Python\Web_Scraping\ADEQ_RX_Burns\New File Geodatabase.gdb\ADEQ_RX_Burns1,Approved_Acres,-1,-1;Notes "Notes" true true false 2147483647 Text 0 0,First,#,C:\Code\Python\Web_Scraping\ADEQ_RX_Burns\New File Geodatabase.gdb\ADEQ_RX_Burns1,Notes,0,5000;latitude "latitude" true true false 0 Double 0 0,First,#,C:\Code\Python\Web_Scraping\ADEQ_RX_Burns\New File Geodatabase.gdb\ADEQ_RX_Burns1,latitude,-1,-1;longitude "longitude" true true false 0 Double 0 0,First,#,C:\Code\Python\Web_Scraping\ADEQ_RX_Burns\New File Geodatabase.gdb\ADEQ_RX_Burns1,longitude,-1,-1', '', '') # In[67]: arcpy . management . Delete ( r "'C:\Code\Python\Web_Scraping\ADEQ_RX_Burns\New File Geodatabase.gdb\ADEQ_RX_Burns1'" , '' ) # In[ ]:
... View more
02-07-2020
12:08 PM
|
0
|
0
|
116
|
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:24 AM
|