POST
|
I dont know what exactly happen, a already set 7zip in my environment but if i run some script there are no 7zip in my environment. here the error message : (arcgispro-py3) C:\Users\User\Documents>python coral_reef_exercise_online.py https://downloads.esri.com/LearnArcGIS/update-real-time-data-with-python/vs_polygons.json C:\Temp\Work.gdb c7bdbfb02684459bae0e16e7bf3dbc5e C:\Temp\Coral_Reef_Watch.sd Coral_Reef_Watch
Starting workGDB...
Downloading data...
Creating feature classes...
Deploying...
Traceback (most recent call last):
File "coral_reef_exercise_online.py", line 115, in <module>
feedRoutine (url, workGDB, itemid, original_sd_file, service_name)
File "coral_reef_exercise_online.py", line 63, in feedRoutine
deployLogic(workGDB, itemid, original_sd_file, service_name)
File "coral_reef_exercise_online.py", line 88, in deployLogic
raise Exception('7-Zip could not be found in the PATH environment variable')
Exception: 7-Zip could not be found in the PATH environment variable and this message proof that I have 7zip installed in my environment Im very thanks if anyone can help me to solve this problem
... View more
07-17-2021
08:12 PM
|
0
|
2
|
953
|
POST
|
Thank you for your help, now the problem is solved.
... View more
07-17-2021
06:54 PM
|
0
|
0
|
2138
|
POST
|
Thank you for your suggestion, i already modify my post. Here my script import sys, os, tempfile, json, logging, arcpy, shutil
import datetime as dt
from urllib import request
from urllib.error import URLError
def feedRoutine (url, workGDB, liveGDB):
# Log file
logging.basicConfig(filename="coral_reef_exercise.log", level=logging.INFO)
log_format = "%Y-%m-%d %H:%M:%S"
# Create workGDB and default workspace
print("Starting workGDB...")
logging.info("Starting workGDB... {0}".format(dt.datetime.now().strftime(log_format)))
arcpy.env.workspace = workGDB
gdb_name = os.path.basename(workGDB)
if arcpy.Exists(arcpy.env.workspace):
for feat in arcpy.ListFeatureClasses ("alert_*"):
arcpy.management.Delete(feat)
else:
arcpy.management.CreateFileGDB(os.path.dirname(workGDB), os.path.basename(workGDB))
# Download and split json file
print("Downloading data...")
logging.info("Downloading data... {0}".format(dt.datetime.now().strftime(log_format)))
temp_dir = tempfile.mkdtemp()
filename = os.path.join(temp_dir, 'latest_data.json')
try:
response = request.urlretrieve(url, filename)
except URLError:
raise Exception("{0} not available. Check internet connection or url address".format(url))
with open(filename) as json_file:
data_raw = json.load(json_file)
data_stations = dict(type=data_raw['type'], features=[])
data_areas = dict(type=data_raw['type'], features=[])
for feat in data_raw['features']:
if feat['geometry']['type'] == 'Point':
data_stations['features'].append(feat)
else:
data_areas['features'].append(feat)
# Filenames of temp json files
stations_json_path = os.path.join(temp_dir, 'points.json')
areas_json_path = os.path.join(temp_dir, 'polygons.json')
# Save dictionaries into json files
with open(stations_json_path, 'w') as point_json_file:
json.dump(data_stations, point_json_file, indent=4)
with open(areas_json_path, 'w') as poly_json_file:
json.dump(data_areas, poly_json_file, indent=4)
# Convert json files to features
print("Creating feature classes...")
logging.info("Creating feature classes... {0}".format(dt.datetime.now().strftime(log_format)))
arcpy.conversion.JSONToFeatures(stations_json_path, os.path.join(gdb_name, 'alert_stations'))
arcpy.conversion.JSONToFeatures(areas_json_path, os.path.join(gdb_name, 'alert_areas'))
# Add 'alert_level ' field
arcpy.management.AddField(os.path.join(gdb_name, "alert_stations"), "alert_level", "SHORT", field_alias="Alert Level")
arcpy.management.AddField(os.path.join(gdb_name, "alert_areas"), "alert_level", "SHORT", field_alias="Alert Level")
# Calculate 'alert_level ' field
arcpy.management.CalculateField(os.path.join(gdb_name, "alert_stations"), "alert_level", "int(!alert!)")
arcpy.management.CalculateField(os.path.join(gdb_name, "alert_areas"), "alert_level", "int(!alert!)")
# Deployment Logic
print("Deploying...")
logging.info("Deploying... {0}".format(dt.datetime.now().strftime(log_format)))
deployLogic(workGDB, liveGDB)
# Close Log File
logging.shutdown()
# Return
print("Done!")
logging.info("Done! {0}".format(dt.datetime.now().strftime(log_format)))
return True
def deployLogic(workGDB, liveGDB):
for root, dirs, files in os.walk(workGDB, topdown=False):
files = [f for f in files if '.lock' not in f]
for f in files:
shutil.copy2(os.path.join(workGDB, f), os.path.join(liveGDB, f))
if __name__ == "__main__":
[url, workGDB, liveGDB] = sys.argv[1:]
feedRoutine (url, workGDB, liveGDB) and here the error message (arcgispro-py3) C:\Users\User\Documents>python coral_reef_exercise_local.py https://downloads.esri.com/LearnArcGIS/update-real-time-data-with-python/vs_polygons.json C:\Temp\Work.gdb C:\Temp\Live.gdb
Starting workGDB...
Downloading data...
Creating feature classes...
Traceback (most recent call last):
File "coral_reef_exercise_local.py", line 80, in <module>
feedRoutine (url, workGDB, liveGDB)
File "coral_reef_exercise_local.py", line 50, in feedRoutine
arcpy.conversion.JSONToFeatures(stations_json_path, os.path.join(gdb_name, 'alert_stations'))
File "C:\Program Files\ArcGIS\Pro\Resources\ArcPy\arcpy\conversion.py", line 576, in JSONToFeatures
raise e
File "C:\Program Files\ArcGIS\Pro\Resources\ArcPy\arcpy\conversion.py", line 573, in JSONToFeatures
retval = convertArcObjectToPythonObject(gp.JSONToFeatures_conversion(*gp_fixargs((in_json_file, out_features, geometry_type), True)))
File "C:\Program Files\ArcGIS\Pro\Resources\ArcPy\arcpy\geoprocessing\_base.py", line 512, in <lambda>
return lambda *args: val(*gp_fixargs(args, True))
arcgisscripting.ExecuteError: ERROR 999999: Something unexpected caused the tool to fail. Contact Esri Technical Support (http://esriurl.com/support) to Report a Bug, and refer to the error help for potential solutions or workarounds.
CreateFeatureClassName: The workspace Work.gdb does not exist.
Failed to execute (JSONToFeatures). I'm very happy if you take the time to help me solve this problem. Thank you
... View more
07-16-2021
10:28 PM
|
0
|
0
|
824
|
POST
|
I'm sorry because I'm a beginner in understanding programming languages, especially python. When I follow your suggestion, the result is still an error import sys, os, tempfile, json, logging, arcpy, shutil
import datetime as dt
from urllib import request
from urllib.error import URLError
def feedRoutine (url, workGDB, liveGDB):
# Log file
logging.basicConfig(filename="coral_reef_exercise.log", level=logging.INFO)
log_format = "%Y-%m-%d %H:%M:%S"
# Create workGDB and default workspace
print("Starting workGDB...")
logging.info("Starting workGDB... {0}".format(dt.datetime.now().strftime(log_format)))
arcpy.env.workspace = workGDB
gdb_name = os.path.basename(workGDB)
if arcpy.Exists(arcpy.env.workspace):
for feat in arcpy.ListFeatureClasses ("alert_*"):
arcpy.management.Delete(feat)
else:
arcpy.management.CreateFileGDB(os.path.dirname(workGDB), os.path.basename(workGDB))
# Download and split json file
print("Downloading data...")
logging.info("Downloading data... {0}".format(dt.datetime.now().strftime(log_format)))
temp_dir = tempfile.mkdtemp()
filename = os.path.join(temp_dir, 'latest_data.json')
try:
response = request.urlretrieve(url, filename)
except URLError:
raise Exception("{0} not available. Check internet connection or url address".format(url))
with open(filename) as json_file:
data_raw = json.load(json_file)
data_stations = dict(type=data_raw['type'], features=[])
data_areas = dict(type=data_raw['type'], features=[])
for feat in data_raw['features']:
if feat['geometry']['type'] == 'Point':
data_stations['features'].append(feat)
else:
data_areas['features'].append(feat)
# Filenames of temp json files
stations_json_path = os.path.join(temp_dir, 'points.json')
areas_json_path = os.path.join(temp_dir, 'polygons.json')
# Save dictionaries into json files
with open(stations_json_path, 'w') as point_json_file:
json.dump(data_stations, point_json_file, indent=4)
with open(areas_json_path, 'w') as poly_json_file:
json.dump(data_areas, poly_json_file, indent=4)
# Convert json files to features
print("Creating feature classes...")
logging.info("Creating feature classes... {0}".format(dt.datetime.now().strftime(log_format)))
arcpy.conversion.JSONToFeatures(stations_json_path, os.path.join(gdb_name, 'alert_stations'))
arcpy.conversion.JSONToFeatures(areas_json_path, os.path.join(gdb_name, 'alert_areas'))
# Add 'alert_level ' field
arcpy.management.AddField(os.path.join(gdb_name, "alert_stations"), "alert_level", "SHORT", field_alias="Alert Level")
arcpy.management.AddField(os.path.join(gdb_name, "alert_areas"), "alert_level", "SHORT", field_alias="Alert Level")
# Calculate 'alert_level ' field
arcpy.management.CalculateField(os.path.join(gdb_name, "alert_stations"), "alert_level", "int(!alert!)")
arcpy.management.CalculateField(os.path.join(gdb_name, "alert_areas"), "alert_level", "int(!alert!)")
# Deployment Logic
print("Deploying...")
logging.info("Deploying... {0}".format(dt.datetime.now().strftime(log_format)))
deployLogic(workGDB, liveGDB)
# Close Log File
logging.shutdown()
# Return
print("Done!")
logging.info("Done! {0}".format(dt.datetime.now().strftime(log_format)))
return True
def deployLogic(workGDB, liveGDB):
for root, dirs, files in os.walk(workGDB, topdown=False):
files = [f for f in files if '.lock' not in f]
for f in files:
shutil.copy2(os.path.join(workGDB, f), os.path.join(liveGDB, f))
if __name__ == "__main__":
[url, workGDB, liveGDB] = sys.argv[1:]
feedRoutine (url, workGDB, liveGDB)
... View more
07-16-2021
07:35 PM
|
0
|
2
|
3308
|
POST
|
But Work.gdb is automatically appear when i run the previous script : import sys, arcpy, os, tempfile, json from urllib import request def feedRoutine (url, workGDB😞 # workGDB and default workspace print("Creating workGDB...") arcpy.env.workspace = os.path.dirname(workGDB) gdb_name = os.path.basename(workGDB) arcpy.management.CreateFileGDB(arcpy.env.workspace, gdb_name) # Download and split json file print("Downloading data...") temp_dir = tempfile.mkdtemp() filename = os.path.join(temp_dir, 'latest_data.json') response = request.urlretrieve(url, filename) with open(filename) as json_file: data_raw = json.load(json_file) data_stations = dict(type=data_raw['type'], features=[]) data_areas = dict(type=data_raw['type'], features=[]) for feat in data_raw['features']: if feat['geometry']['type'] == 'Point': data_stations['features'].append(feat) else: data_areas['features'].append(feat) # Filenames of temp json files stations_json_path = os.path.join(temp_dir, 'points.json') areas_json_path = os.path.join(temp_dir, 'polygons.json') # Save dictionaries into json files with open(stations_json_path, 'w') as point_json_file: json.dump(data_stations, point_json_file, indent=4) with open(areas_json_path, 'w') as poly_json_file: json.dump(data_areas, poly_json_file, indent=4) # Convert json files to features print("Creating feature classes...") arcpy.conversion.JSONToFeatures(stations_json_path, os.path.join(gdb_name, 'alert_stations')) arcpy.conversion.JSONToFeatures(areas_json_path, os.path.join(gdb_name, 'alert_areas')) # Add 'alert_level ' field arcpy.management.AddField(os.path.join(gdb_name, "alert_stations"), "alert_level", "SHORT", field_alias="Alert Level") arcpy.management.AddField(os.path.join(gdb_name, "alert_areas"), "alert_level", "SHORT", field_alias="Alert Level") # Calculate 'alert_level ' field arcpy.management.CalculateField(os.path.join(gdb_name, "alert_stations"), "alert_level", "int(!alert!)") arcpy.management.CalculateField(os.path.join(gdb_name, "alert_areas"), "alert_level", "int(!alert!)") # Deployment Logic print("Deploying...") deployLogic() # Return print("Done!") return True def deployLogic(): pass if __name__ == "__main__": [url, workGDB] = sys.argv[1:] feedRoutine (url, workGDB)
... View more
07-16-2021
07:04 PM
|
0
|
0
|
3348
|
POST
|
Here my script : import sys, os, tempfile, json, logging, arcpy, shutil import datetime as dt from urllib import request from urllib.error import URLError def feedRoutine (url, workGDB, liveGDB😞 # Log file logging.basicConfig(filename="coral_reef_exercise.log", level=logging.INFO) log_format = "%Y-%m-%d %H:%M:%S" # Create workGDB and default workspace print("Starting workGDB...") logging.info("Starting workGDB... {0}".format(dt.datetime.now().strftime(log_format))) arcpy.env.workspace = workGDB gdb_name = os.path.basename(workGDB) if arcpy.Exists(arcpy.env.workspace): for feat in arcpy.ListFeatureClasses ("alert_*"😞 arcpy.management.Delete(feat) else: arcpy.management.CreateFileGDB(os.path.dirname(workGDB), os.path.basename(workGDB)) # Download and split json file print("Downloading data...") logging.info("Downloading data... {0}".format(dt.datetime.now().strftime(log_format))) temp_dir = tempfile.mkdtemp() filename = os.path.join(temp_dir, 'latest_data.json') try: response = request.urlretrieve(url, filename) except URLError: raise Exception("{0} not available. Check internet connection or url address".format(url)) with open(filename) as json_file: data_raw = json.load(json_file) data_stations = dict(type=data_raw['type'], features=[]) data_areas = dict(type=data_raw['type'], features=[]) for feat in data_raw['features']: if feat['geometry']['type'] == 'Point': data_stations['features'].append(feat) else: data_areas['features'].append(feat) # Filenames of temp json files stations_json_path = os.path.join(temp_dir, 'points.json') areas_json_path = os.path.join(temp_dir, 'polygons.json') # Save dictionaries into json files with open(stations_json_path, 'w') as point_json_file: json.dump(data_stations, point_json_file, indent=4) with open(areas_json_path, 'w') as poly_json_file: json.dump(data_areas, poly_json_file, indent=4) # Convert json files to features print("Creating feature classes...") logging.info("Creating feature classes... {0}".format(dt.datetime.now().strftime(log_format))) arcpy.conversion.JSONToFeatures(stations_json_path, os.path.join(gdb_name, 'alert_stations')) arcpy.conversion.JSONToFeatures(areas_json_path, os.path.join(gdb_name, 'alert_areas')) # Add 'alert_level ' field arcpy.management.AddField(os.path.join(gdb_name, "alert_stations"), "alert_level", "SHORT", field_alias="Alert Level") arcpy.management.AddField(os.path.join(gdb_name, "alert_areas"), "alert_level", "SHORT", field_alias="Alert Level") # Calculate 'alert_level ' field arcpy.management.CalculateField(os.path.join(gdb_name, "alert_stations"), "alert_level", "int(!alert!)") arcpy.management.CalculateField(os.path.join(gdb_name, "alert_areas"), "alert_level", "int(!alert!)") # Deployment Logic print("Deploying...") logging.info("Deploying... {0}".format(dt.datetime.now().strftime(log_format))) deployLogic(workGDB, liveGDB) # Close Log File logging.shutdown() # Return print("Done!") logging.info("Done! {0}".format(dt.datetime.now().strftime(log_format))) return True def deployLogic(workGDB, liveGDB😞 for root, dirs, files in os.walk(workGDB, topdown=False😞 files = [f for f in files if '.lock' not in f] for f in files: shutil.copy2(os.path.join(workGDB, f), os.path.join(liveGDB, f)) if __name__ == "__main__": [url, workGDB, liveGDB] = sys.argv[1:] feedRoutine (url, workGDB, liveGDB)
... View more
07-16-2021
06:50 PM
|
0
|
5
|
3365
|
POST
|
And if i continue to next step, i found this trouble. Here I attach the error and the script that was made. I'm very happy if you can help me with this problem. Thank you
... View more
07-16-2021
06:45 PM
|
0
|
0
|
995
|
POST
|
I want to learn automatic update real time data with python in ESRI training (https://learn.arcgis.com/en/projects/update-real-time-data-with-python/) but i have some trouble. Here I attach the error and the script that was made. Here my script import sys, os, tempfile, json, logging, arcpy, shutil
import datetime as dt
from urllib import request
from urllib.error import URLError
def feedRoutine (url, workGDB, liveGDB):
# Log file
logging.basicConfig(filename="coral_reef_exercise.log", level=logging.INFO)
log_format = "%Y-%m-%d %H:%M:%S"
# Create workGDB and default workspace
print("Starting workGDB...")
logging.info("Starting workGDB... {0}".format(dt.datetime.now().strftime(log_format)))
arcpy.env.workspace = workGDB
gdb_name = os.path.basename(workGDB)
if arcpy.Exists(arcpy.env.workspace):
for feat in arcpy.ListFeatureClasses ("alert_*"):
arcpy.management.Delete(feat)
else:
arcpy.management.CreateFileGDB(os.path.dirname(workGDB), os.path.basename(workGDB))
# Download and split json file
print("Downloading data...")
logging.info("Downloading data... {0}".format(dt.datetime.now().strftime(log_format)))
temp_dir = tempfile.mkdtemp()
filename = os.path.join(temp_dir, 'latest_data.json')
try:
response = request.urlretrieve(url, filename)
except URLError:
raise Exception("{0} not available. Check internet connection or url address".format(url))
with open(filename) as json_file:
data_raw = json.load(json_file)
data_stations = dict(type=data_raw['type'], features=[])
data_areas = dict(type=data_raw['type'], features=[])
for feat in data_raw['features']:
if feat['geometry']['type'] == 'Point':
data_stations['features'].append(feat)
else:
data_areas['features'].append(feat)
# Filenames of temp json files
stations_json_path = os.path.join(temp_dir, 'points.json')
areas_json_path = os.path.join(temp_dir, 'polygons.json')
# Save dictionaries into json files
with open(stations_json_path, 'w') as point_json_file:
json.dump(data_stations, point_json_file, indent=4)
with open(areas_json_path, 'w') as poly_json_file:
json.dump(data_areas, poly_json_file, indent=4)
# Convert json files to features
print("Creating feature classes...")
logging.info("Creating feature classes... {0}".format(dt.datetime.now().strftime(log_format)))
arcpy.conversion.JSONToFeatures(stations_json_path, os.path.join(gdb_name, 'alert_stations'))
arcpy.conversion.JSONToFeatures(areas_json_path, os.path.join(gdb_name, 'alert_areas'))
# Add 'alert_level ' field
arcpy.management.AddField(os.path.join(gdb_name, "alert_stations"), "alert_level", "SHORT", field_alias="Alert Level")
arcpy.management.AddField(os.path.join(gdb_name, "alert_areas"), "alert_level", "SHORT", field_alias="Alert Level")
# Calculate 'alert_level ' field
arcpy.management.CalculateField(os.path.join(gdb_name, "alert_stations"), "alert_level", "int(!alert!)")
arcpy.management.CalculateField(os.path.join(gdb_name, "alert_areas"), "alert_level", "int(!alert!)")
# Deployment Logic
print("Deploying...")
logging.info("Deploying... {0}".format(dt.datetime.now().strftime(log_format)))
deployLogic(workGDB, liveGDB)
# Close Log File
logging.shutdown()
# Return
print("Done!")
logging.info("Done! {0}".format(dt.datetime.now().strftime(log_format)))
return True
def deployLogic(workGDB, liveGDB):
for root, dirs, files in os.walk(workGDB, topdown=False):
files = [f for f in files if '.lock' not in f]
for f in files:
shutil.copy2(os.path.join(workGDB, f), os.path.join(liveGDB, f))
if __name__ == "__main__":
[url, workGDB, liveGDB] = sys.argv[1:]
feedRoutine (url, workGDB, liveGDB) Here the error message : (arcgispro-py3) C:\Users\User\Documents>python coral_reef_exercise_local.py https://downloads.esri.com/LearnArcGIS/update-real-time-data-with-python/vs_polygons.json C:\Temp\Work.gdb C:\Temp\Live.gdb
Starting workGDB...
Downloading data...
Creating feature classes...
Traceback (most recent call last):
File "coral_reef_exercise_local.py", line 80, in <module>
feedRoutine (url, workGDB, liveGDB)
File "coral_reef_exercise_local.py", line 50, in feedRoutine
arcpy.conversion.JSONToFeatures(stations_json_path, os.path.join(gdb_name, 'alert_stations'))
File "C:\Program Files\ArcGIS\Pro\Resources\ArcPy\arcpy\conversion.py", line 576, in JSONToFeatures
raise e
File "C:\Program Files\ArcGIS\Pro\Resources\ArcPy\arcpy\conversion.py", line 573, in JSONToFeatures
retval = convertArcObjectToPythonObject(gp.JSONToFeatures_conversion(*gp_fixargs((in_json_file, out_features, geometry_type), True)))
File "C:\Program Files\ArcGIS\Pro\Resources\ArcPy\arcpy\geoprocessing\_base.py", line 512, in <lambda>
return lambda *args: val(*gp_fixargs(args, True))
arcgisscripting.ExecuteError: ERROR 999999: Something unexpected caused the tool to fail. Contact Esri Technical Support (http://esriurl.com/support) to Report a Bug, and refer to the error help for potential solutions or workarounds.
CreateFeatureClassName: The workspace Work.gdb does not exist.
Failed to execute (JSONToFeatures). I'm very happy if someone can help me with this problem. Thank you
... View more
07-16-2021
06:43 PM
|
0
|
12
|
3792
|
POST
|
I try to follow tutorial in ESRI training (https://learn.arcgis.com/en/projects/update-real-time-data-with-python/), but have problem in converting JSON to Feature. I think that my error in lines 38-42 in the script I made. but I don't know the details of where the error is.
... View more
07-14-2021
10:58 PM
|
0
|
3
|
1051
|
POST
|
I never use this. I will use this in crime analysis. I think i cant find this tools in arcgis pro. Can I find this in ArcMap?
... View more
04-09-2020
05:10 AM
|
0
|
1
|
509
|
POST
|
I never use this. I will use this in crime analysis. I think i cant find this tools in arcgis pro. Can I find this in ArcMap?
... View more
04-09-2020
04:42 AM
|
0
|
0
|
509
|
POST
|
Thank you for your reply. In toolbox, but i cant find Extract Date Parts To Field tools
... View more
04-09-2020
03:44 AM
|
0
|
0
|
509
|
POST
|
Thank you for your reply. In toolbox, but i cant find this.
... View more
04-09-2020
03:41 AM
|
0
|
4
|
509
|
Online Status |
Offline
|
Date Last Visited |
07-21-2023
04:33 AM
|