Select to view content in your preferred language

Python Error Message Workspace Work. gdb already exists. Failed to execute (CreateFileGDB)

1025
7
03-27-2022 06:10 PM
Vjk62
by
New Contributor II

My question is similar to two previous posts.

1. Workspace work gdb does not exist Post 1 

2. Workspace work gdb does not exist Post 2

The question is based on the same subject automatic update real time data with python in Esri training from the project (Update Real Time Data With Python in ArcGIS Pro)

I also encountered similar errors when I ran the code based on the project description linked above on python command prompt. 

The prompt showed the error message Work.gdb already exists. Failed to execute (CreateFileGDB)

import sys, arcpy, os, tempfile, json
from urllib import request

def feedRoutine (url, workGDB):
    # workGDB and default workspace
    print("Creating workGDB...")
    arcpy.env.workspace = workGDB 
    arcpy.env.overwriteOutput = True #incase the gdb file is already created
    arcpy.management.CreateFileGDB(os.path.dirname(workGDB), os.path.basename(workGDB))
    
    # Download and split json file
    print("Downloading data...")
    temp_dir = tempfile.mkdtemp()
    filename = os.path.join(temp_dir, 'latest_data.json')
    response = request.urlretrieve(url, filename)
    with open(filename) as json_file:
        data_raw = json.load(json_file)
        data_stations = dict(type=data_raw['type'], features=[])
        data_areas = dict(type=data_raw['type'], features=[])
    for feat in data_raw['features']:
        if feat['geometry']['type'] == 'Point':
            data_stations['features'].append(feat)
        else:
            data_areas['features'].append(feat)
    # Filenames of temp json files
    stations_json_path = os.path.join(temp_dir, 'points.json')
    areas_json_path = os.path.join(temp_dir, 'polygons.json')
    # Save dictionaries into json files
    with open(stations_json_path, 'w') as point_json_file:
        json.dump(data_stations, point_json_file, indent=4)
    with open(areas_json_path, 'w') as poly_json_file:
        json.dump(data_areas, poly_json_file, indent=4)
    # Convert json files to features
    print("Creating feature classes...")
    arcpy.conversion.JSONToFeatures(stations_json_path, 'alert_stations') 
    arcpy.conversion.JSONToFeatures(areas_json_path, 'alert_areas')
    # Add 'alert_level ' field
    arcpy.management.AddField('alert_stations', 'alert_level', 'SHORT', field_alias='Alert Level')
    arcpy.management.AddField('alert_areas', 'alert_level', 'SHORT', field_alias='Alert Level')
    # Calculate 'alert_level ' field
    arcpy.management.CalculateField('alert_stations', 'alert_level', "int(!alert!)")
    arcpy.management.CalculateField('alert_areas', 'alert_level', "int(!alert!)")

    # Deployment Logic
    print("Deploying...")
    deployLogic()

    # Return
    print("Done!")
    return True

def deployLogic():
    pass

if __name__ == "__main__":
    [url, workGDB] = sys.argv[1:]
    feedRoutine (url, workGDB)

 

After I ran the code, I have trouble updating my data in correspondence to real time data in NOAA in ArcGIS Pro

Please do advice on what I could do to resolve the problem.

Thanks

0 Kudos
7 Replies
AlfredBaldenweck
MVP Regular Contributor

Oof. First it doesn’t exist, and now you finally get it to exist and it’s causing problems.

The good news is that I think this is an easy problem.

Lines 5-7 (really line 7) create a work gdb according to your naming. What probably happened is it worked the first time but not any time after that. The reason why is it’s preventing you from overwriting that inital gdb.

Throw line 7 into an if() statement.

If  not os.path.exists(workGDB):

  • CreateGDB(workGDB, etc)

Basically, check to see if it’s already there and make it if it isn’t. You can also apply that same logic to whatever feature classes lived in there from the last time you ran the tool, in case those don’t get overwritten 

I will say I’m a little confused as to why you want to overwrite it? 

I suppose if you need an empty gdb you h could delete all the files in it or something, then go from there. Or delete the gdb first and then remake it? But again, not sure why you would want that.

Hope this helps!

0 Kudos
Vjk62
by
New Contributor II

Hi Thanks for your feedback. I revised the code accordingly below. 

 

import sys, os, tempfile, json, logging, arcpy, shutil
import datetime as dt
from urllib import request
from urllib.error import URLError

def feedRoutine (url, workGDB, liveGDB):
    # Log file
    logging.basicConfig(filename="coral_reef_exercise.log", level=logging.INFO)
    log_format = "%Y-%m-%d %H:%M:%S"
   # Create workGDB and default workspace
    print("Starting workGDB...")
    logging.info("Starting workGDB... {0}".format(dt.datetime.now().strftime(log_format)))
    arcpy.env.workspace = workGDB
    if arcpy.Exists(arcpy.env.workspace):
        for feat in arcpy.ListFeatureClasses ("alert_*"):   
            arcpy.management.Delete(feat)
    else:
        arcpy.management.CreateFileGDB(os.path.dirname(workGDB), os.path.basename(workGDB))
    
    # Download and split json file
    print("Downloading data...")
    logging.info("Downloading data... {0}".format(dt.datetime.now().strftime(log_format)))
    temp_dir = tempfile.mkdtemp()
    filename = os.path.join(temp_dir, 'latest_data.json')
    try:
        response = request.urlretrieve(url, filename)
    except URLError:
        logging.exception("Failed on: request.urlretrieve(url, filename) {0}".format(
                          dt.datetime.now().strftime(log_format)))
        raise Exception("{0} not available. Check internet connection or url address".format(url))
    with open(filename) as json_file:
        data_raw = json.load(json_file)
        data_stations = dict(type=data_raw['type'], features=[])
        data_areas = dict(type=data_raw['type'], features=[])
    for feat in data_raw['features']:
        if feat['geometry']['type'] == 'Point':
            data_stations['features'].append(feat)
        else:
            data_areas['features'].append(feat)
    # Filenames of temp json files
    stations_json_path = os.path.join(temp_dir, 'points.json')
    areas_json_path = os.path.join(temp_dir, 'polygons.json')
    # Save dictionaries into json files
    with open(stations_json_path, 'w') as point_json_file:
        json.dump(data_stations, point_json_file, indent=4)
    with open(areas_json_path, 'w') as poly_json_file:
        json.dump(data_areas, poly_json_file, indent=4)
    # Convert json files to features
    print("Creating feature classes...")
    logging.info("Creating feature classes... {0}".format(dt.datetime.now().strftime(log_format)))
    arcpy.conversion.JSONToFeatures(stations_json_path, 'alert_stations') 
    arcpy.conversion.JSONToFeatures(areas_json_path, 'alert_areas')
    # Add 'alert_level ' field
    arcpy.management.AddField('alert_stations', 'alert_level', 'SHORT', field_alias='Alert Level')
    arcpy.management.AddField('alert_areas', 'alert_level', 'SHORT', field_alias='Alert Level')
    # Calculate 'alert_level ' field
    arcpy.management.CalculateField('alert_stations', 'alert_level', "int(!alert!)")
    arcpy.management.CalculateField('alert_areas', 'alert_level', "int(!alert!)")

    # Deployment Logic
    print("Deploying...")
    logging.info("Deploying... {0}".format(dt.datetime.now().strftime(log_format)))
    deployLogic(workGDB, liveGDB)
    
    # Close Log File
    logging.shutdown()

    # Return
    print("Done!")
    logging.info("Done! {0}".format(dt.datetime.now().strftime(log_format)))
    return True

def deployLogic(workGDB, liveGDB):
    for root, dirs, files in os.walk(workGDB, topdown=False):
	    files = [f for f in files if '.lock' not in f]
	    for f in files:
	        shutil.copy2(os.path.join(workGDB, f), os.path.join(liveGDB, f))

if __name__ == "__main__":
    [url, workGDB, liveGDB] = sys.argv[1:]
    feedRoutine (url, workGDB, liveGDB)

 

 

Although I am no longer getting the error messages, I'm still having problem getting my data updated accordingly. The code is not working on the Python coral bleaching project. It is not working with the recommended  Url with sample data by Esri. 

So in essence I'm stuck on the "Run the Stand-Alone Script" section of the project, because my data is stuck on the most recent data from NOAA that I downloaded last week and it never updates even I after I changed the code to follow the url for the sample data provided by Esri in the python command prompt for the purpose of the project. I tried restarting ArcGIS Pro, refreshing the map. It doesn't work. I still think something's wrong with the code. 

Thanks for your advice. If you have any recommendations for changing the code so that the data in ArcGIS Pro updates according to the url given in the prompt, please let me know.

Thanks

0 Kudos
AlfredBaldenweck
MVP Regular Contributor

I'd try debugging at various points.

  • The alert stations and areas don't appear in your liveGDB; do any of them appear in the work GDB correctly?
  • Try changing your tempdir to a testing folder just to see what's happening in there-- do those json files get populated?

That being said, I think your issue is probably Lines 26 and/or 31. You request the data as the variable "response", which you then never call again. What happens if you use "response" as the variable in Line 31, or just run request.urlretrieve() by itself, without naming it as a variable? (I think this is probably it)

I can't test this myself, but I think that might be where the trouble is.

Hope this helps!

0 Kudos
Vjk62
by
New Contributor II

Thanks for your suggestion! I tried debugging the code, however, now an error appears on the prompt for line 80 after I tried rewrote the script and ran it on request. urlretrieve() by itself.

The error reads [url, workGDB, liveGDB] = sys.argv[1:]

Value Error: not enough values to unpack (expected 3, got 2)

What do you think it means?

Any advice would be greatly appreciated!

Thanks!

0 Kudos
AlfredBaldenweck
MVP Regular Contributor

It's expecting three items, and you're giving it two.

I'd check that you're providing the URL and both GDB addresses when you run the script

0 Kudos
DanPatterson
MVP Esteemed Contributor

Current Workspace (Environment setting)—ArcGIS Pro | Documentation

overwriteOutputs will allow you to delete featureclasses in a gdb (eg a folder), not the gdb itself.

You have to check with arcpy.Exists then use Delete from the management toolset

Delete (Data Management)—ArcGIS Pro | Documentation

since it allows you to delete all kinds of data including gdb s


... sort of retired...
0 Kudos
Vjk62
by
New Contributor II

Hi Thanks for your reply. Although I'm no longer receiving the error messages, I'm still having trouble getting the data in Pro to be updated according to the url given in python command prompt. 

Please let me know if you have any advice for that.

Thanks

0 Kudos