Hello
I have created a script in ArcGis Pro that I run in the Notebook that allows me to create an empty Geodatabase (without records).
The Geodatabase design is read from a Json file where I define the structure of: Feature Dataset, Feature Class, Domains, field, default value of fields and subtypes.
I apply for loops to the arcpy.management methods for most of the operations, however, the process is very slow.
Is it possible to develop this procedure with a cursor?
So as to add features and their fields in a single process.
Or other optimization alternatives?
Is the file geodatabase being created on a network shared folder?
Can you create it locally on the machine, then copy it over to the final location?
I am working locally, it is a very slow process.
How is the speed comparted to running the process in ArcGIS Pro (locally) vs. Notebook Server?
Well, I guess it is faster to run in the Notebook, than the alternative of creating and filling all the domains one by one, creating the fields, assigning the subtypes, etc.
Is there a good reason why your not simply creating the database in Pro itself or ArcCatalog
We are defining the most suitable database model and we decided to create a script that will facilitate the creation of an empty geodatabase with everything configured.
So that we can test and iterate on the design until we get to the most suitable one.
Thank you, no problem
I leave you most of the code, the Json file is formed by the definitions of the parts of the GDB, with lists and dictionaries. It creates an empty GDB.
#GENERACION DE GEODATABASE DE ARCHIVO
#GENERACION DE GEODATABASE DE ARCHIVO
from os import path
from json import load
import arcpy
from arcpy import env, management, SpatialReference
# path to GDB
local_path = ----path folder----
# name and path of file Json
with open(
local_path + 'chr_geodatabase_schema_gdbfile.json',
encoding='utf-8') as schema_file:
gdb_schema = load(schema_file)
# Get info OF Json at geodatabase level
metadata = gdb_schema['metadata']
#subtypes info in Json
subtypes=gdb_schema['subtypes']
# Set ArcPy workspace
env.workspace = local_path
#name, path and spatial reference of GDB
geodatabase_name = metadata['gdb_name']
spatial_reference = SpatialReference(metadata['spatial_reference'])
local_copy_path = local_path + '\\' + geodatabase_name + '.gdb'
# Securely delete the geodatabase if it exists
if path.exists(local_copy_path):
r=management.Delete(local_copy_path)
# Create an empty FILE geodatabase
geodb = management.CreateFileGDB(local_path, geodatabase_name,"9.2")
# SET workspace to edit GDB
env.workspace = geodb[0]
#CREATE Features datasets
esquemas=gdb_schema['featuredatasets']
for i in esquemas:
management.CreateFeatureDataset(geodb, i['name'], spatial_reference)
#create domains
for domain in metadata['domains']:
management.CreateDomain(
geodb,
domain['name'],
domain['description'],
domain['field_type'],
domain['domain_type']
)
# Add items for coded domains
if domain['domain_type'] == 'CODED' and "codes" in domain:
# TODO: Warning if no code config found
for code in domain['codes']:
code_key, code_value = list(code.items())[0]
management.AddCodedValueToDomain(
geodb,
domain['name'],
code_key,
code_value
)
else:
# TODO: Add range value domain config, for example for date columns
pass
#create feature class, fields, assign domains
for dataset in gdb_schema['datasets']:
outp=''
for e in esquemas:
if dataset['name'] in e['capas']:
#print('coincidio',e['capas'],dataset['name'])
outp=e['name']
break
if outp=='':
outp=geodb
#print(outp)
feature_class = management.CreateFeatureclass(
out_path=outp,
out_name=dataset['name'],
geometry_type=dataset['geometry_type'],
spatial_reference=spatial_reference,
out_alias=dataset['alias']
)
# Add fields and domains to the recently created feature class
for field in dataset['fields']:
l={i:field[i] for i in field if not i=='default_value'}
management.AddField(
feature_class,
**l # Unpacking field list before passig it
)