ArcGIS Pro 2.6.8; Oracle 18c 10.7.1 EGDB
I have 300 tables in a production enterprise geodatabase. When I say tables I mean feature classes and tables.
Some (but not all) of the tables have fields that need to be deleted. Fields:
Question:
What is a simple, safe way to delete the fields programmatically? For example, a model or Python script that:
What mechanism do you recommend for that use case?
Thanks.
So caveat - I'm am not a Python person but today, have been playing with ChatGPT to see how I can use it in GIS technologies. Some examples show it can write arcpy scripts.
So I asked ChatGPT the following question - "Write an ArcGIS python script that loops through an enterprise geodatabase that deletes fields called ASSET_STATUS_DATE, PROP_DATE, ASSM_DATE, REMOV_DATE, BLOCK and MAP_ROTATION attribute fields common to any feature classes and non-spatial tables in the same enterprise geodatabase. If the feature class table or non-spatial table has one of those fields to delete, then create a copy of the feature class or non-spatial table and store it in a file geodatabase."
This is the code it provided in seconds...
import arcpy
import os
# Set the workspace to your enterprise geodatabase
arcpy.env.workspace = r"C:\Path\To\Your\EnterpriseGeodatabase.sde" # Update with your actual workspace path
# List of common attribute fields to be deleted
fields_to_delete = ["ASSET_STATUS_DATE", "PROP_DATE", "ASSM_DATE", "REMOV_DATE", "BLOCK", "MAP_ROTATION"]
# Path to the file geodatabase where copies will be created
output_fgdb = r"C:\Path\To\Your\OutputFileGeodatabase.gdb" # Update with your actual path
# Create a file geodatabase if it doesn't exist
if not arcpy.Exists(output_fgdb):
arcpy.CreateFileGDB_management(os.path.dirname(output_fgdb), os.path.basename(output_fgdb))
# Loop through feature classes
for fc in arcpy.ListFeatureClasses():
has_deleted_fields = False
fields_deleted = []
for field in fields_to_delete:
if arcpy.ListFields(fc, field):
arcpy.DeleteField_management(fc, field)
fields_deleted.append(field)
has_deleted_fields = True
# Create a copy if fields were deleted
if has_deleted_fields:
output_fc = os.path.join(output_fgdb, fc)
arcpy.Copy_management(fc, output_fc)
print(f"Created copy of '{fc}' in '{output_fgdb}' with fields deleted: {fields_deleted}")
# Loop through tables
for table in arcpy.ListTables():
has_deleted_fields = False
fields_deleted = []
for field in fields_to_delete:
if arcpy.ListFields(table, field):
arcpy.DeleteField_management(table, field)
fields_deleted.append(field)
has_deleted_fields = True
# Create a copy if fields were deleted
if has_deleted_fields:
output_table = os.path.join(output_fgdb, table)
arcpy.Copy_management(table, output_table)
print(f"Created copy of '{table}' in '{output_fgdb}' with fields deleted: {fields_deleted}")
print("Script completed.")
It then adds - "This script will delete the specified fields from the feature classes and non-spatial tables in the enterprise geodatabase and then create copies of those datasets with the specified fields removed in the specified file geodatabase (output_fgdb). Make sure to update the paths (arcpy.env.workspace, output_fgdb) as needed for your specific use case."
I'm curious from Python folks if this makes sense. If you do try the code, let me know how it works (or does not work) for you.