Select to view content in your preferred language

Quick way to determine metadata standard of many feature classes?

736
7
Jump to solution
01-11-2024 08:00 PM
ShareUser
Esri Community Manager

Is there a quick and easy way to list or determine which metadata standard, style, or format is being used by many feature classes at once?  A tool, Python code, arcpy, etc.?  I want to see a list of all of my organizations 200+ feature classes' metadata styles; I want to see which ones need to be converted to Esri's Item Description metadata format.  For example:  ALL_ROADS: Esri style, ADDRESS_POINTS: FGDC style, PRECINCTS: Esri style, and so forth.

I did some searching but couldn't find any good methods for doing this.  I have used the arcpy Metadata class but it doesn't seem to have a way to tell you the feature class's metadata format.

My situation is that I've discovered that one old feature class in our enterprise database is in FGDC metadata style, and it's a problem because it contains metadata fields that are not exposed when viewing and editing metadata in ArcGIS, and those fields contain outdated info.  I want to re-do the metadata for any feature classes that have non-Esri style metadata, so I need to identify which feature classes need this work done.  I did read about the "upgrade" button for metadata, but it was grayed out for me.  And this wouldn't solve the problem anyway, because documentation says that parts of the old metadata will still be retained.  I definitely do not want that.

Thank you for reading!

0 Kudos
1 Solution

Accepted Solutions
ShareUser
Esri Community Manager

I figured out a way to do it!  I wanted to share my solution here, in case it can help anyone else in the future. 

I wrote a Python script that used the Metadata class to access the xml file, and I read through the xml to determine what wording was used to indicate the metadata standard ("ArcGISProfile").  I did some string manipulation to grab the metadata standard (as well as the date the metadata was last updated), and I wrote all the feature class names and their metadata standard, date, and other metadata elements to a csv file, for easy sorting and examination.  Here is the script; the part that gets the metadata standard is lines 41-55:

 

""" Writes enterprise databases' feature classes' metadata elements, including metadata standard, to a csv file for use in determining which feature classes' metadata needs to be updated or put 
into a different standard.
Python 3 / ArcGIS Pro
Allen Dailey
1/25/24
"""

import arcpy
from arcpy import metadata as md
import csv

# Desired file name for csv file. File can already exist, or not. If exists, will overwrite.
filename = r"\\path\to\csv_file.csv"
# Column names to use in csv file
columns = ["Database", "Dataset", "FeatureClass", "DateUpdated", "Standard", "Title", "Tags", "Summary", "Description",
           "UseLimitations", "Credits"]

# List of enterprise database sde connections. Metadata elements will be obtained for each fc in each db.
db_sde_list = [r"List\of\my\ArcGISPro\database\sde\connections.sde"]


def get_metadata(db, dataset, fc_list, csv_writer):
    """ Gets metadata elements for each feature class in a list and writes the metadata elements to a csv.
    :param db: String, name of database, gotten from sde file path string
    :param dataset: Name of dataset within the db, gotten from arcpy.ListDatasets()
    :param fc_list: List, feature classes, generated by arcpy.ListFeatureClasses()
    :param csv_writer: CSV writer
    :return: None
    """
    # For each feature class that has been passed into this function:
    for fc in fc_list:
        # Instantiate it as a metadata object
        m = md.Metadata(fc)
        # Get each of the usual metadata elements from the feature class's metadata
        tt = m.title
        tg = m.tags
        s = m.summary
        d = m.description
        u = m.accessConstraints
        c = m.credits
        # Get the metadata standard from the xml file. First, access the xml.
        x = m.xml
        # Break the xml into a list of parts, in order to find the part that says the metadata standard
        chunklist = x.split(">")
        # Standard and date = empty string, to avoid error in case the next block doesn't work out in a particular case.
        standard = ""
        date = ""
        # Go through each item in the list of parts of the xml file to find the standard
        for i in range(len(chunklist)):
            # "ArcGISProfile" is the field name for the metadata standard
            if "ArcGISProfile" in chunklist[i]:
                # The standard comes right after "ArcGISProfile."
                standard = chunklist[i + 1].split("<")[0]
                # Exit the loop to keep the correct content for the "standard" variable.
                break
        # Go through the xml to get the date metadata was last updated
        for i in range(len(chunklist)):
            if "mdDateSt" in chunklist[i]:
                date = chunklist[i+1].split("<")[0]
                break
        # List containing all the data to be written to a row of the csv file
        row = [db, dataset, fc, date, standard, tt, tg, s, d, u, c]
        # Write the row to the file
        csv_writer.writerow(row)


# Establish the csv file and csv writer
with open(filename, 'w', encoding="utf-8") as csvfile:
    writer = csv.writer(csvfile)
    # Write the column names to the csv file
    writer.writerow(columns)
    # For each database in the list, write its metadata elements to the csv file
    for db in db_sde_list:
        # Establish the database as the workspace in order to next access its datasets
        arcpy.env.workspace = db
        dbname = (db.split("\\")[-1]).split(".")[0]
        datasets = arcpy.ListDatasets()
        # If there are any standalone feature classes not inside datasets, get their names and their metadata
        try:
            lone_fcs = arcpy.ListFeatureClasses()
            get_metadata(dbname, "", lone_fcs, writer)
        except:
            print(f"Has no standalone fc's: {db}")
        for dataset in datasets:
            ds_fcs = arcpy.ListFeatureClasses(feature_dataset=dataset)
            get_metadata(dbname, dataset, ds_fcs, writer)

 

 

View solution in original post

0 Kudos
7 Replies
ShareUser
Esri Community Manager

Alan, I shared your question to

Quick way to determine metadata standard of many f... - Esri Community

it is a relatively new community space, but hopefully the owners will have a better chance on seeing metadata related questions than being buried in the Pro community.

0 Kudos
AllenDailey1
Frequent Contributor

Dan, thank you for sharing my post here!

Allen

0 Kudos
ShareUser
Esri Community Manager

Okay, thank you Dan!

0 Kudos
ShareUser
Esri Community Manager

I figured out a way to do it!  I wanted to share my solution here, in case it can help anyone else in the future. 

I wrote a Python script that used the Metadata class to access the xml file, and I read through the xml to determine what wording was used to indicate the metadata standard ("ArcGISProfile").  I did some string manipulation to grab the metadata standard (as well as the date the metadata was last updated), and I wrote all the feature class names and their metadata standard, date, and other metadata elements to a csv file, for easy sorting and examination.  Here is the script; the part that gets the metadata standard is lines 41-55:

 

""" Writes enterprise databases' feature classes' metadata elements, including metadata standard, to a csv file for use in determining which feature classes' metadata needs to be updated or put 
into a different standard.
Python 3 / ArcGIS Pro
Allen Dailey
1/25/24
"""

import arcpy
from arcpy import metadata as md
import csv

# Desired file name for csv file. File can already exist, or not. If exists, will overwrite.
filename = r"\\path\to\csv_file.csv"
# Column names to use in csv file
columns = ["Database", "Dataset", "FeatureClass", "DateUpdated", "Standard", "Title", "Tags", "Summary", "Description",
           "UseLimitations", "Credits"]

# List of enterprise database sde connections. Metadata elements will be obtained for each fc in each db.
db_sde_list = [r"List\of\my\ArcGISPro\database\sde\connections.sde"]


def get_metadata(db, dataset, fc_list, csv_writer):
    """ Gets metadata elements for each feature class in a list and writes the metadata elements to a csv.
    :param db: String, name of database, gotten from sde file path string
    :param dataset: Name of dataset within the db, gotten from arcpy.ListDatasets()
    :param fc_list: List, feature classes, generated by arcpy.ListFeatureClasses()
    :param csv_writer: CSV writer
    :return: None
    """
    # For each feature class that has been passed into this function:
    for fc in fc_list:
        # Instantiate it as a metadata object
        m = md.Metadata(fc)
        # Get each of the usual metadata elements from the feature class's metadata
        tt = m.title
        tg = m.tags
        s = m.summary
        d = m.description
        u = m.accessConstraints
        c = m.credits
        # Get the metadata standard from the xml file. First, access the xml.
        x = m.xml
        # Break the xml into a list of parts, in order to find the part that says the metadata standard
        chunklist = x.split(">")
        # Standard and date = empty string, to avoid error in case the next block doesn't work out in a particular case.
        standard = ""
        date = ""
        # Go through each item in the list of parts of the xml file to find the standard
        for i in range(len(chunklist)):
            # "ArcGISProfile" is the field name for the metadata standard
            if "ArcGISProfile" in chunklist[i]:
                # The standard comes right after "ArcGISProfile."
                standard = chunklist[i + 1].split("<")[0]
                # Exit the loop to keep the correct content for the "standard" variable.
                break
        # Go through the xml to get the date metadata was last updated
        for i in range(len(chunklist)):
            if "mdDateSt" in chunklist[i]:
                date = chunklist[i+1].split("<")[0]
                break
        # List containing all the data to be written to a row of the csv file
        row = [db, dataset, fc, date, standard, tt, tg, s, d, u, c]
        # Write the row to the file
        csv_writer.writerow(row)


# Establish the csv file and csv writer
with open(filename, 'w', encoding="utf-8") as csvfile:
    writer = csv.writer(csvfile)
    # Write the column names to the csv file
    writer.writerow(columns)
    # For each database in the list, write its metadata elements to the csv file
    for db in db_sde_list:
        # Establish the database as the workspace in order to next access its datasets
        arcpy.env.workspace = db
        dbname = (db.split("\\")[-1]).split(".")[0]
        datasets = arcpy.ListDatasets()
        # If there are any standalone feature classes not inside datasets, get their names and their metadata
        try:
            lone_fcs = arcpy.ListFeatureClasses()
            get_metadata(dbname, "", lone_fcs, writer)
        except:
            print(f"Has no standalone fc's: {db}")
        for dataset in datasets:
            ds_fcs = arcpy.ListFeatureClasses(feature_dataset=dataset)
            get_metadata(dbname, dataset, ds_fcs, writer)

 

 

0 Kudos
ctalleygreenville
Occasional Contributor

Thank you for this post. It was the code I needed to identify feature classes without any sort of metadata.

 

AllenDailey1
Frequent Contributor

Hi, thanks for letting me know!  I'm very glad to hear that it was useful for you!  😊

FYI, I later discovered that for some feature classes that had "no" metadata in the spreadsheet that my code wrote, they actually had very old metadata that had not been edited since an extremely old version of ArcMap or something and so the metadata was not even compatible/recognizable by ArcGIS Pro.  Some of these feature classes I could upgrade the metadata using the upgrade button in the Pro metadata editor.  But some of them I was not able to do that and had to copy and paste metadata from the metadata editor in ArcCatalog.

Basically, some feature classes with no metadata in Pro might have super old metadata that's only accessible in ArcMap/ArcCatalog.

0 Kudos
ctalleygreenville
Occasional Contributor

Thanks for the info about the old metadata. 

0 Kudos