I work in a large company that has 1000s of spatial files. The metadata is all entered via Arc Catalog. We are doing a “health check” of all our metadata. i.e
- What fields are being left blank,
- Is what is being entered consistent to standards
The Arc Catalog metadata records are XML files. It’s taken a great deal of effort, but have been able to harvest a very small portion of fields from the XML files and get it into a CSV, we then import the CSV into a various data analysis tools. Tableau/Power BI.
While this has given us an insight, I can’t help but think, someone else has wanted to look at the global “quality” of their metadata. Are there any “out of the box” tools that do this? Or is there any “easy” way getting all your metadata records into a CSV?
We are exploring flattening the XML though various packages. FME looks like the best route. We have looked at Excel importing the XML mapping fields to cells. Due to the complexity of the XML and nesting Excel only wants to import a single record at a time.
Any thoughts or advice is welcome.
(Is this the best place for this question?)