As my organization transitions to using Pro for all of our reports, we are bringing along a lot of legacy shapefile and shapefile based workflows. Many of our modelers and scientists use software or python scripts that can only create shapefiles, so we will continue to reference these shapefiles in our ArcGIS Pro projects when creating our reports. Python specifically allows users to create unsupported file and field names with no check to see if they are valid.
Pro will open and display most shapefiles even when they have file names that are too long, with unsupported characters in their file names as well as field names. It is not until you get to a particular query/symbology/editing or geoprocessing process that you run into issues, and often the error message produced is generic and does not let you know that bad shapefile naming practices are to blame. For instance, the Project Package tool that zips up an APRX along with all of the files referenced in the maps ("Share Outside Organization") will fail with "General Function Failure" when your project references shapefiles with bad naming, even though the project was functioning normally.
Is there currently a way, possibly using Python, to scrape all Data Sources in an APRX and to see if any are shapefiles with unsupported file or field names?
I'd like this tool to produce a list of the files at the very least. Ideally it would also tell a user what specifically is unsupported. The logical next tool would take these files and import them into the default project geodatabase, hopefully correcting any of the unsupported naming in the process.