I have a Python script that examines a super large table (CSV imported into a gdb as a table), queries this table based on a distinct field name value with its attributes, does a join based on a common field, and then export the features to a geodatabase feature classes. The features classes could be in the hundreds.
The problem is when I go in manually and want to merge the outputted feature classes in the geodatabase, the merge table (in the Merge geoprocessing function) is literally every single field in all the outputted feature classes. It turns out all the aliases are the same across all attribute tables, but the field names are different based on the join. For example, I want the field name as Name, but the field name is actually DistinctFile1_Name with the alias Name. The next file is actually DistinctFile2_Name with the alias Name and so forth and so forth.
So it is basically impossible to work with any attribute data in the files.
Any guidance? Thanks.
So I guess I would have to maybe use this tool in the script after the join, but before exporting to make the alias the actual field name across all the fields I am using? Actually maybe after exporting since joins are temporary. I don't think Alter Field would work (be efficient) after script execution because of the multitudes of files.
Well clear field alias wouldn't help since the alias name is the name I actually want.