Hello,
I'm looking for the most efficient way to convert a stand-alone table to a pandas data frame. This table will ultimately be geocoded and saved as a feature class, but I need to manipulate the data and fields quite a bit before that. I find pandas to the easiest and most efficient way to do the data manipulation.
Currently, I am using arcpy.conversion.TableToTable() to first convert the table to a csv file, and then pandas.read_csv() to convert to a data frame. The table has roughly 63,000 records and it is taking over an hour to do the TableToTable portion of the conversion.
Is there a better way to do this? Pandas is so quick so read the csv and I so wish I could read the table directly into pandas without the intermediate of a csv.
Many Thanks
see you thread in the python community