Max Number of Records When Exporting Pandas Dataframe to Geodatabase Table?

1825
2
Jump to solution
04-19-2022 04:10 PM
JoshuaFlickinger
Occasional Contributor

I'm using the arcgis .to_table() method to convert a pandas dataframe to a GIS table.  However, I'm losing records from my table in the process.  My dataframe has 51,970 records.  My table in GIS only has 45,314 records.  If I export the exact same dataframe to csv, the records are there - so I can use the csv if necessary to construct my table in ArcGIS.  But, I'm curious to find out what's going on.  I already ran a function to convert the object data types to strings, and I tried resetting the index right before export.  I end up with the same result.  So, I'm wondering if there's a maximum number of records that the .to_table() method can handle?  If not, I must be missing something in the data that I can investigate more fully.

 

 

0 Kudos
1 Solution

Accepted Solutions
DanPatterson
MVP Esteemed Contributor

to_table ultimately is a call to

C:\... your install folder ... \bin\Python\envs\arcgispro-py3\Lib\site-packages\arcgis\features\geo\_io\fileops.py

lines 335-366

It in essence uses arcpy.da.ExtendTable and an arcpy.da.InsertCursor  

during the insertcursor operation, any row that fails to be converted to a proper list will skip that row (lines 451-456)

the to_csv method doesnt do that checking, so perhaps you have nulls/blanks etc in rows that the to_table doesn't explicitly handle.

In short,

  • there is no imposed limit on the number of rows by to_table
  • use to_csv since, for whatever reason, it doesn't really care what is in the rows. 

... sort of retired...

View solution in original post

2 Replies
DanPatterson
MVP Esteemed Contributor

to_table ultimately is a call to

C:\... your install folder ... \bin\Python\envs\arcgispro-py3\Lib\site-packages\arcgis\features\geo\_io\fileops.py

lines 335-366

It in essence uses arcpy.da.ExtendTable and an arcpy.da.InsertCursor  

during the insertcursor operation, any row that fails to be converted to a proper list will skip that row (lines 451-456)

the to_csv method doesnt do that checking, so perhaps you have nulls/blanks etc in rows that the to_table doesn't explicitly handle.

In short,

  • there is no imposed limit on the number of rows by to_table
  • use to_csv since, for whatever reason, it doesn't really care what is in the rows. 

... sort of retired...
JoshuaFlickinger
Occasional Contributor

Thanks Dan, you've both answered my question and provided me exactly the reference I was looking for to dig deeper!  Looks like there is a handy line in the insert cursor for explicitly printing which rows are not being appended.  If I have time, I will try to manually re-engineer the process and try to use that print statement to figure out why certain rows weren't being inserted into my table.  To get the job done in the meantime I'll stick with to_csv.

0 Kudos