<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Max Number of Records When Exporting Pandas Dataframe to Geodatabase Table? in Python Questions</title>
    <link>https://community.esri.com/t5/python-questions/max-number-of-records-when-exporting-pandas/m-p/1165974#M64357</link>
    <description>&lt;P&gt;to_table ultimately is a call to&lt;/P&gt;&lt;P&gt;C:\... your install folder ... \bin\Python\envs\arcgispro-py3\Lib\site-packages\arcgis\features\geo\_io\fileops.py&lt;/P&gt;&lt;P&gt;lines 335-366&lt;/P&gt;&lt;P&gt;It in essence uses arcpy.da.ExtendTable and an arcpy.da.InsertCursor&amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;during the insertcursor operation, any row that fails to be converted to a proper list will skip that row (lines 451-456)&lt;/P&gt;&lt;P&gt;the to_csv method doesnt do that checking, so perhaps you have nulls/blanks etc in rows that the to_table doesn't explicitly handle.&lt;/P&gt;&lt;P&gt;In short,&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;there is no imposed limit on the number of rows by&lt;EM&gt; to_table&lt;/EM&gt;&lt;/LI&gt;&lt;LI&gt;use &lt;EM&gt;to_csv&lt;/EM&gt; since, for whatever reason, it doesn't really care what is in the rows.&amp;nbsp;&lt;/LI&gt;&lt;/UL&gt;</description>
    <pubDate>Tue, 19 Apr 2022 23:51:01 GMT</pubDate>
    <dc:creator>DanPatterson</dc:creator>
    <dc:date>2022-04-19T23:51:01Z</dc:date>
    <item>
      <title>Max Number of Records When Exporting Pandas Dataframe to Geodatabase Table?</title>
      <link>https://community.esri.com/t5/python-questions/max-number-of-records-when-exporting-pandas/m-p/1165965#M64355</link>
      <description>&lt;P&gt;I'm using the arcgis &lt;A href="https://developers.arcgis.com/python/api-reference/arcgis.features.toc.html?highlight=from_feather#arcgis.features.GeoAccessor.to_table" target="_self"&gt;.to_table()&lt;/A&gt;&amp;nbsp;method to convert a pandas dataframe to a GIS table.&amp;nbsp; However, I'm losing records from my table in the process.&amp;nbsp;&amp;nbsp;My dataframe has 51,970 records.&amp;nbsp; My table in GIS only has 45,314 records.&amp;nbsp; If I export the exact same dataframe to csv, the records are there - so I can use the csv if necessary to construct my table in ArcGIS.&amp;nbsp; But, I'm curious to find out what's going on.&amp;nbsp; I already ran a function to convert the object data types to strings, and I tried resetting the index right before export.&amp;nbsp; I end up with the same result.&amp;nbsp; &lt;STRONG&gt;So, I'm wondering if there's a maximum number of records that the .to_table() method can handle?&amp;nbsp;&lt;/STRONG&gt; If not, I must be missing something in the data that I can investigate more fully.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 19 Apr 2022 23:10:52 GMT</pubDate>
      <guid>https://community.esri.com/t5/python-questions/max-number-of-records-when-exporting-pandas/m-p/1165965#M64355</guid>
      <dc:creator>JoshuaFlickinger</dc:creator>
      <dc:date>2022-04-19T23:10:52Z</dc:date>
    </item>
    <item>
      <title>Re: Max Number of Records When Exporting Pandas Dataframe to Geodatabase Table?</title>
      <link>https://community.esri.com/t5/python-questions/max-number-of-records-when-exporting-pandas/m-p/1165974#M64357</link>
      <description>&lt;P&gt;to_table ultimately is a call to&lt;/P&gt;&lt;P&gt;C:\... your install folder ... \bin\Python\envs\arcgispro-py3\Lib\site-packages\arcgis\features\geo\_io\fileops.py&lt;/P&gt;&lt;P&gt;lines 335-366&lt;/P&gt;&lt;P&gt;It in essence uses arcpy.da.ExtendTable and an arcpy.da.InsertCursor&amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;during the insertcursor operation, any row that fails to be converted to a proper list will skip that row (lines 451-456)&lt;/P&gt;&lt;P&gt;the to_csv method doesnt do that checking, so perhaps you have nulls/blanks etc in rows that the to_table doesn't explicitly handle.&lt;/P&gt;&lt;P&gt;In short,&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;there is no imposed limit on the number of rows by&lt;EM&gt; to_table&lt;/EM&gt;&lt;/LI&gt;&lt;LI&gt;use &lt;EM&gt;to_csv&lt;/EM&gt; since, for whatever reason, it doesn't really care what is in the rows.&amp;nbsp;&lt;/LI&gt;&lt;/UL&gt;</description>
      <pubDate>Tue, 19 Apr 2022 23:51:01 GMT</pubDate>
      <guid>https://community.esri.com/t5/python-questions/max-number-of-records-when-exporting-pandas/m-p/1165974#M64357</guid>
      <dc:creator>DanPatterson</dc:creator>
      <dc:date>2022-04-19T23:51:01Z</dc:date>
    </item>
    <item>
      <title>Re: Max Number of Records When Exporting Pandas Dataframe to Geodatabase Table?</title>
      <link>https://community.esri.com/t5/python-questions/max-number-of-records-when-exporting-pandas/m-p/1165990#M64358</link>
      <description>&lt;P&gt;Thanks Dan, you've both answered my question and provided me exactly the reference I was looking for to dig deeper!&amp;nbsp; Looks like there is a handy line in the insert cursor for explicitly printing which rows are not being appended.&amp;nbsp; If I have time, I will try to manually re-engineer the process and try to use that print statement to figure out why certain rows weren't being inserted into my table.&amp;nbsp; To get the job done in the meantime I'll stick with to_csv.&lt;/P&gt;</description>
      <pubDate>Wed, 20 Apr 2022 03:01:04 GMT</pubDate>
      <guid>https://community.esri.com/t5/python-questions/max-number-of-records-when-exporting-pandas/m-p/1165990#M64358</guid>
      <dc:creator>JoshuaFlickinger</dc:creator>
      <dc:date>2022-04-20T03:01:04Z</dc:date>
    </item>
  </channel>
</rss>

