How to export table with more than 1,048,576 rows of data

16657
7
Jump to solution
06-20-2019 11:31 AM
BaileyFredlund1
New Contributor III

Hello all,

I have a data table with more than 1,048,576 rows of data which is the limit of microsoft and hence cannot use the Table to Excel function.  I tried to export to txt file and then open in excel but still no luck. Is there an alternative that you have used to deal with these larger tables? There are about 10 million rows total. I am not opposed to working in python. 

Thanks in advance.

Tags (2)
0 Kudos
1 Solution

Accepted Solutions
BaileyFredlund1
New Contributor III

Thanks for the replies. I ended up just copying the maximum amount of rows in the attribute table and pasting them into excel manually. Was time consuming but the python suggested was a bit too advanced for me! Thanks for yalls help!

View solution in original post

0 Kudos
7 Replies
DEWright_CA
Occasional Contributor III

I export them as logical blocks; export a given Zipcode, County, State, Watershed etc...

DanPatterson_Retired
MVP Emeritus

What is it you want to do that you need excel?

Perhaps the functionality can be implemented through tools/scripts with the Arc* environment

0 Kudos
forestknutsen1
MVP Regular Contributor

Pandas might be for you.

https://pandas.pydata.org/

http://geospatialtraining.com/tutorial-creating-a-pandas-dataframe-from-a-shapefile/

https://datatofish.com/export-dataframe-to-csv/

https://towardsdatascience.com/why-and-how-to-use-pandas-with-large-data-9594dda2ea4c

I only have a ~140,000 rows to work within my test... but I think you could scale stuff up.

https://stackoverflow.com/questions/14525344/whats-the-maximum-size-of-a-numpy-array

import pandas
import arcpy

arcpy.env.workspace = connection

source = 'coninfo'
field_names = arcpy.ListFields(source)
field_names = [x.name for x in field_names]
field_names = field_names[:1]

ar = arcpy.da.TableToNumPyArray(source, field_names)
df = pandas.DataFrame(ar)
df.to_csv('test.csv')

0 Kudos
DanPatterson_Retired
MVP Emeritus

You don't need pandas, and a cleaner check to ensure Unicode support etc is to use numpy.

I built this little function which I use instead of the standard versions that ship natively

def save_txt(a, name="arr.txt", sep=", ", dt_hdr=True):
    """Save a NumPy structured, recarray to text.

    Parameters
    ----------
    a : array
        input array
    fname : filename
        output filename and path otherwise save to script folder
    sep : separator
        column separater, include a space if needed
    dt_hdr : boolean
        if True, add dtype names to the header of the file

    """
    a_names = ", ".join(i for i in a.dtype.names)
    hdr = ["", a_names][dt_hdr]  # use "" or names from input array
    s = np.array(a.tolist(), dtype=np.unicode_)
    widths = [max([len(i) for i in s[:, j]])
              for j in range(s.shape[1])]
    frmt = sep.join(["%{}s".format(i) for i in widths])
    # vals = ", ".join([i[1] for i in a.dtype.descr])
    np.savetxt(name, a, fmt=frmt, header=hdr, comments="")
    print("\nFile saved...")

since you already have your array, you can use the above with any separator you want, not just comma-delimited.

save_txt(ar, "c:/temp/arr.txt", sep=", ", dt_hdr=True)
0 Kudos
MarcoBoeringa
MVP Regular Contributor

I am not sure what you are trying to achieve. As you noted yourself, and is also documented here:

https://support.office.com/en-us/article/excel-specifications-and-limits-1672b34d-7043-467e-8e27-269...

Excel can only handle 1M rows maximum. There is no way you will be getting past that limit by changing your import practices, it is after all the limit of the worksheet itself.

For this amount of rows and data, you really should be looking at Microsoft Access. Databases can handle a far greater number of records. Access has basic functionality for summarizing and analyzing data too, so doing some calculations you would like to do in Excel, is likely possible in Access as well, although the process to get there will be different (defining SQL queries instead of defining formulas in Excel).

0 Kudos
BaileyFredlund1
New Contributor III

Thanks for the replies. I ended up just copying the maximum amount of rows in the attribute table and pasting them into excel manually. Was time consuming but the python suggested was a bit too advanced for me! Thanks for yalls help!

0 Kudos
BrittanyBurson
Occasional Contributor III

For anyone else here who was using the Table to Excel tool as a means to get to .csv, but hitting this record limit (me), you can follow instructions here for exporting straight to .csv. This should bypass the row limit if .csv is acceptable for your needs.

0 Kudos