How to export a table with > 9 million rows?

3280
8
05-05-2020 12:40 PM
RyanHowell1
New Contributor III

I have a point feature class that has 9.2 million rows (according to the counter on the attribute table) and ~50 columns. I am trying to get it in to R to run some statistical analysis on it. I'm having issues exporting the whole table in a usable format (it is only exporting around 3 million rows).

A few things I have tried:

1. ArcGIS Binding (it only brings in ~3 million rows to R).

2. Copy Rows tool with .csv extension (also only getting ~3 million rows)

3. Subset Features tool with the parameters set as 33.3%, thinking I could export it 3 times and then merge it in R (this only gives me 1.2 million rows, or 1/3 of what I'm getting with the other two methods)

4. A python script that selects 1 million rows and exports it as an excel table, then selects the next 1 million rows, etc. This runs until the 3rd iteration and then ends (because I've reached this strange ~3 million number)

5. Exporting from ArcMap as a table (same thing)

I'm a little confused because the attribute table is definitely saying that I have 9 million rows, but nothing else I try seems to recognize that. I'm not aware with any sort of limit of features on any of these methods (for example, a .csv should be able to handle 9 million rows, as does R Studio). Is there a limit I'm not aware of, or some sort of known bug with the attribute table and there really is only 3 million rows in my feature class? What is the best way to get a table this size as .csv format?

0 Kudos
8 Replies
JoshuaBixby
MVP Esteemed Contributor

Are you getting errors or do the tools complete with just less records?  I have definitely worked with more than 3 million records, so I don't think you are running into any tool limits.

0 Kudos
RyanHowell1
New Contributor III

The tools execute fine, the results just don't have all of the rows.

I also noticed that if I open the attribute table and scroll to the bottom it works for about 30 seconds and then ArcPro crashes (without loading the last rows of the table). Could it be a RAM issue or something? Not sure if insufficient RAM would influence the tool results if it doesn't give me an error though.

0 Kudos
BruceHarold
Esri Regular Contributor
JoshuaBaker1
New Contributor

Go into the symbology tab and get a count of records under unique values with no value set. What is that number?

Also, what is the source of the data? SDE, FileGDB, Etc?

0 Kudos
JoeBorgione
MVP Emeritus

Interesting thread; 9 million records x 50 fields is a pretty hefty table.  I wonder if it's a ram problem too.  As Joshua Bixbymentions, what type of GDB is this table in?  Perhaps a python/db connection approach would provide a solution if it's in a SQL or some other flavor of rdms.

That should just about do it....
0 Kudos
RyanHowell1
New Contributor III

Updates since yesterday:

Bruce: I ran the tool you linked to and it finished running. I set it to comma delimited and it gave me a .tsv file which I'm having trouble opening, but based on the file size it looks like it's the same number of points I've been getting from the other methods I've tried.

The points are just a feature class in a file GDB.

I realized as well that I needed an XY field to reload the points back into ArcPro after the work in R Studio if I go the .csv route, so I ran the Add Geometry Attributes tool. It ran for 25 minutes and got to 40%, then completed and it only calculated the values for the 3 million rows. I then discovered I have access to a machine with 32GB of RAM (compared with the 16 on mine), so I went and ran the Add Geometry Attributes tool on that machine. It processed for about 20 minutes and finished, but now my feature class only has the 3 million points and visually only covers about a third of what it did before. So, it appears I somehow deleted the 6 million points that weren't being touched. Now to find my backup/rebuild my point file...

I'm thinking I'm going to try to abandon this method and go to working with raster in R Studio (the point file was generated using raster to point) and see if that will solve my issues. But it appears that there might have been some sort of RAM issue?

0 Kudos
JoshuaBixby
MVP Esteemed Contributor

How is the original data (9-million) getting into a feature class in the first place?  Is the data being imported from another format or program?  If so, what format or program, and how exactly is it being imported?

I have seen similar behavior to this when data is imported using certain tools where the user gets to specify the unique index.  If the index isn't unique, then rows will be dropped when run through other geoprocessing tools.

0 Kudos
LouiseWalker
New Contributor

Hi 

Just wondering if there could be anything wrong with the data? Perhaps some odd characters or spaces in one of the attrubutes?

Or could there be something wrong with the actual geometry?


What happens when you try to take a copy of the layer? Or export the whole layer to new feature class? What happens if in ArcMap, you select all the records and then export to a new layer?

Are you still getting your 9million rows or are you only getting the 3million?

0 Kudos