Efficiency of ArcPro when working with millions of polygons

447
1
07-08-2019 02:58 PM
MalcolmLittle
New Contributor III

The recent dataset I'm working with is a huge array of 12 million hexagonal polygons, essentially acting as variable bins for 70 variables of all sorts of types. I generally run processing with it on a Windows 10 i7-7700 machine, yet find even loading the attribute table to take several minutes. Even when I created a 250K slice of the hexes, it still takes a long slog to view the table.

I do have a join, basically linking the hexagonal polygon feature class (has only 5 variables) with an aspatial .csv file containing the other 65 variables. I keep wondering if exporting into a new feature class - thus having a complete & unjoined polygon feature class - would see gains in browsing and processing efficiency.

Thoughts?

0 Kudos
1 Reply
DanPatterson_Retired
MVP Emeritus

Aren't the joins re-established behind the scenes when a project is re-opened?

You have a great data set to test with, even a portion of it since you seem to have identical machine, it might be worth the testing time if it will save you time in the long run.

Also, if the csv is static, why don't you convert it to a table since Arc* likes its own stuff better and it rules out extra processing to get it into a state it likes.  I am sure that a csv isn't the native tabular format