Hi,
I am doing my Master thesis with GIS Data where I have to churn a large amount of data, I have more than 2 million rows, which I must intersect with another map to get the common attribute table which I use in other software to get the required results, but processing itself is taking hours, yesterday I started it on ArcMap and it ran for 18 hours but just said "drawing features 36,000" at which point I thought it would take months to complete the operation and installed pro as i saw this thread.Dissolving Large Data
But still the problem is persisting and I ran into a problem again, I waited for 3 hours at which point Pro returned an error and started processing again, can you guys say how to go through this large data intersect.
Solved! Go to Solution.
We pass 150m points through the 3m polygon-polygon intersection result (takes an hour or so in PostgreSQL, but only because we have to match to features nearby if not exactly within one). I use straight SQL; none of that GUI stuff for me. But first I defragment both tables spatially, and my polygons are regularly partitioned to max 25 sq km, so the identity operation flies.
- V
Also from the PRO docs
(Tiled) Processing of Large Datasets
Summary of recommendations, from there and others from above... not all needed, but here they are