I am working on a senior project looking at fire occurrence in the United States. I have a polygon feature class with about 3.6 million fires that I am wanting to union, dissolve, and rasterize for further analysis. I want the output raster to contain a value for how many fires happened for each pixel (areas of overlapping polygons).
When running the union tool for the wildfire polygons I've monitored the memory use using the arc pro diagnostic monitor and I have found that the memory use does not venture upwards of 2gb. I have read that personal gdb's are limited to 2gb (Tiled processing of large datasets—Appendices | Documentation). At the bottom of the page it indicates that enterprise gdbs can leverage more ram. I looked into changing the tile size for the geoprocessing, but if its still going to be limited to 2 GB of memory I don't see the benefit of doing so.
I have an advanced license for pro and have additional storage space to set up an enterprise gdb on my machine, but I lack any experience with enterprise gdb's. I am definitely open to learning because I like to tinker. However, if theres better way to get around the 2gb memory threshold please enlighten me.
Processor: i7-4790k (4 cores 8 threads) @ 4.8 GHz
Memory: 32GB DDR3 1600 MHz
Graphics card: GTX 1050ti 4 GB
Storage: OS : Samsung 850 EVO 250 GB SSD
Workspace: Samsung 860 EVO 1 TB SSD.
I want to utilize my hardware. Any suggestions would be greatly appreciated!