Every month or so, my organisation updates a large SQL database of wildlife records (approx 4 million rows) and I create a geodatabase feature class point layer from it. I then use that layer as input to a model that carries out lots of basic processes (clipping it to a certain region, adding and populating lots of new fields, creating various summary tables etc), that takes about 2 hours to run. I have been running this model roughly every month for the past 18 months or so, and it hasn't been changed in that time. About 2 months ago, ArcGIS Pro started crashing part way through the run every time I ran this model. It isn't that the model fails at any step, just that every time I try to run it using the latest updated layer, the whole program crashes, saying ArcGIS Pro ran into an unexpected error and asking if I want to send an error report. However, when I to run the model using a subset of the data from the latest layer, it works fine. The only thing I can think of that has changed in this time is that the machine I am working on has got a bit more clogged up. It has gone from having about 120gb of free space to about 90gb. Is it possible that the reduced free space on the hard drive could be somehow reducing performance on Pro and causing it to crash when attempting longer processes on bigger datasets? That doesn't sound likely to me somehow but it's all I can come up with at the moment. Any ideas most welcome! I am using Pro version 3.3.1.
Solved! Go to Solution.
Disk size shouldn't be impacting anything but the cache that Pro builds might be - go to the options menu, find 'display' and try clearing your cache, the screen shot below can help you navigate that. There's also some stuff about managing the cache in this documentation link. Also, sometimes .aprx files can get somewhat corrupted, you could try to run your model from a new .aprx and see if it helps there.
managing the cache, arcgis pro
I think it might be worthwhile to open up a ticket with ESRI about this.
Disk size shouldn't be impacting anything but the cache that Pro builds might be - go to the options menu, find 'display' and try clearing your cache, the screen shot below can help you navigate that. There's also some stuff about managing the cache in this documentation link. Also, sometimes .aprx files can get somewhat corrupted, you could try to run your model from a new .aprx and see if it helps there.
managing the cache, arcgis pro
Thanks, I will try these.
Clearing the cache worked, thanks! There was about 8gb in there.
What type of geodatabase are you using? File or Enterprise?
It's a file geodatabase.
Okay, I was thinking if it was enterprise geodatabase using a traditional versioning workflow, then I can see the concept of "state doubling" occurring with millions of records being appended to a feature class.