AnsweredAssumed Answered

LOCK-file insanity during iteration

Question asked by ledwith on May 13, 2020

I'm running an iteration of the results of a region group. There are tens of thousands of rows in the iteration. Each iteration does some stuff and outputs a file that eventually will be part of a much larger mosaic.

 

The scripts runs really fast in the beginning. Then as the number of iterations increase, the processing time becomes longer and longer. For example, the time to run though one thousand iterations (the numbers are per 1000, not cumulative):

 

0 - 1000: 20 min

1 - 2000: 23 min

2 - 3000: 30 min

3 - 4000: 46 min

4 - 5000: 1:24

5 - 6000: 1:49

6 - 7000: 1:59

7 - 8000: 3:30

8 - 9000: 4:08

9 - 10000: 5:50

 

So it takes 17 times longer to iterate through lines 9000 - 10000 than it does for lines 0 - 1000.

 

The process ran over a weekend. I checked the directory where the files were being written to and there were tens of thousands of LOCK files. I shut down the IDLE shell, which deleted all of the LOCK files and started up the script again from where I left off. Swoosh! Back up to super fast.

 

I placed a "del row" comment at the end of each iteration but this doesn't seem to release the LOCK file. Anyone know how to get rid of it without stopping the iteration?

 

Or am I missing something else? Is there a way to iterate through tens of thousands of objects, creating separate files without this massive slowdown? I should note that I'm running the script separately on two different computers, as well as two different VD-instances. One each outputs to a directory (tif), the other outputs to a file geodatabase (grid). No difference.

Outcomes