Python processing running twice as slow each run

661
4
04-04-2012 09:44 AM
SarahOishi
New Contributor
I have a script that processes roads in different counties and builds a dictionary of road lengths in polygons. Due to the large size of the datasets I am building the dictionary "in_memory." The first time I ran this, it processed fairly quickly. The second time I ran the same script on the same data and the processing time doubled. I ran it a third time (on different data, with not much difference in dataset size) and the time doubled from the second run. I am deleting the "in_memory" as one of the first steps of my script to ensure a clear workspace. Any ideas on why it is getting slower each run?
Tags (2)
0 Kudos
4 Replies
MathewCoyle
Frequent Contributor
Are you executing this through IDLE? If so, does killing the process solve the issue? Is memory not being release after each run?
0 Kudos
SarahOishi
New Contributor
I'm using pyscripter. At the end of the processing, I delete individual aspects in the memory as well as the memory as a whole.
0 Kudos
MathewCoyle
Frequent Contributor
Does killing the PyScripter.exe process or rebooting solve the issue?

And does the memory being held by the process release after you delete your in memory workspace or does it persist?
0 Kudos
SarahOishi
New Contributor
Killing and rebooting does not solve the issue.

I'm not sure if the memory is being released. The in_memory workspace size remains constant throughout processing though (before deleting, after deleting and after building a dictionary).
0 Kudos