Wednesday, October 14, 2009

Long Running Scripts

Well, today I've hit a major road block, but I have a work around that should help many of you out there. Have you ever had a process that performs the same operation multiple times, but sometimes that operation fails. It's almost random. Take my situation, I have a clipping function that clips 100s of data sets, but it will randomly fail with a TopoEngine error. The help suggests that you check the geometries of your features, but it turns out, they are fine.

After much search, and questioning, I discovered that I need to delete the Geoprocessing object, then re-create it as such:

import arcgisscripting
from time import strftime
print str(strftime("%H:%M:%S"))
gp = arcgisscripting.create(9.3)
gp.overwriteoutput = True
print str(strftime("%H:%M:%S"))
del gp
print str(strftime("%H:%M:%S"))
gp = arcgisscripting.create(9.3)
gp.overwriteoutput = True
print str(strftime("%H:%M:%S"))

I did it twice, and measured the time. The first time you create the object, it will take the longest. On my machine, it takes 3 seconds to initialize, but after that, the process takes less than a second. Since the processes I am running already take 2-3 hours to complete a 1-2 minute increase is not too hard to live with.

I found that if I delete the geoprocessing object after I perform work on about 3 processes, I eliminate the error.



George Silva said...

That is very weird, but somehow makes sense.

Sometimes when debugging and testing a script using the gp object my machine will be slow like hell, and when i look at it i can see 5 or 6 python processes still running eating up all memory.

I know i must close all gp objects after use, but do you have come across this?


Andrew said...

It all depends what you are doing. If I'm using sub processing with my gp tools, I've encountered this many times.

What version of ArcGIS Desktop are you using? It might be a function of that.

Albert Grela said...

Hi, in my script which use 3 gp functions it did not iterate even twice. I have to 'del gp' at each iteration. I suspect that loading a NetCDF file into a feature file of 576000 records 'eats' a bit of memory. It takes quite a while to figure out a solution. Your experience was invaluable. Thanks

Shane said...

I usually build my scripts to run as tools in Arctoolbox, and have mine also loop through a bunch of processes over and over.

My problem is, it causes Arcmap to eat so much memory that it kills my machine and I have to restart the app to clear it.

So, am I understanding correctly that I can simply add del gp to the end of my script and clear the geoprocesses from memory?


Andrew said...

It will not clear up all the memory, but it will free up some.

I suggest you use the del statement on all objects once you are done using them. Each del statement will free up memory for your machine.

Also, I assume you are using 931, deleting your gp object will require you to recreate the gp object before you repeat your steps.