Problem updating tableview broken data sources.

5370
26
03-21-2012 11:13 AM
MathewCoyle
Frequent Contributor
I have several mxds with broken data sources to SDE after a database move. I am running a script to update the broken data sources and keep getting an error on broken links to table views. All other layers update correctly.

Here is the error I get
Runtime error <type 'exceptions.ValueError'>: StandaloneTableObject: Unexpected error


Here's the relevant code where I am getting the error.
brokenlist = arcpy.mapping.ListBrokenDataSources(mxd)
for lyr in brokenlist:
    if ".sde" in lyr.dataSource:
        try:
            lyr.replaceDataSource(new_datasource, "SDE_WORKSPACE", lyr.name)


Has anyone encountered this or know a work around?

Edit: I've tried it with and without the lyr.name specified.
Tags (2)
0 Kudos
26 Replies
MichaelVolz
Esteemed Contributor
Mathew:

Sorry for my ignorance in this particular area, but how does one make a "python process large address aware"?

Thank you.
0 Kudos
MathewCoyle
Frequent Contributor
Here's the post that got me going. A few other memory management tips in the thread too.
http://forums.arcgis.com/threads/35655-Pool-from-multiprocessing-issues?p=121358&viewfull=1#post1213...
0 Kudos
MichaelVolz
Esteemed Contributor
Mathew:

Thanks for the following information:

Here's the post that got me going. A few other memory management tips in the thread too.
http://forums.arcgis.com/threads/356...l=1#post121358

If I understand the instructions correctly, all I need to do is run the exe on the computer where the python script is running.  Then I select the .exe file that I want to be "large address aware" and check it to add it.  Then when I run the python script on the computer again, I will have access to more RAM?  That's it.

Also, if you run your script on another batch of files, can you turn off "large address aware" on the python.exe file to see if you have the memory leak issue that I am experiencing?

Your assistance and feedback is greatly appreciated.  Thanks.
0 Kudos
MathewCoyle
Frequent Contributor
After more than 50 mxds there hasn't been any noticeable memory increase. Fluctuates from 260MB to 500MB but seems to be clearing alright every iteration. It does seem to clear more memory when it finishes a folder chain if that helps. If there is a memory leak it is in the KBs per mxd, at least in my environment.
0 Kudos
MichaelVolz
Esteemed Contributor
When you ran this test, were you running python as "large address aware"?

Maybe you could try running the same test with python with "large address aware" turned off?  Unless I read the instructions incorrectly, I thought all this involved was a checkbox when directed to this .exe file.
0 Kudos
MathewCoyle
Frequent Contributor
Yes it was with large address aware. I have a few other things running right now so I won't be able to revert it back to it's original format until tomorrow.

I did notice at over 100 mxds it isn't clearing down to the level it was before. Seems to shoot up when it hits a bunch of errors with the tables that I originally posted about but then drops back down, just not all the way. Just a guess, Arc may be trying to log errors with the GP tool in the background and not managing it very well.
0 Kudos
MathewCoyle
Frequent Contributor
Failure!

At around 200 mxds it crashed. System process was using close to 3GB, pythonw process using less than 400MB. My memory must have failed me, I must not have run it on all the MXDs I thought I had the first round. Must have broken it up into sub directories and ran it a couple of times instead of running off everything under the root dir. Of course I was also using the computer at the same time as it was processing and I left it to run over night before, maybe that had something to do with it.
0 Kudos
lelaharrington
New Contributor III

Did you ever get closure on this. i have the same issue i am just listing broken data sources in a file location using an os.walk script with listbrokendatasources

i have tried to source the table with list table views but hat is not working. this is the second script i have ever made and i am not to savvy on how to approach this.

0 Kudos
MichaelVolz
Esteemed Contributor

Lela:

I don't believe python was ever able to handle that situation, so I went through all the mxd files and flagged the broken tableviews into a log file and then fixed them manually (This was a very minor portion of the SDE connections - < 0.1%, thus a small job to perform manually).

0 Kudos
lelaharrington
New Contributor III

see where as mine is a huge deal. i am looking to either use an except system that just ignores them. that would be ok to. the broken sources actually stops my os.walk script when it hits a broken table. is there a way to say "if you run into a broken table ignore it and keep running" 

Thank you

Lela Harrington

0 Kudos