I'm facing an issue with the use of arcpy.overwriteOutput property.
I have one Python script which performs some Geo-Processing operations and at the end generates one final table named suppose "CITY_NEW". It gets created on the production server after each run of the script daily. I have kept arcpy.overwriteOutput option set to True so that it will always replaces the old table with the newly created one.
My issue is, for sometimes script does get ended with the error saying "Table name 'CITY_NEW' #already exists". But sometimes it gets completed successfully without any error. Is it because the output table "CITY_NEW" may be open in some other application or getting accessed by some user while the script is replacing it? I tried doing this by keeping that table open on one system and running the script manually. This also gets ended with the same error. If so, then why here arcpy.overwriteOutput property is not working as expected?
OverwriteOutput will only overwrite if it can. It does not guarantee that this will occur in all situations. An often used example in scripting would be to check to see whether a featureclass exists, try to delete (delete_management) within a try-except block, and if successful proceed on. Any featureclass that is 'used' by arcmap will automatically become removed from the objects that can be deleted. If you create/copy/whatever a featureclass or table and it is added to an arcmap project, it cannot be deleted regardless of the overwrite settings, since it is used by the project.
Is there any way we can kill that session where the output table is getting used? Whether its Arcmap session or other?
I tried with arcpy.ListUsers(), which gives all database level locks where we can disconnect them.
But I'm not able find a way to get the table or feature class level locks.
If this is possible, then what if i first kill that session and then perform the replace functionality?
This will help us a lot.
I tried with this first truncating the table and then inserting the all new records into it. But 2 things i faced. In one run i again kept the table open on ArcMap, it again ended with lock issue. and second, it takes far more time to truncate than it takes in current situation.
I want to know if there is any way we can disconnect a specific user who is having lock on that table.
Thanks Dan !!
I have implemented this in some other task. But its disconnecting the user from entire SDE workspace (which we are not supposed to follow) and not the single table object.
i want to know any method where we can get object level lock from any workspace. like we can do from ArcCatalog or ArcMap. Disconnect all users from a specific table in SDE. And not from entire SDE.
I hope im able to convey my question.
Thanks in advance.
SDE is a whole different issue, with its own rules. If it isn't covered in arcpy, then I can't help... perhaps arcobjects, but someone with experience in that realm would have to weight in