Automated Post/Reconcile, Compress, Index, and Analyze Script Fails

1237
7
08-07-2019 06:49 AM
DavidBuehler
Occasional Contributor III

I am trying to automate my administrative processes with Python for my enterprise geodatebase.  I basically took the script at the end found here:  Use Python scripting to batch reconcile and post versions—ArcGIS Help | ArcGIS Desktop and trimmed out the parts I do not need.  I cannot for the life of me figure out why it bombs when run via a scheduled task.  

Using 10.6.1, sde is the admin level, gis user is the owner of the data.  

import arcpy
#, time, smtplib

# Set the workspace
arcpy.env.workspace = r'C:\Users\GISAdmin\AppData\Roaming\ESRI\Desktop10.6\ArcCatalog\SDE to GIS2.sde'
# sde is the user that can disconnect users
# gis is the owner of of all the data


# Set a variable for the workspace
adminConn = arcpy.env.workspace

# Block new connections to the database.
print("The database is no longer accepting connections")
arcpy.AcceptConnections(adminConn, False)
print("Credit Card Declined")

# Wait 15 minutes
#time.sleep(900)

# Disconnect all users from the database.
print("Disconnecting all users")
arcpy.DisconnectUser(adminConn, "ALL")
print("Disconnected Users")

# Get a list of versions to pass into the ReconcileVersions tool.
# Only reconcile versions that are children of Default
print("Compiling a list of versions to reconcile")
verList = arcpy.ListVersions(adminConn)
#versionList = [ver.name for ver in verList if ver.parentVersionName == 'sde.DEFAULT']
print verList

# Execute the ReconcileVersions tool.
print("Reconciling all versions")
arcpy.ReconcileVersions_management(adminConn, "ALL_VERSIONS", "sde.DEFAULT", u'WEBSERVICES.WebEdits', "LOCK_ACQUIRED", "NO_ABORT", "BY_OBJECT", "FAVOR_TARGET_VERSION", "POST", "KEEP_VERSION", "c:/ArcGISscripts/ReconcileLogs/Temp/reconcilelog.txt")
print("Reconcile Complete")

# Run the compress tool.
print("Running compress")
arcpy.Compress_management(adminConn)
print("Compression Complete")

# Allow the database to begin accepting connections again
print("Allow users to connect to the database again")
arcpy.AcceptConnections(adminConn, True)
print("Communications Back On Line Captain")

# Update statistics and indexes for the system tables
# Note: to use the "SYSTEM" option the user must be an geodatabase or database administrator.
# Rebuild indexes on the system tables
print("Rebuilding indexes on the system tables")
arcpy.RebuildIndexes_management(adminConn, "SYSTEM")
print("Rebuilt Indexes")

# Update statistics on the system tables
print("Updating statistics on the system tables")
arcpy.AnalyzeDatasets_management(adminConn, "SYSTEM")
print("Analyzed Data sets")


print("YAY!!! We Finished For Once.")‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍
7 Replies
MichaelVolz
Esteemed Contributor

Are you sure the script is even getting fired from the Windows Scheduled Task?

Are you running this script from a desktop or server machine? Either way which specific OS are you using?

0 Kudos
DavidBuehler
Occasional Contributor III

Hi Michael,

I would be running the script from a server machine.  Windows 2012r2. The script does fire.  I think it has something do with the which user has permissions.

0 Kudos
DougBrowning
MVP Frequent Contributor

For me it was that some things need to use the DBO and some need to use SDE user.  Depends on not just the tool but also the options picked.  Notice I have to run Analyze twice actually.

# Setup Vars
# List of each DB to run maintenance on - use full path to SDE connection files
# Edt
edtSDEConn = r"\\somthing\EdtSDE.sde"
edtDBOConn = r"\\something\EdtDBO.sde"


# log file location
logfileDir = r"\\something\Logs"

failedFlag = 0

import traceback, time, arcpy, os, datetime

#----------------------Functions----------------------------------------------#
# just to make it easy to write to all 3 places at once
def prt(msg):
print msg
arcpy.AddMessage(msg)
logFile.write(msg + "\n")

#----------------------Enable Log file-----------------------------------------------#

try:
tmpName = time.asctime()
# Can not have : in the time so made it -
tmpName = "Maintenance report made " + tmpName.replace(':', '-')
logfileName = logfileDir + '/' + tmpName + '.txt'
logFile = open(logfileName, 'w')

startTime = time.time()
prt(tmpName)
prt("Start time is " + time.asctime())

# First is Compress - which uses SDE user
# Run on Edt
prt("\nCompress Started on Edt: "+ time.strftime('%H:%M'))
result = arcpy.Compress_management(edtSDEConn)
if result.status != 4:
prt(" Warning Arc reported that Compress on Edt had an issue! Status code is " + str(result.status))
prt(" Arc Result messages are ")
prt(result.getMessages())
failedFlag = 1


# Need to get a list of all the feature classes and tables for the next two tools
prt("\nGetting a List of tables\n")
# For Edt
arcpy.env.workspace = edtDBOConn
# Get any at the root - -which all of ours are
edtfcList = arcpy.ListTables() + arcpy.ListFeatureClasses()
# IF there was any Feature Datasets would use this code. But there are none
## fdList = arcpy.ListDatasets("*", "Feature")
## for fd in fdList:
## fcList += arcpy.ListFeatureClasses(feature_dataset=fd)


# Then Rebuild Indexees
# Run on Edt using DBO and NO_SYSTEM
prt("Rebuild Indexes Started on Edt using DBO: " + time.strftime('%H:%M'))
result = arcpy.RebuildIndexes_management(edtDBOConn, "NO_SYSTEM", edtfcList, "ALL")
if result.status != 4:
prt(" Warning Arc reported that Rebuild on Edt using DBO had an issue! Status code is " + str(result.status))
prt(" Arc Result messages are ")
prt(result.getMessages())
failedFlag = 1

# Run on Edt using SDE and SYSTEM
prt("Rebuild Indexes Started on Edt using SDE: " + time.strftime('%H:%M'))
result = arcpy.RebuildIndexes_management(edtSDEConn, "SYSTEM", "", "ALL")
if result.status != 4:
prt(" Warning Arc reported that Rebuild on Edt using SDE had an issue! Status code is " + str(result.status))
prt(" Arc Result messages are ")
prt(result.getMessages())
failedFlag = 1


# Finally Analyze Datasets
# Run on Edt using DBO and NO_SYSTEM
prt("Analyze Datasets Started on Edt using DBO: " + time.strftime('%H:%M'))
result = arcpy.AnalyzeDatasets_management(edtDBOConn, "NO_SYSTEM", edtfcList, "ANALYZE_BASE", "ANALYZE_DELTA", "ANALYZE_ARCHIVE")
if result.status != 4:
prt(" Warning Arc reported that Analyze on Edt using DBO had an issue! Status code is " + str(result.status))
prt(" Arc Result messages are ")
prt(result.getMessages())
failedFlag = 1

# Run on Edt using SDE and SYSTEM
prt("Analyze Datasets Started on Edt using SDE: " + time.strftime('%H:%M'))
result = arcpy.AnalyzeDatasets_management(edtSDEConn, "SYSTEM", "", "ANALYZE_BASE", "ANALYZE_DELTA", "ANALYZE_ARCHIVE")
if result.status != 4:
prt(" Warning Arc reported that Analyze on Edt using SDE had an issue! Status code is " + str(result.status))
prt(" Arc Result messages are ")
prt(result.getMessages())
failedFlag = 1

if failedFlag == 0:
prt("Maintenance completed successfully.\n")
else:
prt("Warning Maintenance was NOT successful.\n")


prt("\nEnded at " + time.asctime())
prt("Run time is " + str((time.time() - startTime)/60) + " minutes")
prt("\n---End---\n")
logFile.close()

# Rename log if Arc thinks it failed (it can fail but Python is fine)
if failedFlag == 1:
os.rename(logfileName, logfileName[:-4] + "_Failed.txt")

except:
# This part of the script executes if anything went wrong in the main script above

#-----------------------------------------------
# Use this section to prt all the errors
prt( "\nSomething went wrong.\n\n")
prt("Arc Result messages are ")
if result:
prt(result.getMessages())
prt("Pyton Traceback Message below:")
prt(traceback.format_exc())
prt("\nArcMap Error Messages below:")
prt(arcpy.GetMessages(2))
prt("\nArcMap Warning Messages below:")
prt(arcpy.GetMessages(1) )

prt("\n\nEnded at " + time.asctime() + '\n')
prt("\n---End of Script---\n")


try:
if logFile:
logFile.close()
# Rename log file based on outcome
os.rename(logfileName, logfileName[:-4] + "_Failed.txt")
except:
print "Log file never opened"
#-------------------------------------------------------------------


BruceLang
Occasional Contributor III

Many thanks for this script.  I know enough to be dangerous, please feel free to dumb it down for me.   But I have a question for the section beginning at line 46;

# Need to get a list of all the feature classes and tables for the next two tools

Specifically;

    # IF there was any Feature Datasets would use this code.  But there are none
##        fdList = arcpy.ListDatasets("*", "Feature")
##        for fd in fdList:
##            fcList += arcpy.ListFeatureClasses(feature_dataset=fd)

I think our data do contain Feature Datasets, but get an error when activating this code by removing the double hash marks (e.g. ## ) and re-aligning the code to;

    # IF there was any Feature Datasets would use this code.  But there are none
    fdList = arcpy.ListDatasets("*", "Feature")
    for fd in fdList:
        fcList += arcpy.ListFeatureClasses(feature_dataset=fd)

The error is; "NameError: name 'fcList' is not defined"

So, shouldn't the variable "fcList" actually be changed to "edtfcList"?

The script completes after this change, but please confirm this change. Thank you. Thank you, very much...

0 Kudos
DougBrowning
MVP Frequent Contributor

I think that is correct.  I never actually used it so not sure.  Since you do not have any datasets you should not have to use it either.

The only weird part here is that you have to run some tools twice if you have SDE and DBO set up like we do.   

Glad it helped.  Mark as helpful.  Thanks

Arne_Gelfert
Regular Contributor

I was wrestling with some of these same issues recently - have you made sure that your account has logon as services or batch job permissions on the server? In my case, I had also migrated code to Python 3.x that ships with pro. In that case, you now need to make sure that your service account has a Pro license ( is a named user in Portal).

When you say "bombs", is there an error message somewhere?

0 Kudos
BlakeTerhune
MVP Regular Contributor

Is the scheduled task set to run as the GISAdmin user? We put our database connections in a common folder on the C: drive of the server instead of in a user folder.

0 Kudos