Select to view content in your preferred language

SDE failing to compress to state 0, multiple admin connections are left despite disconnecting/blocking connections

3792
12
10-04-2021 07:27 AM
CarlVricella
New Contributor III

As title states, for some reason my SDE does not get to state 0 during my weekly compression via python script. In our SDE the 'SDE' user is the admin, I connect with the user, disconnect everyone, block new connections, turn off our geodatabase manager services, delete all unposted versions and then compress, yet I'm always left with something like this (from my logs): 

 

2021-10-01 21:19:29,839 - DEBUG - Failed to compress to state 0, current def state 16140
2021-10-01 21:19:29,839 - DEBUG - The following states are still remaining:
2021-10-01 21:19:29,839 - DEBUG - (STATE_ID,OWNER,CREATION_TIME,CLOSING_TIME,PARENT_STATE_ID,LINEAGE_NAME)
2021-10-01 21:19:29,855 - DEBUG - (0, u'SDE', datetime.datetime(2013, 2, 4, 16, 1, 51), datetime.datetime(2013, 2, 4, 16, 23, 14), 0, 0)
2021-10-01 21:19:29,855 - DEBUG - (16140, u'SDE', datetime.datetime(2021, 10, 1, 16, 45, 33), datetime.datetime(2021, 10, 1, 16, 45, 34), 0, 16122)
2021-10-01 21:19:29,855 - DEBUG - (15687, u'SDE', datetime.datetime(2021, 10, 1, 14, 2, 15), datetime.datetime(2021, 10, 1, 14, 2, 35), 0, 15687)
2021-10-01 21:19:29,871 - DEBUG - (16144, u'SDE', datetime.datetime(2021, 10, 1, 16, 46, 40), datetime.datetime(2021, 10, 1, 16, 46, 40), 16140, 16122)
2021-10-01 21:19:29,871 - DEBUG - (16146, u'SDE', datetime.datetime(2021, 10, 1, 16, 46, 42), datetime.datetime(2021, 10, 1, 16, 46, 43), 16140, 16146)
2021-10-01 21:19:29,885 - DEBUG - (16148, u'SDE', datetime.datetime(2021, 10, 1, 16, 46, 45), datetime.datetime(2021, 10, 1, 16, 47, 7), 16140, 16148)
2021-10-01 21:19:29,885 - DEBUG - (16150, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 9), datetime.datetime(2021, 10, 1, 16, 47, 10), 16140, 16150)
2021-10-01 21:19:29,901 - DEBUG - (16152, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 11), datetime.datetime(2021, 10, 1, 16, 47, 13), 16140, 16152)
2021-10-01 21:19:29,901 - DEBUG - (16154, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 14), datetime.datetime(2021, 10, 1, 16, 47, 16), 16140, 16154)
2021-10-01 21:19:29,901 - DEBUG - (16156, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 17), datetime.datetime(2021, 10, 1, 16, 47, 17), 16140, 16156)
2021-10-01 21:19:29,917 - DEBUG - (16158, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 19), datetime.datetime(2021, 10, 1, 16, 47, 31), 16140, 16158)
2021-10-01 21:19:29,917 - DEBUG - (16160, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 33), datetime.datetime(2021, 10, 1, 16, 47, 34), 16140, 16160)
2021-10-01 21:19:29,933 - DEBUG - (16162, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 35), datetime.datetime(2021, 10, 1, 16, 47, 44), 16140, 16162)
2021-10-01 21:19:29,933 - DEBUG - (16164, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 46), datetime.datetime(2021, 10, 1, 16, 47, 47), 16140, 16164)
2021-10-01 21:19:29,933 - DEBUG - (16166, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 48), datetime.datetime(2021, 10, 1, 16, 47, 50), 16140, 16166)
2021-10-01 21:19:29,948 - DEBUG - (16168, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 51), datetime.datetime(2021, 10, 1, 16, 47, 52), 16140, 16168)
2021-10-01 21:19:29,948 - DEBUG - (16170, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 54), datetime.datetime(2021, 10, 1, 16, 47, 54), 16140, 16170)
2021-10-01 21:19:29,963 - DEBUG - (16172, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 56), datetime.datetime(2021, 10, 1, 16, 47, 56), 16140, 16172)
2021-10-01 21:19:29,963 - DEBUG - Attempting compress to state 0 again
2021-10-01 21:19:52,631 - DEBUG - Second attempt failed, current def state 16140
2021-10-01 21:19:52,631 - DEBUG - The following states are still remaining:
2021-10-01 21:19:52,647 - DEBUG - (STATE_ID,OWNER,CREATION_TIME,CLOSING_TIME,PARENT_STATE_ID,LINEAGE_NAME)
2021-10-01 21:19:52,647 - DEBUG - (0, u'SDE', datetime.datetime(2013, 2, 4, 16, 1, 51), datetime.datetime(2013, 2, 4, 16, 23, 14), 0, 0)
2021-10-01 21:19:52,647 - DEBUG - (16140, u'SDE', datetime.datetime(2021, 10, 1, 16, 45, 33), datetime.datetime(2021, 10, 1, 16, 45, 34), 0, 16122)
2021-10-01 21:19:52,661 - DEBUG - (15687, u'SDE', datetime.datetime(2021, 10, 1, 14, 2, 15), datetime.datetime(2021, 10, 1, 14, 2, 35), 0, 15687)
2021-10-01 21:19:52,661 - DEBUG - (16144, u'SDE', datetime.datetime(2021, 10, 1, 16, 46, 40), datetime.datetime(2021, 10, 1, 16, 46, 40), 16140, 16122)
2021-10-01 21:19:52,677 - DEBUG - (16146, u'SDE', datetime.datetime(2021, 10, 1, 16, 46, 42), datetime.datetime(2021, 10, 1, 16, 46, 43), 16140, 16146)
2021-10-01 21:19:52,677 - DEBUG - (16148, u'SDE', datetime.datetime(2021, 10, 1, 16, 46, 45), datetime.datetime(2021, 10, 1, 16, 47, 7), 16140, 16148)
2021-10-01 21:19:52,677 - DEBUG - (16150, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 9), datetime.datetime(2021, 10, 1, 16, 47, 10), 16140, 16150)
2021-10-01 21:19:52,694 - DEBUG - (16152, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 11), datetime.datetime(2021, 10, 1, 16, 47, 13), 16140, 16152)
2021-10-01 21:19:52,694 - DEBUG - (16154, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 14), datetime.datetime(2021, 10, 1, 16, 47, 16), 16140, 16154)
2021-10-01 21:19:52,709 - DEBUG - (16156, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 17), datetime.datetime(2021, 10, 1, 16, 47, 17), 16140, 16156)
2021-10-01 21:19:52,709 - DEBUG - (16158, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 19), datetime.datetime(2021, 10, 1, 16, 47, 31), 16140, 16158)
2021-10-01 21:19:52,709 - DEBUG - (16160, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 33), datetime.datetime(2021, 10, 1, 16, 47, 34), 16140, 16160)
2021-10-01 21:19:52,724 - DEBUG - (16162, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 35), datetime.datetime(2021, 10, 1, 16, 47, 44), 16140, 16162)
2021-10-01 21:19:52,724 - DEBUG - (16164, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 46), datetime.datetime(2021, 10, 1, 16, 47, 47), 16140, 16164)
2021-10-01 21:19:52,740 - DEBUG - (16166, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 48), datetime.datetime(2021, 10, 1, 16, 47, 50), 16140, 16166)
2021-10-01 21:19:52,740 - DEBUG - (16168, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 51), datetime.datetime(2021, 10, 1, 16, 47, 52), 16140, 16168)
2021-10-01 21:19:52,740 - DEBUG - (16170, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 54), datetime.datetime(2021, 10, 1, 16, 47, 54), 16140, 16170)
2021-10-01 21:19:52,756 - DEBUG - (16172, u'SDE', datetime.datetime(2021, 10, 1, 16, 47, 56), datetime.datetime(2021, 10, 1, 16, 47, 56), 16140, 16172)

 

I do a second compression if the first fails to get to state 0 as I read that sometimes this helps fix the issue, but still no luck. How can I remedy this? I'm not sure where these states are coming from as at this point there no versions, they've all been posted or deleted so these states are unreferenced. The datetimes show they are opened around 4PM while this script runs at 9PM and the only things that use this schema/user are myself and the services for managing the geodatabase (which I turn off before running).  

 

Tags (3)
0 Kudos
12 Replies
JakeSkinner
Esri Esteemed Contributor

I agree with @George_Thompson that you should not truncate these tables in the geodatabase.  You should always proceed with extreme caution before making any edits to the SDE repository tables.  Below is a script that I've had success with.  It compresses a geodatabase and writes a delta report out to a CSV file showing the before/after counts of the A & D tables.

 

import requests, json, arcpy, sys, linecache, time, csv
from arcpy import env
env.overwriteOutput = 1

# Disable warnings
requests.packages.urllib3.disable_warnings()

startTime = time.clock()

# Variables
adminConnection = r"C:\DB_Connections\GIS.sde"           # DBO connected user (or SDE user)

federatedAGS = "true"                                   # Specify true if AGS is federated, false if not

portalServer = "portal.esri.com"                        # Portal instance
portalUsername = "portaladmin"                          # Portal Admin account
portalPassword = "**********"                           # Portal admin account password

agsServer = "server.esri.com"                           # ArcGIS Server instance
serverPort = 6443
agsUsername = ''                                        # Do not specify if ArcGIS Server is federated
agsPassword = ''                                        # Do not specify if ArcGIS Server is federated

deleteVersions = "KEEP_VERSION"                         # DELETE_VERSION
abortConflicts = "NO_ABORT"                             # ABORT_CONFLICTS
resolveConflicts = "FAVOR_EDIT_VERSION"                 # FAVOR_TARGET_VERSION
reports = r"C:\TEMP"                                    # Directory to write Delta Report


# Split the services
serviceList = []

# Dictionary of all versioned feature classes
versionedFeatureClasses = {}


# Function to report errors
def PrintException():
    exc_type, exc_obj, tb = sys.exc_info()
    f = tb.tb_frame
    lineno = tb.tb_lineno
    filename = f.f_code.co_filename
    linecache.checkcache(filename)
    line = linecache.getline(filename, lineno, f.f_globals)
    arcpy.AddError('Error:  Line {} -- "{}": {}'.format(lineno, line.strip(), exc_obj))
    sys.exit()


# Function to get all services
def getServices():
    baseUrl = "https://{0}:6443/arcgis/admin/services".format(agsServer)

    if serverPort == '6080':
        baseUrl = "http://{0}:{1}/arcgis/admin/services".format(agsServer, serverPort)
    else:
        baseUrl = "https://{0}:{1}/arcgis/admin/services".format(agsServer, serverPort)

    params = {'f': 'json', 'token': token}
    r = requests.post(baseUrl, data = params, verify=False)
    catalog = json.loads(r.content)
    services = catalog['services']
    for service in services:
        if service['type']!= 'StreamServer':
            serviceList.append(service['serviceName'] + '.' + service['type'])

    folders = catalog['folders']
    for folderName in folders:
        if str(folderName) not in ('System', 'Utilities', 'DataStoreCatalogs', 'Hosted'):
            r = requests.post(baseUrl + "/" + folderName, data = params, verify=False)
            catalog = json.loads(r.content)
            services = catalog['services']
            for service in services:
                serviceList.append(str(folderName) + "/" + service['serviceName'] + '.' + service['type'])

# Function to start and stop services
def startStopServices(START_STOP):
    for service in serviceList:
        if START_STOP == 'stop':
            print("Stopping {}".format(service))
        else:
            print("Starting {}".format(service))
        if serverPort == '6080':
            baseUrl = "http://{0}:{1}/arcgis/admin/services".format(agsServer, serverPort)
        else:
            baseUrl = "https://{0}:{1}/arcgis/admin/services".format(agsServer, serverPort)
        params = {'f': 'json', 'token': token}

        r = requests.post(baseUrl + "/" + service + "/" + START_STOP, data = params, verify=False)
        response = json.loads(r.content)
        print("\t" + str(response))


# Function to get delta table counts
def getDeltaTableCount(egdb_conn):
    arcpy.env.workspace = adminConnection

    fcList = []
    for dataset in arcpy.ListDatasets("*"):
        for fc in arcpy.ListFeatureClasses("*", "", dataset):
            if arcpy.Describe(fc).isVersioned == True:
                fcList.append(fc)

    for fc in arcpy.ListFeatureClasses("*"):
        if arcpy.Describe(fc).isVersioned == True:
            fcList.append(fc)

    for fc in fcList:
        egdb_conn = arcpy.ArcSDESQLExecute(adminConnection)

        try:
            sqlDBO = "SELECT registration_id FROM dbo.sde_table_registry where table_name = '{0}' and owner = '{1}'".format(fc.split(".")[-1], fc.split(".")[-2])
            registrationId = egdb_conn.execute(sqlDBO)
        except:
            pass
        try:
            sqlDBO = "SELECT registration_id FROM sde.sde_table_registry where table_name = '{0}' and owner = '{1}'".format(fc.split(".")[-1], fc.split(".")[-2])
            registrationId = egdb_conn.execute(sqlDBO)
        except:
            pass
        try:
            sqlDBO = "SELECT registration_id FROM sde.table_registry where table_name = '{0}' and owner = '{1}'".format(fc.split(".")[-1].upper(), fc.split(".")[-2].upper())
            registrationId = egdb_conn.execute(sqlDBO)
        except:
            pass


        sqlDBO = "SELECT COUNT(*) FROM {0}.a{1}".format(fc.split(".")[-2], registrationId)
        aCount = egdb_conn.execute(sqlDBO)
        sqlDBO = "SELECT COUNT(*) FROM {0}.d{1}".format(fc.split(".")[-2], registrationId)
        dCount = egdb_conn.execute(sqlDBO)
        try:
            versionedFeatureClasses[str(fc)].append(aCount)
            versionedFeatureClasses[str(fc)].append(dCount)
        except:
            versionedFeatureClasses[str(fc)] = [aCount, dCount]

    return versionedFeatureClasses



# Function to get current State ID
def getStateID(adminConnection):
    egdb_conn = arcpy.ArcSDESQLExecute(adminConnection)

    sqlDBO = '''SELECT state_id FROM dbo.sde_states'''
    sqlSDE = '''SELECT state_id FROM sde.sde_states'''
    sqlORCL = '''SELECT state_id FROM sde.states'''

    try:
        state_id = egdb_conn.execute(sqlDBO)
        deltaCounts = getDeltaTableCount(egdb_conn)
    except:
        pass
    try:
        state_id = egdb_conn.execute(sqlSDE)
        deltaCounts = getDeltaTableCount(egdb_conn)
    except:
        pass
    try:
        state_id = egdb_conn.execute(sqlORCL)
        deltaCounts = getDeltaTableCount(egdb_conn)
    except:
        pass

    try:
        state_id = int(state_id)
    except:
        state_id = int(state_id[-1][0])

    print("Current State ID:  " + str(state_id))
    return state_id, deltaCounts


# If federated
if federatedAGS == 'true':
    if portalUsername and portalPassword:
        try:
            # Generate token for Portal
            tokenURL = 'https://{}:7443/arcgis/sharing/rest/generateToken/'.format(portalServer)
            params = {'f': 'json', 'username': portalUsername, 'password': portalPassword, 'referer': 'https://' + portalServer, 'expiration': str(1440)}
            r = requests.post(tokenURL, data = params, verify = False)
            response = json.loads(r.content)
            token = response['token']
        except:
            PrintException()
    else:
        token = ''

# If not federated
if federatedAGS == 'false':
    if agsUsername and agsPassword:
        try:
            if serverPort == '6080':
                tokenURL = 'http://{}:6080/arcgis/tokens/'.format(agsServer)
            else:
                tokenURL = 'https://{}:6443/arcgis/tokens/'.format(agsServer)
            params = {'username': agsUsername, 'password': agsPassword, 'client': 'requestip', 'f': 'pjson', 'expiration': str(1440)}
            r = requests.post(tokenURL, data = params, verify = False)
            response = json.loads(r.content)
            token = response['token']
        except:
            PrintException()
    else:
        token = ''

# Get Current State ID
current_state_id, deltaCounts = getStateID(adminConnection)

# Get All Services
getServices()

# Stop AGS services
if '' in serviceList:
   serviceList.remove('')
if len(serviceList) > 0:
   startStopServices("stop")

# Block new connections to the database.
print("Blocking new connections to the database")
arcpy.AcceptConnections(adminConnection, False)

# Disconnect all users
print("Disconnecting all users from database")
arcpy.DisconnectUser(adminConnection, "ALL")

# Get a list of versions to pass into the ReconcileVersions tool.
print("Getting list of all versions")
versionList = arcpy.ListVersions(adminConnection)

for version in versionList:
    if 'default' in version.lower():
        if 'dbo' in version.lower():
            defaultVersion = 'dbo.DEFAULT'
        elif 'sde' in version.lower():
            defaultVersion = 'sde.DEFAULT'


# Execute the ReconcileVersions tool.
print("Reconciling/posting/deleting all versions")
try:
    arcpy.ReconcileVersions_management(adminConnection, "ALL_VERSIONS", defaultVersion, versionList, "LOCK_ACQUIRED", abortConflicts, "BY_OBJECT", resolveConflicts, "POST", deleteVersions, reports + "\\CompressLog.txt")
except:
    reconcileMessage = "Reconcile failed:  " + arcpy.GetMessages() + ".  Check reconcilelog.txt file in the " + str(reports + "\\CompressLog.txt")
    if sendEmail == 'true':
        email("Reconcile failed", reconcileMessage)
    arcpy.AddWarning(reconcileMessage)
    pass

# Run the compress tool.
print("Running compress")
try:
    arcpy.Compress_management(adminConnection)
    print("Compress was successful")
    compressMessageSuccess = 'True'
except Exception as e:
    compressMessage = 'Compress failed:  ' + arcpy.GetMessages() + ".  Please check the sde.COMPRESS_LOG file within the geodatabase"
    arcpy.AddWarning(compressMessage)
    pass

# Allow the database to begin accepting connections again
print("Allow users to connect to the database again")
arcpy.AcceptConnections(adminConnection, True)

# Clear workspace cache
arcpy.ClearWorkspaceCache_management()

# Start AGS services
if len(serviceList) > 0:
   startStopServices("start")

# Get New State ID
new_state_id, deltaCounts = getStateID(adminConnection)

# Write delta table counts to CSV
print("Writing delta counts to CSV")
csv = open(reports + "\\DeltaCounts.csv", "w")
columnTitleRow = "Feature Class,A table count (Before Compress),A table count (After Compress),D table count (Before Compress), D table count (After Compress)\n"
csv.write(columnTitleRow)
for val in deltaCounts:
    row = val + "," + str(deltaCounts[val][0]) + "," + str(deltaCounts[val][2]) + "," + str(deltaCounts[val][1]) + "," + str(deltaCounts[val][3]) + "\n"
    csv.write(row)

csv.close()


endTime = time.clock()
elapsedTime = round((endTime - startTime) / 60, 2)
print("Compressed completed in " + str(elapsedTime) + " mins. \nNew State ID:  " + str(new_state_id))
CarlVricella
New Contributor III

Can you give reasoning and state the relevance to the above? We assume everything of importance is posted at the time of this script run, any existing lock would interfere with getting to state 0, that's why those tables are truncated. Process_Information is truncated to remove any lingering connections, I then reconnect after that to get the sde connection back. 

 

Regardless the real question is where at these excess states coming from? They shouldn't be because again I delete all versions (except default of course) and turn all our services off (and by services I mean window services connecting to the SDE that mange our post and qa queues). No map service uses the SDE connection    

0 Kudos
RogerDunnGIS
Occasional Contributor II

I thought the smallest number of states you could get to with a compression was 1, not 0.

0 Kudos