Python Script - Updating Feature Class in SDE daily

5793
15
10-01-2015 05:17 PM
timdunlevie
Occasional Contributor

Hi all,

I am going to implement a nightly script which will remove all records in a Feature Class in our spatial DB, and append new records.

Apart from compressing the Database, is there anything else I should manage? Analyze ? Should I remove any locks that could be connected?

This process will run every night.

Will compressing the entire DB have an effect with any child/parent DBs ?

Also, are you able to compress a single Feature Class rather than an entire DB ?

we run 10.2

Thanks,

Tim

15 Replies
AsrujitSengupta
Regular Contributor III

Compress, Rebuild Indexes, Analyze datasets are some of the maintenance tasks.

Disconnecting all users before doing the compress will help. Considering that the script will run nightly, this shouldn't be a problem I guess.

No, compressing the sde geodatabase will not affect replica GDBs.

No, the compress tool runs for the entire geodatabase.

timdunlevie
Occasional Contributor

Brilliant!

Many thanks….just what I needed to know.

Tim

0 Kudos
WesMiller
Regular Contributor III

Here's what i use

# Import system modules
import arcpy,os


# set workspace
workspace = r"Your File here"


# set the workspace environment
arcpy.env.workspace = workspace


# No more users to connect to database
arcpy.AcceptConnections(workspace, False)


#Disconnect Users
arcpy.DisconnectUser (workspace, "ALL")


# Compress database
arcpy.Compress_management(workspace)
print 'Compression Complete'


# Get a list of stand alone feature classes
dataList = arcpy.ListFeatureClasses()


# Add Feature classes in datasets
for dataset in arcpy.ListDatasets("", "Feature"):
    arcpy.env.workspace = os.path.join(workspace,dataset)
    dataList += arcpy.ListFeatureClasses()


# reset the workspace
arcpy.env.workspace = workspace


# Execute rebuild indexes
arcpy.RebuildIndexes_management(workspace, "NO_SYSTEM", dataList, "ALL")
print 'Rebuild Complete'


# Allow connections.
arcpy.AcceptConnections(workspace, True)
0 Kudos
BlakeTerhune
MVP Regular Contributor
WesMiller
Regular Contributor III

I thought i had put that in, apparently, I didn't thanks 

0 Kudos
BlakeTerhune
MVP Regular Contributor

Are all of your feature classes in the same schema?

0 Kudos
WesMiller
Regular Contributor III

Yes they are all in the same schema.

0 Kudos
JoshuaBixby
MVP Esteemed Contributor

Is this feature class large?  Are all the records being altered/modified each day?  I ask because completely deleting all records in a feature class only to replace them with many of the same records creates a lot of overhead, especially if the feature class is versioned, even more especially if there are replicas based off of the geodatabase.  If most of the records don't change, it might be more efficient to implement a record comparison workflow and only purge and replace records that are different.

timdunlevie
Occasional Contributor

Thanks Joshua,

Yeah the FC is ~ 8000 records, of which maybe only a handful are added/modified.

So a record comparison is the way to go….thanks.

Do you run a script to perform this comparison?

Thanks,

Tim

0 Kudos