Automatically compress File Geodabases

2265
9
Jump to solution
12-10-2020 05:51 AM
MarkVolz
Occasional Contributor III

Hello,

I would like to know if there is geoprocessing script or another way to crawl through a directory and automatically compress any File Geodabases within the subfolder.  This would save me the time of needing to manually search for all of the Geodatabases in directory.

Thanks!

P.S.  I did come across this script, but it appears to apply towards SQL databases and not File Geodabases.  https://community.esri.com/t5/data-management-documents/compress-geodatabase-tool/ta-p/908944

0 Kudos
1 Solution

Accepted Solutions
JoeBorgione
MVP Emeritus

It would not be a tough script to write; psuedo code might like look like this:

 

import arcpy

arcpy.env.workpace = #directory of geodatabases

fgdbList = arcpy.ListWorkspaces("*", "FileGDB")
for fgdb in fgdbList:
     arcpy.CompressFileGeodatabaseData_management(fgdb)

 

The question I have though is what is the purpose of compressing a file geodatabase on a regular basis?

That should just about do it....

View solution in original post

9 Replies
JohannesBierer
Occasional Contributor III
0 Kudos
JohannesBierer
Occasional Contributor III
0 Kudos
JohannesBierer
Occasional Contributor III

Not tested but could work?

It's from here: "https://gis.stackexchange.com/questions/129454/how-to-compact-geodatabases-in-multiple-directories-u..."

import arcpy, os
from arcpy import env

arcpy.env.workspace = r"C:\"

for path, dirs, files in os.walk(arcpy.env.workspace):
    for dir in dirs:
        if ".gdb" in dir:
            
            workspace = os.path.join(path, dir)
            
            arcpy.Compact_management(workspace)
MarkVolz
Occasional Contributor III

This and Joe's code worked.

Please note that there is a difference between arcpy.Compact_management(workspace), and arcpy.CompressFileGeodatabaseData_management(workspace).  The latter of which will zip up the files.

Thank You!

 

0 Kudos
JoeBorgione
MVP Emeritus

It would not be a tough script to write; psuedo code might like look like this:

 

import arcpy

arcpy.env.workpace = #directory of geodatabases

fgdbList = arcpy.ListWorkspaces("*", "FileGDB")
for fgdb in fgdbList:
     arcpy.CompressFileGeodatabaseData_management(fgdb)

 

The question I have though is what is the purpose of compressing a file geodatabase on a regular basis?

That should just about do it....
MarkVolz
Occasional Contributor III
Joe,



I will check out your script to compress Geodatabases. To answer your question, the purpose of compressing the File Geodabases is to reduce the storage on older Geodatabases that are infrequently used, such as those in archive folders. In theory this should be a one-time script. All that being said, I hope I am not confusing compressing a database with compacting a database. Please let me know if you disagree.


0 Kudos
JoeBorgione
MVP Emeritus

I'm pretty sure I've never compressed a fgdb.  when I read about it in help page, I got the impression that it's a one and done operation.  Compressing an Egdb is an ongoing maintenance operation and I've written and executed scripts for those.

That should just about do it....
0 Kudos
Robert_LeClair
Esri Notable Contributor

Mark - I think what you're looking for is Compact (Data Management)  to reduce file size on disk.  This keeps the contents still editable whereas Compress File Geodatabase makes the contents read-only.

0 Kudos
JoeBorgione
MVP Emeritus

From the compact help page:

JoeBorgione_0-1607616559628.png

swap out the following in my code:

#replace:
arcpy.CompressFileGeodatabaseData_management(fgdb)
#with
arcpy.Compact_management(fgdb)

That should do the trick for you....

That should just about do it....