Have you distributed this across your organization? I can see right away it will fail in a Citrix environment. Too bad because this type of programmatic access is pretty good to have!
Hi James,
No I haven't. I developed this solution as part of some R&D on a redesign of an SDE data loading process we have. We need to detect changes in datasets in a range of formats so we can determine which of them have been updated and need to be loaded to our SDE geodatabases. Our current app was set up by my supervisor in C# ArcObjects. It works well but has a few shortcomings and has proven tricky to maintain over the years.
Out of curiosity, what indicates that it will fail in a Citrix environment?
Out of curiosity, what indicates that it will fail in a Citrix environment?
It will fail on your import comtypes statement(s)
Hi Micah, thanks for this clear start to finish example of using python arcobjects. I hope you don't mind me taking the liberty of folding it into my (very slowly) growing py-arcobjects module(?)
Hi Matt,
I don't mind at all. Thanks for the shout-out. Happy scripting!
Micah
Great, thanks Micah!
Things are slow right now at work so I found an answer to the second part of this question: how to get the file geodatabase table/feature class size. In Snippets.py, underneath the GetModifiedDate function, insert:
def GetFileSize(gdb, tableName, featureDataset): # Define a function which will convert bytes to a meaningful unit def convert_bytes(bytes): bytes = float(bytes) if bytes >= 1099511627776: terabytes = bytes / 1099511627776 size = '%.2f TB' % terabytes elif bytes >= 1073741824: gigabytes = bytes / 1073741824 size = '%.2f GB' % gigabytes elif bytes >= 1048576: megabytes = bytes / 1048576 size = '%.2f MB' % megabytes elif bytes >= 1024: kilobytes = bytes / 1024 size = '%.2f KB' % kilobytes else: size = '%.2 fb' % bytes return size # Setup GetStandaloneModules() InitStandalone() import comtypes.gen.esriSystem as esriSystem import comtypes.gen.esriGeoDatabase as esriGeoDatabase import comtypes.gen.esriDataSourcesGDB as esriDataSourcesGDB # Open the FGDB pWS = Standalone_OpenFileGDB(gdb) # Create empty Properties Set pPropSet = NewObj(esriSystem.PropertySet, esriSystem.IPropertySet) pPropSet.SetProperty("database", gdb) # Cast the FGDB as IFeatureWorkspace pFW = CType(pWS, esriGeoDatabase.IFeatureWorkspace) # Get the info for a stand-alone table if featureDataset == "standalone": # Open the table pTab = pFW.OpenTable(tableName) # Cast the table to an IDatasetFileStat object pDFS = CType(pTab, esriGeoDatabase.IDatasetFileStat) # Return the size return convert_bytes(pDFS.StatSize) else: # Open the feature class pTab = pFW.OpenFeatureClass(tableName) # Cast the table as a IDatasetFileStat pDFS = CType(pTab, esriGeoDatabase.IDatasetFileStat) # Return the size return convert_bytes(pDFS.StatSize)
It's very similar to the GetModifiedDate function, but uses a different property of the IDatasetFileStat object.
...Really though, this information ought to be accessible via something like
arcpy.Describe().modifiedDate
arcpy.Describe().fileSize
very nice, thanks, again! FYI, here's a good thread on producing human readable byte sizes: python - Reusable library to get human readable version of file size? - Stack Overflow
You may want to request this functionality (arcpy.Describe().modifiedDate, arcpy.Describe().fileSize) on the Esri Ideas site.
Hi Lance,
Thanks for mentioning the Esri Idea related to this. I've promoted it and hope others will as well.