I would like to batch export my photo attachments on my local network. I need an efficient way to have multiples photos that are associated with a single survey be named in a way that lets me know what survey they're tied to. I have discovered a couple of scripts that batch export the attachments to my local folder, but the script only address one attachment per survey. I believe this happens because the script is written in a way that it assumes the attachment table is a one to one with the feature layer. I typically have 5 photos per structure and my surveys entail hundreds of structures. I need to expedite the process of getting these photos off of the Portal and onto my local network. So far I have found little guidance on the best way to do this.
This is the script that I have been using:
# imports the necessary modules to run
from arcpy import da
import os
attachTable = arcpy.GetParameterAsText(0) # table in GDB holding attachments
origTable = arcpy.GetParameterAsText(1) # layer in GDB holding features to which attachments belong
nameField = arcpy.GetParameterAsText(2) # field in origTable that contains more appropriate name for attachment
fileLocation = arcpy.GetParameterAsText(3) # folder where you want to save the photos
# create the cursor to search through the attachment tables; specify you only wish to search those three fields
attachCursor = da.SearchCursor(attachTable,['DATA','ATT_NAME','sc_portal_map_ld_globalid_2'])
# begin searching the table and storing the actual images, movies, etc
for attRow in attachCursor:
binaryRep = attRow[0]
fileName = attRow[1]
relID = attRow[2] # defines the relationship ID that we can use to join to the actual features (origTable)
# creates a cursor to sort the features; essentially to find a match for the relID above
originCursor = da.SearchCursor(origTable,['GlobalID', nameField])
for origRow in originCursor:
origID = origRow[0] # store the Global ID (which will match to the relID above for the correct attachment)
origName = origRow[1] # store the unique name of each record that you will use to save the attachment
if origID == relID: # stops the search after it finds the record which equals the ID from the attachments
break
# saves a file in the specified location that contains the name chosen by the user for that attachment
open(fileLocation + os.sep + origName + ".jpg",'wb').write(binaryRep.tobytes())
# iteratively deletes the searchCursor from the feature class so it can reboot for the next attachment!
del originCursor
# If you are creating script from Sratch in ArcGIS
# Parameter(0) Type = Table
# Parameter(1) Type = Feature Layer
# Parameter(2) Type = Field (obtained from Parameter(1)
# Parameter(3) Type = Folder (make sure it is an input)
Tagging Python. You might get more responses. Formatting your code will also help, see: Code Formatting... the basics++
I have been experimenting with some code that may help you. It exports attachments and saves them using a unique filename that includes the object ID from the feature layer, the object ID of the attachment, and text from a field in feature layer.
import arcpy
import os
import re
origTable = r"C:\Path\To\Your.gdb\Layer"
attachTable = "{}__ATTACH".format(origTable) # if no attachTable given, append __ATTACH to origTable
nameField = "dataField" # appropriate name field in origTable
fileLocation = r"C:\Path\To\Save\Directory"
origFieldsList = ["GlobalID", "OBJECTID", nameField] # GlobalID for linking, OBJECTID for renaming, nameField for renaming
# Use list comprehension to build a dictionary from a da SearchCursor
valueDict = {r[0]:(r[1:]) for r in arcpy.da.SearchCursor(origTable, origFieldsList)}
# REL_GLOBALID # 'sc_portal_map_ld_globalid_2' # GlobalID that links to origTable
with arcpy.da.SearchCursor(attachTable, ['DATA', 'ATT_NAME', 'ATTACHMENTID', 'REL_GLOBALID']) as cursor:
for item in cursor:
attachment = item[0] # attachment data
filenum = "ATT" + str(item[2]) + "_"
filename = filenum + str(item[1]) # this will be the filename if linking fails
# store the Join value of the row being updated in a keyValue variable
keyValue = item[3] # REL_GLOBALID
# verify that the keyValue is in the Dictionary
if keyValue in valueDict:
# transfer the values stored under the keyValue from the dictionary to the updated fields.
obID = valueDict[keyValue][0]
# remove invalid filename characters, replace spaces and periods, limit length
namefield = re.sub('[^0-9a-zA-Z]+', '_', valueDict[keyValue][1])[:18]
# Create a unique filename ObjectID_AttachmentID_namefield.ext
ext = filename.rsplit('.', 1)[-1] # keep extension of original file
filename = "{}_{}_{}.{}".format(obID,item[2],namefield,ext)
print "Writing: {}{}{}".format(fileLocation, os.sep, filename)
open(fileLocation + os.sep + filename, 'wb').write(attachment.tobytes())
del item
del filenum
del filename
del attachment
del valueDict
print "Done"
Since you are using Portal, you may need to make some adjustments. I was working with a downloaded geodatabase from AGOL. Although it is not set up as a tool, this could be done.
Hope this helps.
References:
How To: Batch export attachments from a feature class
Turbo Charging Data Manipulation with Python Cursors and Dictionaries (Example 2 method is used here)
Thank you Randy! I'll be heading into work tonight to give this a try. Stay tuned for my results!
We keep getting an error in line 17, any insight?
My initial guess would be that you are starting with an original feature table called "Field_Point" and that the linked attachment table is expected to be "Field_Point__ATTACH". Depending on your geodatabase type, it may be using a different naming system than what the code expects. Changes to lines 5-6 may be required. I would use Catalog to explore your database and verify the names of the feature and attachment tables.
Great work Randy.
Just in case anyone needs this:
If you already have the nameField you want your download files to be named after in the attached table, and don't need the feature class for the naming:
import arcpy
from arcpy import da
import os
import re
inTable = arcpy.GetParameterAsText(0)
fileLocation = arcpy.GetParameterAsText(1)
nameField = "ADDRESS"
with da.SearchCursor(inTable, ['DATA', 'ATT_NAME', nameField, 'OBJECTID']) as cursor:
for item in cursor:
attachment = item[0]
newfilename = re.sub('[^0-9a-zA-Z]+', '_', str(item[2]))
oid = 'OID' + str(item[3])
oldfilename = str(item[1])
ext = oldfilename.rsplit('.', 1)[-1] # keep extension of original file
newwinfilename = "{}_{}.{}".format(newfilename, oid, ext)
open(fileLocation + os.sep + newwinfilename, 'wb').write(attachment.tobytes())
del item
del newfilename
del oid
del oldfilename
del newwinfilename
del attachment
Also, if the attachments are not related using the GlobalID/GUID (for example, related using OBJECTID/REL_OBJECTID instead), you can make them as such using these methods discussed here:
Preserving a GlobalID while moving data between Feature Classes
written by:
Other references:
How To: Batch export attachments from a feature class in ArcGIS Pro
It looks like you might be using python 3. Try:
print(f"Writing: {fileLocation}{os.sep}{filename}")
If this doesn't solve the issue, confirm which version of Python you are using.
Hi Randy
I am very new to ArcMap and I tried this script but got the following error
I am guessing there is something I am doing wrong or do I need to change something in the script
Traceback (most recent call last): File "C:\Users\rugum\Documents\ArcGIS\My ArcGIS Files\My ToolBoxes\Renaming_Export_Attachments.py", line 13, in <module> valueDict = {r[0]:(r[1:]) for r in arcpy.da.SearchCursor(origTable, origFieldsList)} RuntimeError: cannot open 'C:\Path\To\Your.gdb\Layer' Failed to execute (RenamingAttachments). Failed at Thu Jul 2 09:09:07 2020 (Elapsed Time: 0.24 seconds)