Please help with batch exporting photo attachments in Collector

6120
12
09-15-2017 01:56 PM
andystiver
New Contributor

I would like to batch export my photo attachments on my local network. I need an efficient way to have multiples photos that are associated with a single survey be named in a way that lets me know what survey they're tied to. I have discovered a couple of scripts that batch export the attachments to my local folder, but the script only address one attachment per survey. I believe this happens because the script is written in a way that it assumes the attachment table is a one to one with the feature layer. I typically have 5 photos per structure and my surveys entail hundreds of structures. I need to expedite the process of getting these photos off of the Portal and onto my local network. So far I have found little guidance on the best way to do this. 

 

This is the script that I have been using:

# imports the necessary modules to run
from arcpy import da
import os

attachTable = arcpy.GetParameterAsText(0) # table in GDB holding attachments
origTable = arcpy.GetParameterAsText(1) # layer in GDB holding features to which attachments belong
nameField = arcpy.GetParameterAsText(2) # field in origTable that contains more appropriate name for attachment
fileLocation = arcpy.GetParameterAsText(3) # folder where you want to save the photos

# create the cursor to search through the attachment tables; specify you only wish to search those three fields
attachCursor = da.SearchCursor(attachTable,['DATA','ATT_NAME','sc_portal_map_ld_globalid_2'])

# begin searching the table and storing the actual images, movies, etc
for attRow in attachCursor:
binaryRep = attRow[0]
fileName = attRow[1]
relID = attRow[2] # defines the relationship ID that we can use to join to the actual features (origTable)

# creates a cursor to sort the features; essentially to find a match for the relID above
originCursor = da.SearchCursor(origTable,['GlobalID', nameField])
for origRow in originCursor:
origID = origRow[0] # store the Global ID (which will match to the relID above for the correct attachment)
origName = origRow[1] # store the unique name of each record that you will use to save the attachment
if origID == relID: # stops the search after it finds the record which equals the ID from the attachments
break

# saves a file in the specified location that contains the name chosen by the user for that attachment
open(fileLocation + os.sep + origName + ".jpg",'wb').write(binaryRep.tobytes())

# iteratively deletes the searchCursor from the feature class so it can reboot for the next attachment!
del originCursor

# If you are creating script from Sratch in ArcGIS
# Parameter(0) Type = Table
# Parameter(1) Type = Feature Layer
# Parameter(2) Type = Field (obtained from Parameter(1)
# Parameter(3) Type = Folder (make sure it is an input)

12 Replies
DanPatterson
MVP Esteemed Contributor

C:\Path\To\Your.gdb\Layer'

Did you change the above path to match your situation?

In other words, specify the path to your geodatabase and provide the layer name


... sort of retired...
MichaelKelly3
New Contributor III

Have you come across this script which downloads directly from a hosted Feature Layer?

NelsonSterner
New Contributor

Thanks Randy! The script worked for me on some feature classes and not others. I've been getting the error:

Runtime error
Traceback (most recent call last):
File "<string>", line 31, in <module>
File "C:\Python27\ArcGIS10.5\Lib\re.py", line 155, in sub
return _compile(pattern, flags).sub(repl, string, count)
TypeError: expected string or buffer

Any idea?

0 Kudos