Having attachments from Collector be saved in folder on server

2258
4
Jump to solution
12-22-2016 11:26 AM
ChristopherHolewinski1
New Contributor III

I'm currently attempting to create a map to be used with Collector to aid our assessors in taking pictures of properties.  I've created a map that contains the parcel layer and I have attachments enabled, and can successfully attach pictures to each parcel.  The problem is that I would like to be able to have those pictures be able to be downloaded/saved to a folder so I can import them into our assessors software? Or does Collector just save those pictures to the cloud with no way to access them outside of AGOL?

Any information would be greatly appreciated.

Thank you,

Chris

Tags (2)
0 Kudos
1 Solution

Accepted Solutions
MichaelDavis3
Occasional Contributor III

If the photos are saved as attachments in AGOL they should be included if you download your data as a file geodatabase.  You could then use a python script to export the attachments as image files in a folder.  We have variations on this that create folders for each feature and then export the attachments so the resulting files are somewhat organized.

View solution in original post

0 Kudos
4 Replies
MichaelDavis3
Occasional Contributor III

If the photos are saved as attachments in AGOL they should be included if you download your data as a file geodatabase.  You could then use a python script to export the attachments as image files in a folder.  We have variations on this that create folders for each feature and then export the attachments so the resulting files are somewhat organized.

0 Kudos
GeorgeKatsambas
Occasional Contributor III

I am using Collector and have a field crew that is attaching photos. Everything works fine except the attach and attach rel tables that are in SDE only allow exporting of 1 photo. I am useing a python script I got from geonet 

import arcpy
from arcpy import da
import os

inTable = arcpy.GetParameterAsText(0)
fileLocation = arcpy.GetParameterAsText(1)

with da.SearchCursor(inTable, ['DATA', 'ATT_NAME', 'ATTACHMENTID']) as cursor:
for item in cursor:
attachment = item[0]
filenum = "ATT" + str(item[2]) + "_"
filename = filenum + str(item[1])
open(fileLocation + os.sep + filename, 'wb').write(attachment.tobytes())
del item
del filenum
del filename
del attachment

0 Kudos
MichaelDavis3
Occasional Contributor III

This is the exact same code we use - maybe check your spacing and indentation to make sure everything is lined up correctly?

If SDE is the issue you might also try to copy/paste everything into a file geodatabase and then run the export script on it.

ScottFierro2
Occasional Contributor III

We use this you will just have to create the local FGDB with a FC and table then follow #7 for modifying the script to fit your use

1) Create local file geodatabase to hold data and attachments you want to download from ArcGIS Online (called data.gdb in script)

2) Create feature class (called myLayer in script), enable attachments, add globalID's

3) Add the following field to the feature class -GlobalID_str, text, length: 50

4) Create table called MatchTable (called MatchTable in script).

5) Add the following fields to the MatchTable table:
    - GlobalID_Str, text, length: 50
    - PhotoPath, text length: 255

6) Enable "sync" on hosted feature service (http://resources.arcgis.com/en/help/arcgisonline/index.html#//010q000000n0000000)

7) Open AGOL_pullFeatures script in text editor and modify the following:
    -ArcGIS Online username/password
    -REST url to feature service to pull from
    -path to and name of local file geodatabase
    -fields to pull from the hosted feature service (must match local feature class)
    -name of local feature class (this will hold the data from the hosted service and the attachments)

import os, urllib, urllib2, datetime, arcpy, json


## ============================================================================== ##
## function to update a field - basically converts longs to dates for date fields ##
## since json has dates as a long (milliseconds since unix epoch) and geodb wants ##
## a proper date, not a long.
## ============================================================================== ##
def updateValue(row,field_to_update,value):
    outputfield=next((f for f in fields if f.name ==field_to_update),None)  #find the output field
    if outputfield == None or value == None:        #exit if no field found or empty (null) value passed in
        return
    if outputfield.type == 'Date':
        if value > 0 :                                            # filter "zero" dates
            value = datetime.datetime.fromtimestamp(value/1000)   # convert to date - this is local time, to use utc time
            row.setValue(field_to_update,value)                   # change "fromtimestamp" to "utcfromtimestamp"
    else:
        row.setValue(field_to_update,value)
    return
## ============================================================================== ##

### Generate Token ###
gtUrl = 'https://www.arcgis.com/sharing/rest/generateToken'
gtValues = {'username' : '<YOUR_USERNAME>',
'password' : '<YOUR_PASSWORD>',
'referer' : 'http://www.arcgis.com',
'f' : 'json' }
gtData = urllib.urlencode(gtValues)
gtRequest = urllib2.Request(gtUrl, gtData)
gtResponse = urllib2.urlopen(gtRequest)
gtJson = json.load(gtResponse)
token = gtJson['token']

### Create Replica ###
### Update service url HERE ###
crUrl = 'http://services1.arcgis.com/1AlElnGrgBM62OSj/arcgis/rest/services/<YOUR_FC>/FeatureServer/CreateReplica'

crValues = {'f' : 'json',
'layers' : '0',
'returnAttachments' : 'true',
'token' : token }
crData = urllib.urlencode(crValues)
crRequest = urllib2.Request(crUrl, crData)
crResponse = urllib2.urlopen(crRequest)
crJson = json.load(crResponse)
replicaUrl = crJson['URL']
urllib.urlretrieve(replicaUrl, 'myLayer.json')

### Get Attachment ###
cwd = os.getcwd()
with open('myLayer.json') as data_file:
    data = json.load(data_file)

for x in data['layers'][0]['attachments']:
    gaUrl = x['url']
    gaFolder = cwd + '\\photos\\' + x['parentGlobalId']
    if not os.path.exists(gaFolder):
        os.makedirs(gaFolder)
    gaName = x['name']
    gaValues = {'token' : token }
    gaData = urllib.urlencode(gaValues)
    urllib.urlretrieve(url=gaUrl + '/' + gaName, filename=os.path.join(gaFolder, gaName),data=gaData)

### Create Features ###
rows = arcpy.InsertCursor(cwd + '/data.gdb/myLayer')
fields = arcpy.ListFields(cwd + '/data.gdb/myLayer')

for cfX in data['layers'][0]['features']:
    pnt = arcpy.Point()
    pnt.X = cfX['geometry']['x']
    pnt.Y = cfX['geometry']['y']
    row = rows.newRow()
    row.shape = pnt

    ### Set Attribute columns HERE ###
    ## makes use of the "updatevalue function to deal with dates ##

    updateValue(row,'OBJECTID', cfX['attributes']['OBJECTID'])
    updateValue(row,'Route_Code', cfX['attributes']['Route_Code'])
    updateValue(row,'Primary_Route', cfX['attributes']['Primary_Route'])
    updateValue(row,'Street_Name', cfX['attributes']['Street_Name'])
    updateValue(row,'Structure_Type', cfX['attributes']['Structure_Type'])
    updateValue(row,'Pole_Designation_Number', cfX['attributes']['Pole_Designation_Number'])
    updateValue(row,'Visible_Cracks', cfX['attributes']['Visible_Cracks'])
    updateValue(row,'Number_Of_Cracks', cfX['attributes']['Number_Of_Cracks'])
    updateValue(row,'Longest_Crack_Length', cfX['attributes']['Longest_Crack_Length'])
    updateValue(row,'Pole_Multi_Sided', cfX['attributes']['Pole_Multi_Sided'])
    updateValue(row,'Number_Of_Sides', cfX['attributes']['Number_Of_Sides'])
    updateValue(row,'Number_Of_Anchor_Bolts', cfX['attributes']['Number_Of_Anchor_Bolts'])
    updateValue(row,'Anchor_Bolt_Cracked', cfX['attributes']['Anchor_Bolt_Cracked'])
    updateValue(row,'Nuts_Loose', cfX['attributes']['Nuts_Loose'])
    updateValue(row,'Lock_Washers_OK', cfX['attributes']['Lock_Washers_OKs'])
    updateValue(row,'FollowUp_Status', cfX['attributes']['FollowUp_Status'])
    updateValue(row,'FollowUp_Comments ', cfX['attributes']['FollowUp_Comments '])
    updateValue(row,'FollowUp_Complete', cfX['attributes']['FollowUp_Complete'])
    updateValue(row,'District', cfX['attributes']['District'])
    updateValue(row,'County', cfX['attributes']['County'])
    updateValue(row,'Latitude', cfX['attributes']['Latitude'])
    updateValue(row,'Longitude', cfX['attributes']['Longitude'])
    updateValue(row,'NLFID', cfX['attributes']['NLFID'])
    updateValue(row,'GlobalID', cfX['attributes']['GlobalID'])
    updateValue(row,'NLFID', cfX['attributes']['NLFID'])
      
    # leave GlobalID out - you cannot edit this field in the destination geodb

    #comment out below fields if you don't have them in your online or destination geodb (editor tracking)
    updateValue(row,'CreationDate', cfX['attributes']['CreationDate'])
    updateValue(row,'Creator', cfX['attributes']['Creator'])
    updateValue(row,'EditDate', cfX['attributes']['EditDate'])
    updateValue(row,'Editor', cfX['attributes']['Editor'])

    updateValue(row,'GlobalID_str', cfX['attributes']['GlobalID'])

    rows.insertRow(row)

del row
del rows

### Add Attachments ###
### Create Match Table ###
rows = arcpy.InsertCursor(cwd + '/data.gdb/MatchTable')

for cmtX in data['layers'][0]['attachments']:
    row = rows.newRow()

    row.setValue('GlobalID_Str', cmtX['parentGlobalId'])
    row.setValue('PhotoPath', cwd + '\\photos\\' + cmtX['parentGlobalId'] + '\\' + cmtX['name'])

    rows.insertRow(row)

del row
del rows

### Add Attachments ###
arcpy.AddAttachments_management(cwd + '/data.gdb/myLayer', 'GlobalID_Str', cwd + '/data.gdb/MatchTable', 'GlobalID_Str', 'PhotoPath')
0 Kudos