Is there a way to store photos other than BLOB format?

3322
6
Jump to solution
01-31-2017 10:05 AM
LeeBrannon
New Contributor III

When taking photos into the Collector app and then getting the BLOB format for them in my exported FGDB, I have not found a way to access the photos nor to see them attached to the feature they were collected with outside of AGOL.  I feel like the BLOB format has my photos locked up with no way to view them - except in AGOL.  The exported (from AGOL) FGDB contains the feature class with my collected features, plus an attachment table that lists my jpeg photos, plus the relationship class - see attached image.

What am I doing wrong?  I'm using a hosted feature layer in AGOL.  Would this work better if I used a feature service from my ArcGIS Server?

0 Kudos
1 Solution

Accepted Solutions
ScottFierro2
Occasional Contributor III

We ran into this and if you are comfortable with some python then this will work. It will bring everything down to your FGDB and create folders named by GUID with the appropriate JPEG's extracted inside of them.

Here are the instructions for the python tool:

1) Create local file geodatabase to hold data and attachments you want to download from ArcGIS Online (called data.gdb in script)

2) Create feature class (called myLayer in script), enable attachments, add globalID's

3) Add the following field to the feature class -GlobalID_str, text, length: 50

4) Create table called MatchTable (called MatchTable in script).

5) Add the following fields to the MatchTable table:
    - GlobalID_Str, text, length: 50
    - PhotoPath, text length: 255

6) Enable "sync" on hosted feature service (http://resources.arcgis.com/en/help/arcgisonline/index.html#//010q000000n0000000)

7) Open AGOL_pullFeatures script in text editor and modify the following:
    -ArcGIS Online username/password
    -REST url to feature service to pull from
    -path to and name of local file geodatabase
    -fields to pull from the hosted feature service (must match local feature class)
    -name of local feature class (this will hold the data from the hosted service and the attachments)

import os, urllib, urllib2, datetime, arcpy, json


## ============================================================================== ##
## function to update a field - basically converts longs to dates for date fields ##
## since json has dates as a long (milliseconds since unix epoch) and geodb wants ##
## a proper date, not a long.
## ============================================================================== ##
def updateValue(row,field_to_update,value):
    outputfield=next((f for f in fields if f.name ==field_to_update),None)  #find the output field
    if outputfield == None or value == None:        #exit if no field found or empty (null) value passed in
        return
    if outputfield.type == 'Date':
        if value > 0 :                                            # filter "zero" dates
            value = datetime.datetime.fromtimestamp(value/1000)   # convert to date - this is local time, to use utc time
            row.setValue(field_to_update,value)                   # change "fromtimestamp" to "utcfromtimestamp"
    else:
        row.setValue(field_to_update,value)
    return
## ============================================================================== ##

### Generate Token ###
gtUrl = 'https://www.arcgis.com/sharing/rest/generateToken'
gtValues = {'username' : 'XXX',
'password' : 'XXX',
'referer' : 'http://www.arcgis.com',
'f' : 'json' }
gtData = urllib.urlencode(gtValues)
gtRequest = urllib2.Request(gtUrl, gtData)
gtResponse = urllib2.urlopen(gtRequest)
gtJson = json.load(gtResponse)
token = gtJson['token']

### Create Replica ###
### Update service url HERE ###
crUrl = 'http://services1.arcgis.com/XXX/arcgis/rest/services/FeatureLayerName/FeatureServer/CreateReplica'

crValues = {'f' : 'json',
'layers' : '0',
'returnAttachments' : 'true',
'token' : token }
crData = urllib.urlencode(crValues)
crRequest = urllib2.Request(crUrl, crData)
crResponse = urllib2.urlopen(crRequest)
crJson = json.load(crResponse)
replicaUrl = crJson['URL']
urllib.urlretrieve(replicaUrl, 'myLayer.json')

### Get Attachment ###
cwd = os.getcwd()
with open('myLayer.json') as data_file:
    data = json.load(data_file)

for x in data['layers'][0]['attachments']:
    gaUrl = x['url']
    gaFolder = cwd + '\\photos\\' + x['parentGlobalId']
    if not os.path.exists(gaFolder):
        os.makedirs(gaFolder)
    gaName = x['name']
    gaValues = {'token' : token }
    gaData = urllib.urlencode(gaValues)
    urllib.urlretrieve(url=gaUrl + '/' + gaName, filename=os.path.join(gaFolder, gaName),data=gaData)

### Create Features ###
rows = arcpy.InsertCursor(cwd + '/data.gdb/myLayer')
fields = arcpy.ListFields(cwd + '/data.gdb/myLayer')

for cfX in data['layers'][0]['features']:
    pnt = arcpy.Point()
    pnt.X = cfX['geometry']['x']
    pnt.Y = cfX['geometry']['y']
    row = rows.newRow()
    row.shape = pnt

    ### Set Attribute columns HERE ###
    ## makes use of the "updatevalue function to deal with dates ##

    updateValue(row,'OBJECTID', cfX['attributes']['OBJECTID'])
    updateValue(row,'Route_Code', cfX['attributes']['Route_Code'])
    updateValue(row,'Primary_Route', cfX['attributes']['Primary_Route'])
    updateValue(row,'Street_Name', cfX['attributes']['Street_Name'])
    updateValue(row,'Structure_Type', cfX['attributes']['Structure_Type'])
    updateValue(row,'Pole_Designation_Number', cfX['attributes']['Pole_Designation_Number'])
    updateValue(row,'Visible_Cracks', cfX['attributes']['Visible_Cracks'])
    updateValue(row,'Number_Of_Cracks', cfX['attributes']['Number_Of_Cracks'])
    updateValue(row,'Longest_Crack_Length', cfX['attributes']['Longest_Crack_Length'])
    updateValue(row,'Pole_Multi_Sided', cfX['attributes']['Pole_Multi_Sided'])
    updateValue(row,'Number_Of_Sides', cfX['attributes']['Number_Of_Sides'])
    updateValue(row,'Number_Of_Anchor_Bolts', cfX['attributes']['Number_Of_Anchor_Bolts'])
    updateValue(row,'Anchor_Bolt_Cracked', cfX['attributes']['Anchor_Bolt_Cracked'])
    updateValue(row,'Nuts_Loose', cfX['attributes']['Nuts_Loose'])
    updateValue(row,'Lock_Washers_OK', cfX['attributes']['Lock_Washers_OKs'])
    updateValue(row,'FollowUp_Status', cfX['attributes']['FollowUp_Status'])
    updateValue(row,'FollowUp_Comments ', cfX['attributes']['FollowUp_Comments '])
    updateValue(row,'FollowUp_Complete', cfX['attributes']['FollowUp_Complete'])
    updateValue(row,'District', cfX['attributes']['District'])
    updateValue(row,'County', cfX['attributes']['County'])
    updateValue(row,'Latitude', cfX['attributes']['Latitude'])
    updateValue(row,'Longitude', cfX['attributes']['Longitude'])
    updateValue(row,'NLFID', cfX['attributes']['NLFID'])
    updateValue(row,'GlobalID', cfX['attributes']['GlobalID'])
    updateValue(row,'NLFID', cfX['attributes']['NLFID'])
      
    # leave GlobalID out - you cannot edit this field in the destination geodb

    #comment out below fields if you don't have them in your online or destination geodb (editor tracking)
    updateValue(row,'CreationDate', cfX['attributes']['CreationDate'])
    updateValue(row,'Creator', cfX['attributes']['Creator'])
    updateValue(row,'EditDate', cfX['attributes']['EditDate'])
    updateValue(row,'Editor', cfX['attributes']['Editor'])

    updateValue(row,'GlobalID_str', cfX['attributes']['GlobalID'])

    rows.insertRow(row)

del row
del rows

### Add Attachments ###
### Create Match Table ###
rows = arcpy.InsertCursor(cwd + '/data.gdb/MatchTable')

for cmtX in data['layers'][0]['attachments']:
    row = rows.newRow()

    row.setValue('GlobalID_Str', cmtX['parentGlobalId'])
    row.setValue('PhotoPath', cwd + '\\photos\\' + cmtX['parentGlobalId'] + '\\' + cmtX['name'])

    rows.insertRow(row)

del row
del rows

### Add Attachments ###
arcpy.AddAttachments_management(cwd + '/data.gdb/myLayer', 'GlobalID_Str', cwd + '/data.gdb/MatchTable', 'GlobalID_Str', 'PhotoPath')

View solution in original post

6 Replies
ScottFierro2
Occasional Contributor III

We ran into this and if you are comfortable with some python then this will work. It will bring everything down to your FGDB and create folders named by GUID with the appropriate JPEG's extracted inside of them.

Here are the instructions for the python tool:

1) Create local file geodatabase to hold data and attachments you want to download from ArcGIS Online (called data.gdb in script)

2) Create feature class (called myLayer in script), enable attachments, add globalID's

3) Add the following field to the feature class -GlobalID_str, text, length: 50

4) Create table called MatchTable (called MatchTable in script).

5) Add the following fields to the MatchTable table:
    - GlobalID_Str, text, length: 50
    - PhotoPath, text length: 255

6) Enable "sync" on hosted feature service (http://resources.arcgis.com/en/help/arcgisonline/index.html#//010q000000n0000000)

7) Open AGOL_pullFeatures script in text editor and modify the following:
    -ArcGIS Online username/password
    -REST url to feature service to pull from
    -path to and name of local file geodatabase
    -fields to pull from the hosted feature service (must match local feature class)
    -name of local feature class (this will hold the data from the hosted service and the attachments)

import os, urllib, urllib2, datetime, arcpy, json


## ============================================================================== ##
## function to update a field - basically converts longs to dates for date fields ##
## since json has dates as a long (milliseconds since unix epoch) and geodb wants ##
## a proper date, not a long.
## ============================================================================== ##
def updateValue(row,field_to_update,value):
    outputfield=next((f for f in fields if f.name ==field_to_update),None)  #find the output field
    if outputfield == None or value == None:        #exit if no field found or empty (null) value passed in
        return
    if outputfield.type == 'Date':
        if value > 0 :                                            # filter "zero" dates
            value = datetime.datetime.fromtimestamp(value/1000)   # convert to date - this is local time, to use utc time
            row.setValue(field_to_update,value)                   # change "fromtimestamp" to "utcfromtimestamp"
    else:
        row.setValue(field_to_update,value)
    return
## ============================================================================== ##

### Generate Token ###
gtUrl = 'https://www.arcgis.com/sharing/rest/generateToken'
gtValues = {'username' : 'XXX',
'password' : 'XXX',
'referer' : 'http://www.arcgis.com',
'f' : 'json' }
gtData = urllib.urlencode(gtValues)
gtRequest = urllib2.Request(gtUrl, gtData)
gtResponse = urllib2.urlopen(gtRequest)
gtJson = json.load(gtResponse)
token = gtJson['token']

### Create Replica ###
### Update service url HERE ###
crUrl = 'http://services1.arcgis.com/XXX/arcgis/rest/services/FeatureLayerName/FeatureServer/CreateReplica'

crValues = {'f' : 'json',
'layers' : '0',
'returnAttachments' : 'true',
'token' : token }
crData = urllib.urlencode(crValues)
crRequest = urllib2.Request(crUrl, crData)
crResponse = urllib2.urlopen(crRequest)
crJson = json.load(crResponse)
replicaUrl = crJson['URL']
urllib.urlretrieve(replicaUrl, 'myLayer.json')

### Get Attachment ###
cwd = os.getcwd()
with open('myLayer.json') as data_file:
    data = json.load(data_file)

for x in data['layers'][0]['attachments']:
    gaUrl = x['url']
    gaFolder = cwd + '\\photos\\' + x['parentGlobalId']
    if not os.path.exists(gaFolder):
        os.makedirs(gaFolder)
    gaName = x['name']
    gaValues = {'token' : token }
    gaData = urllib.urlencode(gaValues)
    urllib.urlretrieve(url=gaUrl + '/' + gaName, filename=os.path.join(gaFolder, gaName),data=gaData)

### Create Features ###
rows = arcpy.InsertCursor(cwd + '/data.gdb/myLayer')
fields = arcpy.ListFields(cwd + '/data.gdb/myLayer')

for cfX in data['layers'][0]['features']:
    pnt = arcpy.Point()
    pnt.X = cfX['geometry']['x']
    pnt.Y = cfX['geometry']['y']
    row = rows.newRow()
    row.shape = pnt

    ### Set Attribute columns HERE ###
    ## makes use of the "updatevalue function to deal with dates ##

    updateValue(row,'OBJECTID', cfX['attributes']['OBJECTID'])
    updateValue(row,'Route_Code', cfX['attributes']['Route_Code'])
    updateValue(row,'Primary_Route', cfX['attributes']['Primary_Route'])
    updateValue(row,'Street_Name', cfX['attributes']['Street_Name'])
    updateValue(row,'Structure_Type', cfX['attributes']['Structure_Type'])
    updateValue(row,'Pole_Designation_Number', cfX['attributes']['Pole_Designation_Number'])
    updateValue(row,'Visible_Cracks', cfX['attributes']['Visible_Cracks'])
    updateValue(row,'Number_Of_Cracks', cfX['attributes']['Number_Of_Cracks'])
    updateValue(row,'Longest_Crack_Length', cfX['attributes']['Longest_Crack_Length'])
    updateValue(row,'Pole_Multi_Sided', cfX['attributes']['Pole_Multi_Sided'])
    updateValue(row,'Number_Of_Sides', cfX['attributes']['Number_Of_Sides'])
    updateValue(row,'Number_Of_Anchor_Bolts', cfX['attributes']['Number_Of_Anchor_Bolts'])
    updateValue(row,'Anchor_Bolt_Cracked', cfX['attributes']['Anchor_Bolt_Cracked'])
    updateValue(row,'Nuts_Loose', cfX['attributes']['Nuts_Loose'])
    updateValue(row,'Lock_Washers_OK', cfX['attributes']['Lock_Washers_OKs'])
    updateValue(row,'FollowUp_Status', cfX['attributes']['FollowUp_Status'])
    updateValue(row,'FollowUp_Comments ', cfX['attributes']['FollowUp_Comments '])
    updateValue(row,'FollowUp_Complete', cfX['attributes']['FollowUp_Complete'])
    updateValue(row,'District', cfX['attributes']['District'])
    updateValue(row,'County', cfX['attributes']['County'])
    updateValue(row,'Latitude', cfX['attributes']['Latitude'])
    updateValue(row,'Longitude', cfX['attributes']['Longitude'])
    updateValue(row,'NLFID', cfX['attributes']['NLFID'])
    updateValue(row,'GlobalID', cfX['attributes']['GlobalID'])
    updateValue(row,'NLFID', cfX['attributes']['NLFID'])
      
    # leave GlobalID out - you cannot edit this field in the destination geodb

    #comment out below fields if you don't have them in your online or destination geodb (editor tracking)
    updateValue(row,'CreationDate', cfX['attributes']['CreationDate'])
    updateValue(row,'Creator', cfX['attributes']['Creator'])
    updateValue(row,'EditDate', cfX['attributes']['EditDate'])
    updateValue(row,'Editor', cfX['attributes']['Editor'])

    updateValue(row,'GlobalID_str', cfX['attributes']['GlobalID'])

    rows.insertRow(row)

del row
del rows

### Add Attachments ###
### Create Match Table ###
rows = arcpy.InsertCursor(cwd + '/data.gdb/MatchTable')

for cmtX in data['layers'][0]['attachments']:
    row = rows.newRow()

    row.setValue('GlobalID_Str', cmtX['parentGlobalId'])
    row.setValue('PhotoPath', cwd + '\\photos\\' + cmtX['parentGlobalId'] + '\\' + cmtX['name'])

    rows.insertRow(row)

del row
del rows

### Add Attachments ###
arcpy.AddAttachments_management(cwd + '/data.gdb/myLayer', 'GlobalID_Str', cwd + '/data.gdb/MatchTable', 'GlobalID_Str', 'PhotoPath')
LeeBrannon
New Contributor III

Thanks Scott.  I'm still trying to get the script to work without error.  Will the script work as is with polygons?  Polygons are what I am testing it on, but I see a code reference in there in the ### Create Features ### section for "row.shape = pnt" ...aka arcpy.Point().

zkovacs
Occasional Contributor III

Lee

I've seen quite a few Python scripts trying to download the content from AGOL, but I like this one best. It creates and downloads a replica of your hosted data (all features and photos) as a FGDB with much less code then above for example. Then you can use another code (or compile them together if needed) to extract the photos from the attachment table - and maybe write the extracted photos' path back to the feature table. (Like one of my past projects required).

I hope this helps.

Zoltan

LeeBrannon
New Contributor III

Thanks for the other option Zoltan, but I did finally get the script from Scott working, just this morning and will focus on that solution.

ScottFierro2
Occasional Contributor III

Ours was a point data set and that piece is building the records. Can't quote off top of my head but should be a somewhat similar logic to generate and build polygon records instead of point records.

Quick Google search I did find this page which at the bottom provides logic for looping to generate the polygons.

http://pro.arcgis.com/en/pro-app/arcpy/get-started/reading-geometries.htm

0 Kudos
LeeBrannon
New Contributor III

Scott, I just got the pullFeatures script to work on point data, which is great and will probably suffice for our requirement for photo points.  Thanks again!