Append to Hosted Feature Layer in Python API

10-03-2022 12:06 PM
New Contributor II

Hey all, longtime lurker finally encountering enough resistance from the Python implementation to come shouting for help.  I'm using Python 3.6, latest Arcpy and ArcPro.

My company keeps a hosted feature layer of previous work (telecom construction) in our ArcOnline Organization; my task is to build an automation script to take CAD .dwg files, extract out certain polylines, check to see if there are duplicates in our existing layer, replace them if so, and finally upload the new polylines to the existing online feature layer.

Everything besides comparing the two layers has gone fine, but I've been stuck for a week on getting the hosted feature layer to cooperate properly.

From my understanding, the proper syntax to pull the layer into Python is as such:

dataitem = gis.content.get(itemid)
flayer = FeatureLayer.fromitem(dataitem)

I then have to convert the CAD feature class to a .gdb, zip it, and use

item_cad = gis.content.add(data, item_properties)

in order for them to be able to append to one another.

I was able to run that once with success, now I get thrown "Exception: item (data) already exists" and the script breaks.

How can I append the CAD feature class I'm creating to the online feature layer in a way that will allow me to continue and find the duplicates of a unique field?  I've made sure all of the fields are the same, and when I've tried using

arcpy.Append_management(flayer, cad_fc, "NO_TEST")  # I get a runtime error

flayer.append(cad_fc) # I get "Exception: Unable to append data. Append is not enabled. (Error Code: 405)" although the layer is editable in our organization content.

I'll add the relevant code below, please give a shout if you have any ideas!

import os, sys, traceback, datetime, arcpy, shutil
from arcgis.gis import GIS
from arcgis.features import FeatureLayer
arcpy.FeatureClassToGeodatabase_conversion(cad_fc, gdb)

#  Download hosted feature layer
print('Downloading online layer...')
dataitem = gis.content.get(itemid)              
flayer = FeatureLayer.fromitem(dataitem)        
geo_zip = r'path/to/'

#  Compress local gdb with update data to zip
make_archive(gdb, geo_zip)
gdb_properties = {'title': 'Updated Works',
                  'type': 'File Geodatabase',
                  'tags' : 'tag,tag'}
item_cad = gis.content.add(data=geo_zip, item_properties=gdb_properties)

#  Combine then Compare via Job Name (Unique) and delete older Job Name feature if there's a match
print('Comparing to online layer...')

flayer.append(cad_fc) #throws Error(stated above)
arcpy.Append_management(flayer, cad_fc, "NO_TEST") #throws Exception (above)
flayer.append(, upload_format='filegdb',source_table_name='Scratch')#throws a runtime error


5 Replies
New Contributor III

Hi Matthew,

Can I see the complete code you're working with?


0 Kudos
New Contributor II



It's a bit long, and I've been making changes up until the Append error to try and fix it so Line 166 down is untested.  Let me know if anything jumps out at you! I'm still relatively new to arcpy.

# Import packages
print("Loading packages...")
import os, sys, traceback, datetime, arcpy, shutil
from arcgis.gis import GIS, Layer
from arcgis.features import FeatureLayer
print("Packages loaded.")

# Global Variables
itemid = r'73a9b37fcfb048c3b524217dd5dda916' #tell script to update Layer 'Previous Designs'
dwnpath = r'C:/Users/mattl/Desktop/Test_Folder/AGO_Upload/'
gdb = r"C:/Users/mattl/Desktop/Test_Folder/Scratch.gdb"
dir_path = r'C:/Users/mattl/Desktop/Test_Folder/CAD_to_ArcOnline/'

def make_archive(source, destination):
        base = os.path.basename(destination)
        name = base.split('.')[0]
        format = base.split('.')[1]
        archive_from = os.path.dirname(source)
        archive_to = os.path.basename(source.strip(os.sep))
        print(source, destination, archive_from, archive_to)
        shutil.make_archive(name, format, archive_from, archive_to)
        shutil.move('%s.%s'%(name,format), destination)

# Check to see if connected to Network Drive
print('Connecting to network...')
if os.path.exists('W:/'):
	print('Could not connect to network drive W:/')
	print('Exiting script...')
	raise SystemExit()

# Log into ArcGIS Online
print('Logging into ArcGIS Online...')
user = 'nunya'
password = 'biznis'
	gis = GIS('url', user, password)  #not specifying password will ask for user input
	un =
	print('Logged in as: {}'.format(un))
except Exception as error:
	print('Login Failed')
	print('Exiting script...')
	raise SystemExit()

# Gather CAD .dwg's to Process
print('Finding CAD drawings...')
ext = '.dwg'
dwg_list = []

# 		Tell Arc where to look for files
arcpy.env.workspace = dir_path
#		Put files in list
for file in os.listdir(dir_path):
	file.replace(" ", "")
	if file.endswith(ext):
		counter = file.count('_') # Perform check for two underscores
		if counter == 2:
			dwg_list.append(file) # Add to list of files

print('Drawings compiled.')

# Perform Geoprocessing for each CAD Drawing
arcpy.env.workspace = gdb
#		Clear gdb
fc_list = arcpy.ListFeatureClasses()
for fc in fc_list:
#		Create a feature layer to hold input feature classes
arcpy.CreateFeatureclass_management(gdb, 'CAD_FC', 'POLYLINE')
arcpy.MakeFeatureLayer_management(gdb + '/CAD_FC', 'cad_fc')

fd = { #field dictionary
	'Date_Uploaded' : ['Date_Upld', 'DATE', 'Date Uploaded', None,], 
	'OBJECTID' : ['OBJECTID', 'SHORT', 'OID', None, None],
	'GlobalID' : ['GlobalID', 'GUID', 'GlobalID', 38, None],
	'Shape__Length' : ['Shp__Len', 'DOUBLE', 'Shape_Length', None, None],
	'CreationDate' : ['CreateDate', 'DATE', 'CreationDate', 8, None],
	'Creator' : ['Creator', 'TEXT', 'Creator', 128, None], 
	'EditDate' : ['EditDate', 'DATE', 'EditDate', 8, None],
	'Editor' : ['Editor', 'TEXT', 'Editor', 128, None], 
	'Customer' : ['Customer', 'TEXT', 'Customer', 100, None], 
	'Job_Name' : ['Job_Name', 'TEXT', 'Job Name', 256, None], 
	'Design_Stage' : ['Dsgn_Stg', 'TEXT', 'Design Stage', 255, None], 
	'State' : ['State', 'TEXT', 'State', 2, None], 
	'GD_Folder' : ['GD_Folder', 'TEXT', 'GD Folder', 256, None]

arcpy.AddFields_management('cad_fc', [fd['CreationDate'], fd['Creator'], fd['EditDate'], fd['Editor'], fd['Customer'], fd['Job_Name'], fd['Design_Stage'], fd['State'], fd['GD_Folder']])

# Iterate through dwg list
arcpy.env.workspace = dir_path
arcpy.env.overwriteOutput = True
query = "Layer LIKE '%PROPOSED%' OR Layer = 'u_CENTURYLINK'"
for file in arcpy.ListDatasets(dwg_list):
	print(file + '...')
	f_name = os.path.splitext(file)[0]
	full_path = dir_path + '/' + f_name.replace(" ", "")
	arcpy.Delete_management(['file_pl', 'file_dissolved'])
	#Parse name for customer, job name, design stage
	new_fields = f_name.split("_")
	customer = str(new_fields[0])
	job = str(new_fields[1])
	design = str(new_fields[2])

	#Extract PROPOSED Polylines
	arcpy.MakeFeatureLayer_management(file + "/Polyline", 'file_pl', query) 

	#merge (dissolve) polylines to one feature
	arcpy.Dissolve_management('file_pl', 'file_dissolved', multi_part="MULTI_PART", unsplit_lines="DISSOLVE_LINES")
	#delete fields from attribute table
	arcpy.DeleteField_management('file_dissolved', ['Entity', 'Handle', 'Layer', 'LyrFrzn', 'LyrOn', 'Color', 'Linetype', 'Elevation', 'LineWt', 'RefName', 'DocUpdate', 'DocId', 'X3', 'X2', 'X1'], "DELETE_FIELDS")

	#add and populate new fields	
	arcpy.AddFields_management('file_dissolved', [fd['CreationDate'], fd['Creator'], fd['EditDate'], fd['Editor'], fd['Customer'], fd['Job_Name'], fd['Design_Stage'], fd['State'], fd['GD_Folder']])
	arcpy.CalculateField_management('file_dissolved', "Customer", "'"+customer+"'")
	arcpy.CalculateField_management('file_dissolved', "Job_Name", "'"+job+"'")
	arcpy.CalculateField_management('file_dissolved', "Dsgn_Stg", "'"+design+"'")
	#add to the new layer
	arcpy.Append_management('file_dissolved', 'cad_fc', "NO_TEST")

	#clear variables and gdb
	arcpy.Delete_management(['file_pl', 'file_dissolved'])
	del full_path, new_fields, customer, job, design#, file_pl, file_dissolved, file_clean
	print(file + " done.")
print('Processing complete.')

#  Reset workspace to .gdb
arcpy.env.workspace = gdb

# Impart Spatial Reference/Coordinate system 
arcpy.SpatialReference(4326) #NAD1983
#arcpy.FeatureClassToGeodatabase_conversion('cad_fc', gdb)

#  Download hosted feature layer
print('Downloading online layer...')
dataitem = gis.content.get(itemid)
raise SystemExit
#flayer = dataitem.layers[0].url
flayer = FeatureLayer.fromitem(dataitem)
print('Download complete.')						 #
geo_zip = r'C:/Users/mattl/Desktop/Test_Folder/AGO_Upload/Archive/'

#  Update online filegdb
old_gdb ='title:xyz, owner:' +, item_type='filegdb')
print(old_gdb, type(old_gdb))
make_archive(gdb, geo_zip)
gdb_properties = {'title': 'Updated Works',
                  'type': 'File Geodatabase', 
                  'tags' : 'tag, tag'}
item_cad = gis.content.add(data=geo_zip, item_properties=gdb_properties)

#  Combine then Compare via Job Name (Unique) and delete older Job Name feature if there's a match
print('Comparing to online layer...')
#arcpy.Append_management(flayer, 'cad_fc', "NO_TEST") #online layer added to end of new layer's features           ###### Keeps throwing an error at me. 
status = flayer.append(item_id=None, upload_format='filegdb', source_table_name='Scratch')

arcpy.Delete_management(compare_fc) ####################################################################### clears last test run, remove from final script
compare_fc = gdb + '/CAD_COPY'
arcpy.CopyFeatures_management(cad_fc, compare_fc)
arcpy.CalculateField_management(compare_fc, 'Job_Name', '!Job_Name!.replace(" ", "").lower()', "PYTHON3")
duplicates = arcpy.ValueTable(3)
arcpy.FindIdentical_management(compare_fc, duplicates, ['Job_Name'], output_record_option="ONLY_DUPLICATES")

if duplicates.empty():
	#create index of duplicate features, 'IN_FID' being the OID of duplicate in online layer
	duplicate_index = []
	with arcpy.da.SearchCursor(duplicates, ['IN_FID']) as index:
		for x in index:
	#remove duplicate features by index list
	with arcpy.da.UpdateCursor(cad_fc, ['OID']) as features:
		for x in features:
			if x[0] in duplicate_index:
final_fc = cad_fc


# Overwrite hosted feature layer
#		save to upload folder
print('Saving to upload folder...')
outLayerFile = dwnpath + 'Previous_Designs_' + str( + '.lyrx'
arcpy.SaveToLayerFile_management(final_fc, outLayerFile, "ABSOLUTE")

print('Updating online layer...')


# Move and Zip Processed .dwg's to 'Processed' Folder with date in name
print('Cleaning folders...')
zip_name = '_Processed_' + str(
dest = dir_path + zip_name
for file in dwg_list:
	full_path = dir_path + file
	shutil.move(full_path, dest)
shutil.make_archive(zip_name, 'zip', dir_path)

del dest, full_path

dest = '_Previous_Designs_' + str(
full_path = dwnpath + dest
shutil.move(outLayerFile, full_path)
shutil.make_archive(full_path, 'zip', dwnpath)


print('Emptying .gdb...')
arcpy.Delete_management(in_data=[cad_fc, flayer, final_fc, compare_fc])
print('Script successful.  Exiting...')
raise SystemExit()


0 Kudos
Occasional Contributor III

Appending with the ArcGIS Online API for Python is hit and miss, more miss when I try! The community boards are littered with issues. You can use arcpy Append geoprocessing tool as shown below.

import arcpy
from arcgis import GIS

cad_fc = "PATH/TO/FeatureClass"

## connect to AGOL
agol = GIS("home")

## get the feature service that contains the layer
item = agol.content.get("ITEM_ID")

## get the specific layer you want to append to
## NOTE: you need the url of a layer to add to most GP tools
lyr = item.layers[0].url

## append data
## cad_fc is your input data, the lyr is teh target dataset to append in to
arcpy.Append_management(cad_fc, lyr, "NO_TEST")
~ Mapping my way to retirement
New Contributor II

Thanks Club!

I hadn't found that method of getting the layer, interesting.  The documentation is dense and the other Append issues I found on the boards didn't seem to apply, though there are quite a few. 

I tried lyr = item.layers[0].url, and was greeted with another surprise exception! How nice.

Investigating my cad_fc type I realized it was a 'Result' instead of a FeatureLayer, which could have been an issue, but changing that hasn't fixed my append issue.

This time the error is 'Dataset [url] does not exist or is not supported' although I'm sure it exists, as I'm looking right at it.  Must mean it isn't supported for some reason.

I did learn in further digging that I'll need to contact the admin and request they "Allow Append" through the REST API in order to open the online layer up to alterations, so maybe that continues to be the issue.  Will report back once/if that gets done!

New Contributor III

My guess is that the problem is when you upload an item using gis.content.add() that you're not deleting the zipped geodatabase when you're done.  If I recall correctly, the AGOL portal doesn't let you have multiple items with the same name.  So when I use append() I'm always sure to give the uploaded item a randomly generated name, and to make sure I delete that item once the append is finished.

Here's an example, slightly different than what you're doing, I used a shapefile instead of geodatabase, but hopefully the logic makes sense.  Line 38 specifically is what I'm suggesting you try:



# make a bunch of edits to features in the feature layer @ {url}
# save these edited features to a shapefile
# zip shapefile and upload to AGOL
# do the append
# cleanup temp files + item we uploaded

import arcgis
import tempfile
import shutil
import uuid

username = ""
password = ""
gis = arcgis.gis.GIS("", username, password)

url = ""
f_lyr = arcgis.features.FeatureLayer(url, gis)
f_set = f_lyr.query()
features = f_set.features

# placeholder logic, just doing some editing here
for feature in features:
    feature.set_value("some_field", "some value")

# use "with" so temp directories delete themselves
with tempfile.TemporaryDirectory() as datadir:
    # Save the arcgis.features.FeatureSet to a shapefile in datadir, "shapefile")

    with tempfile.TemporaryDirectory() as zipdir:
        # Zip the shapefile
        zipped_path = os.path.join(zipdir, "shapefilezip")
        zipped_path_full = zipped_path + ".zip"
        shutil.make_archive(zipped_path, "zip", datadir)

        # Upload the shapefile with unique title to avoid conflicts
        item_properties = {
            "title": "temp_shp_" + uuid.uuid4().hex[:7],
            "tags": "temp",
            "description": "temp",
            "type": "Shapefile",
        upload_item = gis.content.add(item_properties, zipped_path_full)

        # upsert the shapefile item to f_lyr
        result = f_lyr.append(
  , upload_format="shapefile", rollback=True

        # delete the zipped shapefile we uploaded



Edit: on second thought, I may have been barking up the wrong tree, since your code would be failing at the add() if it was a naming issue, not the append().  So maybe the problem is that there's a field mapping conflict between your source table and destination layer.  I would experiment with the append_fields and/or field_mappings parameters; see if you can get it to work with just geometry and no fields.

Also based on the error you're getting "append is not enabled," I'm guessing either a) you need to disable sync, or b) if sync is already disabled, you need to enable append manually in the layer's service definition.  See the documentation:

Another piece of advice is that you can use FeatureLayer.edit_features() instead of append() to insert features.  I've found that edit_features() is faster than my implementation of append() until about 1500-2000 features, but the documentation correctly notes that it's less reliable over ~250 features.  I do generally prefer using edit_features because it's much easier to write and you don't have to worry about the append/sync stuff, but it's worth breaking up larger requests into 250 features at a time.

0 Kudos