Select to view content in your preferred language

Create a duplicate hosted feature layer with Python

2017
8
Jump to solution
03-14-2023 06:31 AM
Labels (2)
RichardHowe
Occasional Contributor III

Hi, I would like to create an exact replica (with a different name) of a hosted feature service (with multiple feature layers) within the same portal of ArcGIS Online via python. Doesn't seem like a massive ask, and it's something I can do manually fairly easily.

The documentation here seemingly gives me two options, but...

When I use the item.copy() method on a hosted feature layer, then what I am left with is just a feature layer (it loses its "hosted" status and I can't enable editing on it etc.) which is no use to me.

When I use the item.copy_feature_layer_collection method then I can only get the script to work when I specify the first layer within it (i.e. "layers=[0]") anything else throws an error (and in reality I might not know how many layers I wish to copy each time I run the script - I'd like to specify "all"). The documentation implies that the layers parameter is optional (not true - it fails without it) and also that it accepts a comma separated list e.g. (as per documentation) "layers="1,4,5,8"" - also not true.

Please can anyone suggest the way to do this simply?

0 Kudos
1 Solution

Accepted Solutions
PeterKnoop
MVP Regular Contributor

@RichardHowe you can get lists of the hosted feature layer's feature layers' and tables' integer indexes by doing something like the following:

# Get the your hosted feature layer as an item using its itemID.
item = gis.content.get('item_id_of_your_hosted_feature_layer')

# Generate lists of the layers' and tables' index integers for the hosted feature layer
# you are copying. A hosted feature layer item has properties for the lists of URLs
# for its feature layers and tables (e.g., item.layers and item.tables.)
# Each entry in those lists has an ID property that reflects its integer index.
layer_ids = [layer.properties.id for layer in item.layers]
table_ids = [table.properties.id for table in item.tables]

# Create a copy of your hosted feature layer's schema, inclusive of all layers and tables.
# (If the hosted feature layer did not contain any feature layers or tables, then it
# is okay to pass the empty lists calculated above to those parameters.)
copy = item.copy_feature_layer_collection(
service_name = "copy of my hosted feature layer",
layers = layer_ids,
tables = table_ids
)

Note that clone_items() will generate a second item with the same item name as the original hosted feature layer, as there is no rule against having two items with the same name. The underlying feature service for the cloned item, however, will have a different URL, derived from the original, with a unique identifier appended, as there is a rule against having two feature services with the same name.

item.copy_feature_layer_collection(), however, will use the specified service_name for both the copy's item name and the feature service name, as long as there isn't already a feature service using that name.

 

 

 

View solution in original post

8 Replies
PeterKnoop
MVP Regular Contributor

Assuming you want to copy the hosted feature layer's data too, then I would suggest using clone_items for duplicating a hosted feature layer, if you want a full copy.  

If you want an exact replica, and your data has globalIDs, then make sure to set the optional parameter copy_global_ids to True.

(I think -- the documentation is not clear -- that item.copy_feature_layer_collection is intended for copying the schema, but not the data, so you end up with an exact replica of a hosted feature layer, minus its data. And, I also have found that layers and tables are not optional parameters; however, you can create those lists programmatically from the layers and tables properties of the original hosted feature layer, so you don't need to know what they are ahead of time.)

RichardHowe
Occasional Contributor III

@PeterKnoop Thanks for the help. I saw clone_items refernced, but only in the process of copying data from one portal to another. I couldn't see a way to change the name of the new item, and so my guess was that it would fail if I used the same portal as the input and destination.

I don't actually need the data copying, these are effectively empty templates, so it seems I was correct in my theory of using copy_feature_layer_collection. I am glad you too found issues with the documentation (and it's not just me!). Any assistance showing how I would pass the layers and table id info of the template feature layer to the parameters of this tool would be much appreciated.

0 Kudos
PeterKnoop
MVP Regular Contributor

@RichardHowe you can get lists of the hosted feature layer's feature layers' and tables' integer indexes by doing something like the following:

# Get the your hosted feature layer as an item using its itemID.
item = gis.content.get('item_id_of_your_hosted_feature_layer')

# Generate lists of the layers' and tables' index integers for the hosted feature layer
# you are copying. A hosted feature layer item has properties for the lists of URLs
# for its feature layers and tables (e.g., item.layers and item.tables.)
# Each entry in those lists has an ID property that reflects its integer index.
layer_ids = [layer.properties.id for layer in item.layers]
table_ids = [table.properties.id for table in item.tables]

# Create a copy of your hosted feature layer's schema, inclusive of all layers and tables.
# (If the hosted feature layer did not contain any feature layers or tables, then it
# is okay to pass the empty lists calculated above to those parameters.)
copy = item.copy_feature_layer_collection(
service_name = "copy of my hosted feature layer",
layers = layer_ids,
tables = table_ids
)

Note that clone_items() will generate a second item with the same item name as the original hosted feature layer, as there is no rule against having two items with the same name. The underlying feature service for the cloned item, however, will have a different URL, derived from the original, with a unique identifier appended, as there is a rule against having two feature services with the same name.

item.copy_feature_layer_collection(), however, will use the specified service_name for both the copy's item name and the feature service name, as long as there isn't already a feature service using that name.

 

 

 

RichardHowe
Occasional Contributor III

@PeterKnoop That's amazing! Thank you. You've saved me going around the houses figuring that part out for myself.

Intriguingly the script started throwing a "list index out of range error" when I added your above code in, and I think (no matter what the layer IDs are in their source feature layer) it resets them to count sequentially from zero in the output. Using a len() calculation on your layer_ids and table_ids and creating a range based on them worked though 👌🏻

0 Kudos
PeterKnoop
MVP Regular Contributor

@RichardHowe I would be careful calculating the list indices for the hosted feature layer to be copied, as the layer or table indices are not guaranteed to always be sequential like that, and are likely not sequential when it has both layers and tables.

The snippet I provided works fine across a number of scripts and different hosted feature layers, so I might suggest taking a closer look at things to figure out what is causing that error. There maybe something amiss elsewhere in your code.

CassKalinski
Occasional Contributor

Getting the same IndexError when I run the code snippet. No other code other than logging into the GIS. Put a couple print statements in for the IDs and they came back with the expected IDs: 3 feature layers and a table, numbered 0-3. 
LYR IDS: [0, 1, 2]
TBL IDS: [3]

If I change the table ID list to [0], I get a different error:
Exception: Unable to add feature service definition.
Invalid definition for System.Collections.Generic.List`1[ESRI.ArcGIS.SDS.Metadata.LayerCoreInfo]
Exception has been thrown by the target of an invocation.
(Error Code: 400)

0 Kudos
RichardHowe
Occasional Contributor III

@CassKalinski I never got around this issue and have now been using  clone_items()  quite happily instead for several months. It might be worth your while having a play with that instead.

CassKalinski
Occasional Contributor

@RichardHowe thanks! Yes, clone_items() worked well. Other team members using clone for system migrations have had issues with data integrity, so I was skeptical of using it at first. Works fine in this context though.