I have a script tool using the Python API that receives a list of Urls to zip files on a server with multiple shape files inside. Each zip file contains exactly the same number of shapefiles with the same names and schemas. I would like to publish all the shapefiles to a single Hosted Feature Service in Portal but I cannot find any way to publish multiple files together or to merge the shapefiles or Hosted Feature Services once created. Is there a supported workflow for doing this?
At the moment I am doing this to end up with one Hosted Feature Service per shapefile:
for zip_Url in zip_Url_List: shp_file = gis.content.add(item_Properties, zip_Url) published_service = shp_file.publish() shp_file.delete()
I found the function append() on a Feature Layer in the API documentation but it seems this is only available for AGOL.
The only workaround I can see is to download all the shapefiles locally, unzip them, merge the shapes with same names using arcpy, zip them again, add the shapefiles, and publish the services but this would add potentially very long processing times since some of the shapefiles can be quite big. This tool is meant to run on a button click in a web application so performance is important to not keep the user waiting any longer than necessary.
Thank you in advance for any recommendations
I recently created a tool that will append features from a shapefile/feature class to a hosted feature service (AGOL or Portal):
Not sure if it will be helpful, but you may be able to incorporate this functionality into your workflow.
Thanks for your suggestion and for providing your code! It looks well implemented and I'm sure I'll be able to use some of the logic for future projects. But in this case, since we are receiving a zip with multiple shapes from the file server, we would have to download and unzip the files anyway before running them through your script so since we've decided to simply use arcpy.merge_management() on them before zipping them again and using the API as in my code above to publish a single HFS. That workflow runs very fast and setting up cleanup was very easy. May be a little old school but it works fine!