A number of organisations will have an isolated internal ArcGIS Server deployment, and wish to setup an automated workflow for overwriting a subset of their datasets to hosted feature services within their ArcGIS Online Account, which then flow through to their ArcGIS Open Data site.
I have seen a number of scripts out there, but was wondering if there was a preferred approach or set of scripts that are recommended? This would be a one-way push of data, and simply overwriting the service should suffice and I believe this does not change the ID of the item, therefore does not have any undesirable knock on effects with URLs to datasets changing in the Open Data site.
Ideally looking for some base scripts that can be easily adapted/re-usable for different organisations.
Semi-related, but would also like to know the pros and cons of publishing each layer as a separate hosted feature service versus grouping your layers into categories and publishing as one hosted feature service.
Some related links:
Hi, this seems like a perfect use case for our new python api,
and in fact it’s a very similar scenario to some of the examples they share as notebooks
in the documentation for it. If you’re not familiar with them, Jupyter notebooks are environments where
you write your python code, run it, and view the results all in the same screen.
You can think of it as your script, text editor and terminal all in one place. Another benefit of the notebook workflow,
is that they can be easily modified as necessary, and shared as a single file.
Having said that, the Jupyter Notebook integration with the python api is completely optional.
You could export your notebook to a .py file, as it’s just standard python code, or write it in your text editor of choice from the start.
This would give you the ability to automate running the script, while still benefitting from the perks of the modern api.
You linked to the guide for the api in your post. Here is one of the notebooks in the documentation that you should take a look at also.