Hi,
I am using ArcGIS version 2.4.0 in Databricks and was trying to publish large number of records from our Databricks Environment to the ArcGIS Enterprise (10.9.1). We created a ticket for this and got it working with workaround.
I tried this method to publish: Overwriting feature layers | ArcGIS API for Python
and Appending Features | ArcGIS API for Python
The workaround that we tried: Use add item to the updated or new CSV file to add it to your portal. Once we have our updated CSV file in our portal, we removed the original published CSV layer from portal. Overwrite the old layer by using the publish method to force the name and item_id to be the same.
But the ESRI consultant also told us that even though installation is possible, ArcGIS Python API is not meant for cloud environments which I also felt in the documentation above that they have only tested that locally but not for any data stored in the cloud environment. In my case, the dataset is stored in the managed_volumes in the Databricks.
It would be great if we have more support for ArcGIS python API for cloud environments. Any recommendations / comments would be welcome.
Thanks
Prasandeep
A/IT Analyst, Transport Canada
Hi, we are currently working on adding this in the next release of Python API.
Please stay tuned!
-Nick Giner
-nginer@esri.com
@NicholasGiner1 - any updates to share on this?
Hi all - we recently released a blog for using Python API 2.4.1 in Databricks Notebooks:
Thanks @NicholasGiner1 . I can confirm this workflow does work in Azure Databricks. Interestingly, before today I've only ever successfully installed Python API <= 1.9 on Azure Databricks (any of the LTS dbr's). Some issues with the Kerberos packages/auth flow implemented in 2.0 (? I think)