I'm creating a script to analyse landcover within an AOI which I'll turn into a geoprocessing tool. I want to use the ESA 2022 landcover and download from within the tool using the code provided on the ESA data access page (Script below).
I'm aware I need to install geopandas into the conda environment most recommendations seem to suggest doing this in the Python Command Prompt. However, eventually, this tool needs to stand alone and be used by non-techincal staff to carry out the landcover analysis, i.e. just run from the geoprocessing pane.
Is there a way I can include the installation of geopandas to allow me to download relevant landcover data WITHIN the script for the geoprocessing tool I'm creating so no additional steps are required from those using the tool?
import geopandas as gpd
s3_url_prefix = "https://esa-worldcover.s3.eu-central-1.amazonaws.com"
# load natural earth low res shapefile
ne = gpd.read_file(gpd.datasets.get_path("naturalearth_lowres"))
# get AOI geometry (Italy in this case)
country = 'Italy'
geom = ne[ne.name == country].iloc[0].geometry
# load worldcover grid
url = f'{s3_url_prefix}/v100/2020/esa_worldcover_2020_grid.geojson'
grid = gpd.read_file(url)
# get grid tiles intersecting AOI
tiles = grid[grid.intersects(geom)]
# use requests library to download them
import requests
from tqdm.auto import tqdm # provides a progressbar
for tile in tqdm(tiles.ll_tile):
url = f"{s3_url_prefix}/v100/2020/map/ESA_WorldCover_10m_2020_v100_{tile}_Map.tif\n"
r = requests.get(url, allow_redirects=True)
out_fn = f"ESA_WorldCover_10m_2020_v100_{tile}_Map.tif"
with open(out_fn, 'wb') as f:
f.write(r.content)
I'm using ArcPro 3.0.
geopandas would have to be installed in the conda environment first
How can I do this from within the script?
You can give some examples on the web a try.
For example
installation - Using conda install within a python script - Stack Overflow
The problem you are going to run into is that each person will have different active conda environment and most(?) will have the default environment that does not allow you to install packages as their current environment... You'll have to clone the environment for them as well if the default is active.
I've been dealing with this a lot for custom add-in deployments and have created a couple ideas in the ideas for this process. 99 out of 100 times a 'custom' tool will require custom, cloned environment with custom packages and there is little to no documentation on how to deploy this without touching every single pc that the add-in is used...
I resorted to creating a bat file that is attached to a separate add-in button that clones the env, installs all of the third party packages and sets the active environment to the environment that it needs. When users install the add-in, they can click this button and the environment is created for them. Or after an upgrade and the previous environments are broken, it will create a new clone. But, this doesn't always work because of the included jupyter notebook package successfully installing is 50/50 and it will roll back the whole process if it fails- which is very annoying! After this bat file is executed, the users environment will remain this new one until they change it.
I remember one post on geonet trying to do the install in a script, but they had issues with subsequent execution of the script with the package already installed.