Select to view content in your preferred language

Reusing python scripts and modules across projects

735
2
10-12-2023 12:27 PM
MattWilkie1
Frequent Contributor

Where do you save python scripts and modules for re-using across multiple projects?
That you yourself work on?
That you share with colleagues?

I'm not asking about how to share python modules or modify PYTHONPATH and so on. I'm interested in what conventions have you settled on as practical for your general operations in ArcGIS Pro. For example, I save my batch and powershell scripts that are used for general system maintenance in C:\bin. The corollory for tools that I and co-workers share is Z:\Tools\bin. Both of these are in PATH.

Having settled on a location, how do you make them available across projects?
To yourself? to co-workers?

As above, my question isn't about the technical how to package scripts but the practical day to day use of them. ArcGIS Pro defaults to extracting under "C:\Users\USERNAME\Office365_Onedrive_ORG_NAME\Documents\ArcGIS\Packages" which is a horrible path to refer to repeatedly. What's your alternative?

High value scripts can be added to a project toolbox and then "Add to New Projects" and be always at finger tips. What about things that are less polished? Something that does useful work but the doesn't warrant the overhead of adding parameters and tool validation etc. to make it a 'tool'. 

MattWilkie1_0-1697137341012.png

Lastly, how do you work with other python projects that are not ArcGIS? Do you resign yourself to using Pro's built in Package Manager and cloning? Or just stay inside what's given and not reach for new cool things in python 3.11 and 3.12?

I've tried installing Anaconda/Miniconda alongside Pro (the built in one is dreadfully slow). I made a mess of things that wasn't straightened out completely until I got a new machine. Maybe an isolated experience though, there is a way to dance together cleanly, and I just fumbled.

Thanks in advance for your thoughts and experience.

0 Kudos
2 Replies
jcarlson
MVP Esteemed Contributor

Over here, we just use Git repos for organizational tools. Sensitive information like credentials, etc., are handled by untracked config files to keep the repo "clean". Users can clone the repos to wherever they need them and easily pull in changes made to the scripts. If a tool has particular requirements, a simple requirements.txt file and venv can help keep things consistent across machines.

For a while, we fiddled around with actually installing custom modules locally, but that ended up being more work than it was worth.

I personally use miniconda for Python envs, and haven't had any issues with it.

- Josh Carlson
Kendall County GIS
DavidSolari
Frequent Contributor

Currently we have one big git repo that holds all the scripts and supporting files, a copy of which is cloned to our shared network drive and referenced by toolboxes on said drive. Clunky, but it hasn't collapsed so far. Our next move is to switch to geoprocessing modules which have a dependency on a "master" package, that way we don't have to implement common utilities over every project. This has its own set of challenges (teaching users how to manage environments, potential dependency conflicts, keeping the Servers in sync for web tools etc.) but it's a bit easier to manage than one fat folder full of junk.