I have a python tool with the following structure:
The main_script.py imports module.py, and module.py imports module_helper.py. Importantly, main_script.py does not directly import module_helper.py.
When I publish via Pro's Share as Web Tool, the main_script.py and module.py will upload, but module_helper.py will not. Based on testing it seems like module_helper.py is not uploading because it is not directly called by the "main" script that the GP tool points to.
Is there a way to get all scripts in the folder to upload, even if not directly called by the main_script?
Note - There are clunky workarounds: One is to manually copy over module_helper.py. But that's a cumbersome step, especially when I have to do it on a frequent basis. Another option is I could just import them into the main script even if I don't use them, which would ensure they upload, but it's poor practice to import modules you don't use (wastes memory and has potential to create confusion later).
Thanks,
PS - I feel like @DanPatterson or @Luke_Pinner might have some useful insights on this.
importing the module_helper.py isn't going to increase memory useage, it just imports function names into python namespace.
import module as m # you can them import specify functions from `m`
import module_helper as mh # just assigns `mh` into the namespace
perhaps try that to see if it enables importing of the helper... I don't do web tools so I will defer to others
A toolbox `atbx` offers an alternative since you can put all the other *.py files into a folder
Thanks for clarifying, Dan.
I tried adding import module_helper in the main script without using it anywhere in the main script, but it still didn't copy over during publishing. So seems like it not only has to be imported but actually used in the main script?