Some tips for fixing PYT breaking:
Use a wrapper/fallback class during import and make sure to reload the modules during development or you'll need to restart the kernel constantly to get code changes loaded in (with a reload method, you can load in code changes with the toolbox refresh option in ArcPro):
Wrapper:
def build_dev_error(label: str, desc: str):
class Development(object):
def __init__(self):
"""Placeholder tool for development tools"""
self.category = "Tools in Development"
self.label = label
self.alias = self.label.replace(" ", "")
self.description = desc
return
return Development
Reload Logic:
from importlib import reload, import_module
from traceback import format_exc
try:
import tools.ToolModule
reload(tools.ToolModule)
from tools.ToolModule import ToolClass
except:
ToolClass = build_dev_error("Tool Class", format_exc())
class Toolbox(object):
def __init__(self):
"""Define the toolbox (the name of the toolbox is the name of the
.pyt file)."""
self.label = "My Toolbox"
self.alias = "MyToolbox"
# List of tool classes associated with this toolbox
self.tools = \
[
ToolClass,
]
This flow will allow your PYT to just be a list of imports and will allow you to throw a bunch of tools in and handle one failing without breaking the others. The `build_dev_error` function returns a stub class with a dynamically set description parameter (in this case the traceback for debugging purposes)
I can't go back to single file PYTs anymore, this method really does wonders for maintainability and keeping tools that may or may not error out on import from gumming up the whole toolbox.
In my use case keeping only one tool per file so it's easier to track changes and plug/play between branches, but you could have multiple tool classes in one tool module file.
Another important thing is to create a utility module that you put generic reuse code in so you don't have to patch it in multiple places and are able to make the change in one module and get the updates everywhere