I have a large number of scripts that we want to add error logging to and wanted to see if there are any recommendations for creating a single error handling python script that can be called from multiple scripts? I have seen a number of examples that add error checking in a single script but in an effort to reduce duplicate code throughout the numerous scripts, I would like to see if anyone has an recommendations to streamline this process.
And this looks interesting too: Using Python logging in multiple modules - Stack Overflow
We wrote our own library to handle this and import the library across all of our production python scripts, writing errors and logs to a SQL Server table that is designated for holding results.
Our organization has chosen to create two related tables in our enterprise geodatabase: Scheduled Task Index and Scheduled Task Log
The index table has a record for each script we run with a unique task id. The log table gets many records, each with a task id to identify what script wrote the log message; it also records the date/time. Every scheduled task script has a bit of code to write the success or failure (with error message) of the script to the log table. It also sends an email with the traceback information for debugging. We then have an email report that queries the log table for entries in the past 24 hours so we can easily see if anything failed.