Hello! I am currently supporting a custom python geoprocessing tool that does a number of tasks such as intersect data and report out on features that intersect an AOI. All of the data is warehoused within an Oracle RDBMS that I access using an .sde file (one .sde file per instance of the tool and then it gets disposed of). I do not have admin privileges in the Oracle DB and none of the clients using the tool do either. The script will access multiple feature classes (hundreds) within the .sde file throughout various functions and modules in the script, each of which are generating a connection in the Oracle DB.
The issue is that our organization only allows for X number of connections and it appears each tool iteration is using up to 10 connections per tool instance and failure occur on the third instance of the tool. Some connections will time out and be discarded, but when the tool finishes executing, it will not drop the connections until I close down ArcGIS Pro. Only then will those connections on the DB end be dropped. The issue noted HERE.
Our users want to run concurrent iterations of the tool in multiple instances of ArcGIS Pro (APRX's) to improve productivity, but that is proving to be a challenge with the shear number of connections being made and the locking of those connections until ArcGIS Pro closes.
I want to be able to efficiently use and reuse connections throughout the script but I cannot find any 'best practices' on the topic of database connections to an RDBMS. I do not understand why the tool is using so many connections and if I could force disposal or reuse of the already established connections, that would free up more room for more instances of the tool.
I am leaving out a tonne of detail, but I am hoping this is enough to get some feedback or insight. Thank-you.
Steve