I am attempting to create a tool that queries a large geodatabase table based upon user input, converts the output table to a shapefile and adds the shapefile to the Table Of Contents.
While I have successfully generated this tool (it works fine for small database files), the temporary output table from the SQL query responds slower and slower to any successive operation as a function of how large the original geodatabase table is. The query itself acts very fast, but the table it produces becomes unusable when it is derived from a sizeable source (millions of rows). The output table itself could have very few rows, but if the original database was large, the output table struggles.
This happens no matter what tool I use (Make Query Table, Table To Table, etc.), as long as that tool includes an SQL query. And the issue is not related to the successive tool that acts upon the output table. Even if I add the temporary SQL output table to my TOC, it cannot even be opened (it just spins and spins and spins). This behavior happens whether I use Model Builder, a python script, or just running the tools individually.
My instinct is that this is some sort of memory allocation issue, but it's not obvious to me how exactly to diagnose the problem. Any help would be sincerely appreciated.