The solutions presented here and in other resources and discussions on this topic (here and here) all rely on using local network layers and do not address the core inquiry of this question:
I am looking for a way to increase the limit from 1,000
Local network layers may work fine, but there is substantial convenience to using ESRI's built-in network dataset to circumvent having to build your own network dataset. This is the core principle of a lot of the credit-consuming functions ESRI offers, is it not? Offering users a for-cost option for executing workflows and geoprocessing that would be laborious for the user to scale on their own. I would like to be able to tap into ESRI's network dataset because it is exactly what I'm looking for and better than what I could create independently in a short amount of time.
I understand wanting to caution the user against running large datasets through the OD Cost Matrix function. However, the artificial, immovable 1,000 record limit on both origins and destinations doesn't make sense to me. I have purchased credits and am willing to spend a good number of them on using ESRI's built-in network dataset. If I understand the risks and costs of executing the ODCM function with large datasets, why am I not allowed to do so?
My suggestion would be to add a user warning checkbox in the GUI and argument in the python functions with the default set to unchecked and "False", respectively. If the default remains in effect as it does now, the user would get a warning flag that running the OCDM would take a long time and consume a high volume of credits. Then, if the user understands and wants to continue, they check the box or set the warning flag to "True" and re-execute.