I'm looking to migrate from one enterprise geodatabase to another enterprise geodatabase (both are postgresql). I've been using ArcCatalog to do the exports, exporting a small set of tables at a time since ArcCatalog tends to freeze on large sets of data. This process has been getting tedious and I was wondering if there was a way to programmatically export geodatabases? I've checked the ArcPy documentation but there doesn't seem to be any functions that can do this.
You may want to look into creating a backup and then restore it to the other PostgreSQL instance
PostgreSQL backups—Help | ArcGIS Desktop
Restore a geodatabase to PostgreSQL—Help | ArcGIS Desktop
If the data is versioned this is the best option to keep all the tables in their correct states.
I've tried doing a pg_dumpall from a postgresql 9.0 database, restoring that to a postgresql 9.5 database, but I get many errors related to parsing the geometries. I came across this thread where the poster had a similar issue to mine, and the workaround of using ArcCatalog to copy data from the old to new database seems to work without any issues.
I ran into the same issue following the instructions in the links that you've posted.
If the PG databases are in the same server, you might be able to INSERT INTO ... SELECT across databases within SQL.
It's not at all difficult to write ArcPy code with a da.SearchCursor fetching rows from one database and da.InsertCursor writing into another. I moved several dozen tables spanning 60 million rows using this last summer without any ArcGIS-related errors (using both 32-bit Python and 64-bit Python).