We have a PostgreSQL Geodatabase that we replicate a subset of our production GIS data into every day. Production GIS data is extracted into a file geodatabase, we then use the arcpy.Delete_Management() tool to delete all sde existing objects in the target schema (schema is called repl) in the PostgreSQL database, and then use the arcpy.Copy_management() tool to copy feature classes and tables from the file geodatabase into the repl schema in the PostgreSQL geodatabase. We use the Delete/Copy tools rather than truncate/append because we need to maintain the OBJECTID field of all feature classes and tables when we load data from the file geodatabase into Postgres.
We have many views built against the feature classes/tables in repl schema. When all the sde objects get deleted every night, it also drops all dependent views. I understand that this is expected behavior in a PostgreSQL database. We use the same workflow in a SQL Server geodatabase, and the Delete tool does NOT drop dependent views.
I was wondering if there is a way to delete all the sde objects in a PostgreSQL schema without also dropping dependent views? If not, are there any suggestions for a daily workflow that 1) uses a file geodatabase as source 2) postgresql geodatabase as target 3) maintains the OBJECTID values when loading feature classes/tables from fgdb to postgresql and 4) doesn't drop views dependent on the feature classes/tables that are being refreshed?