How do you avoid single points of failure with an enterprise geodatabase. We have a production geodatabase and publication geodatabase. We do have a solid backup and recovery plan in place and know how to use it. How do you keep your publication geodatabase constantly updated. It gets updated every night but in some cases that is not enough.
What is the RDBMS being used for the enterprise GDB side?
You can write and schedule Python scripts to update your publication database on a schedule. It would be based on the volume and how often the records would need to be updated.
SQL Server database, 10.6.1. These updates can occur when people are connected to it?
Depending on the data and locks, yes. I am sure that many people here do this workflow. You would need to look at what is needed and build a script and test in your environment.
You may want to look into this with your DBA team to have better redundancy of the EGDB: Connections to highly available SQL Server databases—Help | ArcGIS Desktop
Great, thank you for this.
Sounds like you want your publication geodatabase updated in near real-time based on edits committed to your production geodatabase. I'd recommend reading through this section from Esri System Design Strategies, if you haven't already: GIS Data Administration - GIS Wiki | The GIS Encyclopedia.
Based on your description, it sounds like either geodatabase or DBMS-tier replication would be your best bets. Scenarios using distributed data—ArcGIS Help | ArcGIS Desktop. I haven't set this up myself but it seems to serve the need of what you're looking for. This blog might help you in your journey: /blogs/HackingArcSDE/2015/01/01/geodatabase-replicationwithout-the-replication-part-1
Good luck! Let us know how it turns out.