Building a more versatile server architecture

2714
1
08-17-2015 01:10 PM
BenDrury
Occasional Contributor

Below is a graphic of our sever and database structure. All databases are sql and running mostly 1 way replications defined by the arrows. This has been a working progress for about 2 years. The systems has been running for about 2 years in this configuration and it had been nice, but there is one problem. Schema changes are a nightmare to get from point A to point B. If they are made at the top I could possible have to compare schema and import schema changes 7 to 8 times. All the boxes represent off site servers that I replicate to, they are all on the same network but doing these changes over the network takes hours per task. We are in the process of getting fiber connected to all facilities but that is going to be a few years down the road. Other than networking, are there any other options to redesign this structure to improve the system and make it more versatile to schema changes.

GRAPH EXPLAINED:

The E911, Eagle, and CED are all offsite facilities. Tobler is my server with 6 databases. I pull in data from E911, Eagle, and CED through 1 way replications. Except for a 2 way replication from Tobler - Local Government Model, to E911 - E911 Local Government Model. Once I get all the data into my server "Tobler", producing an off site backup for all facilities. I replicate data out to our publication database and too our replication database. The publication database feeds all map services and viewer programs for all staff. The replication database I push replications back out to the off site facilities so they receive most current data possible for all organization data.

Server and Data Structure.PNG

0 Kudos
1 Reply
PaulDavidson1
Occasional Contributor III

Have you looked at the Data Interoperability extension?

I have not used it yet but it's supposed to be a powerful ETL and most ETLs allow for pushing schema changes around.  from some Esri docs: "If the output require schema changes then the Spatial ETL Tool should be used."

Another tool you might consider are the ones from Red Gate :

All Redgate's products for SQL Server, .NET, and Oracle

I see they have a new Database Lifecycle Management suite that might be very useful for you.

Now of course, you're working with a Geodatabase so things can get a bit tricky here.

I used to use their SQL Developer Suite to do exactly what you're talking about.

BUT it was not with a Geodatabase.

I had a web app with a failry large database of CIP projects.

I'd make schema changes to my local development copy, modify web code, etc....

Then I'd use the SQL Compare tool to migrate the schema changes up to the database on the Test Server environment.  After testing by various folks, I'd repeat the migration with SQL Compare to push the Schema changes up to Production environment.

I would also use the Compare and Data Compare tools to help verify schemas were consistent and to move data around as needed.  For example, I could build a new table for something, like say a drop down list (DDL) (aka a Domain in arc speak) and populate it on my desktop.  Then just push the schema change and the data up to Test and Prod.

I guess I'd first look at the Data Interop tool.

Since it comes from Esri (yes, I know, it's a 3rd party tool but with Esri's blessing), it's more likely to carry all the required hidden SDE stuff down at the database level.

OTOH, since SDE is basically an abstraction layer (I think that's fair to say?) the Red Gate (and similar tools) should pickup any and all changes down in the dB Schema and take care of anything.

Best of luck.

0 Kudos