Esri provides concept paper around Architecture best practices. One of the architectural recommendations in is to deploy ArcGIS Enterprise with “Environment Isolation”. In the white paper, the concept is defined as: “Isolating computing environments is an approach to maintaining system reliability and availability. This approach involves creating separate systems for production, testing, and development activities. Environment isolation reduces risk and protects operational systems from unintentional changes that negatively impact the business.”
In architectural conversations with customers, especially with one having large-scale deployments who manage system of records/engagements, quite often there is a discussion around environment isolation. The questions include:
We need to address all these concerns/questions by designing an automation workflow. Generally, the best practices (security/infrastructure constraints) around environment isolation are:
Now let’s consider a simple use case scenario which needs to be promoted in an automated way:
Let’s decompose this application in all subcomponents as shown in the figure 1:
Figure 1: Application subcomponents
To promote this application from Development to Production environment successfully, all sub-components (from all tiers) should be taken in account. If organization wants to automate this process, they need to automate for all tiers. Let’s go through all the implementation workflow:
Figure 2: Automated Content promotion scenario
Once all requirements are implemented, in lower environment. The data model can be delivered in File geodatabase format. This can become a data exchange format between all environments. ArcPy can be used to inject data model from single file geodatabase source to cross environments enterprise geodatabases to ensure that data is in sync and there is no discrepancy due to any manual intervention. Data management toolbox from ArcPy can be used for this scenario. Some command examples are:
arcpy.management.CopyFeatures
arcpy.AddIndex_management
arcpy.AddRelate_management
ArcGIS Servers should be configured in following way:
Staging database whose name is staging_database and it uses a database user staging_data_owner.
SD File is created from ArcGIS Pro which is using same data connection as above.
ArcPy based Command can be used for publishing:
import arcpy
inSdFile = r"C:\install\socal_fire_infra.sd"
inServer = r"C:\install\staging_connection.ags"
# Run UploadServiceDefinition
arcpy.server.UploadServiceDefinition(inSdFile, inServer)
ArcPy needs 2 input parameters one is ArcGIS Server connection file, and one is Service Definition which is created using ArcGIS Pro.
At the end it publishes service in staging environment in root folder for ArcGIS Server:
The service is published with referenced database:
Publisher database connection should be configured with credentials of data owner from staging database. Server database connection should be configured with production data owner credentials.
Now by executing same line of ArcPy based python code, the service is published in production environment. The service is published with replaced dataset, where staging credentials are being replaced by production:
from arcgis.gis import GIS
sourcePortal='https://staging.esri.com/portal'
sourceUserName='portaladmin'
sourcePassWord='Source2018'
targetPortal='https://prod.esri.com/portal'
targetUserName='portaladmin'
targetPassWord='Target2018'
source = GIS(sourcePortal, sourceUserName, sourcePassWord, verify_cert = False)
print("Connected to Source Portal " + sourcePortal + " as "+sourceUserName)
target = GIS(targetPortal, targetUserName, targetPassWord, verify_cert = False)
print("Connected to Target Portal " + targetPortal + " as "+targetUserName)
item_mapping={}
item_mapping['d4038035da9c4850980d0eec723d5fc1']='6481d82b0597403ab31c2a68487e473d'
source_item = source.content.get('d5de71d8fc69496c8359fde9b21b0a3f')
target_item = target.content.clone_items(items=[source_item],owner=targetUserName,
item_mapping = item_mapping,
copy_data=False,
search_existing_items=True,
preserve_item_id=True)
print('Item is clonned with ID:' + target_item[0].id)
We need 3 IDs to make this happen:
With option “Preserver Item ID TRUE”, the web map in target portal will be created with same item ID as in source environment.
from arcgis.gis import GIS
sourcePortal='https://staging.esri.com/portal'
sourceUserName='portaladmin'
sourcePassWord='Source2018'
targetPortal='https://prod.esri.com/portal'
targetUserName='portaladmin'
targetPassWord='Target2018'
source = GIS(sourcePortal, sourceUserName, sourcePassWord, verify_cert = False)
print("Connected to Source Portal " + sourcePortal + " as "+sourceUserName)
target = GIS(targetPortal, targetUserName, targetPassWord, verify_cert = False)
print("Connected to Target Portal " + targetPortal + " as "+targetUserName)
item_mapping={}
item_mapping['d4038035da9c4850980d0eec723d5fc1']='6481d82b0597403ab31c2a68487e473d'
source_item = source.content.get('1e2d62518d8940dcaf5a077cf8f4e6db')
target_item = target.content.clone_items(items=[source_item],owner=targetUserName,
item_mapping = item_mapping,
copy_data=False,
search_existing_items=True)
print('Item is clonned with ID:' + target_item[0].id)
We need 3 IDs to make this happen:
Congratulations, you have successfully promoted Web Mapping Application from staging to production using automation script and application is fully working with all widgets in production environment:
Takeaways:
In this blog the focus in on web mapping application. As we know that future in around experience builder. In the next blog of automation series, there will be a focus around experience builder.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.