Select to view content in your preferred language

Comparing ArcGIS Enterprise Systems

2802
0
08-26-2022 11:03 AM
DannyKrouk
Esri Contributor
4 0 2,802

Introduction

GIS Enterprise Reporter now has a companion application “er_compare.exe” that allows you to compare two ArcGIS Enterprise systems.

This is the fourth article in the GIS Enterprise Reporter series.  If you are not familiar with the series, this is the first article: Introducing the GIS Enterprise Reporter

Why is Comparison Useful?

There are many reasons why you might want to compare two ArcGIS Enterprise systems.  GIS Enterprise Reporter envisions the following motivations:

  1. You are setting up a Disaster Recovery Site.  In this case, you may wish to understand that the deployments and configuration are sufficiently compatible to allow you to use the webgisdr tool to synchronize the content.
  2. You have a Production and one or more lower environments.  In this case, it may be useful to understand the nature and extent of the differences between the Production environment system and the other(s).

How to Compare

Comparison is a three-step process:

  1. Use GIS Enterprise Reporter to generate outputs (“admin” and/or “content” at the initial release of this tool) for the two systems you wish to compare.
  2. Run “er_compare.exe” against those output files.
  3. Open the resulting Excel document and review its comparison notes

Step #1, the use of the GIS Enterprise Reporter tool, has been described in other blog articles in this series.  Please refer to those if you would like more guidance on how to generate outputs for an ArcGIS Enterprise system.

The remainder of this blog article will focus on steps #2 and #3.

Run "er_compare.exe"

The application can be run by double-clicking on the er_compare.exe executable.  When run in this way, you may briefly see a command window appear.  It will be replaced with a graphical user interface as shown here.

DannyKrouk_0-1661536544198.png

Navigate to the two output files that you wish to compare from the two systems; those will be your “File One” and “File Two”.  You can compare the “admin” outputs of two systems to see how the administrative configurations and services inventories compare.  Or, you can compare the “content” outputs of two systems to see the differences/similarities between the Portal content.

“File One” is typically the “source” system, the one that you consider to be the standard against which you want to measure the other system.  For example, if you have a primary data center and a stand-by data center, the primary data center would be “File One”.  Or, if you have a Production system and a Pre-Production system, Production would be “File One”.

The application will suggest an output file location and name based on your inputs.  But, you may name it and place it wherever you like.  Then, click the “Compare” button.

The tool will report its progress in the lower status pane of the application.  When it is complete, it will change the “Compare” button into a “Dismiss” button.

Note that you can also run the tool on the command-line.  If you do so, provide three space-delimited arguments for “File One”, “File Two”, and “Output File” and the application will do the same thing as it does from the graphical user interface.

Review the Comparison Excel Document

The application accepts Excel inputs and creates a new Excel output.  The output Excel will have one sheet for each pair of input file sheets that it attempted to compare.  If the input files do not have the same number or kind of sheets, the output will note the “missing” sheets from “File Two” relative to “File One”.  Note that the application does a logical comparison of sheets based on what they represent, not their literal names.  It uses the GIS Enterprise Reporter “site number” to know, in the case of multiple federated ArcGIS Server Sites, which sites are logically comparable.

The first sheet is always “Comparison_Metadata”.  It has the structure show below:

DannyKrouk_1-1661536622712.png

An "Admin" Comparison Example

The remaining sheets will be named for the input sheets that they compare.  For example, “portalAdmin” compares the Admin API sheet from each Portal.  An example of a comparison output for it appears here

DannyKrouk_2-1661536659045.png

The column layout is the same for all sheets.  The columns are:

DifferenceSeverity

The likely relative importance of the difference.

DifferenceComment

A brief description of what is found to be different.

TypeOfSource

This is somewhat redundant with the name of the sheet.

RecordOne

The information that it found in “File One”.

RecordTwo

The compared information that it found in “File Two”.

ProcessingErrorInformation

In the case that the application is not able to make the comparison it intends for some reason, there may be additional information about why in this column.

 

The portalAdmin example above shows that the tool has identified one “Error” and three “Warnings” (the rows in all sheets are color-coded by these classifications). 

The Error shown here is that there are a different number of “redirectURI’s” for the “arcgisonline appId”.  These appId’s are used as part of the ArcGIS Enterprise OAuth security implementation.   Note that the application is not comparing the values.  The values across two systems can be different.  Instead, it is comparing the number of redirectURI’s.  It does not know how many there should be.  But, it assumes that the “source system” (“File One”) has been configured according to your intent and the other system should be a numerical match for the number of redirectURI’s. 

The first Warning is reporting on a difference in the number of licensed members between the two systems.  The second system may or may not have “enough” licensed members; that is for you to decide.  All this tool can do is note that there are different numbers.

The second and third Warnings are noting that the source system is configured for Windows users whereas the second system is configured for “Built-In” users.

These are just examples.  The tool does not report on any comparisons for which it does not find a difference of concern.  In other words, in perfectly matched systems, the comparison sheets will be empty. 

The screen capture below shows another example; a comparison of the Server Admin API for matching federated Server Sites between the two ArcGIS Enterprise systems.

DannyKrouk_3-1661536716420.png

The first Warning is that the number of platform services between the two ArcGIS Server Sites is different.  It is not easy to see what the difference is in the “RecordOne” and “RecordTwo” cells.  But, if you were to study the JSON there carefully, you would see that the “source” system has a “Message Bus” platform service, in addition to a “Synchronization” and a “Compute Platform”.  The other system has only “Synchronization” and “Compute Platform”.  Likely, this indicates some difference in what has been installed or licensed for this Site between the compared systems.

The second Warning is that the “source” system supports HTTP_and_HTTPS whereas the other system supports only HTTPS.  This might be because the “source” system was upgraded from a prior release whereas the other system was a fresh installation.  Or, it might be an indication that the “source” system has the preferred configuration and the other one should be matched to it.

The third and fourth Warnings aggregate the member and processor resources across all of the machines in the Sites and report differences.  Again, these differences may be intentional or not, the application cannot know.  But, a common source of problems is an inadvertent difference in machine resources.  Note that the tool sums the memory and cpu from all of the machines in the Site.  So, if you have different numbers of machines, these aggregates will have different values even if the per-machine resources are identical.

In the final row, there is an example of an “Information” difference.  The log settings (the “logLevel”) are different between the two systems.  This should not cause a problem for most use cases.  But, it may be relevant to know.

The final example here shows the “Information” comparisons of the service “analytics”.

DannyKrouk_4-1661536744132.png

In this case, the application is reporting that the numbers of services of various kinds, their instancing, etc. is a match between the compared systems. 

A "Content" Comparison Example

The sheets in the “content” comparison output are likely easier to understand.  Below is an example which shows the sheets you can expect and an example of one of the sheets.

DannyKrouk_5-1661536799672.png

As you might expect, the “Users” sheet will report on any users in “File One” that are not in “File Two” … and vice verse.  The same is true for “Groups” and the other sheets.

The InventoryAnalytics detail shown here is comparing the aggregate counts and sums of the content in each system.  In the case of Production and Pre-Production environments, this can be useful to get a sense of “how different” the content is.  And, in the case of a Disaster Recovery content synchronization, this can be useful to confirm that all of the content is the same between the two systems after the synchronization.

Understanding Comparison

The preceding examples of the sheets and comparison rows are a small sample of what the application does.  You should bear in mind that the application does not compare everything.  It only compares differences that are likely to be meaningful or interesting in some way. 

The application does not, for example, compare the names of the machines between systems.  However, it will report if the count of machines is different.  And, for similar reasons, it does not compare the subjects of self-signed certificates when comparing certificates.  Rather, it matches the serial numbers of all the non-self-signed certificates on the assumption that the certificates that have been imported in one system should also be in the other system.

You should not expect the application to catch all meaningful differences.  The comparison logic is based upon the real-life experiences of Esri Professional Services in helping customers with “Enterprise Migration”, “Disaster Recovery”, “Content Promotion”, “Lower Environment Usage Protocols”, etc.  So, the body of knowledge changes over time based on experience.  At this time the comparison logic is not documented.  If you have an interest in a particular type of comparison, feel free to send an email inquiry to dkrouk@esri.com.  I can report on the nature of the comparison that is currently implemented in the application.  And, if an additional comparison is appropriate, I can plan a new comparison feature.

You should also expect “false negatives”; differences that are not important to you.  You should think of all differences, even “Error” differences as potentially significant, but not necessarily significant.  You must use your judgement to determine whether a difference is meaningful to your situation.

The output rows have classifications like “Error”, “Warning”, and “Information” in the normal cases.  However, you may also see classifications like “ProcessingConcern”, “ProcessingWarning”, or “ProcessingError”.  These indicate either insufficient input information for the tool to carry-out its intended comparison or an internal error in executing its comparison logic.  In the former case, this typically indicates a difference between your systems or differences in the parameters used when you ran GIS Enterprise Reporter.  In the latter case, an internal error, the ProcessingErrorInformation column will have information about the internal error.  This is, essentially, a log of sorts that can be used to help me troubleshoot the internal error.  So, when you see this kind of issue, please bring it to my attention so I can troubleshoot: dkrouk@esri.com.

What to do with the Comparison

The best practice pattern with comparison is to do it at least twice:

  1. The first comparison is to discover the differences and decide whether and what to do about them.
  2. The second comparison is to confirm that the differences you chose to address were, in fact, successfully addressed.

Final Thought

Hopefully, this tool helps you in your ArcGIS Enterprise administration work.  If you have questions or ideas for improvement, please be in contact: dkrouk@esri.com.