Implementing ArcGIS Blog

cancel
Showing results for 
Search instead for 
Did you mean: 

Latest Activity

(149 Posts)
New Contributor III

In this entry, we will be looking at what a deployment looks like from the infrastructure as code (IaC) perspective with Terraform as well as the configuration management side with PowerShell DSC (Desired State Configuration). Both play important roles in automating ArcGIS Enterprise deployments, so let's jump in.

This deployment will follow a single machine model as described in the ArcGIS Enterprise documentation. It will consist of the following.

  • Portal for ArcGIS 10.7.1

  • ArcGIS Server 10.7.1 (Set as Hosting Server)

    • Services Directory is disabled

  • ArcGIS Data Store 10.7.1 (Relational Storage)

  • Two (2) Web Adaptors within IIS

    • Portal context: portal

    • Server context: hosted

    • Self-Signed certificate matching the public DNS

Additional Configurations

  • WebGISDR is configured to perform weekly full backups that are stored within Azure Blob Storage via Task Scheduler

  • Virtual machine is configured for nightly backups to an Azure Recovery Services Vault

  • RDP (3389) access is restricted via Network Security Group to the public IP of the box in which Terraform is ran from.

  • Internet access (80, 443) is configured for ArcGIS Enterprise via Network Security Group

  • Azure Anti-malware is configured for the virtual machine

The complete code and configurations can be found attached below. You will need to provide your own ArcGIS Enterprise licenses however.

Note:   This is post two (2) in a series on engineering with ArcGIS.

Infrastructure Deployment

If you are not already familiar with Terraform and how it can be used to efficiently handle the lifecycle of your infrastructure, I would recommend taking the time to read through the first entry in this series which can be found here. The Terraform code in that first entry will be used as the basis for the work that will be done in this posting.

As discussed in the first entry, Terraform is a tool designed to help manage the lifecycle of your infrastructure. Instead of rehashing the benefits of Terraform this time however, we will jump straight into the code and review what is being done. As mentioned above, the template from the first entry in this series is used again here with additional code added to perform specific actions needed for configuring ArcGIS Enterprise. Let's take a look at those additions.

These additions handle the creation of two blob containers that will be used for uploading deployment resources ("artifacts") and a empty container ("webgisdr") that will be used when configuring webgisdr backups along with the uploading of the license files, the PowerShell DSC archive and lastly, the web adaptor installer.

resource "azurerm_storage_container" "artifacts" {
name = "${var.deployInfo["projectName"]}${var.deployInfo["environment"]}-deployment"
resource_group_name = "${azurerm_resource_group.rg.name}"
storage_account_name = "${azurerm_storage_account.storage.name}"
container_access_type = "private"
}

resource "azurerm_storage_container" "webgisdr" {
name = "webgisdr"
resource_group_name = "${azurerm_resource_group.rg.name}"
storage_account_name = "${azurerm_storage_account.storage.name}"
container_access_type = "private"
}

resource "azurerm_storage_blob" "serverLicense" {
name = "${var.deployInfo["serverLicenseFileName"]}"
resource_group_name = "${azurerm_resource_group.rg.name}"
storage_account_name = "${azurerm_storage_account.storage.name}"
storage_container_name = "${azurerm_storage_container.artifacts.name}"
type = "block"
source = "./${var.deployInfo["serverLicenseFileName"]}"
}

resource "azurerm_storage_blob" "portalLicense" {
name = "${var.deployInfo["portalLicenseFileName"]}"
resource_group_name = "${azurerm_resource_group.rg.name}"
storage_account_name = "${azurerm_storage_account.storage.name}"
storage_container_name = "${azurerm_storage_container.artifacts.name}"
type = "block"
source = "./${var.deployInfo["portalLicenseFileName"]}"
}

resource "azurerm_storage_blob" "dscResources" {
name = "dsc.zip"
resource_group_name = "${azurerm_resource_group.rg.name}"
storage_account_name = "${azurerm_storage_account.storage.name}"
storage_container_name = "${azurerm_storage_container.artifacts.name}"
type = "block"
source = "./dsc.zip"
}

resource "azurerm_storage_blob" "webAdaptorInstaller" {
name = "${var.deployInfo["marketplaceImageVersion"]}-iiswebadaptor.exe"
resource_group_name = "${azurerm_resource_group.rg.name}"
storage_account_name = "${azurerm_storage_account.storage.name}"
storage_container_name = "${azurerm_storage_container.artifacts.name}"
type = "block"
source = "./${var.deployInfo["marketplaceImageVersion"]}-iiswebadaptor.exe"
}
‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍

This addition handles the generation of a short-lived SAS token from the storage account that is then used during the configuration management portion to actually grab the needed files from storage securely. In this situation, we could simplify the deployment by marking our containers as public and not requiring a token but that is not recommended.

data "azurerm_storage_account_sas" "token" {
connection_string = "${azurerm_storage_account.storage.primary_connection_string}"
https_only = true
start = "${timestamp()}"
expiry = "${timeadd(timestamp(), "5h")}"

resource_types {
service = false
container = false
object = true
}

services {
blob = true
queue = false
table = false
file = false
}

permissions {
read = true
write = true
delete = true
list = true
add = true
create = true
update = true
process = true
}
}
‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍

The final change is the addition of an extension to the virtual machine that will handle the configuration management task using PowerShell DSC. Instead of reviewing this in-depth here, just know that the data that is getting passed under the settings and protected_settings json will be passed to PowerShell DSC as parameters for use as needed by the configuration file.

resource "azurerm_virtual_machine_extension" "arcgisEnterprise-dsc" {
name = "dsc"
location = "${azurerm_resource_group.rg.location}"
resource_group_name = "${azurerm_resource_group.rg.name}"
virtual_machine_name = "${element(azurerm_virtual_machine.arcgisEnterprise.*.name, count.index)}"
publisher = "Microsoft.Powershell"
type = "DSC"
type_handler_version = "2.9"
auto_upgrade_minor_version = true
count = "${var.arcgisEnterpriseSpecs["count"]}"

settings = <<SETTINGS
{
"configuration": {
"url": "${azurerm_storage_blob.dscResources.url}${data.azurerm_storage_account_sas.token.sas}",
"function": "enterprise",
"script": "enterprise.ps1"
},
"configurationArguments": {
"webAdaptorUrl": "${azurerm_storage_blob.webAdaptorInstaller.url}${data.azurerm_storage_account_sas.token.sas}",
"serverLicenseUrl": "${azurerm_storage_blob.serverLicense.url}${data.azurerm_storage_account_sas.token.sas}",
"portalLicenseUrl": "${azurerm_storage_blob.portalLicense.url}${data.azurerm_storage_account_sas.token.sas}",
"externalDNS": "${azurerm_public_ip.arcgisEnterprise.fqdn}",
"arcgisVersion" : "${var.deployInfo["marketplaceImageVersion"]}",
"BlobStorageAccountName": "${azurerm_storage_account.storage.name}",
"BlobContainerName": "${azurerm_storage_container.webgisdr.name}",
"BlobStorageKey": "${azurerm_storage_account.storage.primary_access_key}"
}
}
SETTINGS
protected_settings = <<PROTECTED_SETTINGS
{
"configurationArguments": {
"serviceAccountCredential": {
"username": "${var.deployInfo["serviceAccountUsername"]}",
"password": "${var.deployInfo["serviceAccountPassword"]}"
},
"arcgisAdminCredential": {
"username": "${var.deployInfo["arcgisAdminUsername"]}",
"password": "${var.deployInfo["arcgisAdminPassword"]}"
}
}
}
PROTECTED_SETTINGS
}‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍

Configuration Management

As was touched on above, we are utilizing PowerShell DSC (Desired State Configuration) to handle the configuration of ArcGIS Enterprise as well as a few other tasks on the instance. To simplify things, I have included v2.1 of the ArcGIS module within the archive but the public repo can be found here. The ArcGIS module provides a means with which to interact with ArcGIS Enterprise in a controlled manner by provided various "resources" that perform specific tasks. One of the major benefits of PowerShell DSC is that it is idempotent. This means that we can continually run our configuration and nothing will be modified if the system matches our code. This provides administrators the ability to push changes and updates without altering existing resources as well as detecting configuration drift over time. 

To highlight the use of one of these resources, let's take a quick look at the ArcGIS_Portal resource which is designed to  configure a new site without having to manually do so through the typical browser based workflow. In this deployment, our ArcGIS_Portal resource looks exactly like the below code. The resource specifies the parameters that we must be provided to successfully configure the portal site and will error out if all required parameters are not provided. 

ArcGIS_Portal arcgisPortal {
PortalEndPoint = (Get-FQDN $env:COMPUTERNAME)
PortalContext = 'portal'
ExternalDNSName = $externalDNS
Ensure = 'Present'
PortalAdministrator = $arcgisAdminCredential
AdminEMail = 'example@esri.com'
AdminSecurityQuestionIndex = '12'
AdminSecurityAnswer = 'none'
ContentDirectoryLocation = $portalContentLocation
LicenseFilePath = (Join-Path $(Get-Location).Path (Get-FileNameFromUrl $portalLicenseUrl))
DependsOn = $Depends
}
$Depends += '[ArcGIS_Portal]arcgisPortal'‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍

Because of the scope of what is being done within the configuration script here, we will not be doing a deep dive. This will come in a later article.

Putting it together

With the changes to the Terraform template as well as a very high level overview of PowerShell DSC and its purpose, we can deploy the environment using the same commands mentioned in the first entry in the series. Within the terminal of your choosing, navigate into the extracted archive that contains your licenses, template and DSC archive, and start by initializing Terraform with the following command.

terraform init

Next, you can run the following to start the deployment process. Keep in mind, we are not only deploying the infrastructure but configuring ArcGIS Enterprise so the time for completion will vary. When it completes, it will output the public facing url to access your ArcGIS Enterprise portal.

terraform apply


Summary

As you should quickly be able to see, removing the manual configuration aspects of software as well as the deployment of infrastructure, a large portion of problems can be mitigated by moving toward IaC and Configuration Management. There are many solutions out there to handle both aspects and these are just two options. Explore what works for you and start moving toward a more DevOps centric approach.

I hope you find this helpful, do not hesitate to post your questions here: Engineering ArcGIS Series: Tools of an Engineer 

 

Note: The contents presented above are examples and should be reviewed and modified as needed for each specific environment.

more
3 0 1,905
Esri Regular Contributor

If you're headed to the 2019 Esri International User Conference and are interested in sessions for GIS Managers, here is a link to the GIS Manager Track:

https://userconference2019.schedule.esri.com/schedule?filters=1964269831

more
0 0 160
Esri Regular Contributor

Are you a GIS manager, leader or other executive headed to the 2019 Esri International User Conference (UC)?  I know it can be a challenge creating your personal agenda for the world's largest GIS conference, so I created this flier to assist.  It covers suggested events and activities you should consider when deciding how to spend your valuable time at UC.  I hope you have a productive UC experience, and I hope to see you there!

UPDATED 6/24/2019 - Added GIS Manager Track and Get Advice from Esri Services section.

UPDATED 6/13/2019 - Corrected the name of the Implementing ArcGIS area in the Expo to “Guiding your Geospatial Journey”

FYI there are other Esri UC fliers here: https://community.esri.com/community/events/user-conference/content?filterID=contentstatus%5Bpublish...

more
3 0 1,364
Esri Contributor

What is System Log Parser?

System Log Parser is an ArcGIS for Server (10.1+) log query and analyzer tool to help you quickly quantify the "GIS" in your deployment. When run, it connects to an ArcGIS for Server instance on port 6080/6443/443 as a publisher (or an administrator), retrieves the logs from a time duration (specified as an input), analyzes the information then produces a spreadsheet version of the data that summarizes the service statistics. The command-line version of System Log Parser (slp.exe) is used by ArcGIS Monitor for data capture.

System Log Parser supports the following service types:

  • Feature Services
  • Geoprocessing Services
  • Network Analyst Services
  • Geocode Services
  • KML Services
  • Stream Services
  • GeoData Services
  • Map Services
  • Workflow Manager Services
  • Geometry Services
  • Image Services

  • Globe Services
  • Mobile Services

System Log Parser (https://arcg.is/0XLnfb), a free-standing application or Add-on for ArcGIS Monitor, is an effective tool for diagnosing and reviewing infrastructure functionality.

Getting Started

In this section, we’ll configure ArcGIS Server to collect logs at the level needed for the tool and setup System Log Parser to generate a report (MS Excel).

1.   Ensure the following conditions are met on the machine you’ll be running System Log Parser from:

  1. 64-bit Operating System:
    1. Windows 7 (64 bit), Windows 8.x, Windows 10
    2. Windows Server 2008 64 bit, Windows Server 2012, Windows Server 2016
  2. RAM: 4 GB
  3. Microsoft .NET Framework 4.5 or 4.6
  4. Microsoft Excel 2010 or newer (or appropriate .xlsx viewer).

2.   Set your ArcGIS Server logs to Fine on EACH server you’d like to get metrics on. Complete instructions on how to       change ArcGIS Server log levels can be found here:  Specify Server Log Settings

Note:    I recommend running the logging at FINE for AT LEAST one week prior to running System Log              Parser. This should give you a fairly clear picture of a typical weeks load.

3.   Download and extract System Log Parser here: https://arcg.is/0XLnfb

4.   Extract the .zip file.

Note:    This is BOTH the user interface and the Add-on for ArcGIS Monitor.  We will be focused on the user               interface version for this exercise.

5.   Launch System Log Parser

6.   Browse to the location you extracted System Log Parser

7.   In the System Log Parser for ArcGIS folder, locate and launch SystemLogsGUI.exe

System Log Parser GUI

Note:    You may be prompted that Windows has protected your PC.  If you do get this prompt, please click              More info and then click Run Anyway.


Configuring System Log Parser

The following outlines the configuration required to setup System Log Parser to analyse a weeks worth of logs.

Note:    The System Log Parser will automatically access logging for all clusters that are part of an ArcGIS              Server Site. If you have multiple ArcGIS Server Sites configured

Click the ArcGIS Server (Web) button to display the following:

Fill out the above form as indicated below:

1.   Enter the Server URL.

  1. The typical syntax with ArcGIS Server 10.2 or higher is: https://<host_name>:<port_number>/arcgis  
  2. The typical syntax with ArcGIS Server 10.1 is: https://<host_name>:<port_number>/ArcGIS
Note:    If your URL structure is different, enter it.

2.   Enter the ArcGIS Server Manager user name with publisher or better permissions. 

3.   Enter the users password

4.   Check this box if you are accessing a Site federated to Portal for ArcGIS

Note:   Consider using a web adapter address for the Server URL:  https://<webadaptor_name>/server
Note:   If accessing over the internet, this assumes that the web adapter was registered with administrative access to ArcGIS Server

5.   Check this box if you use IWA(Integrated Windows Authentication)

6.   If needed, specify a token(advanced option)

7.   Select an End Time (Now)

8.   Select Start Time (1 week)

9.   Select Analysis Type (Complete)

  1. Simple: Provides only the Service Summary page data. 

    Note: This mode will also generate a list of the underlying data source by service and layer in the service. 

  2. WithOverviewCharts: Provides the Service Summary page plus charts of Request Count, Average Request Response Time, Instance Creation Time, Wait Time (Queue Time), and Max Request Response Time.

  3. Complete: Provides a Service Summary page plus all data and charts in separate tabs for all services.

  4. ErrorsOnly: Provides a report of just the errors.
  5. VerboseMode: Provides full verbose log analysis (Limited to 12 hours).

10.   Select Report Type (Spreadsheet)

11.   Specify where to output the report (Default is your My Documents location)

 

Click Analyze Logs. Analyze Logs

This process can take a few minutes or longer, this all depends on the number of transactions logged.

Review the System Log Parser report

 

When System Log Parser finishes running, it will open the report in Excel if present.  If you ran this from a machine without Microsoft Excel, move it to a computer with Excel and open.

 

You will note that there is a summary tab, and several tabs listed across the bottom of the spreadsheet.  We'll cover each in further detail below, by tab.

 

Summary

When the Excel report opens, you will see the Summary tab. The below screen grab shows what server this was run against and some summary statistics.

 

Summary

 

Statistics

On the bottom of the Excel report select the Statistics tab to view a table of all services by layer and service types.  this is where we'll spend most of our time.  Please read the rest of this post, then click here.

 

Resources

On the bottom of the Excel report select the Resources tab to view several charts:

  • Top 20 Resources by Count
  • Top 20 Resources by Average Response Time
  • Top 20 Resources by Maximum Response Time

 

Methods

On the bottom of the Excel report select the Methods tab to view several charts:

  • Top 20 Methods by Count
  • Top 20 Methods by Average Response Time
  • Top 20 Methods by Maximum Response Time

 

Queue Time

On the bottom of the Excel report select the Queue Time tab to view any services that had to wait for a ArcSOC to return a result. In an ideal setting the below is the desired value:

 

Queue Time Stats

 

Users

On the bottom of the Excel report select the Users tab to view a chart of the top 20 users by request count.

 

Time

On the bottom of the Excel report select the Time tab to view a chart of requests by day.

 

Throughput per Minute

On the bottom of the Excel report select the Throughput per Minute tab to few a minute by minute breakdown of requests.

Below is a sample of what information can be found on the tab:

 

Throughput Per Minute

 

Elapsed Time of All Resources

On the bottom of the Excel report, select the Elapsed Time of All Resources tab to view chronological listing of all requests from the time period the System Log Parser report was generated.

I'd also like to thank Aaron Lopez‌ for his help and continued development of this invaluable tool. 

Note: The contents presented above are recommendations that will typically improve performance for many scenarios. However, in some cases, these recommendations may not produce better performance results, in which case, additional performance testing and system configuration modifications may be needed.

I hope you find this helpful, do not hesitate to post your questions here: ArcGIS Architecture Series: Tools of an Architect

more
7 1 2,553
New Contributor III

In your organization there are likely different people, working in a variety of roles, with varying skills and responsibilities. It can be overwhelming to deliver the right content in the right format to these different people in a well-performing, reliable, and secure manner.

Your geospatial content publication strategy serves as a guide to help accomplish this. While any two organizations can have vastly different publications strategies, an effective content delivery strategy will always address performance, reliability, and security.

Performance

Think of performance as how long it takes an application to load- is it lightning fast, or crawling along. One way to address performance strategically is to consider separating internal and external activities. In practice, this could mean external public applications like StoryMaps live in a scalable environment such as ArcGIS Online, and internal dashboards, analytics, and editing work stays on your own infrastructure in ArcGIS Enterprise. This way, if one of those public-facing apps suddenly becomes popular, your internal resources won’t have to compete for resources.

Reliability

Reliability is expressed in a service level agreement (SLA), and is an expectation of when the system will be available- like during work hours, or 99% of the time. There are many ways in which organizations address reliability, such as following other best practices like high availability, load balancing, workload separation, and security. You could also address reliability by leveraging cloud capabilities.

Security

Within the context of a publication strategy, security is about exposing the right content and capabilities to the right people. You certainly don’t want non-experts editing your asset information, or your sensitive data to be exposed publicly. This content should be properly maintained in a secure system of record. Security isn’t just about keeping your internal content within your organization; it can also pertain to information or capabilities that is sensitive even between departments or teams within your organization. Depending on the level of risk and sensitivity of this content, it may be appropriate to have a separate, internal publication environment.

While your organization’s individual content publication strategy will likely encompass many other considerations that are relevant to your work, goals, and mission, it should always address the needs and expectations of the people in your organization and protect your internal system.

Download the PPT for this presentation from the 2018 Esri User's Conference: https://community.esri.com/docs/DOC-12080-content-publication-strategypdf  

more
2 0 215
Esri Contributor

Geodata engineering is focused on making your data work for your mission. Did you know that The Living Atlas is a way for you to use authoritative information others have created, including official data sources such as the US Census, NOAA, and USGS?  The experts in Esri's Geodata Services support the production of The Living Atlas imagery and community maps content, and can help you improve the quality of your data, too.  To learn more about the work behind The Living Atlas, see this article in XYHT, and meet members of #Geodata and Living Atlas at #EsriUC2018. 

more
2 0 194
New Contributor III

I'll be moderating a User Paper Session, Session 2250: "Your Decisions are Only as Good as Your Data", on Wed, Jul 11 - 8:30am - 9:30am  SDCC - Room 29 B. 

We'll have two interesting papers presented by BLM and San Jose Water. Nick Hall and Jason Frels of the BLM National Operations Center in Denver Colorado, will discuss the continued maturation of BLM's Data Quality and the Enterprise (eGIS) System. Mary McMahon of San Jose Water California will discuss how the Esri Water Utility Tools are used to improve their data quality checks workflows. I hope to see you there!

more
2 0 97
Esri Contributor

It really was my pleasure working with Cliff Sullivan and Mark Dickman at SACWD.  These two guys really knew how to make finding errors fun. With their positive attitudes and get-it-done mentality, they quickly gained a good grasp of using the data QC extension, ArcGIS Data Reviewer.

Look where their success got them featured – in the latest issue of ArcNews!

https://www.esri.com/about/newsroom/arcnews/performing-data-checks-keeps-water-running/ 

If you are interested in a quick review of Data Reviewer reviewing your data, schedule a Data Health Check at this year's UC, go.esri.com/dhc-uc18

more
1 0 238
Esri Contributor

I just returned from a trip to Tucson Water where I helped Terri Bunting (GIS Supervisor) and Lorena Baltierrez (QA/QC Lead) implement ArcGIS Data Reviewer as part of their daily editing and quality control workflows.  This was probably one of the most successful business trips I've made during my 25+ years at Esri.

 

It all started out with a quick Data Health Check that I conducted on their water utility data a couple of years ago at the Esri UC. I used the ArcGIS Data Reviewer extension to configure several data checks and validated their water data.  As with most utility users whose data is in geometric network, I found the typical errors: duplicate features, disconnected lines and points, and required fields not being populated. These type of data issues affect any network tracing results and the connection back to any 3rd party applications, like an asset management system.

After completing the data health check, Terri and Lorena were excited to take the recommendations I provided and implement Data Reviewer when they got back to the office. Their feedback regarding the session was, “This is a great addition to the user conference, thank you!”

 

After a couple of false starts due to existing staff workloads and not having extra time to ramp up and implement Data Reviewer, this year they looked into doing a 3-day Data Reviewer jumpstart workshop. How lucky was I that I got assigned to do this jumpstart with them?! Terri and Lorena were very excited too!

 

While onsite, I helped them configure the quality control checks that were appropriate for their data. One of the main goals for their GIS system is to be able to perform valve isolation tracing on their water utility data by the end of the year.  To achieve this goal, we prioritized the data checks that focused on feature connectivity.

 

What helped me most during this whole process was how well organized and prepared they were by providing a data editing guidelines document.  We went through the specifications to identify checks that needed to be created.  By doing so, we also found areas where their editing guidelines needed to be updated.  Besides doing QC on their data, they got the added bonus of QC’ing their guidelines too! In fact, Terri shared that Lorena is already teaching others how to include the new functionality to their editing workflows and feels their editors are confident in using Data Reviewer right away!

 

Not only was the implementation successful, these two wonderful ladies were also so very hospitable and really appreciative of my visit.  They made my job enjoyable and I felt like we bonded instantly which made my trip so great. I am excited and I can’t wait to hear about their progress as they begin utilizing the tools effectively to clean up their entire water data.

 

We, at Esri, strive hard to enable our customers to successfully implement and efficiently use their GIS. That’s why we are offering complimentary Data Health Checks at the upcoming 2018 Esri UC. Watch this video of me inviting you to sign up for a session.

Michelle Johnson

GIS Data QA Lead, Geodata Services

more
7 1 653
Esri Contributor

If you maintain any of the of following datasets, you should take advantage of the GIS Data Health Check activity at the 2018 Esri User Conference!

  • Water, wastewater, sewer, or stormwater
  • Electric or gas
  • Roads and highways
  • Utility pipelines
  • Land records or addressing
  • 3D data

Meet with me or one of my esteemed colleagues who will sit with you to review a sample of your data, explain the types of issues to look for, and what errors you may have in your data.  Find out more here

Be sure to make your reservation as soon as you can; appointments fill up fast!  To reserve your spot, go here.  

See you at the UC!

Michelle Johnson

GIS Data QA Lead, Geodata Services

more
4 0 486