Implementing ArcGIS Blog

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Other Boards in This Place

Latest Activity

(180 Posts)
AaronLopez
Esri Contributor

One of the most popular functions for consuming resources from an ArcGIS Enterprise Site is through hosted feature services. Their data source is managed by ArcGIS, the scale very well and visualization is performed on the client.

This Article contains a walk-through for load testing a hosted feature layer service with an Apache JMeter Test Plan. It uses a public accessible dataset in the examples for the data generation and test configuration process. However, any source using a Projected Coordinate System of WGS 1984 Web Mercator Auxiliary_Sphere (WKID: 3857) should also work. The Article assumes you are familiar with Apache JMeter and some of the Test Plan strategies we have been using in other discussions.

Table Of Contents

  • Why Test a Hosted Feature Service?
  • Hosted Feature Service Testing Challenges
  • How to Test a Hosted Feature Service? 
    • The USGS Motor Vehicle Use Roads Dataset 
    • Test Data Generation 
      • Making the Tools Available from ArcGIS Pro 
      • Select an Area of Interest 
      • Run the Generate Query Extents Tool 
      • Adjust the Inputs for the Generate Query Extents Tool 
      • Validating the Generated Test Data 
  • The Hosted Feature Service Query Test Plan 
    • Components of the Test Plan 
      • Data Reader Logic 
      • Operation ID Selection Logic 
      • Operation Loop and Parameter Population 
      • HTTP Request 
  • The Thread Group Configuration 
  • Validating the Test Plan 
    • Transactions 
    • Requests 
  • Test Execution 
  • JMeter Report 
    • Throughput Curves 
    • Performance Curves 
  • Final Thoughts 
Read more...

more
2 10 3,645
MichelleJohnson
Esri Contributor

We are excited to offer Data Health Check appointments at this year's User Conference.  It's been a popular offering at UC for the past 10 years.  Users get to meet with an Esri industry expert, who will assess the quality of their GIS data (geometry and attributes) for use cases such as getting ready to migrate to the Utility Network or Parcel Fabric, connectivity issues in their geometric network, report/submit data quality, understand inherited data, readiness to migrate to indoors data model, and more.

Customers with water, wastewater, stormwater, electric, gas, linear referenced pipelines, addressing, land records, street centerlines, airports (airfield operations), and Indoors datasets can sign up for a free, 45-min virtual appointment. 

Learn more about this awesome offering here.

If you are ready to sign-up, click here.

more
0 0 577
MichaelHatcher
New Contributor III

In this entry, we will be looking at what a deployment looks like from the infrastructure as code (IaC) perspective with Terraform as well as the configuration management side with PowerShell DSC (Desired State Configuration). Both play important roles in automating ArcGIS Enterprise deployments, so let's jump in.

This deployment will follow a single machine model as described in the ArcGIS Enterprise documentation. It will consist of the following.

  • Portal for ArcGIS 10.7.1

  • ArcGIS Server 10.7.1 (Set as Hosting Server)

    • Services Directory is disabled

  • ArcGIS Data Store 10.7.1 (Relational Storage)

  • Two (2) Web Adaptors within IIS

    • Portal context: portal

    • Server context: hosted

    • Self-Signed certificate matching the public DNS

Additional Configurations

  • WebGISDR is configured to perform weekly full backups that are stored within Azure Blob Storage via Task Scheduler

  • Virtual machine is configured for nightly backups to an Azure Recovery Services Vault

  • RDP (3389) access is restricted via Network Security Group to the public IP of the box in which Terraform is ran from.

  • Internet access (80, 443) is configured for ArcGIS Enterprise via Network Security Group

  • Azure Anti-malware is configured for the virtual machine

The complete code and configurations can be found attached below. You will need to provide your own ArcGIS Enterprise licenses however.

Note:   This is post two (2) in a series on engineering with ArcGIS.

Infrastructure Deployment

If you are not already familiar with Terraform and how it can be used to efficiently handle the lifecycle of your infrastructure, I would recommend taking the time to read through the first entry in this series which can be found here. The Terraform code in that first entry will be used as the basis for the work that will be done in this posting.

As discussed in the first entry, Terraform is a tool designed to help manage the lifecycle of your infrastructure. Instead of rehashing the benefits of Terraform this time however, we will jump straight into the code and review what is being done. As mentioned above, the template from the first entry in this series is used again here with additional code added to perform specific actions needed for configuring ArcGIS Enterprise. Let's take a look at those additions.

These additions handle the creation of two blob containers that will be used for uploading deployment resources ("artifacts") and a empty container ("webgisdr") that will be used when configuring webgisdr backups along with the uploading of the license files, the PowerShell DSC archive and lastly, the web adaptor installer.

resource "azurerm_storage_container" "artifacts" {
  name                  = "${var.deployInfo["projectName"]}${var.deployInfo["environment"]}-deployment"
  resource_group_name   = "${azurerm_resource_group.rg.name}"
  storage_account_name  = "${azurerm_storage_account.storage.name}"
  container_access_type = "private"
}

resource "azurerm_storage_container" "webgisdr" {
  name                  = "webgisdr"
  resource_group_name   = "${azurerm_resource_group.rg.name}"
  storage_account_name  = "${azurerm_storage_account.storage.name}"
  container_access_type = "private"
}

resource "azurerm_storage_blob" "serverLicense" {
  name                   = "${var.deployInfo["serverLicenseFileName"]}"
  resource_group_name    = "${azurerm_resource_group.rg.name}"
  storage_account_name   = "${azurerm_storage_account.storage.name}"
  storage_container_name = "${azurerm_storage_container.artifacts.name}"
  type                   = "block"
  source                 = "./${var.deployInfo["serverLicenseFileName"]}"
}

resource "azurerm_storage_blob" "portalLicense" {
  name                   = "${var.deployInfo["portalLicenseFileName"]}"
  resource_group_name    = "${azurerm_resource_group.rg.name}"
  storage_account_name   = "${azurerm_storage_account.storage.name}"
  storage_container_name = "${azurerm_storage_container.artifacts.name}"
  type                   = "block"
  source                 = "./${var.deployInfo["portalLicenseFileName"]}"
}

resource "azurerm_storage_blob" "dscResources" {
  name                   = "dsc.zip"
  resource_group_name    = "${azurerm_resource_group.rg.name}"
  storage_account_name   = "${azurerm_storage_account.storage.name}"
  storage_container_name = "${azurerm_storage_container.artifacts.name}"
  type                   = "block"
  source                 = "./dsc.zip"
}

resource "azurerm_storage_blob" "webAdaptorInstaller" {
  name                   = "${var.deployInfo["marketplaceImageVersion"]}-iiswebadaptor.exe"
  resource_group_name    = "${azurerm_resource_group.rg.name}"
  storage_account_name   = "${azurerm_storage_account.storage.name}"
  storage_container_name = "${azurerm_storage_container.artifacts.name}"
  type                   = "block"
  source                 = "./${var.deployInfo["marketplaceImageVersion"]}-iiswebadaptor.exe"
}
‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍

This addition handles the generation of a short-lived SAS token from the storage account that is then used during the configuration management portion to actually grab the needed files from storage securely. In this situation, we could simplify the deployment by marking our containers as public and not requiring a token but that is not recommended.

data "azurerm_storage_account_sas" "token" {
  connection_string = "${azurerm_storage_account.storage.primary_connection_string}"
  https_only        = true
  start             = "${timestamp()}"
  expiry            = "${timeadd(timestamp(), "5h")}"

  resource_types {
    service   = false
    container = false
    object    = true
  }

  services {
    blob  = true
    queue = false
    table = false
    file  = false
  }

  permissions {
    read    = true
    write   = true
    delete  = true
    list    = true
    add     = true
    create  = true
    update  = true
    process = true
  }
}
‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍

The final change is the addition of an extension to the virtual machine that will handle the configuration management task using PowerShell DSC. Instead of reviewing this in-depth here, just know that the data that is getting passed under the settings and protected_settings json will be passed to PowerShell DSC as parameters for use as needed by the configuration file.

resource "azurerm_virtual_machine_extension" "arcgisEnterprise-dsc" {
  name                       = "dsc"
  location                   = "${azurerm_resource_group.rg.location}"
  resource_group_name        = "${azurerm_resource_group.rg.name}"
  virtual_machine_name       = "${element(azurerm_virtual_machine.arcgisEnterprise.*.name, count.index)}"
  publisher                  = "Microsoft.Powershell"
  type                       = "DSC"
  type_handler_version       = "2.9"
  auto_upgrade_minor_version = true
  count                      = "${var.arcgisEnterpriseSpecs["count"]}"

  settings = <<SETTINGS
	{
		"configuration": {
          "url": "${azurerm_storage_blob.dscResources.url}${data.azurerm_storage_account_sas.token.sas}",
          "function": "enterprise",
		  "script": "enterprise.ps1"
		},
		"configurationArguments": {
          "webAdaptorUrl": "${azurerm_storage_blob.webAdaptorInstaller.url}${data.azurerm_storage_account_sas.token.sas}",
          "serverLicenseUrl": "${azurerm_storage_blob.serverLicense.url}${data.azurerm_storage_account_sas.token.sas}",
          "portalLicenseUrl": "${azurerm_storage_blob.portalLicense.url}${data.azurerm_storage_account_sas.token.sas}",
          "externalDNS": "${azurerm_public_ip.arcgisEnterprise.fqdn}",
		  "arcgisVersion" : "${var.deployInfo["marketplaceImageVersion"]}",
          "BlobStorageAccountName": "${azurerm_storage_account.storage.name}",
          "BlobContainerName": "${azurerm_storage_container.webgisdr.name}",
          "BlobStorageKey": "${azurerm_storage_account.storage.primary_access_key}"
      }
	}
	SETTINGS
  protected_settings = <<PROTECTED_SETTINGS
	{
		"configurationArguments": {
          "serviceAccountCredential": {
            "username": "${var.deployInfo["serviceAccountUsername"]}",
            "password": "${var.deployInfo["serviceAccountPassword"]}"
      },
		"arcgisAdminCredential": {
			"username": "${var.deployInfo["arcgisAdminUsername"]}",
			"password": "${var.deployInfo["arcgisAdminPassword"]}"
			}
		}
	}
	PROTECTED_SETTINGS
}‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍

Configuration Management

As was touched on above, we are utilizing PowerShell DSC (Desired State Configuration) to handle the configuration of ArcGIS Enterprise as well as a few other tasks on the instance. To simplify things, I have included v2.1 of the ArcGIS module within the archive but the public repo can be found here. The ArcGIS module provides a means with which to interact with ArcGIS Enterprise in a controlled manner by provided various "resources" that perform specific tasks. One of the major benefits of PowerShell DSC is that it is idempotent. This means that we can continually run our configuration and nothing will be modified if the system matches our code. This provides administrators the ability to push changes and updates without altering existing resources as well as detecting configuration drift over time. 

To highlight the use of one of these resources, let's take a quick look at the ArcGIS_Portal resource which is designed to  configure a new site without having to manually do so through the typical browser based workflow. In this deployment, our ArcGIS_Portal resource looks exactly like the below code. The resource specifies the parameters that we must be provided to successfully configure the portal site and will error out if all required parameters are not provided. 

ArcGIS_Portal arcgisPortal {
    PortalEndPoint = (Get-FQDN $env:COMPUTERNAME)
    PortalContext = 'portal'
    ExternalDNSName = $externalDNS
    Ensure = 'Present'
    PortalAdministrator = $arcgisAdminCredential
    AdminEMail = 'example@esri.com' 
    AdminSecurityQuestionIndex = '12'
    AdminSecurityAnswer = 'none'
    ContentDirectoryLocation = $portalContentLocation
    LicenseFilePath = (Join-Path $(Get-Location).Path (Get-FileNameFromUrl $portalLicenseUrl))
    DependsOn = $Depends
}
$Depends += '[ArcGIS_Portal]arcgisPortal'‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍

Because of the scope of what is being done within the configuration script here, we will not be doing a deep dive. This will come in a later article.

Putting it together

With the changes to the Terraform template as well as a very high level overview of PowerShell DSC and its purpose, we can deploy the environment using the same commands mentioned in the first entry in the series. Within the terminal of your choosing, navigate into the extracted archive that contains your licenses, template and DSC archive, and start by initializing Terraform with the following command.

terraform init

Next, you can run the following to start the deployment process. Keep in mind, we are not only deploying the infrastructure but configuring ArcGIS Enterprise so the time for completion will vary. When it completes, it will output the public facing url to access your ArcGIS Enterprise portal.

terraform apply


Summary

As you should quickly be able to see, removing the manual configuration aspects of software as well as the deployment of infrastructure, a large portion of problems can be mitigated by moving toward IaC and Configuration Management. There are many solutions out there to handle both aspects and these are just two options. Explore what works for you and start moving toward a more DevOps centric approach.

I hope you find this helpful, do not hesitate to post your questions here: Engineering ArcGIS Series: Tools of an Engineer 

 

Note: The contents presented above are examples and should be reviewed and modified as needed for each specific environment.

more
3 0 5,951
AdamCarnow
Esri Regular Contributor

If you're headed to the 2019 Esri International User Conference and are interested in sessions for GIS Managers, here is a link to the GIS Manager Track:

https://userconference2019.schedule.esri.com/schedule?filters=1964269831

more
0 0 393
AdamCarnow
Esri Regular Contributor

Are you a GIS manager, leader or other executive headed to the 2019 Esri International User Conference (UC)?  I know it can be a challenge creating your personal agenda for the world's largest GIS conference, so I created this flier to assist.  It covers suggested events and activities you should consider when deciding how to spend your valuable time at UC.  I hope you have a productive UC experience, and I hope to see you there!

UPDATED 6/24/2019 - Added GIS Manager Track and Get Advice from Esri Services section.

UPDATED 6/13/2019 - Corrected the name of the Implementing ArcGIS area in the Expo to “Guiding your Geospatial Journey”

FYI there are other Esri UC fliers here: https://community.esri.com/community/events/user-conference/content?filterID=contentstatus%5Bpublish...

more
3 0 2,010
JacobBoyle412
Esri Contributor

What is System Log Parser?

System Log Parser is an ArcGIS for Server (10.1+) log query and analyzer tool to help you quickly quantify the "GIS" in your deployment. When run, it connects to an ArcGIS for Server instance on port 6080/6443/443 as a publisher (or an administrator), retrieves the logs from a time duration (specified as an input), analyzes the information then produces a spreadsheet version of the data that summarizes the service statistics. The command-line version of System Log Parser (slp.exe) is used by ArcGIS Monitor for data capture.

 

Note:   This post is a second in a series on System Log Parser, please see ArcGIS Server Tuning and Optimization with System Log Parser to learn how to setup your server for System Log Parser and an overview of the report.

 

Introduction to Statistics Used In System Log Parser

 

There are several statistical categories you should be familiar with when using System Log Parser. (definitions from Wikipedia)

 

Percentile (P) - a measure used in statistics indicating the value below which a given percentage of observations in a group of observations falls. For example, the 20th percentile is the value (or score) below which 20% of the observations may be found. 

 

Average (avg) -   is a single number taken as representative of a list of numbers. Different concepts of average are used in different contexts. Often "average" refers to the arithmetic mean, the sum of the numbers divided by how many numbers are being averaged. In statistics, mean, median, and mode are all known as measures of central tendency, and in colloquial usage, any of these might be called an average value. 

 

Maximum (Max) - the largest value of the function within a given range.

 

Minimum (Min) - the smallest value of the function within a given range.

 

Standard Deviation (Stdev) -    measure that is used to quantify the amount of variation or dispersion of a set of data values. A l...

 

Fields of the Statistics Collected

 

Field
Definition

Resource Requested resource or service (Service REST endpoint)
Capability The ArcGIS capability of the resource
Method The function performed by the resource (What was accessed)
CountThe number of requests for this resource
Count Pct Count percentage based on total service requests
Avg The average time (in seconds) spent processing request
MinThe time (in seconds) of the shortest request
P5, P25, P50, P75The percentile grouping of the time (in seconds)
P9595% of all responses occur between 0 seconds and the value displayed in this column per service
P9999% of all responses occur between 0 seconds and the value displayed in this column per service
MaxThe time (in seconds) of the longest request
StdevThe standard deviation of time (in seconds)
SumThe total time (in seconds) spent processing requests per resource
Sum PctThe total time (in seconds) spent processing requests

 

We're going to focus on 2 key statistics, P95 and Max.  As we learned above, P95 signifies the response time for the fastest 95% of all requests and Max signifies the maximum draw time per request per service and method.

 

Identifying Opportunities to Tune Service Performance

 

In the example below, I've sorted P95 and Max values over 1/2 second.  User experience drops the longer your draw-time takes. 

 

I've highlighted any Max draw time over 1/2 second in red and any P95 draw time over 1/2 second in yellow.  These are the services and layers I'd focus on cleaning up, focusing first on getting the P95 value below 1/2 second first. 

In the next section you'll find starting points to tune and optimize your services.

 

Another column worth reviewing is the Sum Pct.  this column factors in the number of requests for each service and the respective average time, then weights that in against all the other services.

 

Sum Pct

 

For example:   

  1. One service may have thousands of more requests than all others but it has fast times (Sum Pct should be low)
  2. Another service may have just a small handful of requests but very slow times (Sum Pct should be high). In this case, this service would be a good candidate to for tuning.

 

Best Practices for Services

 

Below are some links to get you started on service tuning and SOC management.

         

In addition to the above, data source performance should be looked at if adjustments to the service do not help enough. You can look at:

 

I hope you find this helpful, do not hesitate to post your questions here: https://community.esri.com/thread/231451-arcgis-architecture-series-tools-of-an-architect

 

Note: The contents presented above are recommendations that will typically improve performance for many scenarios. However, in some cases, these recommendations may not produce better performance results, in which case, additional performance testing and system configuration modifications may be needed.

more
14 6 9,666
SarahScher
Esri Contributor

In your organization there are likely different people, working in a variety of roles, with varying skills and responsibilities. It can be overwhelming to deliver the right content in the right format to these different people in a well-performing, reliable, and secure manner.

Your geospatial content publication strategy serves as a guide to help accomplish this. While any two organizations can have vastly different publications strategies, an effective content delivery strategy will always address performance, reliability, and security.

Performance

Think of performance as how long it takes an application to load- is it lightning fast, or crawling along. One way to address performance strategically is to consider separating internal and external activities. In practice, this could mean external public applications like StoryMaps live in a scalable environment such as ArcGIS Online, and internal dashboards, analytics, and editing work stays on your own infrastructure in ArcGIS Enterprise. This way, if one of those public-facing apps suddenly becomes popular, your internal resources won’t have to compete for resources.

Reliability

Reliability is expressed in a service level agreement (SLA), and is an expectation of when the system will be available- like during work hours, or 99% of the time. There are many ways in which organizations address reliability, such as following other best practices like high availability, load balancing, workload separation, and security. You could also address reliability by leveraging cloud capabilities.

Security

Within the context of a publication strategy, security is about exposing the right content and capabilities to the right people. You certainly don’t want non-experts editing your asset information, or your sensitive data to be exposed publicly. This content should be properly maintained in a secure system of record. Security isn’t just about keeping your internal content within your organization; it can also pertain to information or capabilities that is sensitive even between departments or teams within your organization. Depending on the level of risk and sensitivity of this content, it may be appropriate to have a separate, internal publication environment.

While your organization’s individual content publication strategy will likely encompass many other considerations that are relevant to your work, goals, and mission, it should always address the needs and expectations of the people in your organization and protect your internal system.

Download the PPT for this presentation from the 2018 Esri User's Conference: https://community.esri.com/docs/DOC-12080-content-publication-strategypdf  

more
2 0 581
GuyNoll
Esri Contributor

Geodata engineering is focused on making your data work for your mission. Did you know that The Living Atlas is a way for you to use authoritative information others have created, including official data sources such as the US Census, NOAA, and USGS?  The experts in Esri's Geodata Services support the production of The Living Atlas imagery and community maps content, and can help you improve the quality of your data, too.  To learn more about the work behind The Living Atlas, see this article in XYHT, and meet members of #Geodata and Living Atlas at #EsriUC2018. 

more
2 0 426
MattBottenberg
New Contributor III

I'll be moderating a User Paper Session, Session 2250: "Your Decisions are Only as Good as Your Data", on Wed, Jul 11 - 8:30am - 9:30am  SDCC - Room 29 B. 

We'll have two interesting papers presented by BLM and San Jose Water. Nick Hall and Jason Frels of the BLM National Operations Center in Denver Colorado, will discuss the continued maturation of BLM's Data Quality and the Enterprise (eGIS) System. Mary McMahon of San Jose Water California will discuss how the Esri Water Utility Tools are used to improve their data quality checks workflows. I hope to see you there!

more
2 0 323
MichelleJohnson
Esri Contributor

It really was my pleasure working with Cliff Sullivan and Mark Dickman at SACWD.  These two guys really knew how to make finding errors fun. With their positive attitudes and get-it-done mentality, they quickly gained a good grasp of using the data QC extension, ArcGIS Data Reviewer.

Look where their success got them featured – in the latest issue of ArcNews!

https://www.esri.com/about/newsroom/arcnews/performing-data-checks-keeps-water-running/ 

If you are interested in a quick review of Data Reviewer reviewing your data, schedule a Data Health Check at this year's UC, go.esri.com/dhc-uc18

more
1 0 538
101 Subscribers