Select to view content in your preferred language

Benchmark ArcGIS Enterprise Without a Dataset (Intermediate)

655
0
09-03-2024 11:52 AM
AaronLopez
Esri Contributor
3 0 655

Benchmark ArcGIS Enterprise...The Original Approach

A while ago, I discussed using the Natural Earth dataset with a preconfigured Apache JMeter test to benchmark an ArcGIS Enterprise deployment. Those results from that test could then compared to runs from other deployments to get a comparative idea of the underlying hardware's performance and scalability characteristics. This approach had some benefits:

  • Natural Earth is a free GIS data
    • Available for public use
  • Low-to-moderate data complexity (easy to work with)
  • Test Plan featured a step load for observing scalability capabilities

While useful and a good measuring stick, the scalability component meant the test would typically run for a long time (which also added some complication). I had wondered if there was an easier way to just benchmark the processing hardware (e.g., the CPU) but still through ArcGIS Enterprise:

  • Was it possible to use JMeter from a performance-only perspective?
  • Could I create a test to benchmark ArcGIS Enterprise without an underlying FGDB or enterprise geodatabase dataset (which should simplify the overall effort)?

It turns out the answers were yes!

Benchmark ArcGIS Enterprise...An Alternative Approach

Okay...I am speaking in half-truths. The new benchmark test does not depend on an FGDB or eGDB dataset based service, but does need some data. To help keep things simple, the data (e.g., pre-generated geometries) is simply passed through the JMeter sample elements to an ArcGIS resource that does not have a referenced dataset behind-the-scenes. 

So, how is this done?

Through the tried-and-true Geometry service. ArcGIS Server's geometry service is a built-in resource that provides access to many functions for performing geometric operations. The calculations of these operations (like buffer or generalize) can be simple or complex (depending on what you ask it). From a performance analyst's perspective, it provides a fantastic means for benchmarking the CPU hardware of the machine running ArcGIS Server.

Note: Although the term ArcGIS Enterprise includes ArcGIS Server, this benchmark primary exercises the latter (e.g., ArcGIS Server). Some traffic may go through the ArcGIS Web Adaptor and there would be a small amount of Portal for ArcGIS authentication taking place, but by design, the bulk of the work will  be performed by ArcGIS Server.

Benefits of Using the Geometry service

The Geometry service has been around in ArcGIS Server since version 9.3, so its ubiquitous. That makes a test utilizing it easy and reliable. Since the data driving the test is put inside the key/value pairs of the requests, that adds portability (e.g., no dataset to lug around).

Note: While the Geometry service has been included with ArcGIS Server for some time, by default it is off and not running. The service would need to be started and shared to the appropriate Portal for ArcGIS members before running the test.

The Geometry_Functions_Benchmark Test Plan

  • Downloading and opening the Test Plan in Apache JMeter should look similar to the following:
    • Adjust the User Defined Variables to fit your environment

jmeter_geometry_functions_benchmark1.png

What Types of Functions Should be Tested?

For a benchmark, the short answer is only a few. This particular Test Plan only calls a few different operations...as well as the same operations in different ways (e.g., changing request parameters to purposely get a variant response). This provides mutability so the test is not just doing the same thing over and over.

Below is a look at the operations used in this benchmark:

jmeter_geometry_functions_benchmark_operations1.png

Expected Test and Operation Performance

This test has some operations that may perform fast and others that will take more time. This speed will vary based on the hardware. Ultimately, we just want ArcGIS Enterprise (e.g., Server) to work for a just few minutes so we can get an idea of the processing performance. If each operation took 10 minutes (with the test many times longer) the benchmark itself can become too time-consuming and less practical to use.

Deployment Architecture Example

This benchmark test was run in a lab against two different severs (e.g., run once per server):

  • ArcGIS Enterprise -- Machine #1 (older hardware)
    • Intel Xeon E5-4650, 2.70 GHz
      • SPECint_base2006 
        • Score:  50.5
        • 32 processing cores 
        • HyperThreading disabled
      • 64GB RAM
      • 10Gbps network
  • ArcGIS Enterprise -- Machine #2 (newer hardware)
    • Intel Xeon Gold 6126, 2.60 GHz 
      • SPECint_base2006 
        • Score:  71.9
          • 24 processing cores 
          • HyperThreading disabled
      • 128GB RAM
      • 10Gbps network

Note: Since this testing effort was more focused on speed instead of throughput, SPECint_base numbers were used instead of SPECint_rate_base.

Benchmark Test Execution

For long running tests, it is not recommended to run the Test Plan within the GUI. However, since this is a relatively short test, the impact is nominal.

Note: When running any test, it is always recommended to coordinate the start time and expected duration with the appropriate personnel. This ensures minimal impact to users and other colleagues that may also need to use the ArcGIS Enterprise Site of interest (e.g., the production deployment). Additionally, this helps prevent system noise from other activity and use which may "pollute" the test results.

Results

After adjusting the User Defined Variables to point to the appropriate environment (Machine #1…devlab05), the benchmark was run right in JMeter GUI. The results can be observed from the View Results in Table element:

  • For convenience, the Test Plan automatically calculates the overall test run duration, right in the name of the last operation
    • This makes benchmark time easy to observe from the table

jmeter_geometry_functions_benchmark_results_server1.png

  • The Test Plan was adjusted to point to a server on newer hardware (Machine #2…eistsrv05) and the benchmark was rerun
    • From the table, the results are added after the first run:

jmeter_geometry_functions_benchmark_results_server2.png

Expectedly, the first machine required more time to complete the same operations. This resulted in a measurable difference in performance between the two machines.

  • Machine #1…devlab05
    • Benchmark duration: 259946 ms
  • Machine #2…eistsrv05
    • Benchmark duration: 181441 ms

Calculate Percentage Change

Since the response times were lower (e.g., faster) with newer hardware (compared to the first run on older hardware), we'll calculate a percentage decrease:

  • First, original server time - newer server time = the decrease
  • Then, the decrease ÷ original server number × 100 = the % decrease

(259946 ms - 181441 ms) / 259946 ms = 0.302

0.302 x 100 = 30.2% 

The benchmark times from the older hardware (our start point) was 30% lower than the newer hardware. This percentage change suggests a measurable improvement when using the newer hardware.

Percentage Change Estimate Based on SPEC

Let's use the SPEC ratio with the benchmark time from the original run to predict the target_time (benchmark time on the newer machine). This can help with the understand if the roughly the same percentage change could be estimated.

(Baseline_SPEC x Baseline_Time) = (Target_SPEC x Target_Time)

((Baseline_SPEC x Baseline_Time) / Target_SPEC) = Target_Time

(36.875 x 259946 ms) / 53.75 = 178335 ms (after rounding down to nearest second)

(259946 ms - 178335 ms) / 259946 ms = 0.314

0.314 x 100 = 31.4%

From this prediction, the older hardware was estimated to be 31% lower than with the newer hardware. This is very close to the percentage change that was calculated based on the observed benchmark times. 

Future Hardware

640px-Wikimedia_Foundation_Servers-8055_43.jpgProcessor architectures and CPU speeds are always improving. Eventually, such a benchmark test (as it is currently built) may only take a minute or tens of seconds to run (what a great problem to have). At this point, complexity could be added to the test to increase its run duration to better match the new technology.

You may have noticed the last transaction in the test was disabled. This 1000 Point Buffer request with a distance of 10000 meters and a unit of 9035 (International Meter Distance) takes some time to calculate (even on decent hardware). It was disabled to shorten the run time to a reasonable duration. However, if helpful, it can enabled as an additional calculation, depending on the CPU speed of the deployment of interest.

Final Thoughts 

As mentioned in other community articles, there is no one service or function that can cover the entire breadth and depth of ArcGIS. However, the Geometry service is a resource that represents a portion of the amazing field of GIS that is easy to work with. This makes it a good option to use for benchmark testing efforts.

A Fast Response Time Is All About CPU Speed, Right?

For this Geometry benchmark test, yes. However, for real-world services, processing speed is not the only factor.

Server hardware components like disk speed, available memory, network speed are other resources which can improve response times (in addition to CPU speed). Together, they all have a positive affect on the user experience.

This benchmark focused on CPU performance as it is a large part of the client request/server response process, but as just mentioned, it is not the only server resource when taking into account other potential ArcGIS services.

What About Other CPU Comparison Tools?

Cpu-processor.jpgThere are many utilities out there that can profile and test the various pieces of server hardware using a whole battery of exercises. These tests are great and certainly add value for understanding the hardware. Again, there is no one test that can represent all things GIS. But hopefully, this Geometry Benchmark Test Plan can be a useful tool in the analyst tool chest. 



To download the Apache JMeter Test Plan used in this Article see: geometry_functions_benchmark1.zip 

 

 

 

Attribution

Resource: File:Wikimedia_Foundation_Servers-8055_43.jpg

Description: Rack-mounted 11th-generation PowerEdge servers

Author: Victorgrigas - Own work

Created: 16 July 2012

Uploaded: 20 July 2012

License: CC BY-SA 3.0Link 

 

Resource: File:Cpu-processor.jpg

Description:

Author: Fx Mehdi - Own work

Uploaded: 30 May 2019

License: Creative Commons Attribution-Share Alike 4.0 International