|
BLOG
|
Hi @DeanHowell1, Are you referring to the testing of a single fused image cache? If so, a possible solution would be to take generated bounding boxes of interest and convert them on-the-fly to the appropriate set of tiles. This conversion process would be based on the GetLayerTile logic (there might be some older resources out on the interest which still list these steps in coding languages). Of course, the newer developer APIs from Esri do this for you with a simple function call, but in Apache JMeter's case, this logic would need to be added to the test (e.g. using something like Groovy). I would recommend this strategy over converting a HAR file into a load test. Although technically valid, with the HAR file approach the requests are quickly cached and the load tests then typically shows high network utilization. But using the first approach (conversion of extents to underlying tiles), the requests are more realistic as the test can spatially cover a lot more area. This topic was also recently discussed as a potential Community Article in the future. If one is put together I will definitely send you the link. Thanks again for the feedback. Aaron.
... View more
11-15-2021
08:50 PM
|
0
|
0
|
3758
|
|
BLOG
|
Hi @DeanHowell1, The Performance Engineering team recently released a Community Article on Creating a Load Test in Apache JMeter Against a Hosted Feature Layer Service! The discussion covers how to generate (feature service) test data and how to plug this data (query extents) into an Apache JMeter Test Plan for load testing. We took a programmatic approach to solving tackling this challenge, however this part of the test logic for the most part remains hidden from the tester. Happy testing! Aaron
... View more
10-26-2021
02:11 PM
|
1
|
0
|
5830
|
|
BLOG
|
Why Test a Hosted Feature Layer Service? Previous Community Articles on performance testing with Apache JMeter focused on exercising Map Services through the export function. However, Hosted (feature) layers are also a popular capability of ArcGIS Enterprise and are widely used in deployments. Additionally, querying these layers are based on a "repeated" grid design which can help provide a higher degree of scalability over other visualization technologies. Couple this with client-side rendering of the data returned and its a win-win. Given that hosted feature services are a proven and favorite service technology, it makes sense to want to test feature queries under load to observe its scalability first hand. Hosted Feature Layer Service Testing Challenges Compared to testing the export map function, testing Hosted Feature Layer Service queries is challenge as the requests are more complex to achieve programmatically. A navigational "pan" or a "zoom" in the web browser produces a handful of different queries, each with their own geometry. To repeat this behavior, the constructed load test will not have just one request to issue but many and a varying amount. Couple this with the fact that each query request in the transaction will have a unique geometry and changing maxAllowableOffset (depending on the map scale) and its a lot of moving parts to keep track of. How to Test a Hosted Feature Service? The USGS Motor Vehicle Use Roads Dataset The understanding of the process in this Article is most effective if the steps can be reproduced. But this repeatability requires access to the same set of data. The spatial size of the data source also needs to be large enough to generate decent test data but not too big where it is cumbersome to download. Enter in the Motor Vehicle Use Map: Roads feature layer dataset on hub.arcgis.com. The 179K polyline records of USGS Roads data in WGS 1984 Web Mercator (Auxiliary_Sphere), equates to about 200MB when zipped. It is provided through the Creative Commons (CC0) license. View of Roads data from ArcGIS Pro: Large scale view with labeling enabled: This data will be published from ArcGIS Pro to a hosted feature service in ArcGIS Enterprise or loaded directly through Portal for ArcGIS. To create a service from this data, see Publish hosted feature layers in ArcGIS Enterprise Test Data Generation This test will require some good test data to use within the JMeter test. To tackle such a task, it is highly recommended to use the very excellent Load Testing Tools. Version 1.2.2 adds new capabilities like the "Generate Query Extents" tool which will be a great help for generating feature service test data. This data utilizes the grid-based design which is what we want. With the grid-based approach, envelopes for the desired area are created behind the scenes. Then, these envelopes are converted to the appropriate 512x512 query extents. The number of the queries (for each initial envelope) will vary based on where it lands on the grid...this mimics the service behavior in a web browser. Making the Tools Available from ArcGIS Pro Once the load-testing-tools project has been downloaded to your machine, place the unzipped folder in a directory that is accessible or made accessible by ArcGIS Pro. If you have a previous version of the Load Testing Tools already installed, this updated version can be installed along side it (although with a different folder name) or completely replace the existing folder. For example: Place the load-testing-tools folder in C:\Users\[username]\Documents\ArcGIS Use the Add Folder Connection from Catalog in ArcGIS Pro to list the contents of this directory: The "Generate Query Extents" tool can work off the hosted feature service, a local copy of the data or the data within an enterprise geodatabase. Note: the tool should generate query extents from any data but it does require the Projected Coordinate System to be WGS 1984 Web Mercator Auxiliary_Sphere (WKID: 3857). Select an Area of Interest Select an area of interest from the map in which to generate test data. In this example, The Roads data is being viewed from the Northwestern United States (near the state borders of Idaho and Montana). The selected map scale is 1:1,000,000. Run the Generate Query Extents Tool Running the Generate Query Extents tool should present inputs similar to the following: Adjust the Inputs for the Generate Query Extents Tool The default inputs were adjusted to reflect the following: Several smaller and larger scale levels were removed The remaining scale levels are 12, 13, and 14 which correspond to the map scales 144448, 72224, and 36112, respectively The Number of Records for these scales were increased Scale Level 14 may be omitted depending on the release of Load Testing Tools (if absent, please add this Scale Level manually) The File Output Location which should be something similar to: C:\Users\username\Documents\ArcGIS\Projects\Catalog2\query_extents.csv Click Run to execute the tool Note: The duration of time to generate the test data is based on several factors such as the number of different Scale Levels, the Number of Records (per each Scale Level) and the current map scale of the Project. Note: Generating test data using other datasets may dictate the need to use different Scale Levels based on level of detail and feature density. Validating the Generated Test Data It is a good practice to visually verify generated test data. This let's the tester know what the load test will be spatially requesting from the feature service. Once the tool has completed successfully it will generate 3 primary sets of data that are of interest: Bounding box feature classes Contains randomly generated areas of interest One feature class for each requested Scale Level Query Extent feature classes Contains a (512x512) tile grid that each feature query will be based on One feature class for each requested Scale Level Query Extent CSV files Contains the generated test data Each line is composed of the dynamic components of a feature service request One file for reach requested Scale Level From the Catalog panel, load the bbox_36112 feature class onto the current map in ArcGIS Pro This output is very similar to the data from the Generate Bounding Boxes tool In this example, the randomly generated boxes are in pink These areas represent the screen resolution of a user requesting data from the feature service Now, from the Catalog panel, load the query_extents_36112 feature class onto the current map but behind (underneath) the bbox_36112 data In this example, the query tile grid boxes are in green These tiles correspond to an area on the map that the bboxes are asking for data Zooming in to the map can yield a better understanding to the relationship between these two datasets As seen in the map below, some bboxes are slightly offset from each other but still share a common query tile from the grid beneath them The coordinates of these query tiles (e.g. from the query_extent feature class) is what will go into the CSV files and ultimately the JMeter load test Looking closer at the bboxes reveal details on their respective query composition For example, some bboxes might require 12 "underlying" tiles to fulfill, others 15 or 20 As seen in the map below, the bbox highlighted in black requires 12 specific query tiles colored in red Note: The tile grid design of the feature service is one of its key strengths as it lends itself to repeatability. This repeatability can be leveraged with caching in a deployment for improved scalability. This is not possible with export map. Examining the generated CSV files will reveal the end results of this transformation Viewing the query_extents_36112.csv file in a text editor should show something similar to the following Depending on the release of Load Test Tools, the CSV files might be sorted by the operationid column Depending on the release of Load Testing Tools, the line may or may not be grouped by the operationid column The understanding of the operationid, in this case, is an important testing concept as each operation is representing a navigation action (e.g. a pan or zoom) From JMeter's point of view, an operation is the same as a transaction All the lines with a matching operationid will become feature service query request geometries under the same transaction controller The Hosted Feature Service Query Test Plan To download the Apache JMeter Test Plan used in this Article see: roads_hfs1.zip Opening the Test Plan in Apache JMeter should look similar to the following: Adjust the User Defined Variables to fit your environment The 3 CSV files generated from the tool are referenced by through the JMeter variables DataFile_A, DataFile_B, and DataFile_C by just the file name (the file system path is not included here) Components of the Test Plan Data Reader Logic The roads_hfs test is a bit of a different beast than other Apache JMeter test examples used in previous articles. The primary different is that while its still a data drive test (e.g. CSV files are used for request input), it is not using the typical "CSV Data Set Config" Config Element object to read in the data. Instead, this logic is performed through JSR223 Samplers that execute Groovy code. The reason Groovy is utilized is due to the nature of interacting with a feature service mentioned earlier. Recall that some transactions will have 12 requests and other may have 15 or 20 (depending on where the overall area of interest lands on the tile grid). This difference in the number of requests requires the test to use a more flexible mechanism for reading and using the data from the CSV files since this will not be constant. There is one JSR223 Sampler for each CSV file (e.g. each map scale) All JSR223 Samplers for reading data are put into a Once Only Controller to minimize overhead The CSV file read will only be carried out once, at the beginning of each test thread Shown below is "JSR223 Sample A1" which will be reading in the file query_extents_72224.csv Experience coding in Groovy is not required for running this test, in fact, these JSR223 Samplers do not need to be edited to run the test, but it is helpful to understand what logic is responsible for reading in the CSV data Operation ID Selection Logic Once the CSV data has been read in, the test will need to select an operation id for each scale with every test iteration. To accomplish this, a second set of JSR223 Samplers were used to pick from each list of operations. There is one JSR223 Sampler for each map scale that randomly selects an operation id All JSR223 Samplers for generating this operation id are put into a Transaction Controller called Operation Generator This is executed with every test thread iteration These JSR223 Samplers do not need to be edited to run the test Note: JSR223 Samplers using Groovy are generally executed quickly and add very little overhead to the test Operation Loop and Parameter Population With an operation id chosen, the focus becomes the loop logic where the test will lookup the number of feature services queries that the make up the transaction. From there it will use a third set of JSR223 Samplers to populate the requests parameters associated to the previously selected operation id with each iteration in the loop. There is one JSR223 Sampler for each map scale that populates the associated JMeter variables based on the operation id and iteration value These items then become key/value pairs which are picked up by the HTTP Request The iteration valued are tracked by a Counter Config Element These JSR223 Samplers do not need to be edited to run the test Each Loop Controller, Counter, JSR223 Sampler and HTTP Request objects are all placed inside a corresponding Transaction Controller to logically separate the items for each map scale HTTP Request Essentially, all of the test logic above exists just for this component of the test. Here, the JMeter HTTP Request object can read-in the JMeter variables for specific key/value parameters that have been populated by the JSR223 Sampler immediately before it. Since this approach is highly programmatic, there is only one HTTP Request per map scale! Such a design favors maintainability. Note: This test approach would also work for traditional, non-hosted feature layer services. However, these feature services do not have the same request parameter optimizations that hosted services do such as maxAllowableOffset and quantizationParameters. These options would just need to be deleted from the HTTP Request. The Thread Group Configuration The JMeter Test Plan is currently configured for a relatively short test of 10 minutes. Generally speaking, hosted feature services perform well, so a lot of throughput will be taking place within each step (1 minute per step) as well as from the test overall. Different environments and data may require an alternative setting to achieve the desired test results, adjust as needed Validating the Test Plan As a best practice, it is always a good idea to validate the results coming back before executing the actual load test. Use the View Results Tree listener to assist with the validation The Test Plan includes a View Results Tree Listener but it is disabled by default Enable it to view the results From the GUI, Start the test Transactions Select one of the "HFS" Transactions The results should resemble the following: In this example, the transactions listed above: HFS (mapscale: 72224), HFS (mapscale: 36112), and HFS (mapscale: 144448) all completed successfully The Sampler result lists some more details Although each Transaction sent one HTTP request per feature query extent, the JMeter test is counting the Sampler as part of the operation The JSR223 Samplers add very little overhead to the Transaction although they do double the number of samples, this is just a detail to be aware of Take a quick glance at the Size in bytes In this example, the Transaction Size was almost 65KB which suggests some data was being returned and the responses were not "empty" Requests Expand one of the "HFS" Transactions Select one of the https requests The results should resemble the following: In this example, the select request completed successfully Take a quick glance at the Size in bytes In this example, the Request Size was about 5KB which suggests some data was being returned and the responses were not "empty" (e.g. 1500 bytes) The ContentType is also important Per the parameters in the Test Plan, the requested format is pbf which returns application/x-protobuf Requesting protocol buffers is a best practice as it optimizes the payload The resulting format is binary and cannot easily be viewed without additional help that is not covered in this Article Note: Feature services (including hosted feature services) are rendered on the client (not on the server like export map). Although Apache JMeter is a (test) client, it does not render the server responses through JavaScript like a web browser. Test Execution The load test should be run in the same manner as a typical JMeter Test Plan. See the runMe.bat script included with the roads_hfs1.zip project for an example on how to run a test as recommended by the Apache JMeter team. The runMe.bat script contains a jmeterbin variable that will need to be set to the appropriate value for your environment Note: It is always recommended to coordinate the load test start time and duration with the appropriate personnel. This ensures minimal impact to users and other colleagues that may also need to use the ArcGIS Enterprise Site. Additionally, this helps prevent system noise from other activity and use which may "pollute" the test results. JMeter Report Throughput Curves The auto-generated JMeter Report can provide insight into the throughput of the HFS transactions under load Non-HFS Transactions have been manually filtered out In this case, the peak throughput for the HFS operations was about 16.5 transactions/second Since there were 3 HFS transactions, this equates to almost 50 transactions/second (or 178,200 transactions/hour) Note: Each of the HFS Transactions will naturally have a similar throughput as their respective execution in the test was weighted the same Performance Curves The auto-generated JMeter Report can provide insight into the performance of the HFS transactions under load Non-HFS Transactions have been manually filtered out In this case, HFS transactions for all scales were sub-second (under 1 second) Even toward the end of the test, under the heaviest load, the average response time was under 225 ms or 0.225 seconds Final Thoughts There are other ways to test a hosted feature layer service queries such as through captured traffic from a web browser while interacting with the endpoint or application. This would produce a list of the service URLs which could be translated into a test. However, a programmatic approach such as the one listed in this Article offers a strategy for testing a wide spatial area of the service covering many more extents than can be practically done with the captured traffic approach. The programmatic approach is also easier to maintain as the size of Test Plan is much smaller. To put this into perspective, the JMeter test contained in this Article only contained 3 HTTP Requests (one for each map scale). To download the Apache JMeter Test Plan used in this Article see: roads_hfs1.zip To download the Natural Earth subset of data used in this Article see: Motor Vehicle Use Map: Roads (Feature Layer) Apache JMeter released under the Apache License 2.0. Apache, Apache JMeter, JMeter, the Apache feather, and the Apache JMeter logo are trademarks of the Apache Software Foundation.
... View more
10-26-2021
01:21 PM
|
2
|
12
|
6311
|
|
BLOG
|
Hello @RDSpire, I am very happy to hear that you found these articles are useful. Yes...our team definitely have more planned, including one that covers our take for interpreting load tests results. Your suggested topic, "How to test a web application published from ArcGIS Enterprise" also sounds like good subject to socialize on Community. Thank you for your feedback! Aaron
... View more
08-26-2021
10:15 AM
|
0
|
0
|
2417
|
|
BLOG
|
What is Test Data? Simply put, test data is used to drive a performance or load test by requesting different areas of interest from an ArcGIS Enterprise map service. This spatial parts of the data usually takes the form of a points or bounding boxes (bboxes) and is typically stored a plain text file or in some cases a database. Previous Community Articles on Load Testing ArcGIS Enterprise with Apache JMeter focused on strategies for building test logic and running the test. The sample projects provided on these blogs included test data in the form of plain text comma separated value (CSV) files that plugged right into the requests. These CSV files contained items like bounding boxes and a corresponding spatial reference to provide the HTTP requests in the test with parameter information. With each iteration of test, the next line of data is read in which is then populated into the request. For demonstration purposes, this test data worked well for requesting different map scales against services like SampleWorldCities and NaturalEarth. However, those sample test datasets are limited for use with other map services as the pre-generated bounding boxes were created to only ask for areas of interest around the world at a high level. If your organization is working with data at the state, county or city level, you'll want to have test data that focuses on those areas to maximize load test value. In other others, you want test data at a larger map scale and test data cover a specific area of interest. Generating such data that is specific to your services or your spatial data becomes a critical piece of the process for making a good load test. While composing a few geometries by hand for a simple test is certainly doable, the request signatures are quickly repeated resulting in scalability patterns that are skewed and not realistic. A better test is one that utilizes a large amount of random geometries to push the map service and hardware resource more effectively. Tools for Creating Custom Load Data Thankfully, there is a set of recently released testing tools for ArcGIS Pro on GitHub that makes the task of data generation extraordinarily easy. The name of the utility is called: Load Testing Tools and is available at: https://www.arcgis.com/home/item.html?id=b06ef175665a45d68f5796f321b56e61 This examples in this Article were based on version 1.1 of the toolset One of my favorite tools in the group is "Generate Bounding Boxes" which can quickly generate bounding boxes by the either the map's current extent or a selected polygon. Having the ability to passing in a specific polygon is a very powerful feature as the geometries that are created can be filtered to just your area of interest (e.g. Country, State, County or City). The generated data can be validated visually (via separate feature classes that are created) and plugged right into a JMeter Test Plan (via CSV files that are also created). Again, very easy...very powerful. Creating Custom Test Data Making the Tools Available from ArcGIS Pro Once the load-testing-tools project has been downloaded to your machine, place the folder in a directory that is accessible or made accessible by ArcGIS Pro. For example: Place the load-testing-tools folder in C:\Users\[username]\Documents\ArcGIS Use the Add Folder Connection from Catalog in ArcGIS Pro to list the contents of this directory: Using A Polygon to Outline the Area of Interest In this ArcGIS Pro project, a polygon feature class (U.S. State of Indiana in pink) has been added to the Map to define a boundary around the area where the bounding boxes for the requests in the test will be generated. The Projected Coordinate System of the Indiana State feature class is: WGS 1984 Web Mercator (auxiliary sphere) Its WKID is: 3857 For a point of reference, the default Basemap (World Topographic Map) is left in the map The Projected Coordinate System of the Basemap is also: WGS 1984 Web Mercator (auxiliary sphere) Generate Bounding Boxes Tool Inputs You can launch the Generate Bounding Boxes tool, by navigating the load-testing-tools folder from the ArcGIS Pro Catalog screen. Expand the Load Testing Tools.tbx and double-click on Generate Bounding Boxes. The Geoprocessing screen should populate and look similar to the following: One of the convenient features about the Generate Bounding Boxes tool is that technically ready to go just by clicking Run! With the default options, it will randomly generate bounding boxes using the current extent of the ArcGIS Pro map. Note: The default map scales of the Generate Bounding Boxes tool are similar to those of ArcGIS Online but for brevity, only every other scale is listed. You additional map scales are needed, they can be manually added from within the tool. While this make the data generation really easy, in this example, we are interested in generating boxes inside a particular polygon (State of Indiana). We also want to be very specific on the map scales our test will be using, so we'll want to remove some scales and add others from the tool's interface. From the Generate Bounding Boxes tool: Click the red X in front of 73957191, 18489298, 4622324, 1155581, 288895, 282, 70 to remove these map scales From the empty text box under the Scale column, Add 36112 and use 100 for the Number of Records column From the empty text box under the Scale column, Add 9028 and use 1000 for the Number of Records column From the empty text box under the Scale column, Add 2257 and use 3000 for the Number of Records column Increase the Number of Records for 4514 to 1000 records Increase the Number of Records for 1128 to 3000 records Click the drop down under Polygon Layer and select the feature class of interest with in the Map, in this case, Indiana Expand Output Options Note the location of the bounding boxes csv file Separate csv files per map scale will also be created at this location Select "Output Separate Feature Class Per Scale" option After the customization, the Generate Bounding Boxes tool input should look like the following ([username] would reflect your Windows username): Click Run Tool execution may take a few moments The Table of Contents screen will start to populate by adding feature classes to the Map (one per scale) Visualizing the Generated Data from the Individual Feature Classes Once complete, the output within ArcGIS Pro should look similar to the following: The individual feature class make quality checking a breeze as its easy to see the areas of interest that the test will be making from the generated data Note: Some of the generated bounding boxes may have portions of the their geometry that fall outside the polygon of interest. This is okay. Thanks to the visualization of the data, it is also easy to see why fewer bounding boxes were created for smaller map scales like 1:72,224 and 1:36,112 Similarly, this is why more bounding boxes were created for larger map scales like 1:2,257 and 1:1,128 Note: Depending on your data and it density at the larger scales, it could be advantageous to generate more than 3000 bounding boxes (per scale) in order to "cover more ground". Keep in mind that some load test frameworks may read CSV data into memory and creating extremely large datasets may require more memory from the test client. Visualizing the Generated Data from the Individual CSV Files Using the file system explorer, navigate to the ArcGIS Pro project used for generating the data: C:\Users\[username]\Documents\ArcGIS\Projects\MyProject1 The folder contents should look similar to the following: Opening the contents of bounding_boxes_2257.csv should resemble the following: This data will work with most load testing tools that allow the parameterization of HTTP requests from CSV files Note: The feature class to use as a Polygon Layer for spatial filtering can utilize a Projected Coordinate Systems other than WGS 1984 Web Mercator (auxiliary sphere). However, the generated CSV data will still be projected into bounding boxes that have a WKID of 3857. Using the Generated Data in an Apache JMeter Test Plan With a procedure for generating spatially customized data, you can take the CSV files and import them into an Apache JMeter Test Plan to use in a load test. The previous testing Articles: Using Apache JMeter to Load Test an ArcGIS Enterprise Authenticated Service (Intermediate/Advanced) Using Public Domain Data to Benchmark an ArcGIS Enterprise Map Service (Intermediate) Provided Apache JMeter sample tests that would make good templates to use with your new data and against your map services. CSV Data Set Config Using the CSV Data Set Config element in JMeter, the new generated test data can be referenced from its path on the file system. The Filename path value refers to the location of the CSV file on the disk C:/JMeter Tests/naturalearth1/datasets/bounding_boxes_288895.csv Sample test projects from previous Articles used variables for the path ${ProjectFolder}/datasets/bounding_boxes_288895.csv The Variable Names denotes the column headers in the CSV file bbox,width,height,mapUnits,sr,scale would then become bbox_288895,width_288895,height_288895,mapUnits_288895,sr_288895,scale_288895 as the test may be using other map scales where just "bbox" would be ambiguous The HTTP Request elements pointing to your map service can then be adjusted to utilize variables such as ${bbox_288895} that reference your generated test data. Apache JMeter released under the Apache License 2.0. Apache, Apache JMeter, JMeter, the Apache feather, and the Apache JMeter logo are trademarks of the Apache Software Foundation.
... View more
07-29-2021
11:17 AM
|
5
|
2
|
6135
|
|
BLOG
|
Updates to following sections: Testing Framework Bottleneck Interactive Response Time Law Additions of the following sections: Testing Framework Architecture
... View more
07-18-2021
04:07 PM
|
0
|
0
|
2068
|
|
BLOG
|
Request Also known as a sampler An HTTP request is the "smallest" unit of work you can define a test to perform. Generally, when testing ArcGIS Enterprise, it can be a URL for a resource like a map service, feature service or route solve but it can also be a call for a static object like a *.css or *.js file. The protocol can be HTTP (plain text) or HTTPS (secured) and the method can be one of many, although GET POST and HEAD are typically the most common. A dynamic map service request would resemble the following form: https://yourwebadaptor.domain.com/server/rest/services/NaturalEarth/MapServer/export?bbox=-130.9656801129776%2C18.608785315857112%2C-57.52504741730332%2C52.34557596043248&bboxSR=4326&imageSR=4326&size=1920%2C882&dpi=96&format=png32&transparent=true&layers=show%3A15%2C16%2C17%2C19%2C20%2C21%2C22%2C23%2C24%2C25%2C26%2C27%2C28%2C29%2C30%2C31%2C32%2C33%2C34%2C35&f=image The same URL as an Apache JMeter HTTP request: What a static request would look like: https://yourwebadaptor.domain.com/portal/home/10.9.0/js/jsapi/dojo/dojo.js Apache JMeter also makes the distinction between a request and a sampler, though they are both defining an action to perform. A sampler, it would be the execution of a process at the Operating System level that performs some type of action like running a geoprocessing tool to create a file geodatabase or to create of a new SDE version in an enterprise geodatabase. Every test has to have at least one request or sampler. Another type of sampler is a web socket. While like an HTTP request in that it makes a call over the "web" and can be secured, it uses a different protocol for communicating with the remote server as well as different parameters to specify parameter options. Transaction A transaction is a logical grouping of one or more http requests. The requests can dynamic and/or static. Together, these requests typically make up one user operation, for example: The loading of web app A navigation action like a pan or zoom A search function Creation of a new SDE Version within an Enterprise GeoDatabase It is not a technical requirement to use transactions in a test , but doing so can greatly enhance the analysis as individual operations (e.g. transactions) could then be isolated to show their respective performance behaviors throughout the run. This can be very informative. Understanding that only requests for map scale 1:72,224 had performance problems is very useful from a tuning perspective as you would know exactly what areas of the map document or project would need to be adjusted...transactions can help you accomplish this. Apache JMeter Transaction containing three requests from one operation: Test Also known as test plan or test project The term "test" is rather generic and is often used as both a noun (I created a test to call the resource) and verb (I am going to test the service). Transactions and requests are usually defined in a test. The test will have additional options to configure such as: how long the test will run for, where the results go, should metrics on the remote servers be collected. Different frameworks use slightly different terminology for describing a test. In Apache JMeter's case, a test or test project is called a Test Plan and is designated with a *.jmx file extension. Step Load Also known as load The step load is a characteristic that defines how long and how many concurrent test threads to apply during the test through even, incrementing pressure (e.g. similar to a staircase). Configuring the test for a step load is helpful for understanding how a map service performs or scales or how deployment resources behave as more and more requests are thrown at it. The defined pressure can also decrease (toward the end of the test) but do not have to. Apache JMeter Thread Group (bzm - Concurrency) specifying and visualizing a specific step load: Constant Load A constant load also defines how long and how many test threads to apply but is usually set for a steady rate over long periods of time. Instead of focusing on performance and scalability this configuration is typically for understanding durability and stability. Apache JMeter Thread Group (bzm - Concurrency) specifying and visualizing a specific constant load: Test Threads Also known as threads This is mechanism responsible for applying load by taking the defined work to be done in the test such as the transactions and/or requests and executing them repeatedly. Test threads typically behave in a serial fashion where each thread starts by reading the the first request defined in the test, sends that to the server then awaits its response. The next request in the test will not be issued until a response comes back from the server or a timeout has elapsed. Once one of these conditions is met it moves to the next request. Most tests are configured to have each test thread repeat this process continuously for the duration of the run. Various technologies often refer to test threads as virtual users but this can be misleading. The test threads of a test are just the means (pressure) to an end (delivered throughput). In other words, the execution of a test that is configured with a step load that reaches 100 test threads does not mean the environment is supporting 100 concurrent, virtual users. In this case, determining users would be calculated off the test's throughput; transactions/sec, for example. Apache JMeter Thread Group (bzm - Concurrency) defining the step load via (test) threads: Users Also known as virtual users The number of supported users is one of the most requested items to determine from a load test and usually takes the form of: How many users will this specific service or application support? Will a particular service or application support at least X users? The calculation of users is closely tied to think time as well as measured test artifacts such as throughput and response time. Using Little Law with these inputs can provide a theoretical estimate to the number of users an environment can support. Think Time Also known as workflow pacing Think time is a duration (defined as seconds or milliseconds) that is added into a test to simulate the delays of human behavior that would occur from a person naturally interacting with the map service or web application. Think time delays can be added to transactions (e.g. an operation) or requests or even to the test itself (which is then referred to as workflow pacing). How they are added can vary based on the testing framework involved. In Apache JMeter's case, there are several different timers available that can be added to the test to simulate various types of delays. Key Performance indicators (KPIs) KPIs are test metrics that assist with the analysis of a load test. Some of the most popular ones are associated with measuring the response time and throughput of the test. However, they also extend to items that count the number of failed requests, count the average content length (per request) or that collect information on hardware utilization (such as CPU, memory, network and disk). Although the ability to capture hardware utilization often requires additional test configuration and permissions within the environment, this information is one of the most important artifacts captured from a load test. Note: Captured hardware utilization is one of the most important artifacts captured from a load test. Response Time Response time is a common metric that is used to measure the performance of a request, transaction or test. Simply put, it provides an understanding to how fast an operation is behaving. The value is typically presented in seconds or milliseconds. Faster performance means lower response times which translates to a more favorable user experience. Response time can be plotted over the duration of the test to understand how performance scaled or listed together with throughput for a particular point in the test (e.g. where throughput peaked). Note: Response times are one of the most important artifacts captured from a load test. Ideally, the performance of the item being tested will take on the following curve where the response times will climb more quickly (around the point of peak throughput). In the following example, the average request response time at peak throughput was about 0.4 seconds. Throughput: Throughput is a common metric that is used to measure the scalability of a map service, web application or hardware infrastructure. Essentially, it provides an understanding to the rate at which an operation can be conducted over a duration of time. The value can be usually captured as requests/sec, transactions/sec (e.g. operations/sec) or tests/sec, though it is often expressed over the duration of an hour (the rate in seconds multiplied by 3600). Higher scalability means more throughput which translates to support for more users. Some test analysis will focus on the average throughput of all transactions for a test, while others might examine the average throughput for each individual operation. Note: Throughput is one of the most important artifacts captured from a load test. Ideally, the throughput of the item being tested will resemble the following curve where it reaches a peak then plateaus. When throughput peaks and/or plateaus it suggests that the test has encountered some form of a bottleneck. In the following example, the average request throughput at peak was about 24 requests/second (or 86,400 requests/hour). Bottleneck A bottleneck is a condition of a deployment where one of the its components or tiers is limiting the rate at which it can respond to incoming requests. A bottleneck can take the form of: Hardware Examples All of the CPU cores of ArcGIS Server are fully utilized Available Memory is exhausted Storage disk I/O of the Database server is fully utilized Network card is saturated Due to Send or Received traffic Software Examples The database was configured to only allow 25 current connections despite having ample hardware resources available Throughput for consuming a map service plateaus but ArcGIS Server CPU utilization does not increase above 25% A bottleneck will always exist in a deployment and determining which component will restricts first is part of analysis. It will often take a load test to expose where the first bottleneck will occur since it may only be observed under a large amount of pressure. While server resources and settings are typically the focus of bottleneck analysis, test client resources (CPU, memory, network, disk and in some cases the testing license) can also be a factor. Reaching a bottleneck is not necessarily a problem, it just lets you know where the first weakness or limitation is within the system. Sometimes a bottleneck is considered a “good thing”, for example running a large ArcGIS caching process, it is desired that the CPU becomes the first bottleneck because it is doing the work to create the map tiles. If the CPU can only reach 50% because of another bottleneck (e.g., disk I/O), it will take twice as long for the job to finish relative to 100% CPU utilization. Note: A bottleneck always exists in a deployment Test Type Also known as a performance test, load test, stress test, endurance test, benchmark test Many organizations often use different categories to classify the testing being carried out. A performance test is typically utilized to troubleshoot issues with a service or application when it is behaving slowly or produce longer than expected response times. They do not need to involve a step load and could be conveniently executed as a single user directly interacting from a web browser with the endpoint of interest. A load test can often be used to describe a step load test with a goal of meeting a particular throughput and response time goal. For example, X transactions/sec with a response time under Y seconds and no failures. This might result in the exhaustion of one of the server hardware resources but that is usually not the goal. A load test can also be referred to as a scalability test. A stress test is a similar test but is frequently focused on reaching a pressure that is a multiple load test's goal. In other words, if the load test was trying to reach X transactions/sec, the stress test might try to reach X * 5 transactions/sec without encountering a significant amount of failures. An endurance test has the distinction of trying to break components of the system. Its applied load can be a multiple of the stress test's where the goal is to encounter significant errors and observe the throughput and response time when they occur. An endurance test can also be referred to as a durability test where the applied load is constant for a very long duration and hardware utilization and reclamation patterns are observed. Test Plan In the general sense, a test plan is document, table or list which defines the specific tests that will be executed as well as their respective goals. These goals are the reason and purpose of each the test. The analysis of the results (by hand or from generated test reports) should help you determine whether or not the goals of each test were achieved. Testing Framework The testing framework is the tool or technology used in the form of libraries, APIs as well as a graphical user interface (GUI) for assembling requests, and the test as well as defining the load to be applied. There are many great testing frameworks out there and Apache JMeter is just one of them. While they are all similar in purpose, many of them take different approaches to the vocabulary of certain components and how they create a test and apply load. Some put the definition of the requests and transactions into their own files with the step load configuration in another. With Apache JMeter, all of the test objects are defined in the Test Plan and are logically separated within the tree. Some load testing framework examples: Apache JMeter LoadRunner Silk Performer Some performance testing framework examples: wget A command line tool for retrieving one or more URLs Can provides a high level of detail on each request and response curl A command line tool for retrieving one or more URLs Can provides a high level of detail on each request and response Fiddler GUI-based HTTP Debugger that can be used alone or with a web browser Can provides a high level of detail on each request and response Testing Framework Architecture When testing ArcGIS Enterprise most of the architectural attention centers around scalability of the deployment tiers: Load Balancer, Web Adaptor, Portal for ArcGIS, ArcGIS DataStore, ArcGIS Server, Enterprise Geodatabase and Network Storage. While one 8 Core test machine can usually send a fair amount of requests that can satisfy the typical test, sometimes multiple machines are needed if the load to apply requires serious horsepower. Depending on the testing framework involved, several of the testing components can be separated out to different machines to improve the scalability of the test client. Common components to scale out are: Test Controller As the name implies, the main focus of the controller is to stop and start the test as well as coordinate the collection of test metrics from one or more Test Agents In Apache JMeter's case, the controller is integrated right into the GUI but is also running when the test is executed from the command line Other testing frameworks may have a web-based Test Controller frontend Typically, only one Test Controller is needed for any given test environment, but it can run on dedicated hardware that is separate from the Test Agents Test Agent The primary job of the Test Agent is to send requests and receive responses from the server This component performs most of the work and would require the most CPU resources For big jobs, multiple Test Agents machines might be needed In Apache JMeter's case, by default, the Test Agent runs on the same machine as the Test Controller Test Repository A machine dedicated to storing the load test results This can include test metrics like response time, throughput and hardware utilization In Apache JMeter's case, the results are stored on the controller in text (*.JTL) files It is possible to send the results to a database, but this is not the default Test Visualization A machine used to visualize the test metrics and hardware utilization in real-time In Apache JMeter's case, the GUI is not recommend for the data visualization of a production test run but the command-line is If results are sent to a database, additional software can connect to Test Repository to visualize the information Interactive Response Time Law The Interactive Response Time Law is a formula that defines the relationship between key performance factors, namely users, throughput, response time, and user think time. The calculation can be arranged to determine the parameter of interest as long as you know the other three. For example, if the number of users utilizing the system is known, what the average response time is for requests, and the average user think time, we can then derive the estimated throughput demand on the system. This law is very useful when attempting to convert users to throughput and throughput to users and other use cases as well and is foundational to areas related to testing such as capacity planning. Given the following formula: N = X * (R + Z) N = Number of jobs or concurrent users X = Throughput per second in the system R = Response time, or average time a job spends in the system Z = Think time For more information on the Interactive Response Time Law see: http://downloads.esri.com/Support/downloads/other_/ArcGIS%20Enterprise%20deployment%20guide_Scene%20layer%20benchmark%20testing.pdf https://homepages.inf.ed.ac.uk/jeh/biss2013/Note2.pdf Apache JMeter released under the Apache License 2.0. Apache, Apache JMeter, JMeter, the Apache feather, and the Apache JMeter logo are trademarks of the Apache Software Foundation.
... View more
07-16-2021
04:14 PM
|
0
|
1
|
3606
|
|
BLOG
|
Hi @DeanHowell1, This is a good question. We are planning to address strategies for testing feature services in future Community Articles as well as ways to generate test data (bounding boxes and points) from ArcGIS Pro. In the short term, the quickest way forward may be to use the HTTP recorded that is built-in to Apache JMeter. The recorder would capture the requests as you interact with the feature service from a web application and put them right into the Test Plan. Ideally, each pan or zoom from the application would be its own transaction in the test. If you have pre-recorded HTTP Archive (*.har) files of captured feature service traffic, there is a free utility called HAR2JMX (har to jmx) that can convert them right into an Apache JMeter Test Plan.
... View more
06-30-2021
12:00 PM
|
1
|
0
|
6963
|
|
BLOG
|
Actual Performance Curve chart adjusted to show the response time point that corresponds more accurately to the maximum throughput.
... View more
06-30-2021
09:54 AM
|
1
|
0
|
2585
|
|
BLOG
|
Choosing a Capability of ArcGIS Enterprise to Benchmark As the foundational software system for GIS, ArcGIS Enterprise performs many duties such mapping and visualization, analytics. From this wide range of capabilities and functions there is no single test that can represent all of its ability. However, if one function were to be used as a benchmark for testing an ArcGIS Enterprise deployment, a strong case can be made for the map service export function. Export map can be called easily and programmatically in an Apache JMeter Test Plan by varying the spatial extents of the requests from CSV data files across several map scales. This translates to just one request for each map scale transaction which helps keep the test from become complicated and difficult to maintain. Coupled with the fact that the export function has been available since version 9.3 makes for a proven and reliable operation to benchmark. What is a Benchmark of a Map Service? GIS testers and administrators are often tasked with understanding the differences in throughput between two systems or the same system after some form of environment modification. In such scenarios, a benchmark is the process of carrying out a load test to act as a standard for which multiple things can be compared to one another. With respect to GIS, this load test would be an Apache JMeter Test Plan executing a step load test against an ArcGIS Enterprise map service to understand the highest rate of throughput (transactions/sec or requests/sec) that can be achieved from the deployment given a particular state or configuration. This rate is also known as the peak throughput. At peak throughput, understanding the performance (transaction or request response time) would also be critical to measure. Benchmark Dataset Any dataset can be used for a benchmark as long as it is kept constant where changes like feature class additions, updates, deletes and versions are not being made. This consistency helps create a dependable "standard" since it is a non-moving target. The test data can be private (e.g. proprietary) or public domain based. What is Public Domain Data? Generally speaking, public domain data would be any raster or vector datasets that are free to download and use. There are many public domain datasets out there (and potentially different licenses that define them). The data used in this Article is Made with Natural Earth and provided through the Creative Commons (CC0) license. Why Use Public Domain Data? One of the characteristics that make a good benchmark is constructing a test so that others are able repeat the same test that you did. Public domain data is a good choice in this regard as it promotes a testing standard and a dependable measuring stick for performance and scalability. SampleWorldCities vs Natural Earth While ArcGIS Server's inclusion of the SampleWorldCities through its installation helps make the dataset ubiquitous and good for test examples and walkthroughs, its extremely small size does not make it ideal to use for benchmarking a map service. The Natural Earth datasets on the other hand, provides some decent map detail (at smaller scales) covering the whole world. Additionally, this can be achieved given an easily accommodating disk size foot print which help make it more practical to share, download and use. The Benchmark Natural Earth Dataset Download the benchmark dataset here The data is a subset of the Natural_Earth_quick_start.zip and includes a modified MXD for ArcMap 10.8.1 and ArcGIS Pro 2.8 project. Either can be used to publish a map service to ArcGIS Enterprise. The Natural Earth subset of data should look similar to the following when opened in ArcGIS Pro (or ArcMap) Deployment Architecture Architecture is import detail of a benchmark. The following are all important components of benchmark architecture that have an impact on the test: Does a Web Adaptor exist? Was authentication involved or was the service made available to everyone Portal for ArcGIS authentication ArcGIS Server token authentication Available to everyone How many machines took part in the ArcGIS Site? Processor details Processor model and architecture Number of CPU cores for each server (including the testing client workstation) Physical, virtual or cloud Physical Memory details Total amount of system memory Network speed ArcGIS Enterprise version Operating System Version Note: It is recommended to take note of the deployment architecture details. Saving this information with the test results can help give proper context and meaning to the analysis or conclusions. The results listed in for this benchmark test were run against the follow environment architecture: ArcGIS Server (10.9 Final) Dell PowerEdge R640 SPECint_rate_base2006 HyperThreading disabled 128GB RAM Windows Server 2019 10G network ArcGIS Web Adaptor (10.9 Final) Dell PowerEdge R440 SPECint_base2006 HyperThreading disabled 64GB RAM Windows Server 2019 10G network Test Client Apache JMeter 5.4.1 Dell PowerEdge R640 SPECint_rate_base2006 6 virtual CPUs 16GB RAM Windows Server 2019 10G network Data Source Type and Location Using either a file geodatabase or enterprise geodatabase to store data in for benchmark test is fine. Regardless of which is used, the detail of the data source is an important property of the environment which should be noted. Note: It is recommended to take note of the data source type. Saving this information with the test results can help give proper context and meaning to the analysis or conclusions. As for location, using a remote file geodatabase instead of a local file geodatabase might be necessary if the deployment has multiple servers that make up the ArcGIS Enterprise Site. In either case, remote or local, the data source location is also an important detail of the test environment that should be noted. Note: It is recommended to take note of the data source location. Saving this information with the test results can help give proper context and meaning to the analysis or conclusions. Service Type and Number of Instances For the most widely used ArcGIS map services in a Site, it is recommended to publish the resource as a Dedicated instance instead of Shared. Although both types can scale to fully utilize the available hardware, a Dedicated service instance has resources behind the scenes that are devoted to it which make it an ideal choice for a benchmark test. For predictable performance, it is recommended to set the Minimum and Maximum number of instances for the Dedicate instance type equal to the number of CPU Cores of the ArcGIS Server machine. Note: It is recommended to take note of the service type and number of instances. Saving this information with the test results can help give proper context and meaning to the analysis or conclusions. Do the Request Options in a Benchmark Test Matter? Absolutely! Using a common dataset and the export map function is not enough to establish a dependable benchmark. The export operation is extremely versatile but through this flexibility an image can be generated through in a variety of different input options. A load test that is sending in the requests to the map service consistently is an important for establishing a reliable benchmark. Can the test request a BMP image format instead of a PNG or ask for data to be in a different spatial reference other than the default of 4326? Yes, but changing such options may impact the performance and scalability of the test so it is recommended to leave this Test Plan settings as is. The Map Service Benchmark Test Plan To download the Apache JMeter Test Plan used in this Article see: naturalearth1.zip This Test Plan is largely based on the SampleWorldCities test project from a previous Article Downloading and opening the Test Plan in Apache JMeter should look similar to the following: Adjusted the User Defined Variables to fit your environment The request composition (one for each of the 5 tested map scales) should look similar to the following: The Thread Group Configuration The Thread Group defines the step load characteristics of the test and plays an important role. For an export map, the maximum Number of Threads for the test has a close relationship with maximum number ArcGIS Server CPU cores (and similarly, the maximum number of service instances). Configuring the test threads to exceed the number of cores helps ensure enough pressure is applied to fully utilize the server CPU resources. From there, peak throughput should observed which is a primary goal of a benchmark test. Note: Not all tested datasets may show the respective service fully utilizing the CPU of the ArcGIS Server tier. In such cases, additional troubleshooting is needed to understand where the bottleneck exists that is limiting the scalability of the given workflow. As a general rule of thumb, configure the maximum step load to be 25% -- 60% higher than number of server CPU cores As seen below, the Test Plan is configured to run for 1 hour and reach a maximum step load of 40 concurrent test threads This would start the benchmark at 1 test thread and add an additional thread every 90 seconds This benchmark was designed to test an ArcGIS Server deployment running on 24 physical CPU cores Adjust accordingly, not every ArcGIS Server will run on 24 physical cores and the maximum step values may be too high for your deployment Note: It is recommended to take note of the step load configuration details. Saving this information with the test results can help give proper context and meaning to the analysis or conclusions. Benchmark Test Execution The benchmark should be run in the same manner as a typical JMeter Test Plan. See the runMe.bat script included with the naturalearth1.zip project for an example on how to run a test recommended by the Apache JMeter team. Note: It is always recommended to coordinate the load test start time and duration with the appropriate personnel. This ensures minimal impact to users and other colleagues that may also need to use the ArcGIS Enterprise Site. Additionally, this helps prevent system noise from other activity and use which may "pollute" the test results. Results and Analysis Once the load test has completed, the runME.bat instructs Apache JMeter to automatically generated a report to assist with the analysis of the results. There can be entire Articles and internet resources devoted exclusively to analyzing the components of the results from a load test. So, in the interest of keeping things simple, our focus will be looking at request throughput (requests/sec) and request performance (seconds) metrics from the report. The diagrams below illustrates the ideal trends of these two items in the test over time. The Ideal Throughput Curve Ideally, the throughput curve will have the form of the orange line above. The point where the curve peaks and begins to flatten is an indication that the system has reached its highest level of throughput (due to a hardware or software bottleneck). This area of the graph where the curve bends is referred to as the knee and the value for maximum throughput is at this point. The blue line represents the increasing step load of the test. The Ideal Performance Curve Ideally, the response time curve will have the form of the green line above. It is taken at this same point in the test as maximum throughput. The blue line represents the increasing step load of the test. JMeter Report Included with the naturalearth1.zip project is a Apache JMeter report called naturalearth1_run1 within the reports folder. Opening the index.html will reveal multiple charts and table to assist with the analysis Actual Throughput Curve From the report: Under Charts-->Throughput, the Hits Per Second chart can be found where the request throughput from the test is plotted Since the test was constructed with each transaction containing only one request, "hits per second" is equivalent to both transactions/sec and requests/sec The system achieved a maximum throughput of about 80 transactions/sec (or 80 requests/sec) Actual Performance Curve From the report: Under Charts-->Response Times, the Time Vs Threads chart can be found where the request performance from the test is plotted All items except "/pvtserver/rest/services/NaturalEarth/MapServer/export" are filtered out (by clicking on them within the legend) Since the test was constructed with each transaction containing only one request, the "export request" is also representing the average transaction performance At the point of maximum throughput, the system deliver a transaction performance of about 314ms or 0.3 seconds Note: A different approach to the analysis will need to be taken for load test containing transactions with more than one request Comparing the Results After you have completed the test of your system with the provided data and Test Plan you can compare the results with the those listed in this Article. This can provide an approximate measuring stick for equating two systems. To download the Apache JMeter Test Plan used in this Article see: naturalearth1.zip To download the Natural Earth subset of data used in this Article see: Natural_Earth_Test_Data Apache JMeter released under the Apache License 2.0. Apache, Apache JMeter, JMeter, the Apache feather, and the Apache JMeter logo are trademarks of the Apache Software Foundation.
... View more
06-29-2021
12:55 AM
|
2
|
11
|
4182
|
|
BLOG
|
Hi @DeanHowell1, Thanks for reading our Article on Creating a Load Test in Apache JMeter. I agree with you...it appears your deployment is requiring a token in order to consume the SampleWorldCities map service. We recently added a walkthrough on Using Apache JMeter to Load Test an ArcGIS Enterprise Authenticated Service (Intermediate/Advanced) which may help. At the end of that Article, there is a Test Plan you can download which includes all of the pieces listed in the discussion. Hope this helps! Aaron
... View more
06-21-2021
05:25 PM
|
1
|
0
|
7078
|
|
BLOG
|
Performance Engineering: Load Testing ArcGIS Enterprise What is Performance Engineering? Performance Engineering is the practice of proactively testing, monitoring and analyzing an ArcGIS Enterprise deployment or application from the perspective of performance and/or scalability. It can also encompass both hardware (e.g. CPU and memory utilization) and software components (e.g. map service composition) of a Site. Performance Engineering efforts typically involve multiple tools to carry out the testing and monitoring functions. Why is Performance and Scalability Important? System performance and scalability are critical factors in the successful adoption, operation, and long-term use of an ArcGIS Enterprise deployment. They are often key determinants of end-user satisfaction. The Performance Engineering team in Professional Services provides resources in the form of Community Articles to help achieve those results through the implementation of modern performance and scalability testing and troubleshooting best practices using ArcGIS Enterprise. What Tools are Recommended for Load Testing and Analysis? There is a tremendous amount of high quality testing tools for troubleshooting, analyzing and monitoring the performance of web applications and map services. Unfortunately, it is impossible to cover all of them discussing how they can be used. Instead, Performance Engineering Articles will focus heavily on using Apache JMeter for our performance and load testing tutorials with ArcGIS Enterprise. For many of our Articles, we provide the Apache JMeter Test Plan that was built specifically for each walkthrough. Performance Engineering Articles Strategies, Integration & Configuration and Operational Support Recommended Strategies for Load Testing an ArcGIS Server Deployment (Beginner/Intermediate) -- General strategies for load testing an ArcGIS Server setup; not specific to any testing tool Testing Fundamentals, Meanings and How They Are Used (Beginner) -- Vocabulary definitions to common testing items and phrases ArcGIS Enterprise Analysis with System Log Parser's Optimized Analysis Type (Beginner) -- Improve your log parsing experience by taking advantage of Optimized Analysis Type in System Log Parser Automating System Log Parser from the Windows Command Line (Beginner/Intermediate) -- Several helpful tips and tricks for automating the parsing of ArcGIS Enterprise logs with command line version of System Log Parser ArcGIS Enterprise Analysis with System Log Parser's ServiceDetails Analysis Type (Beginner) -- Use the ServiceDetails Analysis Type in System Log Parser to summarize important information from your services Optimizing ArcSOC Availability and Utilization (Beginner/Intermediate) -- How to observe and match the Instance configuration of dedicated services to the incoming demand ArcGIS Server Performance Strategies (Beginner/Intermediate) -- Common performance challenges and strategies for overcoming them ArcGIS Enterprise Analysis with System Log Parser: Understanding Anonymous Entries for the User Name (Beginner) -- Understanding why the value of "anonymous" can be seen in the System Log Parser report's "Statistics By User" worksheet ArcGIS Enterprise: Is It a Good Idea to Load Test Shared Services? (Beginner) -- A quick discussion of items to consider before load testing shared services Benefits of Analyzing ArcGIS Server Log Entries of Level Info -- Taking advantage of ArcGIS Server's Info log entries and System Log Parser to get an enhanced statistical service analysis of your Site (** Added July 2025 **) Performance and Load Testing Walkthroughs Performance Testing with Apache JMeter (An Introduction) -- An introduction to performance with Apache JMeter; setup a very simple load test Creating a Load Test in Apache JMeter against the SampleWorldCities Map Service (Beginner/Intermediate) -- A detailed walkthrough for building a dynamic load test in Apache JMeter against a map service Running an Apache JMeter Load Test from Command-line mode (Beginner/Intermediate) -- Procedures and strategies for running an Apache JMeter load test from the command-line Using Apache JMeter to Load Test an ArcGIS Enterprise Authenticated Service (Intermediate/Advanced) -- A discussion on how authentication can be used an Apache JMeter test to apply load to a secured map service Using Public Domain Data to Benchmark an ArcGIS Enterprise Map Service (Intermediate) -- A discussion on running an export map test with public data to act as a benchmark of map service; benchmark results from an Esri lab environment included Using ArcGIS Pro to Generate Test Data for Use with Map Services (Beginner/Intermediate) -- A walkthrough on using the new Load Test Tools utility to generate test data that can be spatially customized Creating a Load Test in Apache JMeter Against a Hosted Feature Layer Service (Intermediate/Advanced) -- A walkthrough on using an update to the Load Test Tools utility to generate test data for a hosted feature layer service and how to utilize this programmatically with an Apache JMeter Test Plan Creating a Load Test in Apache JMeter Against a Network Analyst Route Service (Intermediate/Advanced) -- A walkthrough on using an update to the Load Test Tools utility to generate test data for a Network Analyst route service and how to utilize this programmatically with an Apache JMeter Test Plan Creating a Load Test in Apache JMeter Against a Cached Map Service (Advanced) -- A walkthrough on using the latest Load Test Tools utility to generate test data for a cached map service and how to utilize this programmatically with an Apache JMeter Test Plan Load Test an Asynchronous Geoprocessing Service Using Apache JMeter (Advanced) -- A walkthrough on how to load test an asynchronous geoprocessing service; includes an Apache JMeter Test Plan and link to a full featured GP model Capturing Hardware Utilization During an Apache JMeter Load Test (Intermediate) -- A discussion on several common scenarios on how to capture the hardware usage from machines in an ArcGIS Enterprise deployment Using a Branch Versioning Editing Load Test with Apache JMeter (Advanced) -- A discussion on strategies for using a load test to conduct branch versioning editing Benchmark ArcGIS Enterprise Without a Dataset (Intermediate) -- Use the built-in Geometry service to easily benchmark the underlying hardware of the ArcGIS Server machine Administration Automation ArcGIS Enterprise User Administration Automation with Apache JMeter (Intermediate) -- A walkthrough on how to use Apache JMeter to automate some common administrative user tasks in ArcGIS Enterprise; includes several Apache JMeter Test Plans Related Boards Implementing ArcGIS | ArcGIS Enterprise Attribution File:Wikimedia_Foundation_Servers-8055_17.jpg; Victorgrigas, CC BY-SA 3.0 <https://creativecommons.org/licenses/by-sa/3.0>, via Wikimedia Commons, Created: 16 July 2012 File:Blumfield_V-twin_motorcycle_engine.jpg, Public Domain, Created: 1 January 1912 Apache JMeter released under the Apache License 2.0. Apache, Apache JMeter, JMeter, the Apache feather, and the Apache JMeter logo are trademarks of the Apache Software Foundation.
... View more
06-21-2021
03:41 PM
|
9
|
0
|
24738
|
|
BLOG
|
Why Add Authentication to a Test Plan? Previous Articles covered walk-throughs for building an Apache JMeter Test Plan to apply load to an ArcGIS Enterprise map service such as SampleWorldCities. These were great to use as a primer for constructing a Test Plan to dynamically test a service, however they all assumed the remote endpoint was anonymously/publicly accessible. This is often not the case as many deployments will have some form of authentication in place. While one Article will not be enough cover every ArcGIS Enterprise authentication scenario, it will discuss a common one: The composition of a Test Plan for a secured service that requires a token from a built-in portal member or domain member Note: While this Test Plan will be for use against a service that requires user authentication from the portal, it does not cover a similar but technically different scenario of utilizing Single Sign On (Integrated Windows Authentication) to automatically log in as the member running the testing software. Getting Started For simplicity, this Article will be building directly off the Test Plan used from Running an Apache JMeter Load Test from Command-line mode (Beginner/Intermediate) called sampleworldcities3.zip. This Test Plan dynamically applied load to a publicly accessible ArcGIS Enterprise service called SampleWorldCities across four different map scales, each of which were an HTTP Request put into its own Transaction Controller. Many of the components of the test (e.g. Project Folder, Web Server Name, Service Name) were conveniently put into variables to make sharing and portability easy. Creating the sampleworldcities4 Test Plan Simply put, sampleworldcities4 is based directly on the Test Plan of sampleworldcities3 Download the sampleworldcities3.zip Test Plan Unzip sampleworldcities3.zip From the file system, make a copy of the sampleworldcities3 folder and name it sampleworldcities4 Open the sampleworldcities4 folder and rename the sampleworldcities3.jmx file to sampleworldcities4.jmx This procedure assumes the Test Plan will be stored in C:\JMeter Tests\ From within JMeter, open the "new" sampleworldcities4.jmx test Extending the Test Plan to Add Authentication Support Authentication will be done adding three parts to the Test Plan: more User Defined Variables, Authentication Requests to Acquire a Token, and the Token Support in the Map Request Headers. User Defined Variables Add additional User Defined Variables: Click on the Test Plan (e.g. sampleworldcities4) By default, this should be selected when a new Test Plan is opened At the bottom of the User Defined Variables Section, click Add For Name enter: Username For Value enter: username Enter a username of a member within the portal that has been authorized to consume the SampleWorldCities map service If the user is a domain member, enter domain\username For Name enter: Password For Value enter: password Enter the password associated with the username above For Name enter: TokenExpirationMinutes For Value enter: 240 This will generate tokens with a 4 hour expiration This can be adjusted as needed, but a typical test does not exceed this duration Note: Once the Test Plan is saved, Apache JMeter stores the username and password in plain text within the JMX file Rename the value for the ProjectFolder User Defined Variable Rename the Value contents from: C:\JMeter Tests\sampleworldcities3 To: C:\JMeter Tests\sampleworldcities4 If Test Plan resides in another location, please adjust as needed Rename the values for the WebServerName, PortalInstanceName, and ServerInstanceName User Defined Variables Rename the Value contents as needed Authentication Requests to Acquire a Token When you authenticate to an ArcGIS Enterprise Site from a web browser with a typical JavaScript application using built-in or domain member credentials there are several items sent back and forth from the client to the server in the process. These items appear in several HTTP requests and responses. While this traffic is made of many requests, the majority of them are calls for static content that are only needed for presentation (within the browser). The core authentication pieces, captured by logging into the REST endpoint of a deployment , are from three requests: OAuthState AccessToken Token In the Test Plan, these three HTTP requests will be put into one Transaction Controller that make up the authentication logic. Additionally, Regular Expression Extractors will be used for each request to pull out specific information from the server's responses. Lastly, this will all be put into another logical container called the Once Only Controller. This becomes the first item in the test. As the name implies, anything grouped into this controller is executed only once per the lifetime of the test thread. Note: The Once Only Controller is used because in this test case, it is not desired to authenticate and create a new token with every iteration of each test thread (this would carry to much overhead). But, since the authentication is not reissued during the run, the test duration needs to be less that the token expiration (4 hours as defined above). This approach is just one type of test design, there can be different variations which purposely simulate a heavier load from token generation. For example, the defined username and password could optionally come from a CSV file to simulate different user credentials being used. Despite the light-weight design of this Test Plan's token generation, a test that increases steps too quickly or increases pressure in large increments could still exert a decent amount of load as every new thread will request a token. In this case, a recommendation is to start small and increase the step load in small increments until a value is found that works for your system and your workflow. Looking at the authentication logic in this Test Plan example there are variables but not much in terms of customizations. Therefore, screenshots are shown instead of textually listing out each step to build each HTTP Request and Regular Expression Extractor. The complete Test Plan can be downloaded at the end of this Article. OAuthState HTTP Request Responsible for generating an OAuthState response (implicit grant) from the server. OAuthState Regular Expression Extractor Used to capture the state into a variable called: oauthState AccessToken HTTP Request Responsible for passing in the credentials and oauth_state variables. If valid, these will generate the an AccessToken response from the server AccessToken Regular Expression Extractor Used to capture the item into a variable called: accessToken Token HTTP Request Responsible for generating a token from the server. Token Regular Expression Extractor Used to capture the item in a variable called: token This variable is passed as an HTTP header to authenticate each request against the secured service. Token Response Assertion Added to the token generation request to check for a specific string in the response. If the string "expires" is not found, then a token was generated and the entire transaction is marked as failed (having a failed request). Token Support in the Map Request Headers With the token variable populated from the Authentication logic, the HTTP headers for each secured map request can be expanded to include it under the name Cookie with the value: agstoken=${token} Note: The HTTP Cookie Manager is not used in this Test Plan example From a technical point of view, the token could have been passed in the request as a key/value pair, but for improved security, it is passed as an HTTP header. Validating the Test Plan Ensure the View Results Tree is enabled and Start the test (e.g. the green arrow) to validate the responses. The green check marks for all parts of the Authentication indicate that a valid token was obtained from the provided credentials The image from the export map calls indicate that a request was successfully issued for the secured service To download the Apache JMeter Test Plan used in this Article see: sampleworldcities4.zip Apache JMeter released under the Apache License 2.0. Apache, Apache JMeter, JMeter, the Apache feather, and the Apache JMeter logo are trademarks of the Apache Software Foundation.
... View more
06-14-2021
05:44 PM
|
4
|
2
|
6577
|
|
BLOG
|
Why Run Test Plans from Command-line mode? The primary reason for running a load test from Command-line (CLI) mode and not through the GUI mode is that the later can decrease JMeter's capabilities. Using the GUI to run the test can consume additional CPU and memory that can negatively impact the test results. The CLI mode environment is the optimal choice for the test execution and is the recommended methodology by the Apache JMeter team. Running a test through a command windows might be slightly old school, but it is still very effective. Verifying the Pre-Test Checklist It is a good testing practice to utilize a pre-test checklist to help get the most effective use of your test time. Recommended checklist items include: Coordinating the load test start time and duration with administrative staff Ensures minimal impact of the test to users and other colleagues for the ArcGIS Enterprise Site Helps prevent system noise from other activity and use which may "pollute" the test results When testing a traditional (e.g. dedicated) ArcGIS Service, ensure the minimum and maximum instances are set appropriately This is necessary if your test goal is trying to understand the maximum achievable throughput from the service For max throughput, a general rule of thumb is to set the maximum instances to the number of CPU cores For predictable performance, set the minimum instances to the number of maximum instances Adjusting instances will cause a service to restart (plan accordingly) Increasing minimum instances of a service will require additional physical memory from the ArcGIS Server machine If the Test Plan contains a Listener like the View Results Tree, ensure to disable it from the GUI Running Listeners can increase the consumption of test client resources and potentially impact the test Validate the Thread Group for step load logic and test duration If the test is scheduled for a specific window, ensure the test will be set to run the expected length of time Ensuring the ArcGIS Server Log Level is not set to DEBUG or VERBOSE. While these log levels can be helpful for troubleshooting issues, they can impact performance and scalability of the service(s) being tested Batch Script Walkthrough Since the command-line is utilized for running the Apache JMeter load test, it is advantageous to assemble the actions and environment prep into a batch script (e.g. *.bat file in Windows). This favors repeatability and maintenance. Variables Using variables are handy ways to improve the maintenance of a script. While hard-coding values is technically fine, it can hinder readability if paths for certain items are very long. The following sections are the primary components our batch will utilize variables. Memory By default, Apache JMeter (version 5.4.1) runs with a minimum and maximum of 1GB of memory. This is adequate for simple tests running on older machine hardware, but some tests require more complex logic or multiple data files and more memory might be needed to ensure the tests results are not impacted Note: Many load tests focus on the resource utilization of the server hardware. However, the test client capabilities can also be a bottleneck that limits the ability of accomplish the test goals. Increasing the default memory can be easily with the following set heap=-Xms4g -Xmx4g -XX:MaxMetaspaceSize=256m JMeter and Project Paths We want to tell the batch script where to find Apache JMeter and our Test Plan. Setting the path to the JMeter bin folder is straightforward set jmeterbin=C:\apache-jmeter-5.4.1\bin As it the (manually created) project folder where the jmx file (e.g. the Test Plan) will reside set projectdir=C:\JMeter Tests\sampleworldcities3 For some extendibility, a variable is created for the name of the jmx file separate from the project folder (without the jmx extension and even if they utilize the same name) set testname=sampleworldcities3 For more convenience, a variable is created that will be appended to each run. For multiple reasons, a typical load test can be run several times, with each having a slightly different option enabled within the Test Plan. This can help keep track of different runs of the same test. set runname=run1 Test Execution and Switches Thanks to the utilization of the variables mentioned above, the execution of JMeter with a passed in Test Plan as a command-line switch can easily be reviewed in just a few lines: %jmeterbin%\jmeter -n -f -t "%projectdir%\%testname%.jmx" ^ -l "%projectdir%\results\results_%testname%_%runname%.jtl" ^ -j "%projectdir%\logs\debug_%testname%_%runname%.log" ^ -e -o "%projectdir%\reports\%testname%_%runname%" ^ The -n switch runs JMeter in commad-line mode The -t "path_to_jmx" switch tells JMeter where on the file system to find the jmx file. The -l "path_to_jtl" switch tells JMeter where to log samples, this is the results file and the most important artifact of the test run. The -j "path_to_log" switch says where to store the JMeter run log (e.g. environments information). The -e switch tells JMeter to generate a test report (dashboard) once the test has completed. The -o "path_to_report_folder" switch tells JMeter where to generated the report (dashboard). The -f switch tells JMeter to force delete existing results files and report folder if they are present before starting the test. Altering the "runname" variable in the script is the easiest way to ensure a new results file and report are created with each test run. The caret character or ^ is an "escape" character and is added to the Windows batch script so the next character (newline) is interpreted as an ordinary character. This helps with readability. Putting It All Together The complete batch script is as follows: echo off
rem Scripted JMeter Test Plan execution
rem Utilizing jmeter.bat for invocation
rem
rem 2021/06/07.1
rem
rem ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
rem *** Variables ***
rem Set JMeter memory to min/max of 4GB (adjust based on your test client resources)
set heap=-Xms4g -Xmx4g -XX:MaxMetaspaceSize=256m
rem Location of %JAVA_HOME%\bin (currently not used)
rem set javadir=C:\jdk-16.0.1\bin
rem Location of Apache JMeter bin
set jmeterbin=C:\apache-jmeter-5.4.1\bin
rem Location of JMeter Test Plan root folder (e.g. the folder where the Test Plan resides)
set projectdir=C:\JMeter Tests\sampleworldcities3
rem Name of the JMeter Test Plan (without the JMX file extension)
set testname=sampleworldcities3
rem String appended to results file of each test run
set runname=run1
rem ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
echo on
rem *** Test started ***
%jmeterbin%\jmeter -n -f -t "%projectdir%\%testname%.jmx" ^
-l "%projectdir%\results\results_%testname%_%runname%.jtl" ^
-j "%projectdir%\logs\debug_%testname%_%runname%.log" ^
-e -o "%projectdir%\reports\%testname%_%runname%" ^
rem *** Test completed ***
echo off Running the Load Test For the first first runs of your test, it is recommend to execute it from command window that is already open. This way, if there are immediate issues invoking any the commands in the batch script, they will persist on the screen so they can be corrected more easily (e.g. a typo). From Windows, you can invoke a Command Prompt window by clicking Start, then typing cmd. Once this window appears, enter in the path to your batch script: "C:\JMeter Tests\sampleworldcities3\runMe.bat", hit Enter The script will run for the configured duration then return to the command prompt Running from command-line mode does not provide the console with real-time statistics of the test execution. Such functionality can be obtained but is not covered in this Article. Test Artifacts Test artifacts are items created from the load test that can be used for analysis. Items like report and the JMeter Text Logs (JTL) files which contain the raw test results are typically the most important to retain. The run log can also be useful as its listing can be used to confirm the configured memory of the environment for the test run as well as updates to which step was being executed and when. After multiple runs of the same test, where a different runname in the script was used, it is easy to wind up with many jtl files and report folders. This is where following the testing strategy of utilizing a project folder for management is key. Note: The report generated by JMeter is HTML and JavaScript based. If some components of the report are blank or do not appear to render, try adjusting the security settings of the browser or open in another browser, if available. The initial page and Response Times Over Time section of a JMeter generated report To download the Apache JMeter Test Plan used in this Article see: sampleworldcities3.zip This test requires the Custom Thread Groups plugin to be installed in JMeter. Preparing for Analysis The generated report is one of the best places to start for conducting analysis in order to determine if the test goals were achieved. Since this analysis of a load test can be an involved process that is worth discussing in detail, it is a procedure that deserves its own Article. For a set of additional testing tutorials (of increasing complexity), see: Performance Engineering: Load Testing ArcGIS Enterprise Apache JMeter released under the Apache License 2.0. Apache, Apache JMeter, JMeter, the Apache feather, and the Apache JMeter logo are trademarks of the Apache Software Foundation.
... View more
06-07-2021
05:34 PM
|
1
|
0
|
10532
|
|
BLOG
|
Prerequisite This tutorial contains strategies and procedures for creating and running a dynamic load test against an ArcGIS Server service and assumes you are familiar with some of the basics of Apache JMeter. If you are new to Apache JMeter, please see our Performance Testing with Apache JMeter (An Introduction). JMeter Testing Strategies Before diving into the details on constructing and validating a data driven load test, it’s important to first discuss a few testing strategies specific to JMeter. These tips are different that items listed in our Recommended Strategies for Load Testing an ArcGIS Server Deployment Article which is intended for use with any testing tool (but also applicable to JMeter). A Good Test Plan Starts with a Proper Folder Structure One tactic that can help with managing multiple tests and multiple reports in JMeter is to implement a simple but consistent folder structure for storing the various elements of each test. The primary driver for a JMeter test is the JMX file…this is where all the testing logic resides. But as you continue to add features to your test and run it multiple times, you will expand beyond the management of just the single JMX file. A little organization goes a long way…manually creating a few key directories can help manage the various JMeter files that will populate the hard drive over time. Note: This approach is done from the Operating System’s file system and not directly from Apache JMeter. A recommended folder structure: name_of_project (the JMX file should be stored within it using the same name) datasets Will contain data files that make our requests dynamic with each test iteration logs Will contain debug logs of the JMeter test environment reports Will contain JMeter generated reports results Will contain JMeter test result files This is the RAW results files and the most important artifact of the test uploads Will contain files that are to be uploaded Optional Within the "project" folder, 5 empty directories would be created: datasets, logs, reports, results and uploads. User Defined Variables Like the folder structure mentioned above, utilizing User Defined Variables is a test construction strategy that favors portability, reusability and Test Plan sharing. Some common variables used with an ArcGIS service test are: ProjectFolder WebServerName ServerInstanceName SecurePort ServiceName In the JMeter Test Plan, these variables can be easily referenced with the notation like: ${ServiceName} which would automatically be replaced with the name of service at execution time. Using variables is less error prone than using hard-coded values if you ever plan to use the test in another environment. When the test is then run against another deployment, the tester only needs to adjust the variable definition. For Test Plans with many defined HTTP Request samples, this can be a real time saver. Choosing the Right Thread Group There are several choices when it comes to defining your step load logic. They are all similar and offer the same basic functionality but do have some subtle differences. Thread Group is a good, safe choice that offers maximum test portability as it is included with the core JMeter product. That said, using the JMeter Plugins Manager (installed as part of the Performance Testing with Apache JMeter (An Introduction)) to add the “Custom Thread Groups” can provide the test environment with some other useful options in this arena. Adding Support for Custom Thread Groups into JMeter From the Plugins Manager within JMeter (Options-->Plugins Manager): Select Available Plugins Find and select Custom Threads Groups Select Apply Changes and Restart JMeter Once the “Custom Thread Groups” plugin is installed, a recommended alternative over Thread Group is the bzm – Concurrency Thread Group as it allows for a straight-forward step configuration that visually renders the defined pressure which is a great validation feature from a testing point of view. Using Transactions to Group Requests Adding Transactions to your test can be a helpful way of grouping one or more similar requests that belong to the same “operation”. For example, the loading of an application, a navigational map zoom or form search. Using a transaction like this can greatly aid analysis as it isolates the operation (e.g. form search) into its own logical “container” with only the requests responsible for its function. Whereas, throwing many requests, for many different functions or services into one large group can make the post-test study very difficult. With transactions, analysis can be conducted to understand the performance of each operation, respectively. Note: Map applications and workflows vary; some operations will have transactions composed of many requests, others just one request. Note: The performance of different operations in a test does not always scale in the same way; some may do better than others; being able to identify poor performing operations (by a unique transaction name) is good analysis. Validate the Response is Being Returned Ensuring the expected response is coming back from the remote server is just as import as sending the request. For various reason, when testing an ArcGIS resource like a service, relying on just the HTTP status code (e.g. HTTP 200) is insufficient. One way to validate the response is to look for key words in the Header or Body. If a PNG image is requested but a textual error message is returned instead, a Response Assertion rule looking for “image/png” in the Header will mark the attempt as failed. If it is present in the Header, it is considered a passed request. Note: For every HTTP Request that needs to be examined in the JMeter Test Plan, a Response Assertion rule must be added. Creating a Data Driven, Export Map Test in JMeter Using the strategies above, a flexible and versatile JMeter Test Plan can be created that calls for a different area of interest from an ArcGIS map service with every request. A “dynamic test” is ideal as it can make the server resources work harder than if the same data is requested every time. Such a test is more representative of real-world conditions. As a performance tester, we want to take something like the export map request signature below and turn it into a dynamic JMeter load test: https://yourwebadaptor.domain.com/server/rest/services/SampleWorldCities/MapServer/export?dpi=96&transparent=true&format=png32&layers=show%3A0%2C1%2C2&bbox=-108.76228873935%2C31.0409016308382%2C-88.8526618315487%2C46.9686031570791&bboxSR=4326&imageSR=4326&size=1477%2C827&f=image JMeter offers several methodologies for constructing HTTP payloads in a Test Plan such as: Creating the requests by hand (e.g. manually) Using the built-in recorder to directly capture browser requests Importing from another source This Article will focus on the first method of showing the manually practice for creating a JMeter test, how to assemble the components of the request and how to make it's behavior “dynamic”. This data driven test will call the SampleWorldCities map service (running on your local deployment) through the export function. Note: By many measures, SampleWorldCities is considered a small and very light-weight map service, but given that it is ubiquitous with ArcGIS Server deployments, it is ideal for using it to walk through the creation of a load test. Create and Save a New Test Plan From the file system create the folder structure mentioned above: The name of the project folder will be called sampleworldcities1 Assume this will exist in: C:\JMeter Tests\ Create the supporting project directories: For example: datasets, logs, reports, results, uploads Start JMeter Save the new Test Plan that JMeter creates into the sampleworldcities1 folder that was just created Call the Test Plan: sampleworldcities1.jmx Assume the full path of the jmx Test Plan is: C:\JMeter Tests\sampleworldcities1\sampleworldcities1.jmx From the file system, the project folder would look like the following: Adding Variables to the Test Plan From within JMeter, click on the Test Plan (e.g. sampleworldcities1) At the bottom of the User Defined Variables Section, click Add For Name enter: ProjectFolder For Value enter: C:\JMeter Tests\sampleworldcities1 For Name enter: WebServerName For Value enter: yourwebadaptor.domain.com Enter the hostname of your Web Adaptor or ArcGIS Server For Name enter: ServerInstanceName For Value enter: server Some deployments use a different value, e.g. arcgis For Name enter: SecurePort For Value enter: 443 If pointing to an ArcGIS Server instance, use 6443 For Name enter: ServiceName For Value enter: SampleWorldCities Adding the bzm – Concurrency Thread Group to the Test Plan Right click on the sampleworldcities1 Test Plan and under Add-->Threads (Users), select bzm – Concurrency Thread Group Under the sampleworldcities1 Test Plan, click on bzm – Concurrency Thread Group Configure the step load logic with the following: Target Concurrency: 10 Ramp Up Time (min): 20 Ramp-Up Steps Counter: 10 With the bzm – Concurrency Thread Group plugin, the configured step load logic can be easily seen and visually confirmed in the chart graphic above. This test will be set to run for 20 minutes, will increase pressure by using 10 different steps (adding 1 concurrent test thread at a time every 2 minutes), hit a maximum of 10 concurrent test threads then will stop. Adding a Transaction to a Test Plan Right-click on bzm – Concurrency Thread Group and under Add-->Logic Controller, select Transaction Controller Once added, give the Transaction Controller (e.g. the workflow operation) a meaningful name to distinguish it from others that may exist in the test For the name enter: MapScale_9M The 9M represents the map scale of 9 Million (e.g. 1: 9,244,649) One or more HTTP Requests can now be added to this Transaction Controller Select “Generate parent sample” from the Transaction Controller section This option can assist with validation as it displays the respective requests under the Transaction object Adding an HTTP Request Now we want to create a data driven HTTP request that will ask for a different spatial extent with each request. This is done with a regular JMeter HTTP Request. Right click on the Transaction Controller called MapScale_9M, and under Add-->Sampler, select HTTP Request The new, empty HTTP Request should resemble the following: Making the HTTP Request Dynamic A request signature for export map such as: https://yourwebadaptor.domain.com/server/rest/services/SampleWorldCities/MapServer/export?dpi=96&transparent=true&format=png32&layers=show%3A0%2C1%2C2&bbox=-108.76228873935%2C31.0409016308382%2C-88.8526618315487%2C46.9686031570791&bboxSR=4326&imageSR=4326&size=1477%2C827&f=image contains several URL components that need to go to different places on the HTTP Request page (to maximize flexibility and maintenance). Let’s start with the key/value pairs of the request by separating the URL parameters. Below the parameters section has been modified to contain several JMeter variables instead of the original values. It is okay that some parts will reference JMeter variable names that have not yet created in the test. Copy the following and paste it into the HTTP Request by clicking Add from Clipboard at the bottom: dpi=96&transparent=true&format=png32&layers=show%3A0%2C1%2C2&bbox=${bbox_9244649}&bboxSR=${sr_9244649}&imageSR=${sr_9244649}&size=${width_9244649}%2C${height_9244649}&f=image This will help put in place several pieces necessary to make certain parts of the HTTP Request dynamic. Under the Web Server section: For Protocol enter: https For Server Name or IP enter: ${WebServerName} For Port Number enter: ${SecurePort} Under the HTTP Request section: For Path enter: /${ServerInstanceName}/rest/services/${ServiceName}/MapServer/export Adding a Response Assertion Right click on the HTTP Request, and under Add-->Assertions, select Response Assertion Change several of the Response Assertion parameters: Under Field to Test, select: Response Headers Under Pattern Matching Rules select: Contains Under Patterns to Test, click Add and enter: image/png Adding a CSV Data Set Config Right click on the Test Plan (e.g. sampleworldcities1), and under Add-->Config Element, select CSV Data Set Config Select the CSV Data Set Config and rename it to CSV Data Set Config -- 9M Providing a unique name can assist with maintenance/management if multiple data files are used within a test (e.g. one file for bounding box of each map scale) Using a text editor, populate an empty CSV file with the following 16 lines of data: bbox,width,height,mapUnits,sr,scale "-108.76228873935,31.0409016308382,-88.8526618315487,46.9686031570791",1280,1024,esriDecimalDegrees,4326,9244649 "23.1282001284589,34.337748761293,43.03782703626,50.2654502875339",1280,1024,esriDecimalDegrees,4326,9244649 "-117.052786808947,35.3545948648786,-97.1431599011459,51.2822963911194",1280,1024,esriDecimalDegrees,4326,9244649 "-100.83549484975,36.0637839078502,-80.9258679419492,51.991485434091",1280,1024,esriDecimalDegrees,4326,9244649 "17.7030587822399,34.3302705716379,37.6126856900409,50.2579720978788",1280,1024,esriDecimalDegrees,4326,9244649 "32.2827904735659,37.2792197897781,52.192417381367,53.2069213160189",1280,1024,esriDecimalDegrees,4326,9244649 "0.222469446585188,35.0562794764081,20.1320963543862,50.9839810026489",1280,1024,esriDecimalDegrees,4326,9244649 "100.262489726575,20.8660021309943,120.172116634376,36.7937036572352",1280,1024,esriDecimalDegrees,4326,9244649 "72.4186084471192,22.1748804724666,92.3282353549203,38.1025819987075",1280,1024,esriDecimalDegrees,4326,9244649 "-61.2094273153097,-26.7402777397378,-41.2998004075086,-10.812576213497",1280,1024,esriDecimalDegrees,4326,9244649 "-72.5908131920005,-26.6270515481087,-52.6811862841995,-10.6993500218679",1280,1024,esriDecimalDegrees,4326,9244649 "99.7474167083529,20.7088254145549,119.657043616154,36.6365269407958",1280,1024,esriDecimalDegrees,4326,9244649 "102.59837658124,20.384726879328,122.508003489041,36.3124284055689",1280,1024,esriDecimalDegrees,4326,9244649 "80.8239612278688,23.4521454643548,100.73358813567,39.3798469905956",1280,1024,esriDecimalDegrees,4326,9244649 "-105.812716128284,31.0536390717042,-85.9030892204829,46.9813405979451",1280,1024,esriDecimalDegrees,4326,9244649 Save the CSV file to: C:\JMeter Tests\sampleworldcities1\datasets\bbox_9244649.csv From the CSV Data Set Config window, Click Browse and navigate the bbox_9244649.csv file that was just saved Configure several items of the data source: Change Filename from: C:/JMeter Tests/sampleworldcities1/datasets/bbox_9244649.csv To: ${ProjectFolder}/datasets/bbox_9244649.csv For Variable Names add: bbox_9244649,width_9244649,height_9244649,sr_9244649 Change Ignore first line To: True Change Allow quoted data? To: True Validating the Test Plan Use the View Results Tree listener to assist with validating the response in the GUI from the remote server Right click on the Test Plan (sampleworldcities1) and under Add-->Listener, select View Results Tree Select View Results Tree, then Click on the green triangle at the top of the Apache JMeter GUI Following the configured step load, JMeter will start by repeatedly sending requests, (initially) one at time. The green shield with a checkmark is a sign of a successful request, if this is seen, wait several seconds for a handful of requests to be sent and responses to be returned. Clicking on one of the MapScale_9M green shields will show the Sampler result tab which displays information such as: Load time (response time) Size in bytes (Header and Body) Error Count Response code ContentType Note: This info is great for validating the test playback but also useful for performance troubleshooting Since this test is exercising the export map function, a visualization of the requested image would also assist with the Test Plan validation. Clicking the Response data tab and expanding the MapScale_9M transaction should render the requested image. Repeat as needed for other MapScale_9M transactions Toggling through several of the requests should show different images which validates the creation of a data driven export map load test in JMeter Saving Recent Changes Many changes were made since the last Save. Save the Test Plan File-->Save To download the Apache JMeter Test Plan used in this Article see: sampleworldcities1.zip To download a version of this Apache JMeter Test Plan with more detail covering additional map scales see: sampleworldcities2.zip Preparing to Run the Load Test The Apache JMeter team recommends to not run load test from the GUI but to instead invoke the test from the command-line for optimal performance and resource utilization. Since this process can involve several switches, parameter adjustments and checks that are worth discussing in detail, it is a procedure that deserves its own Article called: Running an Apache JMeter Load Test from Command-line mode (Beginner/Intermediate). Note: Please coordinate with your GIS team if your Apache JMeter test will be sending requests to a server that might impact other users. A load test should be scheduled to run during non-peak business hours. For a set of additional testing tutorials (of increasing complexity), see: Performance Engineering: Load Testing ArcGIS Enterprise Apache JMeter released under the Apache License 2.0. Apache, Apache JMeter, JMeter, the Apache feather, and the Apache JMeter logo are trademarks of the Apache Software Foundation.
... View more
06-02-2021
11:22 PM
|
4
|
11
|
10046
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 10-16-2025 12:05 PM | |
| 3 | 07-02-2025 03:34 PM | |
| 1 | 07-22-2025 11:41 AM | |
| 1 | 04-04-2025 12:08 AM | |
| 1 | 01-22-2025 03:05 PM |
| Online Status |
Offline
|
| Date Last Visited |
Friday
|