Browser performance issues using query widget

1499
5
10-10-2017 12:49 PM
CarlTownsend2
New Contributor III

Has anyone noticed the differences in browser performance using AGOL, specifically Firefox, Chrome, Internet Explorer and Edge?

I setup a query to grab ~10,500 records using the query widget (I changed the service definition in maxRecordCount: to 20,000 from 2,000, because not all the records are displayed, and it's not obvious to the user that you have to keep scrolling down the list to load more records - if you hit export to .csv it only grabs the first 2,000 records)

Anyway, It took Chrome (v 61.03163.100 - 64 bit) 42 seconds, and Firefox (v.56 - 32 bit) 54 seconds to grab the ~10,500 records and highlight them. Both these browsers popped up a message saying a script was taking too long, and gave me the choice of killing it or waiting.

IE (v. 11.1593) and edge (v.  38.14393.1066.0) both completely choked and died, with a message saying 'ArcGIS.com is not responding'...no chance to click wait...they both just stopped responding and froze. I crashed out after 5 minutes of watching a blank white screen. Edge crashed completely and disappeared during one test and I had to restart it.

Has anyone else noticed poor performance with IE and Edge & AGOL queries and what was done to resolve it?

0 Kudos
5 Replies
KellyGerrow
Esri Frequent Contributor

Hi Carl,

Are you using feature layers hosted in ArcGIS Online or ArcGIS Server?

If you are using ArcGIS Online it isn't recommended to increase the max record count as it can result in a load that is too large for some browsers to handle without crashing. Check out these two blogs for more information about how features are requested (queried) and strategies for displaying large datasets:

https://community.esri.com/community/gis/web-gis/arcgisonline/blog/2017/10/17/so-you-want-to-display...

Strategies to Effectively Display Large Amounts of Data in Web Apps | ArcGIS Blog 

If you want to create a layer with a lot of feature data that performs in all browsers, consider using a tile service.

-Kelly

0 Kudos
JohnSmith45
New Contributor II

Having similar issues, 15 seconds in Chrome to query and render 4893 records.  Reasonable I think in terms of both the number of records and the time taken to respond.  But in IE11 about 90 seconds.  It's the same map service running from the same ArcGIS Enterprise server, so I think we can rule out the server end.  It must be how each browser is running the javascript code.  A shame really given our corp environment is IE.  

0 Kudos
CarlTownsend2
New Contributor III

Fortunately, we are rolling out chrome as a default browser, and this seems to be the best solution. I created a tiled service (which comes with its own set of issues) and performance was marginally better, the best response was 28 seconds, from 42 seconds to query records for Chrome, for Firefox it was 40 seconds, from 54 seconds. The best response from IE was 5 minutes and 3 seconds while edge took 2 min 35 seconds. So not all browsers are created equal! Note these were the best times...running the identical query several times I noticed in some instances there wasn't any improvement at all. Note we aren't necessarily interested in viewing the data, just pulling a simple query and saving to .csv.

Note that I've run the exact same query on the exact same data, but on a .gdb stored on locally networked drive using ArcMap and it took 1.5 seconds. I can completely understand and accept some latency in querying data across the web, but given the immense power of cloud computing that we have in 2017...I'm wondering why it's taking so long to run a simple, basic query. ESRI suggest not increasing the MaxRecordsCount beyond 2000, so they are well aware of the latency/performance issue. I hope they will direct resources to making it a lot faster, if not then at least make it more obvious to the end user that only 2000 records have been selected.

0 Kudos
KellyGerrow
Esri Frequent Contributor

Hi Carl,

The layers should be performing quickly. If the layers are slow due to the amount of resources on a locally hosted service, you'll need to look into the resources allocated to the server. If the data is hosted in ArcGIS Online and is significantly slow, I'd suggest looking into the specific layer that you are querying and what is causing the issue. 

Is this a polygon or line geometry? 

How many attributes are you displaying in the pop up?

I'd suggest getting in contact with technical support or sharing your web map in this post so we can look into what is causing the latency. My usual standard for layer performance is that data should draw within a couple of seconds in a map or else my users may not wait for the layers to draw, so ensuring that the data is performing at an acceptable speed is critical in web mapping.

As you mentioned, different browsers can have performance differences so when displaying data on the web, it's easiest to control the complexity of the data than the performance of browsers that users are accessing your data from. For this reason, we recommend using tile layers when applicable, generalizing complex geometry data and planning which attributes to display to improve performance for complex data.

Best Practices for using Tile Layers as Operational Layers | ArcGIS Blog 

Limiting the max record count to 2000 records does no prevent more than a total of 2000 records from being requested from the service, but does keep requests more concise for server performance. Check out this blog for more details on how ArcGIS Online makes requests to the data.

If your data is fine to share publicly, try sharing your web map and we can also try and determine the issues with your service.

-Kelly

0 Kudos
CarlTownsend2
New Contributor III

"Limiting the max record count to 2000 records does no prevent more than a total of 2000 records from being requested from the service, but does keep requests more concise for server performance. Check out this blog for more details on how ArcGIS Online makes requests to the data."

Yes, that is correct, but....If you run your query, you click on those dot thingies, and export to .csv, it will only export 2,000 records. To get all the records, you have to scroll, scroll, scroll and scroll down, down, down, right to the end of all the records that appear in the search pane, then export. This isn't obvious, and this is what I'm saying. I thought there was a problem with the map, and kept refreshing/adding layers/removing layers/rebuilding it to see if it would bring up all records for export.

I spent a lot of time troubleshooting why it would only bring up 2,000 records for export, even after I had increased the MaxRecordsCount to 20,000, and it was saying in small print '10,756 records found' (Hmmm, okay, so it has 'found' 10,756 records, so it knows that they're there, MaxRecordsCount = 20,000, correct layer is being queried, but why is it exporting only 2,000 to .csv?) It was purely by accident that I happened to start scrolling deep down through the records that more of them started magically appearing when I exported (Again I thought this was a problem with the map, or the way the all browsers were rendering things). If you don't scroll all the way down, you'll only get a few thousand extra, so you have to be careful and check the .csv against the no. of records found to make sure you're got them all. If the user sees that it has found 10,756 records, then it should grab all of them for export. There should be no scrolling required. The exact same problem applies to saving features to My Content. Moreover, this isn't a complicated query, it just runs a single query against a single Yes/No field in a table containing about 15 fields.

Again, performance wise, this is where we have found that Firefox and Chrome stand out way above IE and Edge because these last two browsers get completely bogged down and stopped responding trying to scroll down through the (tiled service) records.

0 Kudos