ArcGIS Enterprise performance impact due to Meltdown & Spectre

01-05-2018 02:09 AM
by Anonymous User
Not applicable

Hello all,

We're running one of our ArcGIS Enterprise ecosystems in Azure (2 servers, 1 portal, 1 geoevent server; version 10.5.1, all on Windows), and due to all the recent news about Meltdown and Spectre, which obviously affected us as well, we are noticing some decreases (the entire ecosystem isn't updated yet). In synthetic tests we can duplicate the generic 30% performance decrease in PostgreSQL, which is being used heavily as a part of Datastore for ArcGIS. Due to it being secondary for us (enterprise data is kept in Oracle), we didn't run a (hosted) feature server benchmark prior to the upgrade, as Azure kind of got updated overnight...

I'm at the moment a bit reluctant in regards to the rest of our ecosystem, which is more managed by our IT partner than azure (and as such: more in our sphere of influence).

I'm wondering if anyone, esri or otherwise, has noticed practical impacts on their Enterprise GIS performance, oracle databases, etc...

At the moment I'm scrambling to re-do our baseline benchmarks as much as possible (by means of both jmeter and the esri system tester), but I'm hesistant that this could really mess with our capacity planning.

And in regards to that: any update to the capacity planning tool?

2 Replies
by Anonymous User
Not applicable

For info purposes: esri posted a blog post/advisory today:

Meltdown and Spectre Processor Vulnerabilities | ArcGIS Blog 

0 Kudos
Frequent Contributor

These are all good questions Guus, esp since we are getting ready to stand up Enterprise at 10.6 in a similar congfiguration as yours. The exception being for us that we will be utilizing an internal virtualized environment (hypervisor I think) as opposed to AWS or Azure. 

As a simple test, I can report that for within our current AGOL standard storage environment, I just exported a hosted 1.8 Gb service definition file containing polylines and photo attachments to a file geodatabase, the time was 25 minutes to perform a read/write from/to ESRI's datastore, I think.  To me, that time is about 10 minutes long.

0 Kudos