We are newly federated here running ArcGIS Enterprise 10.7.1. Yesterday much of the morning was spent on a support call with Esri to get ArcGIS Server map services back online. In ArcGIS Server Manager all services were stuck in "Stopping" mode. In ArcCatalog none of the GIS Server connections could connnect. Solution came from viewing and confirming the Web Adaptor settings.The suspected cause of the outage was the automated Windows updates that had been applied hours before with no GIS resources stopped first.
In hopes of preventing this on the next Windows updates, in another thread I found earlier was mention of best practice ahead of Windows updates being this:
Is there anything more to it than that when applying Windows updates?
Portal is on a different server than ArcGIS Server, so what items ought be stopped on each of those?
That's unusual. The web tier (IIS) isn't really dependent on the Enterprise GIS - the web adaptor is basically a proxy and only talks to ArcGIS Enterprise over HTTP(s).
How was this issue addressed? Did you reinstall it or did you reregister the web adaptor with the GIS Server? It would helpful to understand better understand the issue prior to providing advice. A Windows update is more likely to impact IIS (specifically ASP.Net) than ArcGIS Enterprise.
Thank you for your reply, Randall. To your question, no we didn't reinstall, but reregistered the web adaptor with the GIS Server, after having created a new test web adaptor and finding that one to work. But the very next day the same problem happened again, but I was not in on the support call then. Only thing I heard come out of it was the question why we have "1.5gb map services" and further that "These services are too large by ESRI standards and should be reduced."
I am to follow up with support on that part, but I don't know if that relates to this post at all.
Sounds like unfortunate phrasing...full disclosure, I was in support 11 years, a technical lead for Enterprise Teams for 6.
Service data is NEVER too large. Taken straight up, that's incorrect. However, there may be optimization to be done. For instance, there may be scale dependences to be set or feature generalization that will help.
For instance, it's pretty frequent for users to attempt to draw every point and vertex at a small scale. Like, it doesn't make sense to show every vertex in a complicated geometry at a state level scale, nor would you want to attempt to visualize every house as a single point. Unless the map is cached, that'll cause real performance issues as all that geometry is sent from the server to the client. That's especially true with editable layers. I'd use scale dependent rendering and symbolize for heat maps and generalize those vertices. I also tend to separate my maps into thematic layers instead of one big uber service and mash them together client side. I find that more scalable for maintenance and helps rendering speed (one godzilla feature won't slow your whole app down). At larger scales you can get more granular. A broad statement like "you have too much data" is patently incorrect. I have highly performant customers with terabytes of data I speak to regularly.
Thank you for your support and advice, Randall. It was a surprise to me as well, that wording I quoted that I'd heard.
I do already use scale dependencies and caching, and generalized vertices on the most vertex-dense feature classes viewed at small scales. Furthermore, my map services are already thematic like you suggest, so I have dozens of them mashed together client side.