I'm hoping to write a Python script that can tell me if services are running slow on my server site. Would the easiest thing be to run HTTP requests against Export Map and Identify on each service and compare return time against previous times or is there a better way? In the Admin page of the site perhaps? I'm not familiar enough with the Server API to know off hand. I know I can run queries directly against my SQL databases to keep an eye on their indexes, query time, load, etc. which would affect performance of the server site. However any query run against the database is going to be faster than anything run against the server site so that's not a great marker for whats happening in my deployed webmaps/apps.
I am going to make the script show a graph of the last couple hours of data that updates periodically so I can visually see if there is a performance dip. Ultimately I'd like to install the script on a Raspberry Pi with a screen attached and hang it on my cube wall as a constant server site monitor. R-Pi's come with a Python deployment so as long as I try to stick to mostly standard libraries like urllib2 or pure python libraries like pypyodbc I shouldn't have any problems (the R-Pi uses an ARM7 processor so the C code behind the library has to be complied to work on that architecture).
Edit to add: Ill share anything I write obviously. I'm sure this sort of metric is something others would enjoy seeing as well.