We've got a lovely on-prem (soon cloud...) ArcGIS Enterprise platform, with a boatload of layers from reliable "curated" sources, such as our own asset data warehouse, and several other well-managed databases. One feature we really like, ánd hate at the same time is ArcGIS Datastore. We like it for the functionality it provides (hosted content, self-service "BI", and analytics), but hate it for the black box that it is. Even with its own backup tools, it's notoriously unreliable, and it has no 'per item' restore mechanic. That, and for several other reasons: we're looking to limit it.
Ideally I'd turn it outright off, but that would kill a lot of functionality for the GIS platform. A shame. The reason I'd turn it off, however, is our rogue user group (I know, embrace... but we're 8 man with 1700 out of control users...). Some of them have/had a lot of rights, and basically felt they could work 'quicker' than IT, in building apps, layers, and functionality. And beside all our warnings, eventually we díd have a major issue with datastore, causing us to have to roll back to a way earlier point... invoking the wrath of some users who made layers in said datastore that went way way way too far... While I am understanding of their inconvenience, we really need to limit the reach this kind of content has.
While I could make a bunch of python tools that run periodically (already have some that check metadata/extents/tags/categories/etc...), ideally I'd be able to limit the "reach" of those hosted layers. Like: max 50 users reach, X requests/hour. Above that? Contact the dev team, they'll help you move it to an enterprise store, and bring it to a proper production state. I found, however, that this is functionality that, outside of building it from scratch, or making some form of compromise, that's hard to implement.
A rate limit, such as on the resource proxy, would be ideal!