Hi,
I am looking for clarity on HTTP response caching for Hosted Feature Services.
With the Object Store, I see it can be configured via ArcGIS Datastore on its own VM/Instance or a Cloud Object Store.
For Response Caching of Hosted Feature layer queries, is this only available if it is configured via ArcGIS Datastore VM/Instance or is this functionality also available via Cloud Object Store (S3, or Azure Container)?
Also, does this only store a given query response and then deletes the response? I am wondering how this operation works.
Thanks,
rokharris
Hi @rokharris - Thank you for your question!
Response caching is not limited to ArcGIS Enterprise deployments that are using the ArcGIS Data Store object store. That said, there was a defect recently logged specific to ArcGIS Enterprise 11.5 that prevents query response caching from working as expected when the object store is configured using Amazon S3 / Azure Blob Storage. Here is the defect information (along with a workaround noted): BUG-000177268 - Enabling response caching on a hosted feature layer fails with an error, "Object Sto....
For more information on response caching, please refer to this ArcGIS Blog: Use response caching in ArcGIS Enterprise. If you have further questions or feedback, please don't hesitate to follow up. Thanks!
Hi @Sarah_Hanson,
Can you help us on how to implement the workaround please?
Thank you very much!
Hi @PatriceLabbé - As the defect details note, the workaround involves updating the JSON for the registered data store from the REST Admin API to include the provider property (Amazon / Azure) twice— once at the root level, like is shown in the current REST API example for adding an object store, and then a second time (duplicated) within the info property. For example...
{
"path": "/cloudStores/<value>",
"type": "objectStore",
"provider": "amazon",
"info": {
"provider": "amazon",
"systemManaged":false,
Technical support will be the best resource to guide you through implementing the workaround.
Hi @Sarah_Hanson,
Since we're upgrading from 11.3 (no object store) to 11.5 and still working on migrating our dev environment, we simply deleted all hosted scene layers / packages and then unregistered the object store first. Then we ran our Azure DevOps pipeline to register the object store with the workaround. Now caching works as expected.
Thanks!
I am glad you got it working @PatriceLabbé.
For others reading this: deleting layers and unregistering the object store is not a required step. Instead, you can update the existing data store item via the REST Admin API to insert the additional info parameter and then save the changes.