I'm working on caching some services for consumption entirely within ArcMap, several of which are going to be extremely large, and I'm getting the warning about cache size being larger than 500MB. We have the space and staging server to handle this, but the default warning is leading me to believe that I'm not understanding caching best practices.
Since we can't cache to a certain extent and draw dynamically in others (right? This was possible in 10.0, but no longer?), it seems we either cache to the fullest extent or don't cache at all. Can anyone speak to their experience caching or otherwise serving up very large datasets for use in ArcMap? ESRI suggested I provide two services, one cached and the other not, with the extent visibility I'd like. That'd be fine for use in web mapping, but is pretty nonsensical in ArcMap.
Solved! Go to Solution.
Thanks for the ideas, folks. What we went with was:
Create tile package up to the zoom level we wanted cached,
Add it as a layer,
Add the source data for the tpk as a layer,
Set scale visibility to be mutually exclusive,
Publish.
This gives us the functionality we wanted...A map service that was cached to a set zoom level and drew dynamically thereafter.
I'd love to know if there's a good reason not to go this route. Using ESRI's System Test tool on this strategy showed great results.
Hi Cole,
You may want to take a look at creating a cache for a defined area, and then use the option "Create tiles on demand" for other areas. Take a look at the following help document:
Strategies for creating map cache tiles—Documentation | ArcGIS for Server
Thanks, Jake - Should've mentioned that I'm using a feature class for the area of interest to cache only those areas we think will be in high demand and that we're trying the 'Create Tiles on Demand'. This option is too slow to be usable.
As I understand it, creating the tiles on demand is writing those new tiles to the cache. Any way to generate the tiles but not write them? Or am I still misunderstanding? If that option is off, and our max extent to cache is greater than the currently cached extent, what happens? It appears that it just zooms in to the last cached tile, but don't know for sure.
And if there's a warning at 500MB, is it ESRI's contention that a cache should probably not be that big? Or is it just a general warning that the cache is going to take time and resources to manage, so be careful?
If you have your cache built to a max scale of 5,000, and you zoom in further, you will just get a zoomed in version of the data at scale 5,000. It does not render the data at the new scale, unless you have the option 'Create Tiles On Demand' checked. In that case, it will create and write the new tile.
500 MB is pretty small when dealing with a caches. I've worked with customers who's caches are usually over 100 GBs. I would recommend using the 'Caculate Cache Size' option with the 'Best' option to get the most accurate estimate of what the cache size would be. If performance is important, and you have the storage space, it may be best to create the entire cache.
You could also host the cache in ArcGIS Online as well if storage is an issue. See the following document.
Thanks Jake. This was helpful.
There is another strategy you can consider (since you have 10.1+ server). This solves the problem when you want to draw from a cached map service at smaller scale and draw dynamically at larger scale.
This involves making some changes on both client and server side.
Server
Client app:
both of them pointing to the same cached map service
That should do the trick…
What will happen is that as you zoom in to larger scale and the ArcGISDynamicMapServiceLayer becomes visible, you web app will send ‘export’ calls (instead of pulling tiles) to the map service. Since the map service is dynamicLayers enabled, it will draw the image from scratch by reading features from the data, drawing etc. (as if it were a dynamic map service) instead of merging and cropping pre-cooked tiles.
Hope this helps.
You can build the cache just for some areas and mark 'Create Tiles On Demand'
Then create a layer that has a polygon for the entire area and just holes in the areas of interest.
You can run the manage tile command with the option to remove tiles to delete the extra cache that was created.
The best way is to create a simple python script and let it run automatically every night to clean your disk from unneeded cache.
As far as I know if you set your best cache to 5000 it will not build cache for 1000 even if you zoom there and mark cache on demand.
It will just resample the 5000 cache. It will only build cache for areas without cache not for new scales.
Have Fun
Mody
Thanks for the ideas, folks. What we went with was:
Create tile package up to the zoom level we wanted cached,
Add it as a layer,
Add the source data for the tpk as a layer,
Set scale visibility to be mutually exclusive,
Publish.
This gives us the functionality we wanted...A map service that was cached to a set zoom level and drew dynamically thereafter.
I'd love to know if there's a good reason not to go this route. Using ESRI's System Test tool on this strategy showed great results.