Caching stops at 46.1%,

2231
12
05-30-2013 11:28 PM
JamalNUMAN
Legendary Contributor
Caching stops at 46.1%,

I???m wondering why caching stopped while the percentage completed is only 46.1%.

[ATTACH=CONFIG]24847[/ATTACH]

how about the rest?

Why the Caching is not in progress any longer?


Thank you


Best

Jamal
----------------------------------------
Jamal Numan
Geomolg Geoportal for Spatial Information
Ramallah, West Bank, Palestine
Tags (2)
0 Kudos
12 Replies
TimDine
Occasional Contributor II
Did you generate your cache based on the purple feature class?  It wouldn't have generated tiles for the white space which could make up a significant number of tiles for the whole cache.  The full cache would be rectangular.
0 Kudos
JamalNUMAN
Legendary Contributor
Did you generate your cache based on the purple feature class?  It wouldn't have generated tiles for the white space which could make up a significant number of tiles for the whole cache.  The full cache would be rectangular.


Thank you very much Tim for the help,

I�??m attaching the feature class based on which the area of interest for the cache is defined.

[ATTACH=CONFIG]24865[/ATTACH]

I wanted the scale to reach 1:1,000 but sounds no way to do so. I tried it with 1:2,500 but still has issues.

Considering the time/size needed for the cache, then the cache is like a nightmare!

Then what might be the best practice to cache my attached map up to scale 1:1000 with reasonable time/size


Best

Jamal
----------------------------------------
Jamal Numan
Geomolg Geoportal for Spatial Information
Ramallah, West Bank, Palestine
0 Kudos
MichaelRobb
Occasional Contributor III
The percentages are somewhat meaningless.
A completed cache service could be even at 8% like some I have seen.
Especially if you send a GP request at a later time, or cache is moved to an existing service.. it will never reach 100%.
go into the cached folders in arcgisserver and look in the GDB, all status spatial and tabular information about the cache is there (very detailed)...
You can also check cache status with ArcCatalog against the cached service (which is a finer grain) than checking through ArcServer/Manager.

There are several means of adjusting the Time/size of caches.
1. compressed MIXED format, down to 50%.. (these are .bundle files)
2. larger tile sizes
3. More instances / processors
4. Use a cluster purely dedicated for caching.
5. cache based on feature polygon, rather than map extents or full extents
6. cache on demand for certain instances

just a few I could think of off the top of my head

We have done scales down to 1:286 in some maps, or massive area maps down to L18
0 Kudos
JamalNUMAN
Legendary Contributor
The percentages are somewhat meaningless.
A completed cache service could be even at 8% like some I have seen.
Especially if you send a GP request at a later time, or cache is moved to an existing service.. it will never reach 100%.
go into the cached folders in arcgisserver and look in the GDB, all status spatial and tabular information about the cache is there (very detailed)...
You can also check cache status with ArcCatalog against the cached service (which is a finer grain) than checking through ArcServer/Manager.

There are several means of adjusting the Time/size of caches.
1. compressed MIXED format, down to 50%.. (these are .bundle files)
2. larger tile sizes
3. More instances / processors
4. Use a cluster purely dedicated for caching.
5. cache based on feature polygon, rather than map extents or full extents
6. cache on demand for certain instances

just a few I could think of off the top of my head

We have done scales down to 1:286 in some maps, or massive area maps down to L18




Thank you Michael for the answer,


The issue that the cashing rarely succeeds! In the screenshot below I got the same problem. Only three of the scales 100% completed.

[ATTACH=CONFIG]24960[/ATTACH]

What might be the issue? Why the �??generation is not in progress�?�?

Best

Jamal
----------------------------------------
Jamal Numan
Geomolg Geoportal for Spatial Information
Ramallah, West Bank, Palestine
0 Kudos
MichaelRobb
Occasional Contributor III
Define caching succeeds?
"Generation not in progress" could be due to the caching is complete.
This does NOT mean you will see 100% as I mentioned above, I have complete caches done showing 18%. There are a lot of variables determining the percents.


Did you do what I mentioned and look at the geodatabase status? my bet is it will say SUCCESS for all records.
0 Kudos
JamalNUMAN
Legendary Contributor
Define caching succeeds?
"Generation not in progress" could be due to the caching is complete.
This does NOT mean you will see 100% as I mentioned above, I have complete caches done showing 18%. There are a lot of variables determining the percents.


Did you do what I mentioned and look at the geodatabase status? my bet is it will say SUCCESS for all records.



Many thanks Michael,

I never succeed in caching whatever even with a scale as low as 1:1000.

The unfinished caching is not working! I check this with adding the link to the Silverlight
[ATTACH=CONFIG]25767[/ATTACH]

Facts:
1. Caching takes massive time even when caching small area (like the West Bank). In return, how a map like the one in the link below is cached and works fine?

http://www.govmap.gov.il/


2. Caching never ends successfully.


3. Caching takes huge space size on the hard drive

[ATTACH=CONFIG]25768[/ATTACH], [ATTACH=CONFIG]25769[/ATTACH]

If the cache never work then no meaning for publishing the data due to the very slow speed


What might be the solution?
----------------------------------------
Jamal Numan
Geomolg Geoportal for Spatial Information
Ramallah, West Bank, Palestine
0 Kudos
MarcoBoeringa
MVP Regular Contributor
Jamal,

I can't comment on why things might be failing at your site, Michael is in a better position to do that, but in general, you should not consider caching as the only alternative for hosting data.

If you just want to serve non-complex data, like small scale polygons of provinces like the once you showed in this thread with few vertices per polygon, and with simple outline symbology, you are probably much better off serving data straight from the feature source (or a copied file geodatabase), than to create a huge cache. It will bloat storage requirements compared to the original feature data.

Caches are best used for complex large scale data that may take up (tens of) gigabytes of diskspace using complex symbology, like detailed cadastral land records, high quality topographic data etc.

E.g. say you have an MXD having 20 layers with large scale topographic data symbolized using layered line symbology for roads, complex hatch symbols for polygons to represent 50 different land use types and so on, dynamic labelling etc. with a re-draw time of (a dozen) seconds per screen in ArcMap. If this MXD needs to serve as a basemap layer, e.g. a background to other more dynamic data, than that type of data is best served through cached services, as you wouldn't ever be able to serve that data dynamically to hundreds, or thousands, of users at a time. Just imagine with every hit to your website, the server needing 10 seconds to render a single image before it can send of the rendered data to the user...

It really depends on the type of data, and the complexity of the symbology to be used, what type of method you chose to host the data.
0 Kudos
JamalNUMAN
Legendary Contributor
Jamal,

I can't comment on why things might be failing at your site, Michael is in a better position to do that, but in general, you should not consider caching as the only alternative for hosting data.

If you just want to serve non-complex data, like small scale polygons of provinces like the once you showed in this thread with few vertices per polygon, and with simple outline symbology, you are probably much better off serving data straight from the feature source (or a copied file geodatabase), than to create a huge cache. It will bloat storage requirements compared to the original feature data.

Caches are best used for complex large scale data that may take up (tens of) gigabytes of diskspace using complex symbology, like detailed cadastral land records, high quality topographic data etc.

E.g. say you have an MXD having 20 layers with large scale topographic data symbolized using layered line symbology for roads, complex hatch symbols for polygons to represent 50 different land use types and so on, dynamic labelling etc. with a re-draw time of (a dozen) seconds per screen in ArcMap. If this MXD needs to serve as a basemap layer, e.g. a background to other more dynamic data, than that type of data is best served through cached services, as you wouldn't ever be able to serve that data dynamically to hundreds, or thousands, of users at a time. Just imagine with every hit to your website, the server needing 10 seconds to render a single image before it can send of the rendered data to the user...

It really depends on the type of data, and the complexity of the symbology to be used, what type of method you chose to host the data.




Many thanks Marco for the very informative answer,

In my case, 14 layers are required to be published where the background is the satellite image. What might be the best scenario to do this?

[ATTACH=CONFIG]25797[/ATTACH], [ATTACH=CONFIG]25799[/ATTACH], [ATTACH=CONFIG]25800[/ATTACH]


For the time being, the published mxd (service) that contains these layers and satellite image is quite slow even before being accessed by web mapping application. The service itself is quite slow.

[ATTACH=CONFIG]25801[/ATTACH]

Therefore, all users are complaining due to the very slow speed in panning and zooming the maps accommodated in the web application.

[ATTACH=CONFIG]25802[/ATTACH]

By the way, how Google is providing massive maps with relatively high speed? I think that the only way is caching them. Is that true?
----------------------------------------
Jamal Numan
Geomolg Geoportal for Spatial Information
Ramallah, West Bank, Palestine
0 Kudos
MichaelRobb
Occasional Contributor III
There are a vast amount of variables here.

1st off, you say 'huge' I laugh, we have caches 10x that size even with small areas, mind you, we go down to 1:256 for some serious details.
Even with a tiny area, you are going down to the 1000 level so depending on the project area, size would be expected and by using Exploded method (which creates many folder directories as well as jpgs (though the advantage to this is 'merging' caches quite easily) the size problem is exaggerated.
Its again further exaggerated by using 128x128 tiles and again exaggerated using png without compression.

Try re-running the cache by selecting EXTENT of the map (after of course, getting the area you want) - I have had strange problems with using features as the area of interest (however, I think this was due to custom projections being used)

Use MIXED Tile format at 75 compression - you wont notice a difference!
Use 512x512 tile
Use Bundle format (compressed storage format)

as for the estimates shown, Ive yet to see one that is accurate or really close - its a big 'guess' done by samples and algorithms.

as for the 'caching never ends successful'
what does the geodatabase results show you? which failures, if any are in the attribute table?

Another thing you can do is create the service MANUALLY cache and send the GP request through Cache manager in Catalog to the map service when published.
0 Kudos