Large imagery map service help! Also, future support for Raster Catalog?

4529
5
02-12-2015 11:49 AM
JacquelinePursell
Occasional Contributor

I am unable to purchase the imagery extension because of the huge cost that doesn't fit in our budget.  I have a mosaic dataset already for our users to use in their MXD's but I need to publish this for use in our internal and public web maps.  I never had a raster problem before, I have mosaiced and georeferenced large imagery for years but 10.x has made things so difficult it seems.  I want ERDAS imagine but that is also much more than our budget as well.

 

This is imagery is comprised of 4000+ individual Tiff images equal to about 1.25 TB.  I am running this on beefed up server machines using ArcCatalog, Python, or ArcPro.  I also have been running this locally all contained on the same server.

 

Here is what I have tried so far (each multiple times with various changes):

1. mosaic in one Tiff image is 0.99 TB before pyramids, with it climbs to 8 TB so I scrapped that.  That is just absurd!

 

2. I tried to mosaic in a File GDB but my configuration keyword never stuck (ESRI says there is no way to tell if a keyword was applied).  The image stopped processing at 1 TB even.

 

3.  I tried to mosaic in an Oracle SDE, this crashed as well, tried this various ways using various tools/environment settings)

 

4.  I tried to mosaic again in the Oracle SDE but this time with a JPG compression but JPG doesn't support 4 bands so I used a layer file dropping that 3rd band.  I used this successfully with one image but when I try it wit hte 4000+ it crashes about halfway.

 

5. I tried using ArcGIS Pro from ESRI Tech support advice, this failed the same as every other process

 

6.  I tried multiple variations (from above) in Python scripts that crash asfter a week or 2.  Half the state finishes.

 

I have tried for almost a year now (constantly running something) trying to mosaic this imagery and have failed time and time again.  Each time I run this it sits for 2 months minimum running in Arc Catalog, few weeks in ArcPro, and a week or so in Python.  I have tried everything short of a raster catalog.  I am going to be trying with the free opensource QGIS today or tomorrow and then I am throwing my hands up and using a raster catalog.  ESRI support (multiple calls) has not offered me anything useful other than a sales pitch for the imagery extension that I repeatedly state is beyond the means of our budget.

 

So what is the future for Raster Catalogs?  Will they be supported in future versions?  Would it be wise to go this route?   At the moment it may be my only option.  I need to do this every 2-3 years.

 

Suggestions? comments?

0 Kudos
5 Replies
larryzhang
Occasional Contributor III

jac,

For your case (you want to serve raster images over network/Internet, but you don't have budget to use ArcGIS Server /image extension or ERDAS Appollo) , with SDE/ Oracle available, it is advisable

  1. to manage 4000 raster images in SDE /Oracle (SAN storage) via raster catalog (unmanaged vs file folder) and then
  2. to make it accessible to your audiences /applications through SDE /Oracle connection

Regarding uses of the raster catalog, it is better to perform coding to retrieve / query /call individual raster via footprint ID within client applications, in addition to loading all raster images from raster catalog…

0 Kudos
JeffreySwain
Esri Regular Contributor

Your specific use case is exactly the case that an image service is built to handle, but I understand the budget frustration.  I am not sure about the practicality of create multiple TB mosaics to try and publish or for that matter even create on your database.  The raster catalog in Oracle is actually a managed raster catalog, where the data is ingested into the database, not linked to as with a file geodatabase, so any file created there will still be huge(imagery is imported into the database first).  The question to answer is it cheaper to come up with a beefy database (SAN as Larry recommended) or otherwise or try to increase the budget for Image Server. 

If you only have internal customers, then you could potentially build the mosaic dataset on a network drive and then use the UNC paths to link to the data. All of your users will have to have access to the mosaic dataset, the source data and the overviews, but that will work internally.  If you want to share it over the web then you are back to Image Server.

Sorry for the hassle and I wish you luck with the budgeting discussion if that is the route you go.

GordonSumerling
Esri Contributor

Hello JacquElaine,

have you considered using a map cache of the mosaic dataset? If you do not need to analyse the imagery it might be a smart option.

To create an image cache is a relatively easy exercise.
1. In a File Geodatabse create a Geodatabase Mosaic Dataset and then add all your imagery to the dataset. This gives us a seamless mosaic of the imagery.
2. Make sure the boundary of both the images and the entire boundary of the mosaic have been clipped to remove all the padding (no data regions)
3. Decide on the scales you would like to build a cache to. Generally speaking a 50cm image is cached to 1:512. The smallest scales that consumer mapping applications go to is 1:1128 which is roughly 1 m resolution
4. Create a tiling scheme file that reflects the scales you decided upon.
5. Using the tiling scheme file as a template export the Mosaic dataset as a Managed Tile Cache. When caches for imagery are created in ArcGIS they work from the largest scale first and progressively get smaller.

Now here comes the smart bit. We need to trick ArcGIS into serving a cache of imagery but only publish a vector layer.
1. In ArcGIS for Desktop add a vector layer which is equivalent to the minimum bounding area of the imagery
2. Share this as a map service to ArcGIS for server.
3. In the configuration options of the service enable a cache but do not create a cache. Include the tiling scheme file from point 4. above. This ensures the correct scales and image formats are used.
4. Once the service is up an running, simply copy the cache you created in step five above into the service. Now the service will ignore the vector layer and only show the cache when requests are made against it.

You can now access this imagery from any web application, desktop tool or mobile tool with ease.

this process is outline in the Image Management guide book. I find this very useful for many of my imagery workflows.

Gordon

0 Kudos
JedFehrenbach
New Contributor

Perhaps looking into other compression types to get your file GDB below 1 TB may also be a solution?  I once had to load many thousand TIFF images (covering the entire state of Texas) into a raster catalog in an Oracle database and spent time learning about compression types, sampling methods, quality, and pyramid settings.  I have found JPEG2000 (not jpg, they are quite different behind the scenes) at 80 to 85 compression quality, to be a great balance between disk space and perceived quality.  Also, for aerial photos use bilinear sampling.

0 Kudos
DanPatterson_Retired
MVP Emeritus

general discussion on raster compression including types, how stored whether being used for analysis etc is given in the help for basic arc* stuff which I am sure is transferable to any method of deployment

Raster compression—Help | ArcGIS for Desktop

0 Kudos