I am currently working with a large dataset of hundreds of geotiffs (interferograms) that were subsequently added into several .mxd projects over the past few years (total data size >1TB). The satellite data in geotiff format is stored on a server in different subfolders and organised in the layer principle sketched out below.
This approach worked out for a few years, however recently the .mxd projects within ArcGIS became slower and more susceptible to crashes. Therefore an efficient restructuring of the data storage and .mxd projects is necessary and I am wondering if any best practices or suggestions exist that would allow us an improvement in performance and storage?
The data is not stored in geodatabases at the moment, but as .tif files in folders. Raster catalogs are not an option as the data is temporal and often features large overlaps and we need the ability to tick on and off a layer for comparison (and not the possibility to access the layer only via attribute table as is the case with a raster catalog). Storing the data in a mosaic dataset is also problematic because of overlaps and no ability to toggle on and off individual layers.
Additional data will be added as new data deliveries come in every month. Would saving the data in a geodatabase be an advantage? Or do you have other suggestions on what to do to efficiently store new data. The different .mxd projects with sizes of 100MB+ can be opened but adding new data to it is very slow and is at the moment still done manually. Would adding new data to an .mxd project be possible by using arcpy in an automated process? Another option would maybe to compress the data (conversion to jpeg? LZ77).
I would be very happy about any suggestions, experiences and ideas.