Select to view content in your preferred language

Reading large NetCDF file

2233
5
12-03-2012 09:00 PM
__11
by
Deactivated User
I'm trying to create a large NetCDF file that I can load into ArcMap 10.0
For smaller files (eg 100 MB) it works, but for larger files (eg 5GB) the Make NetCDF Raster Layer tool either takes hours, or hangs indefinitely (possibly because I give up after a few hours).
Are there any limitations on how large a NetCDF file can be?
Are there any particular requirements for ArcMap to load the file in a reasonable time?
The 5GB file I'm trying to load at the moment has this header (from ncdump -h)

netcdf largeFillValue { // format variant: 64bit
dimensions:
        x = 39393 ;
        y = 17766 ;
variables:
        double x(x) ;
        double y(y) ;
        double z(x, y) ;
                z:_FillValue = 9.96920996838687e+036 ;
}

Do I need to add any specific parameters to the file?
The values are all eastings/northings, not lon/lat.
0 Kudos
5 Replies
JeffreySwain
Esri Regular Contributor
Remember you are creating a layer and it may be subjected to the memory of your system. While you may have a lot of RAM to spare ArcMap is a 32 bit process in the foreground and at 10.1 background it is 64 bit.  Can you indicate if the process fails due to memory?  While I have never tried to create a NETCdf of that size, I would think that the limitations of a layer file on your system would be the issue.
0 Kudos
__11
by
Deactivated User
Remember you are creating a layer and it may be subjected to the memory of your system. While you may have a lot of RAM to spare ArcMap is a 32 bit process in the foreground and at 10.1 background it is 64 bit.  Can you indicate if the process fails due to memory?  While I have never tried to create a NETCdf of that size, I would think that the limitations of a layer file on your system would be the issue.


There's nothing to indicate that lack of memory is the problem. According to Task Manager, ArcMap, ArcSOCP, ArcSOMP use about 500MB between them - nowhere near the 2GB per process limit - after I have loaded the previously described 5GB file. (I don't know how long it took to load - I left it running overnight after waiting an hour for it then going home.) I get similar results with Windows XP with 3GB RAM + 3GB page file and Windows 7 32-bit, 12 GB RAM.
0 Kudos
__11
by
Deactivated User
I've found the problem. Z must be stored in row-major order, ie z(y, x), not z(x, y).
The latter works, but is unusably slow especially with larger data sets (presumably once the size of z exceeds the cache).
0 Kudos
__11
by
Deactivated User
I've found the problem. Z must be stored in row-major order, ie z(y, x), not z(x, y).
The latter works, but is unusably slow especially with larger data sets (presumably once the size of z exceeds the cache).


Well that fix works in some cases....
Specifically on Windows 7 64-bit with 32GB RAM, ArcGIS Desktop 10, ArcMap 10.0 build 2414, it works - loading the 5GB NetCDF file in about 2 minutes.
However on Windows XP 32-bit with 4GB RAM, ArcGIS Desktop 10 SP2, ArcMAP 10.0 build 3200, the same file thrashes the harddisk continuously and takes 15 minutes or longer to load (and then is unusably slow).
0 Kudos
JeffreySwain
Esri Regular Contributor
If you are creating the NetCDF, I always recommend using a format checker, like the one found here to find any errors with regards to the formatting.  Out or curiosity, how is the performance of a 1 gig NetCDF file on your system?  Most of the NetCDFs that I have used are no where near that size of file and still generate fairly good sized rasters when exported.  Have you taken a look at how large the 5 Gig NetCDF raster output will be?  I still would consider using a smaller file based on your description.
0 Kudos