Create Space time cube by Aggregating points- HDF error

578
2
05-18-2018 02:09 AM
HaydenWilson2
New Contributor

Hi!

I am trying to run the space time pattern mining tool on MODIS active fire product data in order to identify regions in South Africa where fires are increasing or decreasing in frequency from 01/01/2001 to 01/01/2018.

The data is projected and is formatted as follows:

FID (object ID);shape; LATITUDE (numeric); LONGITUDE(numeric); BRIGHT_TI4(numeric); SCAN(numeric); TRACK(numeric); DATE (date); TIME (Text); SATELLITE (Text); INSTRUMENT(Text); CONFIDENCE(Text); VERSION(Text); BRIGHT_TI5 (Double); FRP (Double).

I am using the date field as the time field, I have a time step interval of 1 month, time step alignment from the start time and I am using a hexagon grid.

Arcgis Pro seems to have no problems with reading the data. However as soon as it starts the actual process of creating the space time cube the analysis fails and I get the following error.

Traceback (most recent call last):
File "<string>", line 1158, in execute
File "C:\Program Files\ArcGIS\Pro\Resources\ArcToolbox\Scripts\SSCube.py", line 90, in __init__
self.__initialize(cubeObj)
File "C:\Program Files\ArcGIS\Pro\Resources\ArcToolbox\Scripts\SSCube.py", line 163, in __initialize
self.createVariable('time_step_ID', timeIDValue, dType = 'i4')
File "C:\Program Files\ArcGIS\Pro\Resources\ArcToolbox\Scripts\SSCube.py", line 964, in createVariable
var[:] = varValue
File "netCDF4\_netCDF4.pyx", line 4053, in netCDF4._netCDF4.Variable.__setitem__ (netCDF4\_netCDF4.c:44186)
File "netCDF4\_netCDF4.pyx", line 4258, in netCDF4._netCDF4.Variable._put (netCDF4\_netCDF4.c:45707)
RuntimeError: NetCDF: HDF error
Failed to execute (CreateSpaceTimeCube).

It looks to me like its is battling to writre a header file for some reason.

Have any of you had this error and if so, How did you resolve it?

Many thanks!

0 Kudos
2 Replies
HECAdmin
New Contributor III

Did you ever solve this issue? I'm running into the same error.

Thanks!

0 Kudos
HaydenWilson2
New Contributor

Sadly no.... I wound up having to work around it by building a virtual cube in postgres and performing all my analyses in there.

0 Kudos