We would like to start using MISB compliant video data in relation to our global pipe-lay projects. We performed tests using both the KLV Injector and ESRIs own multiplexer. We experienced some compatibility issues with the KLV Injector data which we hope will be resolved soon as this has worked before (CSA). Based on the successful multiplexer trials we do have the following queries and/or suggestions:
- Restrictions to the MISB definitions (probably outside ESRIs control but it would be convenient if these limitations could be overruled.
- Support for additional fields associated to any project using linear referencing.
- Synchronization of multiple video streams and snapping to (interpolated) cursor position.
MISB Field definition restrictions.
We found the following fields to have unnecessary restrictions due to the original scope associated with UAVs or ambiguities when used in combination with GIS:
Field #12 Image Coordinate System
Typically a local system would be used but is seems that ArcGIS does not reference this field to assess on which spheroid the embedded latitude and longitude have been defined. Would it be possible to link this field to a WKID as done with all projected included in ArcGIS? In combination with Field 13 and 14 stating WGS84 only it suggest this field is not being used as geodetic reference.
Field #13 & 14 Sensor Latitude and Longitude
Considering the existence of Field 12 it would seem logical if this could be any local latitude or longitude instead of a WGS84 one only. How does ArcMap use these latitude and longitudes, as defined in the map itself even if this is a local system. This would actually be the most sensible solution but the definition deviates.
Field #15 Sensor True Altitude
Considering we would like to use the system in a subsea setting the limit of -900 should be increased to -12000 to cater for the deepest trench. This is important as we also gather local bathymetry that we need to intersect to obtain the video camera foot prints on the map. Would it be possible to support larger altitudes without becoming none compliant?
Field #21 Slant Range
I would like to use this value based on the DEM acquired at the same time. The multiplexer interface is somewhat fuzzy however as it initially allows for a fixed altitude only and in the next phase allows the use of a DEM. Since the Z value for the cameras and seabed depth is known it would be more convenient to calculate the slant range from the camera position(s), camera altitude and DEM in one single step from Fields 13 through 20.
Field #75 Sensor Ellipsoid Height
See Field #15 the limit of -900 should be increased to -12000 to cater for the deepest trench. Would it be possible to support smaller heights without becoming none compliant?
Fields that seems sensible to add in combination with linear referencing.
Typically surveys are performed using a local grid like “UTM” and a route as reference for the measurements and distances (linear referencing). Since we also consider/are requested to support data models like the SSDM for seabed and PODS for pipeline features support for the fields below seem necessary. Could this be resolved by adding a 2nd (configurable) table stream that could also be used to generate the project specific video overlay? Overlay Parameters in blue
Field 101 – Easting (X grid coordinate)
Field 102 – Northing (X grid coordinate)
Field 103 – Water Depth (Z grid coordinate)
Field 104 – Elevation of the camera (Field 15 without depth restrictions)
Field 105 – Altitude (height above seabed for ROV/AUV)
Field 106 – Station or Kilometer Point (measurement in linear referencing)
Field 107 – Off-track (distance in linear referencing (negative to the left, positive to the right))
Field 108 – Task based on configurable string as Field 77 is restricted, examples would be pre-lay survey, as-laid survey etc.
Field 109 – Location identifier based on a configurable string
Field 110 – Run-line identifier, the route used for linear referencing
Field 111 – ROV/AUV Dive Number (Incremental Number per project and per vehicle.
Stream Synchronization and Feature Snapping
Please reference to the attachment “Synchronise_FMV_Streams.png”.
Although we have a third party program for the eventing it would very convenient if the same functionality would be available in ArcGIS. In particular if we do need to deliver data in the SSDM or PODS data models.
We typically run 3 cameras simultaneous that capture the pipeline from the top and from both sides. A green laser is used to align the cameras to capture the same area. Hence the geo-referencing would not be a necessity for us and as can be seem from the example we cover very small areas with each of the 3 cameras.
What is however crucial is the following:
- Rather than 3 independent streams we would like to synchronize these streams using the embedded time. We have not located a button for stream synchronization yet.
- When events are generated there are 2 options
- For the SSDM model we would digitize from the DTM and very rarely from the video.
- For PODS we would like to use the curser position AND snap to the as-built line (a ZM Polyline).
- At the moment we see the curser jump based on the 1s interval provided. If we are eventing we would like to interpolate between updates using time as interpolator. I have enclosed a grab but can also provide the test project raster and event data if that is of any use for testing purposes. This is real data from one of our client so it cannot be used online.
Should any queries remain, please do not hesitate to contact undersigned.
Robert van der Velden
Reporting and GIS Supervisor