Skip navigation
All Places > GIS > Imagery and Remote Sensing > Blog > Authors CBenkelman-esristaff

Do you have imagery from an aerial photography camera (whether a modern digital camera or scanned film) and the orientation data either by direct georeferencing or the results of aerial triangulation? If yes, you’ll want to work with a mosaic dataset, and load the imagery with the proper raster type.

 

The mosaic dataset provides the foundation for many different use cases, including:

  • On-the-fly orthorectification of images in a dynamic mosaic, for direct use in ArcGIS Pro or sharing through ArcGIS Image Server.
  • Production of custom basemaps from source imagery.
  • Managing and viewing aerial frame imagery in stereo
  • Accessing images in their Image Coordinate System (ICS).  


There are different raster types that support the photogrammetric model for frame imagery.  If you have existing orientation data from ISAT or Match-AT, you can use the raster types with those names to directly load the data (see
Help here). 

 

For a general frame camera, you’ll want to know how to use the Frame Camera raster type and we have recently updated some helpful resources:  

UI for automated script

 

Further information:

  • Note that if your imagery is oblique, the Frame Camera raster type supports multi-sensor oblique images. Refer to the http://esriurl.com/FrameCameraBestPractices for configuration advice.
  • If you want to extract a digital terrain model (DTM) from the imagery, or improve the accuracy of the aerial triangulation, see the Ortho Mapping capabilities of ArcGIS Pro (advanced license). http://esriurl.com/OrthoMapping.
  • If you are seeking additional detail on the photogrammetric model used within the Frame Camera raster type, see this supplemental document http://esriurl.com/FrameCameraDetailDoc

For FMV in ArcGIS (ArcGIS Pro 2.2 with Image Analyst Extension, or ArcMap 10.x with the FMV add-in) to display videos and link the footprint into the proper location on the map, the video must include georeferencing metadata multiplexed into the video stream.  The metadata must be in MISB (motion industry standards board) format, originally designed for military systems.  Information is here http://www.gwg.nga.mil/misb/index.html, but drone users do not need to study this specification.  For non-MISB datasets, Esri has created a geoprocessing tool called the “Video Multiplexer” that will process a video file with a separate metadata text file to create a MISB-compatible video.  This is described more completely (e.g. format for the metadata about camera location, orientation, field of view, etc.) in the FMV Manual at http://esriurl.com/FMVmanual.

 

For those with DJI drones, the challenge then becomes “where is the required metadata?”.  DJI drones write a binary formatted metadata file with extension *.dat (or possibly *.srt, depending on drone and firmware) for every flight.  There is a free utility called “DatCon” at this link https://datfile.net/DatCon/downloads.html which will reportedly convert the DJI files to ASCII format. 

 

Key points:

  • Esri has not tested and cannot endorse this free utility. If you choose to use it, as with any download from the internet, you should check it for viruses etc.
  • DJI has changed the format of the metadata in this file on multiple occasions, so depending on your drone and date of its firmware, you will find differences in the metadata content. Esri does not have a specification for this metadata at any version, so cannot advise you what to expect to be included in (or missing from) this file.
  • Another key point is that the DJI *.dat file was created for the purpose of troubleshooting. It was not created with the intent of supporting geospatial professionals seeking a complete metadata record for the drone, gimbal, and camera.  As a result, users will typically find temporal gaps in the metadata.  As a result, processing this metadata through the FMV Multiplexer will likely generate inaccurate results, unless you are willing to apply manual effort (requiring trial and error, and substantial time) to identify the temporal gaps and fill in your own estimated or interpolated values for the missing times and missing fields.
  • IMPORTANT: This blog was written in September 2018, and it is very possible that DJI will make firmware changes in the future to change the readability and completeness of their metadata.

 

There is an alternative to this, but it is not an Esri solution.  CompassDrone, an Esri business partner and DJI authorized distributor, has built a flight planning and flight control application called CIRRUAS using the DJI API.  This application has access to the DJI metadata in flight, and (among other features) is explicitly designed to capture complete metadata as defined by Esri for FMV support.  If you are using the CIRRUAS app, a metadata file will be captured and exported from the drone, and this will feed directly into the FMV multiplexer. 

 

The CIRRUAS app is available here https://compassdrone.com/software/dji2fmv-cirruas-app/.  For further discussion, please refer to the blog on this topic written by CompassDrone:  https://compassdrone.com/dat-srt-vs-cirruas/

 

A few final notes:

  • Our testing of the CIRRUAS app has yielded good results, but Esri does not provide technical support for the app.
  • Note that the CIRRUAS app must be used to plan and fly the mission, and this will initiate the recording of complete metadata. It cannot be applied to video that was previously recorded, since the metadata records will not be complete.
  • It is not known if there are other alternatives which provide a solution for processing video from DJI drones for ArcGIS FMV.

 

Check back in this blog for updates as more capabilities are developed.

#EsriFMVDJI

With the Image Analyst extension in ArcGIS Pro 2.1 (or later), non-orthorectified and suitably overlapping images with appropriate metadata can be viewed in stereo!  This stereoscopic viewing experience can enable 3D feature extraction.  See more information at http://esriurl.com/stereo.

 

If your organization has a collection of images and you’d like to use the stereo viewing capability in ArcGIS Pro, where do you start?   The key questions are: 

  1. What type of sensor collected the data, and
  2. What orientation data do you have along with the images?

 

In order to display images as stereo pairs, ArcGIS must have detailed information about the location of the sensor (x,y,z) as well as its orientation – and this is unique information for every image.  Information about the sensor (typically called a camera model or sensor model) is also required. 

Graphic Showing Geometry of One Stereo Image Pair

 

There are a few conceptually simple cases, although each has important details to follow within its own workflow and documentation.

 

  • If you have two overlapping satellite images, you can go directly to stereo viewing.
  • If you have a collection of satellite images, you can build a mosaic dataset and ingest the images using the specific raster type for that satellite, run the Build Stereo Model geoprocessing tool, then proceed to the stereo view.  The raster type for the satellite reads the required orientation data.
  • If your imagery came from a professional aerial camera system:
    • If you have an output project file from aerotriangulation (AT) software (e.g. Match-AT or ISAT), ArcGIS includes raster types which ingest the orientation data for you, so this is similar to the satellite case: build a mosaic dataset with the proper raster type, Build Stereo Model, and proceed to stereo viewing.
    • If you have a project file from AT software not currently supported, Python raster types are under development for additional sensors e.g. for the Vexcel Ultracam. For more information, watch for announcements on GeoNet or on http://esriurl.com/ImageryWorkflows.  Alternatively, if you have a table of camera and frame orientation values, see the next bullet.
    • If you have a table of data values representing the exterior orientation as well as a camera model (interior orientation), you will build a mosaic dataset and ingest the images using the “Frame camera” raster type. 
    • If you have scanned film but without the results of AT software, refer to the FrameCameraBestPractices. With ArcGIS Pro 2.1, some values may have to be estimated, and the positional accuracy may not be optimum.  ArcGIS Pro 2.2 (and later versions) support fiducial measurement.
  • If your imagery was captured using a drone, you will need to use photogrammetric software to generate the camera model and orientation data.   
    • If you process your drone imagery using Ortho Mapping in ArcGIS Pro Advanced (see http://esriurl.com/OrthoMappingHelp), after the Adjust step is completed, the Image Collection mosaic dataset will be ready for viewing in stereo (after Build Stereo Model).
    • If you are using Drone2Map, please see this item ArcGIS Online http://esriurl.com/D2Mmanagement to download a geoprocessing tool which can ingest the images into a mosaic dataset.

 

For those interested in trying an example, a downloadable sample is available in this item on ArcGIS Online: http://esriurl.com/FrameCameraSample

Esri and Garmin are pleased to announce that Garmin’s VIRB Action Cameras (VIRB Ultra 30, VIRB X, VIRB XE, and VIRB Elite) have full support for Full Motion Video (FMV) for ArcGIS!  

 

To leverage this feature, users can download the latest version of VIRB Edit software (version 5.1.1 and above) from http://www8.garmin.com/support/download_details.jsp?id=6591 or simply search for “Virb edit software” at http://www.garmin.com.

 

The Full Motion Video add-in is a free download for ArcMap 10.3 through 10.5, and will be coming in ArcGIS Pro version 2.1 by the end of 2017.  Current users of ArcMap can find information on FMV at http://esri.com/FMV.  The Full Motion Video add-in allows users to manage, display, and analyze geospatially enabled videos within their GIS.  Feature data can be digitized from video frames, and GIS features can be overlaid onto the video during playback.  The video search tool provides a powerful data management capability, enabling users to quickly find archived videos based on attribute data or a simple geographic search. 

 

The Garmin VIRB cameras record GPS and camera orientation data with the video.  This position and orientation metadata enables FMV for ArcGIS to locate the sensor on the map, and if the camera footprint (field of view) is aimed toward the ground, the moving video footprint can also be displayed in ArcGIS. 

 

Instructions for extracting the VIRB position and orientation metadata and then processing with the Full Motion Video Multiplexer are available in this document:  http://esriurl.com/GarminVirbFMV

Esri has released its latest version of the Full Motion Video (FMV) add-in, version 1.3.2, with a series of new features and updated support for the current version of ArcMap (10.5, also compatible with 10.4.x and 10.3.x).

 

For video captured with appropriate metadata (regarding camera location and orientation), FMV enables viewing and processing of video in a mapping environment.  The FMV add-in is compatible with a variety of common video formats, whether captured from fixed vantage points, airborne platforms, or drones. 

 

A highlight of new features and improvements includes:

  • Increased performance with support for 2.7K, 4K, and higher resolutions of digital video (performance may vary, depending on CPU/GPU)
  • A new workflow for digitizing GIS features directly in the video which automatically populates custom metadata fields: Video_Date, Video_Time, and Video_Source.
  • A new video search workflow, improved search algorithm, and an updated UI make searching for archived videos faster and more intuitive.
  • The Capture Groups of Images tool in the FMV Add-in allows users to extract individual frames while streaming live video.  Images can be easily added into a mosaic dataset using the Mosaic Video Frames GP tool.
  • The Mosaic Video GP tool now supports JPG, JP2, PNG, NITF, and TIFF file types. Georeferencing of each frame now uses the Projective transformation to increase overall accuracy and eliminate unnecessary resampling of the imagery.  
  • A new GP Tool called Extract Video Frames for Orthomosaic has been added.  This tool can be used to extract individual images and associated MISB metadata for input to the Ortho Mapping tools for ArcGIS or the Esri Drone2Map application.  There, the video frames can be processed to generate an orthorectified mosaic image and other derived products. Metadata captured with this tool is stored in external CSV’s or embedded as EXIF file headers.

 

As with prior versions, the FMV add-in remains free of charge to customers current on ArcGIS Desktop maintenance.  

Refer to the landing page at http://esri.com/FMV for further information, including a link to my.esri.com to download the software.