Select to view content in your preferred language

Use lidar data in Reality Studio

279
0
06-16-2025 11:48 PM
Iro_Boutsi
Esri Contributor
2 0 279

ArcGIS Reality Studio 2025.1 introduces lidar data integration to improve the quality and completeness of geospatial products. Lidar data is complementary to the imagery data, especially in cases where the geometry of the surface is difficult to reconstruct from images only - such as highly reflective surfaces, dense vegetation, low-texture areas like concrete or snow, or complex urban environments with occlusions.

In this blog post, we will walk you through data preparation and requirements, best practices and essential steps to integrate your lidar data into the reconstruction workflow.

Input data preparation

Supported formats

Reality Studio requires the following lidar data types:

  • Trajectories: One or more files that contain the position and time information of the lidar sensor during data acquisition.
    • Formats: 
      • SBET (.out) –Smoothed Best Estimate of Trajectory, from Applanix
      • SOL (.sol) –Solution File, from HxMap
    • Required Information:
      • Position: Latitude, Longitude, Altitude (typically in the WGS84 reference frame)
      • Time format: GPS Week Seconds 
  • Point clouds: A set of postprocessed point clouds after the data acquisition survey.
    • Formats: 
      • LAS (.las) - Up to version 1.4
      • LAZ (.laz)
  • Classification:
    • Reality Studio uses the first return points (first echo).
    • Classified noise points are automatically filtered out.
    • The classification in other classes such as Ground, Vegetation etc. is not required.

Technical requirements

Before importing your lidar data into Reality Studio, prepare them according to the following requirements:

1. Cloud-to-trajectory time synchronization.

Each lidar point cloud should belong to exactly one trajectory among the given trajectories.

  • The time ranges between the lidar points and trajectories must overlap:
    • Point cloud timestamps (when the observation was taken for each of the points) must fall within the trajectory’s logged time range.
    • No lidar point cloud should overlap with more than one trajectory.
  • Meta Time information contained in the lidar point clouds must be set correctly:
    • The supported Meta Time Information format is GPS Week Seconds (seconds since the start of the current GPS week).
    • LAS file Global Encoding header bit field must be set to 0.

Any mismatch in time range or format will result in an error on surface reconstruction step and interrupt the process.

2. Coordinate system consistency between lidar and imagery data.

No adjustment or transformation operation is performed on the two data sources:

  • Lidar point clouds and imagery data must share the same coordinate system:
    • As detailed in the following section, you can define the spatial reference system in several ways during capture session creation.
  • Lidar data must be pre-aligned through strip adjustment before integration with imagery data:
    • The surface described by the lidar point clouds should "fit" well with the surface reconstructed from images.
    • Inconsistencies of the surface features (e.g., features missing in one source) or misalignments could lead to unwanted effects in the results.

Best practices for hybrid aerial systems

For optimal results, lidar and imagery data should be acquired at the same time. This can be achieved by either using a hybrid mapping system such as Leica Geosystems (CityMapper-2S, TerrainMapper-3), RIEGL (VQ-1560 III-S series), Vexcel/RIEGL and (UltraCam Dragon)  or operating both sensor types during an acquisition flight. Be sure to complete the following steps and checks:

  • Group point clouds and trajectory files in directories per flight, especially for systems that export multiple trajectory files. 
  • When flights can't be separated by time, processing them as a single mission/flight isn’t technically possible.
  • Always verify time synchronization and spatial coverage between imagery and lidar sensors before export.
  • Use your sensor vendor’s software to inspect, merge, or split trajectory and point cloud data prior to import, based on the requirements described above.

 

Workflow

Import lidar data: capture session creation

To import your lidar data into Reality Studio, you need to create a capture session. A capture session contains all data from a single acquisition mission, which can include:

  • one or more camera sessions (imagery orientation data) and/or,
  • one lidar session (trajectory file(s) and point clouds).

Imagery and lidar data captured from different missions should be imported into separate capture sessions, based on the scenarios and requirements described below.

ScenarioDescriptionRequirements

Single Flight/Mission

Create one capture session with:

  • One or more camera sessions
  • One lidar session
  • Same flight/mission date
  • Same spatial reference system for all data
  • Time-synchronized sensors
  • Overlapping coverage area

Multiple Flights/Missions

Create separate capture sessions if any of the following applies:

  • Different flight dates or acquisition times
  • Different sensor configurations or mounting setups
  • Different spatial reference systems
  • Temporal gaps in data (non-continuous trajectories)
  • Unaligned sensor timing
 
  • Separate sessions to prevent processing errors

 

Based on your scenario and dataset, a capture session can contain imagery data, lidar data, or a combination of both. These data types are organized into separate session types within the capture session and must share the same spatial reference system for correct processing. The table below shows how to set correctly the spatial reference for each data type and session configuration.

Data typeSession typesSpatial reference configurationCapture session selection
Imagery dataCamera session(s)

The spatial reference can be provided:

  • indirectly through the orientation data of imagery, or
  • directly by manually setting the horizontal and vertical coordinate systems, based on these guidelines.
The camera session(s) is automatically added to the capture session.
Imagery and lidar dataCamera session(s) and lidar session

Reality Studio checks whether the horizontal spatial reference systems of imagery and lidar point clouds match:

  • If they differ, it suggests applying the coordinate system of the newly added data type.
  • If the suggested spatial reference for lidar data is incorrect, the suggestion should be ignored.
  • If the data is pre-aligned, it is recommended to use the imagery’s spatial reference for the capture session.

The lidar session is displayed with the camera sessions of the capture session.

Lidar dataLidar session
  • The spatial reference is set automatically if it can be parsed from the lidar data; otherwise, it can be specified manually.
  • If corresponding imagery is already in the project, you can select their shared coordinate system from the drop-down menu.

The lidar session is automatically added to the capture session.

 

Upon capture session creation, the lidar session appears under the Data tab of Project Tree pane, as part of its capture session. Note that visualization of lidar point clouds and trajectories on the globe, as well as alignment of imagery and lidar data, are currently not supported.

Process lidar data: reconstruction creation

On reconstruction, you can process your lidar data along with imagery data to generate the desired geospatial products, i.e. digital surface models (DSM), True Orthos, DSM meshes, and 3D meshes. It’s important to know that 3D point clouds are generated from imagery data only. Before continuing though, please consider the following requirements:

  • Processing a lidar session only is not supported; lidar sessions can be used in reconstruction in addition to camera sessions, not stand-alone.
  • All inputs must share the same spatial reference system.

The lidar data can be used in Nadir and Oblique scenarios, combined with any type of geometry or configuration setting. After specifying the scenario, it’s time to choose the camera and lidar sessions used as an input to the reconstruction. In the Sessions section, you can check a capture session to automatically check all the camera sessions and the lidar session in it.The following session combinations are valid as reconstruction inputs:

  1. Imagery-only data: One or more camera sessions of aligned or non-aligned capture sessions.
  2. Imagery and lidar data: One or more lidar sessions with at least one or more camera sessions of aligned or non-aligned capture sessions. Please note that:

When using aligned capture sessions, you must select the lidar data from the original non-aligned version of that session. To do this:

  • Disable the “Only show aligned capture sessions” filter to display the corresponding non-aligned capture sessions.
  • Select the lidar session from the non-aligned capture session.
  • Select the camera session(s) from the aligned capture session.

Use the “Only show aligned capture sessions” filter to switch between the different capture sessions and select the lidar session along with the corresponding aligned imagery.Use the “Only show aligned capture sessions” filter to switch between the different capture sessions and select the lidar session along with the corresponding aligned imagery.

Finally, keep in mind that on reconstruction settings editing, any changes made to lidar sessions will trigger a full reset of the reconstruction, rather than a reprocessing.

 

Summary

Bring your geospatial products to the next level — ArcGIS Reality Studio now supports lidar data integration to significantly enhance the quality of DSMs, True Orthos, and 3D meshes. By combining imagery with lidar from hybrid aerial systems, you can overcome photogrammetry limitations in areas that are reflective, transparent, or occluded.  

This blog post walked you through everything you needed to prepare and use your lidar data seamlessly in Reality Studio. From supported formats to export tips and integration guidelines, you found the key steps to unlock the full potential of your hybrid datasets for even more precise and visually complete results.

3D Mesh without and with lidar data.3D Mesh without and with lidar data.

Contributors