Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

Tightly-coupled camera/LiDAR integration for point cloud generation from GNSS/INS-assisted UAV mapping systems

Journal Article · · ISPRS Journal of Photogrammetry and Remote Sensing
 [1];  [2];  [2]
  1. Purdue Univ., West Lafayette, IN (United States); OSTI
  2. Purdue Univ., West Lafayette, IN (United States)

Unmanned aerial vehicles (UAVs) equipped with integrated global navigation satellite systems/inertial navigation systems (GNSS/INS) together with cameras and/or LiDAR sensors are being widely used for topographic mapping in a variety of applications such as precision agriculture, coastal monitoring, and archaeological documentation. Integration of image-based and LiDAR point clouds can provide a comprehensive 3D model of the area of interest. For such integration, ensuring a good alignment between data from the different sources is critical. Although many works have been conducted on this topic, there is still a need for a rigorous integration approach that minimizes the discrepancy between camera and LiDAR data caused by inaccurate system calibration parameters and/or trajectory artifacts. This study proposes an automated tightly-coupled camera/LiDAR integration workflow for GNSS/INS-assisted UAV systems. The proposed strategy is conducted in three main steps. First, an image-based point cloud is generated using a LiDAR/GNSS/INS-assisted structure from motion (SfM) strategy. Then, feature correspondences between image-based and LiDAR point clouds are automatically identified. Finally, an integrated-bundle adjustment procedure including image points, LiDAR raw measurements, and GNSS/INS information is conducted to minimize the discrepancy between point clouds from different sensors while estimating system calibration parameters and refining the trajectory information. The proposed SfM strategy and integration framework are evaluated using five datasets. The SfM results show that using LiDAR data can facilitate feature matching and further increase the number of reconstructed 3D points. The experimental results also illustrate that the developed automated camera/LiDAR integration strategy is capable of accurately estimating system calibration parameters to achieve good alignment among camera/LiDAR data from single/multiple systems. Finally, an absolute accuracy in the range of 3–5 cm is achieved for the image/LiDAR point clouds after the integration process.

Research Organization:
Purdue Univ., West Lafayette, IN (United States)
Sponsoring Organization:
USDOE Advanced Research Projects Agency - Energy (ARPA-E)
Grant/Contract Number:
AR0001135
OSTI ID:
1977227
Alternate ID(s):
OSTI ID: 1819466
Journal Information:
ISPRS Journal of Photogrammetry and Remote Sensing, Journal Name: ISPRS Journal of Photogrammetry and Remote Sensing Journal Issue: C Vol. 180; ISSN 0924-2716
Publisher:
ElsevierCopyright Statement
Country of Publication:
United States
Language:
English

References (25)

Topographic structure from motion: a new development in photogrammetric measurement: TOPOGRAPHIC STRUCTURE FROM MOTION journal January 2013
UAV for 3D mapping applications: a review journal November 2013
Principal component analysis journal August 1987
Object modelling by registration of multiple range images journal April 1992
‘Structure-from-Motion’ photogrammetry: A low-cost, effective tool for geoscience applications journal December 2012
In-flight photogrammetric camera calibration and validation via complementary lidar journal February 2015
NRLI-UAV: Non-rigid registration of sequential raw laser scans and images for low-cost UAV LiDAR point cloud quality improvement journal December 2019
A flexible targetless LiDAR–GNSS/INS–camera calibration method for UAV platforms journal August 2020
Distinctive Image Features from Scale-Invariant Keypoints journal November 2004
Structure-from-Motion Revisited conference June 2016
Automatic Extrinsic Calibration of a Camera and a 3D LiDAR Using Line and Plane Correspondences conference October 2018
Automatic extrinsic calibration for lidar-stereo vehicle sensor setups conference October 2017
Infrastructure Based Calibration of a Multi-Camera and Multi-LiDAR System Using Apriltags conference June 2018
Simultaneous System Calibration of a Multi-LiDAR Multicamera Mobile Mapping Platform journal May 2018
Bias Impact Analysis and Calibration of Terrestrial Mobile LiDAR System With Several Spinning Multibeam Laser Scanners journal September 2018
Accurate, Dense, and Robust Multiview Stereopsis journal August 2010
Stability Analysis and Geometric Calibration of Off-the-Shelf Digital Cameras journal June 2005
Automated Relative Orientation of UAV-Based Imagery in the Presence of Prior Information for the Flight Trajectory journal November 2016
Rigorous Strip Adjustment of UAV-based Laserscanning Data Including Time-Dependent Correction of Trajectory Errors journal December 2016
GNSS/INS-Assisted Structure from Motion Strategies for UAV-Based Imagery over Mechanized Agricultural Fields journal January 2020
LiDAR-Aided Interior Orientation Parameters Refinement Strategy for Consumer-Grade Cameras Onboard UAV Remote Sensing Systems journal July 2020
Accurate Calibration of Multi-LiDAR-Multi-Camera Systems journal July 2018
A Novel Calibration Board and Experiments for 3D LiDAR and Camera Calibration journal February 2020
Hybrid Orientation of Airborne Lidar Point Clouds and Aerial Images journal January 2019
Investigations on the Quality of the Interior Orientation and its Impact in Object Space for uav Photogrammetry journal January 2015