Automatic targetless camera–LIDAR calibration by aligning edge with Gaussian mixture model

Jaehyeon Kang, Nakju Doh

Research output: Contribution to journalArticle

Abstract

This paper presents a calibration algorithm that does not require an artificial target object to precisely estimate a rigid-body transformation between a camera and a light detection and ranging (LIDAR) sensor. The proposed algorithm estimates calibration parameters by minimizing a cost function that evaluates the edge alignment between two sensor measurements. In particular, the proposed cost function is constructed using a projection model-based many-to-many correspondence of the edges to fully exploit measurements with different densities (dense photometry and sparse geometry). The alignment of the many-to-many correspondence is represented using the Gaussian mixture model (GMM) framework. Here, each component of the GMM, including weight, displacement, and standard deviation, is derived to suitably capture the intensity, location, and influential range of the edge measurements, respectively. The derived cost function is optimized by the gradient descent method with an analytical derivative. A coarse-to-fine scheme is also applied by gradually decreasing the standard deviation of the GMM to enhance the robustness of the algorithm. Extensive indoor and outdoor experiments validate the claim that the proposed GMM strategy improves the performance of the proposed algorithm. The experimental results also show that the proposed algorithm outperforms previous methods in terms of precision and accuracy by providing calibration parameters of standard deviations less than 0.6° and 2.1 cm with a reprojection error of 1.78 for a 2.1-megapixel image (2,048 × 1,024) in the best case.

Original languageEnglish
JournalJournal of Field Robotics
DOIs
Publication statusPublished - 2019 Jan 1

Fingerprint

Calibration
Cost functions
Photometry
Sensors
Cameras
Derivatives
Geometry
Experiments

Keywords

  • calibration
  • perception
  • sensor networks
  • sensors

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Computer Science Applications

Cite this

Automatic targetless camera–LIDAR calibration by aligning edge with Gaussian mixture model. / Kang, Jaehyeon; Doh, Nakju.

In: Journal of Field Robotics, 01.01.2019.

Research output: Contribution to journalArticle

@article{2818b2f7e1774b0caeeef5074a63a849,
title = "Automatic targetless camera–LIDAR calibration by aligning edge with Gaussian mixture model",
abstract = "This paper presents a calibration algorithm that does not require an artificial target object to precisely estimate a rigid-body transformation between a camera and a light detection and ranging (LIDAR) sensor. The proposed algorithm estimates calibration parameters by minimizing a cost function that evaluates the edge alignment between two sensor measurements. In particular, the proposed cost function is constructed using a projection model-based many-to-many correspondence of the edges to fully exploit measurements with different densities (dense photometry and sparse geometry). The alignment of the many-to-many correspondence is represented using the Gaussian mixture model (GMM) framework. Here, each component of the GMM, including weight, displacement, and standard deviation, is derived to suitably capture the intensity, location, and influential range of the edge measurements, respectively. The derived cost function is optimized by the gradient descent method with an analytical derivative. A coarse-to-fine scheme is also applied by gradually decreasing the standard deviation of the GMM to enhance the robustness of the algorithm. Extensive indoor and outdoor experiments validate the claim that the proposed GMM strategy improves the performance of the proposed algorithm. The experimental results also show that the proposed algorithm outperforms previous methods in terms of precision and accuracy by providing calibration parameters of standard deviations less than 0.6° and 2.1 cm with a reprojection error of 1.78 for a 2.1-megapixel image (2,048 × 1,024) in the best case.",
keywords = "calibration, perception, sensor networks, sensors",
author = "Jaehyeon Kang and Nakju Doh",
year = "2019",
month = "1",
day = "1",
doi = "10.1002/rob.21893",
language = "English",
journal = "Journal of Field Robotics",
issn = "1556-4959",
publisher = "John Wiley and Sons Inc.",

}

TY - JOUR

T1 - Automatic targetless camera–LIDAR calibration by aligning edge with Gaussian mixture model

AU - Kang, Jaehyeon

AU - Doh, Nakju

PY - 2019/1/1

Y1 - 2019/1/1

N2 - This paper presents a calibration algorithm that does not require an artificial target object to precisely estimate a rigid-body transformation between a camera and a light detection and ranging (LIDAR) sensor. The proposed algorithm estimates calibration parameters by minimizing a cost function that evaluates the edge alignment between two sensor measurements. In particular, the proposed cost function is constructed using a projection model-based many-to-many correspondence of the edges to fully exploit measurements with different densities (dense photometry and sparse geometry). The alignment of the many-to-many correspondence is represented using the Gaussian mixture model (GMM) framework. Here, each component of the GMM, including weight, displacement, and standard deviation, is derived to suitably capture the intensity, location, and influential range of the edge measurements, respectively. The derived cost function is optimized by the gradient descent method with an analytical derivative. A coarse-to-fine scheme is also applied by gradually decreasing the standard deviation of the GMM to enhance the robustness of the algorithm. Extensive indoor and outdoor experiments validate the claim that the proposed GMM strategy improves the performance of the proposed algorithm. The experimental results also show that the proposed algorithm outperforms previous methods in terms of precision and accuracy by providing calibration parameters of standard deviations less than 0.6° and 2.1 cm with a reprojection error of 1.78 for a 2.1-megapixel image (2,048 × 1,024) in the best case.

AB - This paper presents a calibration algorithm that does not require an artificial target object to precisely estimate a rigid-body transformation between a camera and a light detection and ranging (LIDAR) sensor. The proposed algorithm estimates calibration parameters by minimizing a cost function that evaluates the edge alignment between two sensor measurements. In particular, the proposed cost function is constructed using a projection model-based many-to-many correspondence of the edges to fully exploit measurements with different densities (dense photometry and sparse geometry). The alignment of the many-to-many correspondence is represented using the Gaussian mixture model (GMM) framework. Here, each component of the GMM, including weight, displacement, and standard deviation, is derived to suitably capture the intensity, location, and influential range of the edge measurements, respectively. The derived cost function is optimized by the gradient descent method with an analytical derivative. A coarse-to-fine scheme is also applied by gradually decreasing the standard deviation of the GMM to enhance the robustness of the algorithm. Extensive indoor and outdoor experiments validate the claim that the proposed GMM strategy improves the performance of the proposed algorithm. The experimental results also show that the proposed algorithm outperforms previous methods in terms of precision and accuracy by providing calibration parameters of standard deviations less than 0.6° and 2.1 cm with a reprojection error of 1.78 for a 2.1-megapixel image (2,048 × 1,024) in the best case.

KW - calibration

KW - perception

KW - sensor networks

KW - sensors

UR - http://www.scopus.com/inward/record.url?scp=85070297675&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85070297675&partnerID=8YFLogxK

U2 - 10.1002/rob.21893

DO - 10.1002/rob.21893

M3 - Article

JO - Journal of Field Robotics

JF - Journal of Field Robotics

SN - 1556-4959

ER -