Enhancing Object Estimation by Camera-LiDAR Sensor Fusion Using IMM-KF With Error Characteristics in Autonomous Robot Systems

In autonomous robot systems, accurate object recognition and estimation are crucial for ensuring reliable performance, and sensor fusion techniques that combine complementary sensors have proven to be effective in achieving this. This paper proposes a Light Detection And Ranging (LiDAR) and a camera...

Full description

Saved in:
Bibliographic Details
Main Authors: Sun Ho Lee, Woo Young Choi
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10813161/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In autonomous robot systems, accurate object recognition and estimation are crucial for ensuring reliable performance, and sensor fusion techniques that combine complementary sensors have proven to be effective in achieving this. This paper proposes a Light Detection And Ranging (LiDAR) and a camera sensor fusion method to improve object recognition and estimation performance in autonomous robot systems. We first calibrate the camera and LiDAR sensor to implement the proposed method. Then, data association is performed between the LiDAR sensor’s data and the object’s bounding box data, which is identified through the camera sensor with a deep learning algorithm. To improve the performance of object recognition and estimation, we identified the limitations of single-sensor recognition and measurement and established measurement noise covariance through the analysis of each sensor’s distance measurement errors. After that, we applied an Interacting Multiple Model (IMM)-Kalman Filter (KF) considering the pre-analyzed error characteristics. The usefulness of the proposed method was validated through scenario-based experiments. Experimental results show that the proposed sensor fusion method significantly improves the accuracy of object estimation and extends the Field of View (FoV) of the sensors over the conventional methods.
ISSN:2169-3536