A method based on visual saliency for vehicle-mounted monocular camera ego-motion estimation and vehicle scale estimation

A method based on visual saliency for ego-motion estimation and scale estimation of the vehicle in front was proposed.Firstly, for the ego-motion estimation of vehicle-mounted camera, the visual saliency calculation method was used to detect and remove moving objects in the monocular image sequence...

Full description

Saved in:
Bibliographic Details
Main Authors: Mingxin AI, Tie LIU, Jing WANG, Jiali DING, Zejian YUAN, Yuanyuan SHANG
Format: Article
Language:zho
Published: POSTS&TELECOM PRESS Co., LTD 2021-09-01
Series:智能科学与技术学报
Subjects:
Online Access:http://www.cjist.com.cn/thesisDetails#10.11959/j.issn.2096-6652.202129
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A method based on visual saliency for ego-motion estimation and scale estimation of the vehicle in front was proposed.Firstly, for the ego-motion estimation of vehicle-mounted camera, the visual saliency calculation method was used to detect and remove moving objects in the monocular image sequence containing noise.While considering the image texture and smooth region, the weighted saliency map was used to retain useful feature points, to improve the robustness of ego-motion estimation.Secondly, the distance of the vehicle in front was converted into a vehicle scale estimation, by integrating descriptor match and the strength of regularization match of the lie algebra to minimize loss function.The visual attention mechanism was used to get texture image block without shade, and the pixel in the image block weight to mitigate the effects of destroyed by noise pixel, so as to realize the robust and accurate scale estimation.Finally, several challenging datasets were used to analyze and verify the proposed method.The results show that the monocular camera ego-motion estimation method reaches the level of the stereo camera-based method, and the vehicle scale estimation method ensures the prediction accuracy while giving full play to the advantages of strong robustness.
ISSN:2096-6652