Hybrid adaptive method for lane detection of degraded road surface condition
Lane detection on roads is essential for autonomous vehicles. Most previous studies detected the area of the road and all possible lanes built on it, whereas only the specific lane that the car is currently traveling on should be detected. In addition, they are complex, slow, fail under degraded roa...
Saved in:
| Main Author: | |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Springer
2022-09-01
|
| Series: | Journal of King Saud University: Computer and Information Sciences |
| Subjects: | |
| Online Access: | http://www.sciencedirect.com/science/article/pii/S1319157822002038 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Lane detection on roads is essential for autonomous vehicles. Most previous studies detected the area of the road and all possible lanes built on it, whereas only the specific lane that the car is currently traveling on should be detected. In addition, they are complex, slow, fail under degraded road conditions, and cannot be generalized to different scenarios. Moreover, these methods require costly hardware and under-consider road conditions. In this study, the ego lane, which is the lane a car is currently traveling on, is detected. This study proposes an adaptive hybrid lane detection method that adopts the advantages of traditional vision-based and machine-learning-based approaches. The proposed method involves a set of preprocessing methods for obtaining the candidate lane borders from an image in any degraded state. Subsequently, numerical features about the boundaries of the candidate lanes are extracted and used by a model of the k-nearest neighbor algorithm and the Gaussian process for final lane discovery. A set of experiments were conducted using the KITTI dataset to evaluate the performance. The results show that the proposed method overcomes various challenges. It is relatively simple and fast; it requires low-cost devices and processes and can be generalized without major modifications. |
|---|---|
| ISSN: | 1319-1578 |