Robust Corner Detection Using Local Extrema Differences

Corner detection, crucial for many computer vision tasks due to corner's distinct structural properties, often relies on traditional intensity-based detectors developed before 2000. This paper introduces a novel intensity-based corner detector that surpasses existing methods by solely analyzing...

Full description

Saved in:
Bibliographic Details
Main Authors: Reza Yazdi, Hassan Khotanlou, Hosna Khademfar
Format: Article
Language:English
Published: University of science and culture 2024-01-01
Series:International Journal of Web Research
Subjects:
Online Access:https://ijwr.usc.ac.ir/article_200313_e8e2ef03ef36eb3261087bf4d7811d3f.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Corner detection, crucial for many computer vision tasks due to corner's distinct structural properties, often relies on traditional intensity-based detectors developed before 2000. This paper introduces a novel intensity-based corner detector that surpasses existing methods by solely analyzing pixel intensity within a 3×3 neighborhood. Our approach leverages a unique corner response function derived from intensity sorting and difference calculations. We conduct a comprehensive evaluation comparing our detector to seven established algorithms using five benchmark images with ground truth corner locations. The evaluation encompasses detection accuracy, localization error under varying noise levels, and repeatability under transformations and degradations. This assessment utilizes 28 diverse images without ground truth data. Experimental results demonstrate the proposed detector's superior overall performance by 3%. It achieves better accuracy in corner localization and reduces both missed detections and false positives. Furthermore, requiring only one parameter for adjustment, it offers computational efficiency and real-time processing potential. Additionally, the generated corner response map holds promise for integration with deep learning architectures, opening possibilities for further exploration.
ISSN:2645-4343