An Autonomous Positioning Method for Drones in GNSS Denial Scenarios Driven by Real-Scene 3D Models
Drones are extensively utilized in both military and social development processes. Eliminating the reliance of drone positioning systems on GNSS and enhancing the accuracy of the positioning systems is of significant research value. This paper presents a novel approach that employs a real-scene 3D m...
Saved in:
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-01-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/25/1/209 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1841548916075528192 |
---|---|
author | Yongqiang Cui Xue Gao Rui Yu Xi Chen Dingwen Wang Di Bai |
author_facet | Yongqiang Cui Xue Gao Rui Yu Xi Chen Dingwen Wang Di Bai |
author_sort | Yongqiang Cui |
collection | DOAJ |
description | Drones are extensively utilized in both military and social development processes. Eliminating the reliance of drone positioning systems on GNSS and enhancing the accuracy of the positioning systems is of significant research value. This paper presents a novel approach that employs a real-scene 3D model and image point cloud reconstruction technology for the autonomous positioning of drones and attains high positioning accuracy. Firstly, the real-scene 3D model constructed in this paper is segmented in accordance with the predetermined format to obtain the image dataset and the 3D point cloud dataset. Subsequently, real-time image capture is performed using the monocular camera mounted on the drone, followed by a preliminary position estimation conducted through image matching algorithms and subsequent 3D point cloud reconstruction utilizing the acquired images. Next, the corresponding real-scene 3D point cloud data within the point cloud dataset is extracted in accordance with the image-matching results. Finally, the point cloud data obtained through image reconstruction is matched with the 3D point cloud of the real scene, and the positioning coordinates of the drone are acquired by applying the pose estimation algorithm. The experimental results demonstrate that the proposed approach in this paper enables precise autonomous positioning of drones in complex urban environments, achieving a remarkable positioning accuracy of up to 0.4 m. |
format | Article |
id | doaj-art-4b75c57a4e8a4b2a804985417a4322e5 |
institution | Kabale University |
issn | 1424-8220 |
language | English |
publishDate | 2025-01-01 |
publisher | MDPI AG |
record_format | Article |
series | Sensors |
spelling | doaj-art-4b75c57a4e8a4b2a804985417a4322e52025-01-10T13:21:14ZengMDPI AGSensors1424-82202025-01-0125120910.3390/s25010209An Autonomous Positioning Method for Drones in GNSS Denial Scenarios Driven by Real-Scene 3D ModelsYongqiang Cui0Xue Gao1Rui Yu2Xi Chen3Dingwen Wang4Di Bai5College of Electronics and Information Engineering, South-Central Minzu University, Wuhan 430074, ChinaCollege of Electronics and Information Engineering, South-Central Minzu University, Wuhan 430074, ChinaCollege of Electronics and Information Engineering, South-Central Minzu University, Wuhan 430074, ChinaSchool of Computer Science, Wuhan University, Wuhan 430074, ChinaSchool of Computer Science, Wuhan University, Wuhan 430074, ChinaCollege of Electronics and Information Engineering, South-Central Minzu University, Wuhan 430074, ChinaDrones are extensively utilized in both military and social development processes. Eliminating the reliance of drone positioning systems on GNSS and enhancing the accuracy of the positioning systems is of significant research value. This paper presents a novel approach that employs a real-scene 3D model and image point cloud reconstruction technology for the autonomous positioning of drones and attains high positioning accuracy. Firstly, the real-scene 3D model constructed in this paper is segmented in accordance with the predetermined format to obtain the image dataset and the 3D point cloud dataset. Subsequently, real-time image capture is performed using the monocular camera mounted on the drone, followed by a preliminary position estimation conducted through image matching algorithms and subsequent 3D point cloud reconstruction utilizing the acquired images. Next, the corresponding real-scene 3D point cloud data within the point cloud dataset is extracted in accordance with the image-matching results. Finally, the point cloud data obtained through image reconstruction is matched with the 3D point cloud of the real scene, and the positioning coordinates of the drone are acquired by applying the pose estimation algorithm. The experimental results demonstrate that the proposed approach in this paper enables precise autonomous positioning of drones in complex urban environments, achieving a remarkable positioning accuracy of up to 0.4 m.https://www.mdpi.com/1424-8220/25/1/209real-scene 3D modelautonomous positioningimage matchingthree-dimensional reconstruction |
spellingShingle | Yongqiang Cui Xue Gao Rui Yu Xi Chen Dingwen Wang Di Bai An Autonomous Positioning Method for Drones in GNSS Denial Scenarios Driven by Real-Scene 3D Models Sensors real-scene 3D model autonomous positioning image matching three-dimensional reconstruction |
title | An Autonomous Positioning Method for Drones in GNSS Denial Scenarios Driven by Real-Scene 3D Models |
title_full | An Autonomous Positioning Method for Drones in GNSS Denial Scenarios Driven by Real-Scene 3D Models |
title_fullStr | An Autonomous Positioning Method for Drones in GNSS Denial Scenarios Driven by Real-Scene 3D Models |
title_full_unstemmed | An Autonomous Positioning Method for Drones in GNSS Denial Scenarios Driven by Real-Scene 3D Models |
title_short | An Autonomous Positioning Method for Drones in GNSS Denial Scenarios Driven by Real-Scene 3D Models |
title_sort | autonomous positioning method for drones in gnss denial scenarios driven by real scene 3d models |
topic | real-scene 3D model autonomous positioning image matching three-dimensional reconstruction |
url | https://www.mdpi.com/1424-8220/25/1/209 |
work_keys_str_mv | AT yongqiangcui anautonomouspositioningmethodfordronesingnssdenialscenariosdrivenbyrealscene3dmodels AT xuegao anautonomouspositioningmethodfordronesingnssdenialscenariosdrivenbyrealscene3dmodels AT ruiyu anautonomouspositioningmethodfordronesingnssdenialscenariosdrivenbyrealscene3dmodels AT xichen anautonomouspositioningmethodfordronesingnssdenialscenariosdrivenbyrealscene3dmodels AT dingwenwang anautonomouspositioningmethodfordronesingnssdenialscenariosdrivenbyrealscene3dmodels AT dibai anautonomouspositioningmethodfordronesingnssdenialscenariosdrivenbyrealscene3dmodels AT yongqiangcui autonomouspositioningmethodfordronesingnssdenialscenariosdrivenbyrealscene3dmodels AT xuegao autonomouspositioningmethodfordronesingnssdenialscenariosdrivenbyrealscene3dmodels AT ruiyu autonomouspositioningmethodfordronesingnssdenialscenariosdrivenbyrealscene3dmodels AT xichen autonomouspositioningmethodfordronesingnssdenialscenariosdrivenbyrealscene3dmodels AT dingwenwang autonomouspositioningmethodfordronesingnssdenialscenariosdrivenbyrealscene3dmodels AT dibai autonomouspositioningmethodfordronesingnssdenialscenariosdrivenbyrealscene3dmodels |