Enhancing drone-based fire detection with flame-specific attention and optimized feature fusion
Recent advancements in drone-based fire early warning technologies have significantly improved fire detection, particularly in remote and forested areas where drones are widely utilized. However, the constrained battery life and limited computational resources of drones present challenges for real-t...
Saved in:
| Main Authors: | , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Elsevier
2025-08-01
|
| Series: | International Journal of Applied Earth Observations and Geoinformation |
| Subjects: | |
| Online Access: | http://www.sciencedirect.com/science/article/pii/S1569843225003024 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849395455774949376 |
|---|---|
| author | Qiang Wang Shiyu Guan Shuchang Lyu Guangliang Cheng |
| author_facet | Qiang Wang Shiyu Guan Shuchang Lyu Guangliang Cheng |
| author_sort | Qiang Wang |
| collection | DOAJ |
| description | Recent advancements in drone-based fire early warning technologies have significantly improved fire detection, particularly in remote and forested areas where drones are widely utilized. However, the constrained battery life and limited computational resources of drones present challenges for real-time fire detection. Existing methods primarily focus on fire target identification without considering the distinct color and thermal characteristics of flames, leading to suboptimal detection accuracy. To address these issues, we propose a Flame-Specific Attention (FSA) mechanism, which integrates heat conduction principles and flame shape features to enhance receptive field expansion while maintaining computational efficiency. Furthermore, the Neck of the model is optimized with a Focal Modulation module to improve feature fusion, and a variable multi-attention detection head is introduced to refine detection precision. Experimental results on our Comprehensive Fire Scene Dataset (containing 3,905 images) demonstrate that our model achieves a mean Average Precision (mAP@0.5) of 87.7%, surpassing both Vision Transformers (ViTs) and traditional CNN approaches. Compared to the YOLOv10 baseline, our approach improves precision by 5.7% while maintaining an inference speed of 182 FPS, enabling real-time deployment in edge-computing scenarios such as drone-based fire detection. Additionally, the model effectively detects small- and medium-sized flames, reducing false positives in challenging lighting conditions (e.g., sunset and urban illumination). These enhancements make our approach highly suitable for early fire warning applications in forest and urban environments. |
| format | Article |
| id | doaj-art-f80d8f6544f94be1881f91a6e5f2d7d4 |
| institution | Kabale University |
| issn | 1569-8432 |
| language | English |
| publishDate | 2025-08-01 |
| publisher | Elsevier |
| record_format | Article |
| series | International Journal of Applied Earth Observations and Geoinformation |
| spelling | doaj-art-f80d8f6544f94be1881f91a6e5f2d7d42025-08-20T03:39:36ZengElsevierInternational Journal of Applied Earth Observations and Geoinformation1569-84322025-08-0114210465510.1016/j.jag.2025.104655Enhancing drone-based fire detection with flame-specific attention and optimized feature fusionQiang Wang0Shiyu Guan1Shuchang Lyu2Guangliang Cheng3Department of Electrics and Information Engineering, Beihang University, 37 Xueyuan Road, Haidian District, Beijing, 100191, Beijing, China; UAV Industry Academy, Chengdu Aeronautic Polytechnic, No. 699, East 7th Road, Checheng, Chengdu, 610100, Sichuan, ChinaCollege of Electronic Information and Artificial Intelligence, Shaanxi University of Science and Technology, 6 Xuefu Zhonglu, Weiyang District, Xian, 710016, Shaanxi, ChinaDepartment of Electrics and Information Engineering, Beihang University, 37 Xueyuan Road, Haidian District, Beijing, 100191, Beijing, China; Corresponding author.Department of Computer Science, University of Liverpool, Liverpool, L69 3BX, UKRecent advancements in drone-based fire early warning technologies have significantly improved fire detection, particularly in remote and forested areas where drones are widely utilized. However, the constrained battery life and limited computational resources of drones present challenges for real-time fire detection. Existing methods primarily focus on fire target identification without considering the distinct color and thermal characteristics of flames, leading to suboptimal detection accuracy. To address these issues, we propose a Flame-Specific Attention (FSA) mechanism, which integrates heat conduction principles and flame shape features to enhance receptive field expansion while maintaining computational efficiency. Furthermore, the Neck of the model is optimized with a Focal Modulation module to improve feature fusion, and a variable multi-attention detection head is introduced to refine detection precision. Experimental results on our Comprehensive Fire Scene Dataset (containing 3,905 images) demonstrate that our model achieves a mean Average Precision (mAP@0.5) of 87.7%, surpassing both Vision Transformers (ViTs) and traditional CNN approaches. Compared to the YOLOv10 baseline, our approach improves precision by 5.7% while maintaining an inference speed of 182 FPS, enabling real-time deployment in edge-computing scenarios such as drone-based fire detection. Additionally, the model effectively detects small- and medium-sized flames, reducing false positives in challenging lighting conditions (e.g., sunset and urban illumination). These enhancements make our approach highly suitable for early fire warning applications in forest and urban environments.http://www.sciencedirect.com/science/article/pii/S1569843225003024Fire detectionFlame-specific attention mechanismComprehensive Fire Scene Dataset |
| spellingShingle | Qiang Wang Shiyu Guan Shuchang Lyu Guangliang Cheng Enhancing drone-based fire detection with flame-specific attention and optimized feature fusion International Journal of Applied Earth Observations and Geoinformation Fire detection Flame-specific attention mechanism Comprehensive Fire Scene Dataset |
| title | Enhancing drone-based fire detection with flame-specific attention and optimized feature fusion |
| title_full | Enhancing drone-based fire detection with flame-specific attention and optimized feature fusion |
| title_fullStr | Enhancing drone-based fire detection with flame-specific attention and optimized feature fusion |
| title_full_unstemmed | Enhancing drone-based fire detection with flame-specific attention and optimized feature fusion |
| title_short | Enhancing drone-based fire detection with flame-specific attention and optimized feature fusion |
| title_sort | enhancing drone based fire detection with flame specific attention and optimized feature fusion |
| topic | Fire detection Flame-specific attention mechanism Comprehensive Fire Scene Dataset |
| url | http://www.sciencedirect.com/science/article/pii/S1569843225003024 |
| work_keys_str_mv | AT qiangwang enhancingdronebasedfiredetectionwithflamespecificattentionandoptimizedfeaturefusion AT shiyuguan enhancingdronebasedfiredetectionwithflamespecificattentionandoptimizedfeaturefusion AT shuchanglyu enhancingdronebasedfiredetectionwithflamespecificattentionandoptimizedfeaturefusion AT guangliangcheng enhancingdronebasedfiredetectionwithflamespecificattentionandoptimizedfeaturefusion |