Enhancing drone-based fire detection with flame-specific attention and optimized feature fusion
Recent advancements in drone-based fire early warning technologies have significantly improved fire detection, particularly in remote and forested areas where drones are widely utilized. However, the constrained battery life and limited computational resources of drones present challenges for real-t...
Saved in:
| Main Authors: | , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Elsevier
2025-08-01
|
| Series: | International Journal of Applied Earth Observations and Geoinformation |
| Subjects: | |
| Online Access: | http://www.sciencedirect.com/science/article/pii/S1569843225003024 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Recent advancements in drone-based fire early warning technologies have significantly improved fire detection, particularly in remote and forested areas where drones are widely utilized. However, the constrained battery life and limited computational resources of drones present challenges for real-time fire detection. Existing methods primarily focus on fire target identification without considering the distinct color and thermal characteristics of flames, leading to suboptimal detection accuracy. To address these issues, we propose a Flame-Specific Attention (FSA) mechanism, which integrates heat conduction principles and flame shape features to enhance receptive field expansion while maintaining computational efficiency. Furthermore, the Neck of the model is optimized with a Focal Modulation module to improve feature fusion, and a variable multi-attention detection head is introduced to refine detection precision. Experimental results on our Comprehensive Fire Scene Dataset (containing 3,905 images) demonstrate that our model achieves a mean Average Precision (mAP@0.5) of 87.7%, surpassing both Vision Transformers (ViTs) and traditional CNN approaches. Compared to the YOLOv10 baseline, our approach improves precision by 5.7% while maintaining an inference speed of 182 FPS, enabling real-time deployment in edge-computing scenarios such as drone-based fire detection. Additionally, the model effectively detects small- and medium-sized flames, reducing false positives in challenging lighting conditions (e.g., sunset and urban illumination). These enhancements make our approach highly suitable for early fire warning applications in forest and urban environments. |
|---|---|
| ISSN: | 1569-8432 |