EMHANet: Lightweight Salient Object Detection for Remote Sensing Images via Edge-Aware Multiscale Feature Fusion
Salient object detection in remote sensing images (RSI-SOD) aims to identify visually prominent objects by mimicking human visual perception. While convolutional neural networks (CNNs) have significantly improved detection accuracy, most RSI-SOD methods suffer from high computational costs and large...
Saved in:
| Main Authors: | , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10980003/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Salient object detection in remote sensing images (RSI-SOD) aims to identify visually prominent objects by mimicking human visual perception. While convolutional neural networks (CNNs) have significantly improved detection accuracy, most RSI-SOD methods suffer from high computational costs and large model sizes, limiting their applicability in resource-constrained environments. Additionally, RSI’s complex backgrounds and diverse object scales further challenge existing methods. To address these issues, we propose EMHANet, a lightweight network that integrates edge texture detail extraction, multi-scale feature fusion, and hybrid attention mechanism. EMHANet consists of MobileNetV3 for feature extraction, an Edge Feature Integration Module (EFIM) for low-level edge details, a Multi-scale Contextual Information Enhancement Module (MCIEM) for high-level feature refinement, and a lightweight decoder for saliency prediction. The network employs a coarse-to-fine strategy to accurately detect salient objects while maintaining efficiency. Experiments on ORSSD and EORSSD datasets demonstrate EMHANet superior performance over 31 state-of-the-art methods. It achieves high accuracy with an inference speed of 143 fps, 0.257M parameters, and 0.92G FLOPs, making it suitable for resource-limited applications. The source code and dataset will be available on <uri>https://github.com/darkseid-arch/EMHANet</uri> |
|---|---|
| ISSN: | 2169-3536 |