Axial-UNet++ Power Line Detection Network Based on Gated Axial Attention Mechanism

The segmentation and recognition of power lines are crucial for the UAV-based inspection of overhead power lines. To address the issues of class imbalance, low sample quantity, and long-range dependency in images, a specialized semantic segmentation network for power line segmentation called Axial-U...

Full description

Saved in:
Bibliographic Details
Main Authors: Ding Hu, Zihao Zheng, Yafei Liu, Chengkang Liu, Xiaoguo Zhang
Format: Article
Language:English
Published: MDPI AG 2024-12-01
Series:Remote Sensing
Subjects:
Online Access:https://www.mdpi.com/2072-4292/16/23/4585
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The segmentation and recognition of power lines are crucial for the UAV-based inspection of overhead power lines. To address the issues of class imbalance, low sample quantity, and long-range dependency in images, a specialized semantic segmentation network for power line segmentation called Axial-UNet++ is proposed. Firstly, to tackle the issue of long-range dependencies in images and low sample quantity, a gated axial attention mechanism is introduced to expand the receptive field and improve the capture of relative positional biases in small datasets, thereby proposing a novel feature extraction module termed axial-channel local normalization module. Secondly, to address the imbalance in training samples, a new loss function is developed by combining traditional binary cross-entropy loss with focal loss, enhancing the precision of image semantic segmentation. Lastly, ablation and comparative experiments on the PLDU and Mendeley datasets demonstrate that the proposed model achieves 54.7% IoU and 80.1% recall on the PLDU dataset, and 79.3% IoU and 93.1% recall on the Mendeley dataset, outperforming other listed models. Additionally, robustness experiments show the adaptability of the Axial-UNet++ model under extreme conditions and the augmented image dataset used in this study has been open sourced.
ISSN:2072-4292