Noise Robust Underwater Fishing Net Recognition Based on Range Gated Imaging

Underwater fishing net recognition plays an indispensable role in applications such as safe navigation of unmanned underwater vehicles, protection of marine ecology and marine ranching. However, the performance of underwater fishing net recognition usually degrades seriously due to noise interferenc...

Full description

Saved in:
Bibliographic Details
Main Authors: Zhensong Xu, Xinwei Wang, Liang Sun, Bo Song, Yue Zhang, Pingshun Lei, Jianan Chen, Jun He, Yan Zhou, Yuliang Liu
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10772403/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Underwater fishing net recognition plays an indispensable role in applications such as safe navigation of unmanned underwater vehicles, protection of marine ecology and marine ranching. However, the performance of underwater fishing net recognition usually degrades seriously due to noise interference in underwater environments. In this paper, we use range gated imaging as the detection device, and propose a semantic fishing net recognition network (SFNR-Net) for underwater fishing net recognition at long distance. The proposed SFNR-Net introduces an auxiliary semantic segmentation module (ASSM) to introduce extra semantic information and enhance feature representation under noisy conditions. Besides, to address the problem of unbalanced training data, we employ semantic regulated cycle-consistent generative adversarial network (CycleGAN) as a data augmentation approach. To improve the quality of generated data, we propose a semantic loss to regulate the training of CycleGAN. Comprehensive experiments on the test data show that SFNR-Net can effectively solve noise interference and achieve the best recognition accuracy of 96.28% compared with existing methods. Field experiments in underwater environments with different turbidity further validate the advantages of our method.
ISSN:2169-3536