A real-time detection method for multi-scale ships in complex scenes

Ship detection plays an important role in tasks such as military reconnaissance, maritime target tracking, and maritime traffic control.However, due to the influence of variable sizes of ships and complex background of sea surface, detecting multi-scale ships remains a challenge in complex sea surfa...

Full description

Saved in:
Bibliographic Details
Main Authors: Weina ZHOU, Lu LIU
Format: Article
Language:zho
Published: Beijing Xintong Media Co., Ltd 2022-10-01
Series:Dianxin kexue
Subjects:
Online Access:http://www.telecomsci.com/zh/article/doi/10.11959/j.issn.1000-0801.2022258/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Ship detection plays an important role in tasks such as military reconnaissance, maritime target tracking, and maritime traffic control.However, due to the influence of variable sizes of ships and complex background of sea surface, detecting multi-scale ships remains a challenge in complex sea surfaces.To solve this problem, an improved YOLOv4 method based on multi-layers information interactive fusion and attention mechanism was proposed.Multi-layers information interactive fusion (MLIF) and multi-attention receptive field (MARF) were applied and combined reasonably to build a bidirectional fine-grained feature pyramid.MLIF was used to fuse feature of different scales, which not only concatenated high-level semantic features from deep layers, but also reshaped richer features from shallower layers.MARF consisted of receptive field block (RFB) and attention mechanism module, which effectively emphasized the important features and suppressed unnecessary ones.In addition, to further evaluate the performance of the proposed method, experiments were carried out on Singapore maritime dataset (SMD).The results illustrate that the method proposed can effectively solve the problem of difficult detection of multi-scale ships in complex marine environment, and meet the real-time requirements at the same time.
ISSN:1000-0801