Research on noise prediction methods for sound barriers based on the integration of conditional generative adversarial networks and numerical methods

This study proposes a novel approach utilizing Conditional Generative Adversarial Networks (CGANs) to accelerate wideband acoustic state analysis, addressing the computational challenges in traditional Boundary Element Method (BEM) approaches. Traditional BEM-based acoustic analysis requires repeate...

Full description

Saved in:
Bibliographic Details
Main Authors: Qian Hu, Ziyu Cui, Hongxue Liu, Senhao Zhong
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-03-01
Series:Frontiers in Physics
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fphy.2025.1539545/full
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This study proposes a novel approach utilizing Conditional Generative Adversarial Networks (CGANs) to accelerate wideband acoustic state analysis, addressing the computational challenges in traditional Boundary Element Method (BEM) approaches. Traditional BEM-based acoustic analysis requires repeated computation of frequency-dependent system matrices across multiple frequencies, leading to significant computational costs. The asymmetry and full-rank nature of the BEM coefficient matrices further increase computational demands, particularly in large-scale problems. To overcome these challenges, this paper introduces a CGAN-based modeling framework that significantly reduces computation time while maintaining high predictive accuracy. The framework demonstrates exceptional adaptability when handling datasets with varying characteristics, effectively capturing underlying patterns within the data. Numerical experiments validate the effectiveness of the proposed method, highlighting its advantages in both accuracy and computational efficiency. This CGAN-based approach provides a promising alternative for efficient wideband acoustic analysis, significantly reducing computation time while ensuring accuracy.
ISSN:2296-424X