Achieving High-Quality Image Stylization Using Hierarchical Sensibility Boost, Visual Style Attention, and Covariance Evolution Blend
In recent years, image stylization has gained significant attention due to its ability to transform visual styles from reference images to content images, transforming ordinary photographs into sensuous creations. Traditional methods, however, struggle to retain fine-grained details and often exhibi...
Saved in:
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2025-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10819392/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In recent years, image stylization has gained significant attention due to its ability to transform visual styles from reference images to content images, transforming ordinary photographs into sensuous creations. Traditional methods, however, struggle to retain fine-grained details and often exhibit issues such as discordant textures and noticeable artifacts. This paper proposes a novel approach for high-quality image stylization by introducing a method that integrates Hierarchical Sensibility Boost (HSB), Visual Style Attention (VSA), and Covariance Evolution Blend (CovEB) blocks. The proposed method addresses the challenges of preserving content construction while enhancing stylistic representations to achieve visually pleasing and realistic results. Specifically, the HSB block captures and amplifies style representations at different sizes, the VSA block uses channel attention mechanisms to dynamically boost style representations, and the CovEB block aligns second-order statistical data between style and content representations to ensure coherent stylization. Comprehensive experiments conducted on MS-COCO and WikiArt datasets demonstrate that the proposed method produces stylized images with superior visual sensibility, minimal artifacts, and improved efficiency compared to existing techniques. Quantitative evaluations confirm that our method excels in style similarity, global effects, and local pattern retention, while maintaining competitive performance in content fidelity and runtime efficiency. |
---|---|
ISSN: | 2169-3536 |