FGMFN: Fine-Grained Multiscale Cross-Modal Sentiment Analysis in Advertisements
Cross-modal sentiment analysis in advertising has gained significant attention due to its potential in brand communication and consumer behavior analysis. However, traditional methods struggle to handle the multi-scale features and redundant objects in advertising images effectively, resulting in li...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11007133/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Cross-modal sentiment analysis in advertising has gained significant attention due to its potential in brand communication and consumer behavior analysis. However, traditional methods struggle to handle the multi-scale features and redundant objects in advertising images effectively, resulting in limited emotion recognition accuracy. To address the challenges of insufficient multi-scale features and target redundancy in multi-modal sentiment analysis of advertisements, we introduce a novel framework, Fine-Grained Multiscale Cross-Modal Feature Network (FGMFN). The model is designed to process multi-scale feature inputs, facilitate efficient sentiment fusion between images and text. FGMFN employs a multi-scale network to extract key features from advertising images, and uses visual features to guide the textual data representation. Additionally, to reduce textual ambiguity caused by strong intra-class similarity in advertising contexts, we introduce a multi-task learning approach combining image-text matching loss with image-text mutual information loss. This strategy narrows the gap between visual features and sentiment semantics, improving the model’s generalization capabilities. Finally, we construct a fine-grained image-text sentiment analysis dataset (YTB-ADS), which, in contrast to traditional coarse-grained datasets with high intra-class similarity, better serves the specific needs of advertising sentiment analysis. Experimental results show that FGMFN outperforms existing methods on the YTB-ADS dataset, as well as on the publicly available Twitter-2015 and Twitter-2017 datasets, confirming the model’s superior performance in advertising sentiment analysis tasks. |
|---|---|
| ISSN: | 2169-3536 |