A text classification method by integrating mobile inverted residual bottleneck convolution networks and capsule networks with adaptive feature channels
Abstract This study proposes a novel text classification model, MBConv-CapsNet, to address large-scale text data classification issues in the Internet era. Integrating the advantages of Mobile Inverted Bottleneck Convolutional Networks and Capsule Networks, this model comprehensively considers text...
Saved in:
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2025-01-01
|
Series: | Scientific Reports |
Subjects: | |
Online Access: | https://doi.org/10.1038/s41598-025-85237-2 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1841559761193009152 |
---|---|
author | Tao Jin Jiaming Liu |
author_facet | Tao Jin Jiaming Liu |
author_sort | Tao Jin |
collection | DOAJ |
description | Abstract This study proposes a novel text classification model, MBConv-CapsNet, to address large-scale text data classification issues in the Internet era. Integrating the advantages of Mobile Inverted Bottleneck Convolutional Networks and Capsule Networks, this model comprehensively considers text sequence information, word embeddings, and contextual dependencies to capture both local and global information about the text effectively. It transforms from the original text matrix to a more compact and representative feature representation. A Capsule Network is designed to adaptively adjust the importance of different feature channels, including N-gram convolutional layers, selective kernel network layers, primary capsule layers, convolutional capsule layers, and fully connected capsule layers, aiming to enhance the model’s ability to capture semantic information of text across different feature channels. The use of the sparsemax function instead of the softmax function for dynamic routing within the Capsule Network directs the network’s focus more on capsules contributing significantly to the final output, reducing the impact of noise and redundant information, and further improving the classification performance. Experimental validation on multiple publicly available text classification datasets demonstrates significant performance improvements of the proposed method in binary classification, multi-classification, and multi-label text classification tasks, exhibiting better generalization capability and robustness. |
format | Article |
id | doaj-art-b1f68e6eac854d97bc3db2b9e9e0ee46 |
institution | Kabale University |
issn | 2045-2322 |
language | English |
publishDate | 2025-01-01 |
publisher | Nature Portfolio |
record_format | Article |
series | Scientific Reports |
spelling | doaj-art-b1f68e6eac854d97bc3db2b9e9e0ee462025-01-05T12:14:38ZengNature PortfolioScientific Reports2045-23222025-01-0115111410.1038/s41598-025-85237-2A text classification method by integrating mobile inverted residual bottleneck convolution networks and capsule networks with adaptive feature channelsTao Jin0Jiaming Liu1College of Computer and Control Engineering, Qiqihar UniversityCollege of Computer and Control Engineering, Qiqihar UniversityAbstract This study proposes a novel text classification model, MBConv-CapsNet, to address large-scale text data classification issues in the Internet era. Integrating the advantages of Mobile Inverted Bottleneck Convolutional Networks and Capsule Networks, this model comprehensively considers text sequence information, word embeddings, and contextual dependencies to capture both local and global information about the text effectively. It transforms from the original text matrix to a more compact and representative feature representation. A Capsule Network is designed to adaptively adjust the importance of different feature channels, including N-gram convolutional layers, selective kernel network layers, primary capsule layers, convolutional capsule layers, and fully connected capsule layers, aiming to enhance the model’s ability to capture semantic information of text across different feature channels. The use of the sparsemax function instead of the softmax function for dynamic routing within the Capsule Network directs the network’s focus more on capsules contributing significantly to the final output, reducing the impact of noise and redundant information, and further improving the classification performance. Experimental validation on multiple publicly available text classification datasets demonstrates significant performance improvements of the proposed method in binary classification, multi-classification, and multi-label text classification tasks, exhibiting better generalization capability and robustness.https://doi.org/10.1038/s41598-025-85237-2Text classificationMBConvCapsule networksSKNetDynamic routing |
spellingShingle | Tao Jin Jiaming Liu A text classification method by integrating mobile inverted residual bottleneck convolution networks and capsule networks with adaptive feature channels Scientific Reports Text classification MBConv Capsule networks SKNet Dynamic routing |
title | A text classification method by integrating mobile inverted residual bottleneck convolution networks and capsule networks with adaptive feature channels |
title_full | A text classification method by integrating mobile inverted residual bottleneck convolution networks and capsule networks with adaptive feature channels |
title_fullStr | A text classification method by integrating mobile inverted residual bottleneck convolution networks and capsule networks with adaptive feature channels |
title_full_unstemmed | A text classification method by integrating mobile inverted residual bottleneck convolution networks and capsule networks with adaptive feature channels |
title_short | A text classification method by integrating mobile inverted residual bottleneck convolution networks and capsule networks with adaptive feature channels |
title_sort | text classification method by integrating mobile inverted residual bottleneck convolution networks and capsule networks with adaptive feature channels |
topic | Text classification MBConv Capsule networks SKNet Dynamic routing |
url | https://doi.org/10.1038/s41598-025-85237-2 |
work_keys_str_mv | AT taojin atextclassificationmethodbyintegratingmobileinvertedresidualbottleneckconvolutionnetworksandcapsulenetworkswithadaptivefeaturechannels AT jiamingliu atextclassificationmethodbyintegratingmobileinvertedresidualbottleneckconvolutionnetworksandcapsulenetworkswithadaptivefeaturechannels AT taojin textclassificationmethodbyintegratingmobileinvertedresidualbottleneckconvolutionnetworksandcapsulenetworkswithadaptivefeaturechannels AT jiamingliu textclassificationmethodbyintegratingmobileinvertedresidualbottleneckconvolutionnetworksandcapsulenetworkswithadaptivefeaturechannels |