EASRec: External Attentive Efficient Sequential Recommender
Sequential recommendation task aims at modeling users’ preference according to the sequential dependencies contained in their historical interacted item sequences. The advanced self-attention mechanism, reallocating the input sequence features, reducing the inductive bias, and refining th...
Saved in:
| Main Authors: | , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2024-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10155488/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1846129661638082560 |
|---|---|
| author | Wu Qiao Xingliang Zhang Chao Wu Bing Jia Funing Yang |
| author_facet | Wu Qiao Xingliang Zhang Chao Wu Bing Jia Funing Yang |
| author_sort | Wu Qiao |
| collection | DOAJ |
| description | Sequential recommendation task aims at modeling users’ preference according to the sequential dependencies contained in their historical interacted item sequences. The advanced self-attention mechanism, reallocating the input sequence features, reducing the inductive bias, and refining the output representations, tends to become the go-to underlying method for this task. However, since the computational complexity is quadratic correlated with the sequence length, self-attention-based sequential recommenders might be inflexible to deal with long sequences. We propose an efficient sequential recommender EASRec based on the multi-head external attention mechanism to avoid such issue. Specifically, EASRec mines the sequential dependency via two learnable memories coupled with a double normalization strategy, reducing the computational complexity from quadratic to linear. Take a step further, since the memories are global sharing, our EASRec can implicitly model the potential correlations among all sequences, i.e., the common preference, which is lack in the self-attention mechanism. We conduct numerous experiments on three public datasets, and the experimental results show that our EASRec can provide 3.63% improvement on average compared to several state-of-the-art baselines with minimal computational cost. |
| format | Article |
| id | doaj-art-1a8f339524f64981bf480691bc3eb547 |
| institution | Kabale University |
| issn | 2169-3536 |
| language | English |
| publishDate | 2024-01-01 |
| publisher | IEEE |
| record_format | Article |
| series | IEEE Access |
| spelling | doaj-art-1a8f339524f64981bf480691bc3eb5472024-12-10T00:01:33ZengIEEEIEEE Access2169-35362024-01-011218073818074610.1109/ACCESS.2023.328764010155488EASRec: External Attentive Efficient Sequential RecommenderWu Qiao0Xingliang Zhang1Chao Wu2Bing Jia3https://orcid.org/0000-0003-3294-6303Funing Yang4College of Artificial Intelligence Industry, Changchun University of Architecture and Civil Engineering, Changchun, Jilin, ChinaChina Mobile Group Jilin Company Ltd., Changchun, Jilin, ChinaChina Mobile Communications Group Company Ltd., Beijing, ChinaCollege of Computer Science, Inner Mongolia University, Hohhot, ChinaCollege of Computer Science and Technology, Jilin University, Changchun, Jilin, ChinaSequential recommendation task aims at modeling users’ preference according to the sequential dependencies contained in their historical interacted item sequences. The advanced self-attention mechanism, reallocating the input sequence features, reducing the inductive bias, and refining the output representations, tends to become the go-to underlying method for this task. However, since the computational complexity is quadratic correlated with the sequence length, self-attention-based sequential recommenders might be inflexible to deal with long sequences. We propose an efficient sequential recommender EASRec based on the multi-head external attention mechanism to avoid such issue. Specifically, EASRec mines the sequential dependency via two learnable memories coupled with a double normalization strategy, reducing the computational complexity from quadratic to linear. Take a step further, since the memories are global sharing, our EASRec can implicitly model the potential correlations among all sequences, i.e., the common preference, which is lack in the self-attention mechanism. We conduct numerous experiments on three public datasets, and the experimental results show that our EASRec can provide 3.63% improvement on average compared to several state-of-the-art baselines with minimal computational cost.https://ieeexplore.ieee.org/document/10155488/Sequential recommendationattention mechanismdeep learning |
| spellingShingle | Wu Qiao Xingliang Zhang Chao Wu Bing Jia Funing Yang EASRec: External Attentive Efficient Sequential Recommender IEEE Access Sequential recommendation attention mechanism deep learning |
| title | EASRec: External Attentive Efficient Sequential Recommender |
| title_full | EASRec: External Attentive Efficient Sequential Recommender |
| title_fullStr | EASRec: External Attentive Efficient Sequential Recommender |
| title_full_unstemmed | EASRec: External Attentive Efficient Sequential Recommender |
| title_short | EASRec: External Attentive Efficient Sequential Recommender |
| title_sort | easrec external attentive efficient sequential recommender |
| topic | Sequential recommendation attention mechanism deep learning |
| url | https://ieeexplore.ieee.org/document/10155488/ |
| work_keys_str_mv | AT wuqiao easrecexternalattentiveefficientsequentialrecommender AT xingliangzhang easrecexternalattentiveefficientsequentialrecommender AT chaowu easrecexternalattentiveefficientsequentialrecommender AT bingjia easrecexternalattentiveefficientsequentialrecommender AT funingyang easrecexternalattentiveefficientsequentialrecommender |