EASRec: External Attentive Efficient Sequential Recommender

Sequential recommendation task aims at modeling users’ preference according to the sequential dependencies contained in their historical interacted item sequences. The advanced self-attention mechanism, reallocating the input sequence features, reducing the inductive bias, and refining th...

Full description

Saved in:
Bibliographic Details
Main Authors: Wu Qiao, Xingliang Zhang, Chao Wu, Bing Jia, Funing Yang
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10155488/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Sequential recommendation task aims at modeling users’ preference according to the sequential dependencies contained in their historical interacted item sequences. The advanced self-attention mechanism, reallocating the input sequence features, reducing the inductive bias, and refining the output representations, tends to become the go-to underlying method for this task. However, since the computational complexity is quadratic correlated with the sequence length, self-attention-based sequential recommenders might be inflexible to deal with long sequences. We propose an efficient sequential recommender EASRec based on the multi-head external attention mechanism to avoid such issue. Specifically, EASRec mines the sequential dependency via two learnable memories coupled with a double normalization strategy, reducing the computational complexity from quadratic to linear. Take a step further, since the memories are global sharing, our EASRec can implicitly model the potential correlations among all sequences, i.e., the common preference, which is lack in the self-attention mechanism. We conduct numerous experiments on three public datasets, and the experimental results show that our EASRec can provide 3.63% improvement on average compared to several state-of-the-art baselines with minimal computational cost.
ISSN:2169-3536