Abstract
Capturing dynamic preference features from user historical behavioral data is widely applied to improve the accuracy of recommendations in sequential recommendation tasks. However, existing deep neural network-based sequential recommendation methods often ignore the noise information in user behavioral data that influence recommendation effectiveness, making recommendation models sensitive to noisy data. Additionally, these deep learning-based recommendation models often require a large number of parameters to capture user behavior patterns, leading to higher time complexity and susceptibility to overfitting due to noise fluctuations. To address these issues, in this paper, we apply the moving average concept from the field of time-series analysis to sequential recommendation tasks. The approach smooths sequence data, removes noise, making the data more stable and reliable, increases the model’s insensitivity to noise information, and better understands the evolution of user interests and behavioral patterns. Specifically, in our experiments, the moving average algorithm effectively denoises sequential data with low time complexity. Therefore, we propose a Sequential Recommendation with Exponential Moving Average with MLP architecture that makes the model more competitive in terms of time complexity. Experimental results conducted on two datasets demonstrate that EMARec outperforms state-of-the-art sequential recommendation methods across various common evaluation metrics.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Data availability
The data used in this study are publicly available http://jmcauley.ucsd.edu/data/amazon/index_2014.html.
References
Wu S, Zhang W, Sun F et al (2022) Graph neural networks in recommender systems: a survey. In: ACM Computing Surveys, pp 1–37. https://doi.org/10.48550/arXiv.2011.02260
Liu H, Jing L, Yu J et al (2019) Social recommendation with learning personal and social latent factors. In: IEEE Transactions on Knowledge and Data Engineering, pp 2956–2970
Xia F, Liu HF, Lee I et al (2016) Scientific article recommendation: exploiting common author relations and historical preferences. In: IEEE Transactions on Big Data, pp 101–112
Yuan W, Wang H, Yu X et al (2020) Attention-based context-aware sequential recommendation model. In: Information Sciences, pp 122–134
Du X, Yuan H, Zhao, P et al (2023) Frequency enhanced hybrid attention network for sequential recommendation. In: Proceedings of the international ACM SIGIR conference on research and development in information retrieval, pp 78–88
Cai C, He R, McAuley J (2017) SPMC: socially-aware personalized Markov chains for sparse sequential recommendation. Preprint http://arxiv.org/abs/1708.04497
Chen X, Yao L, Mcauley J et al (2023) Deep reinforcement learning in recommender systems: a survey and new perspectives. Knowledge-based Syst 264:110335
Zhang Y, Yang B, Liu, H et al (2023) A time-aware self-attention based neural network model for sequential recommendation. Appl Soft Comput 133:109894
He X, Deng K, Wang X et al (2020) Lightgcn: Simplifying and powering graph convolution network for recommendation. In: Proceedings of the international ACM SIGIR conference on research and development in information retrieval, pp 639–648
Yu J, Yin H, Li J et al (2021) Self-supervised multi-channel hypergraph convolutional network for social recommendation. In: Proceedings of the web conference 2021, pp 413–424
Qu S, Yuan F, Guo G et al (2022) CmnRec: sequential recommendations with chunk-accelerated memory network. In: IEEE Transactions on Knowledge and Data Engineering, pp 3540–3550
Ji W, Wang K, Wang X et al (2020) Sequential recommender via time-aware attentive memory network. In: Proceedings of the ACM international conference on information & knowledge management, pp 565–574
Mehta S, Ghazvininejad M, Iyer S et al (2020) Delight: deep and light-weight transformer. Preprint http://arxiv.org/abs/2008.00623
Hansun S (2013) A new approach of moving average method in time series analysis. In: Proceedings of the conference on new media studies (CoNMedia), pp 1–4
Vaswani A, Shazeer N, Parmar N et al (2017) Attention is all you need. In: Proceedings of Advances in neural information processing systems, pp 5998–6008
Liu L, Ouyang W, Wang X et al (2020) Deep learning for generic object detection: a survey. In: International Journal of Computer Vision, pp 261–318
Min B, Ross H, Sulem E et al (2023) Recent advances in natural language processing via large pre-trained language models: a survey, pp 1–40. arXiv preprint http://arxiv.org/abs/2111.01243
Yu S, Xia F, Sun Y, Tang T, Yan X, Lee I (2020) Detecting outlier patterns with query-based artificially generated searching conditions. In: IEEE Transactions on Computational Social Systems, pp 134–147
Cheng HT, Koc L, Harmsen J et al (2016) Wide & deep learning for recommender systems. In: Proceedings of the 1st workshop on deep learning for recommender systems, pp 7–10
Devooght R, Bersini H (2016) Collaborative filtering with recurrent neural networks. Preprint http://arxiv.org/abs/1608.07400
Gao X, Feng F, Huang H et al (2022) Food recommendation with graph convolutional network. In: Information Sciences, pp 170–183
Tang J, Wang K (2018) Personalized top-n sequential recommendation via convolutional sequence embedding. In: Proceedings of the eleventh ACM international conference on web search and data mining, pp 565–573
Tian C, Zheng M, Zuo W et al (2023) Multi-stage image denoising with the wavelet transform. Pattern Recogn 134:109050. https://doi.org/10.1016/j.patcog
Kulikov V, Yadin S, Kleiner M et al (2023) Sinddm: a single image denoising diffusion model. In: Proceedings of the international conference on machine learning, pp 17920–17930
Zhang C, Zheng W, Liu Q et al (2023) SEDGN: Sequence enhanced denoising graph neural network for session-based recommendation. Exp Syst Appl 203:117391. https://doi.org/10.1016/j.eswa
Zhang X, Lin H, Xu B et al (2022) Dynamic intent-aware iterative denoising network for session-based recommendation. Inform Process Manag 59(3):102936
He R, McAuley J (2016) Fusing similarity models with Markov chains for sparse sequential recommendation. In: Proceedings of IEEE international conference on data mining (ICDM), pp 191–200
Hidasi B, Karatzoglou A, Baltrunas L et al (2015) Session-based recommendations with recurrent neural networks. Preprint http://arxiv.org/abs/1511.06939
Sun F, Liu J, Wu J et al (2019) BERT4Rec: sequential recommendation with bidirectional encoder representations from transformer. In: Proceedings of the ACM international conference on information and knowledge management, pp 1441–1450
Yin P, Zhao J, Ma Z et al (2023) Sequential recommendation based on multipair contrastive learning with informative augmentation. In: Neural Computing and Applications, pp 1–15
Sun Y, Wang B, Sun Z et al (2021) Does every data instance matter? Enhancing sequential recommendation by eliminating unreliable data. In: Proceedings of international joint conference on artificial intelligence, pp 1579–1585
Yue Z, He Z, Zeng H et al (2021) Black-box attacks on sequential recommenders via data-free model extraction. In: Proceedings of the 15th ACM conference on recommender systems. pp 44–54
Hunter J (2016) The exponentially weighted moving average. In: Journal of Quality Technology, pp 203–210
Yuan E, Guo W, He Z et al (2022) Multi-behavior sequential transformer recommender. In: Proceedings of the international ACM SIGIR conference on research and development in information retrieval, pp 1642–1652
Srivastava N, Hinton G, Krizhevsky A et al (2014) Dropout: a simple way to prevent neural networks from overfitting. In: Journal of Machine Learning Research, pp 1929–1958
He K, Zhang X, Ren S et al (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 770–778
McAuley J, Targett C, Shi Q et al (2015) Image-based recommendations on styles and substitutes. In: Proceedings of the international ACM SIGIR conference on research and development in information retrieval, pp 43–52
Zhou K, Yu H, Zhao WX et al (2022) Filter-enhanced MLP is all you need for sequential recommendation. Proc ACM Web Conf 2022:2388–2399
Rendle S (2010) Factorization machines. In: Proceedings of the 2010 IEEE international conference on data mining, pp 995–1000
Song W, Shi C et al (2019) Autoint: automatic feature interaction learning via self-attentive neural networks. In: Proceedings of the 28th ACM international conference on information and knowledge management, pp 1161–1170
Ma C, Kang P, Liu X (2019) Hierarchical gating networks for sequential recommendation. In: Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining, pp 825–833
Ren P, Chen Z, Li J et al (2019) Repeatnet: a repeat aware neural recommendation machine for session-based recommendation. In: Proceedings of the AAAI conference on artificial intelligence, pp 4806–4813
Wu S, Tang Y, Zhu Y et al (2019) Session-based recommendation with graph neural networks. In: Proceedings of the AAAI conference on artificial intelligence, pp 346–353
Qin Y, Wang P, Li C (2021) The world is binary: contrastive learning for denoising next basket recommendation. In: Proceedings of the 44th international ACM SIGIR conference on research and development in information retrieval, pp 859–868
Hou Y, Hu B, Zhang Z et al (2022) Core: simple and effective session-based recommendation within consistent representation space. In: Proceedings of the international ACM SIGIR conference on research and development in information retrieval, pp 1796–1801
Kingma D P, Ba J (2014) Adam: A method for stochastic optimization. Preprint http://arxiv.org/abs/1412.6980
Paszke A, Gross S, Massa F et al (2019) Pytorch: an imperative style, high-performance deep learning library. In: Proceedings of Advances in Neural Information Processing Systems, pp 32–32
Acknowledgements
Thank Prof. Ivan Lee for your valuable feedback on the revision of the paper. This work was supported in part by the National Natural Science Foundation of China under Grants 62072416, 61902361, 61672471, 61975187, and 61802352, in part by Key Research and Development Program of Shaanxi Province under Grant 2024GX-YBXM-545, in part by the Henan Key Research Project of Higher Education Institutions under Grant 22B520046, in part by Key Research and Development Special Project of Henan Province under Grants 221111210500, 232102211053, 222102210170, 222102110045, and 232102321069, in part by the Natural Science Foundation Project of Henan Province under Grant 222300420582, in part by the Doctoral Fund Project of Zhengzhou University of Light Industry under Grants 2020BSJJ030 and 2020BSJJ031, in part by the Mass Innovation Space Incubation Project under Grant 2023ZCKJ216, in part by the Data Science and Knowledge Engineering Team of Zhengzhou University of Light Industry, and in part by the innovation team of data science and knowledge engineering of Zhengzhou University of Light Industry under Grant 13606000032.
Author information
Authors and Affiliations
Corresponding authors
Ethics declarations
Conflict of interest
We declare that we do not have any commercial or associative interest that represents a conflict of interest in connection with the work submitted.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Chen, R., Wang, Z., Tang, C. et al. EMARec: a sequential recommendation with exponential moving average. Neural Comput & Applic 36, 12917–12933 (2024). https://doi.org/10.1007/s00521-024-09718-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-024-09718-7