计算机科学
变压器
可扩展性
水准点(测量)
推荐系统
机器学习
标杆管理
人工智能
数据挖掘
计算机工程
数据库
物理
业务
营销
电压
量子力学
地理
大地测量学
作者
Wenqi Sun,Zheng Liu,Xinyan Fan,Ji-Rong Wen,Wayne Xin Zhao
标识
DOI:10.1007/978-3-031-30672-3_23
摘要
Transformer and its variants have been intensively applied for sequential recommender systems nowadays as they take advantage of the self-attention mechanism, feed-forward network (FFN) and parallel computing capability to generate the high-quality sequence representation. Recently, a wide range of fast, efficient Transformers have been proposed to facilitate sequence modeling, however, the lack of a well-established benchmark might lead to the non-reproducible and even inconsistent results across different works, making it hard to gain rigorous assessments. In this paper, We provide a benchmark for reproducibility and present a comprehensive empirical study on various Transformer-based recommendation approaches, and key techniques or components in Transformers. Based on this study, we propose a hybrid effective and Efficient Transformer variant for sequential Recommendation (ETRec), which incorporates the scalable long- and short-term preference learning, blocks of items aggregating as interests, and parameter-efficient cross-layer sharing FFN. Extensive experiments on six public benchmark datasets demonstrate the advanced efficacy of the proposed approach.
科研通智能强力驱动
Strongly Powered by AbleSci AI