This is the PyTorch implementation for our SIGIR 2025 paper.
Bohao Wang, Feng Liu, Jiawei Chen, Xingyu Lou, Changwang Zhang, Jun Wang, Yuegang Sun, Yan Feng, Chun Chen, Can Wang 2025. MSL: Not All Tokens Are What You Need for Tuning LLM as a Recommender. arXiv link
- torch==2.6.0
- transformers==4.49.0
- marisa-trie==1.2.1
- genre
We follow the sliding window data processing method used in BIGRec. In the paper, for LLM-based recommendation, we sample 10,000 training instances (train_10000.csv) due to limitations in computational resources, while for LLM-enhanced recommendation and traditional recommendation models, we use the full dataset (train.csv).
CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 accelerate launch train_lml.py --dataset_name Toy --sample 10000 --num_epochs 10 --tau 1
CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 accelerate launch train_msl.py --dataset_name Toy --sample 10000 --num_epochs 10 --tau 4.5
CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 accelerate launch train_msl_ats.py --dataset_name Toy --sample 10000 --num_epochs 10
Replace lora_weights_path
with the path to the model checkpoint, for example: ./save_lora_model/Toy/sample10000_epoch10_CT_tau4.5/0/checkpoint-79
. We test the checkpoint from each epoch and report the result with the highest NDCG@5.
CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 accelerate launch inference.py --lora_weights_path "xxx" --dataset Toy
Replace lora_weights_path
with the path to the parent directory of the model checkpoint, for example: ./save_lora_model/Toy/sample10000_epoch10_CT_tau4.5/0/
.
python evaluate_batch_match.py --lora_weights_father_path "xxx"
If you find the paper useful in your research, please consider citing:
@misc{wang2025msltokensneedtuning,
title={MSL: Not All Tokens Are What You Need for Tuning LLM as a Recommender},
author={Bohao Wang and Feng Liu and Jiawei Chen and Xingyu Lou and Changwang Zhang and Jun Wang and Yuegang Sun and Yan Feng and Chun Chen and Can Wang},
year={2025},
eprint={2504.04178},
archivePrefix={arXiv},
primaryClass={cs.IR},
url={https://arxiv.org/abs/2504.04178},
}