[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content

Advertisement

Log in

Large language models make sample-efficient recommender systems

  • Letter
  • Published:
Frontiers of Computer Science Aims and scope Submit manuscript

Conclusion

This letter investigates the sample efficiency property of recommender systems enhanced by large language models. We propose a simple yet effective framework (i.e., Laser) to validate the core viewpoint - large language models make sample-efficient recommender systems - from two aspects: (1) LLMs themselves are sample-efficient recommenders; and (2) LLMs make conventional recommender systems more sample-efficient. For future work, we aim to improve the sample efficiency of LLM-based recommender systems from the following two aspects: (1) exploring effective strategy to select the few-shot training samples instead of uniformly sampling, and (2) applying Laser for downstream applications like code snippet recommendation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Explore related subjects

Discover the latest articles and news from researchers in related subjects, suggested using machine learning.

References

  1. Zhang J, Bao K, Zhang Y, Wang W, Feng F, He X. Large language models for recommendation: progresses and future directions. In: Proceedings of the ACM on Web Conference 2024. 2024, 1268–1271

    Chapter  Google Scholar 

  2. Pan X, Wu L, Long F, Ma A. Exploiting user behavior learning for personalized trajectory recommendations. Frontiers of Computer Science, 2022, 16(3): 163610

    Article  Google Scholar 

  3. MindSpore, 2020

Download references

Acknowledgements

The Shanghai Jiao Tong University team was partially supported by the National Natural Science Foundation of China (Grant No. 62177033). Jianghao Lin is supported by the Wu Wen Jun Honorary Doctoral Scholarship. The work was sponsored by Huawei Innovation Research Program. We thank MindSpore [3] for the partial support of this work, which is a new deep learning computing framework.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Weinan Zhang.

Ethics declarations

Competing interests The authors declare that they have no competing interests or financial conflicts to disclose.

Electronic supplementary material

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lin, J., Dai, X., Shan, R. et al. Large language models make sample-efficient recommender systems. Front. Comput. Sci. 19, 194328 (2025). https://doi.org/10.1007/s11704-024-40039-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11704-024-40039-z