Abstract
If you have even a passing familiarity with the advancements that have been made in the fields of machine learning and artificial intelligence in the years since 2018, you have almost certainly become aware of the tremendous strides that have been taken in the field of natural language processing (also known as NLP). Most of the progress in this area can be attributed to large language models, also known as LLMs. The architecture behind these LLMs is the transformer’s encoder-decoder, which we discussed in Chapter 2.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to APress Media, LLC, part of Springer Nature
About this chapter
Cite this chapter
Jain, S.M. (2022). Hugging Face. In: Introduction to Transformers for NLP. Apress, Berkeley, CA. https://doi.org/10.1007/978-1-4842-8844-3_4
Download citation
DOI: https://doi.org/10.1007/978-1-4842-8844-3_4
Published:
Publisher Name: Apress, Berkeley, CA
Print ISBN: 978-1-4842-8843-6
Online ISBN: 978-1-4842-8844-3
eBook Packages: Professional and Applied ComputingApress Access BooksProfessional and Applied Computing (R0)