Chatbots have gained a lot of popularity in recent years. As the interest grows in using chatbots for business, researchers also did a great job on advancing conversational AI chatbots.
In this tutorial, we'll use the Huggingface transformers library to employ the pre-trained DialoGPT model for conversational response generation.
DialoGPT is a large-scale tunable neural conversational response generation model trained on 147M conversations extracted from Reddit. The good thing is that you can fine-tune it with your dataset to achieve better performance than training from scratch.
This tutorial is about text generation in chatbots and not regular text. If you want open-ended generation, see this tutorial where I show you how to use GPT-2 and GPT-J models to generate impressive text.