NLP-LLM based chatbot built using Flask; with custom context handling.
The chatbot performs explicit context management to improve performance of the models with small context size.
It does so by methods like 'dynamic proportion based allocation' and 'summarization plus proportional allocation' methods.
The application handles user and model interactions through a RESTful API.
- Python 3.x
- Flask
- Other dependencies listed in
requirements.txt
-
Clone the repository:
git clone https://github.com/yourusername/nlp-chatbot-flask.git cd nlp-chatbot-flask
-
Create a virtual environment:
python -m venv venv source venv/bin/activate # On Windows use `venv\Scripts\activate`
-
Install dependencies:
pip install -r requirements.txt
-
Run the Flask application:
cd src
Start the app:
python app.py
OR
flask run --host=0.0.0.0 --port=5100
OR
# Using Docker docker-compose up --build
-
[Optional] Customize the chatbot behavior: Update the config dictionary to customize behavior.
model_name
: Any HuggingFace conversational model name.max_tot_input_prop
: Upper ceiling for total input (history + current input)max_hist_input_prop
: Maximum proportion of the history tokens within total input.min_hist_input_prop
: Minimum proportion of the history tokens within total input.
This project is licensed under the MIT License.