8000 GitHub - panic80/mmrag
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

panic80/mmrag

Repository files navigation

Mattermost RAG Chatbot

A Retrieval-Augmented Generation (RAG) chatbot integrated with Mattermost, using LlamaIndex, Qdrant, OpenAI embeddings, and Anthropic's Claude for response generation.

Features

  • Mattermost Integration: Seamlessly integrates with Mattermost via slash commands and bot interactions
  • RAG Architecture: Uses state-of-the-art RAG techniques to provide context-aware responses
  • Incremental Updates: Supports both bulk ingestion and incremental updates of channel messages
  • Vector Search: Utilizes Qdrant for efficient similarity search of message embeddings
  • Modern Tech Stack:
    • OpenAI's text-embedding-3-large for embeddings
    • Anthropic's Claude for response generation
    • LlamaIndex for RAG orchestration
    • FastAPI for the web server
    • Docker for containerization

Prerequisites

  • Python 3.10+
  • Docker and Docker Compose
  • Mattermost server with bot account and access token
  • OpenAI API key
  • Anthropic API key

Environment Setup

  1. Clone the repository:
git clone <repository-url>
cd mmrag2
  1. Create and configure the .env file:
cp .env.example .env
# Edit .env with your configuration

Required environment variables:

  • MATTERMOST_URL: Your Mattermost server URL
  • MATTERMOST_BOT_TOKEN: Bot account access token
  • OPENAI_API_KEY: OpenAI API key for embeddings
  • ANTHROPIC_API_KEY: Anthropic API key for Claude
  • See .env file for additional configuration options

Installation

Using Docker (Recommended)

  1. Build and start the containers:
docker-compose up -d --build
  1. Check the logs:
docker-compose logs -f

Local Development

  1. Create a virtual environment:
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
  1. Install dependencies:
pip install -r requirements.txt
  1. Run the development server:
python -m uvicorn app.main:app --reload --host 0.0.0.0 --port 5000

Usage

Slash Commands

The bot responds to the following slash commands in Mattermost:

  • /rag <query>: Ask a question to the RAG system
  • Example: /rag What was discussed about the project timeline?

API Endpoints

  • POST /api/slash_command: Handles Mattermost slash commands
  • POST /api/ingest: Triggers message ingestion from a channel
  • GET /health: Health check endpoint

Data Ingestion

  1. Initial bulk ingestion:
curl -X POST "http://localhost:5000/api/ingest" \
     -H "Content-Type: application/x-www-form-urlencoded" \
     -d "channel_id=your_channel_id"
  1. Incremental update:
curl -X POST "http://localhost:5000/api/ingest" \
     -H "Content-Type: application/x-www-form-urlencoded" \
     -d "channel_id=your_channel_id&since_time=1234567890000"

Development

Project Structure

mmrag2/
├── app/
│   ├── __init__.py
│   ├── main.py           # FastAPI application
│   ├── rag_engine.py     # RAG implementation
│   ├── processing.py     # Text processing and chunking
│   ├── mattermost_client.py  # Mattermost API client
│   └── utils.py          # Utility functions
├── tests/
│   ├── __init__.py
│   ├── test_api.py
│   ├── test_processing.py
│   ├── test_rag_engine.py
│   └── test_mattermost_client.py
├── scripts/              # Utility scripts
├── docker-compose.yml    # Docker Compose configuration
├── Dockerfile           # Docker build configuration
├── requirements.txt     # Python dependencies
└── README.md           # This file

Running Tests

pytest

Configuration

The application is configured through environment variables. See the .env file for available options.

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Commit your changes
  4. Push to the branch
  5. Create a Pull Request

License

[Your License Here]

Acknowledgments

  • LlamaIndex for the RAG framework
  • Qdrant for the vector database
  • OpenAI for embeddings
  • Anthropic for Claude
  • Mattermost for the chat platform

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
0