8000 GitHub - Xnk0314/llm_log_pipeline: LLM Log Pipeline is a Go-based service that leverages a large language model to analyze and summarize backend logs efficiently. This containerized solution integrates with RabbitMQ and PostgreSQL, making it easy to manage logs and insights. πŸ™πŸš€
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

LLM Log Pipeline is a Go-based service that leverages a large language model to analyze and summarize backend logs efficiently. This containerized solution integrates with RabbitMQ and PostgreSQL, making it easy to manage logs and insights. πŸ™πŸš€

Notifications You must be signed in to change notification settings

Xnk0314/llm_log_pipeline

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

8 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

LLM Log Pipeline πŸš€

LLM Log Pipeline License

Welcome to the LLM Log Pipeline repository! This project is a containerized Go service designed to analyze backend logs using advanced Language Learning Models (LLMs), such as LLaMA 3.1 Instruct. It integrates with RabbitMQ for message queuing and PostgreSQL for structured data storage. This README will guide you through the setup, usage, and contribution process.

Table of Contents

  1. Features
  2. Technologies Used
  3. Getting Started
  4. Usage
  5. Configuration
  6. Contributing
  7. License
  8. Contact
  9. Releases

Features

  • Containerized Service: Easily deployable using Docker.
  • LLM Integration: Leverages state-of-the-art LLMs for log analysis.
  • Asynchronous Processing: Uses RabbitMQ for efficient message handling.
  • Data Storage: Stores structured insights in PostgreSQL.
  • Microservices Architecture: Supports scalable and maintainable applications.

Technologies Used

  • Go (Golang): The primary programming language for building the service.
  • Docker: Containerization for easy deployment.
  • Docker Compose: Simplifies multi-container Docker applications.
  • RabbitMQ: Message broker for handling log messages.
  • PostgreSQL: Database for structured data storage.
  • LLMs: Such as LLaMA 3.1 Instruct for log analysis.

Getting Started

To get started with the LLM Log Pipeline, follow these steps:

Prerequisites

Before you begin, ensure you have the following installed:

  • Docker
  • Docker Compose
  • Go (version 1.16 or higher)

Installation

  1. Clone the repository:

    git clone https://github.com/Xnk0314/llm_log_pipeline.git
    cd llm_log_pipeline
  2. Build the Docker images:

    docker-compose build
  3. Run the services:

    docker-compose up

This will start all the necessary services, including RabbitMQ and PostgreSQL.

Usage

Once the services are running, you can start sending logs to the RabbitMQ queue. The LLM Log Pipeline will process these logs and store structured insights in PostgreSQL.

  1. Send logs to RabbitMQ: Use a simple script or tool to send logs to the RabbitMQ queue.
  2. View insights in PostgreSQL: Connect to the PostgreSQL database to view the structured insights generated by the LLM.

Configuration

The configuration files are located in the config directory. You can modify the following settings:

  • RabbitMQ settings: Change the connection details for RabbitMQ.
  • PostgreSQL settings: Update the database connection parameters.
  • LLM settings: Adjust the model and parameters used for log analysis.

Contributing

We welcome contributions to the LLM Log Pipeline! If you want to contribute, please follow these steps:

  1. Fork the repository.
  2. Create a new branch for your feature or bug fix.
  3. Make your changes and commit them.
  4. Push your branch to your forked repository.
  5. Open a pull request with a clear description of your changes.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Contact

For any questions or feedback, please reach out to the repository maintainer at [your-email@example. 6BAB com].

Releases

For the latest releases, please visit the Releases section. You can download and execute the files from there to get the latest features and fixes.


Feel free to explore and contribute to the LLM Log Pipeline. Your feedback and contributions are valuable in making this project better!

About

LLM Log Pipeline is a Go-based service that leverages a large language model to analyze and summarize backend logs efficiently. This containerized solution integrates with RabbitMQ and PostgreSQL, making it easy to manage logs and insights. πŸ™πŸš€

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published

Contributors 2

  •  
  •  
0