LLMProxy is an intelligent large language model backend routing proxy service.
-
Updated
Jun 11, 2025 - C#
8000
LLMProxy is an intelligent large language model backend routing proxy service.
This Streamlit application showcases the Mixture of Agents (MOA) architecture proposed by Together AI, powered by Groq LLMs. It allows users to interact with a configurable multi-agent system for enhanced AI-driven conversations.
A simplified agentic workflow process based on the Mixture of Agents (MoA) system for Large Language Models (LLMs)
A mixture of agents allowing you to combine responses from different AI models.
This is the official repository for the paper: This is your Doge: Exploring Deception and Robustness in Mixture-of-LLMs.
π€ An intelligent document Q&A system π leveraging LangChain π¦, Retrieval-Augmented Generation (RAG) π§ , and a custom Mixture of Idiots Agents (MoIA) π€ͺ approach with OpenAI models π. Ask questions from your documents and receive intelligently synthesized answers! β¨
Add a description, image, and links to the mixture-of-agents topic page so that developers can more easily learn about it.
To associate your repository with the mixture-of-agents topic, visit your repo's landing page and select "manage topics."