A Model Context Protocol (MCP) server that implements a "wisdom of crowds" approach to AI reasoning by consulting multiple state-of-the-art language models in parallel and synthesizing their responses.
# Run directly with npx (no installation needed)
npx mcp-cognition-wheel
# Or install globally
npm install -g mcp-cognition-wheel
mcp-cognition-wheel
- Clone the repository
- Install dependencies:
pnpm install
- Copy
.env.example
to.env
and add your API keys - Build the project:
pnpm run build
The Cognition Wheel follows a three-phase process:
-
Parallel Consultation: Simultaneously queries three different AI models:
- Claude-4-Opus (Anthropic)
- Gemini-2.5-Pro (Google)
- O3 (OpenAI)
-
Anonymous Analysis: Uses code names (Alpha, Beta, Gamma) to eliminate bias during the synthesis phase
-
Smart Synthesis: Randomly selects one of the models to act as a synthesizer, which analyzes all responses and produces a final, comprehensive answer
- Parallel Processing: All models are queried simultaneously for faster results
- Bias Reduction: Anonymous code names prevent synthesizer bias toward specific models
- Internet Search: Optional web search capabilities for all models
- Detailed Logging: Comprehensive debug logs for transparency and troubleshooting
- Robust Error Handling: Graceful degradation when individual models fail
# Run directly with npx (no installation needed)
npx mcp-cognition-wheel
# Or install globally
npm install -g mcp-cognition-wheel
mcp-cognition-wheel
- Clone the repository
- Install dependencies:
pnpm install
- Copy
.env.example
to.env
and add your API keys - Build the project:
pnpm run build
This is an MCP server designed to be used with MCP-compatible clients like Claude Desktop or other MCP tools.
ANTHROPIC_API_KEY
: Your Anthropic API keyGOOGLE_GENERATIVE_AI_API_KEY
: Your Google AI API keyOPENAI_API_KEY
: Your OpenAI API key
Based on the guide from this dev.to article, here's how to integrate with Cursor:
-
Open Cursor Settings:
- Go to Settings → MCP
- Click "Add new MCP server"
-
Configure the server:
- Name:
cognition-wheel
- Command:
npx
- Args:
["-y", "mcp-cognition-wheel"]
Example configuration:
{ "cognition-wheel": { "command": "npx", "args": ["-y", "mcp-cognition-wheel"], "env": { "ANTHROPIC_API_KEY": "your_anthropic_key", "GOOGLE_GENERATIVE_AI_API_KEY": "your_google_key", "OPENAI_API_KEY": "your_openai_key" } } }
- Name:
-
Build the project (if not already done):
pnpm run build
-
Configure the server:
- Name:
cognition-wheel
- Command:
node
- Args:
["/absolute/path/to/your/cognition-wheel/dist/app.js"]
Example configuration:
{ "cognition-wheel": { "command": "node", "args": [ "/Users/yourname/path/to/cognition-wheel/dist/app.js" ], "env": { "ANTHROPIC_API_KEY": "your_anthropic_key", "GOOGLE_GENERATIVE_AI_API_KEY": "your_google_key", "OPENAI_API_KEY": "your_openai_key" } } }
- Name:
-
Test the integration:
- Enter Agent mode in Cursor
- Ask a complex question that would benefit from multiple AI perspectives
- The
cognition_wheel
tool should be automatically triggered
The server provides a single tool called cognition_wheel
with the following parameters:
context
: Background information and context for the problemquestion
: The specific question you want answeredenable_internet_search
: Boolean flag to enable web search capabilities
pnpm run dev
: Watch mode for developmentpnpm run build
: Build the TypeScript codepnpm run start
: Run the server directly with tsx
Build and run with Docker:
# Build the image
docker build -t cognition-wheel .
# Run with environment variables
docker run --rm \
-e ANTHROPIC_API_KEY=your_key \
-e GOOGLE_GENERATIVE_AI_API_KEY=your_key \
-e OPENAI_API_KEY=your_key \
cognition-wheel
MIT