8000 GitHub - RavenView-ai/cobolt: This is a cross-platform desktop application that allows you to chat with locally hosted LLMs and enjoy features like MCP support
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

This is a cross-platform desktop application that allows you to chat with locally hosted LLMs and enjoy features like MCP support

License

Notifications You must be signed in to change notification settings

RavenView-ai/cobolt

 
 

Repository files navigation

Cobolt

Cobolt Logo

License Platform Version Downloads Build Status Release Status

📥 Download Latest Release

macOS Download Windows Download

This is an early release which is expected to be unstable and change significantly over time. For other platforms and previous versions, visit our Releases page

🎯 Overview

Cobolt is a cross-platform desktop application that enables you to get answers, and perform actions on the data that matters to you. Cobolt only stores data on your device, and uses locally running AI models. Cobolt can also remembers important details about you, and use it to give you personalized responses.And yes! your memories are stored on your device. You can connect to your favourite tools and data sources using the Model Context Protocol (MCP).

Feel like every query to a big tech AI is an automatic, non-consensual donation to their 'Make Our AI Smarter' fund, with zero transparency on how your 'donation' is used on some distant server farm? 💸🤷

We believe that the AI assistants of the future will run on your device, and will not send your data, or queries to be used by tech companies for profit. Small language models are closing the gap with their larger counterparts, and our devices are becoming more powerful. Cobolt is our effort to bring us closer to that future.

Cobolt enables you to get answers based on your data, with a model of your choosing.

Cobolt Home Screen Cobolt Chat Interface
Cobolt Menu Cobolt Integrations

Key Differentiators

  • Local Models: Ensures that your data does not leave your device. We are powered by Ollama, which enables you to use the open source model of your choosing.
  • Model Context Protocol Integration: Enables you to connect to the data sources, or tools that matter the most to you using MCP. This enables your model to access relevant tools and data, providing more useful, context aware responses.
  • Native Memory Support: Cobolt remembers the most important things about you, and uses this to give you more relevant responses.

Getting Started

We use homebrew to install dependencies. If you are on macOS, ensure that homebrew package manager is installed with below script /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"

How to?

How to change the model?

By default we use llama3.1:8b for inference, and nomic-embed-text for embedding.

You can use any Ollama model that supports tool calls listed here. To download a new model for inference install it from Ollama

ollama ls # to view models
ollama pull qwen3:8b # to download qwen3:8b

The downloaded model can be selected from the settings section on the app.

Note: If you want additional customization, you can update the models for tool use, inference, or embedding models individually:

On macOS: Edit ~/Library/Application Support/cobolt/config.json

On Windows: Edit %APPDATA%\cobolt\config.json

After editing, restart Cobolt for changes to take effect.

How to add new integrations?

You can find the most useful MCP backed integrations here. Add new MCP servers by adding new integrations through the application. The application will direct you to a JSON file to add your MCP server. We use the same format as Claude Desktop to make it easier for you to add new servers.

Some integrations that we recommend for new users are available at sample-mcp-server.json.

Restart the application, or reload the integrations after you have added the required servers.

🤝 Contributing

Contributions are welcome! Whether it's reporting a bug, suggesting a feature, or submitting a pull request, your help is appreciated.

Please read our Contributing Guidelines for details on how to setup your development environment and contribute to Cobolt.

You can also:

📄 License

This project is licensed under the Apache 2.0 License - see the LICENSE file for details.

Acknowledgements

Cobolt builds upon several amazing open-source projects and technologies:

  • Ollama - The powerful framework for running large language models locally
  • Model Context Protocol - The protocol specification by Anthropic for model context management
  • Mem0 - The memory management system that inspired our implementation
  • Electron - The framework that powers our cross-platform desktop application

We're grateful to all the contributors and maintainers of these projects for their incredible work.


Built with ❤️ by the Cobolt team

About

This is a cross-platform desktop application that allows you to chat with locally hosted LLMs and enjoy features like MCP support

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 77.0%
  • CSS 15.4%
  • JavaScript 4.6%
  • PowerShell 1.3%
  • Shell 1.0%
  • EJS 0.3%
  • Other 0.4%
0