VSCode Ollama is a powerful Visual Studio Code extension that seamlessly integrates Ollama's local LLM capabilities into your development environment.
-
🤖 Local LLM Support
- Local model execution based on Ollama
- Multiple model switching support
- Low-latency responses
-
🔍 Web Search
- Real-time web information integration
- Smart search results synthesis
- Accurate information citation
-
💡 Intelligent Chat
- Streaming response output
- Thought process visualization
- Chat history preservation
-
⚙️ Flexible Configuration
- Custom server address
- Adjustable performance modes
- Model parameter configuration
-
Install Ollama
# macOS brew install ollama # Linux curl -fsSL https://ollama.com/install.sh | sh
-
Install Extension
- Open Extensions in VS Code
- Search for "VSCode Ollama"
- Click Install
-
Configure Extension
- Open Command Palette (Ctrl+Shift+P / Cmd+Shift+P)
- Type "Ollama: Settings"
- Configure server address and default model
-
Start Using
- Use command "Ollama: Open Chat" to start conversation
- Select model in chat interface
- Toggle web search
- Send message to interact
Ollama: Open Chat
- Open chat interfaceOllama: Settings
- Open settings page
Shift + Enter
- New line in chat inputEnter
- Send message
If you find this extension helpful, you can support the developer by:
💰 Donation Methods
Support the developer
WeChat Pay |
Alipay |
Bitcoin |
Native Segwitbc1qskds324wteq5kfmxh63g624htzwd34gky0f0q5
Taproot bc1pk0zud9csztjrkqew54v2nv7g3kq0xc2n80jatkmz9axkve4trfcqp0aksf
|
Ethereum |
0xB0DA3bbC5e9f8C4b4A12d493A72c33dBDf1A9803
|
Solana |
AMvPLymJm4TZZgvrYU7DCVn4uuzh6gfJiHWNK35gmUzd
|
Your support helps maintain and improve this extension! Thank you! ❤️
- ⭐ Star the GitHub repository
- 📝 Submit issues or feedback
- 🚀 Contribute to the codebase
- 💬 Share with your friends
See CHANGELOG.md for release notes.
This extension is licensed under the MIT License.