8000 GitHub - warm3snow/vscode-ollama: vscode-ollama extension supporting chat、web browsing、code copilot etc.
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

warm3snow/vscode-ollama

Repository files navigation

VSCode Ollama Extension

VSCode Ollama Logo

Downloads Rating GitHub stars License: MIT

English | 中文

VSCode Ollama is a powerful Visual Studio Code extension that seamlessly integrates Ollama's local LLM capabilities into your development environment.

✨ Features

  • 🤖 Local LLM Support

    • Local model execution based on Ollama
    • Multiple model switching support
    • Low-latency responses
  • 🔍 Web Search

    • Real-time web information integration
    • Smart search results synthesis
    • Accurate information citation
  • 💡 Intelligent Chat

    • Streaming response output
    • Thought process visualization
    • Chat history preservation
  • ⚙️ Flexible Configuration

    • Custom server address
    • Adjustable performance modes
    • Model parameter configuration

🚀 Quick Start

📺 Tutorial

  1. Install Ollama

    # macOS
    brew install ollama
    
    # Linux
    curl -fsSL https://ollama.com/install.sh | sh
  2. Install Extension

    • Open Extensions in VS Code
    • Search for "VSCode Ollama"
    • Click Install
  3. Configure Extension

    • Open Command Palette (Ctrl+Shift+P / Cmd+Shift+P)
    • Type "Ollama: Settings"
    • Configure server address and default model

VSCode Ollama Settings Interface

  1. Start Using

    • Use command "Ollama: Open Chat" to start conversation
    • Select model in chat interface
    • Toggle web search
    • Send message to interact

    VSCode Ollama Chat Interface

📝 Usage

Commands

  • Ollama: Open Chat - Open chat interface
  • Ollama: Settings - Open settings page

Shortcuts

  • Shift + Enter - New line in chat input
  • Enter - Send message

❤️ Support & Donation

If you find this extension helpful, you can support the developer by:

💰 Donation Methods

Support the developer

WeChat Pay
WeChat Pay
Alipay
Alipay

🪙 Cryptocurrency

Bitcoin Native Segwit
bc1qskds324wteq5kfmxh63g624htzwd34gky0f0q5

Taproot
bc1pk0zud9csztjrkqew54v2nv7g3kq0xc2n80jatkmz9axkve4trfcqp0aksf
Ethereum 0xB0DA3bbC5e9f8C4b4A12d493A72c33dBDf1A9803
Solana AMvPLymJm4TZZgvrYU7DCVn4uuzh6gfJiHWNK35gmUzd

Your support helps maintain and improve this extension! Thank you! ❤️

  • ⭐ Star the GitHub repository
  • 📝 Submit issues or feedback
  • 🚀 Contribute to the codebase
  • 💬 Share with your friends

📝 Release Notes

See CHANGELOG.md for release notes.

📝 License

This extension is licensed under the MIT License.

Star History

Star History Chart

About

vscode-ollama extension supporting chat、web browsing、code copilot etc.

Resources

License

Stars

Watchers

Forks

Packages

No packages published
0