8000 GitHub - roopakparikh/tiny-ollama-chat: A lightweight UI for chatting with Ollama models. Streaming responses, conversation history, and multi-model support.
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

A lightweight UI for chatting with Ollama models. Streaming responses, conversation history, and multi-model support.

License

Notifications You must be signed in to change notification settings

roopakparikh/tiny-ollama-chat

 
 

Repository files navigation

🤖 Tiny Ollama Chat

Build and Publish Docker Image

A lightweight and efficient UI for interacting with Ollama models locally. This application provides a simple yet powerful interface for chatting with AI models through Ollama.

✨ Features

  • 📱 Real-time message streaming
  • 🧠 View AI thinking process
  • 💬 Conversation history
  • 🚀 Multiple model support
  • 🔗 Custom Ollama URL configuration
  • 💾 Persistent storage with SQLite

📸 Screenshots & Demo

Tiny Ollama Chat Interface

Tiny Ollama Chat Demo

🚦 Prerequisites

  • Ollama running locally or on a network-accessible machine
  • Docker (for container method)
  • Go and Node.js (for local build method)

⚙️ Ollama Configuration

By default, Ollama only listens on localhost (127.0.0.1), which makes it inaccessible from Docker containers. To allow connections from containers or other machines, you need to configure Ollama to listen on all interfaces:

OLLAMA_HOST=0.0.0.0:11434 ollama serve

or you can add this to your .bashrc or .zshrc file:

export OLLAMA_HOST=0.0.0.0:11434

This makes Ollama accessible from other machines and containers by binding to all network interfaces instead of just localhost.

🐳 Docker

Option 1: Pull from GitHub Container Registry

The easiest way to get started is to pull the pre-built image from the GitHub Container Registry:

docker run -p 8080:8080 -v chat-data:/app/data ghcr.io/anishgowda21/tiny-ollama-chat:latest

Option 2: Build Docker Image Locally

Alternatively, you can build the Docker image locally:

docker build -t tiny-ollama-chat .

Running the Docker Container

run the container with:

docker run -p 8080:8080 -v chat-data:/app/data tiny-ollama-chat

Environment Variables

The Docker container supports configuration through environment variables:

Example with custom settings:

docker run -p 9000:9000 \
  -e PORT=9000 \
  -e OLLAMA_URL=http://host.docker.internal:11434 \
  -e DB_PATH=/app/data/custom.db \
  -v chat-data:/app/data \
  tiny-ollama-chat

Connecting to Ollama

Options for connecting the Docker container to Ollama:

  1. Use the Docker host's IP address:

    • On Linux: -e OLLAMA_URL=http://172.17.0.1:11434 (Docker's default bridge gateway)
    • On macOS/Windows: -e OLLAMA_URL=http://host.docker.internal:11434 (This is the default URL you don't need to pass this)
  2. Use the host network:

    docker run --network=host tiny-ollama-chat

🏃‍♂️ Building Locally

If you prefer to build and run the application directly:

Using the Build Script

The repository includes a build script that handles the entire build process:

# Make the script executable
chmod +x buildlocal.sh

# Run the build script
./buildlocal.sh

This script:

  1. Creates a build directory
  2. Builds the client with npm
  3. Builds the server with Go
  4. Places everything in the build directory

Running the Application

After building:

cd build
./tiny-ollama-chat

Command Line Options

The server supports several command line flags:

  • -port=8080: Set the port for the server to listen on (default: 8080)
  • -ollama-url=http://localhost:11434: Set the URL for the Ollama API (default: http://localhost:11434)
  • -db-path=chat.db: Set the path to the SQLite database file (default: chat.db)

Example with custom settings:

./tiny-ollama-chat -port=9000 -ollama-url=http://192.168.1.100:11434 -db-path=/path/to/database.db

💡 Troubleshooting

Ollama Connection Issues

If the application cannot connect to Ollama:

  1. Verify Ollama is running: ps aux | grep ollama
  2. Check that the Ollama URL is correct in your configuration
  3. Ensure network connectivity between the container and Ollama
  4. If using Docker, make sure you've configured Ollama to be accessible as described above

📖 Usage

  1. Open the application in your browser
  2. Select a model from the sidebar to start a new conversation
  3. Type your message and press Enter or click the send button
  4. Browse previous conversations in the sidebar

About

A lightweight UI for chatting with Ollama models. Streaming responses, conversation history, and multi-model support.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 67.2%
  • Go 27.1%
  • Shell 1.8%
  • Dockerfile 1.7%
  • JavaScript 1.2%
  • CSS 0.7%
  • HTML 0.3%
0