A wrapper around GitHub Copilot API to make it OpenAI compatible, making it usable for other tools.
copilot-api-demo.mp4
- Bun (>= 1.2.x)
- GitHub account with Copilot Individual subscription
To install dependencies, run:
bun install
You can run the project directly using npx:
npx copilot-api@latest
With options:
npx copilot-api --port 8080
The project can be run from source in several ways:
bun run dev
Starts the server with hot reloading enabled, which automatically restarts the server when code changes are detected. This is ideal for development.
bun run start
Runs the server in production mode with hot reloading disabled. Use this for deployment or production environments.
The server accepts several command line options:
Option | Description | Default |
---|---|---|
--port, -p | Port to listen on | 4141 |
--verbose, -v | Enable verbose logging | false |
--log-file | File to log request/response details | - |
Note: The --help, -h
option is automatically available through the underlying command-line framework.
Example with options:
bun run start --port 8080 --verbose
In all cases, the server will start and listen for API requests on the specified port.
Tool | Status | Notes |
---|---|---|
Aider | Full | Fully compatible |
bolt.diy | Full | Fully compatible; use any random API key in UI if models fail to load |
Page Assist | Full | Fully compatible |
Kobold AI Lite | Full | Fully compatible |
Note: In general, any application that uses the standard OpenAI-compatible /chat/completions
and /models
endpoints should work with this API.