Add support for `OPENAI_BASE_URL` env for self-hosted models using OpenAI compatible endpoint · Issue #20 · ymichael/open-codex · GitHub
More Web Proxy on the site http://driver.im/
You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm hosting Llama on OpenAI compatible endpoint using vLLM. open-codex uses default OpenAI endpoint for openai provider. Adding support for OPENAI_BASE_URL env will enable open-codex to be used with hosted models.
The text was updated successfully, but these errors were encountered:
SatyKrish
changed the title
Add support for OPENAI_BASE_URL env for locally hosted models using OpenAI compatible endpoint
Add support for OPENAI_BASE_URL env for self-hosted models using OpenAI compatible endpoint
Apr 22, 2025
I'm hosting Llama on OpenAI compatible endpoint using vLLM.
open-codex
uses default OpenAI endpoint foropenai
provider. Adding support forOPENAI_BASE_URL
env will enableopen-codex
to be used with hosted models.Upstream supports this - https://github.com/openai/codex/blob/main/codex-cli/src/utils/config.ts#L37
The text was updated successfully, but these errors were encountered: