Not able to run open-codex with ollama running locally on my computer · Issue #30 · ymichael/open-codex · GitHub
More Web Proxy on the site http://driver.im/
You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Tried to run setup, as per the documentation, but does not work, takes me to chat cli (see video attached below)
Screencast.from.05-02-2025.01.05.32.PM.webm
I have used codex with openai before and ollama before - it worked great with openai (as expected) but with ollama, performance was very slow - it worked but codex was taking like a minute to send the prompt/request to ollama. I had a alternative python cli script to chat with ollama directly and that worked great, but huge latency with codex in middle.
Thats why I was looking to see if there are alternatives, and I stumbled upon this project, but have had no real success with this so far.
I am able to chat with ollama model directly but not via codex cli
Please advise on next steps, also please let me know if there are any other details you need from me.
The text was updated successfully, but these errors were encountered:
HW config:
My dev env: Dell xps 9550, i9, ubuntu 20
Issue:
Installed open-codex using: npm install -g open-codex
Tried to run setup, as per the documentation, but does not work, takes me to chat cli (see video attached below)
Screencast.from.05-02-2025.01.05.32.PM.webm
I have used codex with openai before and ollama before - it worked great with openai (as expected) but with ollama, performance was very slow - it worked but codex was taking like a minute to send the prompt/request to ollama. I had a alternative python cli script to chat with ollama directly and that worked great, but huge latency with codex in middle.
Thats why I was looking to see if there are alternatives, and I stumbled upon this project, but have had no real success with this so far.
I am able to chat with ollama model directly but not via codex cli
Please advise on next steps, also please let me know if there are any other details you need from me.
The text was updated successfully, but these errors were encountered: