Can't get remote Ollama instance to work #27
Comments
Hey @itskenny0! Can you try this : |
I think the Codex error ->
|
After the CORS been resolved with ollama.service file and restarting the service. You have to run open-codex using this command: |
does it work on your system? could it be some configuration mismatch? is there an option to run this in docker? and still have it index the code and retain all the original functionality? |
You can try using ngrok to expose your ollama service and then try the link in browser, it should print If all works fine you can use this link instead of http://192.168.69.3:11434/v1 as |
Can you add these and retry?
|
@dviresh93 , are you run it in the docker? |
Thanks a lot for this project! I'm trying to get it running with my local Ollama instance, but refuses the request:
https://asciinema.org/a/C3xvq5V69tvdo2X2ndiaZTAG2
Am I missing something obvious here? Thanks!
The text was updated successfully, but these errors were encountered: