I have Ollama running on Ubuntu Server.
I have VSCode on Ubuntu Server with Cody and using the Ollama functionality after making the necessary config updates.
I also have Windows computer, running VSCode and Cody, also with the necessary Ollama config updates, however it does not provide any Ollama functionality.
Even if I disconnect the internet, i do not get the option to use Ollama in offline mode. I just get an error saying the internet is basically unavailable.
The custom Cody Extension Config is practically identical to the Working Case on the VSCody with Cody on Ubuntu, but that functionality is not even presented at all on the windows case.
The windows machine can browse to Ollama fine using browser, it cand do a curl to get a generated response from Ollama and it can access OpenWebUI running on the Ubuntu server without any problems at all and with no need to make any changes to any network or firewall settings what so ever. It all works except for one thing, and that is Ollama options in Cody on windows machine.
The Sourcegraph output even says it using Ollama-experimental for autocomplete: CodyCompletionProvider:initialized: experimental-ollama/codestral:latest
But it does not give me any ability to use Chat. I can’t chat with my code on the Windows VSCode Cody, but i can on the Ubuntu VSCode Cody.
DETAILS - UBUNTU
The Ollama version is 0.1.45
DETAILS - WINDOWS:
VSCode:
Version: 1.91.1 (system setup)
Commit: f1e16e1e6214d7c44d078b1f0607b2388f29d729
Date: 2024-07-09T22:06:49.809Z
Electron: 29.4.0
ElectronBuildId: 9728852
Chromium: 122.0.6261.156
Node.js: 20.9.0
V8: 12.2.281.27-electron.0
OS: Windows_NT x64 10.0.22631
Cody:
Name: Cody: AI Coding Assistant with Autocomplete & Chat
Id: sourcegraph.cody-ai
Version: 1.27.1721229491
Publisher: Sourcegraph
Ollama is perfectly accessible from the windows pc
Some Settings / Configs
Experimental Features Enabled
Cody Chat Model Selection doesn’t show the Ollama experimental options.
CodeCompletion Enabled:
You can see in this image below, that code completion to the Ollama server is working properly on Windows VSCode Cody.
So I have managed to prove that the Code Completion is using Ollama properly, but it’s all the Cody Code Chat Functionality that is missing relative to Experimental Ollama.
This is good because it invalidates arguments about firewall rules and network settings causing the issue, since the firewall and network are all allowing the Ollama to do code completion just fine. I’m just not able to get the Ollama Experimental Chat Options in the Model Selection drop down dialogue in Cody’s Code Chat to show.