Error: Local autocomplete with Ollama works (up and running in 10 minutes). Local chat doesn’t, and can’t: The necessary settings entries aren’t present.
Hey there @RoryBlyth. I recently learned that this checkbox appears only if you are running a pre-release/insider build of the Cody VS Code Extension. But if you are on the latest stable version, you can still enable Chat by just adding the following property to your VS Code settings.json file.
"cody.experimental.ollamaChat": true
You’ll still need to restart VS Code, but after adding this, you should see the local ollama models. I will update the blog post as well to reflect this change.
So you’re the genius whose posts I’ve been reading and whose work I’ve been noticing and enjoying and benefitting from and all that.
Thank you so much for all the crazy things you all do. $9/mo. I don’t get it.
I’ll make the change, and then I’ll call everybody I know to tell them to do this immediately.
Being able to use VSC as a local chatbot platform just by installing Cody (free) and getting through Ollama install makes VSC a great chatbot platform.