Local chat with Ollama: Broken: Sourcegraph Blog Post Incorrect

Sourcegraph post with instructions to wire Cody up for local chat: Local chat with Ollama and Cody

Error: Local autocomplete with Ollama works (up and running in 10 minutes). Local chat doesn’t, and can’t: The necessary settings entries aren’t present.

Environment: macOS 14.4.1 + VSC 1.88.1 + Cody 1.14.0

Context: Instructions directly from blog post:

  1. Scroll down to find the “Cody › Experimental: Ollama Chat” section.
  2. Ensure the checkbox titled Enable local Ollama models for chat and commands is checked.

These instructions don’t work. There isn’t a “Cody > Experimental: Ollama Chat” setting.

Screencap from the post shows 35 Cody settings options; mine has only 22, and none of them is “Experimental: Ollama Chat”:

Thank you for any help you can provide.

Hey there @RoryBlyth. I recently learned that this checkbox appears only if you are running a pre-release/insider build of the Cody VS Code Extension. But if you are on the latest stable version, you can still enable Chat by just adding the following property to your VS Code settings.json file.

"cody.experimental.ollamaChat": true

You’ll still need to restart VS Code, but after adding this, you should see the local ollama models. I will update the blog post as well to reflect this change.

Thanks,
Ado

1 Like

So you’re the genius whose posts I’ve been reading and whose work I’ve been noticing and enjoying and benefitting from and all that.

Thank you so much for all the crazy things you all do. $9/mo. I don’t get it.

I’ll make the change, and then I’ll call everybody I know to tell them to do this immediately.

Being able to use VSC as a local chatbot platform just by installing Cody (free) and getting through Ollama install makes VSC a great chatbot platform.

I can’t wait to make the updates to my config :metal::partying_face:

1 Like

Hey. Hey, ado. Hey… What does it feel like to be my hero today?

It worked. And it’s. so. cool. :open_mouth:

1 Like