Configure Cody to use Ollama model hosted in Azure VM

I have installed Cody for Visual Studio extension in my Visual Studio 2022 instance in my laptop. I would like to configure it so that it can use the Ollama models that i have running in a Virtual Machine (VM) in Azure instead of using the local version that is installed in my laptop.

Would you please let me know whether its possible to do that, and if so how i might be able to do it. I am currently testing using the free version of Cody with the intention of upgrading once i know that this can work.

Hey @jomo

The Cody extension for Visual Studio is in a very early experimental stage and not on feature parity like it is with Visual Studio Code.
Configuring Ollama with Visual Studio is currently not supported and only available with VS Code.

Sorry for the inconvenience.