Where and how in Visual Studio can I configure Cody to use my local ollama?
1 Like
Hey @thekim1
Using custom models in Visual Studio is not available currently, since this is in an early stage of development. If you don’t mind you can leave a feedback request here and the team will have a closer look at it.
Thank you.
Bummer. Got Ollama up and running with Code Llama, got Cody installed in VS, went hunting for how to do this and found your reply Was a feedback request made, or is there something I can do to indicate interest in this functionality?
Many thanks.
This forum is dedicated to feature-requests.
Your interest in that raises awareness of it.
Thank you