So, in the VSCode chat window, you can choose different llm models. Are these also used for autocomplete?
No, LLM for chat and autocomplete can be different.
For chat, the models include:
- Anthropic: Claude Instant, Claude 2.0, Claude 2.1, Claude 3 Sonnet, Claude 3 Opus
- OpenAI: GPT-3.5 Turbo, GPT 4 Turbo Preview
- MistralAI: Mixtral 8x7B
For autocomplete, the options are:
- anthropic
- fireworks
- unstable-openai
- experimental-ollama
1 Like