I’d love to see Grok 3, built by xAI, added as an LLM option in Cody for chat and autocomplete. It’s a powerful, truth-seeking model with a newly launched API (April 2025) that mirrors OpenAI/Anthropic compatibility, making integration straightforward. Supporting Grok 3 could enhance Cody’s flexibility and appeal for developers like me who value its unique reasoning capabilities.
I think you can already use Grok 3 with BYOK functionality under cody.dev.models
. I doubt that Grok 3 will be added to core model list in the near future, maybe in Enterprise Starter plan but surely not for free or even Pro users could be discutable.
Also probably will not be used for autocomplete because autocomplete models need very low latency and to be specially trained to handle FIM and other completion cases… Maybe Grok 3 could be used for some auto-edit functionality but that’s a very big if.