Cody autocomplete is getting poor

While the chat function is very good, the autocomplete function is poor. It frequently fires null auto-completes, as if it doesn’t know anything.

Half the time, it completes useless results, and the other half, it sends null completes, loading but resulting in nothing.

I tried setting the autocomplete LLM from null to Auntropic, but then it didn’t send me anything, and the autocomplete didn’t work at all. The same happened with all the other autocomplete LLMs - only the null setting worked, but even that wasn’t good. By the way, what LLM is the null setting? Auntropic or Fireworks?

I don’t know if you’re aware of this, but I use autocomplete daily, and it seems to be getting worse for some reason.

Are there any settings I can adjust to experience better results?


Yeah, Anthropic hasn’t worked for a while as a setting for autocomplete

I raised it on another thread here

I also scoured GitHub issues for some sign of this problem

As of a couple weeks ago it didn’t seem to be on their radar at all from what I could tell

The only option I believe works is null

Thanks for reporting. I will make sure the product team sees this in the weekly community feedback report (going out today).

Thanks for your feedback!

The Anthropic provider issue will be fixed in the upcoming stable release. The nightly version with the solution should be released later today.

What LLM is used for the null setting?

The null value should be considered as the “default” setup. Currently, it uses Fireworks behind the scenes. We will enhance the dropdown menu for better clarity.

1 Like

Thanks @valerybugakov !