I updated to version 5.5.10 of the Jetbrains Cody plugin in Goland and I can’t use Cody anymore.
The chat says
ody encountered an error when processing your message:
⚠ org.eclipse.lsp4j.jsonrpc.ResponseErrorException: Request chat/restore failed with message: No default chat model found
The My Account tab hangs in Current tier: Loading...
@aggregat4 for now can you please roll back to 5.5.9 if you haven’t already?
Could you provide some logs?
JetBrains logs can be accessed via the Output panel. To access logs, you must first enable Cody logs from the Settings menu. To do so:
- Open the Settings panel (⌘, for macOS) (Ctrl Alt 0S for Windows)
- Go to
Sourcegraph & Cody
- Click on
Cody
- Check the box to Enable debug
- Optionally, select the box to enable Verbose debug
- Click Apply
- To access the logs, go to Help - Show Log in Finder
- Open the
idea.log
file
Thanks for the reply.
I enabled verbose debugging, noticed that it was complaining about a token. I removed my account from the Cody settings and readded it and reauthorized through your website. Then after restarting Goland I could use Cody again.
Thanks for the help.
I take it back. I can now do chats, but no autocompletes. The log says:
024-05-17 09:00:14,037 [2538794] WARN - #com.sourcegraph.cody.agent.CodyAgent - autocomplete failed NetworkError: Request to https://cody-gateway.sourcegraph.com/v1/completions/fireworks failed with 400 Bad Request: {"error":"model \"fireworks/accounts/sourcegraph/models/codecompletion-m-mixtral-rb-rs-m-go-400k-25e\" is not allowed, allowed: [fireworks/starcoder, fireworks/accounts/sourcegraph/models/starcoder2-15b, fireworks/accounts/sourcegraph/models/starcoder2-7b, fireworks/accounts/fireworks/models/llama-v2-13b-code, fireworks/accounts/sourcegraph/models/codecompletion-mixtral-rust-152k-005e]"}
2024-05-17 09:00:14,037 [2538794] WARN - #com.sourcegraph.cody.agent.CodyAgent -
2024-05-17 09:00:14,037 [2538794] WARN - #com.sourcegraph.cody.agent.CodyAgent - at /home/boris/.local/share/JetBrains/GoLand2024.1/Sourcegraph/agent/index.js:183144:17
2024-05-17 09:00:14,037 [2538794] WARN - #com.sourcegraph.cody.agent.CodyAgentClient - Cody by Sourcegraph: █ getInlineCompletions:error: Request to https://cody-gateway.sourcegraph.com/v1/completions/fireworks failed with 400 Bad Request: {"error":"model \"fireworks/accounts/sourcegraph/models/codecompletion-m-mixtral-rb-rs-m-go-400k-25e\" is not allowed, allowed: [fireworks/starcoder, fireworks/accounts/sourcegraph/models/starcoder2-15b, fireworks/accounts/sourcegraph/models/starcoder2-7b, fireworks/accounts/fireworks/models/llama-v2-13b-code, fireworks/accounts/sourcegraph/models/codecompletion-mixtral-rust-152k-005e]"}
Error: Request to https://cody-gateway.sourcegraph.com/v1/completions/fireworks failed with 400 Bad Request: {"error":"model \"fireworks/accounts/sourcegraph/models/codecompletion-m-mixtral-rb-rs-m-go-400k-25e\" is not allowed, allowed: [fireworks/starcoder, fireworks/accounts/sourcegraph/models/starcoder2-15b, fireworks/accounts/sourcegraph/models/starcoder2-7b, fireworks/accounts/fireworks/models/llama-v2-13b-code, fireworks/accounts/sourcegraph/models/codecompletion-mixtral-rust-152k-005e]"}
at /home/REDACTEd/.local/share/JetBrains/GoLand2024.1/Sourcegraph/agent/index.js:183144:17
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async generatorWithTimeout (/home/boris/.local/share/JetBrains/GoLand2024.1/Sourcegraph/agent/index.js:124082:31)
at async fetchAndProcessDynamicMultilineCompletions (/home/boris/.local/share/JetBrains/GoLand2024.1/Sourcegraph/agent/index.js:181047:45)
at async Promise.all (index 0)
at async zipGenerators (/home/boris/.local/share/JetBrains/GoLand2024.1/Sourcegraph/agent/index.js:124048:17)
at async generateCompletions (/home/boris/.local/share/JetBrains/G
oLand2024.1/Sourcegraph/agent/index.js:181524:30)
2
1 Like
@aggregat4 thanks for the logs it really helps drill down the issue. And for redating!
So this seems to be a bug.
Temp solution: log out and log back in. If it still is giving you issues, please LMK, and I’ll escalate this.