I’m running into persistent ETIMEDOUT errors with Cody Chat on VCS 1.96.0. This happens across all models, any version of Cody, and on both OSX 14.5 and Windows.
Chat works for 1-2 responses, then stops. Restarting the extension fixes it temporarily but only for 1-2 responses.
Error Logs:
ClientConfigSingleton failed to refresh client config:
Error: accessing Sourcegraph HTTP API: ETIMEDOUT: timed out after 20000ms
(https://sourcegraph.com/.api/client-config)
ModelsService new models API enabled:
codebaseRootFromMentions Failed to get repo IDs from Sourcegraph:
Error: accessing Sourcegraph HTTP API: ETIMEDOUT: timed out after 20000ms
(https://sourcegraph.com/.api/graphql?Repositories)
ChatController resolveContext > symf /Users/Dropbox/Projects/delivery-crew (start):
telemetry-v2 recordEvent: cody.fixup.persistence/present
Issue has been happening since approximately 12/12. Is this a known issue, and are there any fixes/workarounds? Appreciate any help!
The length of the prompts and context provided can vary significantly. At times, they may reach the maximum token limit, while at others, they consist of just a few words or lines. This issue seems to occur more frequently after submitting a request that exceeds the limit. When that happens, the chat window seems to lose connection with the LLM, leading to the errors mentioned in my original post (see screenshot). To resolve this, I typically have to restart Cody to restore functionality.
I am not using a VPN or a slow internet connection; I usually operate on a fiber network. I’ve also tested the Sourcegraph URL mentioned in the timeout messages without any issues. A traceroute doesn’t indicate any problems either.
Cody’s web version doesn’t seem to experience the same issues, although I haven’t tested it extensively. I’ve also reinstalled the extension as part of my troubleshooting efforts.
Today, I’ve noticed improved performance, with only a few instances of the tool being unusable and requiring a restart. I updated to pre-release version 1.51.1734469056 this morning, which might have helped. Other versions have not been as reliable for me since 12-11-24.