[BUG] Chat Window Content Disappears While LLM is Responding - 1.32.4

Hi :wave: The chat window disappears and never reappears again. I am using the latest version of Cody, but I also tested several earlier versions, and the issue persists. I reindexed the entire project but it didn’t help. The files are located in different folders (repos) but within the same VSCode workspace. The window does not disappear when adding a single file to context. I have reset both the extension and VSCode multiple times. No errors in the logs.

Video: Watch Cody Error 1 | Streamable

Hey @Marc

Thank you for reporting the issue.
Did you update to the latest Cody version 1.32.4?

What are your specs, specifically the RAM size?

1 Like

Hi!

Yes, I’m using version 1.32.4 and have tried several earlier versions. My Windows machine has 16 GB of RAM, i5, and I still have 5 GB available.

It seems that at some point, when I receive a response from the LLM, the entire content disappears. I have a feeling that the content is there but just isn’t displaying for some reason (and the cody window stays this way until I disable and enable extension once again). There are no error messages or anything. This happens when I select two files from different repositories (folders). As I mentioned, I’ve tried resetting the VSCode extension multiple times, but that hasn’t helped.

You were right; it was an issue with the RAM or processor. When I restarted the laptop, it started working properly again. Thank you!

1 Like

You are welcome.

The team is working to decrease the requirements on RAM.

1 Like