Wonder if anyone else is having the following issues or has any thoughts on them.
Typically I have one chat going and it get’s quite long. Usually this is the case because I am doing something that it doesn’t make sense to start fresh again. I"m sure there is some impacts to performance because of this as well as impacts to context length being either full all the time or something else.
But this is what I’m seeing overall:
- Copy code then click into chat box, freezes cody for sometimes minutes. Then cursor will appear and I can then paste. Overall the copy and paste into chat box is not performant at all.
- The entire cody window goes grey and nothing shows. I’ve restarted VScode and everything else loads, just not the window with Cody. Looks like it does load minutes later.
- Clicking to load a past chat takes a long time. Again, sometimes my chats get really long but if the chats are stored locally (I would assume), why does it take so long to load?
- I’ve had several times in the past few days the response hang after submit. Sometimes it never responds. Other time it takes minutes. Obviously it depends on the model but I wouldn’t expect that using o4 through cody would be much less performant than using the chatgpt UI with o4.
- Responses are generally slow enough that I will send a chat, then go to a different window, like Chrome, and do something. Then head back to VScode to see if it’s done responding. And it will have frozen sometimes. Other times it’s fine and the response is sitting there complete.
Overall, the experience would be much improved if overall performance and speed were improved. I may try to remove some other extensions like docker and see if helps.
Anyone else seeing similar issues?
Edit: Other VScode functionality and UI elements are not frozen or affected at all. Just the Cody window.