Web interface slooooow to respond

Please give me a button option to turn off the slow response. Perplexity gives me responses almost instantly but I’m finding cody a little better at coding.

Been using the web interface only. As my code gets longer, 130 lines, it’s very frustrating having to wait such a long time as cody slowly types out its response. Trying to do a bunch of rewrites quickly becomes frustrating.

Cody told me that it generates its responses almost instantly behind the scenes but that its creators purposely throtttle how fast it is presented to make it look like someone is typing back at me. It said they are doing that to make it feel more like I’m interacting with a human.

I could care less about it feeling like a human is typing back at me. I need these code changes fast. I can’t be boiling a pot of coffie everytime I wait for cody to sloooowly type out its response.

Going to have to go back to Perplexity for most of my coding and have Cody only do the final rewrite to get things done faster.

Hello @CodyCrumbs
due to a long chat session, the reply slows down in that cases.

Or do you experiences that with a fresh start of a conversation too?

Hi PriNova. I see. I don’t remember the exact conversation length but I recall it was open for more than two hours so thank you for that info. I wasn’t paying attention so I only noticed how slow it was getting toward the end of my coding session since the slowdown happened so gradually.

I tried Cody again today, just a quick session and it was very very fast to respond. It appears to learn what I am doing during a session so it’s a bit of a shame to have it slow down to where I have to restart it and loose all that knowledge it has. Still very usefull though and I am liking it more and more as I learn about it. It’s the best coding Ai I have tried so far. Cheers to the team for their continued good work. Thanks again for the response.

Thank you for your detailed answer.

The dev team love to hear that if users are happy.

Sure thing. I woudn’t be surprised if that session was closer to five hours than two.

One last suggestion regarding speed. If possible it may help if only cody explainations are throttled but not actual code snipets that it posts. This may give a better ballance between speed and user comfort.

Even though its throttling is still pretty fast, in a new conversation, it’s still sort of slow to watch a long code block appear as cody types it out fast line by line. If it could post its code suggestions instantly but have the intermediate explanations remain throttled, that may help bring a nice ballance of speed and user comfort.

Keep up the good work.

To be honest, the response is not throttled anyway. This is the current speed of the LLM with some background processing like code recognition and code highlighting, markdown formatting and so on.
The response speed is faster without all that processing around, but for a real UX it needs that kind of formatting.

Ok, that sounds reasonable. I apprecate the response. Keep up the good work.

1 Like

The performance is painfully slow—Every time the bot types, it adds more elements to the page, which in turns becomes more sluggish and heavy.

If Cody gives a long initial response, the very next reply becomes a struggle. Keep the conversation going a bit longer, and it hits a point where it’s spitting out one letter per second. I’m forced to close the chat and reopen it just to see the full response without waiting 10+ minutes.

Many websites already solved this issue, actually, cody is the only one where i even have this issue, so a developer should definitely look into it.

Or at least have an option to only display the full message when it’s done, because seeing the request being finished in 10s and having to wait 5 minutes to see it on screen, appearing letter by letter is really bad.

Like i said, i’m constantly being forced to send the prompt, wait 10s for the request to finish, reload the page, and open the chat again, because otherwise it’s unusuable unless you’re just using new chats.