Hi,
Is anyone else experiencing Sonnet 3.7 issues regarding it not being able to access the chat for context on the sourcegraph website , today?
Hi,
Is anyone else experiencing Sonnet 3.7 issues regarding it not being able to access the chat for context on the sourcegraph website , today?
Hey @johnsbf
Can you please explain exactly what didn’t work? This would be helpful information to narrow down the issue.
Thanks.
I’ll do my best to explain.
I use Cody on the web, specifically Sonnet 3.7. For whatever reason all day yesterday it was treating every interaction in the same chat as the first interaction, so problem solving became impossible because it couldn’t access the chat for context. I was trying to debug a section of code and it kept asking for the code. I’d provide the section of the code and it would ask what I wanted to do with the code, so I’d provide the errors from the compiler and it would ask to see the code the errors were for, and round and round we went. I shut down, rebooted, tried new chats, waited a few hours, tried again. Same results.
I just tried again this morning and had this response:
Can you see the above chat for context?
Context
— none fetched
No, I still cannot access the previous chat context. Each of my responses is based only on the most recent message you’ve sent me. This appears to be a technical limitation or issue with how the chat history is being made available to me. If you need me to work with information from previous messages, you’ll need to include that information in your current message.
May I ask what plan you are on? Is it Free, Pro or Enterprise Starter?
I’m on Pro.
I’ve tried other AI this morning also, and the issue is the same.
Could it be a firewall or security setting that has somehow changed with a computer update or something? Could that affect the AIs ability to access the chat for context?
I just disabled all firewalls temporarily to check if maybe that was preventing it, and it didn’t help. Please see attached.
It might be that the conversion is overly long? At some point especially when the context window limit exceeds, the LLM “forgets” the previous messages.
Not sure what that image refers to regarding the @.
I’ve never had to use it in the past on sourcegraph or via any other platform. It’s always been able to continue a chat through to conclusion, even if I open a chat in history days later, it’s always been able to revert to the chat for context.
How would I use the @ if I want it to use the previous 5, 6, 7 responses in a chat for context?
On other platforms/apps including Claude, I’m not having an issue with AI using the chat thread for context. This is specific to Sourcegraph and only started yesterday.
It happens with new chats also. It responds to the initial query then treats my response as a new message
I’ve fixed it, I think. I deleted all chat history.
I’m guessing combined they exceeded a max storage/cache allowance or something, which somehow prevented the AI from even accessing a thread in a new chat.
It seems to be working fine
Interesting, I tried it currently and can’t reproduce the issue.
Do you know how to use the DevTool console in your browser? Maybe there might be some logs to share showing the issue in detail.
I don’t know how to use dev tools I’m afraid, but happy for you to talk me through it if you think it would be valuable.
I had a lot of chats in my sourcegraph history, and when I say a lot, I mean like months of very long chats.
Depending which browser you have, typically you open the browser menu and, in the settings, or tool menu you find an entry “Developer Tools”. When you click on this a side-by-side window opens and you see a pane called ‘Console’. There you see all the issues, warnings and errors happening during rendering the page or network traffic.
Looks like in the image
And every time you send a message it might provide the logs as issues or errors.