Hi, why did you shorten the context window for the o1 and o1-mini models? It’s giving really short answers, like just a few lines, and then it just stops mid-sentence. This model is usually good at giving more text, but you’ve really cut it down a lot.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
O1-mini usage limit for cody pro users? | 1 | 80 | January 27, 2025 | |
I get these mesages more than before now | 3 | 51 | November 8, 2024 | |
Abnormal cutoff of o1 | 1 | 57 | November 19, 2024 | |
Request Failed: context deadline exceeded | 1 | 249 | October 26, 2024 | |
Cody loses all context in 5 minutes | 1 | 94 | March 6, 2025 |