Hi, why did you shorten the context window for the o1 and o1-mini models? It’s giving really short answers, like just a few lines, and then it just stops mid-sentence. This model is usually good at giving more text, but you’ve really cut it down a lot.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
O1-mini usage limit for cody pro users? | 1 | 58 | January 27, 2025 | |
I get these mesages more than before now | 3 | 40 | November 8, 2024 | |
Abnormal cutoff of o1 | 1 | 46 | November 19, 2024 | |
Request Failed: context deadline exceeded | 1 | 128 | October 26, 2024 | |
Cody chat slowdown issue | 1 | 137 | September 10, 2024 |