I wish gemini-2-pro was free

I wish gemini-2-pro was free… Because it’s already free from Google, so why is it paid over here?

Hey @blazgocompany

Thank you for your feedback. Inference on LLMs cost money and only Google can provide such a tier on their own platform or over the Gemini API. If you are on VS Code you can set Cody to user your own API key too. Have a read here: Installing Cody in VS Code - Sourcegraph docs

Does that mean that Cody doesn’t use the Gemini API?

I guess Cody does not use the Gemini API directly but Vertex AI or other Gemini providers, since Sourcegraph has a lot of users and they need to scale inference on high peak times to over thousands of people per minute.

I think Gemini models are free for testing/experimental purposes and they are very rate limited. Imagine if Sourcegraph/Cody would use one API key for all these user requests - then the free limit would be hit in no time…

Anyways, it’s always possible to configure BYOK in Cody settings and you can access Gemini models for free under your own API key + also have bigger input and output token context windows.

1 Like