OpenRouter/OpenAI custom models not working

"cody.dev.models": [
    {
      "provider": "openai",
      "model": "google/gemini-2.0-pro-exp-02-05:free",
      "name": "Gemini 2.0 Pro",
      "inputTokens": 1048576,
      "outputTokens": 8000,
      "apiKey": "sk-or-v1-xxxxx",
      "options": {
        "temperature": 0.0
      },
      "apiEndpoint": "https://openrouter.ai/api/v1"
    }
  ],

I tried to use gemini 2 pro with OpenRouter but get:

Request Failed:
Request to https://sourcegraph.com/.api/completions/stream?client-name=vscode&client-version=1.64.0 failed with 400 Bad Request: status 400, reason model “openai/google/gemini-2.0-pro-exp-02-05:free” is not available on sourcegraph.com

I’m not sure why, because OpenRouter is OpenAI-compatible, and the model ID is correct. openai/google/gemini-2.0-pro-exp-02-05:free seems to be the problem.

It’s known issue (Openrouter does not work with Cody · Issue #6109 · sourcegraph/cody · GitHub) - it’s because of “/” in the model name - it gets incorrectly processed.

As work-around you can try PriNova community build - GitHub - PriNova/cody: AI that knows your entire codebase. Custom build edition - which already has fixed this problem, or wait until fix upstream.

A side note - your configuration looks a bit off too:

  • Use "provider": "groq" as provider if it’s OpenAI compatible API;
  • For apiEndpoint I would suggest using https://api.openai.com/v1/chat/completions but changes are that maybe your end-point is alsow working.

There is problem with OpenRouter API calls in Cody as I mentioned in my previous comment so it would not work anyways (for now).

1 Like

How would I use Gemini on OpenAI?

Sorry, my bad…, I was looking at my config and accidently messed up OpenRouter and OpenAI end-points. Your end-point looks good.

So the primary issue why it’s not working is that GitHub issue I mentioned. Also give a try for PriNova Cody build if you want to use this feature without waiting for upstream.

1 Like