Providers

OpenRouter

Configure the OpenRouter provider to access hundreds of models through a single API key.

Setup

pnpm add @openrouter/ai-sdk-provider
import { AiSdkProvider } from "noumen";
import { createOpenRouter } from "@openrouter/ai-sdk-provider";

const openrouter = createOpenRouter({
  apiKey: process.env.OPENROUTER_API_KEY!,
});

const provider = new AiSdkProvider({
  model: openrouter.chat("anthropic/claude-opus-4.6"),
});

Options

All connection-level options come from createOpenRouter.

OptionSourceDescription
apiKeycreateOpenRouterOpenRouter API key.
baseURLcreateOpenRouterOverride the API base URL. Defaults to https://openrouter.ai/api/v1.
headerscreateOpenRouterCustom headers — use this for OpenRouter's ranking headers: HTTP-Referer (your app URL) and X-Title (your app name).
extraBodycreateOpenRouterExtra fields to merge into every request body (for OpenRouter's provider preferences, transforms, etc.).

App identification

OpenRouter optionally accepts headers to identify your app on their leaderboards:

import { createOpenRouter } from "@openrouter/ai-sdk-provider";

const openrouter = createOpenRouter({
  apiKey: process.env.OPENROUTER_API_KEY!,
  headers: {
    "HTTP-Referer": "https://myapp.com",
    "X-Title": "My Coding Agent",
  },
});

Models

OpenRouter gives you access to models from every major provider through a single API. Pass any model ID listed on openrouter.ai/models:

  • anthropic/claude-opus-4.6 — Claude Opus
  • anthropic/claude-sonnet-4 — Claude Sonnet
  • openai/gpt-5 — GPT-5
  • openai/gpt-4o — GPT-4o
  • google/gemini-2.5-pro — Gemini 2.5 Pro
  • deepseek/deepseek-r1 — DeepSeek R1
  • meta-llama/llama-4-maverick — Llama 4 Maverick

How it works

@openrouter/ai-sdk-provider speaks the AI SDK LanguageModelV2 protocol exposing OpenRouter's OpenAI-compatible API under the hood. AiSdkProvider sees a generic OpenAI-family model and forwards reasoningEffort via providerOptions.openai.reasoningEffort. If you're calling Claude models through OpenRouter and want Anthropic-specific cache breakpoints, pass providerFamily: "anthropic" and cacheConfig: { enabled: true } explicitly — OpenRouter accepts Anthropic-style cache_control markers for supported models.

Streaming

Streaming works identically to any other AI SDK provider. No additional configuration is needed.