Skip to content

pydantic_ai.models.openrouter

Setup

For details on how to set up authentication with this model, see model configuration for OpenRouter.

OpenRouterProviderConfig

Bases: TypedDict

Represents the ‘Provider’ object from the OpenRouter API.

Attributes

order

List of provider slugs to try in order (e.g. [“anthropic”, “openai”]). See details

Type: list[OpenRouterProviderName]

allow_fallbacks

Whether to allow backup providers when the primary is unavailable. See details

Type: bool

require_parameters

Only use providers that support all parameters in your request.

Type: bool

data_collection

Control whether to use providers that may store data. See details

Type: Literal[‘allow’, ‘deny’]

zdr

Restrict routing to only ZDR (Zero Data Retention) endpoints. See details

Type: bool

only

List of provider slugs to allow for this request. See details

Type: list[OpenRouterProviderName]

ignore

List of provider slugs to skip for this request. See details

Type: list[str]

quantizations

List of quantization levels to filter by (e.g. [“int4”, “int8”]). See details

Type: list[Literal[‘int4’, ‘int8’, ‘fp4’, ‘fp6’, ‘fp8’, ‘fp16’, ‘bf16’, ‘fp32’, ‘unknown’]]

sort

Sort providers by price or throughput. (e.g. “price” or “throughput”). See details

Type: Literal[‘price’, ‘throughput’, ‘latency’]

max_price

The maximum pricing you want to pay for this request. See details

Type: _OpenRouterMaxPrice

OpenRouterReasoning

Bases: TypedDict

Configuration for reasoning tokens in OpenRouter requests.

Reasoning tokens allow models to show their step-by-step thinking process. You can configure this using either OpenAI-style effort levels or Anthropic-style token limits, but not both simultaneously.

Attributes

effort

OpenAI-style reasoning effort level. Cannot be used with max_tokens.

Type: Literal[‘xhigh’, ‘high’, ‘medium’, ‘low’, ‘minimal’, ‘none’]

max_tokens

Anthropic-style specific token limit for reasoning. Cannot be used with effort.

Type: int

exclude

Whether to exclude reasoning tokens from the response. Default is False. All models support this.

Type: bool

enabled

Whether to enable reasoning with default parameters. Default is inferred from effort or max_tokens.

Type: bool

OpenRouterUsageConfig

Bases: TypedDict

Configuration for OpenRouter usage.

OpenRouterModelSettings

Bases: ModelSettings

Settings used for an OpenRouter model request.

Attributes

openrouter_models

A list of fallback models.

These models will be tried, in order, if the main model returns an error. See details

Type: list[str]

openrouter_provider

OpenRouter routes requests to the best available providers for your model. By default, requests are load balanced across the top providers to maximize uptime.

You can customize how your requests are routed using the provider object. See more

Type: OpenRouterProviderConfig

openrouter_preset

Presets allow you to separate your LLM configuration from your code.

Create and manage presets through the OpenRouter web application to control provider routing, model selection, system prompts, and other parameters, then reference them in OpenRouter API requests. See more

Type: str

openrouter_transforms

To help with prompts that exceed the maximum context size of a model.

Transforms work by removing or truncating messages from the middle of the prompt, until the prompt fits within the model’s context window. See more

Type: list[OpenRouterTransforms]

openrouter_reasoning

To control the reasoning tokens in the request.

The reasoning config object consolidates settings for controlling reasoning strength across different models. See more

Type: OpenRouterReasoning

openrouter_usage

To control the usage of the model.

The usage config object consolidates settings for enabling detailed usage information. See more

Type: OpenRouterUsageConfig

OpenRouterModel

Bases: OpenAIChatModel

Extends OpenAIModel to capture extra metadata for Openrouter.

Methods

__init__
def __init__(
    model_name: str,
    provider: Literal['openrouter'] | Provider[AsyncOpenAI] = 'openrouter',
    profile: ModelProfileSpec | None = None,
    settings: ModelSettings | None = None,
)

Initialize an OpenRouter model.

Parameters

model_name : str

The name of the model to use.

provider : Literal[‘openrouter’] | Provider[AsyncOpenAI] Default: 'openrouter'

The provider to use for authentication and API access. If not provided, a new provider will be created with the default settings.

profile : ModelProfileSpec | None Default: None

The model profile to use. Defaults to a profile picked by the provider based on the model name.

settings : ModelSettings | None Default: None

Model-specific settings that will be used as defaults for this model.

supported_builtin_tools

@classmethod

def supported_builtin_tools(cls) -> frozenset[type[AbstractBuiltinTool]]

Return the set of builtin tool types this model can handle.

OpenRouter supports web search via its plugins system.

Returns

frozenset[type[AbstractBuiltinTool]]

OpenRouterStreamedResponse

Bases: OpenAIStreamedResponse

Implementation of StreamedResponse for OpenRouter models.

KnownOpenRouterProviders

Known providers in the OpenRouter marketplace

Default: Literal['z-ai', 'cerebras', 'venice', 'moonshotai', 'morph', 'stealth', 'wandb', 'klusterai', 'openai', 'sambanova', 'amazon-bedrock', 'mistral', 'nextbit', 'atoma', 'ai21', 'minimax', 'baseten', 'anthropic', 'featherless', 'groq', 'lambda', 'azure', 'ncompass', 'deepseek', 'hyperbolic', 'crusoe', 'cohere', 'mancer', 'avian', 'perplexity', 'novita', 'siliconflow', 'switchpoint', 'xai', 'inflection', 'fireworks', 'deepinfra', 'inference-net', 'inception', 'atlas-cloud', 'nvidia', 'alibaba', 'friendli', 'infermatic', 'targon', 'ubicloud', 'aion-labs', 'liquid', 'nineteen', 'cloudflare', 'nebius', 'chutes', 'enfer', 'crofai', 'open-inference', 'phala', 'gmicloud', 'meta', 'relace', 'parasail', 'together', 'google-ai-studio', 'google-vertex']

OpenRouterProviderName

Possible OpenRouter provider names.

Since OpenRouter is constantly updating their list of providers, we explicitly list some known providers but allow any name in the type hints. See the OpenRouter API for a full list.

Default: str | KnownOpenRouterProviders

OpenRouterTransforms

Available messages transforms for OpenRouter models with limited token windows.

Currently only supports ‘middle-out’, but is expected to grow in the future.

Default: Literal['middle-out']