pydantic_ai.models.anthropic
For details on how to set up authentication with this model, see model configuration for Anthropic.
Bases: ModelSettings
Settings used for an Anthropic model request.
An object describing metadata about the request.
Contains user_id, an external identifier for the user who is associated with the request.
Type: BetaMetadataParam
Determine whether the model should generate a thinking block.
See the Anthropic docs for more information.
Type: BetaThinkingConfigParam
Whether to add cache_control to the last tool definition.
When enabled, the last tool in the tools array will have cache_control set,
allowing Anthropic to cache tool definitions and reduce costs.
If True, uses TTL=‘5m’. You can also specify ‘5m’ or ‘1h’ directly.
See https://docs.anthropic.com/en/docs/build-with-claude/prompt-caching for more information.
Type: bool | Literal[‘5m’, ‘1h’]
Whether to add cache_control to the last system prompt block.
When enabled, the last system prompt will have cache_control set,
allowing Anthropic to cache system instructions and reduce costs.
If True, uses TTL=‘5m’. You can also specify ‘5m’ or ‘1h’ directly.
See https://docs.anthropic.com/en/docs/build-with-claude/prompt-caching for more information.
Type: bool | Literal[‘5m’, ‘1h’]
Convenience setting to enable caching for the last user message.
When enabled, this automatically adds a cache point to the last content block
in the final user message, which is useful for caching conversation history
or context in multi-turn conversations.
If True, uses TTL=‘5m’. You can also specify ‘5m’ or ‘1h’ directly.
Note: Uses 1 of Anthropic’s 4 available cache points per request. Any additional CachePoint markers in messages will be automatically limited to respect the 4-cache-point maximum. See https://docs.anthropic.com/en/docs/build-with-claude/prompt-caching for more information.
Type: bool | Literal[‘5m’, ‘1h’]
The effort level for the model to use when generating a response.
See the Anthropic docs for more information.
Type: Literal[‘low’, ‘medium’, ‘high’, ‘max’] | None
Container configuration for multi-turn conversations.
By default, if previous messages contain a container_id (from a prior response), it will be reused automatically.
Set to False to force a fresh container (ignore any container_id from history).
Set to a dict (e.g. \{'id': 'container_xxx'\}) to explicitly specify a container.
Type: BetaContainerParams | Literal[False]
Whether to enable eager input streaming on tool definitions.
When enabled, all tool definitions will have eager_input_streaming set to True,
allowing Anthropic to stream tool call arguments incrementally instead of buffering
the entire JSON before streaming. This reduces latency for tool calls with large inputs.
See https://platform.claude.com/docs/en/agents-and-tools/tool-use/fine-grained-tool-streaming for more information.
Type: bool
List of Anthropic beta features to enable for API requests.
Each item can be a known beta name (e.g. ‘interleaved-thinking-2025-05-14’) or a custom string. Merged with auto-added betas (e.g. structured-outputs, builtin tools) and any betas from extra_headers[‘anthropic-beta’]. See the Anthropic docs for available beta features.
Type: list[AnthropicBetaParam]
Bases: Model
A model that uses the Anthropic API.
Internally, this uses the Anthropic Python client to interact with the API.
Apart from __init__, all methods are private or match those of the base class.
The model name.
Type: AnthropicModelName
The model provider.
Type: str
def __init__(
model_name: AnthropicModelName,
provider: Literal['anthropic', 'gateway'] | Provider[AsyncAnthropicClient] = 'anthropic',
profile: ModelProfileSpec | None = None,
settings: ModelSettings | None = None,
)
Initialize an Anthropic model.
The name of the Anthropic model to use. List of model names available here.
provider : Literal[‘anthropic’, ‘gateway’] | Provider[AsyncAnthropicClient] Default: 'anthropic'
The provider to use for the Anthropic API. Can be either the string ‘anthropic’ or an
instance of Provider[AsyncAnthropicClient]. Defaults to ‘anthropic’.
profile : ModelProfileSpec | None Default: None
The model profile to use. Defaults to a profile picked by the provider based on the model name.
The default ‘anthropic’ provider will use the default ..profiles.anthropic.anthropic_model_profile.
settings : ModelSettings | None Default: None
Default model settings for this model instance.
@classmethod
def supported_builtin_tools(cls) -> frozenset[type[AbstractBuiltinTool]]
The set of builtin tool types this model can handle.
frozenset[type[AbstractBuiltinTool]]
Bases: StreamedResponse
Implementation of StreamedResponse for Anthropic models.
Get the model name of the response.
Type: AnthropicModelName
Get the provider name.
Type: str
Get the provider base URL.
Type: str
Get the timestamp of the response.
Type: datetime
Latest Anthropic models.
Default: ModelParam
Possible Anthropic model names.
Since Anthropic supports a variety of date-stamped models, we explicitly list the latest models but allow any name in the type hints. See the Anthropic docs for a full list.
Default: str | LatestAnthropicModelNames