pydantic_ai.models.function
A model controlled by a local function.
FunctionModel is similar to TestModel,
but allows greater control over the model’s behavior.
Its primary use case is for more advanced unit testing than is possible with TestModel.
Here’s a minimal example:
from pydantic_ai import Agent
from pydantic_ai import ModelMessage, ModelResponse, TextPart
from pydantic_ai.models.function import FunctionModel, AgentInfo
my_agent = Agent('openai:gpt-5.2')
async def model_function(
messages: list[ModelMessage], info: AgentInfo
) -> ModelResponse:
print(messages)
"""
[
ModelRequest(
parts=[
UserPromptPart(
content='Testing my agent...',
timestamp=datetime.datetime(...),
)
],
timestamp=datetime.datetime(...),
run_id='...',
)
]
"""
print(info)
"""
AgentInfo(
function_tools=[],
allow_text_output=True,
output_tools=[],
model_settings=None,
model_request_parameters=ModelRequestParameters(
function_tools=[], builtin_tools=[], output_tools=[]
),
instructions=None,
)
"""
return ModelResponse(parts=[TextPart('hello world')])
async def test_my_agent():
"""Unit test for my_agent, to be run by pytest."""
with my_agent.override(model=FunctionModel(model_function)):
result = await my_agent.run('Testing my agent...')
assert result.output == 'hello world'
See Unit testing with FunctionModel for detailed documentation.
Bases: Model
A model controlled by a local function.
Apart from __init__, all methods are private or match those of the base class.
The model name.
Type: str
The system / model provider.
Type: str
def __init__(
function: FunctionDef,
model_name: str | None = None,
profile: ModelProfileSpec | None = None,
settings: ModelSettings | None = None,
) -> None
def __init__(
stream_function: StreamFunctionDef,
model_name: str | None = None,
profile: ModelProfileSpec | None = None,
settings: ModelSettings | None = None,
) -> None
def __init__(
function: FunctionDef,
stream_function: StreamFunctionDef,
model_name: str | None = None,
profile: ModelProfileSpec | None = None,
settings: ModelSettings | None = None,
) -> None
Initialize a FunctionModel.
Either function or stream_function must be provided, providing both is allowed.
function : FunctionDef | None Default: None
The function to call for non-streamed requests.
stream_function : StreamFunctionDef | None Default: None
The function to call for streamed requests.
The name of the model. If not provided, a name is generated from the function names.
profile : ModelProfileSpec | None Default: None
The model profile to use.
settings : ModelSettings | None Default: None
Model-specific settings that will be used as defaults for this model.
@classmethod
def supported_builtin_tools(cls) -> frozenset[type[AbstractBuiltinTool]]
FunctionModel supports all builtin tools for testing flexibility.
frozenset[type[AbstractBuiltinTool]]
Information about an agent.
This is passed as the second to functions used within FunctionModel.
The function tools available on this agent.
These are the tools registered via the tool and
tool_plain decorators.
Type: list[ToolDefinition]
Whether a plain text output is allowed.
Type: bool
The tools that can called to produce the final output of the run.
Type: list[ToolDefinition]
The model settings passed to the run call.
Type: ModelSettings | None
The model request parameters passed to the run call.
Type: ModelRequestParameters
The instructions passed to model.
Incremental change to a tool call.
Used to describe a chunk when streaming structured responses.
Incremental change to the name of the tool.
Type: str | None Default: None
Incremental change to the arguments as JSON
Type: str | None Default: None
Incremental change to the tool call ID.
Type: str | None Default: None
Incremental change to a thinking part.
Used to describe a chunk when streaming thinking responses.
Incremental change to the thinking content.
Type: str | None Default: None
Incremental change to the thinking signature.
Type: str | None Default: None
Bases: StreamedResponse
Implementation of StreamedResponse for FunctionModel.
Get the model name of the response.
Type: str
Get the provider name.
Type: None
Get the provider base URL.
Type: None
Get the timestamp of the response.
Type: datetime
A mapping of tool call IDs to incremental changes.
Type: TypeAlias Default: dict[int, DeltaToolCall]
A mapping of thinking call IDs to incremental changes.
Type: TypeAlias Default: dict[int, DeltaThinkingPart]
A function used to generate a non-streamed response.
Type: TypeAlias Default: Callable[[list[ModelMessage], AgentInfo], ModelResponse | Awaitable[ModelResponse]]
A function used to generate a streamed response.
While this is defined as having return type of AsyncIterator[str | DeltaToolCalls | DeltaThinkingCalls | BuiltinTools], it should
really be considered as AsyncIterator[str] | AsyncIterator[DeltaToolCalls] | AsyncIterator[DeltaThinkingCalls],
E.g. you need to yield all text, all DeltaToolCalls, all DeltaThinkingCalls, or all BuiltinToolCallsReturns, not mix them.
Type: TypeAlias Default: Callable[[list[ModelMessage], AgentInfo], AsyncIterator[str | DeltaToolCalls | DeltaThinkingCalls | BuiltinToolCallsReturns]]