Skip to content
You're viewing docs for v2.4. See the latest version →

TypeAdapter

You may have types that are not BaseModels that you want to validate data against. Or you may want to validate a List[SomeModel], or dump it to JSON.

For use cases like this, Pydantic provides TypeAdapter, which can be used for type validation, serialization, and JSON schema generation without creating a BaseModel.

A TypeAdapter instance exposes some of the functionality from BaseModel instance methods for types that do not have such methods (such as dataclasses, primitive types, and more):

from typing import List

from typing_extensions import TypedDict

from pydantic import TypeAdapter, ValidationError

class User(TypedDict):
    name: str
    id: int

UserListValidator = TypeAdapter(List[User])
print(repr(UserListValidator.validate_python([{'name': 'Fred', 'id': '3'}])))
#> [{'name': 'Fred', 'id': 3}]

try:
    UserListValidator.validate_python(
        [{'name': 'Fred', 'id': 'wrong', 'other': 'no'}]
    )
except ValidationError as e:
    print(e)
    '''
    1 validation error for list[typed-dict]
    0.id
      Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='wrong', input_type=str]
    '''

Parsing data into a specified type

TypeAdapter can be used to apply the parsing logic to populate Pydantic models in a more ad-hoc way. This function behaves similarly to BaseModel.model_validate, but works with arbitrary Pydantic-compatible types.

This is especially useful when you want to parse results into a type that is not a direct subclass of BaseModel. For example:

from typing import List

from pydantic import BaseModel, TypeAdapter

class Item(BaseModel):
    id: int
    name: str

# `item_data` could come from an API call, eg., via something like:
# item_data = requests.get('https://my-api.com/items').json()
item_data = [{'id': 1, 'name': 'My Item'}]

items = TypeAdapter(List[Item]).validate_python(item_data)
print(items)
#> [Item(id=1, name='My Item')]

TypeAdapter is capable of parsing data into any of the types Pydantic can handle as fields of a BaseModel.

PydanticUserError

Bases: PydanticErrorMixin, TypeError

An error raised due to incorrect use of Pydantic.


BaseModel

Usage docs: https://docs.pydantic.dev/2.4/concepts/models/

A base class for creating Pydantic models.

Attributes

model_config

Configuration for the model, should be a dictionary conforming to ConfigDict.

Type: ConfigDict Default: ConfigDict()

model_fields

Metadata about the fields defined on the model, mapping of field names to FieldInfo.

This replaces Model.__fields__ from Pydantic V1.

Type: dict[str, FieldInfo]

model_computed_fields

Get the computed fields of this model instance.

Type: dict[str, ComputedFieldInfo]

model_extra

Get extra fields set during validation.

Type: dict[str, Any] | None

model_fields_set

Returns the set of fields that have been set on this model instance.

Type: set[str]

Methods

init

def __init__(__pydantic_self__, data: Any = {}) -> None

Create a new model by parsing and validating input data from keyword arguments.

Raises ValidationError if the input data cannot be validated to form a valid model.

__init__ uses __pydantic_self__ instead of the more common self for the first arg to allow self as a field name.

Returns

None

model_construct

@classmethod

def model_construct(
    cls: type[Model],
    _fields_set: set[str] | None = None,
    values: Any = {},
) -> Model

Creates a new instance of the Model class with validated data.

Creates a new model setting __dict__ and __pydantic_fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if Config.extra = 'allow' was set since it adds all passed values

Returns

Model — A new instance of the Model class with validated data.

Parameters

_fields_set : set[str] | None Default: None

The set of field names accepted for the Model instance.

values : Any Default: \{\}

Trusted or pre-validated data dictionary.

model_copy

def model_copy(update: dict[str, Any] | None = None, deep: bool = False) -> Model

Usage docs: https://docs.pydantic.dev/2.4/concepts/serialization/#model_copy

Returns a copy of the model.

Returns

Model — New model instance.

Parameters

update : dict[str, Any] | None Default: None

Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data.

deep : bool Default: False

Set to True to make a deep copy of the model.

model_dump

def model_dump(
    mode: Literal['json', 'python'] | str = 'python',
    include: IncEx = None,
    exclude: IncEx = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    round_trip: bool = False,
    warnings: bool = True,
) -> dict[str, Any]

Usage docs: https://docs.pydantic.dev/2.4/concepts/serialization/#modelmodel_dump

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

Returns

dict[str, Any] — A dictionary representation of the model.

Parameters

mode : Literal['json', 'python'] | str Default: 'python'

The mode in which to_python should run. If mode is ‘json’, the dictionary will only contain JSON serializable types. If mode is ‘python’, the dictionary may contain any Python objects.

include : IncEx Default: None

A list of fields to include in the output.

exclude : IncEx Default: None

A list of fields to exclude from the output.

by_alias : bool Default: False

Whether to use the field’s alias in the dictionary key if defined.

exclude_unset : bool Default: False

Whether to exclude fields that are unset or None from the output.

exclude_defaults : bool Default: False

Whether to exclude fields that are set to their default value from the output.

exclude_none : bool Default: False

Whether to exclude fields that have a value of None from the output.

round_trip : bool Default: False

Whether to enable serialization and deserialization round-trip support.

warnings : bool Default: True

Whether to log warnings when invalid fields are encountered.

model_dump_json

def model_dump_json(
    indent: int | None = None,
    include: IncEx = None,
    exclude: IncEx = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    round_trip: bool = False,
    warnings: bool = True,
) -> str

Usage docs: https://docs.pydantic.dev/2.4/concepts/serialization/#modelmodel_dump_json

Generates a JSON representation of the model using Pydantic’s to_json method.

Returns

str — A JSON string representation of the model.

Parameters

indent : int | None Default: None

Indentation to use in the JSON output. If None is passed, the output will be compact.

include : IncEx Default: None

Field(s) to include in the JSON output. Can take either a string or set of strings.

exclude : IncEx Default: None

Field(s) to exclude from the JSON output. Can take either a string or set of strings.

by_alias : bool Default: False

Whether to serialize using field aliases.

exclude_unset : bool Default: False

Whether to exclude fields that have not been explicitly set.

exclude_defaults : bool Default: False

Whether to exclude fields that have the default value.

exclude_none : bool Default: False

Whether to exclude fields that have a value of None.

round_trip : bool Default: False

Whether to use serialization/deserialization between JSON and class instance.

warnings : bool Default: True

Whether to show any warnings that occurred during serialization.

model_json_schema

@classmethod

def model_json_schema(
    cls,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
) -> dict[str, Any]

Generates a JSON schema for a model class.

Returns

dict[str, Any] — The JSON schema for the given model class.

Parameters

by_alias : bool Default: True

Whether to use attribute aliases or not.

ref_template : str Default: DEFAULT_REF_TEMPLATE

The reference template.

schema_generator : type[GenerateJsonSchema] Default: GenerateJsonSchema

To override the logic used to generate the JSON schema, as a subclass of GenerateJsonSchema with your desired modifications

mode : JsonSchemaMode Default: 'validation'

The mode in which to generate the schema.

model_parametrized_name

@classmethod

def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str

Compute the class name for parametrizations of generic classes.

This method can be overridden to achieve a custom naming scheme for generic BaseModels.

Returns

str — String representing the new class where params are passed to cls as type variables.

Parameters

params : tuple[type[Any], ...]

Tuple of types of the class. Given a generic class Model with 2 type variables and a concrete model Model[str, int], the value (str, int) would be passed to params.

Raises
  • TypeError — Raised when trying to generate concrete names for non-generic models.

model_post_init

def model_post_init(__context: Any) -> None

Override this method to perform additional initialization after __init__ and model_construct. This is useful if you want to do some validation that requires the entire model to be initialized.

Returns

None

model_rebuild

@classmethod

def model_rebuild(
    cls,
    force: bool = False,
    raise_errors: bool = True,
    _parent_namespace_depth: int = 2,
    _types_namespace: dict[str, Any] | None = None,
) -> bool | None

Try to rebuild the pydantic-core schema for the model.

This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails.

Returns

bool | None — Returns None if the schema is already “complete” and rebuilding was not required. bool | None — If rebuilding was required, returns True if rebuilding was successful, otherwise False.

Parameters

force : bool Default: False

Whether to force the rebuilding of the model schema, defaults to False.

raise_errors : bool Default: True

Whether to raise errors, defaults to True.

_parent_namespace_depth : int Default: 2

The depth level of the parent namespace, defaults to 2.

_types_namespace : dict[str, Any] | None Default: None

The types namespace, defaults to None.

model_validate

@classmethod

def model_validate(
    cls: type[Model],
    obj: Any,
    strict: bool | None = None,
    from_attributes: bool | None = None,
    context: dict[str, Any] | None = None,
) -> Model

Validate a pydantic model instance.

Returns

Model — The validated model instance.

Parameters

obj : Any

The object to validate.

strict : bool | None Default: None

Whether to raise an exception on invalid fields.

from_attributes : bool | None Default: None

Whether to extract data from object attributes.

context : dict[str, Any] | None Default: None

Additional context to pass to the validator.

Raises
  • ValidationError — If the object could not be validated.

model_validate_json

@classmethod

def model_validate_json(
    cls: type[Model],
    json_data: str | bytes | bytearray,
    strict: bool | None = None,
    context: dict[str, Any] | None = None,
) -> Model

Validate the given JSON data against the Pydantic model.

Returns

Model — The validated Pydantic model.

Parameters

json_data : str | bytes | bytearray

The JSON data to validate.

strict : bool | None Default: None

Whether to enforce types strictly.

context : dict[str, Any] | None Default: None

Extra variables to pass to the validator.

Raises
  • ValueError — If json_data is not a JSON string.

model_validate_strings

@classmethod

def model_validate_strings(
    cls: type[Model],
    obj: Any,
    strict: bool | None = None,
    context: dict[str, Any] | None = None,
) -> Model

Validate the given object contains string data against the Pydantic model.

Returns

Model — The validated Pydantic model.

Parameters

obj : Any

The object contains string data to validate.

strict : bool | None Default: None

Whether to enforce types strictly.

context : dict[str, Any] | None Default: None

Extra variables to pass to the validator.

get_pydantic_core_schema

@classmethod

def __get_pydantic_core_schema__(
    cls,
    __source: type[BaseModel],
    __handler: GetCoreSchemaHandler,
) -> CoreSchema

Hook into generating the model’s CoreSchema.

Returns

CoreSchema — A pydantic-core CoreSchema.

Parameters

__source : type[BaseModel]

The class we are generating a schema for. This will generally be the same as the cls argument if this is a classmethod.

__handler : GetCoreSchemaHandler

Call into Pydantic’s internal JSON schema generation. A callable that calls into Pydantic’s internal CoreSchema generation logic.

get_pydantic_json_schema

@classmethod

def __get_pydantic_json_schema__(
    cls,
    __core_schema: CoreSchema,
    __handler: GetJsonSchemaHandler,
) -> JsonSchemaValue

Hook into generating the model’s JSON schema.

Returns

JsonSchemaValue — A JSON schema, as a Python object.

Parameters

__core_schema : CoreSchema

A pydantic-core CoreSchema. You can ignore this argument and call the handler with a new CoreSchema, wrap this CoreSchema (\{'type': 'nullable', 'schema': current_schema\}), or just call the handler with the original schema.

__handler : GetJsonSchemaHandler

Call into Pydantic’s internal JSON schema generation. This will raise a pydantic.errors.PydanticInvalidForJsonSchema if JSON schema generation fails. Since this gets called by BaseModel.model_json_schema you can override the schema_generator argument to that function to change JSON schema generation globally for a type.

pydantic_init_subclass

@classmethod

def __pydantic_init_subclass__(cls, kwargs: Any = {}) -> None

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass only after the class is actually fully initialized. In particular, attributes like model_fields will be present when this is called.

This is necessary because __init_subclass__ will always be called by type.__new__, and it would require a prohibitively large refactor to the ModelMetaclass to ensure that type.__new__ was called in such a manner that the class would already be sufficiently initialized.

This will receive the same kwargs that would be passed to the standard __init_subclass__, namely, any kwargs passed to the class definition that aren’t used internally by pydantic.

Returns

None

Parameters

**kwargs : Any Default: \{\}

Any keyword arguments passed to the class definition that aren’t used internally by pydantic.

copy

def __copy__() -> Model

Returns a shallow copy of the model.

Returns

Model

deepcopy

def __deepcopy__(memo: dict[int, Any] | None = None) -> Model

Returns a deep copy of the model.

Returns

Model

init_subclass

def __init_subclass__(cls, kwargs: Unpack[ConfigDict] = {})

This signature is included purely to help type-checkers check arguments to class declaration, which provides a way to conveniently set model_config key/value pairs.

from pydantic import BaseModel

class MyModel(BaseModel, extra='allow'):
    ...

However, this may be deceiving, since the actual calls to __init_subclass__ will not receive any of the config arguments, and will only receive any keyword arguments passed during class initialization that are not expected keys in ConfigDict. (This is due to the way ModelMetaclass.__new__ works.)

Parameters

**kwargs : Unpack[ConfigDict] Default: \{\}

Keyword arguments passed to the class definition, which set model_config

iter

def __iter__() -> TupleGenerator

So dict(model) works.

Returns

TupleGenerator

dict

def dict(
    include: IncEx = None,
    exclude: IncEx = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
) -> typing.Dict[str, Any]
Returns

typing.Dict[str, Any]

json

def json(
    include: IncEx = None,
    exclude: IncEx = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    encoder: typing.Callable[[Any], Any] | None = PydanticUndefined,
    models_as_dict: bool = PydanticUndefined,
    dumps_kwargs: Any = {},
) -> str
Returns

str

parse_obj

@classmethod

def parse_obj(cls: type[Model], obj: Any) -> Model
Returns

Model

parse_raw

@classmethod

def parse_raw(
    cls: type[Model],
    b: str | bytes,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Model
Returns

Model

parse_file

@classmethod

def parse_file(
    cls: type[Model],
    path: str | Path,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Model
Returns

Model

from_orm

@classmethod

def from_orm(cls: type[Model], obj: Any) -> Model
Returns

Model

construct

@classmethod

def construct(
    cls: type[Model],
    _fields_set: set[str] | None = None,
    values: Any = {},
) -> Model
Returns

Model

copy

def copy(
    include: AbstractSetIntStr | MappingIntStrAny | None = None,
    exclude: AbstractSetIntStr | MappingIntStrAny | None = None,
    update: typing.Dict[str, Any] | None = None,
    deep: bool = False,
) -> Model

Returns a copy of the model.

If you need include or exclude, use:

data = self.model_dump(include=include, exclude=exclude, round_trip=True)
data = {**data, **(update or {})}
copied = self.model_validate(data)
Returns

Model — A copy of the model with included, excluded and updated fields as specified.

Parameters

include : AbstractSetIntStr | MappingIntStrAny | None Default: None

Optional set or mapping specifying which fields to include in the copied model.

exclude : AbstractSetIntStr | MappingIntStrAny | None Default: None

Optional set or mapping specifying which fields to exclude in the copied model.

update : typing.Dict[str, Any] | None Default: None

Optional dictionary of field-value pairs to override field values in the copied model.

deep : bool Default: False

If True, the values of fields that are Pydantic models will be deep copied.

schema

@classmethod

def schema(
    cls,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
) -> typing.Dict[str, Any]
Returns

typing.Dict[str, Any]

schema_json

@classmethod

def schema_json(
    cls,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    dumps_kwargs: Any = {},
) -> str
Returns

str

validate

@classmethod

def validate(cls: type[Model], value: Any) -> Model
Returns

Model

update_forward_refs

@classmethod

def update_forward_refs(cls, localns: Any = {}) -> None
Returns

None


ConfigDict

Bases: TypedDict

A TypedDict for configuring Pydantic behaviour.

Attributes

title

The title for the generated JSON schema, defaults to the model’s name

Type: str | None

str_to_lower

Whether to convert all characters to lowercase for str types. Defaults to False.

Type: bool

str_to_upper

Whether to convert all characters to uppercase for str types. Defaults to False.

Type: bool

str_strip_whitespace

Whether to strip leading and trailing whitespace for str types.

Type: bool

str_min_length

The minimum length for str types. Defaults to None.

Type: int

str_max_length

The maximum length for str types. Defaults to None.

Type: int | None

extra

Whether to ignore, allow, or forbid extra attributes during model initialization. Defaults to 'ignore'.

You can configure how pydantic handles the attributes that are not defined in the model:

  • allow - Allow any extra attributes.
  • forbid - Forbid any extra attributes.
  • ignore - Ignore any extra attributes.
from pydantic import BaseModel, ConfigDict


class User(BaseModel):
  model_config = ConfigDict(extra='ignore')  # (1)

  name: str


user = User(name='John Doe', age=20)  # (2)
print(user)
#> name='John Doe'

This is the default behaviour.

The age argument is ignored.

Instead, with extra='allow', the age argument is included:

from pydantic import BaseModel, ConfigDict


class User(BaseModel):
  model_config = ConfigDict(extra='allow')

  name: str


user = User(name='John Doe', age=20)  # (1)
print(user)
#> name='John Doe' age=20

The age argument is included.

With extra='forbid', an error is raised:

from pydantic import BaseModel, ConfigDict, ValidationError


class User(BaseModel):
    model_config = ConfigDict(extra='forbid')

    name: str


try:
    User(name='John Doe', age=20)
except ValidationError as e:
    print(e)
    '''
    1 validation error for User
    age
    Extra inputs are not permitted [type=extra_forbidden, input_value=20, input_type=int]
    '''

Type: ExtraValues | None

frozen

Whether or not models are faux-immutable, i.e. whether __setattr__ is allowed, and also generates a __hash__() method for the model. This makes instances of the model potentially hashable if all the attributes are hashable. Defaults to False.

Type: bool

populate_by_name

Whether an aliased field may be populated by its name as given by the model attribute, as well as the alias. Defaults to False.

from pydantic import BaseModel, ConfigDict, Field


class User(BaseModel):
  model_config = ConfigDict(populate_by_name=True)

  name: str = Field(alias='full_name')  # (1)
  age: int


user = User(full_name='John Doe', age=20)  # (2)
print(user)
#> name='John Doe' age=20
user = User(name='John Doe', age=20)  # (3)
print(user)
#> name='John Doe' age=20

The field 'name' has an alias 'full_name'.

The model is populated by the alias 'full_name'.

The model is populated by the field name 'name'.

Type: bool

use_enum_values

Whether to populate models with the value property of enums, rather than the raw enum. This may be useful if you want to serialize model.model_dump() later. Defaults to False.

Type: bool

validate_assignment

Whether to validate the data when the model is changed. Defaults to False.

The default behavior of Pydantic is to validate the data when the model is created.

In case the user changes the data after the model is created, the model is not revalidated.

from pydantic import BaseModel

class User(BaseModel):
  name: str

user = User(name='John Doe')  # (1)
print(user)
#> name='John Doe'
user.name = 123  # (1)
print(user)
#> name=123

The validation happens only when the model is created.

The validation does not happen when the data is changed.

In case you want to revalidate the model when the data is changed, you can use validate_assignment=True:

from pydantic import BaseModel, ValidationError

class User(BaseModel, validate_assignment=True):  # (1)
  name: str

user = User(name='John Doe')  # (2)
print(user)
#> name='John Doe'
try:
  user.name = 123  # (3)
except ValidationError as e:
  print(e)
  '''
  1 validation error for User
  name
    Input should be a valid string [type=string_type, input_value=123, input_type=int]
  '''

You can either use class keyword arguments, or model_config to set validate_assignment=True.

The validation happens when the model is created.

The validation also happens when the data is changed.

Type: bool

arbitrary_types_allowed

Whether arbitrary types are allowed for field types. Defaults to False.

from pydantic import BaseModel, ConfigDict, ValidationError

# This is not a pydantic model, it's an arbitrary class
class Pet:
    def __init__(self, name: str):
        self.name = name

class Model(BaseModel):
    model_config = ConfigDict(arbitrary_types_allowed=True)

    pet: Pet
    owner: str

pet = Pet(name='Hedwig')
# A simple check of instance type is used to validate the data
model = Model(owner='Harry', pet=pet)
print(model)
#> pet=<__main__.Pet object at 0x0123456789ab> owner='Harry'
print(model.pet)
#> <__main__.Pet object at 0x0123456789ab>
print(model.pet.name)
#> Hedwig
print(type(model.pet))
#> <class '__main__.Pet'>
try:
    # If the value is not an instance of the type, it's invalid
    Model(owner='Harry', pet='Hedwig')
except ValidationError as e:
    print(e)
    '''
    1 validation error for Model
    pet
      Input should be an instance of Pet [type=is_instance_of, input_value='Hedwig', input_type=str]
    '''

# Nothing in the instance of the arbitrary type is checked
# Here name probably should have been a str, but it's not validated
pet2 = Pet(name=42)
model2 = Model(owner='Harry', pet=pet2)
print(model2)
#> pet=<__main__.Pet object at 0x0123456789ab> owner='Harry'
print(model2.pet)
#> <__main__.Pet object at 0x0123456789ab>
print(model2.pet.name)
#> 42
print(type(model2.pet))
#> <class '__main__.Pet'>

Type: bool

from_attributes

Whether to build models and look up discriminators of tagged unions using python object attributes.

Type: bool

loc_by_alias

Whether to use the actual key provided in the data (e.g. alias) for error locs rather than the field’s name. Defaults to True.

Type: bool

alias_generator

A callable that takes a field name and returns an alias for it.

If data source field names do not match your code style (e. g. CamelCase fields), you can automatically generate aliases using alias_generator:

from pydantic import BaseModel, ConfigDict
from pydantic.alias_generators import to_pascal

class Voice(BaseModel):
    model_config = ConfigDict(alias_generator=to_pascal)

    name: str
    language_code: str

voice = Voice(Name='Filiz', LanguageCode='tr-TR')
print(voice.language_code)
#> tr-TR
print(voice.model_dump(by_alias=True))
#> {'Name': 'Filiz', 'LanguageCode': 'tr-TR'}

Type: Callable[[str], str] | None

ignored_types

A tuple of types that may occur as values of class attributes without annotations. This is typically used for custom descriptors (classes that behave like property). If an attribute is set on a class without an annotation and has a type that is not in this tuple (or otherwise recognized by pydantic), an error will be raised. Defaults to ().

Type: tuple[type, ...]

allow_inf_nan

Whether to allow infinity (+inf an -inf) and NaN values to float fields. Defaults to True.

Type: bool

json_schema_extra

A dict or callable to provide extra JSON schema properties. Defaults to None.

Type: dict[str, object] | JsonSchemaExtraCallable | None

json_encoders

A dict of custom JSON encoders for specific types. Defaults to None.

Type: dict[type[object], JsonEncoder] | None

strict

(new in V2) If True, strict validation is applied to all fields on the model.

By default, Pydantic attempts to coerce values to the correct type, when possible.

There are situations in which you may want to disable this behavior, and instead raise an error if a value’s type does not match the field’s type annotation.

To configure strict mode for all fields on a model, you can set strict=True on the model.

from pydantic import BaseModel, ConfigDict

class Model(BaseModel):
    model_config = ConfigDict(strict=True)

    name: str
    age: int

See Strict Mode for more details.

See the Conversion Table for more details on how Pydantic converts data in both strict and lax modes.

Type: bool

revalidate_instances

When and how to revalidate models and dataclasses during validation. Accepts the string values of 'never', 'always' and 'subclass-instances'. Defaults to 'never'.

  • 'never' will not revalidate models and dataclasses during validation
  • 'always' will revalidate models and dataclasses during validation
  • 'subclass-instances' will revalidate models and dataclasses during validation if the instance is a subclass of the model or dataclass

By default, model and dataclass instances are not revalidated during validation.

from typing import List

from pydantic import BaseModel

class User(BaseModel, revalidate_instances='never'):  # (1)
  hobbies: List[str]

class SubUser(User):
  sins: List[str]

class Transaction(BaseModel):
  user: User

my_user = User(hobbies=['reading'])
t = Transaction(user=my_user)
print(t)
#> user=User(hobbies=['reading'])

my_user.hobbies = [1]  # (2)
t = Transaction(user=my_user)  # (3)
print(t)
#> user=User(hobbies=[1])

my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying'])
t = Transaction(user=my_sub_user)
print(t)
#> user=SubUser(hobbies=['scuba diving'], sins=['lying'])

revalidate_instances is set to 'never' by **default.

The assignment is not validated, unless you set validate_assignment to True in the model's config.

Since revalidate_instances is set to never, this is not revalidated.

If you want to revalidate instances during validation, you can set revalidate_instances to 'always' in the model’s config.

from typing import List

from pydantic import BaseModel, ValidationError

class User(BaseModel, revalidate_instances='always'):  # (1)
  hobbies: List[str]

class SubUser(User):
  sins: List[str]

class Transaction(BaseModel):
  user: User

my_user = User(hobbies=['reading'])
t = Transaction(user=my_user)
print(t)
#> user=User(hobbies=['reading'])

my_user.hobbies = [1]
try:
  t = Transaction(user=my_user)  # (2)
except ValidationError as e:
  print(e)
  '''
  1 validation error for Transaction
  user.hobbies.0
    Input should be a valid string [type=string_type, input_value=1, input_type=int]
  '''

my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying'])
t = Transaction(user=my_sub_user)
print(t)  # (3)
#> user=User(hobbies=['scuba diving'])

revalidate_instances is set to 'always'.

The model is revalidated, since revalidate_instances is set to 'always'.

Using 'never' we would have gotten user=SubUser(hobbies=['scuba diving'], sins=['lying']).

It’s also possible to set revalidate_instances to 'subclass-instances' to only revalidate instances of subclasses of the model.

from typing import List

from pydantic import BaseModel

class User(BaseModel, revalidate_instances='subclass-instances'):  # (1)
  hobbies: List[str]

class SubUser(User):
  sins: List[str]

class Transaction(BaseModel):
  user: User

my_user = User(hobbies=['reading'])
t = Transaction(user=my_user)
print(t)
#> user=User(hobbies=['reading'])

my_user.hobbies = [1]
t = Transaction(user=my_user)  # (2)
print(t)
#> user=User(hobbies=[1])

my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying'])
t = Transaction(user=my_sub_user)
print(t)  # (3)
#> user=User(hobbies=['scuba diving'])

revalidate_instances is set to 'subclass-instances'.

This is not revalidated, since my_user is not a subclass of User.

Using 'never' we would have gotten user=SubUser(hobbies=['scuba diving'], sins=['lying']).

Type: Literal['always', 'never', 'subclass-instances']

ser_json_timedelta

The format of JSON serialized timedeltas. Accepts the string values of 'iso8601' and 'float'. Defaults to 'iso8601'.

  • 'iso8601' will serialize timedeltas to ISO 8601 durations.
  • 'float' will serialize timedeltas to the total number of seconds.

Type: Literal['iso8601', 'float']

ser_json_bytes

The encoding of JSON serialized bytes. Accepts the string values of 'utf8' and 'base64'. Defaults to 'utf8'.

  • 'utf8' will serialize bytes to UTF-8 strings.
  • 'base64' will serialize bytes to URL safe base64 strings.

Type: Literal['utf8', 'base64']

validate_default

Whether to validate default values during validation. Defaults to False.

Type: bool

validate_return

whether to validate the return value from call validators. Defaults to False.

Type: bool

protected_namespaces

A tuple of strings that prevent model to have field which conflict with them. Defaults to ('model_', )).

Pydantic prevents collisions between model attributes and BaseModel’s own methods by namespacing them with the prefix model_.

import warnings

from pydantic import BaseModel

warnings.filterwarnings('error')  # Raise warnings as errors

try:

    class Model(BaseModel):
        model_prefixed_field: str

except UserWarning as e:
    print(e)
    '''
    Field "model_prefixed_field" has conflict with protected namespace "model_".

    You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
    '''

You can customize this behavior using the protected_namespaces setting:

import warnings

from pydantic import BaseModel, ConfigDict

warnings.filterwarnings('error')  # Raise warnings as errors

try:

    class Model(BaseModel):
        model_prefixed_field: str
        also_protect_field: str

        model_config = ConfigDict(
            protected_namespaces=('protect_me_', 'also_protect_')
        )

except UserWarning as e:
    print(e)
    '''
    Field "also_protect_field" has conflict with protected namespace "also_protect_".

    You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('protect_me_',)`.
    '''

While Pydantic will only emit a warning when an item is in a protected namespace but does not actually have a collision, an error is raised if there is an actual collision with an existing attribute:

from pydantic import BaseModel

try:

    class Model(BaseModel):
        model_validate: str

except NameError as e:
    print(e)
    '''
    Field "model_validate" conflicts with member <bound method BaseModel.model_validate of <class 'pydantic.main.BaseModel'>> of protected namespace "model_".
    '''

Type: tuple[str, ...]

hide_input_in_errors

Whether to hide inputs when printing errors. Defaults to False.

Pydantic shows the input value and type when it raises ValidationError during the validation.

from pydantic import BaseModel, ValidationError

class Model(BaseModel):
    a: str

try:
    Model(a=123)
except ValidationError as e:
    print(e)
    '''
    1 validation error for Model
    a
      Input should be a valid string [type=string_type, input_value=123, input_type=int]
    '''

You can hide the input value and type by setting the hide_input_in_errors config to True.

from pydantic import BaseModel, ConfigDict, ValidationError

class Model(BaseModel):
    a: str
    model_config = ConfigDict(hide_input_in_errors=True)

try:
    Model(a=123)
except ValidationError as e:
    print(e)
    '''
    1 validation error for Model
    a
      Input should be a valid string [type=string_type]
    '''

Type: bool

defer_build

Whether to defer model validator and serializer construction until the first model validation.

This can be useful to avoid the overhead of building models which are only used nested within other models, or when you want to manually define type namespace via Model.model_rebuild(_types_namespace=...). Defaults to False.

Type: bool

plugin_settings

A dict of settings for plugins. Defaults to None.

See Pydantic Plugins for details.

Type: dict[str, object] | None

schema_generator

A custom core schema generator class to use when generating JSON schemas. Useful if you want to change the way types are validated across an entire model/schema. Defaults to None.

The GenerateSchema interface is subject to change, currently only the string_schema method is public.

See #6737 for details.

Type: type[_GenerateSchema] | None

json_schema_serialization_defaults_required

Whether fields with default values should be marked as required in the serialization schema. Defaults to False.

This ensures that the serialization schema will reflect the fact a field with a default will always be present when serializing the model, even though it is not required for validation.

However, there are scenarios where this may be undesirable — in particular, if you want to share the schema between validation and serialization, and don’t mind fields with defaults being marked as not required during serialization. See #7209 for more details.

from pydantic import BaseModel, ConfigDict

class Model(BaseModel):
    a: str = 'a'

    model_config = ConfigDict(json_schema_serialization_defaults_required=True)

print(Model.model_json_schema(mode='validation'))
'''
{
    'properties': {'a': {'default': 'a', 'title': 'A', 'type': 'string'}},
    'title': 'Model',
    'type': 'object',
}
'''
print(Model.model_json_schema(mode='serialization'))
'''
{
    'properties': {'a': {'default': 'a', 'title': 'A', 'type': 'string'}},
    'required': ['a'],
    'title': 'Model',
    'type': 'object',
}
'''

Type: bool

json_schema_mode_override

If not None, the specified mode will be used to generate the JSON schema regardless of what mode was passed to the function call. Defaults to None.

This provides a way to force the JSON schema generation to reflect a specific mode, e.g., to always use the validation schema.

It can be useful when using frameworks (such as FastAPI) that may generate different schemas for validation and serialization that must both be referenced from the same schema; when this happens, we automatically append -Input to the definition reference for the validation schema and -Output to the definition reference for the serialization schema. By specifying a json_schema_mode_override though, this prevents the conflict between the validation and serialization schemas (since both will use the specified schema), and so prevents the suffixes from being added to the definition references.

from pydantic import BaseModel, ConfigDict, Json

class Model(BaseModel):
    a: Json[int]  # requires a string to validate, but will dump an int

print(Model.model_json_schema(mode='serialization'))
'''
{
    'properties': {'a': {'title': 'A', 'type': 'integer'}},
    'required': ['a'],
    'title': 'Model',
    'type': 'object',
}
'''

class ForceInputModel(Model):
    # the following ensures that even with mode='serialization', we
    # will get the schema that would be generated for validation.
    model_config = ConfigDict(json_schema_mode_override='validation')

print(ForceInputModel.model_json_schema(mode='serialization'))
'''
{
    'properties': {
        'a': {
            'contentMediaType': 'application/json',
            'contentSchema': {'type': 'integer'},
            'title': 'A',
            'type': 'string',
        }
    },
    'required': ['a'],
    'title': 'ForceInputModel',
    'type': 'object',
}
'''

Type: Literal['validation', 'serialization', None]

coerce_numbers_to_str

If True, enables automatic coercion of any Number type to str in “lax” (non-strict) mode. Defaults to False.

Pydantic doesn’t allow number types (int, float, Decimal) to be coerced as type str by default.

from decimal import Decimal

from pydantic import BaseModel, ConfigDict, ValidationError

class Model(BaseModel):
    value: str

try:
    print(Model(value=42))
except ValidationError as e:
    print(e)
    '''
    1 validation error for Model
    value
      Input should be a valid string [type=string_type, input_value=42, input_type=int]
    '''

class Model(BaseModel):
    model_config = ConfigDict(coerce_numbers_to_str=True)

    value: str

repr(Model(value=42).value)
#> "42"
repr(Model(value=42.13).value)
#> "42.13"
repr(Model(value=Decimal('42.13')).value)
#> "42.13"

Type: bool


GenerateJsonSchema

A class for generating JSON schemas.

This class generates JSON schemas based on configured parameters. The default schema dialect is https://json-schema.org/draft/2020-12/schema. The class uses by_alias to configure how fields with multiple names are handled and ref_template to format reference names.

Constructor Parameters

by_alias : bool Default: True

Whether or not to include field names.

ref_template : str Default: DEFAULT_REF_TEMPLATE

The format string to use when generating reference names.

Attributes

schema_dialect

Default: 'https://json-schema.org/draft/2020-12/schema'

ignored_warning_kinds

Type: set[JsonSchemaWarningKind] Default: \{'skipped-choice'\}

by_alias

Default: by_alias

ref_template

Default: ref_template

core_to_json_refs

Type: dict[CoreModeRef, JsonRef] Default: \{\}

core_to_defs_refs

Type: dict[CoreModeRef, DefsRef] Default: \{\}

defs_to_core_refs

Type: dict[DefsRef, CoreModeRef] Default: \{\}

json_to_defs_refs

Type: dict[JsonRef, DefsRef] Default: \{\}

definitions

Type: dict[DefsRef, JsonSchemaValue] Default: \{\}

mode

Type: JsonSchemaMode

Methods

build_schema_type_to_method

def build_schema_type_to_method(

) -> dict[CoreSchemaOrFieldType, Callable[[CoreSchemaOrField], JsonSchemaValue]]

Builds a dictionary mapping fields to methods for generating JSON schemas.

Returns

dict[CoreSchemaOrFieldType, Callable[[CoreSchemaOrField], JsonSchemaValue]] — A dictionary containing the mapping of CoreSchemaOrFieldType to a handler method.

Raises
  • TypeError — If no method has been defined for generating a JSON schema for a given pydantic core schema type.

generate_definitions

def generate_definitions(
    inputs: Sequence[tuple[JsonSchemaKeyT, JsonSchemaMode, core_schema.CoreSchema]],
) -> tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], dict[DefsRef, JsonSchemaValue]]

Generates JSON schema definitions from a list of core schemas, pairing the generated definitions with a mapping that links the input keys to the definition references.

Returns

tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], dict[DefsRef, JsonSchemaValue]] — A tuple where:

  • The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.)
  • The second element is a dictionary whose keys are definition references for the JSON schemas from the first returned element, and whose values are the actual JSON schema definitions.
Parameters

inputs : Sequence[tuple[JsonSchemaKeyT, JsonSchemaMode, core_schema.CoreSchema]]

A sequence of tuples, where:

  • The first element is a JSON schema key type.
  • The second element is the JSON mode: either ‘validation’ or ‘serialization’.
  • The third element is a core schema.
Raises
  • PydanticUserError — Raised if the JSON schema generator has already been used to generate a JSON schema.

generate

def generate(schema: CoreSchema, mode: JsonSchemaMode = 'validation') -> JsonSchemaValue

Generates a JSON schema for a specified schema in a specified mode.

Returns

JsonSchemaValue — A JSON schema representing the specified schema.

Parameters

schema : CoreSchema

A Pydantic model.

mode : JsonSchemaMode Default: 'validation'

The mode in which to generate the schema. Defaults to ‘validation’.

Raises
  • PydanticUserError — If the JSON schema generator has already been used to generate a JSON schema.

generate_inner

def generate_inner(schema: CoreSchemaOrField) -> JsonSchemaValue

Generates a JSON schema for a given core schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : CoreSchemaOrField

The given core schema.

any_schema

def any_schema(schema: core_schema.AnySchema) -> JsonSchemaValue

Generates a JSON schema that matches any value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.AnySchema

The core schema.

none_schema

def none_schema(schema: core_schema.NoneSchema) -> JsonSchemaValue

Generates a JSON schema that matches a None value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.NoneSchema

The core schema.

bool_schema

def bool_schema(schema: core_schema.BoolSchema) -> JsonSchemaValue

Generates a JSON schema that matches a bool value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.BoolSchema

The core schema.

int_schema

def int_schema(schema: core_schema.IntSchema) -> JsonSchemaValue

Generates a JSON schema that matches an Int value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.IntSchema

The core schema.

float_schema

def float_schema(schema: core_schema.FloatSchema) -> JsonSchemaValue

Generates a JSON schema that matches a float value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.FloatSchema

The core schema.

decimal_schema

def decimal_schema(schema: core_schema.DecimalSchema) -> JsonSchemaValue

Generates a JSON schema that matches a decimal value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.DecimalSchema

The core schema.

str_schema

def str_schema(schema: core_schema.StringSchema) -> JsonSchemaValue

Generates a JSON schema that matches a string value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.StringSchema

The core schema.

bytes_schema

def bytes_schema(schema: core_schema.BytesSchema) -> JsonSchemaValue

Generates a JSON schema that matches a bytes value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.BytesSchema

The core schema.

date_schema

def date_schema(schema: core_schema.DateSchema) -> JsonSchemaValue

Generates a JSON schema that matches a date value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.DateSchema

The core schema.

time_schema

def time_schema(schema: core_schema.TimeSchema) -> JsonSchemaValue

Generates a JSON schema that matches a time value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.TimeSchema

The core schema.

datetime_schema

def datetime_schema(schema: core_schema.DatetimeSchema) -> JsonSchemaValue

Generates a JSON schema that matches a datetime value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.DatetimeSchema

The core schema.

timedelta_schema

def timedelta_schema(schema: core_schema.TimedeltaSchema) -> JsonSchemaValue

Generates a JSON schema that matches a timedelta value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.TimedeltaSchema

The core schema.

literal_schema

def literal_schema(schema: core_schema.LiteralSchema) -> JsonSchemaValue

Generates a JSON schema that matches a literal value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.LiteralSchema

The core schema.

is_instance_schema

def is_instance_schema(schema: core_schema.IsInstanceSchema) -> JsonSchemaValue

Generates a JSON schema that checks if a value is an instance of a class, equivalent to Python’s isinstance method.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.IsInstanceSchema

The core schema.

is_subclass_schema

def is_subclass_schema(schema: core_schema.IsSubclassSchema) -> JsonSchemaValue

Generates a JSON schema that checks if a value is a subclass of a class, equivalent to Python’s issubclass method.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.IsSubclassSchema

The core schema.

callable_schema

def callable_schema(schema: core_schema.CallableSchema) -> JsonSchemaValue

Generates a JSON schema that matches a callable value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.CallableSchema

The core schema.

list_schema

def list_schema(schema: core_schema.ListSchema) -> JsonSchemaValue

Returns a schema that matches a list schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.ListSchema

The core schema.

tuple_positional_schema

def tuple_positional_schema(
    schema: core_schema.TuplePositionalSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a positional tuple schema e.g. Tuple[int, str, bool].

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.TuplePositionalSchema

The core schema.

tuple_variable_schema

def tuple_variable_schema(schema: core_schema.TupleVariableSchema) -> JsonSchemaValue

Generates a JSON schema that matches a variable tuple schema e.g. Tuple[int, ...].

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.TupleVariableSchema

The core schema.

set_schema

def set_schema(schema: core_schema.SetSchema) -> JsonSchemaValue

Generates a JSON schema that matches a set schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.SetSchema

The core schema.

frozenset_schema

def frozenset_schema(schema: core_schema.FrozenSetSchema) -> JsonSchemaValue

Generates a JSON schema that matches a frozenset schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.FrozenSetSchema

The core schema.

generator_schema

def generator_schema(schema: core_schema.GeneratorSchema) -> JsonSchemaValue

Returns a JSON schema that represents the provided GeneratorSchema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.GeneratorSchema

The schema.

dict_schema

def dict_schema(schema: core_schema.DictSchema) -> JsonSchemaValue

Generates a JSON schema that matches a dict schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.DictSchema

The core schema.

function_before_schema

def function_before_schema(
    schema: core_schema.BeforeValidatorFunctionSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a function-before schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.BeforeValidatorFunctionSchema

The core schema.

function_after_schema

def function_after_schema(
    schema: core_schema.AfterValidatorFunctionSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a function-after schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.AfterValidatorFunctionSchema

The core schema.

function_plain_schema

def function_plain_schema(
    schema: core_schema.PlainValidatorFunctionSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a function-plain schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.PlainValidatorFunctionSchema

The core schema.

function_wrap_schema

def function_wrap_schema(
    schema: core_schema.WrapValidatorFunctionSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a function-wrap schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.WrapValidatorFunctionSchema

The core schema.

default_schema

def default_schema(schema: core_schema.WithDefaultSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema with a default value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.WithDefaultSchema

The core schema.

nullable_schema

def nullable_schema(schema: core_schema.NullableSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that allows null values.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.NullableSchema

The core schema.

union_schema

def union_schema(schema: core_schema.UnionSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that allows values matching any of the given schemas.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.UnionSchema

The core schema.

tagged_union_schema

def tagged_union_schema(schema: core_schema.TaggedUnionSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that allows values matching any of the given schemas, where the schemas are tagged with a discriminator field that indicates which schema should be used to validate the value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.TaggedUnionSchema

The core schema.

chain_schema

def chain_schema(schema: core_schema.ChainSchema) -> JsonSchemaValue

Generates a JSON schema that matches a core_schema.ChainSchema.

When generating a schema for validation, we return the validation JSON schema for the first step in the chain. For serialization, we return the serialization JSON schema for the last step in the chain.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.ChainSchema

The core schema.

lax_or_strict_schema

def lax_or_strict_schema(schema: core_schema.LaxOrStrictSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that allows values matching either the lax schema or the strict schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.LaxOrStrictSchema

The core schema.

json_or_python_schema

def json_or_python_schema(schema: core_schema.JsonOrPythonSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that allows values matching either the JSON schema or the Python schema.

The JSON schema is used instead of the Python schema. If you want to use the Python schema, you should override this method.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.JsonOrPythonSchema

The core schema.

typed_dict_schema

def typed_dict_schema(schema: core_schema.TypedDictSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a typed dict.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.TypedDictSchema

The core schema.

typed_dict_field_schema

def typed_dict_field_schema(schema: core_schema.TypedDictField) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a typed dict field.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.TypedDictField

The core schema.

dataclass_field_schema

def dataclass_field_schema(schema: core_schema.DataclassField) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a dataclass field.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.DataclassField

The core schema.

model_field_schema

def model_field_schema(schema: core_schema.ModelField) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a model field.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.ModelField

The core schema.

computed_field_schema

def computed_field_schema(schema: core_schema.ComputedField) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a computed field.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.ComputedField

The core schema.

model_schema

def model_schema(schema: core_schema.ModelSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a model.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.ModelSchema

The core schema.

resolve_schema_to_update

def resolve_schema_to_update(json_schema: JsonSchemaValue) -> JsonSchemaValue

Resolve a JsonSchemaValue to the non-ref schema if it is a $ref schema.

Returns

JsonSchemaValue — The resolved schema.

Parameters

json_schema : JsonSchemaValue

The schema to resolve.

model_fields_schema

def model_fields_schema(schema: core_schema.ModelFieldsSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a model’s fields.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.ModelFieldsSchema

The core schema.

field_is_present

def field_is_present(field: CoreSchemaField) -> bool

Whether the field should be included in the generated JSON schema.

Returns

boolTrue if the field should be included in the generated JSON schema, False otherwise.

Parameters

field : CoreSchemaField

The schema for the field itself.

field_is_required

def field_is_required(
    field: core_schema.ModelField | core_schema.DataclassField | core_schema.TypedDictField,
    total: bool,
) -> bool

Whether the field should be marked as required in the generated JSON schema. (Note that this is irrelevant if the field is not present in the JSON schema.).

Returns

boolTrue if the field should be marked as required in the generated JSON schema, False otherwise.

Parameters

field : core_schema.ModelField | core_schema.DataclassField | core_schema.TypedDictField

The schema for the field itself.

total : bool

Only applies to TypedDictFields. Indicates if the TypedDict this field belongs to is total, in which case any fields that don’t explicitly specify required=False are required.

dataclass_args_schema

def dataclass_args_schema(schema: core_schema.DataclassArgsSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a dataclass’s constructor arguments.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.DataclassArgsSchema

The core schema.

dataclass_schema

def dataclass_schema(schema: core_schema.DataclassSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a dataclass.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.DataclassSchema

The core schema.

arguments_schema

def arguments_schema(schema: core_schema.ArgumentsSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a function’s arguments.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.ArgumentsSchema

The core schema.

kw_arguments_schema

def kw_arguments_schema(
    arguments: list[core_schema.ArgumentsParameter],
    var_kwargs_schema: CoreSchema | None,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a function’s keyword arguments.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

arguments : list[core_schema.ArgumentsParameter]

The core schema.

p_arguments_schema

def p_arguments_schema(
    arguments: list[core_schema.ArgumentsParameter],
    var_args_schema: CoreSchema | None,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a function’s positional arguments.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

arguments : list[core_schema.ArgumentsParameter]

The core schema.

get_argument_name

def get_argument_name(argument: core_schema.ArgumentsParameter) -> str

Retrieves the name of an argument.

Returns

str — The name of the argument.

Parameters

argument : core_schema.ArgumentsParameter

The core schema.

call_schema

def call_schema(schema: core_schema.CallSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a function call.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.CallSchema

The core schema.

custom_error_schema

def custom_error_schema(schema: core_schema.CustomErrorSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a custom error.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.CustomErrorSchema

The core schema.

json_schema

def json_schema(schema: core_schema.JsonSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a JSON object.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.JsonSchema

The core schema.

url_schema

def url_schema(schema: core_schema.UrlSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a URL.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.UrlSchema

The core schema.

multi_host_url_schema

def multi_host_url_schema(schema: core_schema.MultiHostUrlSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a URL that can be used with multiple hosts.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.MultiHostUrlSchema

The core schema.

uuid_schema

def uuid_schema(schema: core_schema.UuidSchema) -> JsonSchemaValue

Generates a JSON schema that matches a UUID.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.UuidSchema

The core schema.

definitions_schema

def definitions_schema(schema: core_schema.DefinitionsSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a JSON object with definitions.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.DefinitionsSchema

The core schema.

definition_ref_schema

def definition_ref_schema(
    schema: core_schema.DefinitionReferenceSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that references a definition.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.DefinitionReferenceSchema

The core schema.

ser_schema

def ser_schema(
    schema: core_schema.SerSchema | core_schema.IncExSeqSerSchema | core_schema.IncExDictSerSchema,
) -> JsonSchemaValue | None

Generates a JSON schema that matches a schema that defines a serialized object.

Returns

JsonSchemaValue | None — The generated JSON schema.

Parameters

schema : core_schema.SerSchema | core_schema.IncExSeqSerSchema | core_schema.IncExDictSerSchema

The core schema.

get_title_from_name

def get_title_from_name(name: str) -> str

Retrieves a title from a name.

Returns

str — The title.

Parameters

name : str

The name to retrieve a title from.

field_title_should_be_set

def field_title_should_be_set(schema: CoreSchemaOrField) -> bool

Returns true if a field with the given schema should have a title set based on the field name.

Intuitively, we want this to return true for schemas that wouldn’t otherwise provide their own title (e.g., int, float, str), and false for those that would (e.g., BaseModel subclasses).

Returns

boolTrue if the field should have a title set, False otherwise.

Parameters

schema : CoreSchemaOrField

The schema to check.

normalize_name

def normalize_name(name: str) -> str

Normalizes a name to be used as a key in a dictionary.

Returns

str — The normalized name.

Parameters

name : str

The name to normalize.

get_defs_ref

def get_defs_ref(core_mode_ref: CoreModeRef) -> DefsRef

Override this method to change the way that definitions keys are generated from a core reference.

Returns

DefsRef — The definitions key.

Parameters

core_mode_ref : CoreModeRef

The core reference.

get_cache_defs_ref_schema

def get_cache_defs_ref_schema(core_ref: CoreRef) -> tuple[DefsRef, JsonSchemaValue]

This method wraps the get_defs_ref method with some cache-lookup/population logic, and returns both the produced defs_ref and the JSON schema that will refer to the right definition.

Returns

tuple[DefsRef, JsonSchemaValue] — A tuple of the definitions reference and the JSON schema that will refer to it.

Parameters

core_ref : CoreRef

The core reference to get the definitions reference for.

handle_ref_overrides

def handle_ref_overrides(json_schema: JsonSchemaValue) -> JsonSchemaValue

It is not valid for a schema with a top-level $ref to have sibling keys.

During our own schema generation, we treat sibling keys as overrides to the referenced schema, but this is not how the official JSON schema spec works.

Because of this, we first remove any sibling keys that are redundant with the referenced schema, then if any remain, we transform the schema from a top-level ‘$ref’ to use allOf to move the $ref out of the top level. (See bottom of https://swagger.io/docs/specification/using-ref/ for a reference about this behavior)

Returns

JsonSchemaValue

get_schema_from_definitions

def get_schema_from_definitions(json_ref: JsonRef) -> JsonSchemaValue | None
Returns

JsonSchemaValue | None

encode_default

def encode_default(dft: Any) -> Any

Encode a default value to a JSON-serializable value.

This is used to encode default values for fields in the generated JSON schema.

Returns

Any — The encoded default value.

Parameters

dft : Any

The default value to encode.

update_with_validations

def update_with_validations(
    json_schema: JsonSchemaValue,
    core_schema: CoreSchema,
    mapping: dict[str, str],
) -> None

Update the json_schema with the corresponding validations specified in the core_schema, using the provided mapping to translate keys in core_schema to the appropriate keys for a JSON schema.

Returns

None

Parameters

json_schema : JsonSchemaValue

The JSON schema to update.

core_schema : CoreSchema

The core schema to get the validations from.

mapping : dict[str, str]

A mapping from core_schema attribute names to the corresponding JSON schema attribute names.

get_flattened_anyof

def get_flattened_anyof(schemas: list[JsonSchemaValue]) -> JsonSchemaValue
Returns

JsonSchemaValue

get_json_ref_counts

def get_json_ref_counts(json_schema: JsonSchemaValue) -> dict[JsonRef, int]

Get all values corresponding to the key ‘$ref’ anywhere in the json_schema.

Returns

dict[JsonRef, int]

handle_invalid_for_json_schema

def handle_invalid_for_json_schema(
    schema: CoreSchemaOrField,
    error_info: str,
) -> JsonSchemaValue
Returns

JsonSchemaValue

emit_warning

def emit_warning(kind: JsonSchemaWarningKind, detail: str) -> None

This method simply emits PydanticJsonSchemaWarnings based on handling in the warning_message method.

Returns

None

render_warning_message

def render_warning_message(kind: JsonSchemaWarningKind, detail: str) -> str | None

This method is responsible for ignoring warnings as desired, and for formatting the warning messages.

You can override the value of ignored_warning_kinds in a subclass of GenerateJsonSchema to modify what warnings are generated. If you want more control, you can override this method; just return None in situations where you don’t want warnings to be emitted.

Returns

str | None — The formatted warning message, or None if no warning should be emitted.

Parameters

kind : JsonSchemaWarningKind

The kind of warning to render. It can be one of the following:

  • ‘skipped-choice’: A choice field was skipped because it had no valid choices.
  • ‘non-serializable-default’: A default value was skipped because it was not JSON-serializable.

detail : str

A string with additional details about the warning.


TypeAdapter

Bases: Generic[T]

Type adapters provide a flexible way to perform validation and serialization based on a Python type.

A TypeAdapter instance exposes some of the functionality from BaseModel instance methods for types that do not have such methods (such as dataclasses, primitive types, and more).

Note that TypeAdapter is not an actual type, so you cannot use it in type annotations.

Attributes

core_schema

Default: core_schema

validator

Default: validator

serializer

Default: serializer

Methods

new

def __new__(cls, __type: type[T], config: ConfigDict | None = ...) -> TypeAdapter[T]
def __new__(cls, __type: T, config: ConfigDict | None = ...) -> TypeAdapter[T]

A class representing the type adapter.

Returns

TypeAdapter[T]

init

def __init__(
    type: type[T],
    config: ConfigDict | None = None,
    _parent_depth: int = 2,
) -> None
def __init__(type: T, config: ConfigDict | None = None, _parent_depth: int = 2) -> None

Initializes the TypeAdapter object.

Returns

None

validate_python

def validate_python(
    __object: Any,
    strict: bool | None = None,
    from_attributes: bool | None = None,
    context: dict[str, Any] | None = None,
) -> T

Validate a Python object against the model.

Returns

T — The validated object.

Parameters

__object : Any

The Python object to validate against the model.

strict : bool | None Default: None

Whether to strictly check types.

from_attributes : bool | None Default: None

Whether to extract data from object attributes.

context : dict[str, Any] | None Default: None

Additional context to pass to the validator.

validate_json

def validate_json(
    __data: str | bytes,
    strict: bool | None = None,
    context: dict[str, Any] | None = None,
) -> T

Validate a JSON string or bytes against the model.

Returns

T — The validated object.

Parameters

__data : str | bytes

The JSON data to validate against the model.

strict : bool | None Default: None

Whether to strictly check types.

context : dict[str, Any] | None Default: None

Additional context to use during validation.

validate_strings

def validate_strings(
    __obj: Any,
    strict: bool | None = None,
    context: dict[str, Any] | None = None,
) -> T

Validate object contains string data against the model.

Returns

T — The validated object.

Parameters

__obj : Any

The object contains string data to validate.

strict : bool | None Default: None

Whether to strictly check types.

context : dict[str, Any] | None Default: None

Additional context to use during validation.

get_default_value

def get_default_value(
    strict: bool | None = None,
    context: dict[str, Any] | None = None,
) -> Some[T] | None

Get the default value for the wrapped type.

Returns

Some[T] | None — The default value wrapped in a Some if there is one or None if not.

Parameters

strict : bool | None Default: None

Whether to strictly check types.

context : dict[str, Any] | None Default: None

Additional context to pass to the validator.

dump_python

def dump_python(
    __instance: T,
    mode: Literal['json', 'python'] = 'python',
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    round_trip: bool = False,
    warnings: bool = True,
) -> Any

Dump an instance of the adapted type to a Python object.

Returns

Any — The serialized object.

Parameters

__instance : T

The Python object to serialize.

mode : Literal['json', 'python'] Default: 'python'

The output format.

include : IncEx | None Default: None

Fields to include in the output.

exclude : IncEx | None Default: None

Fields to exclude from the output.

by_alias : bool Default: False

Whether to use alias names for field names.

exclude_unset : bool Default: False

Whether to exclude unset fields.

exclude_defaults : bool Default: False

Whether to exclude fields with default values.

exclude_none : bool Default: False

Whether to exclude fields with None values.

round_trip : bool Default: False

Whether to output the serialized data in a way that is compatible with deserialization.

warnings : bool Default: True

Whether to display serialization warnings.

dump_json

def dump_json(
    __instance: T,
    indent: int | None = None,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    round_trip: bool = False,
    warnings: bool = True,
) -> bytes

Serialize an instance of the adapted type to JSON.

Returns

bytes — The JSON representation of the given instance as bytes.

Parameters

__instance : T

The instance to be serialized.

indent : int | None Default: None

Number of spaces for JSON indentation.

include : IncEx | None Default: None

Fields to include.

exclude : IncEx | None Default: None

Fields to exclude.

by_alias : bool Default: False

Whether to use alias names for field names.

exclude_unset : bool Default: False

Whether to exclude unset fields.

exclude_defaults : bool Default: False

Whether to exclude fields with default values.

exclude_none : bool Default: False

Whether to exclude fields with a value of None.

round_trip : bool Default: False

Whether to serialize and deserialize the instance to ensure round-tripping.

warnings : bool Default: True

Whether to emit serialization warnings.

json_schema

def json_schema(
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
) -> dict[str, Any]

Generate a JSON schema for the adapted type.

Returns

dict[str, Any] — The JSON schema for the model as a dictionary.

Parameters

by_alias : bool Default: True

Whether to use alias names for field names.

ref_template : str Default: DEFAULT_REF_TEMPLATE

The format string used for generating $ref strings.

schema_generator : type[GenerateJsonSchema] Default: GenerateJsonSchema

The generator class used for creating the schema.

mode : JsonSchemaMode Default: 'validation'

The mode to use for schema generation.

json_schemas

@staticmethod

def json_schemas(
    __inputs: Iterable[tuple[JsonSchemaKeyT, JsonSchemaMode, TypeAdapter[Any]]],
    by_alias: bool = True,
    title: str | None = None,
    description: str | None = None,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
) -> tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], JsonSchemaValue]

Generate a JSON schema including definitions from multiple type adapters.

Returns

tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], JsonSchemaValue] — A tuple where:

  • The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.)
  • The second element is a JSON schema containing all definitions referenced in the first returned element, along with the optional title and description keys.
Parameters

__inputs : Iterable[tuple[JsonSchemaKeyT, JsonSchemaMode, TypeAdapter[Any]]]

Inputs to schema generation. The first two items will form the keys of the (first) output mapping; the type adapters will provide the core schemas that get converted into definitions in the output JSON schema.

by_alias : bool Default: True

Whether to use alias names.

title : str | None Default: None

The title for the schema.

description : str | None Default: None

The description for the schema.

ref_template : str Default: DEFAULT_REF_TEMPLATE

The format string used for generating $ref strings.

schema_generator : type[GenerateJsonSchema] Default: GenerateJsonSchema

The generator class used for creating the schema.


create_schema_validator

def create_schema_validator(
    schema: CoreSchema,
    config: CoreConfig | None = None,
    plugin_settings: dict[str, Any] | None = None,
) -> SchemaValidator

Create a SchemaValidator or PluggableSchemaValidator if plugins are installed.

Returns

SchemaValidator — If plugins are installed then return PluggableSchemaValidator, otherwise return SchemaValidator.


DEFAULT_REF_TEMPLATE

The default format string used to generate reference names.

Default: '#/$defs/\{model\}'

JsonSchemaKeyT

Default: TypeVar('JsonSchemaKeyT', bound=Hashable)

JsonSchemaMode

A type alias that represents the mode of a JSON schema; either ‘validation’ or ‘serialization’.

For some types, the inputs to validation differ from the outputs of serialization. For example, computed fields will only be present when serializing, and should not be provided when validating. This flag provides a way to indicate whether you want the JSON schema required for validation inputs, or that will be matched by serialization outputs.

Default: Literal['validation', 'serialization']

JsonSchemaValue

A type alias for a JSON schema value. This is a dictionary of string keys to arbitrary values.

Default: Dict[str, Any]

T

Default: TypeVar('T')

IncEx

Default: Union[Set[int], Set[str], Dict[int, Any], Dict[str, Any]]