Skip to content
You're viewing docs for v2.8. See the latest version →

JSON Schema

Usage docs: https://docs.pydantic.dev/2.5/concepts/json_schema/

The json_schema module contains classes and functions to allow the way JSON Schema is generated to be customized.

In general you shouldn’t need to use this module directly; instead, you can use BaseModel.model_json_schema and TypeAdapter.json_schema.

PydanticDeprecatedSince26

Bases: PydanticDeprecationWarning

A specific PydanticDeprecationWarning subclass defining functionality deprecated since Pydantic 2.6.


GetJsonSchemaHandler

Handler to call into the next JSON schema generation function.

Attributes

mode

Type: JsonSchemaMode

Methods

call

def __call__(core_schema: CoreSchemaOrField) -> JsonSchemaValue

Call the inner handler and get the JsonSchemaValue it returns. This will call the next JSON schema modifying function up until it calls into pydantic.json_schema.GenerateJsonSchema, which will raise a pydantic.errors.PydanticInvalidForJsonSchema error if it cannot generate a JSON schema.

Returns

JsonSchemaValue — The JSON schema generated by the inner JSON schema modify JsonSchemaValue — functions.

Parameters

core_schema : CoreSchemaOrField

A pydantic_core.core_schema.CoreSchema.

resolve_ref_schema

def resolve_ref_schema(maybe_ref_json_schema: JsonSchemaValue) -> JsonSchemaValue

Get the real schema for a \{"$ref": ...\} schema. If the schema given is not a $ref schema, it will be returned as is. This means you don’t have to check before calling this function.

Returns

JsonSchemaValue — A JsonSchemaValue that has no $ref.

Parameters

maybe_ref_json_schema : JsonSchemaValue

A JsonSchemaValue which may be a $ref schema.

Raises
  • LookupError — If the ref is not found.

PydanticInvalidForJsonSchema

Bases: PydanticUserError

An error raised during failures to generate a JSON schema for some CoreSchema.


PydanticSchemaGenerationError

Bases: PydanticUserError

An error raised during failures to generate a CoreSchema for some type.


PydanticUserError

Bases: PydanticErrorMixin, TypeError

An error raised due to incorrect use of Pydantic.


ConfigDict

Bases: TypedDict

A TypedDict for configuring Pydantic behaviour.

Attributes

title

The title for the generated JSON schema, defaults to the model’s name

Type: str | None

model_title_generator

A callable that takes a model class and returns the title for it. Defaults to None.

Type: Callable[[type], str] | None

field_title_generator

A callable that takes a field’s name and info and returns title for it. Defaults to None.

Type: Callable[[str, FieldInfo | ComputedFieldInfo], str] | None

str_to_lower

Whether to convert all characters to lowercase for str types. Defaults to False.

Type: bool

str_to_upper

Whether to convert all characters to uppercase for str types. Defaults to False.

Type: bool

str_strip_whitespace

Whether to strip leading and trailing whitespace for str types.

Type: bool

str_min_length

The minimum length for str types. Defaults to None.

Type: int

str_max_length

The maximum length for str types. Defaults to None.

Type: int | None

extra

Whether to ignore, allow, or forbid extra attributes during model initialization. Defaults to 'ignore'.

You can configure how pydantic handles the attributes that are not defined in the model:

  • allow - Allow any extra attributes.
  • forbid - Forbid any extra attributes.
  • ignore - Ignore any extra attributes.
from pydantic import BaseModel, ConfigDict


class User(BaseModel):
  model_config = ConfigDict(extra='ignore')  # (1)

  name: str


user = User(name='John Doe', age=20)  # (2)
print(user)
#> name='John Doe'

This is the default behaviour.

The age argument is ignored.

Instead, with extra='allow', the age argument is included:

from pydantic import BaseModel, ConfigDict


class User(BaseModel):
  model_config = ConfigDict(extra='allow')

  name: str


user = User(name='John Doe', age=20)  # (1)
print(user)
#> name='John Doe' age=20

The age argument is included.

With extra='forbid', an error is raised:

from pydantic import BaseModel, ConfigDict, ValidationError


class User(BaseModel):
    model_config = ConfigDict(extra='forbid')

    name: str


try:
    User(name='John Doe', age=20)
except ValidationError as e:
    print(e)
    '''
    1 validation error for User
    age
    Extra inputs are not permitted [type=extra_forbidden, input_value=20, input_type=int]
    '''

Type: ExtraValues | None

frozen

Whether models are faux-immutable, i.e. whether __setattr__ is allowed, and also generates a __hash__() method for the model. This makes instances of the model potentially hashable if all the attributes are hashable. Defaults to False.

Type: bool

populate_by_name

Whether an aliased field may be populated by its name as given by the model attribute, as well as the alias. Defaults to False.

from pydantic import BaseModel, ConfigDict, Field


class User(BaseModel):
  model_config = ConfigDict(populate_by_name=True)

  name: str = Field(alias='full_name')  # (1)
  age: int


user = User(full_name='John Doe', age=20)  # (2)
print(user)
#> name='John Doe' age=20
user = User(name='John Doe', age=20)  # (3)
print(user)
#> name='John Doe' age=20

The field 'name' has an alias 'full_name'.

The model is populated by the alias 'full_name'.

The model is populated by the field name 'name'.

Type: bool

use_enum_values

Whether to populate models with the value property of enums, rather than the raw enum. This may be useful if you want to serialize model.model_dump() later. Defaults to False.

from enum import Enum
from typing import Optional

from pydantic import BaseModel, ConfigDict, Field


class SomeEnum(Enum):
    FOO = 'foo'
    BAR = 'bar'
    BAZ = 'baz'


class SomeModel(BaseModel):
    model_config = ConfigDict(use_enum_values=True)

    some_enum: SomeEnum
    another_enum: Optional[SomeEnum] = Field(default=SomeEnum.FOO, validate_default=True)


model1 = SomeModel(some_enum=SomeEnum.BAR)
print(model1.model_dump())
# {'some_enum': 'bar', 'another_enum': 'foo'}

model2 = SomeModel(some_enum=SomeEnum.BAR, another_enum=SomeEnum.BAZ)
print(model2.model_dump())
#> {'some_enum': 'bar', 'another_enum': 'baz'}

Type: bool

validate_assignment

Whether to validate the data when the model is changed. Defaults to False.

The default behavior of Pydantic is to validate the data when the model is created.

In case the user changes the data after the model is created, the model is not revalidated.

from pydantic import BaseModel

class User(BaseModel):
  name: str

user = User(name='John Doe')  # (1)
print(user)
#> name='John Doe'
user.name = 123  # (1)
print(user)
#> name=123

The validation happens only when the model is created.

The validation does not happen when the data is changed.

In case you want to revalidate the model when the data is changed, you can use validate_assignment=True:

from pydantic import BaseModel, ValidationError

class User(BaseModel, validate_assignment=True):  # (1)
  name: str

user = User(name='John Doe')  # (2)
print(user)
#> name='John Doe'
try:
  user.name = 123  # (3)
except ValidationError as e:
  print(e)
  '''
  1 validation error for User
  name
    Input should be a valid string [type=string_type, input_value=123, input_type=int]
  '''

You can either use class keyword arguments, or model_config to set validate_assignment=True.

The validation happens when the model is created.

The validation also happens when the data is changed.

Type: bool

arbitrary_types_allowed

Whether arbitrary types are allowed for field types. Defaults to False.

from pydantic import BaseModel, ConfigDict, ValidationError

# This is not a pydantic model, it's an arbitrary class
class Pet:
    def __init__(self, name: str):
        self.name = name

class Model(BaseModel):
    model_config = ConfigDict(arbitrary_types_allowed=True)

    pet: Pet
    owner: str

pet = Pet(name='Hedwig')
# A simple check of instance type is used to validate the data
model = Model(owner='Harry', pet=pet)
print(model)
#> pet=<__main__.Pet object at 0x0123456789ab> owner='Harry'
print(model.pet)
#> <__main__.Pet object at 0x0123456789ab>
print(model.pet.name)
#> Hedwig
print(type(model.pet))
#> <class '__main__.Pet'>
try:
    # If the value is not an instance of the type, it's invalid
    Model(owner='Harry', pet='Hedwig')
except ValidationError as e:
    print(e)
    '''
    1 validation error for Model
    pet
      Input should be an instance of Pet [type=is_instance_of, input_value='Hedwig', input_type=str]
    '''

# Nothing in the instance of the arbitrary type is checked
# Here name probably should have been a str, but it's not validated
pet2 = Pet(name=42)
model2 = Model(owner='Harry', pet=pet2)
print(model2)
#> pet=<__main__.Pet object at 0x0123456789ab> owner='Harry'
print(model2.pet)
#> <__main__.Pet object at 0x0123456789ab>
print(model2.pet.name)
#> 42
print(type(model2.pet))
#> <class '__main__.Pet'>

Type: bool

from_attributes

Whether to build models and look up discriminators of tagged unions using python object attributes.

Type: bool

loc_by_alias

Whether to use the actual key provided in the data (e.g. alias) for error locs rather than the field’s name. Defaults to True.

Type: bool

alias_generator

A callable that takes a field name and returns an alias for it or an instance of AliasGenerator. Defaults to None.

When using a callable, the alias generator is used for both validation and serialization. If you want to use different alias generators for validation and serialization, you can use AliasGenerator instead.

If data source field names do not match your code style (e. g. CamelCase fields), you can automatically generate aliases using alias_generator. Here’s an example with a basic callable:

from pydantic import BaseModel, ConfigDict
from pydantic.alias_generators import to_pascal

class Voice(BaseModel):
    model_config = ConfigDict(alias_generator=to_pascal)

    name: str
    language_code: str

voice = Voice(Name='Filiz', LanguageCode='tr-TR')
print(voice.language_code)
#> tr-TR
print(voice.model_dump(by_alias=True))
#> {'Name': 'Filiz', 'LanguageCode': 'tr-TR'}

If you want to use different alias generators for validation and serialization, you can use AliasGenerator.

from pydantic import AliasGenerator, BaseModel, ConfigDict
from pydantic.alias_generators import to_camel, to_pascal

class Athlete(BaseModel):
    first_name: str
    last_name: str
    sport: str

    model_config = ConfigDict(
        alias_generator=AliasGenerator(
            validation_alias=to_camel,
            serialization_alias=to_pascal,
        )
    )

athlete = Athlete(firstName='John', lastName='Doe', sport='track')
print(athlete.model_dump(by_alias=True))
#> {'FirstName': 'John', 'LastName': 'Doe', 'Sport': 'track'}

Type: Callable[[str], str] | AliasGenerator | None

ignored_types

A tuple of types that may occur as values of class attributes without annotations. This is typically used for custom descriptors (classes that behave like property). If an attribute is set on a class without an annotation and has a type that is not in this tuple (or otherwise recognized by pydantic), an error will be raised. Defaults to ().

Type: tuple[type, ...]

allow_inf_nan

Whether to allow infinity (+inf an -inf) and NaN values to float fields. Defaults to True.

Type: bool

json_schema_extra

A dict or callable to provide extra JSON schema properties. Defaults to None.

Type: JsonDict | JsonSchemaExtraCallable | None

json_encoders

A dict of custom JSON encoders for specific types. Defaults to None.

Type: dict[type[object], JsonEncoder] | None

strict

(new in V2) If True, strict validation is applied to all fields on the model.

By default, Pydantic attempts to coerce values to the correct type, when possible.

There are situations in which you may want to disable this behavior, and instead raise an error if a value’s type does not match the field’s type annotation.

To configure strict mode for all fields on a model, you can set strict=True on the model.

from pydantic import BaseModel, ConfigDict

class Model(BaseModel):
    model_config = ConfigDict(strict=True)

    name: str
    age: int

See Strict Mode for more details.

See the Conversion Table for more details on how Pydantic converts data in both strict and lax modes.

Type: bool

revalidate_instances

When and how to revalidate models and dataclasses during validation. Accepts the string values of 'never', 'always' and 'subclass-instances'. Defaults to 'never'.

  • 'never' will not revalidate models and dataclasses during validation
  • 'always' will revalidate models and dataclasses during validation
  • 'subclass-instances' will revalidate models and dataclasses during validation if the instance is a subclass of the model or dataclass

By default, model and dataclass instances are not revalidated during validation.

from typing import List

from pydantic import BaseModel

class User(BaseModel, revalidate_instances='never'):  # (1)
  hobbies: List[str]

class SubUser(User):
  sins: List[str]

class Transaction(BaseModel):
  user: User

my_user = User(hobbies=['reading'])
t = Transaction(user=my_user)
print(t)
#> user=User(hobbies=['reading'])

my_user.hobbies = [1]  # (2)
t = Transaction(user=my_user)  # (3)
print(t)
#> user=User(hobbies=[1])

my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying'])
t = Transaction(user=my_sub_user)
print(t)
#> user=SubUser(hobbies=['scuba diving'], sins=['lying'])

revalidate_instances is set to 'never' by **default.

The assignment is not validated, unless you set validate_assignment to True in the model's config.

Since revalidate_instances is set to never, this is not revalidated.

If you want to revalidate instances during validation, you can set revalidate_instances to 'always' in the model’s config.

from typing import List

from pydantic import BaseModel, ValidationError

class User(BaseModel, revalidate_instances='always'):  # (1)
  hobbies: List[str]

class SubUser(User):
  sins: List[str]

class Transaction(BaseModel):
  user: User

my_user = User(hobbies=['reading'])
t = Transaction(user=my_user)
print(t)
#> user=User(hobbies=['reading'])

my_user.hobbies = [1]
try:
  t = Transaction(user=my_user)  # (2)
except ValidationError as e:
  print(e)
  '''
  1 validation error for Transaction
  user.hobbies.0
    Input should be a valid string [type=string_type, input_value=1, input_type=int]
  '''

my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying'])
t = Transaction(user=my_sub_user)
print(t)  # (3)
#> user=User(hobbies=['scuba diving'])

revalidate_instances is set to 'always'.

The model is revalidated, since revalidate_instances is set to 'always'.

Using 'never' we would have gotten user=SubUser(hobbies=['scuba diving'], sins=['lying']).

It’s also possible to set revalidate_instances to 'subclass-instances' to only revalidate instances of subclasses of the model.

from typing import List

from pydantic import BaseModel

class User(BaseModel, revalidate_instances='subclass-instances'):  # (1)
  hobbies: List[str]

class SubUser(User):
  sins: List[str]

class Transaction(BaseModel):
  user: User

my_user = User(hobbies=['reading'])
t = Transaction(user=my_user)
print(t)
#> user=User(hobbies=['reading'])

my_user.hobbies = [1]
t = Transaction(user=my_user)  # (2)
print(t)
#> user=User(hobbies=[1])

my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying'])
t = Transaction(user=my_sub_user)
print(t)  # (3)
#> user=User(hobbies=['scuba diving'])

revalidate_instances is set to 'subclass-instances'.

This is not revalidated, since my_user is not a subclass of User.

Using 'never' we would have gotten user=SubUser(hobbies=['scuba diving'], sins=['lying']).

Type: Literal['always', 'never', 'subclass-instances']

ser_json_timedelta

The format of JSON serialized timedeltas. Accepts the string values of 'iso8601' and 'float'. Defaults to 'iso8601'.

  • 'iso8601' will serialize timedeltas to ISO 8601 durations.
  • 'float' will serialize timedeltas to the total number of seconds.

Type: Literal['iso8601', 'float']

ser_json_bytes

The encoding of JSON serialized bytes. Accepts the string values of 'utf8' and 'base64'. Defaults to 'utf8'.

  • 'utf8' will serialize bytes to UTF-8 strings.
  • 'base64' will serialize bytes to URL safe base64 strings.

Type: Literal['utf8', 'base64']

ser_json_inf_nan

The encoding of JSON serialized infinity and NaN float values. Defaults to 'null'.

  • 'null' will serialize infinity and NaN values as null.
  • 'constants' will serialize infinity and NaN values as Infinity and NaN.
  • 'strings' will serialize infinity as string "Infinity" and NaN as string "NaN".

Type: Literal['null', 'constants', 'strings']

validate_default

Whether to validate default values during validation. Defaults to False.

Type: bool

validate_return

whether to validate the return value from call validators. Defaults to False.

Type: bool

protected_namespaces

A tuple of strings that prevent model to have field which conflict with them. Defaults to ('model_', )).

Pydantic prevents collisions between model attributes and BaseModel’s own methods by namespacing them with the prefix model_.

import warnings

from pydantic import BaseModel

warnings.filterwarnings('error')  # Raise warnings as errors

try:

    class Model(BaseModel):
        model_prefixed_field: str

except UserWarning as e:
    print(e)
    '''
    Field "model_prefixed_field" has conflict with protected namespace "model_".

    You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
    '''

You can customize this behavior using the protected_namespaces setting:

import warnings

from pydantic import BaseModel, ConfigDict

warnings.filterwarnings('error')  # Raise warnings as errors

try:

    class Model(BaseModel):
        model_prefixed_field: str
        also_protect_field: str

        model_config = ConfigDict(
            protected_namespaces=('protect_me_', 'also_protect_')
        )

except UserWarning as e:
    print(e)
    '''
    Field "also_protect_field" has conflict with protected namespace "also_protect_".

    You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('protect_me_',)`.
    '''

While Pydantic will only emit a warning when an item is in a protected namespace but does not actually have a collision, an error is raised if there is an actual collision with an existing attribute:

from pydantic import BaseModel

try:

    class Model(BaseModel):
        model_validate: str

except NameError as e:
    print(e)
    '''
    Field "model_validate" conflicts with member <bound method BaseModel.model_validate of <class 'pydantic.main.BaseModel'>> of protected namespace "model_".
    '''

Type: tuple[str, ...]

hide_input_in_errors

Whether to hide inputs when printing errors. Defaults to False.

Pydantic shows the input value and type when it raises ValidationError during the validation.

from pydantic import BaseModel, ValidationError

class Model(BaseModel):
    a: str

try:
    Model(a=123)
except ValidationError as e:
    print(e)
    '''
    1 validation error for Model
    a
      Input should be a valid string [type=string_type, input_value=123, input_type=int]
    '''

You can hide the input value and type by setting the hide_input_in_errors config to True.

from pydantic import BaseModel, ConfigDict, ValidationError

class Model(BaseModel):
    a: str
    model_config = ConfigDict(hide_input_in_errors=True)

try:
    Model(a=123)
except ValidationError as e:
    print(e)
    '''
    1 validation error for Model
    a
      Input should be a valid string [type=string_type]
    '''

Type: bool

defer_build

Whether to defer model validator and serializer construction until the first model validation. Defaults to False.

This can be useful to avoid the overhead of building models which are only used nested within other models, or when you want to manually define type namespace via Model.model_rebuild(_types_namespace=...).

See also experimental_defer_build_mode.

Type: bool

experimental_defer_build_mode

Controls when defer_build is applicable. Defaults to ('model',).

Due to backwards compatibility reasons TypeAdapter does not by default respect defer_build. Meaning when defer_build is True and experimental_defer_build_mode is the default ('model',) then TypeAdapter immediately constructs its validator and serializer instead of postponing said construction until the first model validation. Set this to ('model', 'type_adapter') to make TypeAdapter respect the defer_build so it postpones validator and serializer construction until the first validation or serialization.

Type: tuple[Literal['model', 'type_adapter'], ...]

plugin_settings

A dict of settings for plugins. Defaults to None.

See Pydantic Plugins for details.

Type: dict[str, object] | None

schema_generator

A custom core schema generator class to use when generating JSON schemas. Useful if you want to change the way types are validated across an entire model/schema. Defaults to None.

The GenerateSchema interface is subject to change, currently only the string_schema method is public.

See #6737 for details.

Type: type[_GenerateSchema] | None

json_schema_serialization_defaults_required

Whether fields with default values should be marked as required in the serialization schema. Defaults to False.

This ensures that the serialization schema will reflect the fact a field with a default will always be present when serializing the model, even though it is not required for validation.

However, there are scenarios where this may be undesirable — in particular, if you want to share the schema between validation and serialization, and don’t mind fields with defaults being marked as not required during serialization. See #7209 for more details.

from pydantic import BaseModel, ConfigDict

class Model(BaseModel):
    a: str = 'a'

    model_config = ConfigDict(json_schema_serialization_defaults_required=True)

print(Model.model_json_schema(mode='validation'))
'''
{
    'properties': {'a': {'default': 'a', 'title': 'A', 'type': 'string'}},
    'title': 'Model',
    'type': 'object',
}
'''
print(Model.model_json_schema(mode='serialization'))
'''
{
    'properties': {'a': {'default': 'a', 'title': 'A', 'type': 'string'}},
    'required': ['a'],
    'title': 'Model',
    'type': 'object',
}
'''

Type: bool

json_schema_mode_override

If not None, the specified mode will be used to generate the JSON schema regardless of what mode was passed to the function call. Defaults to None.

This provides a way to force the JSON schema generation to reflect a specific mode, e.g., to always use the validation schema.

It can be useful when using frameworks (such as FastAPI) that may generate different schemas for validation and serialization that must both be referenced from the same schema; when this happens, we automatically append -Input to the definition reference for the validation schema and -Output to the definition reference for the serialization schema. By specifying a json_schema_mode_override though, this prevents the conflict between the validation and serialization schemas (since both will use the specified schema), and so prevents the suffixes from being added to the definition references.

from pydantic import BaseModel, ConfigDict, Json

class Model(BaseModel):
    a: Json[int]  # requires a string to validate, but will dump an int

print(Model.model_json_schema(mode='serialization'))
'''
{
    'properties': {'a': {'title': 'A', 'type': 'integer'}},
    'required': ['a'],
    'title': 'Model',
    'type': 'object',
}
'''

class ForceInputModel(Model):
    # the following ensures that even with mode='serialization', we
    # will get the schema that would be generated for validation.
    model_config = ConfigDict(json_schema_mode_override='validation')

print(ForceInputModel.model_json_schema(mode='serialization'))
'''
{
    'properties': {
        'a': {
            'contentMediaType': 'application/json',
            'contentSchema': {'type': 'integer'},
            'title': 'A',
            'type': 'string',
        }
    },
    'required': ['a'],
    'title': 'ForceInputModel',
    'type': 'object',
}
'''

Type: Literal['validation', 'serialization', None]

coerce_numbers_to_str

If True, enables automatic coercion of any Number type to str in “lax” (non-strict) mode. Defaults to False.

Pydantic doesn’t allow number types (int, float, Decimal) to be coerced as type str by default.

from decimal import Decimal

from pydantic import BaseModel, ConfigDict, ValidationError

class Model(BaseModel):
    value: str

try:
    print(Model(value=42))
except ValidationError as e:
    print(e)
    '''
    1 validation error for Model
    value
      Input should be a valid string [type=string_type, input_value=42, input_type=int]
    '''

class Model(BaseModel):
    model_config = ConfigDict(coerce_numbers_to_str=True)

    value: str

repr(Model(value=42).value)
#> "42"
repr(Model(value=42.13).value)
#> "42.13"
repr(Model(value=Decimal('42.13')).value)
#> "42.13"

Type: bool

regex_engine

The regex engine to be used for pattern validation. Defaults to 'rust-regex'.

  • rust-regex uses the regex Rust crate, which is non-backtracking and therefore more DDoS resistant, but does not support all regex features.
  • python-re use the re module, which supports all regex features, but may be slower.
from pydantic import BaseModel, ConfigDict, Field, ValidationError

class Model(BaseModel):
    model_config = ConfigDict(regex_engine='python-re')

    value: str = Field(pattern=r'^abc(?=def)')

print(Model(value='abcdef').value)
#> abcdef

try:
    print(Model(value='abxyzcdef'))
except ValidationError as e:
    print(e)
    '''
    1 validation error for Model
    value
      String should match pattern '^abc(?=def)' [type=string_pattern_mismatch, input_value='abxyzcdef', input_type=str]
    '''

Type: Literal['rust-regex', 'python-re']

validation_error_cause

If True, Python exceptions that were part of a validation failure will be shown as an exception group as a cause. Can be useful for debugging. Defaults to False.

Type: bool

use_attribute_docstrings

Whether docstrings of attributes (bare string literals immediately following the attribute declaration) should be used for field descriptions. Defaults to False.

Available in Pydantic v2.7+.

from pydantic import BaseModel, ConfigDict, Field


class Model(BaseModel):
    model_config = ConfigDict(use_attribute_docstrings=True)

    x: str
    """
    Example of an attribute docstring
    """

    y: int = Field(description="Description in Field")
    """
    Description in Field overrides attribute docstring
    """


print(Model.model_fields["x"].description)
# > Example of an attribute docstring
print(Model.model_fields["y"].description)
# > Description in Field

This requires the source code of the class to be available at runtime.

Type: bool

cache_strings

Whether to cache strings to avoid constructing new Python objects. Defaults to True.

Enabling this setting should significantly improve validation performance while increasing memory usage slightly.

  • True or 'all' (the default): cache all strings
  • 'keys': cache only dictionary keys
  • False or 'none': no caching

Type: bool | Literal['all', 'keys', 'none']


PydanticDataclass

Bases: StandardDataclass, Protocol

A protocol containing attributes only available once a class has been decorated as a Pydantic dataclass.


BaseModel

Usage docs: https://docs.pydantic.dev/2.8/concepts/models/

A base class for creating Pydantic models.

Attributes

model_config

Configuration for the model, should be a dictionary conforming to ConfigDict.

Type: ConfigDict Default: ConfigDict()

model_fields

Metadata about the fields defined on the model, mapping of field names to FieldInfo.

This replaces Model.__fields__ from Pydantic V1.

Type: dict[str, FieldInfo]

model_computed_fields

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

Type: dict[str, ComputedFieldInfo]

model_extra

Get extra fields set during validation.

Type: dict[str, Any] | None

model_fields_set

Returns the set of fields that have been explicitly set on this model instance.

Type: set[str]

Methods

init

def __init__(data: Any = {}) -> None

Create a new model by parsing and validating input data from keyword arguments.

Raises ValidationError if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Returns

None

model_construct

@classmethod

def model_construct(cls, _fields_set: set[str] | None = None, values: Any = {}) -> Self

Creates a new instance of the Model class with validated data.

Creates a new model setting __dict__ and __pydantic_fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed.

Returns

Self — A new instance of the Model class with validated data.

Parameters

_fields_set : set[str] | None Default: None

The set of field names accepted for the Model instance.

values : Any Default: \{\}

Trusted or pre-validated data dictionary.

model_copy

def model_copy(update: dict[str, Any] | None = None, deep: bool = False) -> Self

Usage docs: https://docs.pydantic.dev/2.8/concepts/serialization/#model_copy

Returns a copy of the model.

Returns

Self — New model instance.

Parameters

update : dict[str, Any] | None Default: None

Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data.

deep : bool Default: False

Set to True to make a deep copy of the model.

model_dump

def model_dump(
    mode: Literal['json', 'python'] | str = 'python',
    include: IncEx = None,
    exclude: IncEx = None,
    context: Any | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    serialize_as_any: bool = False,
) -> dict[str, Any]

Usage docs: https://docs.pydantic.dev/2.8/concepts/serialization/#modelmodel_dump

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

Returns

dict[str, Any] — A dictionary representation of the model.

Parameters

mode : Literal['json', 'python'] | str Default: 'python'

The mode in which to_python should run. If mode is ‘json’, the output will only contain JSON serializable types. If mode is ‘python’, the output may contain non-JSON-serializable Python objects.

include : IncEx Default: None

A set of fields to include in the output.

exclude : IncEx Default: None

A set of fields to exclude from the output.

context : Any | None Default: None

Additional context to pass to the serializer.

by_alias : bool Default: False

Whether to use the field’s alias in the dictionary key if defined.

exclude_unset : bool Default: False

Whether to exclude fields that have not been explicitly set.

exclude_defaults : bool Default: False

Whether to exclude fields that are set to their default value.

exclude_none : bool Default: False

Whether to exclude fields that have a value of None.

round_trip : bool Default: False

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

warnings : bool | Literal['none', 'warn', 'error'] Default: True

How to handle serialization errors. False/“none” ignores them, True/“warn” logs errors, “error” raises a PydanticSerializationError.

serialize_as_any : bool Default: False

Whether to serialize fields with duck-typing serialization behavior.

model_dump_json

def model_dump_json(
    indent: int | None = None,
    include: IncEx = None,
    exclude: IncEx = None,
    context: Any | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    serialize_as_any: bool = False,
) -> str

Usage docs: https://docs.pydantic.dev/2.8/concepts/serialization/#modelmodel_dump_json

Generates a JSON representation of the model using Pydantic’s to_json method.

Returns

str — A JSON string representation of the model.

Parameters

indent : int | None Default: None

Indentation to use in the JSON output. If None is passed, the output will be compact.

include : IncEx Default: None

Field(s) to include in the JSON output.

exclude : IncEx Default: None

Field(s) to exclude from the JSON output.

context : Any | None Default: None

Additional context to pass to the serializer.

by_alias : bool Default: False

Whether to serialize using field aliases.

exclude_unset : bool Default: False

Whether to exclude fields that have not been explicitly set.

exclude_defaults : bool Default: False

Whether to exclude fields that are set to their default value.

exclude_none : bool Default: False

Whether to exclude fields that have a value of None.

round_trip : bool Default: False

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

warnings : bool | Literal['none', 'warn', 'error'] Default: True

How to handle serialization errors. False/“none” ignores them, True/“warn” logs errors, “error” raises a PydanticSerializationError.

serialize_as_any : bool Default: False

Whether to serialize fields with duck-typing serialization behavior.

model_json_schema

@classmethod

def model_json_schema(
    cls,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
) -> dict[str, Any]

Generates a JSON schema for a model class.

Returns

dict[str, Any] — The JSON schema for the given model class.

Parameters

by_alias : bool Default: True

Whether to use attribute aliases or not.

ref_template : str Default: DEFAULT_REF_TEMPLATE

The reference template.

schema_generator : type[GenerateJsonSchema] Default: GenerateJsonSchema

To override the logic used to generate the JSON schema, as a subclass of GenerateJsonSchema with your desired modifications

mode : JsonSchemaMode Default: 'validation'

The mode in which to generate the schema.

model_parametrized_name

@classmethod

def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str

Compute the class name for parametrizations of generic classes.

This method can be overridden to achieve a custom naming scheme for generic BaseModels.

Returns

str — String representing the new class where params are passed to cls as type variables.

Parameters

params : tuple[type[Any], ...]

Tuple of types of the class. Given a generic class Model with 2 type variables and a concrete model Model[str, int], the value (str, int) would be passed to params.

Raises
  • TypeError — Raised when trying to generate concrete names for non-generic models.

model_post_init

def model_post_init(__context: Any) -> None

Override this method to perform additional initialization after __init__ and model_construct. This is useful if you want to do some validation that requires the entire model to be initialized.

Returns

None

model_rebuild

@classmethod

def model_rebuild(
    cls,
    force: bool = False,
    raise_errors: bool = True,
    _parent_namespace_depth: int = 2,
    _types_namespace: dict[str, Any] | None = None,
) -> bool | None

Try to rebuild the pydantic-core schema for the model.

This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails.

Returns

bool | None — Returns None if the schema is already “complete” and rebuilding was not required. bool | None — If rebuilding was required, returns True if rebuilding was successful, otherwise False.

Parameters

force : bool Default: False

Whether to force the rebuilding of the model schema, defaults to False.

raise_errors : bool Default: True

Whether to raise errors, defaults to True.

_parent_namespace_depth : int Default: 2

The depth level of the parent namespace, defaults to 2.

_types_namespace : dict[str, Any] | None Default: None

The types namespace, defaults to None.

model_validate

@classmethod

def model_validate(
    cls,
    obj: Any,
    strict: bool | None = None,
    from_attributes: bool | None = None,
    context: Any | None = None,
) -> Self

Validate a pydantic model instance.

Returns

Self — The validated model instance.

Parameters

obj : Any

The object to validate.

strict : bool | None Default: None

Whether to enforce types strictly.

from_attributes : bool | None Default: None

Whether to extract data from object attributes.

context : Any | None Default: None

Additional context to pass to the validator.

Raises
  • ValidationError — If the object could not be validated.

model_validate_json

@classmethod

def model_validate_json(
    cls,
    json_data: str | bytes | bytearray,
    strict: bool | None = None,
    context: Any | None = None,
) -> Self

Usage docs: https://docs.pydantic.dev/2.8/concepts/json/#json-parsing

Validate the given JSON data against the Pydantic model.

Returns

Self — The validated Pydantic model.

Parameters

json_data : str | bytes | bytearray

The JSON data to validate.

strict : bool | None Default: None

Whether to enforce types strictly.

context : Any | None Default: None

Extra variables to pass to the validator.

Raises
  • ValueError — If json_data is not a JSON string.

model_validate_strings

@classmethod

def model_validate_strings(
    cls,
    obj: Any,
    strict: bool | None = None,
    context: Any | None = None,
) -> Self

Validate the given object with string data against the Pydantic model.

Returns

Self — The validated Pydantic model.

Parameters

obj : Any

The object containing string data to validate.

strict : bool | None Default: None

Whether to enforce types strictly.

context : Any | None Default: None

Extra variables to pass to the validator.

get_pydantic_core_schema

@classmethod

def __get_pydantic_core_schema__(
    cls,
    source: type[BaseModel],
    handler: GetCoreSchemaHandler,
) -> CoreSchema

Hook into generating the model’s CoreSchema.

Returns

CoreSchema — A pydantic-core CoreSchema.

Parameters

source : type[BaseModel]

The class we are generating a schema for. This will generally be the same as the cls argument if this is a classmethod.

handler : GetCoreSchemaHandler

A callable that calls into Pydantic’s internal CoreSchema generation logic.

get_pydantic_json_schema

@classmethod

def __get_pydantic_json_schema__(
    cls,
    core_schema: CoreSchema,
    handler: GetJsonSchemaHandler,
) -> JsonSchemaValue

Hook into generating the model’s JSON schema.

Returns

JsonSchemaValue — A JSON schema, as a Python object.

Parameters

core_schema : CoreSchema

A pydantic-core CoreSchema. You can ignore this argument and call the handler with a new CoreSchema, wrap this CoreSchema (\{'type': 'nullable', 'schema': current_schema\}), or just call the handler with the original schema.

handler : GetJsonSchemaHandler

Call into Pydantic’s internal JSON schema generation. This will raise a pydantic.errors.PydanticInvalidForJsonSchema if JSON schema generation fails. Since this gets called by BaseModel.model_json_schema you can override the schema_generator argument to that function to change JSON schema generation globally for a type.

pydantic_init_subclass

@classmethod

def __pydantic_init_subclass__(cls, kwargs: Any = {}) -> None

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass only after the class is actually fully initialized. In particular, attributes like model_fields will be present when this is called.

This is necessary because __init_subclass__ will always be called by type.__new__, and it would require a prohibitively large refactor to the ModelMetaclass to ensure that type.__new__ was called in such a manner that the class would already be sufficiently initialized.

This will receive the same kwargs that would be passed to the standard __init_subclass__, namely, any kwargs passed to the class definition that aren’t used internally by pydantic.

Returns

None

Parameters

**kwargs : Any Default: \{\}

Any keyword arguments passed to the class definition that aren’t used internally by pydantic.

copy

def __copy__() -> Self

Returns a shallow copy of the model.

Returns

Self

deepcopy

def __deepcopy__(memo: dict[int, Any] | None = None) -> Self

Returns a deep copy of the model.

Returns

Self

init_subclass

def __init_subclass__(cls, kwargs: Unpack[ConfigDict] = {})

This signature is included purely to help type-checkers check arguments to class declaration, which provides a way to conveniently set model_config key/value pairs.

from pydantic import BaseModel

class MyModel(BaseModel, extra='allow'):
    ...

However, this may be deceiving, since the actual calls to __init_subclass__ will not receive any of the config arguments, and will only receive any keyword arguments passed during class initialization that are not expected keys in ConfigDict. (This is due to the way ModelMetaclass.__new__ works.)

Parameters

**kwargs : Unpack[ConfigDict] Default: \{\}

Keyword arguments passed to the class definition, which set model_config

iter

def __iter__() -> TupleGenerator

So dict(model) works.

Returns

TupleGenerator

dict

def dict(
    include: IncEx = None,
    exclude: IncEx = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
) -> Dict[str, Any]
Returns

Dict[str, Any]

json

def json(
    include: IncEx = None,
    exclude: IncEx = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    encoder: Callable[[Any], Any] | None = PydanticUndefined,
    models_as_dict: bool = PydanticUndefined,
    dumps_kwargs: Any = {},
) -> str
Returns

str

parse_obj

@classmethod

def parse_obj(cls, obj: Any) -> Self
Returns

Self

parse_raw

@classmethod

def parse_raw(
    cls,
    b: str | bytes,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self
Returns

Self

parse_file

@classmethod

def parse_file(
    cls,
    path: str | Path,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self
Returns

Self

from_orm

@classmethod

def from_orm(cls, obj: Any) -> Self
Returns

Self

construct

@classmethod

def construct(cls, _fields_set: set[str] | None = None, values: Any = {}) -> Self
Returns

Self

copy

def copy(
    include: AbstractSetIntStr | MappingIntStrAny | None = None,
    exclude: AbstractSetIntStr | MappingIntStrAny | None = None,
    update: Dict[str, Any] | None = None,
    deep: bool = False,
) -> Self

Returns a copy of the model.

If you need include or exclude, use:

data = self.model_dump(include=include, exclude=exclude, round_trip=True)
data = {**data, **(update or {})}
copied = self.model_validate(data)
Returns

Self — A copy of the model with included, excluded and updated fields as specified.

Parameters

include : AbstractSetIntStr | MappingIntStrAny | None Default: None

Optional set or mapping specifying which fields to include in the copied model.

exclude : AbstractSetIntStr | MappingIntStrAny | None Default: None

Optional set or mapping specifying which fields to exclude in the copied model.

update : Dict[str, Any] | None Default: None

Optional dictionary of field-value pairs to override field values in the copied model.

deep : bool Default: False

If True, the values of fields that are Pydantic models will be deep-copied.

schema

@classmethod

def schema(
    cls,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
) -> Dict[str, Any]
Returns

Dict[str, Any]

schema_json

@classmethod

def schema_json(
    cls,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    dumps_kwargs: Any = {},
) -> str
Returns

str

validate

@classmethod

def validate(cls, value: Any) -> Self
Returns

Self

update_forward_refs

@classmethod

def update_forward_refs(cls, localns: Any = {}) -> None
Returns

None


PydanticJsonSchemaWarning

Bases: UserWarning

This class is used to emit warnings produced during JSON schema generation. See the GenerateJsonSchema.emit_warning and GenerateJsonSchema.render_warning_message methods for more details; these can be overridden to control warning behavior.


GenerateJsonSchema

Usage docs: https://docs.pydantic.dev/2.8/concepts/json_schema/#customizing-the-json-schema-generation-process

A class for generating JSON schemas.

This class generates JSON schemas based on configured parameters. The default schema dialect is https://json-schema.org/draft/2020-12/schema. The class uses by_alias to configure how fields with multiple names are handled and ref_template to format reference names.

Constructor Parameters

by_alias : bool Default: True

Whether to use field aliases in the generated schemas.

ref_template : str Default: DEFAULT_REF_TEMPLATE

The format string to use when generating reference names.

Attributes

schema_dialect

Default: 'https://json-schema.org/draft/2020-12/schema'

ignored_warning_kinds

Type: set[JsonSchemaWarningKind] Default: \{'skipped-choice'\}

by_alias

Default: by_alias

ref_template

Default: ref_template

core_to_json_refs

Type: dict[CoreModeRef, JsonRef] Default: \{\}

core_to_defs_refs

Type: dict[CoreModeRef, DefsRef] Default: \{\}

defs_to_core_refs

Type: dict[DefsRef, CoreModeRef] Default: \{\}

json_to_defs_refs

Type: dict[JsonRef, DefsRef] Default: \{\}

definitions

Type: dict[DefsRef, JsonSchemaValue] Default: \{\}

mode

Type: JsonSchemaMode

Methods

build_schema_type_to_method

def build_schema_type_to_method(

) -> dict[CoreSchemaOrFieldType, Callable[[CoreSchemaOrField], JsonSchemaValue]]

Builds a dictionary mapping fields to methods for generating JSON schemas.

Returns

dict[CoreSchemaOrFieldType, Callable[[CoreSchemaOrField], JsonSchemaValue]] — A dictionary containing the mapping of CoreSchemaOrFieldType to a handler method.

Raises
  • TypeError — If no method has been defined for generating a JSON schema for a given pydantic core schema type.

generate_definitions

def generate_definitions(
    inputs: Sequence[tuple[JsonSchemaKeyT, JsonSchemaMode, core_schema.CoreSchema]],
) -> tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], dict[DefsRef, JsonSchemaValue]]

Generates JSON schema definitions from a list of core schemas, pairing the generated definitions with a mapping that links the input keys to the definition references.

Returns

tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], dict[DefsRef, JsonSchemaValue]] — A tuple where:

  • The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.)
  • The second element is a dictionary whose keys are definition references for the JSON schemas from the first returned element, and whose values are the actual JSON schema definitions.
Parameters

inputs : Sequence[tuple[JsonSchemaKeyT, JsonSchemaMode, core_schema.CoreSchema]]

A sequence of tuples, where:

  • The first element is a JSON schema key type.
  • The second element is the JSON mode: either ‘validation’ or ‘serialization’.
  • The third element is a core schema.
Raises
  • PydanticUserError — Raised if the JSON schema generator has already been used to generate a JSON schema.

generate

def generate(schema: CoreSchema, mode: JsonSchemaMode = 'validation') -> JsonSchemaValue

Generates a JSON schema for a specified schema in a specified mode.

Returns

JsonSchemaValue — A JSON schema representing the specified schema.

Parameters

schema : CoreSchema

A Pydantic model.

mode : JsonSchemaMode Default: 'validation'

The mode in which to generate the schema. Defaults to ‘validation’.

Raises
  • PydanticUserError — If the JSON schema generator has already been used to generate a JSON schema.

generate_inner

def generate_inner(schema: CoreSchemaOrField) -> JsonSchemaValue

Generates a JSON schema for a given core schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : CoreSchemaOrField

The given core schema.

any_schema

def any_schema(schema: core_schema.AnySchema) -> JsonSchemaValue

Generates a JSON schema that matches any value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.AnySchema

The core schema.

none_schema

def none_schema(schema: core_schema.NoneSchema) -> JsonSchemaValue

Generates a JSON schema that matches None.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.NoneSchema

The core schema.

bool_schema

def bool_schema(schema: core_schema.BoolSchema) -> JsonSchemaValue

Generates a JSON schema that matches a bool value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.BoolSchema

The core schema.

int_schema

def int_schema(schema: core_schema.IntSchema) -> JsonSchemaValue

Generates a JSON schema that matches an int value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.IntSchema

The core schema.

float_schema

def float_schema(schema: core_schema.FloatSchema) -> JsonSchemaValue

Generates a JSON schema that matches a float value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.FloatSchema

The core schema.

decimal_schema

def decimal_schema(schema: core_schema.DecimalSchema) -> JsonSchemaValue

Generates a JSON schema that matches a decimal value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.DecimalSchema

The core schema.

str_schema

def str_schema(schema: core_schema.StringSchema) -> JsonSchemaValue

Generates a JSON schema that matches a string value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.StringSchema

The core schema.

bytes_schema

def bytes_schema(schema: core_schema.BytesSchema) -> JsonSchemaValue

Generates a JSON schema that matches a bytes value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.BytesSchema

The core schema.

date_schema

def date_schema(schema: core_schema.DateSchema) -> JsonSchemaValue

Generates a JSON schema that matches a date value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.DateSchema

The core schema.

time_schema

def time_schema(schema: core_schema.TimeSchema) -> JsonSchemaValue

Generates a JSON schema that matches a time value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.TimeSchema

The core schema.

datetime_schema

def datetime_schema(schema: core_schema.DatetimeSchema) -> JsonSchemaValue

Generates a JSON schema that matches a datetime value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.DatetimeSchema

The core schema.

timedelta_schema

def timedelta_schema(schema: core_schema.TimedeltaSchema) -> JsonSchemaValue

Generates a JSON schema that matches a timedelta value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.TimedeltaSchema

The core schema.

literal_schema

def literal_schema(schema: core_schema.LiteralSchema) -> JsonSchemaValue

Generates a JSON schema that matches a literal value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.LiteralSchema

The core schema.

enum_schema

def enum_schema(schema: core_schema.EnumSchema) -> JsonSchemaValue

Generates a JSON schema that matches an Enum value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.EnumSchema

The core schema.

is_instance_schema

def is_instance_schema(schema: core_schema.IsInstanceSchema) -> JsonSchemaValue

Handles JSON schema generation for a core schema that checks if a value is an instance of a class.

Unless overridden in a subclass, this raises an error.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.IsInstanceSchema

The core schema.

is_subclass_schema

def is_subclass_schema(schema: core_schema.IsSubclassSchema) -> JsonSchemaValue

Handles JSON schema generation for a core schema that checks if a value is a subclass of a class.

For backwards compatibility with v1, this does not raise an error, but can be overridden to change this.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.IsSubclassSchema

The core schema.

callable_schema

def callable_schema(schema: core_schema.CallableSchema) -> JsonSchemaValue

Generates a JSON schema that matches a callable value.

Unless overridden in a subclass, this raises an error.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.CallableSchema

The core schema.

list_schema

def list_schema(schema: core_schema.ListSchema) -> JsonSchemaValue

Returns a schema that matches a list schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.ListSchema

The core schema.

tuple_positional_schema

def tuple_positional_schema(schema: core_schema.TupleSchema) -> JsonSchemaValue

Replaced by tuple_schema.

Returns

JsonSchemaValue

tuple_variable_schema

def tuple_variable_schema(schema: core_schema.TupleSchema) -> JsonSchemaValue

Replaced by tuple_schema.

Returns

JsonSchemaValue

tuple_schema

def tuple_schema(schema: core_schema.TupleSchema) -> JsonSchemaValue

Generates a JSON schema that matches a tuple schema e.g. Tuple[int, str, bool] or Tuple[int, ...].

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.TupleSchema

The core schema.

set_schema

def set_schema(schema: core_schema.SetSchema) -> JsonSchemaValue

Generates a JSON schema that matches a set schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.SetSchema

The core schema.

frozenset_schema

def frozenset_schema(schema: core_schema.FrozenSetSchema) -> JsonSchemaValue

Generates a JSON schema that matches a frozenset schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.FrozenSetSchema

The core schema.

generator_schema

def generator_schema(schema: core_schema.GeneratorSchema) -> JsonSchemaValue

Returns a JSON schema that represents the provided GeneratorSchema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.GeneratorSchema

The schema.

dict_schema

def dict_schema(schema: core_schema.DictSchema) -> JsonSchemaValue

Generates a JSON schema that matches a dict schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.DictSchema

The core schema.

function_before_schema

def function_before_schema(
    schema: core_schema.BeforeValidatorFunctionSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a function-before schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.BeforeValidatorFunctionSchema

The core schema.

function_after_schema

def function_after_schema(
    schema: core_schema.AfterValidatorFunctionSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a function-after schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.AfterValidatorFunctionSchema

The core schema.

function_plain_schema

def function_plain_schema(
    schema: core_schema.PlainValidatorFunctionSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a function-plain schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.PlainValidatorFunctionSchema

The core schema.

function_wrap_schema

def function_wrap_schema(
    schema: core_schema.WrapValidatorFunctionSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a function-wrap schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.WrapValidatorFunctionSchema

The core schema.

default_schema

def default_schema(schema: core_schema.WithDefaultSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema with a default value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.WithDefaultSchema

The core schema.

nullable_schema

def nullable_schema(schema: core_schema.NullableSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that allows null values.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.NullableSchema

The core schema.

union_schema

def union_schema(schema: core_schema.UnionSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that allows values matching any of the given schemas.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.UnionSchema

The core schema.

tagged_union_schema

def tagged_union_schema(schema: core_schema.TaggedUnionSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that allows values matching any of the given schemas, where the schemas are tagged with a discriminator field that indicates which schema should be used to validate the value.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.TaggedUnionSchema

The core schema.

chain_schema

def chain_schema(schema: core_schema.ChainSchema) -> JsonSchemaValue

Generates a JSON schema that matches a core_schema.ChainSchema.

When generating a schema for validation, we return the validation JSON schema for the first step in the chain. For serialization, we return the serialization JSON schema for the last step in the chain.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.ChainSchema

The core schema.

lax_or_strict_schema

def lax_or_strict_schema(schema: core_schema.LaxOrStrictSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that allows values matching either the lax schema or the strict schema.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.LaxOrStrictSchema

The core schema.

json_or_python_schema

def json_or_python_schema(schema: core_schema.JsonOrPythonSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that allows values matching either the JSON schema or the Python schema.

The JSON schema is used instead of the Python schema. If you want to use the Python schema, you should override this method.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.JsonOrPythonSchema

The core schema.

typed_dict_schema

def typed_dict_schema(schema: core_schema.TypedDictSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a typed dict.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.TypedDictSchema

The core schema.

typed_dict_field_schema

def typed_dict_field_schema(schema: core_schema.TypedDictField) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a typed dict field.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.TypedDictField

The core schema.

dataclass_field_schema

def dataclass_field_schema(schema: core_schema.DataclassField) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a dataclass field.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.DataclassField

The core schema.

model_field_schema

def model_field_schema(schema: core_schema.ModelField) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a model field.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.ModelField

The core schema.

computed_field_schema

def computed_field_schema(schema: core_schema.ComputedField) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a computed field.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.ComputedField

The core schema.

model_schema

def model_schema(schema: core_schema.ModelSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a model.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.ModelSchema

The core schema.

resolve_schema_to_update

def resolve_schema_to_update(json_schema: JsonSchemaValue) -> JsonSchemaValue

Resolve a JsonSchemaValue to the non-ref schema if it is a $ref schema.

Returns

JsonSchemaValue — The resolved schema.

Parameters

json_schema : JsonSchemaValue

The schema to resolve.

model_fields_schema

def model_fields_schema(schema: core_schema.ModelFieldsSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a model’s fields.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.ModelFieldsSchema

The core schema.

field_is_present

def field_is_present(field: CoreSchemaField) -> bool

Whether the field should be included in the generated JSON schema.

Returns

boolTrue if the field should be included in the generated JSON schema, False otherwise.

Parameters

field : CoreSchemaField

The schema for the field itself.

field_is_required

def field_is_required(
    field: core_schema.ModelField | core_schema.DataclassField | core_schema.TypedDictField,
    total: bool,
) -> bool

Whether the field should be marked as required in the generated JSON schema. (Note that this is irrelevant if the field is not present in the JSON schema.).

Returns

boolTrue if the field should be marked as required in the generated JSON schema, False otherwise.

Parameters

field : core_schema.ModelField | core_schema.DataclassField | core_schema.TypedDictField

The schema for the field itself.

total : bool

Only applies to TypedDictFields. Indicates if the TypedDict this field belongs to is total, in which case any fields that don’t explicitly specify required=False are required.

dataclass_args_schema

def dataclass_args_schema(schema: core_schema.DataclassArgsSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a dataclass’s constructor arguments.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.DataclassArgsSchema

The core schema.

dataclass_schema

def dataclass_schema(schema: core_schema.DataclassSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a dataclass.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.DataclassSchema

The core schema.

arguments_schema

def arguments_schema(schema: core_schema.ArgumentsSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a function’s arguments.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.ArgumentsSchema

The core schema.

kw_arguments_schema

def kw_arguments_schema(
    arguments: list[core_schema.ArgumentsParameter],
    var_kwargs_schema: CoreSchema | None,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a function’s keyword arguments.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

arguments : list[core_schema.ArgumentsParameter]

The core schema.

p_arguments_schema

def p_arguments_schema(
    arguments: list[core_schema.ArgumentsParameter],
    var_args_schema: CoreSchema | None,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a function’s positional arguments.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

arguments : list[core_schema.ArgumentsParameter]

The core schema.

get_argument_name

def get_argument_name(argument: core_schema.ArgumentsParameter) -> str

Retrieves the name of an argument.

Returns

str — The name of the argument.

Parameters

argument : core_schema.ArgumentsParameter

The core schema.

call_schema

def call_schema(schema: core_schema.CallSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a function call.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.CallSchema

The core schema.

custom_error_schema

def custom_error_schema(schema: core_schema.CustomErrorSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a custom error.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.CustomErrorSchema

The core schema.

json_schema

def json_schema(schema: core_schema.JsonSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a JSON object.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.JsonSchema

The core schema.

url_schema

def url_schema(schema: core_schema.UrlSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a URL.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.UrlSchema

The core schema.

multi_host_url_schema

def multi_host_url_schema(schema: core_schema.MultiHostUrlSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a URL that can be used with multiple hosts.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.MultiHostUrlSchema

The core schema.

uuid_schema

def uuid_schema(schema: core_schema.UuidSchema) -> JsonSchemaValue

Generates a JSON schema that matches a UUID.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.UuidSchema

The core schema.

definitions_schema

def definitions_schema(schema: core_schema.DefinitionsSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a JSON object with definitions.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.DefinitionsSchema

The core schema.

definition_ref_schema

def definition_ref_schema(
    schema: core_schema.DefinitionReferenceSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that references a definition.

Returns

JsonSchemaValue — The generated JSON schema.

Parameters

schema : core_schema.DefinitionReferenceSchema

The core schema.

ser_schema

def ser_schema(
    schema: core_schema.SerSchema | core_schema.IncExSeqSerSchema | core_schema.IncExDictSerSchema,
) -> JsonSchemaValue | None

Generates a JSON schema that matches a schema that defines a serialized object.

Returns

JsonSchemaValue | None — The generated JSON schema.

Parameters

schema : core_schema.SerSchema | core_schema.IncExSeqSerSchema | core_schema.IncExDictSerSchema

The core schema.

get_title_from_name

def get_title_from_name(name: str) -> str

Retrieves a title from a name.

Returns

str — The title.

Parameters

name : str

The name to retrieve a title from.

field_title_should_be_set

def field_title_should_be_set(schema: CoreSchemaOrField) -> bool

Returns true if a field with the given schema should have a title set based on the field name.

Intuitively, we want this to return true for schemas that wouldn’t otherwise provide their own title (e.g., int, float, str), and false for those that would (e.g., BaseModel subclasses).

Returns

boolTrue if the field should have a title set, False otherwise.

Parameters

schema : CoreSchemaOrField

The schema to check.

normalize_name

def normalize_name(name: str) -> str

Normalizes a name to be used as a key in a dictionary.

Returns

str — The normalized name.

Parameters

name : str

The name to normalize.

get_defs_ref

def get_defs_ref(core_mode_ref: CoreModeRef) -> DefsRef

Override this method to change the way that definitions keys are generated from a core reference.

Returns

DefsRef — The definitions key.

Parameters

core_mode_ref : CoreModeRef

The core reference.

get_cache_defs_ref_schema

def get_cache_defs_ref_schema(core_ref: CoreRef) -> tuple[DefsRef, JsonSchemaValue]

This method wraps the get_defs_ref method with some cache-lookup/population logic, and returns both the produced defs_ref and the JSON schema that will refer to the right definition.

Returns

tuple[DefsRef, JsonSchemaValue] — A tuple of the definitions reference and the JSON schema that will refer to it.

Parameters

core_ref : CoreRef

The core reference to get the definitions reference for.

handle_ref_overrides

def handle_ref_overrides(json_schema: JsonSchemaValue) -> JsonSchemaValue

It is not valid for a schema with a top-level $ref to have sibling keys.

During our own schema generation, we treat sibling keys as overrides to the referenced schema, but this is not how the official JSON schema spec works.

Because of this, we first remove any sibling keys that are redundant with the referenced schema, then if any remain, we transform the schema from a top-level ‘$ref’ to use allOf to move the $ref out of the top level. (See bottom of https://swagger.io/docs/specification/using-ref/ for a reference about this behavior)

Returns

JsonSchemaValue

get_schema_from_definitions

def get_schema_from_definitions(json_ref: JsonRef) -> JsonSchemaValue | None
Returns

JsonSchemaValue | None

encode_default

def encode_default(dft: Any) -> Any

Encode a default value to a JSON-serializable value.

This is used to encode default values for fields in the generated JSON schema.

Returns

Any — The encoded default value.

Parameters

dft : Any

The default value to encode.

update_with_validations

def update_with_validations(
    json_schema: JsonSchemaValue,
    core_schema: CoreSchema,
    mapping: dict[str, str],
) -> None

Update the json_schema with the corresponding validations specified in the core_schema, using the provided mapping to translate keys in core_schema to the appropriate keys for a JSON schema.

Returns

None

Parameters

json_schema : JsonSchemaValue

The JSON schema to update.

core_schema : CoreSchema

The core schema to get the validations from.

mapping : dict[str, str]

A mapping from core_schema attribute names to the corresponding JSON schema attribute names.

get_flattened_anyof

def get_flattened_anyof(schemas: list[JsonSchemaValue]) -> JsonSchemaValue
Returns

JsonSchemaValue

get_json_ref_counts

def get_json_ref_counts(json_schema: JsonSchemaValue) -> dict[JsonRef, int]

Get all values corresponding to the key ‘$ref’ anywhere in the json_schema.

Returns

dict[JsonRef, int]

handle_invalid_for_json_schema

def handle_invalid_for_json_schema(
    schema: CoreSchemaOrField,
    error_info: str,
) -> JsonSchemaValue
Returns

JsonSchemaValue

emit_warning

def emit_warning(kind: JsonSchemaWarningKind, detail: str) -> None

This method simply emits PydanticJsonSchemaWarnings based on handling in the warning_message method.

Returns

None

render_warning_message

def render_warning_message(kind: JsonSchemaWarningKind, detail: str) -> str | None

This method is responsible for ignoring warnings as desired, and for formatting the warning messages.

You can override the value of ignored_warning_kinds in a subclass of GenerateJsonSchema to modify what warnings are generated. If you want more control, you can override this method; just return None in situations where you don’t want warnings to be emitted.

Returns

str | None — The formatted warning message, or None if no warning should be emitted.

Parameters

kind : JsonSchemaWarningKind

The kind of warning to render. It can be one of the following:

  • ‘skipped-choice’: A choice field was skipped because it had no valid choices.
  • ‘non-serializable-default’: A default value was skipped because it was not JSON-serializable.

detail : str

A string with additional details about the warning.


WithJsonSchema

Usage docs: https://docs.pydantic.dev/2.8/concepts/json_schema/#withjsonschema-annotation

Add this as an annotation on a field to override the (base) JSON schema that would be generated for that field. This provides a way to set a JSON schema for types that would otherwise raise errors when producing a JSON schema, such as Callable, or types that have an is-instance core schema, without needing to go so far as creating a custom subclass of pydantic.json_schema.GenerateJsonSchema. Note that any modifications to the schema that would normally be made (such as setting the title for model fields) will still be performed.

If mode is set this will only apply to that schema generation mode, allowing you to set different json schemas for validation and serialization.

Attributes

json_schema

Type: JsonSchemaValue | None

mode

Type: Literal['validation', 'serialization'] | None Default: None


Examples

Add examples to a JSON schema.

Examples should be a map of example names (strings) to example values (any valid JSON).

If mode is set this will only apply to that schema generation mode, allowing you to add different examples for validation and serialization.

Attributes

examples

Type: dict[str, Any]

mode

Type: Literal['validation', 'serialization'] | None Default: None


SkipJsonSchema

Usage docs: https://docs.pydantic.dev/2.8/concepts/json_schema/#skipjsonschema-annotation

Add this as an annotation on a field to skip generating a JSON schema for that field.


update_json_schema

def update_json_schema(
    schema: JsonSchemaValue,
    updates: dict[str, Any],
) -> JsonSchemaValue

Update a JSON schema in-place by providing a dictionary of updates.

This function sets the provided key-value pairs in the schema and returns the updated schema.

Returns

JsonSchemaValue — The updated JSON schema.

Parameters

schema : JsonSchemaValue

The JSON schema to update.

updates : dict[str, Any]

A dictionary of key-value pairs to set in the schema.


model_json_schema

def model_json_schema(
    cls: type[BaseModel] | type[PydanticDataclass],
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
) -> dict[str, Any]

Utility function to generate a JSON Schema for a model.

Returns

dict[str, Any] — The generated JSON Schema.

Parameters

cls : type[BaseModel] | type[PydanticDataclass]

The model class to generate a JSON Schema for.

by_alias : bool Default: True

If True (the default), fields will be serialized according to their alias. If False, fields will be serialized according to their attribute name.

ref_template : str Default: DEFAULT_REF_TEMPLATE

The template to use for generating JSON Schema references.

schema_generator : type[GenerateJsonSchema] Default: GenerateJsonSchema

The class to use for generating the JSON Schema.

mode : JsonSchemaMode Default: 'validation'

The mode to use for generating the JSON Schema. It can be one of the following:

  • ‘validation’: Generate a JSON Schema for validating data.
  • ‘serialization’: Generate a JSON Schema for serializing data.

models_json_schema

def models_json_schema(
    models: Sequence[tuple[type[BaseModel] | type[PydanticDataclass], JsonSchemaMode]],
    by_alias: bool = True,
    title: str | None = None,
    description: str | None = None,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
) -> tuple[dict[tuple[type[BaseModel] | type[PydanticDataclass], JsonSchemaMode], JsonSchemaValue], JsonSchemaValue]

Utility function to generate a JSON Schema for multiple models.

Returns

tuple[dict[tuple[type[BaseModel] | type[PydanticDataclass], JsonSchemaMode], JsonSchemaValue], JsonSchemaValue] — A tuple where:

  • The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.)
  • The second element is a JSON schema containing all definitions referenced in the first returned element, along with the optional title and description keys.

Parameters

models : Sequence[tuple[type[BaseModel] | type[PydanticDataclass], JsonSchemaMode]]

A sequence of tuples of the form (model, mode).

by_alias : bool Default: True

Whether field aliases should be used as keys in the generated JSON Schema.

title : str | None Default: None

The title of the generated JSON Schema.

description : str | None Default: None

The description of the generated JSON Schema.

ref_template : str Default: DEFAULT_REF_TEMPLATE

The reference template to use for generating JSON Schema references.

schema_generator : type[GenerateJsonSchema] Default: GenerateJsonSchema

The schema generator to use for generating the JSON Schema.


JsonDict

Type: TypeAlias Default: Dict[str, JsonValue]

JsonSchemaExtraCallable

Type: TypeAlias Default: Union[Callable[[JsonDict], None], Callable[[JsonDict, Type[Any]], None]]

JsonValue

Type: TypeAlias Default: Union[int, float, str, bool, None, List['JsonValue'], 'JsonDict']

CoreSchemaField

Default: Union[core_schema.ModelField, core_schema.DataclassField, core_schema.TypedDictField, core_schema.ComputedField]

CoreSchemaOrField

Default: Union[core_schema.CoreSchema, CoreSchemaField]

GetJsonSchemaFunction

Default: Callable[[CoreSchemaOrField, GetJsonSchemaHandler], JsonSchemaValue]

CoreSchemaOrFieldType

A type alias for defined schema types that represents a union of core_schema.CoreSchemaType and core_schema.CoreSchemaFieldType.

Default: Literal[core_schema.CoreSchemaType, core_schema.CoreSchemaFieldType]

JsonSchemaValue

A type alias for a JSON schema value. This is a dictionary of string keys to arbitrary JSON values.

Default: Dict[str, Any]

JsonSchemaMode

A type alias that represents the mode of a JSON schema; either ‘validation’ or ‘serialization’.

For some types, the inputs to validation differ from the outputs of serialization. For example, computed fields will only be present when serializing, and should not be provided when validating. This flag provides a way to indicate whether you want the JSON schema required for validation inputs, or that will be matched by serialization outputs.

Default: Literal['validation', 'serialization']

JsonSchemaWarningKind

A type alias representing the kinds of warnings that can be emitted during JSON schema generation.

See GenerateJsonSchema.render_warning_message for more details.

Default: Literal['skipped-choice', 'non-serializable-default']

DEFAULT_REF_TEMPLATE

The default format string used to generate reference names.

Default: '#/$defs/\{model\}'

CoreRef

Default: NewType('CoreRef', str)

DefsRef

Default: NewType('DefsRef', str)

JsonRef

Default: NewType('JsonRef', str)

CoreModeRef

Default: Tuple[CoreRef, JsonSchemaMode]

JsonSchemaKeyT

Default: TypeVar('JsonSchemaKeyT', bound=Hashable)

AnyType

Default: TypeVar('AnyType')