JSON Schema
Usage docs: https://docs.pydantic.dev/2.5/concepts/json_schema/
The json_schema module contains classes and functions to allow the way JSON Schema
is generated to be customized.
In general you shouldn’t need to use this module directly; instead, you can use
BaseModel.model_json_schema and
TypeAdapter.json_schema.
Bases: PydanticDeprecationWarning
A specific PydanticDeprecationWarning subclass defining functionality deprecated since Pydantic 2.6.
Bases: PydanticDeprecationWarning
A specific PydanticDeprecationWarning subclass defining functionality deprecated since Pydantic 2.9.
Handler to call into the next JSON schema generation function.
Type: JsonSchemaMode
def __call__(core_schema: CoreSchemaOrField) -> JsonSchemaValue
Call the inner handler and get the JsonSchemaValue it returns.
This will call the next JSON schema modifying function up until it calls
into pydantic.json_schema.GenerateJsonSchema, which will raise a
pydantic.errors.PydanticInvalidForJsonSchema error if it cannot generate
a JSON schema.
JsonSchemaValue — The JSON schema generated by the inner JSON schema modify
JsonSchemaValue — functions.
A pydantic_core.core_schema.CoreSchema.
def resolve_ref_schema(maybe_ref_json_schema: JsonSchemaValue) -> JsonSchemaValue
Get the real schema for a \{"$ref": ...\} schema.
If the schema given is not a $ref schema, it will be returned as is.
This means you don’t have to check before calling this function.
JsonSchemaValue — A JsonSchemaValue that has no $ref.
A JsonSchemaValue which may be a $ref schema.
LookupError— If the ref is not found.
Bases: PydanticUserError
An error raised during failures to generate a JSON schema for some CoreSchema.
Bases: PydanticUserError
An error raised during failures to generate a CoreSchema for some type.
Bases: PydanticErrorMixin, TypeError
An error raised due to incorrect use of Pydantic.
Bases: TypedDict
A TypedDict for configuring Pydantic behaviour.
The title for the generated JSON schema, defaults to the model’s name
Type: str | None
A callable that takes a model class and returns the title for it. Defaults to None.
Type: Callable[[type], str] | None
A callable that takes a field’s name and info and returns title for it. Defaults to None.
Type: Callable[[str, FieldInfo | ComputedFieldInfo], str] | None
Whether to convert all characters to lowercase for str types. Defaults to False.
Type: bool
Whether to convert all characters to uppercase for str types. Defaults to False.
Type: bool
Whether to strip leading and trailing whitespace for str types.
Type: bool
The minimum length for str types. Defaults to None.
Type: int
The maximum length for str types. Defaults to None.
Type: int | None
Whether to ignore, allow, or forbid extra attributes during model initialization. Defaults to 'ignore'.
You can configure how pydantic handles the attributes that are not defined in the model:
allow- Allow any extra attributes.forbid- Forbid any extra attributes.ignore- Ignore any extra attributes.
from pydantic import BaseModel, ConfigDict
class User(BaseModel):
model_config = ConfigDict(extra='ignore') # (1)
name: str
user = User(name='John Doe', age=20) # (2)
print(user)
#> name='John Doe' This is the default behaviour.
The age argument is ignored.
Instead, with extra='allow', the age argument is included:
from pydantic import BaseModel, ConfigDict
class User(BaseModel):
model_config = ConfigDict(extra='allow')
name: str
user = User(name='John Doe', age=20) # (1)
print(user)
#> name='John Doe' age=20 The age argument is included.
With extra='forbid', an error is raised:
from pydantic import BaseModel, ConfigDict, ValidationError
class User(BaseModel):
model_config = ConfigDict(extra='forbid')
name: str
try:
User(name='John Doe', age=20)
except ValidationError as e:
print(e)
'''
1 validation error for User
age
Extra inputs are not permitted [type=extra_forbidden, input_value=20, input_type=int]
'''
Type: ExtraValues | None
Whether models are faux-immutable, i.e. whether __setattr__ is allowed, and also generates
a __hash__() method for the model. This makes instances of the model potentially hashable if all the
attributes are hashable. Defaults to False.
Type: bool
Whether an aliased field may be populated by its name as given by the model
attribute, as well as the alias. Defaults to False.
from pydantic import BaseModel, ConfigDict, Field
class User(BaseModel):
model_config = ConfigDict(populate_by_name=True)
name: str = Field(alias='full_name') # (1)
age: int
user = User(full_name='John Doe', age=20) # (2)
print(user)
#> name='John Doe' age=20
user = User(name='John Doe', age=20) # (3)
print(user)
#> name='John Doe' age=20 The field 'name' has an alias 'full_name'.
The model is populated by the alias 'full_name'.
The model is populated by the field name 'name'.
Type: bool
Whether to populate models with the value property of enums, rather than the raw enum.
This may be useful if you want to serialize model.model_dump() later. Defaults to False.
from enum import Enum
from typing import Optional
from pydantic import BaseModel, ConfigDict, Field
class SomeEnum(Enum):
FOO = 'foo'
BAR = 'bar'
BAZ = 'baz'
class SomeModel(BaseModel):
model_config = ConfigDict(use_enum_values=True)
some_enum: SomeEnum
another_enum: Optional[SomeEnum] = Field(
default=SomeEnum.FOO, validate_default=True
)
model1 = SomeModel(some_enum=SomeEnum.BAR)
print(model1.model_dump())
#> {'some_enum': 'bar', 'another_enum': 'foo'}
model2 = SomeModel(some_enum=SomeEnum.BAR, another_enum=SomeEnum.BAZ)
print(model2.model_dump())
#> {'some_enum': 'bar', 'another_enum': 'baz'}
Type: bool
Whether to validate the data when the model is changed. Defaults to False.
The default behavior of Pydantic is to validate the data when the model is created.
In case the user changes the data after the model is created, the model is not revalidated.
from pydantic import BaseModel
class User(BaseModel):
name: str
user = User(name='John Doe') # (1)
print(user)
#> name='John Doe'
user.name = 123 # (1)
print(user)
#> name=123 The validation happens only when the model is created.
The validation does not happen when the data is changed.
In case you want to revalidate the model when the data is changed, you can use validate_assignment=True:
from pydantic import BaseModel, ValidationError
class User(BaseModel, validate_assignment=True): # (1)
name: str
user = User(name='John Doe') # (2)
print(user)
#> name='John Doe'
try:
user.name = 123 # (3)
except ValidationError as e:
print(e)
'''
1 validation error for User
name
Input should be a valid string [type=string_type, input_value=123, input_type=int]
''' You can either use class keyword arguments, or model_config to set validate_assignment=True.
The validation happens when the model is created.
The validation also happens when the data is changed.
Type: bool
Whether arbitrary types are allowed for field types. Defaults to False.
from pydantic import BaseModel, ConfigDict, ValidationError
# This is not a pydantic model, it's an arbitrary class
class Pet:
def __init__(self, name: str):
self.name = name
class Model(BaseModel):
model_config = ConfigDict(arbitrary_types_allowed=True)
pet: Pet
owner: str
pet = Pet(name='Hedwig')
# A simple check of instance type is used to validate the data
model = Model(owner='Harry', pet=pet)
print(model)
#> pet=<__main__.Pet object at 0x0123456789ab> owner='Harry'
print(model.pet)
#> <__main__.Pet object at 0x0123456789ab>
print(model.pet.name)
#> Hedwig
print(type(model.pet))
#> <class '__main__.Pet'>
try:
# If the value is not an instance of the type, it's invalid
Model(owner='Harry', pet='Hedwig')
except ValidationError as e:
print(e)
'''
1 validation error for Model
pet
Input should be an instance of Pet [type=is_instance_of, input_value='Hedwig', input_type=str]
'''
# Nothing in the instance of the arbitrary type is checked
# Here name probably should have been a str, but it's not validated
pet2 = Pet(name=42)
model2 = Model(owner='Harry', pet=pet2)
print(model2)
#> pet=<__main__.Pet object at 0x0123456789ab> owner='Harry'
print(model2.pet)
#> <__main__.Pet object at 0x0123456789ab>
print(model2.pet.name)
#> 42
print(type(model2.pet))
#> <class '__main__.Pet'>
Type: bool
Whether to build models and look up discriminators of tagged unions using python object attributes.
Type: bool
Whether to use the actual key provided in the data (e.g. alias) for error locs rather than the field’s name. Defaults to True.
Type: bool
A callable that takes a field name and returns an alias for it
or an instance of AliasGenerator. Defaults to None.
When using a callable, the alias generator is used for both validation and serialization.
If you want to use different alias generators for validation and serialization, you can use
AliasGenerator instead.
If data source field names do not match your code style (e. g. CamelCase fields),
you can automatically generate aliases using alias_generator. Here’s an example with
a basic callable:
from pydantic import BaseModel, ConfigDict
from pydantic.alias_generators import to_pascal
class Voice(BaseModel):
model_config = ConfigDict(alias_generator=to_pascal)
name: str
language_code: str
voice = Voice(Name='Filiz', LanguageCode='tr-TR')
print(voice.language_code)
#> tr-TR
print(voice.model_dump(by_alias=True))
#> {'Name': 'Filiz', 'LanguageCode': 'tr-TR'}
If you want to use different alias generators for validation and serialization, you can use
AliasGenerator.
from pydantic import AliasGenerator, BaseModel, ConfigDict
from pydantic.alias_generators import to_camel, to_pascal
class Athlete(BaseModel):
first_name: str
last_name: str
sport: str
model_config = ConfigDict(
alias_generator=AliasGenerator(
validation_alias=to_camel,
serialization_alias=to_pascal,
)
)
athlete = Athlete(firstName='John', lastName='Doe', sport='track')
print(athlete.model_dump(by_alias=True))
#> {'FirstName': 'John', 'LastName': 'Doe', 'Sport': 'track'}
Type: Callable[[str], str] | AliasGenerator | None
A tuple of types that may occur as values of class attributes without annotations. This is
typically used for custom descriptors (classes that behave like property). If an attribute is set on a
class without an annotation and has a type that is not in this tuple (or otherwise recognized by
pydantic), an error will be raised. Defaults to ().
Type: tuple[type, ...]
Whether to allow infinity (+inf an -inf) and NaN values to float and decimal fields. Defaults to True.
Type: bool
A dict or callable to provide extra JSON schema properties. Defaults to None.
Type: JsonDict | JsonSchemaExtraCallable | None
A dict of custom JSON encoders for specific types. Defaults to None.
Type: dict[type[object], JsonEncoder] | None
(new in V2) If True, strict validation is applied to all fields on the model.
By default, Pydantic attempts to coerce values to the correct type, when possible.
There are situations in which you may want to disable this behavior, and instead raise an error if a value’s type does not match the field’s type annotation.
To configure strict mode for all fields on a model, you can set strict=True on the model.
from pydantic import BaseModel, ConfigDict
class Model(BaseModel):
model_config = ConfigDict(strict=True)
name: str
age: int
See Strict Mode for more details.
See the Conversion Table for more details on how Pydantic converts data in both strict and lax modes.
Type: bool
When and how to revalidate models and dataclasses during validation. Accepts the string
values of 'never', 'always' and 'subclass-instances'. Defaults to 'never'.
'never'will not revalidate models and dataclasses during validation'always'will revalidate models and dataclasses during validation'subclass-instances'will revalidate models and dataclasses during validation if the instance is a subclass of the model or dataclass
By default, model and dataclass instances are not revalidated during validation.
from typing import List
from pydantic import BaseModel
class User(BaseModel, revalidate_instances='never'): # (1)
hobbies: List[str]
class SubUser(User):
sins: List[str]
class Transaction(BaseModel):
user: User
my_user = User(hobbies=['reading'])
t = Transaction(user=my_user)
print(t)
#> user=User(hobbies=['reading'])
my_user.hobbies = [1] # (2)
t = Transaction(user=my_user) # (3)
print(t)
#> user=User(hobbies=[1])
my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying'])
t = Transaction(user=my_sub_user)
print(t)
#> user=SubUser(hobbies=['scuba diving'], sins=['lying']) revalidate_instances is set to 'never' by **default.
The assignment is not validated, unless you set validate_assignment to True in the model's config.
Since revalidate_instances is set to never, this is not revalidated.
If you want to revalidate instances during validation, you can set revalidate_instances to 'always'
in the model’s config.
from typing import List
from pydantic import BaseModel, ValidationError
class User(BaseModel, revalidate_instances='always'): # (1)
hobbies: List[str]
class SubUser(User):
sins: List[str]
class Transaction(BaseModel):
user: User
my_user = User(hobbies=['reading'])
t = Transaction(user=my_user)
print(t)
#> user=User(hobbies=['reading'])
my_user.hobbies = [1]
try:
t = Transaction(user=my_user) # (2)
except ValidationError as e:
print(e)
'''
1 validation error for Transaction
user.hobbies.0
Input should be a valid string [type=string_type, input_value=1, input_type=int]
'''
my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying'])
t = Transaction(user=my_sub_user)
print(t) # (3)
#> user=User(hobbies=['scuba diving']) revalidate_instances is set to 'always'.
The model is revalidated, since revalidate_instances is set to 'always'.
Using 'never' we would have gotten user=SubUser(hobbies=['scuba diving'], sins=['lying']).
It’s also possible to set revalidate_instances to 'subclass-instances' to only revalidate instances
of subclasses of the model.
from typing import List
from pydantic import BaseModel
class User(BaseModel, revalidate_instances='subclass-instances'): # (1)
hobbies: List[str]
class SubUser(User):
sins: List[str]
class Transaction(BaseModel):
user: User
my_user = User(hobbies=['reading'])
t = Transaction(user=my_user)
print(t)
#> user=User(hobbies=['reading'])
my_user.hobbies = [1]
t = Transaction(user=my_user) # (2)
print(t)
#> user=User(hobbies=[1])
my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying'])
t = Transaction(user=my_sub_user)
print(t) # (3)
#> user=User(hobbies=['scuba diving']) revalidate_instances is set to 'subclass-instances'.
This is not revalidated, since my_user is not a subclass of User.
Using 'never' we would have gotten user=SubUser(hobbies=['scuba diving'], sins=['lying']).
Type: Literal['always', 'never', 'subclass-instances']
The format of JSON serialized timedeltas. Accepts the string values of 'iso8601' and
'float'. Defaults to 'iso8601'.
'iso8601'will serialize timedeltas to ISO 8601 durations.'float'will serialize timedeltas to the total number of seconds.
Type: Literal['iso8601', 'float']
The encoding of JSON serialized bytes. Defaults to 'utf8'.
Set equal to val_json_bytes to get back an equal value after serialization round trip.
'utf8'will serialize bytes to UTF-8 strings.'base64'will serialize bytes to URL safe base64 strings.'hex'will serialize bytes to hexadecimal strings.
Type: Literal['utf8', 'base64', 'hex']
The encoding of JSON serialized bytes to decode. Defaults to 'utf8'.
Set equal to ser_json_bytes to get back an equal value after serialization round trip.
'utf8'will deserialize UTF-8 strings to bytes.'base64'will deserialize URL safe base64 strings to bytes.'hex'will deserialize hexadecimal strings to bytes.
Type: Literal['utf8', 'base64', 'hex']
The encoding of JSON serialized infinity and NaN float values. Defaults to 'null'.
'null'will serialize infinity and NaN values asnull.'constants'will serialize infinity and NaN values asInfinityandNaN.'strings'will serialize infinity as string"Infinity"and NaN as string"NaN".
Type: Literal['null', 'constants', 'strings']
Whether to validate default values during validation. Defaults to False.
Type: bool
Whether to validate the return value from call validators. Defaults to False.
Type: bool
A tuple of strings and/or patterns that prevent models from having fields with names that conflict with them.
For strings, we match on a prefix basis. Ex, if ‘dog’ is in the protected namespace, ‘dog_name’ will be protected.
For patterns, we match on the entire field name. Ex, if re.compile(r'^dog is in the protected namespace, 'dog' will be protected, but 'dog_name' will not be.) is in the protected namespace, ‘dog’ will be protected, but ‘dog_name’ will not be.
Defaults to ('model_validate', 'model_dump',).
The reason we’ve selected these is to prevent collisions with other validation / dumping formats
in the future - ex, model_validate_\{some_newly_supported_format\}.
Before v2.10, Pydantic used ('model_',) as the default value for this setting to
prevent collisions between model attributes and BaseModel’s own methods. This was changed
in v2.10 given feedback that this restriction was limiting in AI and data science contexts,
where it is common to have fields with names like model_id, model_input, model_output, etc.
For more details, see https://github.com/pydantic/pydantic/issues/10315.
import warnings
from pydantic import BaseModel
warnings.filterwarnings('error') # Raise warnings as errors
try:
class Model(BaseModel):
model_dump_something: str
except UserWarning as e:
print(e)
'''
Field "model_dump_something" in Model has conflict with protected namespace "model_dump".
You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('model_validate',)`.
'''
You can customize this behavior using the protected_namespaces setting:
import re
import warnings
from pydantic import BaseModel, ConfigDict
with warnings.catch_warnings(record=True) as caught_warnings:
warnings.simplefilter('always') # Catch all warnings
class Model(BaseModel):
safe_field: str
also_protect_field: str
protect_this: str
model_config = ConfigDict(
protected_namespaces=(
'protect_me_',
'also_protect_',
re.compile('^protect_this$'),
)
)
for warning in caught_warnings:
print(f'{warning.message}')
'''
Field "also_protect_field" in Model has conflict with protected namespace "also_protect_".
You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('protect_me_', re.compile('^protect_this$'))`.
Field "protect_this" in Model has conflict with protected namespace "re.compile('^protect_this$')".
You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('protect_me_', 'also_protect_')`.
'''
While Pydantic will only emit a warning when an item is in a protected namespace but does not actually have a collision, an error is raised if there is an actual collision with an existing attribute:
from pydantic import BaseModel, ConfigDict
try:
class Model(BaseModel):
model_validate: str
model_config = ConfigDict(protected_namespaces=('model_',))
except NameError as e:
print(e)
'''
Field "model_validate" conflicts with member <bound method BaseModel.model_validate of <class 'pydantic.main.BaseModel'>> of protected namespace "model_".
'''
Type: tuple[str | Pattern[str], ...]
Whether to hide inputs when printing errors. Defaults to False.
Pydantic shows the input value and type when it raises ValidationError during the validation.
from pydantic import BaseModel, ValidationError
class Model(BaseModel):
a: str
try:
Model(a=123)
except ValidationError as e:
print(e)
'''
1 validation error for Model
a
Input should be a valid string [type=string_type, input_value=123, input_type=int]
'''
You can hide the input value and type by setting the hide_input_in_errors config to True.
from pydantic import BaseModel, ConfigDict, ValidationError
class Model(BaseModel):
a: str
model_config = ConfigDict(hide_input_in_errors=True)
try:
Model(a=123)
except ValidationError as e:
print(e)
'''
1 validation error for Model
a
Input should be a valid string [type=string_type]
'''
Type: bool
Whether to defer model validator and serializer construction until the first model validation. Defaults to False.
This can be useful to avoid the overhead of building models which are only
used nested within other models, or when you want to manually define type namespace via
Model.model_rebuild(_types_namespace=...).
Since v2.10, this setting also applies to pydantic dataclasses and TypeAdapter instances.
Type: bool
A dict of settings for plugins. Defaults to None.
Type: dict[str, object] | None
Type: type[_GenerateSchema] | None
Whether fields with default values should be marked as required in the serialization schema. Defaults to False.
This ensures that the serialization schema will reflect the fact a field with a default will always be present when serializing the model, even though it is not required for validation.
However, there are scenarios where this may be undesirable — in particular, if you want to share the schema between validation and serialization, and don’t mind fields with defaults being marked as not required during serialization. See #7209 for more details.
from pydantic import BaseModel, ConfigDict
class Model(BaseModel):
a: str = 'a'
model_config = ConfigDict(json_schema_serialization_defaults_required=True)
print(Model.model_json_schema(mode='validation'))
'''
{
'properties': {'a': {'default': 'a', 'title': 'A', 'type': 'string'}},
'title': 'Model',
'type': 'object',
}
'''
print(Model.model_json_schema(mode='serialization'))
'''
{
'properties': {'a': {'default': 'a', 'title': 'A', 'type': 'string'}},
'required': ['a'],
'title': 'Model',
'type': 'object',
}
'''
Type: bool
If not None, the specified mode will be used to generate the JSON schema regardless of what mode was passed to
the function call. Defaults to None.
This provides a way to force the JSON schema generation to reflect a specific mode, e.g., to always use the validation schema.
It can be useful when using frameworks (such as FastAPI) that may generate different schemas for validation
and serialization that must both be referenced from the same schema; when this happens, we automatically append
-Input to the definition reference for the validation schema and -Output to the definition reference for the
serialization schema. By specifying a json_schema_mode_override though, this prevents the conflict between
the validation and serialization schemas (since both will use the specified schema), and so prevents the suffixes
from being added to the definition references.
from pydantic import BaseModel, ConfigDict, Json
class Model(BaseModel):
a: Json[int] # requires a string to validate, but will dump an int
print(Model.model_json_schema(mode='serialization'))
'''
{
'properties': {'a': {'title': 'A', 'type': 'integer'}},
'required': ['a'],
'title': 'Model',
'type': 'object',
}
'''
class ForceInputModel(Model):
# the following ensures that even with mode='serialization', we
# will get the schema that would be generated for validation.
model_config = ConfigDict(json_schema_mode_override='validation')
print(ForceInputModel.model_json_schema(mode='serialization'))
'''
{
'properties': {
'a': {
'contentMediaType': 'application/json',
'contentSchema': {'type': 'integer'},
'title': 'A',
'type': 'string',
}
},
'required': ['a'],
'title': 'ForceInputModel',
'type': 'object',
}
'''
Type: Literal['validation', 'serialization', None]
If True, enables automatic coercion of any Number type to str in “lax” (non-strict) mode. Defaults to False.
Pydantic doesn’t allow number types (int, float, Decimal) to be coerced as type str by default.
from decimal import Decimal
from pydantic import BaseModel, ConfigDict, ValidationError
class Model(BaseModel):
value: str
try:
print(Model(value=42))
except ValidationError as e:
print(e)
'''
1 validation error for Model
value
Input should be a valid string [type=string_type, input_value=42, input_type=int]
'''
class Model(BaseModel):
model_config = ConfigDict(coerce_numbers_to_str=True)
value: str
repr(Model(value=42).value)
#> "42"
repr(Model(value=42.13).value)
#> "42.13"
repr(Model(value=Decimal('42.13')).value)
#> "42.13"
Type: bool
The regex engine to be used for pattern validation.
Defaults to 'rust-regex'.
rust-regexuses theregexRust crate, which is non-backtracking and therefore more DDoS resistant, but does not support all regex features.python-reuse theremodule, which supports all regex features, but may be slower.
from pydantic import BaseModel, ConfigDict, Field, ValidationError
class Model(BaseModel):
model_config = ConfigDict(regex_engine='python-re')
value: str = Field(pattern=r'^abc(?=def)')
print(Model(value='abcdef').value)
#> abcdef
try:
print(Model(value='abxyzcdef'))
except ValidationError as e:
print(e)
'''
1 validation error for Model
value
String should match pattern '^abc(?=def)' [type=string_pattern_mismatch, input_value='abxyzcdef', input_type=str]
'''
Type: Literal['rust-regex', 'python-re']
If True, Python exceptions that were part of a validation failure will be shown as an exception group as a cause. Can be useful for debugging. Defaults to False.
Type: bool
Whether docstrings of attributes (bare string literals immediately following the attribute declaration)
should be used for field descriptions. Defaults to False.
Available in Pydantic v2.7+.
from pydantic import BaseModel, ConfigDict, Field
class Model(BaseModel):
model_config = ConfigDict(use_attribute_docstrings=True)
x: str
"""
Example of an attribute docstring
"""
y: int = Field(description="Description in Field")
"""
Description in Field overrides attribute docstring
"""
print(Model.model_fields["x"].description)
# > Example of an attribute docstring
print(Model.model_fields["y"].description)
# > Description in Field
This requires the source code of the class to be available at runtime.
Type: bool
Whether to cache strings to avoid constructing new Python objects. Defaults to True.
Enabling this setting should significantly improve validation performance while increasing memory usage slightly.
Trueor'all'(the default): cache all strings'keys': cache only dictionary keysFalseor'none': no caching
Type: bool | Literal['all', 'keys', 'none']
Bases: StandardDataclass, Protocol
A protocol containing attributes only available once a class has been decorated as a Pydantic dataclass.
Usage docs: https://docs.pydantic.dev/2.10/concepts/models/
A base class for creating Pydantic models.
Configuration for the model, should be a dictionary conforming to ConfigDict.
Type: ConfigDict Default: ConfigDict()
The names of the class variables defined on the model.
Type: set[str]
Metadata about the private attributes of the model.
Type: Dict[str, ModelPrivateAttr]
The synthesized __init__ Signature of the model.
Type: Signature
Whether model building is completed, or if there are still undefined fields.
Type: bool Default: False
The core schema of the model.
Type: CoreSchema
Whether the model has a custom __init__ method.
Type: bool
Metadata containing the decorators defined on the model.
This replaces Model.__validators__ and Model.__root_validators__ from Pydantic V1.
Type: _decorators.DecoratorInfos Default: _decorators.DecoratorInfos()
Metadata for generic models; contains data used for a similar purpose to args, origin, parameters in typing-module generics. May eventually be replaced by these.
Type: _generics.PydanticGenericMetadata
Parent namespace of the model, used for automatic rebuilding of models.
Type: Dict[str, Any] | None Default: None
The name of the post-init method for the model, if defined.
Type: None | Literal['model_post_init']
Whether the model is a RootModel.
Type: bool Default: False
The pydantic-core SchemaSerializer used to dump instances of the model.
Type: SchemaSerializer
The pydantic-core SchemaValidator used to validate instances of the model.
Type: SchemaValidator | PluggableSchemaValidator
A dictionary of field names and their corresponding FieldInfo objects.
This replaces Model.__fields__ from Pydantic V1.
Type: Dict[str, FieldInfo]
A dictionary of computed field names and their corresponding ComputedFieldInfo objects.
Type: Dict[str, ComputedFieldInfo]
A dictionary containing extra values, if extra is set to 'allow'.
Type: dict[str, Any] | None Default: _model_construction.NoInitField(init=False)
The names of fields explicitly set during instantiation.
Type: set[str] Default: _model_construction.NoInitField(init=False)
Values of private attributes set on the model instance.
Type: dict[str, Any] | None Default: _model_construction.NoInitField(init=False)
Get metadata about the fields defined on the model.
Deprecation warning: you should be getting this information from the model class, not from an instance.
In V3, this property will be removed from the BaseModel class.
Type: dict[str, FieldInfo]
Get metadata about the computed fields defined on the model.
Deprecation warning: you should be getting this information from the model class, not from an instance.
In V3, this property will be removed from the BaseModel class.
Type: dict[str, ComputedFieldInfo]
Get extra fields set during validation.
Type: dict[str, Any] | None
Returns the set of fields that have been explicitly set on this model instance.
Type: set[str]
def __init__(data: Any = {}) -> None
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be
validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
None
@classmethod
def model_construct(cls, _fields_set: set[str] | None = None, values: Any = {}) -> Self
Creates a new instance of the Model class with validated data.
Creates a new model setting __dict__ and __pydantic_fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Self — A new instance of the Model class with validated data.
A set of field names that were originally explicitly set during instantiation. If provided,
this is directly used for the model_fields_set attribute.
Otherwise, the field names from the values argument will be used.
Trusted or pre-validated data dictionary.
def model_copy(update: Mapping[str, Any] | None = None, deep: bool = False) -> Self
Usage docs: https://docs.pydantic.dev/2.10/concepts/serialization/#model_copy
Returns a copy of the model.
Self — New model instance.
Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data.
Set to True to make a deep copy of the model.
def model_dump(
mode: Literal['json', 'python'] | str = 'python',
include: IncEx | None = None,
exclude: IncEx | None = None,
context: Any | None = None,
by_alias: bool = False,
exclude_unset: bool = False,
exclude_defaults: bool = False,
exclude_none: bool = False,
round_trip: bool = False,
warnings: bool | Literal['none', 'warn', 'error'] = True,
serialize_as_any: bool = False,
) -> dict[str, Any]
Usage docs: https://docs.pydantic.dev/2.10/concepts/serialization/#modelmodel_dump
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
dict[str, Any] — A dictionary representation of the model.
The mode in which to_python should run.
If mode is ‘json’, the output will only contain JSON serializable types.
If mode is ‘python’, the output may contain non-JSON-serializable Python objects.
A set of fields to include in the output.
A set of fields to exclude from the output.
Additional context to pass to the serializer.
Whether to use the field’s alias in the dictionary key if defined.
Whether to exclude fields that have not been explicitly set.
Whether to exclude fields that are set to their default value.
Whether to exclude fields that have a value of None.
If True, dumped values should be valid as input for non-idempotent types such as Json[T].
How to handle serialization errors. False/“none” ignores them, True/“warn” logs errors,
“error” raises a PydanticSerializationError.
Whether to serialize fields with duck-typing serialization behavior.
def model_dump_json(
indent: int | None = None,
include: IncEx | None = None,
exclude: IncEx | None = None,
context: Any | None = None,
by_alias: bool = False,
exclude_unset: bool = False,
exclude_defaults: bool = False,
exclude_none: bool = False,
round_trip: bool = False,
warnings: bool | Literal['none', 'warn', 'error'] = True,
serialize_as_any: bool = False,
) -> str
Usage docs: https://docs.pydantic.dev/2.10/concepts/serialization/#modelmodel_dump_json
Generates a JSON representation of the model using Pydantic’s to_json method.
str — A JSON string representation of the model.
Indentation to use in the JSON output. If None is passed, the output will be compact.
Field(s) to include in the JSON output.
Field(s) to exclude from the JSON output.
Additional context to pass to the serializer.
Whether to serialize using field aliases.
Whether to exclude fields that have not been explicitly set.
Whether to exclude fields that are set to their default value.
Whether to exclude fields that have a value of None.
If True, dumped values should be valid as input for non-idempotent types such as Json[T].
How to handle serialization errors. False/“none” ignores them, True/“warn” logs errors,
“error” raises a PydanticSerializationError.
Whether to serialize fields with duck-typing serialization behavior.
@classmethod
def model_json_schema(
cls,
by_alias: bool = True,
ref_template: str = DEFAULT_REF_TEMPLATE,
schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
mode: JsonSchemaMode = 'validation',
) -> dict[str, Any]
Generates a JSON schema for a model class.
dict[str, Any] — The JSON schema for the given model class.
Whether to use attribute aliases or not.
The reference template.
To override the logic used to generate the JSON schema, as a subclass of
GenerateJsonSchema with your desired modifications
The mode in which to generate the schema.
@classmethod
def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str
Compute the class name for parametrizations of generic classes.
This method can be overridden to achieve a custom naming scheme for generic BaseModels.
str — String representing the new class where params are passed to cls as type variables.
Tuple of types of the class. Given a generic class
Model with 2 type variables and a concrete model Model[str, int],
the value (str, int) would be passed to params.
TypeError— Raised when trying to generate concrete names for non-generic models.
def model_post_init(__context: Any) -> None
Override this method to perform additional initialization after __init__ and model_construct.
This is useful if you want to do some validation that requires the entire model to be initialized.
None
@classmethod
def model_rebuild(
cls,
force: bool = False,
raise_errors: bool = True,
_parent_namespace_depth: int = 2,
_types_namespace: MappingNamespace | None = None,
) -> bool | None
Try to rebuild the pydantic-core schema for the model.
This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails.
bool | None — Returns None if the schema is already “complete” and rebuilding was not required.
bool | None — If rebuilding was required, returns True if rebuilding was successful, otherwise False.
Whether to force the rebuilding of the model schema, defaults to False.
Whether to raise errors, defaults to True.
The depth level of the parent namespace, defaults to 2.
The types namespace, defaults to None.
@classmethod
def model_validate(
cls,
obj: Any,
strict: bool | None = None,
from_attributes: bool | None = None,
context: Any | None = None,
) -> Self
Validate a pydantic model instance.
Self — The validated model instance.
The object to validate.
Whether to enforce types strictly.
Whether to extract data from object attributes.
Additional context to pass to the validator.
ValidationError— If the object could not be validated.
@classmethod
def model_validate_json(
cls,
json_data: str | bytes | bytearray,
strict: bool | None = None,
context: Any | None = None,
) -> Self
Usage docs: https://docs.pydantic.dev/2.10/concepts/json/#json-parsing
Validate the given JSON data against the Pydantic model.
Self — The validated Pydantic model.
The JSON data to validate.
Whether to enforce types strictly.
Extra variables to pass to the validator.
ValidationError— Ifjson_datais not a JSON string or the object could not be validated.
@classmethod
def model_validate_strings(
cls,
obj: Any,
strict: bool | None = None,
context: Any | None = None,
) -> Self
Validate the given object with string data against the Pydantic model.
Self — The validated Pydantic model.
The object containing string data to validate.
Whether to enforce types strictly.
Extra variables to pass to the validator.
@classmethod
def __get_pydantic_core_schema__(
cls,
source: type[BaseModel],
handler: GetCoreSchemaHandler,
) -> CoreSchema
Hook into generating the model’s CoreSchema.
CoreSchema — A pydantic-core CoreSchema.
The class we are generating a schema for.
This will generally be the same as the cls argument if this is a classmethod.
A callable that calls into Pydantic’s internal CoreSchema generation logic.
@classmethod
def __get_pydantic_json_schema__(
cls,
core_schema: CoreSchema,
handler: GetJsonSchemaHandler,
) -> JsonSchemaValue
Hook into generating the model’s JSON schema.
JsonSchemaValue — A JSON schema, as a Python object.
A pydantic-core CoreSchema.
You can ignore this argument and call the handler with a new CoreSchema,
wrap this CoreSchema (\{'type': 'nullable', 'schema': current_schema\}),
or just call the handler with the original schema.
Call into Pydantic’s internal JSON schema generation.
This will raise a pydantic.errors.PydanticInvalidForJsonSchema if JSON schema
generation fails.
Since this gets called by BaseModel.model_json_schema you can override the
schema_generator argument to that function to change JSON schema generation globally
for a type.
@classmethod
def __pydantic_init_subclass__(cls, kwargs: Any = {}) -> None
This is intended to behave just like __init_subclass__, but is called by ModelMetaclass
only after the class is actually fully initialized. In particular, attributes like model_fields will
be present when this is called.
This is necessary because __init_subclass__ will always be called by type.__new__,
and it would require a prohibitively large refactor to the ModelMetaclass to ensure that
type.__new__ was called in such a manner that the class would already be sufficiently initialized.
This will receive the same kwargs that would be passed to the standard __init_subclass__, namely,
any kwargs passed to the class definition that aren’t used internally by pydantic.
None
Any keyword arguments passed to the class definition that aren’t used internally by pydantic.
def __copy__() -> Self
Returns a shallow copy of the model.
Self
def __deepcopy__(memo: dict[int, Any] | None = None) -> Self
Returns a deep copy of the model.
Self
def __init_subclass__(cls, kwargs: Unpack[ConfigDict] = {})
This signature is included purely to help type-checkers check arguments to class declaration, which provides a way to conveniently set model_config key/value pairs.
from pydantic import BaseModel
class MyModel(BaseModel, extra='allow'): ...
However, this may be deceiving, since the actual calls to __init_subclass__ will not receive any
of the config arguments, and will only receive any keyword arguments passed during class initialization
that are not expected keys in ConfigDict. (This is due to the way ModelMetaclass.__new__ works.)
Keyword arguments passed to the class definition, which set model_config
def __iter__() -> TupleGenerator
So dict(model) works.
TupleGenerator
def dict(
include: IncEx | None = None,
exclude: IncEx | None = None,
by_alias: bool = False,
exclude_unset: bool = False,
exclude_defaults: bool = False,
exclude_none: bool = False,
) -> Dict[str, Any]
Dict[str, Any]
def json(
include: IncEx | None = None,
exclude: IncEx | None = None,
by_alias: bool = False,
exclude_unset: bool = False,
exclude_defaults: bool = False,
exclude_none: bool = False,
encoder: Callable[[Any], Any] | None = PydanticUndefined,
models_as_dict: bool = PydanticUndefined,
dumps_kwargs: Any = {},
) -> str
str
@classmethod
def parse_obj(cls, obj: Any) -> Self
Self
@classmethod
def parse_raw(
cls,
b: str | bytes,
content_type: str | None = None,
encoding: str = 'utf8',
proto: DeprecatedParseProtocol | None = None,
allow_pickle: bool = False,
) -> Self
Self
@classmethod
def parse_file(
cls,
path: str | Path,
content_type: str | None = None,
encoding: str = 'utf8',
proto: DeprecatedParseProtocol | None = None,
allow_pickle: bool = False,
) -> Self
Self
@classmethod
def from_orm(cls, obj: Any) -> Self
Self
@classmethod
def construct(cls, _fields_set: set[str] | None = None, values: Any = {}) -> Self
Self
def copy(
include: AbstractSetIntStr | MappingIntStrAny | None = None,
exclude: AbstractSetIntStr | MappingIntStrAny | None = None,
update: Dict[str, Any] | None = None,
deep: bool = False,
) -> Self
Returns a copy of the model.
If you need include or exclude, use:
data = self.model_dump(include=include, exclude=exclude, round_trip=True)
data = {**data, **(update or {})}
copied = self.model_validate(data)
Self — A copy of the model with included, excluded and updated fields as specified.
Optional set or mapping specifying which fields to include in the copied model.
Optional set or mapping specifying which fields to exclude in the copied model.
Optional dictionary of field-value pairs to override field values in the copied model.
If True, the values of fields that are Pydantic models will be deep-copied.
@classmethod
def schema(
cls,
by_alias: bool = True,
ref_template: str = DEFAULT_REF_TEMPLATE,
) -> Dict[str, Any]
Dict[str, Any]
@classmethod
def schema_json(
cls,
by_alias: bool = True,
ref_template: str = DEFAULT_REF_TEMPLATE,
dumps_kwargs: Any = {},
) -> str
str
@classmethod
def validate(cls, value: Any) -> Self
Self
@classmethod
def update_forward_refs(cls, localns: Any = {}) -> None
None
Bases: UserWarning
This class is used to emit warnings produced during JSON schema generation.
See the GenerateJsonSchema.emit_warning and
GenerateJsonSchema.render_warning_message
methods for more details; these can be overridden to control warning behavior.
Usage docs: https://docs.pydantic.dev/2.10/concepts/json_schema/#customizing-the-json-schema-generation-process
A class for generating JSON schemas.
This class generates JSON schemas based on configured parameters. The default schema dialect
is https://json-schema.org/draft/2020-12/schema.
The class uses by_alias to configure how fields with
multiple names are handled and ref_template to format reference names.
Whether to use field aliases in the generated schemas.
The format string to use when generating reference names.
Default: 'https://json-schema.org/draft/2020-12/schema'
Type: set[JsonSchemaWarningKind] Default: \{'skipped-choice'\}
Default: by_alias
Default: ref_template
Type: dict[CoreModeRef, JsonRef] Default: \{\}
Type: dict[CoreModeRef, DefsRef] Default: \{\}
Type: dict[DefsRef, CoreModeRef] Default: \{\}
Type: dict[JsonRef, DefsRef] Default: \{\}
Type: dict[DefsRef, JsonSchemaValue] Default: \{\}
Type: JsonSchemaMode
def build_schema_type_to_method(
) -> dict[CoreSchemaOrFieldType, Callable[[CoreSchemaOrField], JsonSchemaValue]]
Builds a dictionary mapping fields to methods for generating JSON schemas.
dict[CoreSchemaOrFieldType, Callable[[CoreSchemaOrField], JsonSchemaValue]] — A dictionary containing the mapping of CoreSchemaOrFieldType to a handler method.
TypeError— If no method has been defined for generating a JSON schema for a given pydantic core schema type.
def generate_definitions(
inputs: Sequence[tuple[JsonSchemaKeyT, JsonSchemaMode, core_schema.CoreSchema]],
) -> tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], dict[DefsRef, JsonSchemaValue]]
Generates JSON schema definitions from a list of core schemas, pairing the generated definitions with a mapping that links the input keys to the definition references.
tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], dict[DefsRef, JsonSchemaValue]] — A tuple where:
- The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.)
- The second element is a dictionary whose keys are definition references for the JSON schemas from the first returned element, and whose values are the actual JSON schema definitions.
A sequence of tuples, where:
- The first element is a JSON schema key type.
- The second element is the JSON mode: either ‘validation’ or ‘serialization’.
- The third element is a core schema.
PydanticUserError— Raised if the JSON schema generator has already been used to generate a JSON schema.
def generate(schema: CoreSchema, mode: JsonSchemaMode = 'validation') -> JsonSchemaValue
Generates a JSON schema for a specified schema in a specified mode.
JsonSchemaValue — A JSON schema representing the specified schema.
A Pydantic model.
The mode in which to generate the schema. Defaults to ‘validation’.
PydanticUserError— If the JSON schema generator has already been used to generate a JSON schema.
def generate_inner(schema: CoreSchemaOrField) -> JsonSchemaValue
Generates a JSON schema for a given core schema.
TODO: the nested function definitions here seem like bad practice, I’d like to unpack these in a future PR. It’d be great if we could shorten the call stack a bit for JSON schema generation, and I think there’s potential for that here.
JsonSchemaValue — The generated JSON schema.
The given core schema.
def sort(value: JsonSchemaValue, parent_key: str | None = None) -> JsonSchemaValue
Override this method to customize the sorting of the JSON schema (e.g., don’t sort at all, sort all keys unconditionally, etc.)
By default, alphabetically sort the keys in the JSON schema, skipping the ‘properties’ and ‘default’ keys to preserve field definition order. This sort is recursive, so it will sort all nested dictionaries as well.
JsonSchemaValue
def invalid_schema(schema: core_schema.InvalidSchema) -> JsonSchemaValue
Placeholder - should never be called.
JsonSchemaValue
def any_schema(schema: core_schema.AnySchema) -> JsonSchemaValue
Generates a JSON schema that matches any value.
JsonSchemaValue — The generated JSON schema.
The core schema.
def none_schema(schema: core_schema.NoneSchema) -> JsonSchemaValue
Generates a JSON schema that matches None.
JsonSchemaValue — The generated JSON schema.
The core schema.
def bool_schema(schema: core_schema.BoolSchema) -> JsonSchemaValue
Generates a JSON schema that matches a bool value.
JsonSchemaValue — The generated JSON schema.
The core schema.
def int_schema(schema: core_schema.IntSchema) -> JsonSchemaValue
Generates a JSON schema that matches an int value.
JsonSchemaValue — The generated JSON schema.
The core schema.
def float_schema(schema: core_schema.FloatSchema) -> JsonSchemaValue
Generates a JSON schema that matches a float value.
JsonSchemaValue — The generated JSON schema.
The core schema.
def decimal_schema(schema: core_schema.DecimalSchema) -> JsonSchemaValue
Generates a JSON schema that matches a decimal value.
JsonSchemaValue — The generated JSON schema.
The core schema.
def str_schema(schema: core_schema.StringSchema) -> JsonSchemaValue
Generates a JSON schema that matches a string value.
JsonSchemaValue — The generated JSON schema.
The core schema.
def bytes_schema(schema: core_schema.BytesSchema) -> JsonSchemaValue
Generates a JSON schema that matches a bytes value.
JsonSchemaValue — The generated JSON schema.
The core schema.
def date_schema(schema: core_schema.DateSchema) -> JsonSchemaValue
Generates a JSON schema that matches a date value.
JsonSchemaValue — The generated JSON schema.
The core schema.
def time_schema(schema: core_schema.TimeSchema) -> JsonSchemaValue
Generates a JSON schema that matches a time value.
JsonSchemaValue — The generated JSON schema.
The core schema.
def datetime_schema(schema: core_schema.DatetimeSchema) -> JsonSchemaValue
Generates a JSON schema that matches a datetime value.
JsonSchemaValue — The generated JSON schema.
The core schema.
def timedelta_schema(schema: core_schema.TimedeltaSchema) -> JsonSchemaValue
Generates a JSON schema that matches a timedelta value.
JsonSchemaValue — The generated JSON schema.
The core schema.
def literal_schema(schema: core_schema.LiteralSchema) -> JsonSchemaValue
Generates a JSON schema that matches a literal value.
JsonSchemaValue — The generated JSON schema.
The core schema.
def enum_schema(schema: core_schema.EnumSchema) -> JsonSchemaValue
Generates a JSON schema that matches an Enum value.
JsonSchemaValue — The generated JSON schema.
The core schema.
def is_instance_schema(schema: core_schema.IsInstanceSchema) -> JsonSchemaValue
Handles JSON schema generation for a core schema that checks if a value is an instance of a class.
Unless overridden in a subclass, this raises an error.
JsonSchemaValue — The generated JSON schema.
The core schema.
def is_subclass_schema(schema: core_schema.IsSubclassSchema) -> JsonSchemaValue
Handles JSON schema generation for a core schema that checks if a value is a subclass of a class.
For backwards compatibility with v1, this does not raise an error, but can be overridden to change this.
JsonSchemaValue — The generated JSON schema.
The core schema.
def callable_schema(schema: core_schema.CallableSchema) -> JsonSchemaValue
Generates a JSON schema that matches a callable value.
Unless overridden in a subclass, this raises an error.
JsonSchemaValue — The generated JSON schema.
The core schema.
def list_schema(schema: core_schema.ListSchema) -> JsonSchemaValue
Returns a schema that matches a list schema.
JsonSchemaValue — The generated JSON schema.
The core schema.
def tuple_positional_schema(schema: core_schema.TupleSchema) -> JsonSchemaValue
Replaced by tuple_schema.
JsonSchemaValue
def tuple_variable_schema(schema: core_schema.TupleSchema) -> JsonSchemaValue
Replaced by tuple_schema.
JsonSchemaValue
def tuple_schema(schema: core_schema.TupleSchema) -> JsonSchemaValue
Generates a JSON schema that matches a tuple schema e.g. Tuple[int, str, bool] or Tuple[int, ...].
JsonSchemaValue — The generated JSON schema.
The core schema.
def set_schema(schema: core_schema.SetSchema) -> JsonSchemaValue
Generates a JSON schema that matches a set schema.
JsonSchemaValue — The generated JSON schema.
The core schema.
def frozenset_schema(schema: core_schema.FrozenSetSchema) -> JsonSchemaValue
Generates a JSON schema that matches a frozenset schema.
JsonSchemaValue — The generated JSON schema.
The core schema.
def generator_schema(schema: core_schema.GeneratorSchema) -> JsonSchemaValue
Returns a JSON schema that represents the provided GeneratorSchema.
JsonSchemaValue — The generated JSON schema.
The schema.
def dict_schema(schema: core_schema.DictSchema) -> JsonSchemaValue
Generates a JSON schema that matches a dict schema.
JsonSchemaValue — The generated JSON schema.
The core schema.
def function_before_schema(
schema: core_schema.BeforeValidatorFunctionSchema,
) -> JsonSchemaValue
Generates a JSON schema that matches a function-before schema.
JsonSchemaValue — The generated JSON schema.
The core schema.
def function_after_schema(
schema: core_schema.AfterValidatorFunctionSchema,
) -> JsonSchemaValue
Generates a JSON schema that matches a function-after schema.
JsonSchemaValue — The generated JSON schema.
The core schema.
def function_plain_schema(
schema: core_schema.PlainValidatorFunctionSchema,
) -> JsonSchemaValue
Generates a JSON schema that matches a function-plain schema.
JsonSchemaValue — The generated JSON schema.
The core schema.
def function_wrap_schema(
schema: core_schema.WrapValidatorFunctionSchema,
) -> JsonSchemaValue
Generates a JSON schema that matches a function-wrap schema.
JsonSchemaValue — The generated JSON schema.
The core schema.
def default_schema(schema: core_schema.WithDefaultSchema) -> JsonSchemaValue
Generates a JSON schema that matches a schema with a default value.
JsonSchemaValue — The generated JSON schema.
The core schema.
def nullable_schema(schema: core_schema.NullableSchema) -> JsonSchemaValue
Generates a JSON schema that matches a schema that allows null values.
JsonSchemaValue — The generated JSON schema.
The core schema.
def union_schema(schema: core_schema.UnionSchema) -> JsonSchemaValue
Generates a JSON schema that matches a schema that allows values matching any of the given schemas.
JsonSchemaValue — The generated JSON schema.
The core schema.
def tagged_union_schema(schema: core_schema.TaggedUnionSchema) -> JsonSchemaValue
Generates a JSON schema that matches a schema that allows values matching any of the given schemas, where the schemas are tagged with a discriminator field that indicates which schema should be used to validate the value.
JsonSchemaValue — The generated JSON schema.
The core schema.
def chain_schema(schema: core_schema.ChainSchema) -> JsonSchemaValue
Generates a JSON schema that matches a core_schema.ChainSchema.
When generating a schema for validation, we return the validation JSON schema for the first step in the chain. For serialization, we return the serialization JSON schema for the last step in the chain.
JsonSchemaValue — The generated JSON schema.
The core schema.
def lax_or_strict_schema(schema: core_schema.LaxOrStrictSchema) -> JsonSchemaValue
Generates a JSON schema that matches a schema that allows values matching either the lax schema or the strict schema.
JsonSchemaValue — The generated JSON schema.
The core schema.
def json_or_python_schema(schema: core_schema.JsonOrPythonSchema) -> JsonSchemaValue
Generates a JSON schema that matches a schema that allows values matching either the JSON schema or the Python schema.
The JSON schema is used instead of the Python schema. If you want to use the Python schema, you should override this method.
JsonSchemaValue — The generated JSON schema.
The core schema.
def typed_dict_schema(schema: core_schema.TypedDictSchema) -> JsonSchemaValue
Generates a JSON schema that matches a schema that defines a typed dict.
JsonSchemaValue — The generated JSON schema.
The core schema.
def typed_dict_field_schema(schema: core_schema.TypedDictField) -> JsonSchemaValue
Generates a JSON schema that matches a schema that defines a typed dict field.
JsonSchemaValue — The generated JSON schema.
The core schema.
def dataclass_field_schema(schema: core_schema.DataclassField) -> JsonSchemaValue
Generates a JSON schema that matches a schema that defines a dataclass field.
JsonSchemaValue — The generated JSON schema.
The core schema.
def model_field_schema(schema: core_schema.ModelField) -> JsonSchemaValue
Generates a JSON schema that matches a schema that defines a model field.
JsonSchemaValue — The generated JSON schema.
The core schema.
def computed_field_schema(schema: core_schema.ComputedField) -> JsonSchemaValue
Generates a JSON schema that matches a schema that defines a computed field.
JsonSchemaValue — The generated JSON schema.
The core schema.
def model_schema(schema: core_schema.ModelSchema) -> JsonSchemaValue
Generates a JSON schema that matches a schema that defines a model.
JsonSchemaValue — The generated JSON schema.
The core schema.
def resolve_ref_schema(json_schema: JsonSchemaValue) -> JsonSchemaValue
Resolve a JsonSchemaValue to the non-ref schema if it is a $ref schema.
JsonSchemaValue — The resolved schema.
The schema to resolve.
RuntimeError— If the schema reference can’t be found in definitions.
def model_fields_schema(schema: core_schema.ModelFieldsSchema) -> JsonSchemaValue
Generates a JSON schema that matches a schema that defines a model’s fields.
JsonSchemaValue — The generated JSON schema.
The core schema.
def field_is_present(field: CoreSchemaField) -> bool
Whether the field should be included in the generated JSON schema.
bool — True if the field should be included in the generated JSON schema, False otherwise.
The schema for the field itself.
def field_is_required(
field: core_schema.ModelField | core_schema.DataclassField | core_schema.TypedDictField,
total: bool,
) -> bool
Whether the field should be marked as required in the generated JSON schema. (Note that this is irrelevant if the field is not present in the JSON schema.).
bool — True if the field should be marked as required in the generated JSON schema, False otherwise.
The schema for the field itself.
Only applies to TypedDictFields.
Indicates if the TypedDict this field belongs to is total, in which case any fields that don’t
explicitly specify required=False are required.
def dataclass_args_schema(schema: core_schema.DataclassArgsSchema) -> JsonSchemaValue
Generates a JSON schema that matches a schema that defines a dataclass’s constructor arguments.
JsonSchemaValue — The generated JSON schema.
The core schema.
def dataclass_schema(schema: core_schema.DataclassSchema) -> JsonSchemaValue
Generates a JSON schema that matches a schema that defines a dataclass.
JsonSchemaValue — The generated JSON schema.
The core schema.
def arguments_schema(schema: core_schema.ArgumentsSchema) -> JsonSchemaValue
Generates a JSON schema that matches a schema that defines a function’s arguments.
JsonSchemaValue — The generated JSON schema.
The core schema.
def kw_arguments_schema(
arguments: list[core_schema.ArgumentsParameter],
var_kwargs_schema: CoreSchema | None,
) -> JsonSchemaValue
Generates a JSON schema that matches a schema that defines a function’s keyword arguments.
JsonSchemaValue — The generated JSON schema.
The core schema.
def p_arguments_schema(
arguments: list[core_schema.ArgumentsParameter],
var_args_schema: CoreSchema | None,
) -> JsonSchemaValue
Generates a JSON schema that matches a schema that defines a function’s positional arguments.
JsonSchemaValue — The generated JSON schema.
The core schema.
def get_argument_name(argument: core_schema.ArgumentsParameter) -> str
Retrieves the name of an argument.
str — The name of the argument.
The core schema.
def call_schema(schema: core_schema.CallSchema) -> JsonSchemaValue
Generates a JSON schema that matches a schema that defines a function call.
JsonSchemaValue — The generated JSON schema.
The core schema.
def custom_error_schema(schema: core_schema.CustomErrorSchema) -> JsonSchemaValue
Generates a JSON schema that matches a schema that defines a custom error.
JsonSchemaValue — The generated JSON schema.
The core schema.
def json_schema(schema: core_schema.JsonSchema) -> JsonSchemaValue
Generates a JSON schema that matches a schema that defines a JSON object.
JsonSchemaValue — The generated JSON schema.
The core schema.
def url_schema(schema: core_schema.UrlSchema) -> JsonSchemaValue
Generates a JSON schema that matches a schema that defines a URL.
JsonSchemaValue — The generated JSON schema.
The core schema.
def multi_host_url_schema(schema: core_schema.MultiHostUrlSchema) -> JsonSchemaValue
Generates a JSON schema that matches a schema that defines a URL that can be used with multiple hosts.
JsonSchemaValue — The generated JSON schema.
The core schema.
def uuid_schema(schema: core_schema.UuidSchema) -> JsonSchemaValue
Generates a JSON schema that matches a UUID.
JsonSchemaValue — The generated JSON schema.
The core schema.
def definitions_schema(schema: core_schema.DefinitionsSchema) -> JsonSchemaValue
Generates a JSON schema that matches a schema that defines a JSON object with definitions.
JsonSchemaValue — The generated JSON schema.
The core schema.
def definition_ref_schema(
schema: core_schema.DefinitionReferenceSchema,
) -> JsonSchemaValue
Generates a JSON schema that matches a schema that references a definition.
JsonSchemaValue — The generated JSON schema.
The core schema.
def ser_schema(
schema: core_schema.SerSchema | core_schema.IncExSeqSerSchema | core_schema.IncExDictSerSchema,
) -> JsonSchemaValue | None
Generates a JSON schema that matches a schema that defines a serialized object.
JsonSchemaValue | None — The generated JSON schema.
The core schema.
def complex_schema(schema: core_schema.ComplexSchema) -> JsonSchemaValue
Generates a JSON schema that matches a complex number.
JSON has no standard way to represent complex numbers. Complex number is not a numeric
type. Here we represent complex number as strings following the rule defined by Python.
For instance, ‘1+2j’ is an accepted complex string. Details can be found in
Python’s complex documentation.
JsonSchemaValue — The generated JSON schema.
The core schema.
def get_title_from_name(name: str) -> str
Retrieves a title from a name.
str — The title.
The name to retrieve a title from.
def field_title_should_be_set(schema: CoreSchemaOrField) -> bool
Returns true if a field with the given schema should have a title set based on the field name.
Intuitively, we want this to return true for schemas that wouldn’t otherwise provide their own title (e.g., int, float, str), and false for those that would (e.g., BaseModel subclasses).
bool — True if the field should have a title set, False otherwise.
The schema to check.
def normalize_name(name: str) -> str
Normalizes a name to be used as a key in a dictionary.
str — The normalized name.
The name to normalize.
def get_defs_ref(core_mode_ref: CoreModeRef) -> DefsRef
Override this method to change the way that definitions keys are generated from a core reference.
DefsRef — The definitions key.
The core reference.
def get_cache_defs_ref_schema(core_ref: CoreRef) -> tuple[DefsRef, JsonSchemaValue]
This method wraps the get_defs_ref method with some cache-lookup/population logic, and returns both the produced defs_ref and the JSON schema that will refer to the right definition.
tuple[DefsRef, JsonSchemaValue] — A tuple of the definitions reference and the JSON schema that will refer to it.
The core reference to get the definitions reference for.
def handle_ref_overrides(json_schema: JsonSchemaValue) -> JsonSchemaValue
Remove any sibling keys that are redundant with the referenced schema.
JsonSchemaValue — The schema with redundant sibling keys removed.
The schema to remove redundant sibling keys from.
def get_schema_from_definitions(json_ref: JsonRef) -> JsonSchemaValue | None
JsonSchemaValue | None
def encode_default(dft: Any) -> Any
Encode a default value to a JSON-serializable value.
This is used to encode default values for fields in the generated JSON schema.
Any — The encoded default value.
The default value to encode.
def update_with_validations(
json_schema: JsonSchemaValue,
core_schema: CoreSchema,
mapping: dict[str, str],
) -> None
Update the json_schema with the corresponding validations specified in the core_schema, using the provided mapping to translate keys in core_schema to the appropriate keys for a JSON schema.
None
The JSON schema to update.
The core schema to get the validations from.
A mapping from core_schema attribute names to the corresponding JSON schema attribute names.
def get_flattened_anyof(schemas: list[JsonSchemaValue]) -> JsonSchemaValue
JsonSchemaValue
def get_json_ref_counts(json_schema: JsonSchemaValue) -> dict[JsonRef, int]
Get all values corresponding to the key ‘$ref’ anywhere in the json_schema.
dict[JsonRef, int]
def handle_invalid_for_json_schema(
schema: CoreSchemaOrField,
error_info: str,
) -> JsonSchemaValue
JsonSchemaValue
def emit_warning(kind: JsonSchemaWarningKind, detail: str) -> None
This method simply emits PydanticJsonSchemaWarnings based on handling in the warning_message method.
None
def render_warning_message(kind: JsonSchemaWarningKind, detail: str) -> str | None
This method is responsible for ignoring warnings as desired, and for formatting the warning messages.
You can override the value of ignored_warning_kinds in a subclass of GenerateJsonSchema
to modify what warnings are generated. If you want more control, you can override this method;
just return None in situations where you don’t want warnings to be emitted.
str | None — The formatted warning message, or None if no warning should be emitted.
The kind of warning to render. It can be one of the following:
- ‘skipped-choice’: A choice field was skipped because it had no valid choices.
- ‘non-serializable-default’: A default value was skipped because it was not JSON-serializable.
A string with additional details about the warning.
Usage docs: https://docs.pydantic.dev/2.10/concepts/json_schema/#withjsonschema-annotation
Add this as an annotation on a field to override the (base) JSON schema that would be generated for that field. This provides a way to set a JSON schema for types that would otherwise raise errors when producing a JSON schema, such as Callable, or types that have an is-instance core schema, without needing to go so far as creating a custom subclass of pydantic.json_schema.GenerateJsonSchema. Note that any modifications to the schema that would normally be made (such as setting the title for model fields) will still be performed.
If mode is set this will only apply to that schema generation mode, allowing you
to set different json schemas for validation and serialization.
Type: JsonSchemaValue | None
Type: Literal['validation', 'serialization'] | None Default: None
Add examples to a JSON schema.
If the JSON Schema already contains examples, the provided examples will be appended.
If mode is set this will only apply to that schema generation mode,
allowing you to add different examples for validation and serialization.
Default: examples
Default: mode
Usage docs: https://docs.pydantic.dev/2.10/concepts/json_schema/#skipjsonschema-annotation
Add this as an annotation on a field to skip generating a JSON schema for that field.
def model_json_schema(
cls: type[BaseModel] | type[PydanticDataclass],
by_alias: bool = True,
ref_template: str = DEFAULT_REF_TEMPLATE,
schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
mode: JsonSchemaMode = 'validation',
) -> dict[str, Any]
Utility function to generate a JSON Schema for a model.
dict[str, Any] — The generated JSON Schema.
The model class to generate a JSON Schema for.
If True (the default), fields will be serialized according to their alias.
If False, fields will be serialized according to their attribute name.
The template to use for generating JSON Schema references.
The class to use for generating the JSON Schema.
The mode to use for generating the JSON Schema. It can be one of the following:
- ‘validation’: Generate a JSON Schema for validating data.
- ‘serialization’: Generate a JSON Schema for serializing data.
def models_json_schema(
models: Sequence[tuple[type[BaseModel] | type[PydanticDataclass], JsonSchemaMode]],
by_alias: bool = True,
title: str | None = None,
description: str | None = None,
ref_template: str = DEFAULT_REF_TEMPLATE,
schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
) -> tuple[dict[tuple[type[BaseModel] | type[PydanticDataclass], JsonSchemaMode], JsonSchemaValue], JsonSchemaValue]
Utility function to generate a JSON Schema for multiple models.
tuple[dict[tuple[type[BaseModel] | type[PydanticDataclass], JsonSchemaMode], JsonSchemaValue], JsonSchemaValue] — A tuple where:
- The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.)
- The second element is a JSON schema containing all definitions referenced in the first returned element, along with the optional title and description keys.
A sequence of tuples of the form (model, mode).
Whether field aliases should be used as keys in the generated JSON Schema.
The title of the generated JSON Schema.
The description of the generated JSON Schema.
The reference template to use for generating JSON Schema references.
The schema generator to use for generating the JSON Schema.
Type: TypeAlias Default: Dict[str, JsonValue]
Type: TypeAlias Default: Union[int, float, str, bool, None, List['JsonValue'], 'JsonDict']
Default: Union[core_schema.ModelField, core_schema.DataclassField, core_schema.TypedDictField, core_schema.ComputedField]
Default: Union[core_schema.CoreSchema, CoreSchemaField]
Default: Callable[[CoreSchemaOrField, GetJsonSchemaHandler], JsonSchemaValue]
A type alias for defined schema types that represents a union of
core_schema.CoreSchemaType and
core_schema.CoreSchemaFieldType.
Default: Literal[core_schema.CoreSchemaType, core_schema.CoreSchemaFieldType]
A type alias for a JSON schema value. This is a dictionary of string keys to arbitrary JSON values.
Default: Dict[str, Any]
A type alias that represents the mode of a JSON schema; either ‘validation’ or ‘serialization’.
For some types, the inputs to validation differ from the outputs of serialization. For example, computed fields will only be present when serializing, and should not be provided when validating. This flag provides a way to indicate whether you want the JSON schema required for validation inputs, or that will be matched by serialization outputs.
Default: Literal['validation', 'serialization']
A type alias representing the kinds of warnings that can be emitted during JSON schema generation.
See GenerateJsonSchema.render_warning_message
for more details.
Default: Literal['skipped-choice', 'non-serializable-default', 'skipped-discriminator']
The default format string used to generate reference names.
Default: '#/$defs/\{model\}'
Default: NewType('CoreRef', str)
Default: NewType('DefsRef', str)
Default: NewType('JsonRef', str)
Default: Tuple[CoreRef, JsonSchemaMode]
Default: TypeVar('JsonSchemaKeyT', bound=Hashable)
Default: TypeVar('AnyType')