Configuration
Configuration for Pydantic models.
Bases: TypedDict
A TypedDict for configuring Pydantic behaviour.
The title for the generated JSON schema, defaults to the model’s name
Type: str | None
A callable that takes a model class and returns the title for it. Defaults to None.
Type: Callable[[type], str] | None
A callable that takes a field’s name and info and returns title for it. Defaults to None.
Type: Callable[[str, FieldInfo | ComputedFieldInfo], str] | None
Whether to convert all characters to lowercase for str types. Defaults to False.
Type: bool
Whether to convert all characters to uppercase for str types. Defaults to False.
Type: bool
Whether to strip leading and trailing whitespace for str types.
Type: bool
The minimum length for str types. Defaults to None.
Type: int
The maximum length for str types. Defaults to None.
Type: int | None
Whether to ignore, allow, or forbid extra attributes during model initialization. Defaults to 'ignore'.
You can configure how pydantic handles the attributes that are not defined in the model:
allow- Allow any extra attributes.forbid- Forbid any extra attributes.ignore- Ignore any extra attributes.
from pydantic import BaseModel, ConfigDict
class User(BaseModel):
model_config = ConfigDict(extra='ignore') # (1)
name: str
user = User(name='John Doe', age=20) # (2)
print(user)
#> name='John Doe' This is the default behaviour.
The age argument is ignored.
Instead, with extra='allow', the age argument is included:
from pydantic import BaseModel, ConfigDict
class User(BaseModel):
model_config = ConfigDict(extra='allow')
name: str
user = User(name='John Doe', age=20) # (1)
print(user)
#> name='John Doe' age=20 The age argument is included.
With extra='forbid', an error is raised:
from pydantic import BaseModel, ConfigDict, ValidationError
class User(BaseModel):
model_config = ConfigDict(extra='forbid')
name: str
try:
User(name='John Doe', age=20)
except ValidationError as e:
print(e)
'''
1 validation error for User
age
Extra inputs are not permitted [type=extra_forbidden, input_value=20, input_type=int]
'''
Type: ExtraValues | None
Whether models are faux-immutable, i.e. whether __setattr__ is allowed, and also generates
a __hash__() method for the model. This makes instances of the model potentially hashable if all the
attributes are hashable. Defaults to False.
Type: bool
Whether an aliased field may be populated by its name as given by the model
attribute, as well as the alias. Defaults to False.
from pydantic import BaseModel, ConfigDict, Field
class User(BaseModel):
model_config = ConfigDict(populate_by_name=True)
name: str = Field(alias='full_name') # (1)
age: int
user = User(full_name='John Doe', age=20) # (2)
print(user)
#> name='John Doe' age=20
user = User(name='John Doe', age=20) # (3)
print(user)
#> name='John Doe' age=20 The field 'name' has an alias 'full_name'.
The model is populated by the alias 'full_name'.
The model is populated by the field name 'name'.
Type: bool
Whether to populate models with the value property of enums, rather than the raw enum.
This may be useful if you want to serialize model.model_dump() later. Defaults to False.
from enum import Enum
from typing import Optional
from pydantic import BaseModel, ConfigDict, Field
class SomeEnum(Enum):
FOO = 'foo'
BAR = 'bar'
BAZ = 'baz'
class SomeModel(BaseModel):
model_config = ConfigDict(use_enum_values=True)
some_enum: SomeEnum
another_enum: Optional[SomeEnum] = Field(
default=SomeEnum.FOO, validate_default=True
)
model1 = SomeModel(some_enum=SomeEnum.BAR)
print(model1.model_dump())
#> {'some_enum': 'bar', 'another_enum': 'foo'}
model2 = SomeModel(some_enum=SomeEnum.BAR, another_enum=SomeEnum.BAZ)
print(model2.model_dump())
#> {'some_enum': 'bar', 'another_enum': 'baz'}
Type: bool
Whether to validate the data when the model is changed. Defaults to False.
The default behavior of Pydantic is to validate the data when the model is created.
In case the user changes the data after the model is created, the model is not revalidated.
from pydantic import BaseModel
class User(BaseModel):
name: str
user = User(name='John Doe') # (1)
print(user)
#> name='John Doe'
user.name = 123 # (1)
print(user)
#> name=123 The validation happens only when the model is created.
The validation does not happen when the data is changed.
In case you want to revalidate the model when the data is changed, you can use validate_assignment=True:
from pydantic import BaseModel, ValidationError
class User(BaseModel, validate_assignment=True): # (1)
name: str
user = User(name='John Doe') # (2)
print(user)
#> name='John Doe'
try:
user.name = 123 # (3)
except ValidationError as e:
print(e)
'''
1 validation error for User
name
Input should be a valid string [type=string_type, input_value=123, input_type=int]
''' You can either use class keyword arguments, or model_config to set validate_assignment=True.
The validation happens when the model is created.
The validation also happens when the data is changed.
Type: bool
Whether arbitrary types are allowed for field types. Defaults to False.
from pydantic import BaseModel, ConfigDict, ValidationError
# This is not a pydantic model, it's an arbitrary class
class Pet:
def __init__(self, name: str):
self.name = name
class Model(BaseModel):
model_config = ConfigDict(arbitrary_types_allowed=True)
pet: Pet
owner: str
pet = Pet(name='Hedwig')
# A simple check of instance type is used to validate the data
model = Model(owner='Harry', pet=pet)
print(model)
#> pet=<__main__.Pet object at 0x0123456789ab> owner='Harry'
print(model.pet)
#> <__main__.Pet object at 0x0123456789ab>
print(model.pet.name)
#> Hedwig
print(type(model.pet))
#> <class '__main__.Pet'>
try:
# If the value is not an instance of the type, it's invalid
Model(owner='Harry', pet='Hedwig')
except ValidationError as e:
print(e)
'''
1 validation error for Model
pet
Input should be an instance of Pet [type=is_instance_of, input_value='Hedwig', input_type=str]
'''
# Nothing in the instance of the arbitrary type is checked
# Here name probably should have been a str, but it's not validated
pet2 = Pet(name=42)
model2 = Model(owner='Harry', pet=pet2)
print(model2)
#> pet=<__main__.Pet object at 0x0123456789ab> owner='Harry'
print(model2.pet)
#> <__main__.Pet object at 0x0123456789ab>
print(model2.pet.name)
#> 42
print(type(model2.pet))
#> <class '__main__.Pet'>
Type: bool
Whether to build models and look up discriminators of tagged unions using python object attributes.
Type: bool
Whether to use the actual key provided in the data (e.g. alias) for error locs rather than the field’s name. Defaults to True.
Type: bool
A callable that takes a field name and returns an alias for it
or an instance of AliasGenerator. Defaults to None.
When using a callable, the alias generator is used for both validation and serialization.
If you want to use different alias generators for validation and serialization, you can use
AliasGenerator instead.
If data source field names do not match your code style (e. g. CamelCase fields),
you can automatically generate aliases using alias_generator. Here’s an example with
a basic callable:
from pydantic import BaseModel, ConfigDict
from pydantic.alias_generators import to_pascal
class Voice(BaseModel):
model_config = ConfigDict(alias_generator=to_pascal)
name: str
language_code: str
voice = Voice(Name='Filiz', LanguageCode='tr-TR')
print(voice.language_code)
#> tr-TR
print(voice.model_dump(by_alias=True))
#> {'Name': 'Filiz', 'LanguageCode': 'tr-TR'}
If you want to use different alias generators for validation and serialization, you can use
AliasGenerator.
from pydantic import AliasGenerator, BaseModel, ConfigDict
from pydantic.alias_generators import to_camel, to_pascal
class Athlete(BaseModel):
first_name: str
last_name: str
sport: str
model_config = ConfigDict(
alias_generator=AliasGenerator(
validation_alias=to_camel,
serialization_alias=to_pascal,
)
)
athlete = Athlete(firstName='John', lastName='Doe', sport='track')
print(athlete.model_dump(by_alias=True))
#> {'FirstName': 'John', 'LastName': 'Doe', 'Sport': 'track'}
Type: Callable[[str], str] | AliasGenerator | None
A tuple of types that may occur as values of class attributes without annotations. This is
typically used for custom descriptors (classes that behave like property). If an attribute is set on a
class without an annotation and has a type that is not in this tuple (or otherwise recognized by
pydantic), an error will be raised. Defaults to ().
Type: tuple[type, ...]
Whether to allow infinity (+inf an -inf) and NaN values to float and decimal fields. Defaults to True.
Type: bool
A dict or callable to provide extra JSON schema properties. Defaults to None.
Type: JsonDict | JsonSchemaExtraCallable | None
A dict of custom JSON encoders for specific types. Defaults to None.
Type: dict[type[object], JsonEncoder] | None
(new in V2) If True, strict validation is applied to all fields on the model.
By default, Pydantic attempts to coerce values to the correct type, when possible.
There are situations in which you may want to disable this behavior, and instead raise an error if a value’s type does not match the field’s type annotation.
To configure strict mode for all fields on a model, you can set strict=True on the model.
from pydantic import BaseModel, ConfigDict
class Model(BaseModel):
model_config = ConfigDict(strict=True)
name: str
age: int
See Strict Mode for more details.
See the Conversion Table for more details on how Pydantic converts data in both strict and lax modes.
Type: bool
When and how to revalidate models and dataclasses during validation. Accepts the string
values of 'never', 'always' and 'subclass-instances'. Defaults to 'never'.
'never'will not revalidate models and dataclasses during validation'always'will revalidate models and dataclasses during validation'subclass-instances'will revalidate models and dataclasses during validation if the instance is a subclass of the model or dataclass
By default, model and dataclass instances are not revalidated during validation.
from typing import List
from pydantic import BaseModel
class User(BaseModel, revalidate_instances='never'): # (1)
hobbies: List[str]
class SubUser(User):
sins: List[str]
class Transaction(BaseModel):
user: User
my_user = User(hobbies=['reading'])
t = Transaction(user=my_user)
print(t)
#> user=User(hobbies=['reading'])
my_user.hobbies = [1] # (2)
t = Transaction(user=my_user) # (3)
print(t)
#> user=User(hobbies=[1])
my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying'])
t = Transaction(user=my_sub_user)
print(t)
#> user=SubUser(hobbies=['scuba diving'], sins=['lying']) revalidate_instances is set to 'never' by **default.
The assignment is not validated, unless you set validate_assignment to True in the model's config.
Since revalidate_instances is set to never, this is not revalidated.
If you want to revalidate instances during validation, you can set revalidate_instances to 'always'
in the model’s config.
from typing import List
from pydantic import BaseModel, ValidationError
class User(BaseModel, revalidate_instances='always'): # (1)
hobbies: List[str]
class SubUser(User):
sins: List[str]
class Transaction(BaseModel):
user: User
my_user = User(hobbies=['reading'])
t = Transaction(user=my_user)
print(t)
#> user=User(hobbies=['reading'])
my_user.hobbies = [1]
try:
t = Transaction(user=my_user) # (2)
except ValidationError as e:
print(e)
'''
1 validation error for Transaction
user.hobbies.0
Input should be a valid string [type=string_type, input_value=1, input_type=int]
'''
my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying'])
t = Transaction(user=my_sub_user)
print(t) # (3)
#> user=User(hobbies=['scuba diving']) revalidate_instances is set to 'always'.
The model is revalidated, since revalidate_instances is set to 'always'.
Using 'never' we would have gotten user=SubUser(hobbies=['scuba diving'], sins=['lying']).
It’s also possible to set revalidate_instances to 'subclass-instances' to only revalidate instances
of subclasses of the model.
from typing import List
from pydantic import BaseModel
class User(BaseModel, revalidate_instances='subclass-instances'): # (1)
hobbies: List[str]
class SubUser(User):
sins: List[str]
class Transaction(BaseModel):
user: User
my_user = User(hobbies=['reading'])
t = Transaction(user=my_user)
print(t)
#> user=User(hobbies=['reading'])
my_user.hobbies = [1]
t = Transaction(user=my_user) # (2)
print(t)
#> user=User(hobbies=[1])
my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying'])
t = Transaction(user=my_sub_user)
print(t) # (3)
#> user=User(hobbies=['scuba diving']) revalidate_instances is set to 'subclass-instances'.
This is not revalidated, since my_user is not a subclass of User.
Using 'never' we would have gotten user=SubUser(hobbies=['scuba diving'], sins=['lying']).
Type: Literal['always', 'never', 'subclass-instances']
The format of JSON serialized timedeltas. Accepts the string values of 'iso8601' and
'float'. Defaults to 'iso8601'.
'iso8601'will serialize timedeltas to ISO 8601 durations.'float'will serialize timedeltas to the total number of seconds.
Type: Literal['iso8601', 'float']
The encoding of JSON serialized bytes. Defaults to 'utf8'.
Set equal to val_json_bytes to get back an equal value after serialization round trip.
'utf8'will serialize bytes to UTF-8 strings.'base64'will serialize bytes to URL safe base64 strings.'hex'will serialize bytes to hexadecimal strings.
Type: Literal['utf8', 'base64', 'hex']
The encoding of JSON serialized bytes to decode. Defaults to 'utf8'.
Set equal to ser_json_bytes to get back an equal value after serialization round trip.
'utf8'will deserialize UTF-8 strings to bytes.'base64'will deserialize URL safe base64 strings to bytes.'hex'will deserialize hexadecimal strings to bytes.
Type: Literal['utf8', 'base64', 'hex']
The encoding of JSON serialized infinity and NaN float values. Defaults to 'null'.
'null'will serialize infinity and NaN values asnull.'constants'will serialize infinity and NaN values asInfinityandNaN.'strings'will serialize infinity as string"Infinity"and NaN as string"NaN".
Type: Literal['null', 'constants', 'strings']
Whether to validate default values during validation. Defaults to False.
Type: bool
Whether to validate the return value from call validators. Defaults to False.
Type: bool
A tuple of strings and/or patterns that prevent models from having fields with names that conflict with them.
For strings, we match on a prefix basis. Ex, if ‘dog’ is in the protected namespace, ‘dog_name’ will be protected.
For patterns, we match on the entire field name. Ex, if re.compile(r'^dog is in the protected namespace, 'dog' will be protected, but 'dog_name' will not be.) is in the protected namespace, ‘dog’ will be protected, but ‘dog_name’ will not be.
Defaults to ('model_validate', 'model_dump',).
The reason we’ve selected these is to prevent collisions with other validation / dumping formats
in the future - ex, model_validate_\{some_newly_supported_format\}.
Before v2.10, Pydantic used ('model_',) as the default value for this setting to
prevent collisions between model attributes and BaseModel’s own methods. This was changed
in v2.10 given feedback that this restriction was limiting in AI and data science contexts,
where it is common to have fields with names like model_id, model_input, model_output, etc.
For more details, see https://github.com/pydantic/pydantic/issues/10315.
import warnings
from pydantic import BaseModel
warnings.filterwarnings('error') # Raise warnings as errors
try:
class Model(BaseModel):
model_dump_something: str
except UserWarning as e:
print(e)
'''
Field "model_dump_something" in Model has conflict with protected namespace "model_dump".
You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('model_validate',)`.
'''
You can customize this behavior using the protected_namespaces setting:
import re
import warnings
from pydantic import BaseModel, ConfigDict
with warnings.catch_warnings(record=True) as caught_warnings:
warnings.simplefilter('always') # Catch all warnings
class Model(BaseModel):
safe_field: str
also_protect_field: str
protect_this: str
model_config = ConfigDict(
protected_namespaces=(
'protect_me_',
'also_protect_',
re.compile('^protect_this$'),
)
)
for warning in caught_warnings:
print(f'{warning.message}')
'''
Field "also_protect_field" in Model has conflict with protected namespace "also_protect_".
You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('protect_me_', re.compile('^protect_this$'))`.
Field "protect_this" in Model has conflict with protected namespace "re.compile('^protect_this$')".
You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('protect_me_', 'also_protect_')`.
'''
While Pydantic will only emit a warning when an item is in a protected namespace but does not actually have a collision, an error is raised if there is an actual collision with an existing attribute:
from pydantic import BaseModel, ConfigDict
try:
class Model(BaseModel):
model_validate: str
model_config = ConfigDict(protected_namespaces=('model_',))
except NameError as e:
print(e)
'''
Field "model_validate" conflicts with member <bound method BaseModel.model_validate of <class 'pydantic.main.BaseModel'>> of protected namespace "model_".
'''
Type: tuple[str | Pattern[str], ...]
Whether to hide inputs when printing errors. Defaults to False.
Pydantic shows the input value and type when it raises ValidationError during the validation.
from pydantic import BaseModel, ValidationError
class Model(BaseModel):
a: str
try:
Model(a=123)
except ValidationError as e:
print(e)
'''
1 validation error for Model
a
Input should be a valid string [type=string_type, input_value=123, input_type=int]
'''
You can hide the input value and type by setting the hide_input_in_errors config to True.
from pydantic import BaseModel, ConfigDict, ValidationError
class Model(BaseModel):
a: str
model_config = ConfigDict(hide_input_in_errors=True)
try:
Model(a=123)
except ValidationError as e:
print(e)
'''
1 validation error for Model
a
Input should be a valid string [type=string_type]
'''
Type: bool
Whether to defer model validator and serializer construction until the first model validation. Defaults to False.
This can be useful to avoid the overhead of building models which are only
used nested within other models, or when you want to manually define type namespace via
Model.model_rebuild(_types_namespace=...).
Since v2.10, this setting also applies to pydantic dataclasses and TypeAdapter instances.
Type: bool
A dict of settings for plugins. Defaults to None.
Type: dict[str, object] | None
Type: type[_GenerateSchema] | None
Whether fields with default values should be marked as required in the serialization schema. Defaults to False.
This ensures that the serialization schema will reflect the fact a field with a default will always be present when serializing the model, even though it is not required for validation.
However, there are scenarios where this may be undesirable — in particular, if you want to share the schema between validation and serialization, and don’t mind fields with defaults being marked as not required during serialization. See #7209 for more details.
from pydantic import BaseModel, ConfigDict
class Model(BaseModel):
a: str = 'a'
model_config = ConfigDict(json_schema_serialization_defaults_required=True)
print(Model.model_json_schema(mode='validation'))
'''
{
'properties': {'a': {'default': 'a', 'title': 'A', 'type': 'string'}},
'title': 'Model',
'type': 'object',
}
'''
print(Model.model_json_schema(mode='serialization'))
'''
{
'properties': {'a': {'default': 'a', 'title': 'A', 'type': 'string'}},
'required': ['a'],
'title': 'Model',
'type': 'object',
}
'''
Type: bool
If not None, the specified mode will be used to generate the JSON schema regardless of what mode was passed to
the function call. Defaults to None.
This provides a way to force the JSON schema generation to reflect a specific mode, e.g., to always use the validation schema.
It can be useful when using frameworks (such as FastAPI) that may generate different schemas for validation
and serialization that must both be referenced from the same schema; when this happens, we automatically append
-Input to the definition reference for the validation schema and -Output to the definition reference for the
serialization schema. By specifying a json_schema_mode_override though, this prevents the conflict between
the validation and serialization schemas (since both will use the specified schema), and so prevents the suffixes
from being added to the definition references.
from pydantic import BaseModel, ConfigDict, Json
class Model(BaseModel):
a: Json[int] # requires a string to validate, but will dump an int
print(Model.model_json_schema(mode='serialization'))
'''
{
'properties': {'a': {'title': 'A', 'type': 'integer'}},
'required': ['a'],
'title': 'Model',
'type': 'object',
}
'''
class ForceInputModel(Model):
# the following ensures that even with mode='serialization', we
# will get the schema that would be generated for validation.
model_config = ConfigDict(json_schema_mode_override='validation')
print(ForceInputModel.model_json_schema(mode='serialization'))
'''
{
'properties': {
'a': {
'contentMediaType': 'application/json',
'contentSchema': {'type': 'integer'},
'title': 'A',
'type': 'string',
}
},
'required': ['a'],
'title': 'ForceInputModel',
'type': 'object',
}
'''
Type: Literal['validation', 'serialization', None]
If True, enables automatic coercion of any Number type to str in “lax” (non-strict) mode. Defaults to False.
Pydantic doesn’t allow number types (int, float, Decimal) to be coerced as type str by default.
from decimal import Decimal
from pydantic import BaseModel, ConfigDict, ValidationError
class Model(BaseModel):
value: str
try:
print(Model(value=42))
except ValidationError as e:
print(e)
'''
1 validation error for Model
value
Input should be a valid string [type=string_type, input_value=42, input_type=int]
'''
class Model(BaseModel):
model_config = ConfigDict(coerce_numbers_to_str=True)
value: str
repr(Model(value=42).value)
#> "42"
repr(Model(value=42.13).value)
#> "42.13"
repr(Model(value=Decimal('42.13')).value)
#> "42.13"
Type: bool
The regex engine to be used for pattern validation.
Defaults to 'rust-regex'.
rust-regexuses theregexRust crate, which is non-backtracking and therefore more DDoS resistant, but does not support all regex features.python-reuse theremodule, which supports all regex features, but may be slower.
from pydantic import BaseModel, ConfigDict, Field, ValidationError
class Model(BaseModel):
model_config = ConfigDict(regex_engine='python-re')
value: str = Field(pattern=r'^abc(?=def)')
print(Model(value='abcdef').value)
#> abcdef
try:
print(Model(value='abxyzcdef'))
except ValidationError as e:
print(e)
'''
1 validation error for Model
value
String should match pattern '^abc(?=def)' [type=string_pattern_mismatch, input_value='abxyzcdef', input_type=str]
'''
Type: Literal['rust-regex', 'python-re']
If True, Python exceptions that were part of a validation failure will be shown as an exception group as a cause. Can be useful for debugging. Defaults to False.
Type: bool
Whether docstrings of attributes (bare string literals immediately following the attribute declaration)
should be used for field descriptions. Defaults to False.
Available in Pydantic v2.7+.
from pydantic import BaseModel, ConfigDict, Field
class Model(BaseModel):
model_config = ConfigDict(use_attribute_docstrings=True)
x: str
"""
Example of an attribute docstring
"""
y: int = Field(description="Description in Field")
"""
Description in Field overrides attribute docstring
"""
print(Model.model_fields["x"].description)
# > Example of an attribute docstring
print(Model.model_fields["y"].description)
# > Description in Field
This requires the source code of the class to be available at runtime.
Type: bool
Whether to cache strings to avoid constructing new Python objects. Defaults to True.
Enabling this setting should significantly improve validation performance while increasing memory usage slightly.
Trueor'all'(the default): cache all strings'keys': cache only dictionary keysFalseor'none': no caching
Type: bool | Literal['all', 'keys', 'none']
def with_config(config: ConfigDict) -> Callable[[_TypeT], _TypeT]
A convenience decorator to set a Pydantic configuration on a TypedDict or a dataclass from the standard library.
Although the configuration can be set using the __pydantic_config__ attribute, it does not play well with type checkers,
especially with TypedDict.
Callable[[_TypeT], _TypeT]
Default: Literal['allow', 'ignore', 'forbid']
Alias generators for converting between different capitalization conventions.
def to_pascal(snake: str) -> str
Convert a snake_case string to PascalCase.
str — The PascalCase string.
The string to convert.
def to_camel(snake: str) -> str
Convert a snake_case string to camelCase.
str — The converted camelCase string.
The string to convert.
def to_snake(camel: str) -> str
Convert a PascalCase, camelCase, or kebab-case string to snake_case.
str — The converted string in snake_case.
The string to convert.