Network Types
The networks module contains types for common network-related fields.
Bases: PydanticErrorMixin, TypeError
An error raised due to incorrect use of Pydantic.
Handler to call into the next CoreSchema schema generation function.
Get the name of the closest field to this validator.
Type: str | None
def __call__(source_type: Any) -> core_schema.CoreSchema
Call the inner handler and get the CoreSchema it returns.
This will call the next CoreSchema modifying function up until it calls
into Pydantic’s internal schema generation machinery, which will raise a
pydantic.errors.PydanticSchemaGenerationError error if it cannot generate
a CoreSchema for the given source type.
core_schema.CoreSchema — The pydantic-core CoreSchema generated.
The input type.
def generate_schema(source_type: Any) -> core_schema.CoreSchema
Generate a schema unrelated to the current context.
Use this function if e.g. you are handling schema generation for a sequence
and want to generate a schema for its items.
Otherwise, you may end up doing something like applying a min_length constraint
that was intended for the sequence itself to its items!
core_schema.CoreSchema — The pydantic-core CoreSchema generated.
The input type.
def resolve_ref_schema(
maybe_ref_schema: core_schema.CoreSchema,
) -> core_schema.CoreSchema
Get the real schema for a definition-ref schema.
If the schema given is not a definition-ref schema, it will be returned as is.
This means you don’t have to check before calling this function.
core_schema.CoreSchema — A concrete CoreSchema.
A CoreSchema, ref-based or not.
LookupError— If therefis not found.
Bases: Generic[T]
Type adapters provide a flexible way to perform validation and serialization based on a Python type.
A TypeAdapter instance exposes some of the functionality from BaseModel instance methods
for types that do not have such methods (such as dataclasses, primitive types, and more).
Note: TypeAdapter instances are not types, and cannot be used as type annotations for fields.
The type associated with the TypeAdapter.
Configuration for the TypeAdapter, should be a dictionary conforming to
ConfigDict.
Depth at which to search for the parent frame. This frame is used when
resolving forward annotations during schema building, by looking for the globals and locals of this
frame. Defaults to 2, which will result in the frame where the TypeAdapter was instantiated.
The module that passes to plugin if provided.
Compatibility with mypy
Depending on the type used, mypy might raise an error when instantiating a TypeAdapter. As a workaround, you can explicitly
annotate your variable:
from typing import Union
from pydantic import TypeAdapter
ta: TypeAdapter[Union[str, int]] = TypeAdapter(Union[str, int]) # type: ignore[arg-type]
Namespace management nuances and implementation details
Here, we collect some notes on namespace management, and subtle differences from BaseModel:
BaseModel uses its own __module__ to find out where it was defined
and then looks for symbols to resolve forward references in those globals.
On the other hand, TypeAdapter can be initialized with arbitrary objects,
which may not be types and thus do not have a __module__ available.
So instead we look at the globals in our parent stack frame.
It is expected that the ns_resolver passed to this function will have the correct
namespace for the type we’re adapting. See the source code for TypeAdapter.__init__
and TypeAdapter.rebuild for various ways to construct this namespace.
This works for the case where this function is called in a module that has the target of forward references in its scope, but does not always work for more complex cases.
For example, take the following:
IntList = list[int]
OuterDict = dict[str, 'IntList']
from a import OuterDict
from pydantic import TypeAdapter
IntList = int # replaces the symbol the forward reference is looking for
v = TypeAdapter(OuterDict)
v({'x': 1}) # should fail but doesn't
If OuterDict were a BaseModel, this would work because it would resolve
the forward reference within the a.py namespace.
But TypeAdapter(OuterDict) can’t determine what module OuterDict came from.
In other words, the assumption that all forward references exist in the
module we are being called from is not technically always true.
Although most of the time it is and it works fine for recursive models and such,
BaseModel’s behavior isn’t perfect either and can break in similar ways,
so there is no right or wrong between the two.
But at the very least this behavior is subtly different from BaseModel’s.
Type: CoreSchema
Type: SchemaValidator | PluggableSchemaValidator
Type: SchemaSerializer
Type: bool Default: False
def rebuild(
force: bool = False,
raise_errors: bool = True,
_parent_namespace_depth: int = 2,
_types_namespace: _namespace_utils.MappingNamespace | None = None,
) -> bool | None
Try to rebuild the pydantic-core schema for the adapter’s type.
This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails.
bool | None — Returns None if the schema is already “complete” and rebuilding was not required.
bool | None — If rebuilding was required, returns True if rebuilding was successful, otherwise False.
Whether to force the rebuilding of the type adapter’s schema, defaults to False.
Whether to raise errors, defaults to True.
Depth at which to search for the parent frame. This frame is used when resolving forward annotations during schema rebuilding, by looking for the locals of this frame. Defaults to 2, which will result in the frame where the method was called.
An explicit types namespace to use, instead of using the local namespace
from the parent frame. Defaults to None.
def validate_python(
object: Any,
strict: bool | None = None,
from_attributes: bool | None = None,
context: dict[str, Any] | None = None,
experimental_allow_partial: bool | Literal['off', 'on', 'trailing-strings'] = False,
by_alias: bool | None = None,
by_name: bool | None = None,
) -> T
Validate a Python object against the model.
T — The validated object.
The Python object to validate against the model.
Whether to strictly check types.
Whether to extract data from object attributes.
Additional context to pass to the validator.
Experimental whether to enable partial validation, e.g. to process streams.
- False / ‘off’: Default behavior, no partial validation.
- True / ‘on’: Enable partial validation.
- ‘trailing-strings’: Enable partial validation and allow trailing strings in the input.
Whether to use the field’s alias when validating against the provided input data.
Whether to use the field’s name when validating against the provided input data.
def validate_json(
data: str | bytes | bytearray,
strict: bool | None = None,
context: dict[str, Any] | None = None,
experimental_allow_partial: bool | Literal['off', 'on', 'trailing-strings'] = False,
by_alias: bool | None = None,
by_name: bool | None = None,
) -> T
Validate a JSON string or bytes against the model.
T — The validated object.
The JSON data to validate against the model.
Whether to strictly check types.
Additional context to use during validation.
Experimental whether to enable partial validation, e.g. to process streams.
- False / ‘off’: Default behavior, no partial validation.
- True / ‘on’: Enable partial validation.
- ‘trailing-strings’: Enable partial validation and allow trailing strings in the input.
Whether to use the field’s alias when validating against the provided input data.
Whether to use the field’s name when validating against the provided input data.
def validate_strings(
obj: Any,
strict: bool | None = None,
context: dict[str, Any] | None = None,
experimental_allow_partial: bool | Literal['off', 'on', 'trailing-strings'] = False,
by_alias: bool | None = None,
by_name: bool | None = None,
) -> T
Validate object contains string data against the model.
T — The validated object.
The object contains string data to validate.
Whether to strictly check types.
Additional context to use during validation.
Experimental whether to enable partial validation, e.g. to process streams.
- False / ‘off’: Default behavior, no partial validation.
- True / ‘on’: Enable partial validation.
- ‘trailing-strings’: Enable partial validation and allow trailing strings in the input.
Whether to use the field’s alias when validating against the provided input data.
Whether to use the field’s name when validating against the provided input data.
def get_default_value(
strict: bool | None = None,
context: dict[str, Any] | None = None,
) -> Some[T] | None
Get the default value for the wrapped type.
Some[T] | None — The default value wrapped in a Some if there is one or None if not.
Whether to strictly check types.
Additional context to pass to the validator.
def dump_python(
instance: T,
mode: Literal['json', 'python'] = 'python',
include: IncEx | None = None,
exclude: IncEx | None = None,
by_alias: bool | None = None,
exclude_unset: bool = False,
exclude_defaults: bool = False,
exclude_none: bool = False,
round_trip: bool = False,
warnings: bool | Literal['none', 'warn', 'error'] = True,
fallback: Callable[[Any], Any] | None = None,
serialize_as_any: bool = False,
context: dict[str, Any] | None = None,
) -> Any
Dump an instance of the adapted type to a Python object.
Any — The serialized object.
The Python object to serialize.
The output format.
Fields to include in the output.
Fields to exclude from the output.
Whether to use alias names for field names.
Whether to exclude unset fields.
Whether to exclude fields with default values.
Whether to exclude fields with None values.
Whether to output the serialized data in a way that is compatible with deserialization.
How to handle serialization errors. False/“none” ignores them, True/“warn” logs errors,
“error” raises a PydanticSerializationError.
A function to call when an unknown value is encountered. If not provided,
a PydanticSerializationError error is raised.
Whether to serialize fields with duck-typing serialization behavior.
Additional context to pass to the serializer.
def dump_json(
instance: T,
indent: int | None = None,
include: IncEx | None = None,
exclude: IncEx | None = None,
by_alias: bool | None = None,
exclude_unset: bool = False,
exclude_defaults: bool = False,
exclude_none: bool = False,
round_trip: bool = False,
warnings: bool | Literal['none', 'warn', 'error'] = True,
fallback: Callable[[Any], Any] | None = None,
serialize_as_any: bool = False,
context: dict[str, Any] | None = None,
) -> bytes
Serialize an instance of the adapted type to JSON.
bytes — The JSON representation of the given instance as bytes.
The instance to be serialized.
Number of spaces for JSON indentation.
Fields to include.
Fields to exclude.
Whether to use alias names for field names.
Whether to exclude unset fields.
Whether to exclude fields with default values.
Whether to exclude fields with a value of None.
Whether to serialize and deserialize the instance to ensure round-tripping.
How to handle serialization errors. False/“none” ignores them, True/“warn” logs errors,
“error” raises a PydanticSerializationError.
A function to call when an unknown value is encountered. If not provided,
a PydanticSerializationError error is raised.
Whether to serialize fields with duck-typing serialization behavior.
Additional context to pass to the serializer.
def json_schema(
by_alias: bool = True,
ref_template: str = DEFAULT_REF_TEMPLATE,
schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
mode: JsonSchemaMode = 'validation',
) -> dict[str, Any]
Generate a JSON schema for the adapted type.
dict[str, Any] — The JSON schema for the model as a dictionary.
Whether to use alias names for field names.
The format string used for generating $ref strings.
The generator class used for creating the schema.
The mode to use for schema generation.
@staticmethod
def json_schemas(
inputs: Iterable[tuple[JsonSchemaKeyT, JsonSchemaMode, TypeAdapter[Any]]],
by_alias: bool = True,
title: str | None = None,
description: str | None = None,
ref_template: str = DEFAULT_REF_TEMPLATE,
schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
) -> tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], JsonSchemaValue]
Generate a JSON schema including definitions from multiple type adapters.
tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], JsonSchemaValue] — A tuple where:
- The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.)
- The second element is a JSON schema containing all definitions referenced in the first returned element, along with the optional title and description keys.
Inputs to schema generation. The first two items will form the keys of the (first) output mapping; the type adapters will provide the core schemas that get converted into definitions in the output JSON schema.
Whether to use alias names.
The title for the schema.
The description for the schema.
The format string used for generating $ref strings.
The generator class used for creating the schema.
Url constraints.
Type: int | None Default: None
Type: list[str] | None Default: None
Type: bool | None Default: None
Type: str | None Default: None
Type: int | None Default: None
Type: str | None Default: None
Fetch a key / value mapping of constraints to values that are not None. Used for core schema updates.
Type: dict[str, Any]
Bases: _BaseUrl
Base type for all URLs.
- Any scheme allowed
- Top-level domain (TLD) not required
- Host not required
Assuming an input URL of http://samuel:[email protected]:8000/the/path/?query=here#fragment=is;this=bit,
the types export the following properties:
scheme: the URL scheme (http), always set.host: the URL host (example.com).username: optional username if included (samuel).password: optional password if included (pass).port: optional port (8000).path: optional path (/the/path/).query: optional URL query (for example,GETarguments or “search string”, such asquery=here).fragment: optional fragment (fragment=is;this=bit).
Bases: AnyUrl
A type that will accept any http or https URL.
- TLD not required
- Host not required
Bases: AnyUrl
A type that will accept any http or https URL.
- TLD not required
- Host not required
- Max length 2083
from pydantic import BaseModel, HttpUrl, ValidationError
class MyModel(BaseModel):
url: HttpUrl
m = MyModel(url='http://www.example.com') # (1)
print(m.url)
#> http://www.example.com/
try:
MyModel(url='ftp://invalid.url')
except ValidationError as e:
print(e)
'''
1 validation error for MyModel
url
URL scheme should be 'http' or 'https' [type=url_scheme, input_value='ftp://invalid.url', input_type=str]
'''
try:
MyModel(url='not a url')
except ValidationError as e:
print(e)
'''
1 validation error for MyModel
url
Input should be a valid URL, relative URL without a base [type=url_parsing, input_value='not a url', input_type=str]
''' Note: mypy would prefer m = MyModel(url=HttpUrl('http://www.example.com')), but Pydantic will convert the string to an HttpUrl instance anyway.
“International domains” (e.g. a URL where the host or TLD includes non-ascii characters) will be encoded via punycode (see this article for a good description of why this is important):
from pydantic import BaseModel, HttpUrl
class MyModel(BaseModel):
url: HttpUrl
m1 = MyModel(url='http://puny£code.com')
print(m1.url)
#> http://xn--punycode-eja.com/
m2 = MyModel(url='https://www.аррӏе.com/')
print(m2.url)
#> https://www.xn--80ak6aa92e.com/
m3 = MyModel(url='https://www.example.珠宝/')
print(m3.url)
#> https://www.example.xn--pbt977c/
Bases: AnyUrl
A type that will accept any ws or wss URL.
- TLD not required
- Host not required
Bases: AnyUrl
A type that will accept any ws or wss URL.
- TLD not required
- Host not required
- Max length 2083
Bases: AnyUrl
A type that will accept any file URL.
- Host not required
Bases: AnyUrl
A type that will accept ftp URL.
- TLD not required
- Host not required
Bases: _BaseMultiHostUrl
A type that will accept any Postgres DSN.
- User info required
- TLD not required
- Host required
- Supports multiple hosts
If further validation is required, these properties can be used by validators to enforce specific behaviour:
from pydantic import (
BaseModel,
HttpUrl,
PostgresDsn,
ValidationError,
field_validator,
)
class MyModel(BaseModel):
url: HttpUrl
m = MyModel(url='http://www.example.com')
# the repr() method for a url will display all properties of the url
print(repr(m.url))
#> HttpUrl('http://www.example.com/')
print(m.url.scheme)
#> http
print(m.url.host)
#> www.example.com
print(m.url.port)
#> 80
class MyDatabaseModel(BaseModel):
db: PostgresDsn
@field_validator('db')
def check_db_name(cls, v):
assert v.path and len(v.path) > 1, 'database must be provided'
return v
m = MyDatabaseModel(db='postgres://user:pass@localhost:5432/foobar')
print(m.db)
#> postgres://user:pass@localhost:5432/foobar
try:
MyDatabaseModel(db='postgres://user:pass@localhost:5432')
except ValidationError as e:
print(e)
'''
1 validation error for MyDatabaseModel
db
Assertion failed, database must be provided
assert (None)
+ where None = PostgresDsn('postgres://user:pass@localhost:5432').path [type=assertion_error, input_value='postgres://user:pass@localhost:5432', input_type=str]
'''
The required URL host.
Type: str
Bases: AnyUrl
A type that will accept any Cockroach DSN.
- User info required
- TLD not required
- Host required
The required URL host.
Type: str
Bases: AnyUrl
A type that will accept any AMQP DSN.
- User info required
- TLD not required
- Host not required
Bases: AnyUrl
A type that will accept any Redis DSN.
- User info required
- TLD not required
- Host required (e.g.,
rediss://:pass@localhost)
The required URL host.
Type: str
Bases: _BaseMultiHostUrl
A type that will accept any MongoDB DSN.
- User info not required
- Database name not required
- Port not required
- User info may be passed without user part (e.g.,
mongodb://mongodb0.example.com:27017).
Bases: AnyUrl
A type that will accept any Kafka DSN.
- User info required
- TLD not required
- Host not required
Bases: _BaseMultiHostUrl
A type that will accept any NATS DSN.
NATS is a connective technology built for the ever increasingly hyper-connected world. It is a single technology that enables applications to securely communicate across any combination of cloud vendors, on-premise, edge, web and mobile, and devices. More: https://nats.io
Bases: AnyUrl
A type that will accept any MySQL DSN.
- User info required
- TLD not required
- Host not required
Bases: AnyUrl
A type that will accept any MariaDB DSN.
- User info required
- TLD not required
- Host not required
Bases: AnyUrl
A type that will accept any ClickHouse DSN.
- User info required
- TLD not required
- Host not required
Bases: AnyUrl
A type that will accept any Snowflake DSN.
- User info required
- TLD not required
- Host required
The required URL host.
Type: str
Validate email addresses.
from pydantic import BaseModel, EmailStr
class Model(BaseModel):
email: EmailStr
print(Model(email='[email protected]'))
#> email='[email protected]'
Bases: Representation
Validate a name and email address combination, as specified by RFC 5322.
The NameEmail has two properties: name and email.
In case the name is not provided, it’s inferred from the email address.
from pydantic import BaseModel, NameEmail
class User(BaseModel):
email: NameEmail
user = User(email='Fred Bloggs <[email protected]>')
print(user.email)
#> Fred Bloggs <[email protected]>
print(user.email.name)
#> Fred Bloggs
user = User(email='[email protected]')
print(user.email)
#> fred.bloggs <[email protected]>
print(user.email.name)
#> fred.bloggs
Default: name
Default: email
Validate an IPv4 or IPv6 address.
from pydantic import BaseModel
from pydantic.networks import IPvAnyAddress
class IpModel(BaseModel):
ip: IPvAnyAddress
print(IpModel(ip='127.0.0.1'))
#> ip=IPv4Address('127.0.0.1')
try:
IpModel(ip='http://www.example.com')
except ValueError as e:
print(e.errors())
'''
[
{
'type': 'ip_any_address',
'loc': ('ip',),
'msg': 'value is not a valid IPv4 or IPv6 address',
'input': 'http://www.example.com',
}
]
'''
def __new__(cls, value: Any) -> IPvAnyAddressType
Validate an IPv4 or IPv6 address.
IPvAnyAddressType
Validate an IPv4 or IPv6 interface.
def __new__(cls, value: NetworkType) -> IPvAnyInterfaceType
Validate an IPv4 or IPv6 interface.
IPvAnyInterfaceType
Validate an IPv4 or IPv6 network.
def __new__(cls, value: NetworkType) -> IPvAnyNetworkType
Validate an IPv4 or IPv6 network.
IPvAnyNetworkType
def getattr_migration(module: str) -> Callable[[str], Any]
Implement PEP 562 for objects that were either moved or removed on the migration to V2.
Callable[[str], Any] — A callable that will raise an error if the object is not found.
The module name.
def import_email_validator() -> None
None
def validate_email(value: str) -> tuple[str, str]
Email address validation using email-validator.
tuple[str, str] — A tuple containing the local part of the email (or the name for “pretty” email addresses)
and the normalized email.
PydanticCustomError— If the email is invalid.
A type alias for a JSON schema value. This is a dictionary of string keys to arbitrary JSON values.
Default: dict[str, Any]
Type: TypeAlias Default: 'str | bytes | int | tuple[str | bytes | int, str | int]'
Type: TypeAlias Default: 'IPv4Address | IPv6Address'
Type: TypeAlias Default: 'IPv4Interface | IPv6Interface'
Type: TypeAlias Default: 'IPv4Network | IPv6Network'
Default: _build_pretty_email_regex()
Maximum length for an email. A somewhat arbitrary but very generous number compared to what is allowed by most implementations.
Default: 2048