Skip to content
You're viewing docs for v2.11. See the latest version →

Network Types

The networks module contains types for common network-related fields.

PydanticUserError

Bases: PydanticErrorMixin, TypeError

An error raised due to incorrect use of Pydantic.


GetCoreSchemaHandler

Handler to call into the next CoreSchema schema generation function.

Attributes

field_name

Get the name of the closest field to this validator.

Type: str | None

Methods

call

def __call__(source_type: Any) -> core_schema.CoreSchema

Call the inner handler and get the CoreSchema it returns. This will call the next CoreSchema modifying function up until it calls into Pydantic’s internal schema generation machinery, which will raise a pydantic.errors.PydanticSchemaGenerationError error if it cannot generate a CoreSchema for the given source type.

Returns

core_schema.CoreSchema — The pydantic-core CoreSchema generated.

Parameters

source_type : Any

The input type.

generate_schema

def generate_schema(source_type: Any) -> core_schema.CoreSchema

Generate a schema unrelated to the current context. Use this function if e.g. you are handling schema generation for a sequence and want to generate a schema for its items. Otherwise, you may end up doing something like applying a min_length constraint that was intended for the sequence itself to its items!

Returns

core_schema.CoreSchema — The pydantic-core CoreSchema generated.

Parameters

source_type : Any

The input type.

resolve_ref_schema

def resolve_ref_schema(
    maybe_ref_schema: core_schema.CoreSchema,
) -> core_schema.CoreSchema

Get the real schema for a definition-ref schema. If the schema given is not a definition-ref schema, it will be returned as is. This means you don’t have to check before calling this function.

Returns

core_schema.CoreSchema — A concrete CoreSchema.

Parameters

maybe_ref_schema : core_schema.CoreSchema

A CoreSchema, ref-based or not.

Raises
  • LookupError — If the ref is not found.

TypeAdapter

Bases: Generic[T]

Type adapters provide a flexible way to perform validation and serialization based on a Python type.

A TypeAdapter instance exposes some of the functionality from BaseModel instance methods for types that do not have such methods (such as dataclasses, primitive types, and more).

Note: TypeAdapter instances are not types, and cannot be used as type annotations for fields.

Constructor Parameters

type : Any

The type associated with the TypeAdapter.

config : ConfigDict | None Default: None

Configuration for the TypeAdapter, should be a dictionary conforming to ConfigDict.

_parent_depth : int Default: 2

Depth at which to search for the parent frame. This frame is used when resolving forward annotations during schema building, by looking for the globals and locals of this frame. Defaults to 2, which will result in the frame where the TypeAdapter was instantiated.

module : str | None Default: None

The module that passes to plugin if provided.

Compatibility with mypy

Depending on the type used, mypy might raise an error when instantiating a TypeAdapter. As a workaround, you can explicitly annotate your variable:

from typing import Union

from pydantic import TypeAdapter

ta: TypeAdapter[Union[str, int]] = TypeAdapter(Union[str, int])  # type: ignore[arg-type]
Namespace management nuances and implementation details

Here, we collect some notes on namespace management, and subtle differences from BaseModel:

BaseModel uses its own __module__ to find out where it was defined and then looks for symbols to resolve forward references in those globals. On the other hand, TypeAdapter can be initialized with arbitrary objects, which may not be types and thus do not have a __module__ available. So instead we look at the globals in our parent stack frame.

It is expected that the ns_resolver passed to this function will have the correct namespace for the type we’re adapting. See the source code for TypeAdapter.__init__ and TypeAdapter.rebuild for various ways to construct this namespace.

This works for the case where this function is called in a module that has the target of forward references in its scope, but does not always work for more complex cases.

For example, take the following:

a.py
IntList = list[int]
OuterDict = dict[str, 'IntList']
b.py
from a import OuterDict

from pydantic import TypeAdapter

IntList = int  # replaces the symbol the forward reference is looking for
v = TypeAdapter(OuterDict)
v({'x': 1})  # should fail but doesn't

If OuterDict were a BaseModel, this would work because it would resolve the forward reference within the a.py namespace. But TypeAdapter(OuterDict) can’t determine what module OuterDict came from.

In other words, the assumption that all forward references exist in the module we are being called from is not technically always true. Although most of the time it is and it works fine for recursive models and such, BaseModel’s behavior isn’t perfect either and can break in similar ways, so there is no right or wrong between the two.

But at the very least this behavior is subtly different from BaseModel’s.

Attributes

core_schema

Type: CoreSchema

validator

Type: SchemaValidator | PluggableSchemaValidator

serializer

Type: SchemaSerializer

pydantic_complete

Type: bool Default: False

Methods

rebuild

def rebuild(
    force: bool = False,
    raise_errors: bool = True,
    _parent_namespace_depth: int = 2,
    _types_namespace: _namespace_utils.MappingNamespace | None = None,
) -> bool | None

Try to rebuild the pydantic-core schema for the adapter’s type.

This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails.

Returns

bool | None — Returns None if the schema is already “complete” and rebuilding was not required. bool | None — If rebuilding was required, returns True if rebuilding was successful, otherwise False.

Parameters

force : bool Default: False

Whether to force the rebuilding of the type adapter’s schema, defaults to False.

raise_errors : bool Default: True

Whether to raise errors, defaults to True.

_parent_namespace_depth : int Default: 2

Depth at which to search for the parent frame. This frame is used when resolving forward annotations during schema rebuilding, by looking for the locals of this frame. Defaults to 2, which will result in the frame where the method was called.

_types_namespace : _namespace_utils.MappingNamespace | None Default: None

An explicit types namespace to use, instead of using the local namespace from the parent frame. Defaults to None.

validate_python

def validate_python(
    object: Any,
    strict: bool | None = None,
    from_attributes: bool | None = None,
    context: dict[str, Any] | None = None,
    experimental_allow_partial: bool | Literal['off', 'on', 'trailing-strings'] = False,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> T

Validate a Python object against the model.

Returns

T — The validated object.

Parameters

object : Any

The Python object to validate against the model.

strict : bool | None Default: None

Whether to strictly check types.

from_attributes : bool | None Default: None

Whether to extract data from object attributes.

context : dict[str, Any] | None Default: None

Additional context to pass to the validator.

experimental_allow_partial : bool | Literal['off', 'on', 'trailing-strings'] Default: False

Experimental whether to enable partial validation, e.g. to process streams.

  • False / ‘off’: Default behavior, no partial validation.
  • True / ‘on’: Enable partial validation.
  • ‘trailing-strings’: Enable partial validation and allow trailing strings in the input.

by_alias : bool | None Default: None

Whether to use the field’s alias when validating against the provided input data.

by_name : bool | None Default: None

Whether to use the field’s name when validating against the provided input data.

validate_json

def validate_json(
    data: str | bytes | bytearray,
    strict: bool | None = None,
    context: dict[str, Any] | None = None,
    experimental_allow_partial: bool | Literal['off', 'on', 'trailing-strings'] = False,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> T

Validate a JSON string or bytes against the model.

Returns

T — The validated object.

Parameters

data : str | bytes | bytearray

The JSON data to validate against the model.

strict : bool | None Default: None

Whether to strictly check types.

context : dict[str, Any] | None Default: None

Additional context to use during validation.

experimental_allow_partial : bool | Literal['off', 'on', 'trailing-strings'] Default: False

Experimental whether to enable partial validation, e.g. to process streams.

  • False / ‘off’: Default behavior, no partial validation.
  • True / ‘on’: Enable partial validation.
  • ‘trailing-strings’: Enable partial validation and allow trailing strings in the input.

by_alias : bool | None Default: None

Whether to use the field’s alias when validating against the provided input data.

by_name : bool | None Default: None

Whether to use the field’s name when validating against the provided input data.

validate_strings

def validate_strings(
    obj: Any,
    strict: bool | None = None,
    context: dict[str, Any] | None = None,
    experimental_allow_partial: bool | Literal['off', 'on', 'trailing-strings'] = False,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> T

Validate object contains string data against the model.

Returns

T — The validated object.

Parameters

obj : Any

The object contains string data to validate.

strict : bool | None Default: None

Whether to strictly check types.

context : dict[str, Any] | None Default: None

Additional context to use during validation.

experimental_allow_partial : bool | Literal['off', 'on', 'trailing-strings'] Default: False

Experimental whether to enable partial validation, e.g. to process streams.

  • False / ‘off’: Default behavior, no partial validation.
  • True / ‘on’: Enable partial validation.
  • ‘trailing-strings’: Enable partial validation and allow trailing strings in the input.

by_alias : bool | None Default: None

Whether to use the field’s alias when validating against the provided input data.

by_name : bool | None Default: None

Whether to use the field’s name when validating against the provided input data.

get_default_value

def get_default_value(
    strict: bool | None = None,
    context: dict[str, Any] | None = None,
) -> Some[T] | None

Get the default value for the wrapped type.

Returns

Some[T] | None — The default value wrapped in a Some if there is one or None if not.

Parameters

strict : bool | None Default: None

Whether to strictly check types.

context : dict[str, Any] | None Default: None

Additional context to pass to the validator.

dump_python

def dump_python(
    instance: T,
    mode: Literal['json', 'python'] = 'python',
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
    context: dict[str, Any] | None = None,
) -> Any

Dump an instance of the adapted type to a Python object.

Returns

Any — The serialized object.

Parameters

instance : T

The Python object to serialize.

mode : Literal['json', 'python'] Default: 'python'

The output format.

include : IncEx | None Default: None

Fields to include in the output.

exclude : IncEx | None Default: None

Fields to exclude from the output.

by_alias : bool | None Default: None

Whether to use alias names for field names.

exclude_unset : bool Default: False

Whether to exclude unset fields.

exclude_defaults : bool Default: False

Whether to exclude fields with default values.

exclude_none : bool Default: False

Whether to exclude fields with None values.

round_trip : bool Default: False

Whether to output the serialized data in a way that is compatible with deserialization.

warnings : bool | Literal['none', 'warn', 'error'] Default: True

How to handle serialization errors. False/“none” ignores them, True/“warn” logs errors, “error” raises a PydanticSerializationError.

fallback : Callable[[Any], Any] | None Default: None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

serialize_as_any : bool Default: False

Whether to serialize fields with duck-typing serialization behavior.

context : dict[str, Any] | None Default: None

Additional context to pass to the serializer.

dump_json

def dump_json(
    instance: T,
    indent: int | None = None,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
    context: dict[str, Any] | None = None,
) -> bytes

Serialize an instance of the adapted type to JSON.

Returns

bytes — The JSON representation of the given instance as bytes.

Parameters

instance : T

The instance to be serialized.

indent : int | None Default: None

Number of spaces for JSON indentation.

include : IncEx | None Default: None

Fields to include.

exclude : IncEx | None Default: None

Fields to exclude.

by_alias : bool | None Default: None

Whether to use alias names for field names.

exclude_unset : bool Default: False

Whether to exclude unset fields.

exclude_defaults : bool Default: False

Whether to exclude fields with default values.

exclude_none : bool Default: False

Whether to exclude fields with a value of None.

round_trip : bool Default: False

Whether to serialize and deserialize the instance to ensure round-tripping.

warnings : bool | Literal['none', 'warn', 'error'] Default: True

How to handle serialization errors. False/“none” ignores them, True/“warn” logs errors, “error” raises a PydanticSerializationError.

fallback : Callable[[Any], Any] | None Default: None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

serialize_as_any : bool Default: False

Whether to serialize fields with duck-typing serialization behavior.

context : dict[str, Any] | None Default: None

Additional context to pass to the serializer.

json_schema

def json_schema(
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
) -> dict[str, Any]

Generate a JSON schema for the adapted type.

Returns

dict[str, Any] — The JSON schema for the model as a dictionary.

Parameters

by_alias : bool Default: True

Whether to use alias names for field names.

ref_template : str Default: DEFAULT_REF_TEMPLATE

The format string used for generating $ref strings.

schema_generator : type[GenerateJsonSchema] Default: GenerateJsonSchema

The generator class used for creating the schema.

mode : JsonSchemaMode Default: 'validation'

The mode to use for schema generation.

json_schemas

@staticmethod

def json_schemas(
    inputs: Iterable[tuple[JsonSchemaKeyT, JsonSchemaMode, TypeAdapter[Any]]],
    by_alias: bool = True,
    title: str | None = None,
    description: str | None = None,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
) -> tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], JsonSchemaValue]

Generate a JSON schema including definitions from multiple type adapters.

Returns

tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], JsonSchemaValue] — A tuple where:

  • The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.)
  • The second element is a JSON schema containing all definitions referenced in the first returned element, along with the optional title and description keys.
Parameters

inputs : Iterable[tuple[JsonSchemaKeyT, JsonSchemaMode, TypeAdapter[Any]]]

Inputs to schema generation. The first two items will form the keys of the (first) output mapping; the type adapters will provide the core schemas that get converted into definitions in the output JSON schema.

by_alias : bool Default: True

Whether to use alias names.

title : str | None Default: None

The title for the schema.

description : str | None Default: None

The description for the schema.

ref_template : str Default: DEFAULT_REF_TEMPLATE

The format string used for generating $ref strings.

schema_generator : type[GenerateJsonSchema] Default: GenerateJsonSchema

The generator class used for creating the schema.


UrlConstraints

Url constraints.

Attributes

max_length

Type: int | None Default: None

allowed_schemes

Type: list[str] | None Default: None

host_required

Type: bool | None Default: None

default_host

Type: str | None Default: None

default_port

Type: int | None Default: None

default_path

Type: str | None Default: None

defined_constraints

Fetch a key / value mapping of constraints to values that are not None. Used for core schema updates.

Type: dict[str, Any]


AnyUrl

Bases: _BaseUrl

Base type for all URLs.

  • Any scheme allowed
  • Top-level domain (TLD) not required
  • Host not required

Assuming an input URL of http://samuel:[email protected]:8000/the/path/?query=here#fragment=is;this=bit, the types export the following properties:

  • scheme: the URL scheme (http), always set.
  • host: the URL host (example.com).
  • username: optional username if included (samuel).
  • password: optional password if included (pass).
  • port: optional port (8000).
  • path: optional path (/the/path/).
  • query: optional URL query (for example, GET arguments or “search string”, such as query=here).
  • fragment: optional fragment (fragment=is;this=bit).

AnyHttpUrl

Bases: AnyUrl

A type that will accept any http or https URL.

  • TLD not required
  • Host not required

HttpUrl

Bases: AnyUrl

A type that will accept any http or https URL.

  • TLD not required
  • Host not required
  • Max length 2083
from pydantic import BaseModel, HttpUrl, ValidationError

class MyModel(BaseModel):
  url: HttpUrl

m = MyModel(url='http://www.example.com')  # (1)
print(m.url)
#> http://www.example.com/

try:
  MyModel(url='ftp://invalid.url')
except ValidationError as e:
  print(e)
  '''
  1 validation error for MyModel
  url
    URL scheme should be 'http' or 'https' [type=url_scheme, input_value='ftp://invalid.url', input_type=str]
  '''

try:
  MyModel(url='not a url')
except ValidationError as e:
  print(e)
  '''
  1 validation error for MyModel
  url
    Input should be a valid URL, relative URL without a base [type=url_parsing, input_value='not a url', input_type=str]
  '''

Note: mypy would prefer m = MyModel(url=HttpUrl('http://www.example.com')), but Pydantic will convert the string to an HttpUrl instance anyway.

“International domains” (e.g. a URL where the host or TLD includes non-ascii characters) will be encoded via punycode (see this article for a good description of why this is important):

from pydantic import BaseModel, HttpUrl

class MyModel(BaseModel):
    url: HttpUrl

m1 = MyModel(url='http://puny£code.com')
print(m1.url)
#> http://xn--punycode-eja.com/
m2 = MyModel(url='https://www.аррӏе.com/')
print(m2.url)
#> https://www.xn--80ak6aa92e.com/
m3 = MyModel(url='https://www.example.珠宝/')
print(m3.url)
#> https://www.example.xn--pbt977c/

AnyWebsocketUrl

Bases: AnyUrl

A type that will accept any ws or wss URL.

  • TLD not required
  • Host not required

WebsocketUrl

Bases: AnyUrl

A type that will accept any ws or wss URL.

  • TLD not required
  • Host not required
  • Max length 2083

FileUrl

Bases: AnyUrl

A type that will accept any file URL.

  • Host not required

FtpUrl

Bases: AnyUrl

A type that will accept ftp URL.

  • TLD not required
  • Host not required

PostgresDsn

Bases: _BaseMultiHostUrl

A type that will accept any Postgres DSN.

  • User info required
  • TLD not required
  • Host required
  • Supports multiple hosts

If further validation is required, these properties can be used by validators to enforce specific behaviour:

from pydantic import (
    BaseModel,
    HttpUrl,
    PostgresDsn,
    ValidationError,
    field_validator,
)

class MyModel(BaseModel):
    url: HttpUrl

m = MyModel(url='http://www.example.com')

# the repr() method for a url will display all properties of the url
print(repr(m.url))
#> HttpUrl('http://www.example.com/')
print(m.url.scheme)
#> http
print(m.url.host)
#> www.example.com
print(m.url.port)
#> 80

class MyDatabaseModel(BaseModel):
    db: PostgresDsn

    @field_validator('db')
    def check_db_name(cls, v):
        assert v.path and len(v.path) > 1, 'database must be provided'
        return v

m = MyDatabaseModel(db='postgres://user:pass@localhost:5432/foobar')
print(m.db)
#> postgres://user:pass@localhost:5432/foobar

try:
    MyDatabaseModel(db='postgres://user:pass@localhost:5432')
except ValidationError as e:
    print(e)
    '''
    1 validation error for MyDatabaseModel
    db
      Assertion failed, database must be provided
    assert (None)
     +  where None = PostgresDsn('postgres://user:pass@localhost:5432').path [type=assertion_error, input_value='postgres://user:pass@localhost:5432', input_type=str]
    '''

Attributes

host

The required URL host.

Type: str


CockroachDsn

Bases: AnyUrl

A type that will accept any Cockroach DSN.

  • User info required
  • TLD not required
  • Host required

Attributes

host

The required URL host.

Type: str


AmqpDsn

Bases: AnyUrl

A type that will accept any AMQP DSN.

  • User info required
  • TLD not required
  • Host not required

RedisDsn

Bases: AnyUrl

A type that will accept any Redis DSN.

  • User info required
  • TLD not required
  • Host required (e.g., rediss://:pass@localhost)

Attributes

host

The required URL host.

Type: str


MongoDsn

Bases: _BaseMultiHostUrl

A type that will accept any MongoDB DSN.

  • User info not required
  • Database name not required
  • Port not required
  • User info may be passed without user part (e.g., mongodb://mongodb0.example.com:27017).

KafkaDsn

Bases: AnyUrl

A type that will accept any Kafka DSN.

  • User info required
  • TLD not required
  • Host not required

NatsDsn

Bases: _BaseMultiHostUrl

A type that will accept any NATS DSN.

NATS is a connective technology built for the ever increasingly hyper-connected world. It is a single technology that enables applications to securely communicate across any combination of cloud vendors, on-premise, edge, web and mobile, and devices. More: https://nats.io


MySQLDsn

Bases: AnyUrl

A type that will accept any MySQL DSN.

  • User info required
  • TLD not required
  • Host not required

MariaDBDsn

Bases: AnyUrl

A type that will accept any MariaDB DSN.

  • User info required
  • TLD not required
  • Host not required

ClickHouseDsn

Bases: AnyUrl

A type that will accept any ClickHouse DSN.

  • User info required
  • TLD not required
  • Host not required

SnowflakeDsn

Bases: AnyUrl

A type that will accept any Snowflake DSN.

  • User info required
  • TLD not required
  • Host required

Attributes

host

The required URL host.

Type: str


EmailStr

Validate email addresses.

from pydantic import BaseModel, EmailStr

class Model(BaseModel):
    email: EmailStr

print(Model(email='[email protected]'))
#> email='[email protected]'

NameEmail

Bases: Representation

Validate a name and email address combination, as specified by RFC 5322.

The NameEmail has two properties: name and email. In case the name is not provided, it’s inferred from the email address.

from pydantic import BaseModel, NameEmail

class User(BaseModel):
    email: NameEmail

user = User(email='Fred Bloggs <[email protected]>')
print(user.email)
#> Fred Bloggs <[email protected]>
print(user.email.name)
#> Fred Bloggs

user = User(email='[email protected]')
print(user.email)
#> fred.bloggs <[email protected]>
print(user.email.name)
#> fred.bloggs

Attributes

name

Default: name

email

Default: email


IPvAnyAddress

Validate an IPv4 or IPv6 address.

from pydantic import BaseModel
from pydantic.networks import IPvAnyAddress

class IpModel(BaseModel):
    ip: IPvAnyAddress

print(IpModel(ip='127.0.0.1'))
#> ip=IPv4Address('127.0.0.1')

try:
    IpModel(ip='http://www.example.com')
except ValueError as e:
    print(e.errors())
    '''
    [
        {
            'type': 'ip_any_address',
            'loc': ('ip',),
            'msg': 'value is not a valid IPv4 or IPv6 address',
            'input': 'http://www.example.com',
        }
    ]
    '''

Methods

new

def __new__(cls, value: Any) -> IPvAnyAddressType

Validate an IPv4 or IPv6 address.

Returns

IPvAnyAddressType


IPvAnyInterface

Validate an IPv4 or IPv6 interface.

Methods

new

def __new__(cls, value: NetworkType) -> IPvAnyInterfaceType

Validate an IPv4 or IPv6 interface.

Returns

IPvAnyInterfaceType


IPvAnyNetwork

Validate an IPv4 or IPv6 network.

Methods

new

def __new__(cls, value: NetworkType) -> IPvAnyNetworkType

Validate an IPv4 or IPv6 network.

Returns

IPvAnyNetworkType


getattr_migration

def getattr_migration(module: str) -> Callable[[str], Any]

Implement PEP 562 for objects that were either moved or removed on the migration to V2.

Returns

Callable[[str], Any] — A callable that will raise an error if the object is not found.

Parameters

module : str

The module name.


import_email_validator

def import_email_validator() -> None

Returns

None


validate_email

def validate_email(value: str) -> tuple[str, str]

Email address validation using email-validator.

Returns

tuple[str, str] — A tuple containing the local part of the email (or the name for “pretty” email addresses) and the normalized email.

Raises

  • PydanticCustomError — If the email is invalid.

JsonSchemaValue

A type alias for a JSON schema value. This is a dictionary of string keys to arbitrary JSON values.

Default: dict[str, Any]

NetworkType

Type: TypeAlias Default: 'str | bytes | int | tuple[str | bytes | int, str | int]'

IPvAnyAddressType

Type: TypeAlias Default: 'IPv4Address | IPv6Address'

IPvAnyInterfaceType

Type: TypeAlias Default: 'IPv4Interface | IPv6Interface'

IPvAnyNetworkType

Type: TypeAlias Default: 'IPv4Network | IPv6Network'

pretty_email_regex

Default: _build_pretty_email_regex()

MAX_EMAIL_LENGTH

Maximum length for an email. A somewhat arbitrary but very generous number compared to what is allowed by most implementations.

Default: 2048