Skip to content

ReadTimeoutError doesn't pickle / maintain the str parsing #3567

@csm10495

Description

@csm10495

Subject

When using urllib3 inside of multiprocessing, python often uses pickle to send exceptions and other data from the child to the parent. If ReadTimeoutError is raised inside a multiprocess, it gets pickle, but loses most of its context in route back to the parent process.

Environment

Describe your environment.
At least, paste here the output of:

>>> import platform
>>> import ssl
>>> import urllib3
>>>
>>> print("OS", platform.platform())
print("Python", platform.python_version())
print(ssl.OPENSSL_VERSION)
print("urllib3", urllib3.__version__)
OS macOS-15.3.1-x86_64-i386-64bit
>>> print("Python", platform.python_version())
Python 3.10.14
>>> print(ssl.OPENSSL_VERSION)
OpenSSL 1.1.1w  11 Sep 2023
>>> print("urllib3", urllib3.__version__)
urllib3 2.3.0

Steps to Reproduce

A simple and isolated way to reproduce the issue. A code snippet would be great.

In [1]: import urllib3
   ...: import pickle

In [2]: ex = urllib3.exceptions.ReadTimeoutError('connection pool', 'my url', 'the message with more context')

In [3]: print(ex)
connection pool: the message with more context

In [4]: print(pickle.loads(pickle.dumps(ex)))
None: None

Expected Behavior

I understand not pickling the connection pool but optimally the error message should at least include the originally given message after pickling.

Actual Behavior

See the steps to reproduce. The string version of the exception winds up being a confusing: None: None

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions