.. module:: aifc
:synopsis: Read and write audio files in AIFF or AIFC format.
+ :deprecated:
**Source code:** :source:`Lib/aifc.py`
single: AIFF
single: AIFF-C
+
+.. deprecated:: 3.11
+ The :mod:`aifc` module is deprecated (see :pep:`594` for details).
+
--------------
This module provides support for reading and writing AIFF and AIFF-C files.
.. module:: asynchat
:synopsis: Support for asynchronous command/response protocols.
+ :deprecated:
.. moduleauthor:: Sam Rushing <rushing@nightmare.com>
.. sectionauthor:: Steve Holden <sholden@holdenweb.com>
**Source code:** :source:`Lib/asynchat.py`
.. deprecated:: 3.6
+ :mod:`asynchat` will be removed in Python 3.12 (:pep:`594`).
Please use :mod:`asyncio` instead.
--------------
.. module:: asyncore
:synopsis: A base class for developing asynchronous socket handling
services.
+ :deprecated:
.. moduleauthor:: Sam Rushing <rushing@nightmare.com>
.. sectionauthor:: Christopher Petrilli <petrilli@amber.org>
**Source code:** :source:`Lib/asyncore.py`
.. deprecated:: 3.6
+ :mod:`asyncore` will be removed in Python 3.12 (:pep:`594`).
Please use :mod:`asyncio` instead.
--------------
.. module:: audioop
:synopsis: Manipulate raw audio data.
+ :deprecated:
+
+.. deprecated:: 3.11
+ The :mod:`audioop` module is deprecated (see :pep:`594` for details).
--------------
.. function:: crc32(data[, value])
- Compute CRC-32, the 32-bit checksum of *data*, starting with an
+ Compute CRC-32, the unsigned 32-bit checksum of *data*, starting with an
initial CRC of *value*. The default initial CRC is zero. The algorithm
is consistent with the ZIP file checksum. Since the algorithm is designed for
use as a checksum algorithm, it is not suitable for use as a general hash
.. versionchanged:: 3.0
The result is always unsigned.
- To generate the same numeric value across all Python versions and
- platforms, use ``crc32(data) & 0xffffffff``.
-
+ To generate the same numeric value when using Python 2 or earlier,
+ use ``crc32(data) & 0xffffffff``.
.. function:: b2a_hex(data[, sep[, bytes_per_sep=1]])
hexlify(data[, sep[, bytes_per_sep=1]])
.. module:: cgi
:synopsis: Helpers for running Python scripts via the Common Gateway Interface.
+ :deprecated:
**Source code:** :source:`Lib/cgi.py`
single: URL
single: Common Gateway Interface
+.. deprecated:: 3.11
+ The :mod:`cgi` module is deprecated (see :pep:`594` for details).
+
--------------
Support module for Common Gateway Interface (CGI) scripts.
.. module:: cgitb
:synopsis: Configurable traceback handler for CGI scripts.
+ :deprecated:
.. moduleauthor:: Ka-Ping Yee <ping@lfw.org>
.. sectionauthor:: Fred L. Drake, Jr. <fdrake@acm.org>
single: exceptions; in CGI scripts
single: tracebacks; in CGI scripts
+.. deprecated:: 3.11
+ The :mod:`cgitb` module is deprecated (see :pep:`594` for details).
+
--------------
The :mod:`cgitb` module provides a special exception handler for Python scripts.
.. module:: chunk
:synopsis: Module to read IFF chunks.
+ :deprecated:
.. moduleauthor:: Sjoerd Mullender <sjoerd@acm.org>
.. sectionauthor:: Sjoerd Mullender <sjoerd@acm.org>
single: Real Media File Format
single: RMFF
+.. deprecated:: 3.11
+ The :mod:`chunk` module is deprecated (see :pep:`594` for details).
+
--------------
This module provides an interface for reading files that use EA IFF 85 chunks.
my_pictures: %(my_dir)s/Pictures
[Escape]
- gain: 80%% # use a %% to escape the % sign (% is the only character that needs to be escaped)
+ # use a %% to escape the % sign (% is the only character that needs to be escaped):
+ gain: 80%%
In the example above, :class:`ConfigParser` with *interpolation* set to
``BasicInterpolation()`` would resolve ``%(home_dir)s`` to the value of
my_pictures: ${my_dir}/Pictures
[Escape]
- cost: $$80 # use a $$ to escape the $ sign ($ is the only character that needs to be escaped)
+ # use a $$ to escape the $ sign ($ is the only character that needs to be escaped):
+ cost: $$80
Values from other sections can be fetched as well:
.. module:: crypt
:platform: Unix
:synopsis: The crypt() function used to check Unix passwords.
+ :deprecated:
.. moduleauthor:: Steven D. Majewski <sdm7g@virginia.edu>
.. sectionauthor:: Steven D. Majewski <sdm7g@virginia.edu>
single: crypt(3)
pair: cipher; DES
+.. deprecated:: 3.11
+ The :mod:`crypt` module is deprecated (see :pep:`594` for details).
+
--------------
This module implements an interface to the :manpage:`crypt(3)` routine, which is
csv.rst
configparser.rst
netrc.rst
- xdrlib.rst
plistlib.rst
.. module:: imghdr
:synopsis: Determine the type of image contained in a file or byte stream.
+ :deprecated:
**Source code:** :source:`Lib/imghdr.py`
+.. deprecated:: 3.11
+ The :mod:`imghdr` module is deprecated (see :pep:`594` for details).
+
--------------
The :mod:`imghdr` module determines the type of image contained in a file or
.. toctree::
webbrowser.rst
- cgi.rst
- cgitb.rst
wsgiref.rst
urllib.rst
urllib.request.rst
ftplib.rst
poplib.rst
imaplib.rst
- nntplib.rst
smtplib.rst
- smtpd.rst
- telnetlib.rst
uuid.rst
socketserver.rst
http.server.rst
ssl.rst
select.rst
selectors.rst
- asyncore.rst
- asynchat.rst
signal.rst
mmap.rst
.. toctree::
- audioop.rst
- aifc.rst
- sunau.rst
wave.rst
- chunk.rst
colorsys.rst
- imghdr.rst
- sndhdr.rst
- ossaudiodev.rst
.. module:: msilib
:platform: Windows
:synopsis: Creation of Microsoft Installer files, and CAB files.
+ :deprecated:
.. moduleauthor:: Martin v. Löwis <martin@v.loewis.de>
.. sectionauthor:: Martin v. Löwis <martin@v.loewis.de>
.. index:: single: msi
+.. deprecated:: 3.11
+ The :mod:`msilib` module is deprecated (see :pep:`594` for details).
+
--------------
The :mod:`msilib` supports the creation of Microsoft Installer (``.msi``) files.
binhex.rst
binascii.rst
quopri.rst
- uu.rst
.. module:: nis
:platform: Unix
:synopsis: Interface to Sun's NIS (Yellow Pages) library.
+ :deprecated:
.. moduleauthor:: Fred Gansevles <Fred.Gansevles@cs.utwente.nl>
.. sectionauthor:: Moshe Zadka <moshez@zadka.site.co.il>
+.. deprecated:: 3.11
+ The :mod:`nis` module is deprecated (see :pep:`594` for details).
+
--------------
The :mod:`nis` module gives a thin wrapper around the NIS library, useful for
.. module:: nntplib
:synopsis: NNTP protocol client (requires sockets).
+ :deprecated:
**Source code:** :source:`Lib/nntplib.py`
pair: NNTP; protocol
single: Network News Transfer Protocol
+.. deprecated:: 3.11
+ The :mod:`nntplib` module is deprecated (see :pep:`594` for details).
+
--------------
This module defines the class :class:`NNTP` which implements the client side of
.. module:: ossaudiodev
:platform: Linux, FreeBSD
:synopsis: Access to OSS-compatible audio devices.
+ :deprecated:
+
+.. deprecated:: 3.11
+ The :mod:`ossaudiodev` module is deprecated (see :pep:`594` for details).
--------------
The children are yielded in arbitrary order, and the special entries
``'.'`` and ``'..'`` are not included. If a file is removed from or added
- to the directory after creating the iterator, whether an path object for
+ to the directory after creating the iterator, whether a path object for
that file be included is unspecified.
.. method:: Path.lchmod(mode)
.. module:: pipes
:platform: Unix
:synopsis: A Python interface to Unix shell pipelines.
+ :deprecated:
.. sectionauthor:: Moshe Zadka <moshez@zadka.site.co.il>
**Source code:** :source:`Lib/pipes.py`
+.. deprecated:: 3.11
+ The :mod:`pipes` module is deprecated (see :pep:`594` for details).
+
--------------
The :mod:`pipes` module defines a class to abstract the concept of a *pipeline*
.. module:: smtpd
:synopsis: A SMTP server implementation in Python.
+ :deprecated:
.. moduleauthor:: Barry Warsaw <barry@python.org>
.. sectionauthor:: Moshe Zadka <moshez@moshez.org>
This module offers several classes to implement SMTP (email) servers.
.. deprecated:: 3.6
+ :mod:`smtpd` will be removed in Python 3.12 (:pep:`594`).
The `aiosmtpd <https://aiosmtpd.readthedocs.io/>`_ package is a recommended
replacement for this module. It is based on :mod:`asyncio` and provides a
more straightforward API.
.. module:: sndhdr
:synopsis: Determine type of a sound file.
+ :deprecated:
.. sectionauthor:: Fred L. Drake, Jr. <fdrake@acm.org>
.. Based on comments in the module source file.
single: A-LAW
single: u-LAW
+.. deprecated:: 3.11
+ The :mod:`sndhdr` module is deprecated (see :pep:`594` for details).
+
--------------
The :mod:`sndhdr` provides utility functions which attempt to determine the type
.. module:: spwd
:platform: Unix
:synopsis: The shadow password database (getspnam() and friends).
+ :deprecated:
+
+.. deprecated:: 3.11
+ The :mod:`spwd` module is deprecated (see :pep:`594` for details).
--------------
.. module:: sunau
:synopsis: Provide an interface to the Sun AU sound format.
+ :deprecated:
.. sectionauthor:: Moshe Zadka <moshez@zadka.site.co.il>
**Source code:** :source:`Lib/sunau.py`
+.. deprecated:: 3.11
+ The :mod:`sunau` module is deprecated (see :pep:`594` for details).
+
--------------
The :mod:`sunau` module provides a convenient interface to the Sun AU sound
.. toctree::
- optparse.rst
+ aifc.rst
+ asynchat.rst
+ asyncore.rst
+ audioop.rst
+ cgi.rst
+ cgitb.rst
+ chunk.rst
+ crypt.rst
+ imghdr.rst
imp.rst
+ msilib.rst
+ nntplib.rst
+ nis.rst
+ optparse.rst
+ ossaudiodev.rst
+ pipes.rst
+ smtpd.rst
+ sndhdr.rst
+ spwd.rst
+ sunau.rst
+ telnetlib.rst
+ uu.rst
+ xdrlib.rst
.. module:: telnetlib
:synopsis: Telnet client class.
+ :deprecated:
.. sectionauthor:: Skip Montanaro <skip@pobox.com>
.. index:: single: protocol; Telnet
+.. deprecated:: 3.11
+ The :mod:`telnetlib` module is deprecated (see :pep:`594` for details).
+
--------------
The :mod:`telnetlib` module provides a :class:`Telnet` class that implements the
Added *errors* parameter.
-.. function:: SpooledTemporaryFile(max_size=0, mode='w+b', buffering=-1, encoding=None, newline=None, suffix=None, prefix=None, dir=None, *, errors=None)
+.. class:: SpooledTemporaryFile(max_size=0, mode='w+b', buffering=-1, encoding=None, newline=None, suffix=None, prefix=None, dir=None, *, errors=None)
- This function operates exactly as :func:`TemporaryFile` does, except that
+ This class operates exactly as :func:`TemporaryFile` does, except that
data is spooled in memory until the file size exceeds *max_size*, or
until the file's :func:`fileno` method is called, at which point the
contents are written to disk and operation proceeds as with
Added *errors* parameter.
-.. function:: TemporaryDirectory(suffix=None, prefix=None, dir=None, ignore_cleanup_errors=False)
+.. class:: TemporaryDirectory(suffix=None, prefix=None, dir=None, ignore_cleanup_errors=False)
- This function securely creates a temporary directory using the same rules as :func:`mkdtemp`.
+ This class securely creates a temporary directory using the same rules as :func:`mkdtemp`.
The resulting object can be used as a context manager (see
:ref:`tempfile-examples`). On completion of the context or destruction
of the temporary directory object, the newly created temporary directory
subsequent_indent="", expand_tabs=True, \
replace_whitespace=True, fix_sentence_endings=False, \
break_long_words=True, drop_whitespace=True, \
- break_on_hyphens=True, tabsize=8, max_lines=None)
+ break_on_hyphens=True, tabsize=8, max_lines=None, \
+ placeholder=' [...]')
Wraps the single paragraph in *text* (a string) so every line is at most
*width* characters long. Returns a list of output lines, without final
replace_whitespace=True, fix_sentence_endings=False, \
break_long_words=True, drop_whitespace=True, \
break_on_hyphens=True, tabsize=8, \
- max_lines=None)
+ max_lines=None, placeholder=' [...]')
Wraps the single paragraph in *text*, and returns a single string containing the
wrapped paragraph. :func:`fill` is shorthand for ::
def notify_by_email(employees: Sequence[Employee],
overrides: Mapping[str, str]) -> None: ...
-Generics can be parameterized by using a new factory available in typing
+Generics can be parameterized by using a factory available in typing
called :class:`TypeVar`.
::
for var in vars:
var.set(0)
-A generic type can have any number of type variables, and type variables may
-be constrained::
+A generic type can have any number of type variables. All varieties of
+:class:`TypeVar` are permissible as parameters for a generic type::
- from typing import TypeVar, Generic
- ...
+ from typing import TypeVar, Generic, Sequence
- T = TypeVar('T')
+ T = TypeVar('T', contravariant=True)
+ B = TypeVar('B', bound=Sequence[bytes], covariant=True)
S = TypeVar('S', int, str)
- class StrangePair(Generic[T, S]):
+ class WeirdTrio(Generic[T, B, S]):
...
Each type variable argument to :class:`Generic` must be distinct.
Usage::
T = TypeVar('T') # Can be anything
- A = TypeVar('A', str, bytes) # Must be str or bytes
+ S = TypeVar('S', bound=str) # Can be any subtype of str
+ A = TypeVar('A', str, bytes) # Must be exactly str or bytes
Type variables exist primarily for the benefit of static type
checkers. They serve as the parameters for generic types as well
"""Return a list containing n references to x."""
return [x]*n
- def longest(x: A, y: A) -> A:
- """Return the longest of two strings."""
- return x if len(x) >= len(y) else y
- The latter example's signature is essentially the overloading
- of ``(str, str) -> str`` and ``(bytes, bytes) -> bytes``. Also note
- that if the arguments are instances of some subclass of :class:`str`,
- the return type is still plain :class:`str`.
+ def print_capitalized(x: S) -> S:
+ """Print x capitalized, and return x."""
+ print(x.capitalize())
+ return x
+
+
+ def concatenate(x: A, y: A) -> A:
+ """Add two strings or bytes objects together."""
+ return x + y
+
+ Note that type variables can be *bound*, *constrained*, or neither, but
+ cannot be both bound *and* constrained.
+
+ Constrained type variables and bound type variables have different
+ semantics in several important ways. Using a *constrained* type variable
+ means that the ``TypeVar`` can only ever be solved as being exactly one of
+ the constraints given::
+
+ a = concatenate('one', 'two') # Ok, variable 'a' has type 'str'
+ b = concatenate(StringSubclass('one'), StringSubclass('two')) # Inferred type of variable 'b' is 'str',
+ # despite 'StringSubclass' being passed in
+ c = concatenate('one', b'two') # error: type variable 'A' can be either 'str' or 'bytes' in a function call, but not both
+
+ Using a *bound* type variable, however, means that the ``TypeVar`` will be
+ solved using the most specific type possible::
+
+ print_capitalized('a string') # Ok, output has type 'str'
+
+ class StringSubclass(str):
+ pass
+
+ print_capitalized(StringSubclass('another string')) # Ok, output has type 'StringSubclass'
+ print_capitalized(45) # error: int is not a subtype of str
+
+ Type variables can be bound to concrete types, abstract types (ABCs or
+ protocols), and even unions of types::
+
+ U = TypeVar('U', bound=str|bytes) # Can be any subtype of the union str|bytes
+ V = TypeVar('V', bound=SupportsAbs) # Can be anything with an __abs__ method
+
+ Bound type variables are particularly useful for annotating
+ :func:`classmethods <classmethod>` that serve as alternative constructors.
+ In the following example (©
+ `Raymond Hettinger <https://www.youtube.com/watch?v=HTLu2DFOdTg>`_), the
+ type variable ``C`` is bound to the ``Circle`` class through the use of a
+ forward reference. Using this type variable to annotate the
+ ``with_circumference`` classmethod, rather than hardcoding the return type
+ as ``Circle``, means that a type checker can correctly infer the return
+ type even if the method is called on a subclass::
+
+ import math
+
+ C = TypeVar('C', bound='Circle')
+
+ class Circle:
+ """An abstract circle"""
+
+ def __init__(self, radius: float) -> None:
+ self.radius = radius
+
+ # Use a type variable to show that the return type
+ # will always be an instance of whatever `cls` is
+ @classmethod
+ def with_circumference(cls: type[C], circumference: float) -> C:
+ """Create a circle with the specified circumference"""
+ radius = circumference / (math.pi * 2)
+ return cls(radius)
+
+
+ class Tire(Circle):
+ """A specialised circle (made out of rubber)"""
+
+ MATERIAL = 'rubber'
+
+
+ c = Circle.with_circumference(3) # Ok, variable 'c' has type 'Circle'
+ t = Tire.with_circumference(4) # Ok, variable 't' has type 'Tire' (not 'Circle')
At runtime, ``isinstance(x, T)`` will raise :exc:`TypeError`. In general,
:func:`isinstance` and :func:`issubclass` should not be used with types.
Type variables may be marked covariant or contravariant by passing
``covariant=True`` or ``contravariant=True``. See :pep:`484` for more
- details. By default type variables are invariant. Alternatively,
- a type variable may specify an upper bound using ``bound=<type>``.
- This means that an actual type substituted (explicitly or implicitly)
- for the type variable must be a subclass of the boundary type,
- see :pep:`484`.
+ details. By default, type variables are invariant.
.. class:: ParamSpec(name, *, bound=None, covariant=False, contravariant=False)
.. data:: AnyStr
- ``AnyStr`` is a type variable defined as
+ ``AnyStr`` is a :class:`constrained type variable <TypeVar>` defined as
``AnyStr = TypeVar('AnyStr', str, bytes)``.
It is meant to be used for functions that may accept any kind of string
posix.rst
pwd.rst
- spwd.rst
grp.rst
- crypt.rst
termios.rst
tty.rst
pty.rst
fcntl.rst
- pipes.rst
resource.rst
- nis.rst
syslog.rst
.. module:: uu
:synopsis: Encode and decode files in uuencode format.
+ :deprecated:
.. moduleauthor:: Lance Ellinghouse
**Source code:** :source:`Lib/uu.py`
+.. deprecated:: 3.11
+ The :mod:`uu` module is deprecated (see :pep:`594` for details).
+
--------------
This module encodes and decodes files in uuencode format, allowing arbitrary
.. toctree::
- msilib.rst
msvcrt.rst
winreg.rst
winsound.rst
.. module:: xdrlib
:synopsis: Encoders and decoders for the External Data Representation (XDR).
+ :deprecated:
**Source code:** :source:`Lib/xdrlib.py`
single: XDR
single: External Data Representation
+.. deprecated:: 3.11
+ The :mod:`xdrlib` module is deprecated (see :pep:`594` for details).
+
--------------
The :mod:`xdrlib` module supports the External Data Representation Standard as
for use as a general hash algorithm.
.. versionchanged:: 3.0
- Always returns an unsigned value.
- To generate the same numeric value across all Python versions and
- platforms, use ``adler32(data) & 0xffffffff``.
-
+ The result is always unsigned.
+ To generate the same numeric value when using Python 2 or earlier,
+ use ``adler32(data) & 0xffffffff``.
.. function:: compress(data, /, level=-1)
for use as a general hash algorithm.
.. versionchanged:: 3.0
- Always returns an unsigned value.
- To generate the same numeric value across all Python versions and
- platforms, use ``crc32(data) & 0xffffffff``.
-
+ The result is always unsigned.
+ To generate the same numeric value when using Python 2 or earlier,
+ use ``crc32(data) & 0xffffffff``.
.. function:: decompress(data, /, wbits=MAX_WBITS, bufsize=DEF_BUF_SIZE)
present, must be last; it matches any exception. For an except clause with an
expression, that expression is evaluated, and the clause matches the exception
if the resulting object is "compatible" with the exception. An object is
-compatible with an exception if it is the class or a base class of the exception
-object, or a tuple containing an item that is the class or a base class of
-the exception object.
+compatible with an exception if the object is the class or a
+:term:`non-virtual base class <abstract base class>` of the exception object,
+or a tuple containing an item that is the class or a non-virtual base class
+of the exception object.
If no except clause matches the exception, the search for an exception handler
continues in the surrounding code and on the invocation stack. [#]_
Called by built-in function :func:`hash` and for operations on members of
hashed collections including :class:`set`, :class:`frozenset`, and
- :class:`dict`. :meth:`__hash__` should return an integer. The only required
+ :class:`dict`. The ``__hash__()`` method should return an integer. The only required
property is that objects which compare equal have the same hash value; it is
advised to mix together the hash values of the components of the object that
also play a part in comparison of objects by packing them into a tuple and
Exceptions are identified by class instances. The :keyword:`except` clause is
selected depending on the class of the instance: it must reference the class of
-the instance or a base class thereof. The instance can be received by the
-handler and can carry additional information about the exceptional condition.
+the instance or a :term:`non-virtual base class <abstract base class>` thereof.
+The instance can be received by the handler and can carry additional information
+about the exceptional condition.
.. note::
library/typing,,`,"# Else, type of ``val`` is narrowed to ``float``."
library/typing,,`,# Type of ``val`` is narrowed to ``List[str]``.
library/typing,,`,# Type of ``val`` remains as ``List[object]``.
+library/typing,,`, # will always be an instance of whatever `cls` is
/*--start constants--*/
#define PY_MAJOR_VERSION 3
#define PY_MINOR_VERSION 10
-#define PY_MICRO_VERSION 3
+#define PY_MICRO_VERSION 4
#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_FINAL
#define PY_RELEASE_SERIAL 0
/* Version as a string */
-#define PY_VERSION "3.10.3"
+#define PY_VERSION "3.10.4"
/*--end constants--*/
/* Version as a single 4-byte hex number, e.g. 0x010502B2 == 1.5.2b2.
from warnings import warn
warn(
- 'The asynchat module is deprecated. '
+ 'The asynchat module is deprecated and will be removed in Python 3.12. '
'The recommended replacement is asyncio',
DeprecationWarning,
stacklevel=2)
from . import exceptions
from . import mixins
+from . import tasks
class _ContextManagerMixin:
raise ValueError("Semaphore initial value must be >= 0")
self._value = value
self._waiters = collections.deque()
+ self._wakeup_scheduled = False
def __repr__(self):
res = super().__repr__()
waiter = self._waiters.popleft()
if not waiter.done():
waiter.set_result(None)
+ self._wakeup_scheduled = True
return
def locked(self):
called release() to make it larger than 0, and then return
True.
"""
- while self._value <= 0:
+ # _wakeup_scheduled is set if *another* task is scheduled to wakeup
+ # but its acquire() is not resumed yet
+ while self._wakeup_scheduled or self._value <= 0:
fut = self._get_loop().create_future()
self._waiters.append(fut)
try:
await fut
- except:
- # See the similar code in Queue.get.
- fut.cancel()
- if self._value > 0 and not fut.cancelled():
- self._wake_up_next()
+ # reset _wakeup_scheduled *after* waiting for a future
+ self._wakeup_scheduled = False
+ except exceptions.CancelledError:
+ self._wake_up_next()
raise
self._value -= 1
return True
errorcode
warnings.warn(
- 'The asyncore module is deprecated. '
+ 'The asyncore module is deprecated and will be removed in Python 3.12. '
'The recommended replacement is asyncio',
DeprecationWarning,
stacklevel=2)
unittest.TestCase.__init__(self)
self._dt_optionflags = optionflags
self._dt_checker = checker
+ self._dt_globs = test.globs.copy()
self._dt_test = test
self._dt_setUp = setUp
self._dt_tearDown = tearDown
if self._dt_tearDown is not None:
self._dt_tearDown(test)
+ # restore the original globs
test.globs.clear()
+ test.globs.update(self._dt_globs)
def runTest(self):
test = self._dt_test
import sysconfig
import time
import tokenize
+import types
import urllib.parse
import warnings
from collections import deque
normdirs.append(normdir)
return dirs
+def _isclass(object):
+ return inspect.isclass(object) and not isinstance(object, types.GenericAlias)
+
def _findclass(func):
cls = sys.modules.get(func.__module__)
if cls is None:
return None
for name in func.__qualname__.split('.')[:-1]:
cls = getattr(cls, name)
- if not inspect.isclass(cls):
+ if not _isclass(cls):
return None
return cls
if inspect.ismethod(obj):
name = obj.__func__.__name__
self = obj.__self__
- if (inspect.isclass(self) and
+ if (_isclass(self) and
getattr(getattr(self, name, None), '__func__') is obj.__func__):
# classmethod
cls = self
elif inspect.isbuiltin(obj):
name = obj.__name__
self = obj.__self__
- if (inspect.isclass(self) and
+ if (_isclass(self) and
self.__qualname__ + '.' + name == obj.__qualname__):
# classmethod
cls = self
def isdata(object):
"""Check if an object is of a type that probably means it's data."""
- return not (inspect.ismodule(object) or inspect.isclass(object) or
+ return not (inspect.ismodule(object) or _isclass(object) or
inspect.isroutine(object) or inspect.isframe(object) or
inspect.istraceback(object) or inspect.iscode(object))
# by lacking a __name__ attribute) and an instance.
try:
if inspect.ismodule(object): return self.docmodule(*args)
- if inspect.isclass(object): return self.docclass(*args)
+ if _isclass(object): return self.docclass(*args)
if inspect.isroutine(object): return self.docroutine(*args)
except AttributeError:
pass
modules = inspect.getmembers(object, inspect.ismodule)
classes, cdict = [], {}
- for key, value in inspect.getmembers(object, inspect.isclass):
+ for key, value in inspect.getmembers(object, _isclass):
# if __all__ exists, believe it. Otherwise use old heuristic.
if (all is not None or
(inspect.getmodule(value) or object) is object):
result = result + self.section('DESCRIPTION', desc)
classes = []
- for key, value in inspect.getmembers(object, inspect.isclass):
+ for key, value in inspect.getmembers(object, _isclass):
# if __all__ exists, believe it. Otherwise use old heuristic.
if (all is not None
or (inspect.getmodule(value) or object) is object):
return 'member descriptor %s.%s.%s' % (
thing.__objclass__.__module__, thing.__objclass__.__name__,
thing.__name__)
- if inspect.isclass(thing):
+ if _isclass(thing):
return 'class ' + thing.__name__
if inspect.isfunction(thing):
return 'function ' + thing.__name__
desc += ' in module ' + module.__name__
if not (inspect.ismodule(object) or
- inspect.isclass(object) or
+ _isclass(object) or
inspect.isroutine(object) or
inspect.isdatadescriptor(object) or
_getdoc(object)):
# -*- coding: utf-8 -*-
-# Autogenerated by Sphinx on Wed Mar 16 11:26:55 2022
+# Autogenerated by Sphinx on Wed Mar 23 20:11:40 2022
topics = {'assert': 'The "assert" statement\n'
'**********************\n'
'\n'
'resulting\n'
'object is “compatible” with the exception. An object is '
'compatible\n'
- 'with an exception if it is the class or a base class of the '
- 'exception\n'
- 'object, or a tuple containing an item that is the class or a '
+ 'with an exception if the object is the class or a *non-virtual '
'base\n'
- 'class of the exception object.\n'
+ 'class* of the exception object, or a tuple containing an item '
+ 'that is\n'
+ 'the class or a non-virtual base class of the exception object.\n'
'\n'
'If no except clause matches the exception, the search for an '
'exception\n'
'on members\n'
' of hashed collections including "set", "frozenset", and '
'"dict".\n'
- ' "__hash__()" should return an integer. The only required '
- 'property\n'
- ' is that objects which compare equal have the same hash '
- 'value; it is\n'
- ' advised to mix together the hash values of the '
- 'components of the\n'
- ' object that also play a part in comparison of objects by '
- 'packing\n'
- ' them into a tuple and hashing the tuple. Example:\n'
+ ' The "__hash__()" method should return an integer. The '
+ 'only required\n'
+ ' property is that objects which compare equal have the '
+ 'same hash\n'
+ ' value; it is advised to mix together the hash values of '
+ 'the\n'
+ ' components of the object that also play a part in '
+ 'comparison of\n'
+ ' objects by packing them into a tuple and hashing the '
+ 'tuple.\n'
+ ' Example:\n'
'\n'
' def __hash__(self):\n'
' return hash((self.name, self.nick, self.color))\n'
'clause is\n'
'selected depending on the class of the instance: it must '
'reference the\n'
- 'class of the instance or a base class thereof. The instance '
- 'can be\n'
- 'received by the handler and can carry additional information '
- 'about the\n'
- 'exceptional condition.\n'
+ 'class of the instance or a *non-virtual base class* thereof. '
+ 'The\n'
+ 'instance can be received by the handler and can carry '
+ 'additional\n'
+ 'information about the exceptional condition.\n'
'\n'
'Note:\n'
'\n'
'clause is\n'
'selected depending on the class of the instance: it must '
'reference the\n'
- 'class of the instance or a base class thereof. The instance '
- 'can be\n'
- 'received by the handler and can carry additional information '
- 'about the\n'
- 'exceptional condition.\n'
+ 'class of the instance or a *non-virtual base class* thereof. '
+ 'The\n'
+ 'instance can be received by the handler and can carry '
+ 'additional\n'
+ 'information about the exceptional condition.\n'
'\n'
'Note:\n'
'\n'
'on members\n'
' of hashed collections including "set", "frozenset", and '
'"dict".\n'
- ' "__hash__()" should return an integer. The only required '
- 'property\n'
- ' is that objects which compare equal have the same hash '
- 'value; it is\n'
- ' advised to mix together the hash values of the components '
- 'of the\n'
- ' object that also play a part in comparison of objects by '
- 'packing\n'
- ' them into a tuple and hashing the tuple. Example:\n'
+ ' The "__hash__()" method should return an integer. The '
+ 'only required\n'
+ ' property is that objects which compare equal have the '
+ 'same hash\n'
+ ' value; it is advised to mix together the hash values of '
+ 'the\n'
+ ' components of the object that also play a part in '
+ 'comparison of\n'
+ ' objects by packing them into a tuple and hashing the '
+ 'tuple.\n'
+ ' Example:\n'
'\n'
' def __hash__(self):\n'
' return hash((self.name, self.nick, self.color))\n'
'exception. For an except clause with an expression, that expression\n'
'is evaluated, and the clause matches the exception if the resulting\n'
'object is “compatible” with the exception. An object is compatible\n'
- 'with an exception if it is the class or a base class of the '
- 'exception\n'
- 'object, or a tuple containing an item that is the class or a base\n'
- 'class of the exception object.\n'
+ 'with an exception if the object is the class or a *non-virtual base\n'
+ 'class* of the exception object, or a tuple containing an item that '
+ 'is\n'
+ 'the class or a non-virtual base class of the exception object.\n'
'\n'
'If no except clause matches the exception, the search for an '
'exception\n'
]
warn(
- 'The smtpd module is deprecated and unmaintained. Please see aiosmtpd '
+ 'The smtpd module is deprecated and unmaintained and will be removed '
+ 'in Python 3.12. Please see aiosmtpd '
'(https://aiosmtpd.readthedocs.io/) for the recommended replacement.',
DeprecationWarning,
stacklevel=2)
if not first or subpattern:
import warnings
warnings.warn(
- 'Flags not at the start of the expression %r%s' % (
+ 'Flags not at the start of the expression %r%s'
+ ' but at position %d' % (
source.string[:20], # truncate long regexes
' (truncated)' if len(source.string) > 20 else '',
+ start,
),
DeprecationWarning, stacklevel=nested + 6
)
if self.ns.use_mp:
from test.libregrtest.runtest_mp import run_tests_multiprocess
- run_tests_multiprocess(self)
+ # If we're on windows and this is the parent runner (not a worker),
+ # track the load average.
+ if sys.platform == 'win32' and self.worker_test_name is None:
+ from test.libregrtest.win_utils import WindowsLoadTracker
+
+ try:
+ self.win_load_tracker = WindowsLoadTracker()
+ except PermissionError as error:
+ # Standard accounts may not have access to the performance
+ # counters.
+ print(f'Failed to create WindowsLoadTracker: {error}')
+
+ try:
+ run_tests_multiprocess(self)
+ finally:
+ if self.win_load_tracker is not None:
+ self.win_load_tracker.close()
+ self.win_load_tracker = None
else:
self.run_tests_sequential()
self.list_cases()
sys.exit(0)
- # If we're on windows and this is the parent runner (not a worker),
- # track the load average.
- if sys.platform == 'win32' and self.worker_test_name is None:
- from test.libregrtest.win_utils import WindowsLoadTracker
-
- try:
- self.win_load_tracker = WindowsLoadTracker()
- except FileNotFoundError as error:
- # Windows IoT Core and Windows Nano Server do not provide
- # typeperf.exe for x64, x86 or ARM
- print(f'Failed to create WindowsLoadTracker: {error}')
+ self.run_tests()
+ self.display_result()
- try:
- self.run_tests()
- self.display_result()
-
- if self.ns.verbose2 and self.bad:
- self.rerun_failed_tests()
- finally:
- if self.win_load_tracker is not None:
- self.win_load_tracker.close()
- self.win_load_tracker = None
+ if self.ns.verbose2 and self.bad:
+ self.rerun_failed_tests()
self.finalize()
+import _overlapped
+import _thread
import _winapi
import math
-import msvcrt
-import os
-import subprocess
-import uuid
+import struct
import winreg
-from test.support import os_helper
-from test.libregrtest.utils import print_warning
-# Max size of asynchronous reads
-BUFSIZE = 8192
# Seconds per measurement
SAMPLING_INTERVAL = 1
# Exponential damping factor to compute exponentially weighted moving average
# Initialize the load using the arithmetic mean of the first NVALUE values
# of the Processor Queue Length
NVALUE = 5
-# Windows registry subkey of HKEY_LOCAL_MACHINE where the counter names
-# of typeperf are registered
-COUNTER_REGISTRY_KEY = (r"SOFTWARE\Microsoft\Windows NT\CurrentVersion"
- r"\Perflib\CurrentLanguage")
class WindowsLoadTracker():
"""
- This class asynchronously interacts with the `typeperf` command to read
- the system load on Windows. Multiprocessing and threads can't be used
- here because they interfere with the test suite's cases for those
- modules.
+ This class asynchronously reads the performance counters to calculate
+ the system load on Windows. A "raw" thread is used here to prevent
+ interference with the test suite's cases for the threading module.
"""
def __init__(self):
+ # Pre-flight test for access to the performance data;
+ # `PermissionError` will be raised if not allowed
+ winreg.QueryInfoKey(winreg.HKEY_PERFORMANCE_DATA)
+
self._values = []
self._load = None
- self._buffer = ''
- self._popen = None
- self.start()
-
- def start(self):
- # Create a named pipe which allows for asynchronous IO in Windows
- pipe_name = r'\\.\pipe\typeperf_output_' + str(uuid.uuid4())
-
- open_mode = _winapi.PIPE_ACCESS_INBOUND
- open_mode |= _winapi.FILE_FLAG_FIRST_PIPE_INSTANCE
- open_mode |= _winapi.FILE_FLAG_OVERLAPPED
-
- # This is the read end of the pipe, where we will be grabbing output
- self.pipe = _winapi.CreateNamedPipe(
- pipe_name, open_mode, _winapi.PIPE_WAIT,
- 1, BUFSIZE, BUFSIZE, _winapi.NMPWAIT_WAIT_FOREVER, _winapi.NULL
- )
- # The write end of the pipe which is passed to the created process
- pipe_write_end = _winapi.CreateFile(
- pipe_name, _winapi.GENERIC_WRITE, 0, _winapi.NULL,
- _winapi.OPEN_EXISTING, 0, _winapi.NULL
- )
- # Open up the handle as a python file object so we can pass it to
- # subprocess
- command_stdout = msvcrt.open_osfhandle(pipe_write_end, 0)
-
- # Connect to the read end of the pipe in overlap/async mode
- overlap = _winapi.ConnectNamedPipe(self.pipe, overlapped=True)
- overlap.GetOverlappedResult(True)
-
- # Spawn off the load monitor
- counter_name = self._get_counter_name()
- command = ['typeperf', counter_name, '-si', str(SAMPLING_INTERVAL)]
- self._popen = subprocess.Popen(' '.join(command),
- stdout=command_stdout,
- cwd=os_helper.SAVEDCWD)
-
- # Close our copy of the write end of the pipe
- os.close(command_stdout)
-
- def _get_counter_name(self):
- # accessing the registry to get the counter localization name
- with winreg.OpenKey(winreg.HKEY_LOCAL_MACHINE, COUNTER_REGISTRY_KEY) as perfkey:
- counters = winreg.QueryValueEx(perfkey, 'Counter')[0]
-
- # Convert [key1, value1, key2, value2, ...] list
- # to {key1: value1, key2: value2, ...} dict
- counters = iter(counters)
- counters_dict = dict(zip(counters, counters))
-
- # System counter has key '2' and Processor Queue Length has key '44'
- system = counters_dict['2']
- process_queue_length = counters_dict['44']
- return f'"\\{system}\\{process_queue_length}"'
-
- def close(self, kill=True):
- if self._popen is None:
+ self._running = _overlapped.CreateEvent(None, True, False, None)
+ self._stopped = _overlapped.CreateEvent(None, True, False, None)
+
+ _thread.start_new_thread(self._update_load, (), {})
+
+ def _update_load(self,
+ # localize module access to prevent shutdown errors
+ _wait=_winapi.WaitForSingleObject,
+ _signal=_overlapped.SetEvent):
+ # run until signaled to stop
+ while _wait(self._running, 1000):
+ self._calculate_load()
+ # notify stopped
+ _signal(self._stopped)
+
+ def _calculate_load(self,
+ # localize module access to prevent shutdown errors
+ _query=winreg.QueryValueEx,
+ _hkey=winreg.HKEY_PERFORMANCE_DATA,
+ _unpack=struct.unpack_from):
+ # get the 'System' object
+ data, _ = _query(_hkey, '2')
+ # PERF_DATA_BLOCK {
+ # WCHAR Signature[4] 8 +
+ # DWOWD LittleEndian 4 +
+ # DWORD Version 4 +
+ # DWORD Revision 4 +
+ # DWORD TotalByteLength 4 +
+ # DWORD HeaderLength = 24 byte offset
+ # ...
+ # }
+ obj_start, = _unpack('L', data, 24)
+ # PERF_OBJECT_TYPE {
+ # DWORD TotalByteLength
+ # DWORD DefinitionLength
+ # DWORD HeaderLength
+ # ...
+ # }
+ data_start, defn_start = _unpack('4xLL', data, obj_start)
+ data_base = obj_start + data_start
+ defn_base = obj_start + defn_start
+ # find the 'Processor Queue Length' counter (index=44)
+ while defn_base < data_base:
+ # PERF_COUNTER_DEFINITION {
+ # DWORD ByteLength
+ # DWORD CounterNameTitleIndex
+ # ... [7 DWORDs/28 bytes]
+ # DWORD CounterOffset
+ # }
+ size, idx, offset = _unpack('LL28xL', data, defn_base)
+ defn_base += size
+ if idx == 44:
+ counter_offset = data_base + offset
+ # the counter is known to be PERF_COUNTER_RAWCOUNT (DWORD)
+ processor_queue_length, = _unpack('L', data, counter_offset)
+ break
+ else:
return
- self._load = None
-
- if kill:
- self._popen.kill()
- self._popen.wait()
- self._popen = None
-
- def __del__(self):
- self.close()
-
- def _parse_line(self, line):
- # typeperf outputs in a CSV format like this:
- # "07/19/2018 01:32:26.605","3.000000"
- # (date, process queue length)
- tokens = line.split(',')
- if len(tokens) != 2:
- raise ValueError
-
- value = tokens[1]
- if not value.startswith('"') or not value.endswith('"'):
- raise ValueError
- value = value[1:-1]
- return float(value)
-
- def _read_lines(self):
- overlapped, _ = _winapi.ReadFile(self.pipe, BUFSIZE, True)
- bytes_read, res = overlapped.GetOverlappedResult(False)
- if res != 0:
- return ()
-
- output = overlapped.getbuffer()
- output = output.decode('oem', 'replace')
- output = self._buffer + output
- lines = output.splitlines(True)
-
- # bpo-36670: typeperf only writes a newline *before* writing a value,
- # not after. Sometimes, the written line in incomplete (ex: only
- # timestamp, without the process queue length). Only pass the last line
- # to the parser if it's a valid value, otherwise store it in
- # self._buffer.
- try:
- self._parse_line(lines[-1])
- except ValueError:
- self._buffer = lines.pop(-1)
+ # We use an exponentially weighted moving average, imitating the
+ # load calculation on Unix systems.
+ # https://en.wikipedia.org/wiki/Load_(computing)#Unix-style_load_calculation
+ # https://en.wikipedia.org/wiki/Moving_average#Exponential_moving_average
+ if self._load is not None:
+ self._load = (self._load * LOAD_FACTOR_1
+ + processor_queue_length * (1.0 - LOAD_FACTOR_1))
+ elif len(self._values) < NVALUE:
+ self._values.append(processor_queue_length)
else:
- self._buffer = ''
+ self._load = sum(self._values) / len(self._values)
- return lines
+ def close(self, kill=True):
+ self.__del__()
+ return
+
+ def __del__(self,
+ # localize module access to prevent shutdown errors
+ _wait=_winapi.WaitForSingleObject,
+ _close=_winapi.CloseHandle,
+ _signal=_overlapped.SetEvent):
+ if self._running is not None:
+ # tell the update thread to quit
+ _signal(self._running)
+ # wait for the update thread to signal done
+ _wait(self._stopped, -1)
+ # cleanup events
+ _close(self._running)
+ _close(self._stopped)
+ self._running = self._stopped = None
def getloadavg(self):
- if self._popen is None:
- return None
-
- returncode = self._popen.poll()
- if returncode is not None:
- self.close(kill=False)
- return None
-
- try:
- lines = self._read_lines()
- except BrokenPipeError:
- self.close()
- return None
-
- for line in lines:
- line = line.rstrip()
-
- # Ignore the initial header:
- # "(PDH-CSV 4.0)","\\\\WIN\\System\\Processor Queue Length"
- if 'PDH-CSV' in line:
- continue
-
- # Ignore blank lines
- if not line:
- continue
-
- try:
- processor_queue_length = self._parse_line(line)
- except ValueError:
- print_warning("Failed to parse typeperf output: %a" % line)
- continue
-
- # We use an exponentially weighted moving average, imitating the
- # load calculation on Unix systems.
- # https://en.wikipedia.org/wiki/Load_(computing)#Unix-style_load_calculation
- # https://en.wikipedia.org/wiki/Moving_average#Exponential_moving_average
- if self._load is not None:
- self._load = (self._load * LOAD_FACTOR_1
- + processor_queue_length * (1.0 - LOAD_FACTOR_1))
- elif len(self._values) < NVALUE:
- self._values.append(processor_queue_length)
- else:
- self._load = sum(self._values) / len(self._values)
-
return self._load
"""This is a test module for test_pydoc"""
+import types
+import typing
+
__author__ = "Benjamin Peterson"
__credits__ = "Nobody"
__version__ = "1.2.3.4"
def is_it_true(self):
""" Return self.get_answer() """
return self.get_answer()
+ def __class_getitem__(self, item):
+ return types.GenericAlias(self, item)
def doc_func():
"""
def nodoc_func():
pass
+
+
+list_alias1 = typing.List[int]
+list_alias2 = list[int]
+c_alias = C[int]
+type_union1 = typing.Union[int, str]
+type_union2 = int | str
+++ /dev/null
-import asyncio
-import unittest
-import time
-
-def tearDownModule():
- asyncio.set_event_loop_policy(None)
-
-
-class SlowTask:
- """ Task will run for this defined time, ignoring cancel requests """
- TASK_TIMEOUT = 0.2
-
- def __init__(self):
- self.exited = False
-
- async def run(self):
- exitat = time.monotonic() + self.TASK_TIMEOUT
-
- while True:
- tosleep = exitat - time.monotonic()
- if tosleep <= 0:
- break
-
- try:
- await asyncio.sleep(tosleep)
- except asyncio.CancelledError:
- pass
-
- self.exited = True
-
-class AsyncioWaitForTest(unittest.TestCase):
-
- async def atest_asyncio_wait_for_cancelled(self):
- t = SlowTask()
-
- waitfortask = asyncio.create_task(asyncio.wait_for(t.run(), t.TASK_TIMEOUT * 2))
- await asyncio.sleep(0)
- waitfortask.cancel()
- await asyncio.wait({waitfortask})
-
- self.assertTrue(t.exited)
-
- def test_asyncio_wait_for_cancelled(self):
- asyncio.run(self.atest_asyncio_wait_for_cancelled())
-
- async def atest_asyncio_wait_for_timeout(self):
- t = SlowTask()
-
- try:
- await asyncio.wait_for(t.run(), t.TASK_TIMEOUT / 2)
- except asyncio.TimeoutError:
- pass
-
- self.assertTrue(t.exited)
-
- def test_asyncio_wait_for_timeout(self):
- asyncio.run(self.atest_asyncio_wait_for_timeout())
-
-
-if __name__ == '__main__':
- unittest.main()
sem.release()
self.assertFalse(sem.locked())
+ async def test_acquire_fifo_order(self):
+ sem = asyncio.Semaphore(1)
+ result = []
+
+ async def coro(tag):
+ await sem.acquire()
+ result.append(f'{tag}_1')
+ await asyncio.sleep(0.01)
+ sem.release()
+
+ await sem.acquire()
+ result.append(f'{tag}_2')
+ await asyncio.sleep(0.01)
+ sem.release()
+
+ t1 = asyncio.create_task(coro('c1'))
+ t2 = asyncio.create_task(coro('c2'))
+ t3 = asyncio.create_task(coro('c3'))
+
+ await asyncio.gather(t1, t2, t3)
+
+ self.assertEqual(
+ ['c1_1', 'c2_1', 'c3_1', 'c1_2', 'c2_2', 'c3_2'],
+ result
+ )
+
if __name__ == '__main__':
unittest.main()
return self
-# The following value can be used as a very small timeout:
-# it passes check "timeout > 0", but has almost
-# no effect on the test performance
-_EPSILON = 0.0001
-
-
class BaseTaskTests:
Task = None
self.loop.set_task_factory(self.new_task)
self.loop.create_future = lambda: self.new_future(self.loop)
-
def test_generic_alias(self):
task = self.__class__.Task[str]
self.assertEqual(task.__args__, (str,))
task._log_traceback = True
self.loop.run_until_complete(task)
- def test_wait_for_timeout_less_then_0_or_0_future_done(self):
- def gen():
- when = yield
- self.assertAlmostEqual(0, when)
-
- loop = self.new_test_loop(gen)
-
- fut = self.new_future(loop)
- fut.set_result('done')
-
- ret = loop.run_until_complete(asyncio.wait_for(fut, 0))
-
- self.assertEqual(ret, 'done')
- self.assertTrue(fut.done())
- self.assertAlmostEqual(0, loop.time())
-
- def test_wait_for_timeout_less_then_0_or_0_coroutine_do_not_started(self):
- def gen():
- when = yield
- self.assertAlmostEqual(0, when)
-
- loop = self.new_test_loop(gen)
-
- foo_started = False
-
- async def foo():
- nonlocal foo_started
- foo_started = True
-
- with self.assertRaises(asyncio.TimeoutError):
- loop.run_until_complete(asyncio.wait_for(foo(), 0))
-
- self.assertAlmostEqual(0, loop.time())
- self.assertEqual(foo_started, False)
-
- def test_wait_for_timeout_less_then_0_or_0(self):
- def gen():
- when = yield
- self.assertAlmostEqual(0.2, when)
- when = yield 0
- self.assertAlmostEqual(0, when)
-
- for timeout in [0, -1]:
- with self.subTest(timeout=timeout):
- loop = self.new_test_loop(gen)
-
- foo_running = None
-
- async def foo():
- nonlocal foo_running
- foo_running = True
- try:
- await asyncio.sleep(0.2)
- finally:
- foo_running = False
- return 'done'
-
- fut = self.new_task(loop, foo())
-
- with self.assertRaises(asyncio.TimeoutError):
- loop.run_until_complete(asyncio.wait_for(fut, timeout))
- self.assertTrue(fut.done())
- # it should have been cancelled due to the timeout
- self.assertTrue(fut.cancelled())
- self.assertAlmostEqual(0, loop.time())
- self.assertEqual(foo_running, False)
-
- def test_wait_for(self):
-
- def gen():
- when = yield
- self.assertAlmostEqual(0.2, when)
- when = yield 0
- self.assertAlmostEqual(0.1, when)
- when = yield 0.1
-
- loop = self.new_test_loop(gen)
-
- foo_running = None
-
- async def foo():
- nonlocal foo_running
- foo_running = True
- try:
- await asyncio.sleep(0.2)
- finally:
- foo_running = False
- return 'done'
-
- fut = self.new_task(loop, foo())
-
- with self.assertRaises(asyncio.TimeoutError):
- loop.run_until_complete(asyncio.wait_for(fut, 0.1))
- self.assertTrue(fut.done())
- # it should have been cancelled due to the timeout
- self.assertTrue(fut.cancelled())
- self.assertAlmostEqual(0.1, loop.time())
- self.assertEqual(foo_running, False)
-
- def test_wait_for_blocking(self):
- loop = self.new_test_loop()
-
- async def coro():
- return 'done'
-
- res = loop.run_until_complete(asyncio.wait_for(coro(), timeout=None))
- self.assertEqual(res, 'done')
-
- def test_wait_for_race_condition(self):
-
- def gen():
- yield 0.1
- yield 0.1
- yield 0.1
-
- loop = self.new_test_loop(gen)
-
- fut = self.new_future(loop)
- task = asyncio.wait_for(fut, timeout=0.2)
- loop.call_later(0.1, fut.set_result, "ok")
- res = loop.run_until_complete(task)
- self.assertEqual(res, "ok")
-
- def test_wait_for_cancellation_race_condition(self):
- async def inner():
- with contextlib.suppress(asyncio.CancelledError):
- await asyncio.sleep(1)
- return 1
-
- async def main():
- result = await asyncio.wait_for(inner(), timeout=.01)
- self.assertEqual(result, 1)
-
- asyncio.run(main())
-
- def test_wait_for_waits_for_task_cancellation(self):
- loop = asyncio.new_event_loop()
- self.addCleanup(loop.close)
-
- task_done = False
-
- async def foo():
- async def inner():
- nonlocal task_done
- try:
- await asyncio.sleep(0.2)
- except asyncio.CancelledError:
- await asyncio.sleep(_EPSILON)
- raise
- finally:
- task_done = True
-
- inner_task = self.new_task(loop, inner())
-
- await asyncio.wait_for(inner_task, timeout=_EPSILON)
-
- with self.assertRaises(asyncio.TimeoutError) as cm:
- loop.run_until_complete(foo())
-
- self.assertTrue(task_done)
- chained = cm.exception.__context__
- self.assertEqual(type(chained), asyncio.CancelledError)
-
- def test_wait_for_waits_for_task_cancellation_w_timeout_0(self):
- loop = asyncio.new_event_loop()
- self.addCleanup(loop.close)
-
- task_done = False
-
- async def foo():
- async def inner():
- nonlocal task_done
- try:
- await asyncio.sleep(10)
- except asyncio.CancelledError:
- await asyncio.sleep(_EPSILON)
- raise
- finally:
- task_done = True
-
- inner_task = self.new_task(loop, inner())
- await asyncio.sleep(_EPSILON)
- await asyncio.wait_for(inner_task, timeout=0)
-
- with self.assertRaises(asyncio.TimeoutError) as cm:
- loop.run_until_complete(foo())
-
- self.assertTrue(task_done)
- chained = cm.exception.__context__
- self.assertEqual(type(chained), asyncio.CancelledError)
-
- def test_wait_for_reraises_exception_during_cancellation(self):
- loop = asyncio.new_event_loop()
- self.addCleanup(loop.close)
-
- class FooException(Exception):
- pass
-
- async def foo():
- async def inner():
- try:
- await asyncio.sleep(0.2)
- finally:
- raise FooException
-
- inner_task = self.new_task(loop, inner())
-
- await asyncio.wait_for(inner_task, timeout=_EPSILON)
-
- with self.assertRaises(FooException):
- loop.run_until_complete(foo())
-
- def test_wait_for_self_cancellation(self):
- loop = asyncio.new_event_loop()
- self.addCleanup(loop.close)
-
- async def foo():
- async def inner():
- try:
- await asyncio.sleep(0.3)
- except asyncio.CancelledError:
- try:
- await asyncio.sleep(0.3)
- except asyncio.CancelledError:
- await asyncio.sleep(0.3)
-
- return 42
-
- inner_task = self.new_task(loop, inner())
-
- wait = asyncio.wait_for(inner_task, timeout=0.1)
-
- # Test that wait_for itself is properly cancellable
- # even when the initial task holds up the initial cancellation.
- task = self.new_task(loop, wait)
- await asyncio.sleep(0.2)
- task.cancel()
-
- with self.assertRaises(asyncio.CancelledError):
- await task
-
- self.assertEqual(await inner_task, 42)
-
- loop.run_until_complete(foo())
-
def test_wait(self):
def gen():
'test_task_source_traceback'))
self.loop.run_until_complete(task)
- def _test_cancel_wait_for(self, timeout):
- loop = asyncio.new_event_loop()
- self.addCleanup(loop.close)
-
- async def blocking_coroutine():
- fut = self.new_future(loop)
- # Block: fut result is never set
- await fut
-
- task = loop.create_task(blocking_coroutine())
-
- wait = loop.create_task(asyncio.wait_for(task, timeout))
- loop.call_soon(wait.cancel)
-
- self.assertRaises(asyncio.CancelledError,
- loop.run_until_complete, wait)
-
- # Python issue #23219: cancelling the wait must also cancel the task
- self.assertTrue(task.cancelled())
-
- def test_cancel_blocking_wait_for(self):
- self._test_cancel_wait_for(None)
-
- def test_cancel_wait_for(self):
- self._test_cancel_wait_for(60.0)
-
def test_cancel_gather_1(self):
"""Ensure that a gathering future refuses to be cancelled once all
children are done"""
--- /dev/null
+import asyncio
+import unittest
+import time
+
+
+def tearDownModule():
+ asyncio.set_event_loop_policy(None)
+
+
+# The following value can be used as a very small timeout:
+# it passes check "timeout > 0", but has almost
+# no effect on the test performance
+_EPSILON = 0.0001
+
+
+class SlowTask:
+ """ Task will run for this defined time, ignoring cancel requests """
+ TASK_TIMEOUT = 0.2
+
+ def __init__(self):
+ self.exited = False
+
+ async def run(self):
+ exitat = time.monotonic() + self.TASK_TIMEOUT
+
+ while True:
+ tosleep = exitat - time.monotonic()
+ if tosleep <= 0:
+ break
+
+ try:
+ await asyncio.sleep(tosleep)
+ except asyncio.CancelledError:
+ pass
+
+ self.exited = True
+
+
+class AsyncioWaitForTest(unittest.IsolatedAsyncioTestCase):
+
+ async def test_asyncio_wait_for_cancelled(self):
+ t = SlowTask()
+
+ waitfortask = asyncio.create_task(
+ asyncio.wait_for(t.run(), t.TASK_TIMEOUT * 2))
+ await asyncio.sleep(0)
+ waitfortask.cancel()
+ await asyncio.wait({waitfortask})
+
+ self.assertTrue(t.exited)
+
+ async def test_asyncio_wait_for_timeout(self):
+ t = SlowTask()
+
+ try:
+ await asyncio.wait_for(t.run(), t.TASK_TIMEOUT / 2)
+ except asyncio.TimeoutError:
+ pass
+
+ self.assertTrue(t.exited)
+
+ async def test_wait_for_timeout_less_then_0_or_0_future_done(self):
+ loop = asyncio.get_running_loop()
+
+ fut = loop.create_future()
+ fut.set_result('done')
+
+ t0 = loop.time()
+ ret = await asyncio.wait_for(fut, 0)
+ t1 = loop.time()
+
+ self.assertEqual(ret, 'done')
+ self.assertTrue(fut.done())
+ self.assertLess(t1 - t0, 0.1)
+
+ async def test_wait_for_timeout_less_then_0_or_0_coroutine_do_not_started(self):
+ loop = asyncio.get_running_loop()
+
+ foo_started = False
+
+ async def foo():
+ nonlocal foo_started
+ foo_started = True
+
+ with self.assertRaises(asyncio.TimeoutError):
+ t0 = loop.time()
+ await asyncio.wait_for(foo(), 0)
+ t1 = loop.time()
+
+ self.assertEqual(foo_started, False)
+ self.assertLess(t1 - t0, 0.1)
+
+ async def test_wait_for_timeout_less_then_0_or_0(self):
+ loop = asyncio.get_running_loop()
+
+ for timeout in [0, -1]:
+ with self.subTest(timeout=timeout):
+ foo_running = None
+ started = loop.create_future()
+
+ async def foo():
+ nonlocal foo_running
+ foo_running = True
+ started.set_result(None)
+ try:
+ await asyncio.sleep(10)
+ finally:
+ foo_running = False
+ return 'done'
+
+ fut = asyncio.create_task(foo())
+ await started
+
+ with self.assertRaises(asyncio.TimeoutError):
+ t0 = loop.time()
+ await asyncio.wait_for(fut, timeout)
+ t1 = loop.time()
+
+ self.assertTrue(fut.done())
+ # it should have been cancelled due to the timeout
+ self.assertTrue(fut.cancelled())
+ self.assertEqual(foo_running, False)
+ self.assertLess(t1 - t0, 0.1)
+
+ async def test_wait_for(self):
+ loop = asyncio.get_running_loop()
+ foo_running = None
+
+ async def foo():
+ nonlocal foo_running
+ foo_running = True
+ try:
+ await asyncio.sleep(10)
+ finally:
+ foo_running = False
+ return 'done'
+
+ fut = asyncio.create_task(foo())
+
+ with self.assertRaises(asyncio.TimeoutError):
+ t0 = loop.time()
+ await asyncio.wait_for(fut, 0.1)
+ t1 = loop.time()
+ self.assertTrue(fut.done())
+ # it should have been cancelled due to the timeout
+ self.assertTrue(fut.cancelled())
+ self.assertLess(t1 - t0, 0.5)
+ self.assertEqual(foo_running, False)
+
+ async def test_wait_for_blocking(self):
+ async def coro():
+ return 'done'
+
+ res = await asyncio.wait_for(coro(), timeout=None)
+ self.assertEqual(res, 'done')
+
+ async def test_wait_for_race_condition(self):
+ loop = asyncio.get_running_loop()
+
+ fut = loop.create_future()
+ task = asyncio.wait_for(fut, timeout=0.2)
+ loop.call_later(0.1, fut.set_result, "ok")
+ res = await task
+ self.assertEqual(res, "ok")
+
+ async def test_wait_for_cancellation_race_condition(self):
+ async def inner():
+ with self.assertRaises(asyncio.CancelledError):
+ await asyncio.sleep(1)
+ return 1
+
+ result = await asyncio.wait_for(inner(), timeout=.01)
+ self.assertEqual(result, 1)
+
+ async def test_wait_for_waits_for_task_cancellation(self):
+ task_done = False
+
+ async def inner():
+ nonlocal task_done
+ try:
+ await asyncio.sleep(10)
+ except asyncio.CancelledError:
+ await asyncio.sleep(_EPSILON)
+ raise
+ finally:
+ task_done = True
+
+ inner_task = asyncio.create_task(inner())
+
+ with self.assertRaises(asyncio.TimeoutError) as cm:
+ await asyncio.wait_for(inner_task, timeout=_EPSILON)
+
+ self.assertTrue(task_done)
+ chained = cm.exception.__context__
+ self.assertEqual(type(chained), asyncio.CancelledError)
+
+ async def test_wait_for_waits_for_task_cancellation_w_timeout_0(self):
+ task_done = False
+
+ async def foo():
+ async def inner():
+ nonlocal task_done
+ try:
+ await asyncio.sleep(10)
+ except asyncio.CancelledError:
+ await asyncio.sleep(_EPSILON)
+ raise
+ finally:
+ task_done = True
+
+ inner_task = asyncio.create_task(inner())
+ await asyncio.sleep(_EPSILON)
+ await asyncio.wait_for(inner_task, timeout=0)
+
+ with self.assertRaises(asyncio.TimeoutError) as cm:
+ await foo()
+
+ self.assertTrue(task_done)
+ chained = cm.exception.__context__
+ self.assertEqual(type(chained), asyncio.CancelledError)
+
+ async def test_wait_for_reraises_exception_during_cancellation(self):
+ class FooException(Exception):
+ pass
+
+ async def foo():
+ async def inner():
+ try:
+ await asyncio.sleep(0.2)
+ finally:
+ raise FooException
+
+ inner_task = asyncio.create_task(inner())
+
+ await asyncio.wait_for(inner_task, timeout=_EPSILON)
+
+ with self.assertRaises(FooException):
+ await foo()
+
+ async def test_wait_for_self_cancellation(self):
+ async def inner():
+ try:
+ await asyncio.sleep(0.3)
+ except asyncio.CancelledError:
+ try:
+ await asyncio.sleep(0.3)
+ except asyncio.CancelledError:
+ await asyncio.sleep(0.3)
+
+ return 42
+
+ inner_task = asyncio.create_task(inner())
+
+ wait = asyncio.wait_for(inner_task, timeout=0.1)
+
+ # Test that wait_for itself is properly cancellable
+ # even when the initial task holds up the initial cancellation.
+ task = asyncio.create_task(wait)
+ await asyncio.sleep(0.2)
+ task.cancel()
+
+ with self.assertRaises(asyncio.CancelledError):
+ await task
+
+ self.assertEqual(await inner_task, 42)
+
+ async def _test_cancel_wait_for(self, timeout):
+ loop = asyncio.get_running_loop()
+
+ async def blocking_coroutine():
+ fut = loop.create_future()
+ # Block: fut result is never set
+ await fut
+
+ task = asyncio.create_task(blocking_coroutine())
+
+ wait = asyncio.create_task(asyncio.wait_for(task, timeout))
+ loop.call_soon(wait.cancel)
+
+ with self.assertRaises(asyncio.CancelledError):
+ await wait
+
+ # Python issue #23219: cancelling the wait must also cancel the task
+ self.assertTrue(task.cancelled())
+
+ async def test_cancel_blocking_wait_for(self):
+ await self._test_cancel_wait_for(None)
+
+ async def test_cancel_wait_for(self):
+ await self._test_cancel_wait_for(60.0)
+
+
+if __name__ == '__main__':
+ unittest.main()
import binascii
import array
import re
-from test.support import warnings_helper
+from test.support import bigmemtest, _1G, _4G, warnings_helper
# Note: "*_hex" functions are aliases for "(un)hexlify"
class MemoryviewBinASCIITest(BinASCIITest):
type2test = memoryview
+class ChecksumBigBufferTestCase(unittest.TestCase):
+ """bpo-38256 - check that inputs >=4 GiB are handled correctly."""
+
+ @bigmemtest(size=_4G + 4, memuse=1, dry_run=False)
+ def test_big_buffer(self, size):
+ data = b"nyan" * (_1G + 1)
+ self.assertEqual(binascii.crc32(data), 1044521549)
+
if __name__ == "__main__":
unittest.main()
self.assertTrue(data.find(b'1 loop') != -1)
self.assertTrue(data.find(b'__main__.Timer') != -1)
+ def test_relativedir_bug46421(self):
+ # Test `python -m unittest` with a relative directory beginning with ./
+ # Note: We have to switch to the project's top module's directory, as per
+ # the python unittest wiki. We will switch back when we are done.
+ defaultwd = os.getcwd()
+ projectlibpath = os.path.dirname(__file__).removesuffix("test")
+ with os_helper.change_cwd(projectlibpath):
+ # Testing with and without ./
+ assert_python_ok('-m', 'unittest', "test/test_longexp.py")
+ assert_python_ok('-m', 'unittest', "./test/test_longexp.py")
+
def test_run_code(self):
# Test expected operation of the '-c' switch
# Switch needs an argument
"""
+def test_run_doctestsuite_multiple_times():
+ """
+ It was not possible to run the same DocTestSuite multiple times
+ http://bugs.python.org/issue2604
+ http://bugs.python.org/issue9736
+
+ >>> import unittest
+ >>> import test.sample_doctest
+ >>> suite = doctest.DocTestSuite(test.sample_doctest)
+ >>> suite.run(unittest.TestResult())
+ <unittest.result.TestResult run=9 errors=0 failures=4>
+ >>> suite.run(unittest.TestResult())
+ <unittest.result.TestResult run=9 errors=0 failures=4>
+ """
+
+
def load_tests(loader, tests, pattern):
tests.addTest(doctest.DocTestSuite(doctest))
tests.addTest(doctest.DocTestSuite())
| say_no(self)
|\x20\x20
| ----------------------------------------------------------------------
+ | Class methods defined here:
+ |\x20\x20
+ | __class_getitem__(item) from builtins.type
+ |\x20\x20
+ | ----------------------------------------------------------------------
| Data descriptors defined here:
|\x20\x20
| __dict__
DATA
__xyz__ = 'X, Y and Z'
+ c_alias = test.pydoc_mod.C[int]
+ list_alias1 = typing.List[int]
+ list_alias2 = list[int]
+ type_union1 = typing.Union[int, str]
+ type_union2 = int | str
VERSION
1.2.3.4
<p><tt>This is a test module for test_pydoc</tt></p>
<p>
<table width="100%%" cellspacing=0 cellpadding=2 border=0 summary="section">
+<tr bgcolor="#aa55cc">
+<td colspan=3 valign=bottom> <br>
+<font color="#ffffff" face="helvetica, arial"><big><strong>Modules</strong></big></font></td></tr>
+\x20\x20\x20\x20
+<tr><td bgcolor="#aa55cc"><tt> </tt></td><td> </td>
+<td width="100%%"><table width="100%%" summary="list"><tr><td width="25%%" valign=top><a href="types.html">types</a><br>
+</td><td width="25%%" valign=top><a href="typing.html">typing</a><br>
+</td><td width="25%%" valign=top></td><td width="25%%" valign=top></td></tr></table></td></tr></table><p>
+<table width="100%%" cellspacing=0 cellpadding=2 border=0 summary="section">
<tr bgcolor="#ee77aa">
<td colspan=3 valign=bottom> <br>
<font color="#ffffff" face="helvetica, arial"><big><strong>Classes</strong></big></font></td></tr>
<dl><dt><a name="C-say_no"><strong>say_no</strong></a>(self)</dt></dl>
+<hr>
+Class methods defined here:<br>
+<dl><dt><a name="C-__class_getitem__"><strong>__class_getitem__</strong></a>(item)<font color="#909090"><font face="helvetica, arial"> from <a href="builtins.html#type">builtins.type</a></font></font></dt></dl>
+
<hr>
Data descriptors defined here:<br>
<dl><dt><strong>__dict__</strong></dt>
<font color="#ffffff" face="helvetica, arial"><big><strong>Data</strong></big></font></td></tr>
\x20\x20\x20\x20
<tr><td bgcolor="#55aa55"><tt> </tt></td><td> </td>
-<td width="100%%"><strong>__xyz__</strong> = 'X, Y and Z'</td></tr></table><p>
+<td width="100%%"><strong>__xyz__</strong> = 'X, Y and Z'<br>
+<strong>c_alias</strong> = test.pydoc_mod.C[int]<br>
+<strong>list_alias1</strong> = typing.List[int]<br>
+<strong>list_alias2</strong> = list[int]<br>
+<strong>type_union1</strong> = typing.Union[int, str]<br>
+<strong>type_union2</strong> = int | str</td></tr></table><p>
<table width="100%%" cellspacing=0 cellpadding=2 border=0 summary="section">
<tr bgcolor="#7799ee">
<td colspan=3 valign=bottom> <br>
expected = 'C in module %s object' % __name__
self.assertIn(expected, pydoc.render_doc(c))
+ def test_generic_alias(self):
+ self.assertEqual(pydoc.describe(typing.List[int]), '_GenericAlias')
+ doc = pydoc.render_doc(typing.List[int], renderer=pydoc.plaintext)
+ self.assertIn('_GenericAlias in module typing', doc)
+ self.assertIn('List = class list(object)', doc)
+ self.assertIn(list.__doc__.strip().splitlines()[0], doc)
+
+ self.assertEqual(pydoc.describe(list[int]), 'GenericAlias')
+ doc = pydoc.render_doc(list[int], renderer=pydoc.plaintext)
+ self.assertIn('GenericAlias in module builtins', doc)
+ self.assertIn('\nclass list(object)', doc)
+ self.assertIn(list.__doc__.strip().splitlines()[0], doc)
+
+ def test_union_type(self):
+ self.assertEqual(pydoc.describe(typing.Union[int, str]), '_UnionGenericAlias')
+ doc = pydoc.render_doc(typing.Union[int, str], renderer=pydoc.plaintext)
+ self.assertIn('_UnionGenericAlias in module typing', doc)
+ self.assertIn('Union = typing.Union', doc)
+ if typing.Union.__doc__:
+ self.assertIn(typing.Union.__doc__.strip().splitlines()[0], doc)
+
+ self.assertEqual(pydoc.describe(int | str), 'UnionType')
+ doc = pydoc.render_doc(int | str, renderer=pydoc.plaintext)
+ self.assertIn('UnionType in module types object', doc)
+ self.assertIn('\nclass UnionType(builtins.object)', doc)
+ self.assertIn(types.UnionType.__doc__.strip().splitlines()[0], doc)
+
+ def test_special_form(self):
+ self.assertEqual(pydoc.describe(typing.Any), '_SpecialForm')
+ doc = pydoc.render_doc(typing.Any, renderer=pydoc.plaintext)
+ self.assertIn('_SpecialForm in module typing', doc)
+ if typing.Any.__doc__:
+ self.assertIn('Any = typing.Any', doc)
+ self.assertIn(typing.Any.__doc__.strip().splitlines()[0], doc)
+ else:
+ self.assertIn('Any = class _SpecialForm(_Final)', doc)
+
def test_typing_pydoc(self):
def foo(data: typing.List[typing.Any],
x: int) -> typing.Iterator[typing.Tuple[int, typing.Any]]:
self.assertTrue(re.match(p, lower_char))
self.assertEqual(
str(warns.warnings[0].message),
- 'Flags not at the start of the expression %r' % p
+ 'Flags not at the start of the expression %r'
+ ' but at position 1' % p
)
self.assertEqual(warns.warnings[0].filename, __file__)
self.assertTrue(re.match(p, lower_char))
self.assertEqual(
str(warns.warnings[0].message),
- 'Flags not at the start of the expression %r (truncated)' % p[:20]
+ 'Flags not at the start of the expression %r (truncated)'
+ ' but at position 1' % p[:20]
)
self.assertEqual(warns.warnings[0].filename, __file__)
self.assertTrue(re.match(p, b'a'))
self.assertEqual(
str(warns.warnings[0].message),
- 'Flags not at the start of the expression %r' % p
+ 'Flags not at the start of the expression %r'
+ ' but at position 1' % p
)
self.assertEqual(warns.warnings[0].filename, __file__)
self.checkEnumParam(widget, 'activestyle',
'dotbox', 'none', 'underline')
- test_justify = requires_tcl(8, 6, 5)(StandardOptionsTests.test_configure_justify)
+ test_configure_justify = requires_tcl(8, 6, 5)(StandardOptionsTests.test_configure_justify)
def test_configure_listvariable(self):
widget = self.create()
def test_configure_from(self):
widget = self.create()
- conv = False if get_tk_patchlevel() >= (8, 6, 10) else float_round
+ conv = float if get_tk_patchlevel() >= (8, 6, 10) else float_round
self.checkFloatParam(widget, 'from', 100, 14.9, 15.1, conv=conv)
def test_configure_label(self):
from tkinter import ttk
from test import support
from test.support import requires
-from tkinter.test.support import AbstractTkTest
+from tkinter.test.support import AbstractTkTest, get_tk_patchlevel
requires('gui')
newname = f'C.{name}'
self.assertEqual(style.map(newname), {})
style.map(newname, **default)
+ if theme == 'alt' and name == '.' and get_tk_patchlevel() < (8, 6, 1):
+ default['embossed'] = [('disabled', '1')]
self.assertEqual(style.map(newname), default)
for key, value in default.items():
self.assertEqual(style.map(newname, key), value)
name = rel_path
# on Windows both '\' and '/' are used as path
# separators. Better to replace both than rely on os.path.sep
- return name[:-3].replace('\\', '.').replace('/', '.')
+ return os.path.normpath(name)[:-3].replace('\\', '.').replace('/', '.')
return name
def _convert_names(names):
self._lock = lock
self._writing = writing
self.seekable = file.seekable
- self.tell = file.tell
+
+ def tell(self):
+ return self._pos
def seek(self, offset, whence=0):
with self._lock:
Arnaud Ysmal
Bernard Yue
Moshe Zadka
+Bader Zaidan
Elias Zamaria
Milan Zamazal
Artur Zaprzala
Python News
+++++++++++
+What's New in Python 3.10.4 final?
+==================================
+
+*Release date: 2022-03-23*
+
+Core and Builtins
+-----------------
+
+- bpo-46968: Check for the existence of the "sys/auxv.h" header in
+ :mod:`faulthandler` to avoid compilation problems in systems where this
+ header doesn't exist. Patch by Pablo Galindo
+
+Library
+-------
+
+- bpo-23691: Protect the :func:`re.finditer` iterator from re-entering.
+
+- bpo-42369: Fix thread safety of :meth:`zipfile._SharedFile.tell` to avoid
+ a "zipfile.BadZipFile: Bad CRC-32 for file" exception when reading a
+ :class:`ZipFile` from multiple threads.
+
+- bpo-38256: Fix :func:`binascii.crc32` when it is compiled to use zlib'c
+ crc32 to work properly on inputs 4+GiB in length instead of returning the
+ wrong result. The workaround prior to this was to always feed the function
+ data in increments smaller than 4GiB or to just call the zlib module
+ function.
+
+- bpo-39394: A warning about inline flags not at the start of the regular
+ expression now contains the position of the flag.
+
+- bpo-47061: Deprecate the various modules listed by :pep:`594`:
+
+ aifc, asynchat, asyncore, audioop, cgi, cgitb, chunk, crypt, imghdr,
+ msilib, nntplib, nis, ossaudiodev, pipes, smtpd, sndhdr, spwd, sunau,
+ telnetlib, uu, xdrlib
+
+- bpo-2604: Fix bug where doctests using globals would fail when run
+ multiple times.
+
+- bpo-45997: Fix :class:`asyncio.Semaphore` re-aquiring FIFO order.
+
+- bpo-47022: The :mod:`asynchat`, :mod:`asyncore` and :mod:`smtpd` modules
+ have been deprecated since at least Python 3.6. Their documentation and
+ deprecation warnings and have now been updated to note they will removed
+ in Python 3.12 (:pep:`594`).
+
+- bpo-46421: Fix a unittest issue where if the command was invoked as
+ ``python -m unittest`` and the filename(s) began with a dot (.), a
+ ``ValueError`` is returned.
+
+- bpo-40296: Fix supporting generic aliases in :mod:`pydoc`.
+
+
What's New in Python 3.10.3 final?
==================================
Py_DECREF(tp);
}
+static int
+scanner_begin(ScannerObject* self)
+{
+ if (self->executing) {
+ PyErr_SetString(PyExc_ValueError,
+ "regular expression scanner already executing");
+ return 0;
+ }
+ self->executing = 1;
+ return 1;
+}
+
+static void
+scanner_end(ScannerObject* self)
+{
+ assert(self->executing);
+ self->executing = 0;
+}
+
/*[clinic input]
_sre.SRE_Scanner.match
PyObject* match;
Py_ssize_t status;
- if (state->start == NULL)
+ if (!scanner_begin(self)) {
+ return NULL;
+ }
+ if (state->start == NULL) {
+ scanner_end(self);
Py_RETURN_NONE;
+ }
state_reset(state);
state->ptr = state->start;
status = sre_match(state, PatternObject_GetCode(self->pattern));
- if (PyErr_Occurred())
+ if (PyErr_Occurred()) {
+ scanner_end(self);
return NULL;
+ }
match = pattern_new_match(module_state, (PatternObject*) self->pattern,
state, status);
state->start = state->ptr;
}
+ scanner_end(self);
return match;
}
PyObject* match;
Py_ssize_t status;
- if (state->start == NULL)
+ if (!scanner_begin(self)) {
+ return NULL;
+ }
+ if (state->start == NULL) {
+ scanner_end(self);
Py_RETURN_NONE;
+ }
state_reset(state);
state->ptr = state->start;
status = sre_search(state, PatternObject_GetCode(self->pattern));
- if (PyErr_Occurred())
+ if (PyErr_Occurred()) {
+ scanner_end(self);
return NULL;
+ }
match = pattern_new_match(module_state, (PatternObject*) self->pattern,
state, status);
state->start = state->ptr;
}
+ scanner_end(self);
return match;
}
if (!scanner)
return NULL;
scanner->pattern = NULL;
+ scanner->executing = 0;
/* create search state object */
if (!state_init(&scanner->state, self, string, pos, endpos)) {
/*[clinic end generated code: output=52cf59056a78593b input=bbe340bc99d25aa8]*/
#ifdef USE_ZLIB_CRC32
-/* This was taken from zlibmodule.c PyZlib_crc32 (but is PY_SSIZE_T_CLEAN) */
+/* The same core as zlibmodule.c zlib_crc32_impl. */
{
- const Byte *buf;
- Py_ssize_t len;
- int signed_val;
-
- buf = (Byte*)data->buf;
- len = data->len;
- signed_val = crc32(crc, buf, len);
- return (unsigned int)signed_val & 0xffffffffU;
+ unsigned char *buf = data->buf;
+ Py_ssize_t len = data->len;
+
+ /* Avoid truncation of length for very large buffers. crc32() takes
+ length as an unsigned int, which may be narrower than Py_ssize_t. */
+ while ((size_t)len > UINT_MAX) {
+ crc = crc32(crc, buf, UINT_MAX);
+ buf += (size_t) UINT_MAX;
+ len -= (size_t) UINT_MAX;
+ }
+ crc = crc32(crc, buf, (unsigned int)len);
+ return crc & 0xffffffff;
}
#else /* USE_ZLIB_CRC32 */
{ /* By Jim Ahlstrom; All rights transferred to CNRI */
# define FAULTHANDLER_USE_ALT_STACK
#endif
-#if defined(FAULTHANDLER_USE_ALT_STACK) && defined(HAVE_LINUX_AUXVEC_H)
-# include <linux/auxvec.h>
-# include <sys/auxv.h>
+#if defined(FAULTHANDLER_USE_ALT_STACK) && defined(HAVE_LINUX_AUXVEC_H) && defined(HAVE_SYS_AUXV_H)
+# include <linux/auxvec.h> // AT_MINSIGSTKSZ
+# include <sys/auxv.h> // getauxval()
#endif
/* Allocate at maximum 100 MiB of the stack to raise the stack overflow */
PyObject_HEAD
PyObject* pattern;
SRE_STATE state;
+ int executing;
} ScannerObject;
#endif
-This is Python version 3.10.3
+This is Python version 3.10.4
=============================
.. image:: https://travis-ci.com/python/cpython.svg?branch=master
sys/times.h sys/types.h sys/uio.h sys/un.h sys/utsname.h sys/wait.h pty.h \
libutil.h sys/resource.h netpacket/packet.h sysexits.h bluetooth.h \
linux/tipc.h linux/random.h spawn.h util.h alloca.h endian.h \
-sys/endian.h sys/sysmacros.h linux/auxvec.h linux/memfd.h linux/wait.h sys/memfd.h \
+sys/endian.h sys/sysmacros.h linux/auxvec.h sys/auxv.h linux/memfd.h linux/wait.h sys/memfd.h \
sys/mman.h sys/eventfd.h
do :
as_ac_Header=`$as_echo "ac_cv_header_$ac_header" | $as_tr_sh`
sys/times.h sys/types.h sys/uio.h sys/un.h sys/utsname.h sys/wait.h pty.h \
libutil.h sys/resource.h netpacket/packet.h sysexits.h bluetooth.h \
linux/tipc.h linux/random.h spawn.h util.h alloca.h endian.h \
-sys/endian.h sys/sysmacros.h linux/auxvec.h linux/memfd.h linux/wait.h sys/memfd.h \
+sys/endian.h sys/sysmacros.h linux/auxvec.h sys/auxv.h linux/memfd.h linux/wait.h sys/memfd.h \
sys/mman.h sys/eventfd.h)
AC_HEADER_DIRENT
AC_HEADER_MAJOR
/* Define to 1 if you have the <sys/audioio.h> header file. */
#undef HAVE_SYS_AUDIOIO_H
+/* Define to 1 if you have the <sys/auxv.h> header file. */
+#undef HAVE_SYS_AUXV_H
+
/* Define to 1 if you have the <sys/bsdtty.h> header file. */
#undef HAVE_SYS_BSDTTY_H