variables:
testRunTitle: '$(build.sourceBranchName)-linux'
testRunPlatform: linux
- openssl_version: 1.1.0j
+ openssl_version: 1.1.1c
steps:
- template: ./posix-steps.yml
variables:
testRunTitle: '$(Build.SourceBranchName)-linux-coverage'
testRunPlatform: linux-coverage
- openssl_version: 1.1.0j
+ openssl_version: 1.1.1c
steps:
- template: ./posix-steps.yml
clean: true
fetchDepth: 5
+# Work around a known issue affecting Ubuntu VMs on Pipelines
+- script: sudo setfacl -Rb /home/vsts
+ displayName: 'Workaround ACL issue'
+
- script: ${{ parameters.sudo_dependencies }} ./.azure-pipelines/posix-deps-${{ parameters.dependencies }}.sh $(openssl_version)
displayName: 'Install dependencies'
variables:
testRunTitle: '$(system.pullRequest.TargetBranch)-linux'
testRunPlatform: linux
- openssl_version: 1.1.0j
+ openssl_version: 1.1.1c
steps:
- template: ./posix-steps.yml
variables:
testRunTitle: '$(Build.SourceBranchName)-linux-coverage'
testRunPlatform: linux-coverage
- openssl_version: 1.1.0j
+ openssl_version: 1.1.1c
steps:
- template: ./posix-steps.yml
where ``<builder>`` is one of html, text, latex, or htmlhelp (for explanations
see the make targets above).
+Deprecation header
+==================
+
+You can define the ``outdated`` variable in ``html_context`` to show a
+red banner on each page redirecting to the "latest" version.
+
+The link points to the same page on ``/3/``, sadly for the moment the
+language is lost during the process.
+
Contributing
============
minute, second and microsecond.
+.. c:function:: PyObject* PyDateTime_FromDateAndTimeAndFold(int year, int month, int day, int hour, int minute, int second, int usecond, int fold)
+
+ Return a :class:`datetime.datetime` object with the specified year, month, day, hour,
+ minute, second, microsecond and fold.
+
+ .. versionadded:: 3.6
+
+
.. c:function:: PyObject* PyTime_FromTime(int hour, int minute, int second, int usecond)
Return a :class:`datetime.time` object with the specified hour, minute, second and
microsecond.
+.. c:function:: PyObject* PyTime_FromTimeAndFold(int hour, int minute, int second, int usecond, int fold)
+
+ Return a :class:`datetime.time` object with the specified hour, minute, second,
+ microsecond and fold.
+
+ .. versionadded:: 3.6
+
+
.. c:function:: PyObject* PyDelta_FromDSU(int days, int seconds, int useconds)
Return a :class:`datetime.timedelta` object representing the given number
in which the unraisable exception occurred. If possible,
the repr of *obj* will be printed in the warning message.
+ An exception must be set when calling this function.
+
Raising exceptions
==================
single: SIGINT
single: KeyboardInterrupt (built-in exception)
- This function simulates the effect of a :const:`SIGINT` signal arriving --- the
- next time :c:func:`PyErr_CheckSignals` is called, :exc:`KeyboardInterrupt` will
- be raised. It may be called without holding the interpreter lock.
-
- .. % XXX This was described as obsolete, but is used in
- .. % _thread.interrupt_main() (used from IDLE), so it's still needed.
+ Simulate the effect of a :const:`SIGINT` signal arriving. The next time
+ :c:func:`PyErr_CheckSignals` is called, the Python signal handler for
+ :const:`SIGINT` will be called.
+ If :const:`SIGINT` isn't handled by Python (it was set to
+ :data:`signal.SIG_DFL` or :data:`signal.SIG_IGN`), this function does
+ nothing.
.. c:function:: int PySignal_SetWakeupFd(int fd)
* Informative functions:
+ * :c:func:`Py_IsInitialized`
* :c:func:`PyMem_GetAllocator`
* :c:func:`PyObject_GetArenaAllocator`
* :c:func:`Py_GetBuildInfo`
*NULL*. If the lock has been created, the current thread must not have
acquired it, otherwise deadlock ensues.
+ .. note::
+ Calling this function from a thread when the runtime is finalizing
+ will terminate the thread, even if the thread was not created by Python.
+ You can use :c:func:`_Py_IsFinalizing` or :func:`sys.is_finalizing` to
+ check if the interpreter is in process of being finalized before calling
+ this function to avoid unwanted termination.
.. c:function:: PyThreadState* PyThreadState_Get()
When the function returns, the current thread will hold the GIL and be able
to call arbitrary Python code. Failure is a fatal error.
+ .. note::
+ Calling this function from a thread when the runtime is finalizing
+ will terminate the thread, even if the thread was not created by Python.
+ You can use :c:func:`_Py_IsFinalizing` or :func:`sys.is_finalizing` to
+ check if the interpreter is in process of being finalized before calling
+ this function to avoid unwanted termination.
.. c:function:: void PyGILState_Release(PyGILState_STATE)
Return the interpreter state object at the head of the list of all such objects.
+.. c:function:: PyInterpreterState* PyInterpreterState_Main()
+
+ Return the main interpreter state object.
+
+
.. c:function:: PyInterpreterState* PyInterpreterState_Next(PyInterpreterState *interp)
Return the next interpreter state object after *interp* from the list of all
All function, type and macro definitions needed to use the Python/C API are
included in your code by the following line::
- #include "Python.h"
+ #define PY_SSIZE_T_CLEAN
+ #include <Python.h>
This implies inclusion of the following standard headers: ``<stdio.h>``,
``<string.h>``, ``<errno.h>``, ``<limits.h>``, ``<assert.h>`` and ``<stdlib.h>``
headers on some systems, you *must* include :file:`Python.h` before any standard
headers are included.
+ It is recommended to always define ``PY_SSIZE_T_CLEAN`` before including
+ ``Python.h``. See :ref:`arg-parsing` for a description of this macro.
+
All user visible names defined by Python.h (except those defined by the included
standard headers) have one of the prefixes ``Py`` or ``_Py``. Names beginning
with ``_Py`` are for internal use by the Python implementation and should not be
If the value of *obj* is out of range for an :c:type:`unsigned long`,
return the reduction of that value modulo ``ULONG_MAX + 1``.
- Returns ``-1`` on error. Use :c:func:`PyErr_Occurred` to disambiguate.
+ Returns ``(unsigned long)-1`` on error. Use :c:func:`PyErr_Occurred` to
+ disambiguate.
.. c:function:: unsigned long long PyLong_AsUnsignedLongLongMask(PyObject *obj)
If the value of *obj* is out of range for an :c:type:`unsigned long long`,
return the reduction of that value modulo ``PY_ULLONG_MAX + 1``.
- Returns ``-1`` on error. Use :c:func:`PyErr_Occurred` to disambiguate.
+ Returns ``(unsigned long long)-1`` on error. Use :c:func:`PyErr_Occurred`
+ to disambiguate.
.. c:function:: double PyLong_AsDouble(PyObject *pylong)
Setup hooks to detect bugs in the Python memory allocator functions.
- Newly allocated memory is filled with the byte ``0xCB``, freed memory is
- filled with the byte ``0xDB``.
+ Newly allocated memory is filled with the byte ``0xCD`` (``CLEANBYTE``),
+ freed memory is filled with the byte ``0xDD`` (``DEADBYTE``). Memory blocks
+ are surrounded by "forbidden bytes" (``FORBIDDENBYTE``: byte ``0xFD``).
Runtime checks:
if the GIL is held when functions of :c:data:`PYMEM_DOMAIN_OBJ` and
:c:data:`PYMEM_DOMAIN_MEM` domains are called.
+ .. versionchanged:: 3.7.3
+ Byte patterns ``0xCB`` (``CLEANBYTE``), ``0xDB`` (``DEADBYTE``) and
+ ``0xFB`` (``FORBIDDENBYTE``) have been replaced with ``0xCD``, ``0xDD``
+ and ``0xFD`` to use the same values than Windows CRT debug ``malloc()``
+ and ``free()``.
+
.. _pymalloc:
*args* must not be *NULL*, use an empty tuple if no arguments are needed.
If no named arguments are needed, *kwargs* can be *NULL*.
- Returns the result of the call on success, or *NULL* on failure.
+ Return the result of the call on success, or raise an exception and return
+ *NULL* on failure.
This is the equivalent of the Python expression:
``callable(*args, **kwargs)``.
Call a callable Python object *callable*, with arguments given by the
tuple *args*. If no arguments are needed, then *args* can be *NULL*.
- Returns the result of the call on success, or *NULL* on failure.
+ Return the result of the call on success, or raise an exception and return
+ *NULL* on failure.
This is the equivalent of the Python expression: ``callable(*args)``.
The C arguments are described using a :c:func:`Py_BuildValue` style format
string. The format can be *NULL*, indicating that no arguments are provided.
- Returns the result of the call on success, or *NULL* on failure.
+ Return the result of the call on success, or raise an exception and return
+ *NULL* on failure.
This is the equivalent of the Python expression: ``callable(*args)``.
The format can be *NULL*, indicating that no arguments are provided.
- Returns the result of the call on success, or *NULL* on failure.
+ Return the result of the call on success, or raise an exception and return
+ *NULL* on failure.
This is the equivalent of the Python expression:
``obj.name(arg1, arg2, ...)``.
:c:type:`PyObject\*` arguments. The arguments are provided as a variable number
of parameters followed by *NULL*.
- Returns the result of the call on success, or *NULL* on failure.
+ Return the result of the call on success, or raise an exception and return
+ *NULL* on failure.
This is the equivalent of the Python expression:
``callable(arg1, arg2, ...)``.
Calls a method of the Python object *obj*, where the name of the method is given as a
Python string object in *name*. It is called with a variable number of
:c:type:`PyObject\*` arguments. The arguments are provided as a variable number
- of parameters followed by *NULL*. Returns the result of the call on success, or
+ of parameters followed by *NULL*.
+
+ Return the result of the call on success, or raise an exception and return
*NULL* on failure.
.. c:type:: PyCFunctionWithKeywords
- Type of the functions used to implement Python callables in C that take
- keyword arguments: they take three :c:type:`PyObject\*` parameters and return
- one such value. See :c:type:`PyCFunction` above for the meaning of the return
- value.
+ Type of the functions used to implement Python callables in C
+ with signature :const:`METH_VARARGS | METH_KEYWORDS`.
+
+
+.. c:type:: _PyCFunctionFast
+
+ Type of the functions used to implement Python callables in C
+ with signature :const:`METH_FASTCALL`.
+
+
+.. c:type:: _PyCFunctionFastWithKeywords
+
+ Type of the functions used to implement Python callables in C
+ with signature :const:`METH_FASTCALL | METH_KEYWORDS`.
.. c:type:: PyMethodDef
The :attr:`ml_flags` field is a bitfield which can include the following flags.
The individual flags indicate either a calling convention or a binding
-convention. Of the calling convention flags, only :const:`METH_VARARGS` and
-:const:`METH_KEYWORDS` can be combined. Any of the calling convention flags
-can be combined with a binding flag.
+convention.
+There are four basic calling conventions for positional arguments
+and two of them can be combined with :const:`METH_KEYWORDS` to support
+also keyword arguments. So there are a total of 6 calling conventions:
.. data:: METH_VARARGS
using :c:func:`PyArg_ParseTuple` or :c:func:`PyArg_UnpackTuple`.
-.. data:: METH_KEYWORDS
+.. data:: METH_VARARGS | METH_KEYWORDS
Methods with these flags must be of type :c:type:`PyCFunctionWithKeywords`.
- The function expects three parameters: *self*, *args*, and a dictionary of
- all the keyword arguments. The flag must be combined with
- :const:`METH_VARARGS`, and the parameters are typically processed using
- :c:func:`PyArg_ParseTupleAndKeywords`.
+ The function expects three parameters: *self*, *args*, *kwargs* where
+ *kwargs* is a dictionary of all the keyword arguments or possibly *NULL*
+ if there are no keyword arguments. The parameters are typically processed
+ using :c:func:`PyArg_ParseTupleAndKeywords`.
+
+
+.. data:: METH_FASTCALL
+
+ Fast calling convention supporting only positional arguments.
+ The methods have the type :c:type:`_PyCFunctionFast`.
+ The first parameter is *self*, the second parameter is a C array
+ of :c:type:`PyObject\*` values indicating the arguments and the third
+ parameter is the number of arguments (the length of the array).
+
+ This is not part of the :ref:`limited API <stable>`.
+
+ .. versionadded:: 3.7
+
+
+.. data:: METH_FASTCALL | METH_KEYWORDS
+
+ Extension of :const:`METH_FASTCALL` supporting also keyword arguments,
+ with methods of type :c:type:`_PyCFunctionFastWithKeywords`.
+ Keyword arguments are passed the same way as in the vectorcall protocol:
+ there is an additional fourth :c:type:`PyObject\*` parameter
+ which is a tuple representing the names of the keyword arguments
+ or possibly *NULL* if there are no keywords. The values of the keyword
+ arguments are stored in the *args* array, after the positional arguments.
+
+ This is not part of the :ref:`limited API <stable>`.
+
+ .. versionadded:: 3.7
.. data:: METH_NOARGS
signature. It should modify its first operand, and return it. This slot
may be left to *NULL*, in this case :c:func:`!PySequence_InPlaceConcat`
will fall back to :c:func:`PySequence_Concat`. It is also used by the
- augmented assignment ``+=``, after trying numeric inplace addition
+ augmented assignment ``+=``, after trying numeric in-place addition
via the :c:member:`~PyNumberMethods.nb_inplace_add` slot.
.. c:member:: ssizeargfunc PySequenceMethods.sq_inplace_repeat
signature. It should modify its first operand, and return it. This slot
may be left to *NULL*, in this case :c:func:`!PySequence_InPlaceRepeat`
will fall back to :c:func:`PySequence_Repeat`. It is also used by the
- augmented assignment ``*=``, after trying numeric inplace multiplication
+ augmented assignment ``*=``, after trying numeric in-place multiplication
via the :c:member:`~PyNumberMethods.nb_inplace_multiply` slot.
and inefficient; it should be avoided in performance- or memory-sensitive
situations.
-Due to the transition between the old APIs and the new APIs, unicode objects
+Due to the transition between the old APIs and the new APIs, Unicode objects
can internally be in two states depending on how they were created:
-* "canonical" unicode objects are all objects created by a non-deprecated
- unicode API. They use the most efficient representation allowed by the
+* "canonical" Unicode objects are all objects created by a non-deprecated
+ Unicode API. They use the most efficient representation allowed by the
implementation.
-* "legacy" unicode objects have been created through one of the deprecated
+* "legacy" Unicode objects have been created through one of the deprecated
APIs (typically :c:func:`PyUnicode_FromUnicode`) and only bear the
:c:type:`Py_UNICODE*` representation; you will have to call
:c:func:`PyUnicode_READY` on them before calling any other API.
.. c:function:: void* PyUnicode_DATA(PyObject *o)
- Return a void pointer to the raw unicode buffer. *o* has to be a Unicode
+ Return a void pointer to the raw Unicode buffer. *o* has to be a Unicode
object in the "canonical" representation (not checked).
.. versionadded:: 3.3
.. c:function:: PyObject* PyUnicode_FromFormat(const char *format, ...)
Take a C :c:func:`printf`\ -style *format* string and a variable number of
- arguments, calculate the size of the resulting Python unicode string and return
+ arguments, calculate the size of the resulting Python Unicode string and return
a string with the values formatted into it. The variable arguments must be C
types and must correspond exactly to the format characters in the *format*
ASCII-encoded string. The following format characters are allowed:
| :attr:`%A` | PyObject\* | The result of calling |
| | | :func:`ascii`. |
+-------------------+---------------------+--------------------------------+
- | :attr:`%U` | PyObject\* | A unicode object. |
+ | :attr:`%U` | PyObject\* | A Unicode object. |
+-------------------+---------------------+--------------------------------+
- | :attr:`%V` | PyObject\*, | A unicode object (which may be |
+ | :attr:`%V` | PyObject\*, | A Unicode object (which may be |
| | const char\* | *NULL*) and a null-terminated |
| | | C character array as a second |
| | | parameter (which will be used, |
.. c:function:: int PyUnicode_CompareWithASCIIString(PyObject *uni, const char *string)
- Compare a unicode object, *uni*, with *string* and return ``-1``, ``0``, ``1`` for less
+ Compare a Unicode object, *uni*, with *string* and return ``-1``, ``0``, ``1`` for less
than, equal, and greater than, respectively. It is best to pass only
ASCII-encoded strings, but the function interprets the input string as
ISO-8859-1 if it contains non-ASCII characters.
.. c:function:: PyObject* PyUnicode_RichCompare(PyObject *left, PyObject *right, int op)
- Rich compare two unicode strings and return one of the following:
+ Rich compare two Unicode strings and return one of the following:
* ``NULL`` in case an exception was raised
* :const:`Py_True` or :const:`Py_False` for successful comparisons
.. c:function:: void PyUnicode_InternInPlace(PyObject **string)
Intern the argument *\*string* in place. The argument must be the address of a
- pointer variable pointing to a Python unicode string object. If there is an
+ pointer variable pointing to a Python Unicode string object. If there is an
existing interned string that is the same as *\*string*, it sets *\*string* to
it (decrementing the reference count of the old string object and incrementing
the reference count of the interned string object), otherwise it leaves
.. c:function:: PyObject* PyUnicode_InternFromString(const char *v)
A combination of :c:func:`PyUnicode_FromString` and
- :c:func:`PyUnicode_InternInPlace`, returning either a new unicode string
+ :c:func:`PyUnicode_InternInPlace`, returning either a new Unicode string
object that has been interned, or a new ("owned") reference to an earlier
interned string object with the same value.
(:func:`sys.getfilesystemencoding`). If *closeit* is true, the file is
closed before PyRun_SimpleFileExFlags returns.
+ .. note::
+ On Windows, *fp* should be opened as binary mode (e.g. ``fopen(filename, "rb")``.
+ Otherwise, Python may not handle script file with LF line ending correctly.
+
.. c:function:: int PyRun_InteractiveOne(FILE *fp, const char *filename)
'languages': ['ja', 'fr', 'zh_TW', 'zh_CN'], 'builders': ['man', 'text'],
}
+# Avoid a warning with Sphinx >= 2.0
+master_doc = 'contents'
# Options for HTML output
# -----------------------
PyDateTime_FromDateAndTime:int:second::
PyDateTime_FromDateAndTime:int:usecond::
+PyDateTime_FromDateAndTimeAndFold:PyObject*::+1:
+PyDateTime_FromDateAndTimeAndFold:int:year::
+PyDateTime_FromDateAndTimeAndFold:int:month::
+PyDateTime_FromDateAndTimeAndFold:int:day::
+PyDateTime_FromDateAndTimeAndFold:int:hour::
+PyDateTime_FromDateAndTimeAndFold:int:minute::
+PyDateTime_FromDateAndTimeAndFold:int:second::
+PyDateTime_FromDateAndTimeAndFold:int:usecond::
+PyDateTime_FromDateAndTimeAndFold:int:fold::
+
PyDateTime_FromTimestamp:PyObject*::+1:
PyDateTime_FromTimestamp:PyObject*:args:0:
PyTime_FromTime:int:second::
PyTime_FromTime:int:usecond::
+PyTime_FromTimeAndFold:PyObject*::+1:
+PyTime_FromTimeAndFold:int:hour::
+PyTime_FromTimeAndFold:int:minute::
+PyTime_FromTimeAndFold:int:second::
+PyTime_FromTimeAndFold:int:usecond::
+PyTime_FromTimeAndFold:int:fold::
+
PyTraceMalloc_Track:int:::
PyTraceMalloc_Track:unsigned int:domain::
PyTraceMalloc_Track:uintptr_t:ptr::
.. _currently recommended tools: https://packaging.python.org/guides/tool-recommendations/#packaging-tool-recommendations
-Reading the guide
-=================
+.. index::
+ single: Python Package Index (PyPI)
+ single: PyPI; (see Python Package Index (PyPI))
+
+.. _publishing-python-packages:
+
+Reading the Python Packaging User Guide
+=======================================
The Python Packaging User Guide covers the various key steps and elements
-involved in creating a project:
+involved in creating and publishing a project:
* `Project structure`_
* `Building and packaging the project`_
configfile.rst
sourcedist.rst
builtdist.rst
- packageindex.rst
examples.rst
extending.rst
commandref.rst
-.. index::
- single: Python Package Index (PyPI)
- single: PyPI; (see Python Package Index (PyPI))
+:orphan:
.. _package-index:
The Python Package Index (PyPI)
*******************************
-The `Python Package Index (PyPI)`_ stores :ref:`meta-data <meta-data>`
-describing distributions packaged with distutils, as well as package data like
-distribution files if a package author wishes.
-
-Distutils provides the :command:`register` and :command:`upload` commands for
-pushing meta-data and distribution files to PyPI, respectively. See
-:ref:`package-commands` for information on these commands.
-
-
-PyPI overview
-=============
-
-PyPI lets you submit any number of versions of your distribution to the index.
-If you alter the meta-data for a particular version, you can submit it again
-and the index will be updated.
-
-PyPI holds a record for each (name, version) combination submitted. The first
-user to submit information for a given name is designated the Owner of that
-name. Changes can be submitted through the :command:`register` command or
-through the web interface. Owners can designate other users as Owners or
-Maintainers. Maintainers can edit the package information, but not designate
-new Owners or Maintainers.
-
-By default PyPI displays only the newest version of a given package. The web
-interface lets one change this default behavior and manually select which
-versions to display and hide.
-
-For each version, PyPI displays a home page. The home page is created from
-the ``long_description`` which can be submitted via the :command:`register`
-command. See :ref:`package-display` for more information.
-
-
-.. _package-commands:
-
-Distutils commands
-==================
-
-Distutils exposes two commands for submitting package data to PyPI: the
-:ref:`register <package-register>` command for submitting meta-data to PyPI
-and the :ref:`upload <package-upload>` command for submitting distribution
-files. Both commands read configuration data from a special file called a
-:ref:`.pypirc file <pypirc>`.
-
-
-.. _package-register:
-
-The ``register`` command
-------------------------
-
-The distutils command :command:`register` is used to submit your distribution's
-meta-data to an index server. It is invoked as follows::
-
- python setup.py register
-
-Distutils will respond with the following prompt::
-
- running register
- We need to know who you are, so please choose either:
- 1. use your existing login,
- 2. register as a new user,
- 3. have the server generate a new password for you (and email it to you), or
- 4. quit
- Your selection [default 1]:
-
-Note: if your username and password are saved locally, you will not see this
-menu. Also, refer to :ref:`pypirc` for how to store your credentials in a
-:file:`.pypirc` file.
-
-If you have not registered with PyPI, then you will need to do so now. You
-should choose option 2, and enter your details as required. Soon after
-submitting your details, you will receive an email which will be used to confirm
-your registration.
-
-Once you are registered, you may choose option 1 from the menu. You will be
-prompted for your PyPI username and password, and :command:`register` will then
-submit your meta-data to the index.
-
-See :ref:`package-cmdoptions` for options to the :command:`register` command.
-
-
-.. _package-upload:
-
-The ``upload`` command
-----------------------
-
-The distutils command :command:`upload` pushes the distribution files to PyPI.
-
-The command is invoked immediately after building one or more distribution
-files. For example, the command ::
-
- python setup.py sdist bdist_wininst upload
-
-will cause the source distribution and the Windows installer to be uploaded to
-PyPI. Note that these will be uploaded even if they are built using an earlier
-invocation of :file:`setup.py`, but that only distributions named on the command
-line for the invocation including the :command:`upload` command are uploaded.
-
-If a :command:`register` command was previously called in the same command,
-and if the password was entered in the prompt, :command:`upload` will reuse the
-entered password. This is useful if you do not want to store a password in
-clear text in a :file:`.pypirc` file.
-
-You can use the ``--sign`` option to tell :command:`upload` to sign each
-uploaded file using GPG (GNU Privacy Guard). The :program:`gpg` program must
-be available for execution on the system :envvar:`PATH`. You can also specify
-which key to use for signing using the ``--identity=name`` option.
-
-See :ref:`package-cmdoptions` for additional options to the :command:`upload`
-command.
-
-
-.. _package-cmdoptions:
-
-Additional command options
---------------------------
-
-This section describes options common to both the :command:`register` and
-:command:`upload` commands.
-
-The ``--repository`` or ``-r`` option lets you specify a PyPI server
-different from the default. For example::
-
- python setup.py sdist bdist_wininst upload -r https://example.com/pypi
-
-For convenience, a name can be used in place of the URL when the
-:file:`.pypirc` file is configured to do so. For example::
-
- python setup.py register -r other
-
-See :ref:`pypirc` for more information on defining alternate servers.
-
-The ``--show-response`` option displays the full response text from the PyPI
-server, which is useful when debugging problems with registering and uploading.
-
-
-.. index::
- single: .pypirc file
- single: Python Package Index (PyPI); .pypirc file
-
-.. _pypirc:
-
-The ``.pypirc`` file
---------------------
-
-The :command:`register` and :command:`upload` commands both check for the
-existence of a :file:`.pypirc` file at the location :file:`$HOME/.pypirc`.
-If this file exists, the command uses the username, password, and repository
-URL configured in the file. The format of a :file:`.pypirc` file is as
-follows:
-
-.. code-block:: ini
-
- [distutils]
- index-servers =
- pypi
-
- [pypi]
- repository: <repository-url>
- username: <username>
- password: <password>
-
-The *distutils* section defines an *index-servers* variable that lists the
-name of all sections describing a repository.
-
-Each section describing a repository defines three variables:
-
-- *repository*, that defines the url of the PyPI server. Defaults to
- ``https://upload.pypi.org/legacy/``.
-- *username*, which is the registered username on the PyPI server.
-- *password*, that will be used to authenticate. If omitted the user
- will be prompt to type it when needed.
-
-If you want to define another server a new section can be created and
-listed in the *index-servers* variable:
-
-.. code-block:: ini
-
- [distutils]
- index-servers =
- pypi
- other
-
- [pypi]
- repository: <repository-url>
- username: <username>
- password: <password>
-
- [other]
- repository: https://example.com/pypi
- username: <username>
- password: <password>
-
-This allows the :command:`register` and :command:`upload` commands to be
-called with the ``--repository`` option as described in
-:ref:`package-cmdoptions`.
-
-Specifically, you might want to add the `PyPI Test Repository
-<https://wiki.python.org/moin/TestPyPI>`_ to your ``.pypirc`` to facilitate
-testing before doing your first upload to ``PyPI`` itself.
-
-
-.. _package-display:
-
-PyPI package display
-====================
-
-The ``long_description`` field plays a special role at PyPI. It is used by
-the server to display a home page for the registered package.
-
-If you use the `reStructuredText <http://docutils.sourceforge.net/rst.html>`_
-syntax for this field, PyPI will parse it and display an HTML output for
-the package home page.
-
-The ``long_description`` field can be attached to a text file located
-in the package::
-
- from distutils.core import setup
-
- with open('README.txt') as file:
- long_description = file.read()
-
- setup(name='Distutils',
- long_description=long_description)
-
-In that case, :file:`README.txt` is a regular reStructuredText text file located
-in the root of the package besides :file:`setup.py`.
-
-To prevent registering broken reStructuredText content, you can use the
-:program:`rst2html` program that is provided by the :mod:`docutils` package and
-check the ``long_description`` from the command line:
-
-.. code-block:: shell-session
-
- $ python setup.py --long-description | rst2html.py > output.html
-
-:mod:`docutils` will display a warning if there's something wrong with your
-syntax. Because PyPI applies additional checks (e.g. by passing ``--no-raw``
-to ``rst2html.py`` in the command above), being able to run the command above
-without warnings does not guarantee that PyPI will convert the content
-successfully.
+The `Python Package Index (PyPI)`_ stores metadata describing distributions
+packaged with distutils and other publishing tools, as well the distribution
+archives themselves.
+References to up to date PyPI documentation can be found at
+:ref:`publishing-python-packages`.
.. _Python Package Index (PyPI): https://pypi.org
setup(...,
data_files=[('bitmaps', ['bm/b1.gif', 'bm/b2.gif']),
- ('config', ['cfg/data.cfg']),
+ ('config', ['cfg/data.cfg'])],
)
Each (*directory*, *files*) pair in the sequence specifies the installation
provided, distutils lists it as the author in :file:`PKG-INFO`.
(4)
- The ``long_description`` field is used by PyPI when you are
- :ref:`registering <package-register>` a package, to
- :ref:`build its home page <package-display>`.
+ The ``long_description`` field is used by PyPI when you publish a package,
+ to build its project page.
(5)
The ``license`` field is a text indicating the license covering the
Uploading Packages to the Package Index
***************************************
-The contents of this page have moved to the section :ref:`package-index`.
+References to up to date PyPI documentation can be found at
+:ref:`publishing-python-packages`.
to interact with the application directly. This can for example be used to
perform some operation on a file. ::
+ #define PY_SSIZE_T_CLEAN
#include <Python.h>
int
:file:`spammodule.c`; if the module name is very long, like ``spammify``, the
module name can be just :file:`spammify.c`.)
-The first line of our file can be::
+The first two lines of our file can be::
+ #define PY_SSIZE_T_CLEAN
#include <Python.h>
which pulls in the Python API (you can add a comment describing the purpose of
headers on some systems, you *must* include :file:`Python.h` before any standard
headers are included.
+ It is recommended to always define ``PY_SSIZE_T_CLEAN`` before including
+ ``Python.h``. See :ref:`parsetuple` for a description of this macro.
+
All user-visible symbols defined by :file:`Python.h` have a prefix of ``Py`` or
``PY``, except those defined in standard header files. For convenience, and
since they are used extensively by the Python interpreter, ``"Python.h"``
Here is an example module which uses keywords, based on an example by Geoff
Philbrick (philbrick@hks.com)::
- #include "Python.h"
+ #define PY_SSIZE_T_CLEAN /* Make "s#" use Py_ssize_t rather than int. */
+ #include <Python.h>
static PyObject *
keywdarg_parrot(PyObject *self, PyObject *args, PyObject *keywds)
In the beginning of the module, right after the line ::
- #include "Python.h"
+ #include <Python.h>
two more lines must be added::
.tp_doc = "Custom objects",
.tp_basicsize = sizeof(CustomObject),
.tp_itemsize = 0,
+ .tp_flags = Py_TPFLAGS_DEFAULT,
.tp_new = PyType_GenericNew,
};
equal to ``E_EOF``, which means the input is incomplete. Here's a sample code
fragment, untested, inspired by code from Alex Farber::
+ #define PY_SSIZE_T_CLEAN
#include <Python.h>
#include <node.h>
#include <errcode.h>
#include <stdio.h>
#include <readline.h>
+ #define PY_SSIZE_T_CLEAN
#include <Python.h>
#include <object.h>
#include <compile.h>
releases.
The latest stable releases can always be found on the `Python download page
-<https://www.python.org/downloads/>`_. There are two production-ready version
-of Python: 2.x and 3.x, but the recommended one at this times is Python 3.x.
-Although Python 2.x is still widely used, `it will not be
-maintained after January 1, 2020 <https://www.python.org/dev/peps/pep-0373/>`_.
-Python 2.x was known for having more third-party libraries available, however,
-by the time of this writing, most of the widely used libraries support Python 3.x,
-and some are even dropping the Python 2.x support.
-
+<https://www.python.org/downloads/>`_. There are two production-ready versions
+of Python: 2.x and 3.x. The recommended version is 3.x, which is supported by
+most widely used libraries. Although 2.x is still widely used, `it will not
+be maintained after January 1, 2020 <https://www.python.org/dev/peps/pep-0373/>`_.
How many people are using Python?
---------------------------------
========================================================
By installing the `PyObjc Objective-C bridge
-<https://pythonhosted.org/pyobjc/>`_, Python programs can use Mac OS X's
+<https://pypi.org/project/pyobjc/>`_, Python programs can use Mac OS X's
Cocoa libraries.
:ref:`Pythonwin <windows-faq>` by Mark Hammond includes an interface to the
Yes.
+Several debuggers for Python are described below, and the built-in function
+:func:`breakpoint` allows you to drop into any of them.
+
The pdb module is a simple but adequate console-mode debugger for Python. It is
part of the standard Python library, and is :mod:`documented in the Library
Reference Manual <pdb>`. You can also write your own debugger by using the code
.. XXX need review for Python 3.
XXX need review for Windows Vista/Seven?
+.. _faq-run-program-under-windows:
+
How do I run a Python program under Windows?
--------------------------------------------
examples which can be executed interactively in the interpreter.
``...``
- The default Python prompt of the interactive shell when entering code for
- an indented code block, when within a pair of matching left and right
- delimiters (parentheses, square brackets, curly braces or triple quotes),
- or after specifying a decorator.
+ The default Python prompt of the interactive shell when entering the
+ code for an indented code block, when within a pair of matching left and
+ right delimiters (parentheses, square brackets, curly braces or triple
+ quotes), or after specifying a decorator.
2to3
A tool that tries to convert Python 2.x code to Python 3.x code by
statement by defining :meth:`__enter__` and :meth:`__exit__` methods.
See :pep:`343`.
+ context variable
+ A variable which can have different values depending on its context.
+ This is similar to Thread-Local Storage in which each execution
+ thread may have a different value for a variable. However, with context
+ variables, there may be several contexts in one execution thread and the
+ main usage for context variables is to keep track of variables in
+ concurrent asynchronous tasks.
+ See :mod:`contextvars`.
+
contiguous
.. index:: C-contiguous, Fortran contiguous
Hashability makes an object usable as a dictionary key and a set member,
because these data structures use the hash value internally.
- All of Python's immutable built-in objects are hashable; mutable
- containers (such as lists or dictionaries) are not. Objects which are
+ Most of Python's immutable built-in objects are hashable; mutable
+ containers (such as lists or dictionaries) are not; immutable
+ containers (such as tuples and frozensets) are only hashable if
+ their elements are hashable. Objects which are
instances of user-defined classes are hashable by default. They all
compare unequal (except with themselves), and their hash value is derived
from their :func:`id`.
:term:`finder`. See :pep:`302` for details and
:class:`importlib.abc.Loader` for an :term:`abstract base class`.
+ magic method
+ .. index:: pair: magic; method
+
+ An informal synonym for :term:`special method`.
+
mapping
A container object that supports arbitrary key lookups and implements the
methods specified in the :class:`~collections.abc.Mapping` or
(subscript) notation uses :class:`slice` objects internally.
special method
+ .. index:: pair: special; method
+
A method that is called implicitly by Python to execute a certain
operation on a type, such as addition. Such methods have names starting
and ending with double underscores. Special methods are documented in
... print(x)
... f = staticmethod(f)
...
- >>> print(E.f(3))
+ >>> E.f(3)
3
- >>> print(E().f(3))
+ >>> E().f(3)
3
Using the non-data descriptor protocol, a pure Python version of
UTF-8 has several convenient properties:
1. It can handle any Unicode code point.
-2. A Unicode string is turned into a sequence of bytes containing no embedded zero
- bytes. This avoids byte-ordering issues, and means UTF-8 strings can be
- processed by C functions such as ``strcpy()`` and sent through protocols that
- can't handle zero bytes.
+2. A Unicode string is turned into a sequence of bytes that contains embedded
+ zero bytes only where they represent the null character (U+0000). This means
+ that UTF-8 strings can be processed by C functions such as ``strcpy()`` and sent
+ through protocols that can't handle zero bytes for anything other than
+ end-of-string markers.
3. A string of ASCII text is also valid UTF-8 text.
4. UTF-8 is fairly compact; the majority of commonly used characters can be
represented with one or two bytes.
5. If bytes are corrupted or lost, it's possible to determine the start of the
next UTF-8-encoded code point and resynchronize. It's also unlikely that
random 8-bit data will look like valid UTF-8.
-
+6. UTF-8 is a byte oriented encoding. The encoding specifies that each
+ character is represented by a specific sequence of one or more bytes. This
+ avoids the byte-ordering issues that can occur with integer and word oriented
+ encodings, like UTF-16 and UTF-32, where the sequence of bytes varies depending
+ on the hardware on which the string was encoded.
References
+#define PY_SSIZE_T_CLEAN
#include <Python.h>
typedef struct {
+#define PY_SSIZE_T_CLEAN
#include <Python.h>
#include "structmember.h"
+#define PY_SSIZE_T_CLEAN
#include <Python.h>
#include "structmember.h"
+#define PY_SSIZE_T_CLEAN
#include <Python.h>
#include "structmember.h"
+#define PY_SSIZE_T_CLEAN
#include <Python.h>
int
now = datetime.datetime.now()
cur.execute("select ?", (now,))
print(cur.fetchone()[0])
+
+con.close()
p = Point(4.0, -3.2)
cur.execute("select ?", (p,))
print(cur.fetchone()[0])
+
+con.close()
p = Point(4.0, -3.2)
cur.execute("select ?", (p,))
print(cur.fetchone()[0])
+
+con.close()
+++ /dev/null
-import sqlite3
-
-con = sqlite3.connect("mydb")
+++ /dev/null
-import sqlite3
-
-con = sqlite3.connect(":memory:")
cur1 = con.cursor()
cur2 = con.cursor()
print(con.numcursors)
+
+con.close()
con.execute("insert into person(firstname) values (?)", ("Joe",))
except sqlite3.IntegrityError:
print("couldn't add Joe twice")
+
+# Connection object used as context manager only commits or rollbacks transactions,
+# so the connection object should be closed manually
+con.close()
cur.execute(SELECT)
for row in cur:
print('%s is %d years old.' % (row[0], row[1]))
+
+con.close()
# Retrieve all rows as a sequence and print that sequence:
print(cur.fetchall())
+
+con.close()
cur.execute("select * from people where name_last=:who and age=:age", {"who": who, "age": age})
print(cur.fetchone())
+
+con.close()
+++ /dev/null
-import sqlite3
-
-con = sqlite3.connect("mydb")
-
-cur = con.cursor()
-
-who = "Yeltsin"
-age = 72
-
-cur.execute("select name_last, age from people where name_last=:who and age=:age",
- locals())
-print(cur.fetchone())
cur.execute("select c from characters")
print(cur.fetchall())
+
+con.close()
cur.execute("select c from characters")
print(cur.fetchall())
+
+con.close()
1987
);
""")
+con.close()
# The changes will not be saved unless the transaction is committed explicitly:
con.commit()
+
+con.close()
""")
for row in con.execute("select rowid, name, ingredients from recipe where name match 'pie'"):
print(row)
+
+con.close()
cur = con.cursor()
cur.execute("select md5(?)", (b"foo",))
print(cur.fetchone()[0])
+
+con.close()
cur.execute("insert into test(i) values (2)")
cur.execute("select mysum(i) from test")
print(cur.fetchone()[0])
+
+con.close()
cur.execute('select ? as "x [timestamp]"', (datetime.datetime.now(),))
dt = cur.fetchone()[0]
print(dt, type(dt))
+
+con.close()
row = cur.fetchone()
print("current_date", row[0], type(row[0]))
print("current_timestamp", row[1], type(row[1]))
+
+con.close()
cur = con.cursor()
cur.execute("select 1 as a")
print(cur.fetchone()["a"])
+
+con.close()
assert row["name"] == row["nAmE"]
assert row[1] == row["age"]
assert row[1] == row["AgE"]
+
+con.close()
print(row)
print("I just deleted", con.execute("delete from person").rowcount, "rows")
+
+# close is not a shortcut method and it's not called automatically,
+# so the connection object should be closed manually
+con.close()
print(fieldValue.ljust(FIELD_MAX_WIDTH), end=' ')
print() # Finish the row with a newline.
+
+con.close()
cur.execute("select ?", ("bar",))
row = cur.fetchone()
assert row[0] == "barfoo"
+
+con.close()
+#define PY_SSIZE_T_CLEAN
#include <Python.h>
typedef struct {
.. function:: interrupt_main()
- Raise a :exc:`KeyboardInterrupt` exception in the main thread. A subthread can
- use this function to interrupt the main thread.
+ Simulate the effect of a :data:`signal.SIGINT` signal arriving in the main
+ thread. A thread can use this function to interrupt the main thread.
+
+ If :data:`signal.SIGINT` isn't handled by Python (it was set to
+ :data:`signal.SIG_DFL` or :data:`signal.SIG_IGN`), this function does
+ nothing.
.. function:: exit()
the event loop's internal monotonic clock.
.. note::
-
- Timeouts (relative *delay* or absolute *when*) should not
- exceed one day.
+ .. versionchanged:: 3.8
+ In Python 3.7 and earlier timeouts (relative *delay* or absolute *when*)
+ should not exceed one day. This has been fixed in Python 3.8.
.. seealso::
import os
import signal
- def ask_exit(signame):
+ def ask_exit(signame, loop):
print("got signal %s: exit" % signame)
loop.stop()
for signame in {'SIGINT', 'SIGTERM'}:
loop.add_signal_handler(
getattr(signal, signame),
- functools.partial(ask_exit, signame))
+ functools.partial(ask_exit, signame, loop))
await asyncio.sleep(3600)
various kinds of communication channels.
Transport objects are always instantiated by an
-ref:`asyncio event loop <asyncio-event-loop>`.
+:ref:`asyncio event loop <asyncio-event-loop>`.
asyncio implements transports for TCP, UDP, SSL, and subprocess pipes.
The methods available on a transport depend on the transport's kind.
Creating Subprocesses
=====================
-.. coroutinefunction:: create_subprocess_exec(\*args, stdin=None, \
+.. coroutinefunction:: create_subprocess_exec(program, \*args, stdin=None, \
stdout=None, stderr=None, loop=None, \
limit=None, \*\*kwds)
argument; use the :func:`asyncio.wait_for` function to perform
operations with timeouts.
-asyncio has the following basic sychronization primitives:
+asyncio has the following basic synchronization primitives:
* :class:`Lock`
* :class:`Event`
This method waits until the lock is *unlocked*, sets it to
*locked* and returns ``True``.
+ When more than one coroutine is blocked in :meth:`acquire`
+ waiting for the lock to be unlocked, only one coroutine
+ eventually proceeds.
+
+ Acquiring a lock is *fair*: the coroutine that proceeds will be
+ the first coroutine that started waiting on the lock.
+
.. method:: release()
Release the lock.
>>> main()
<coroutine object main at 0x1053bb7c8>
-To actually run a coroutine asyncio provides three main mechanisms:
+To actually run a coroutine, asyncio provides three main mechanisms:
* The :func:`asyncio.run` function to run the top-level
entry point "main()" function (see the above example.)
The *buffering* argument is ignored. Its use is deprecated.
- If *mode* is ``'w'`` or ``'a'``, *compresslevel* can be a number between
+ If *mode* is ``'w'`` or ``'a'``, *compresslevel* can be an integer between
``1`` and ``9`` specifying the level of compression: ``1`` produces the
least compression, and ``9`` (default) produces the most compression.
incrementally. For one-shot compression, use the :func:`compress` function
instead.
- *compresslevel*, if given, must be a number between ``1`` and ``9``. The
+ *compresslevel*, if given, must be an integer between ``1`` and ``9``. The
default is ``9``.
.. method:: compress(data)
.. function:: compress(data, compresslevel=9)
- Compress *data*.
+ Compress *data*, a :term:`bytes-like object <bytes-like object>`.
- *compresslevel*, if given, must be a number between ``1`` and ``9``. The
+ *compresslevel*, if given, must be an integer between ``1`` and ``9``. The
default is ``9``.
For incremental compression, use a :class:`BZ2Compressor` instead.
.. function:: decompress(data)
- Decompress *data*.
+ Decompress *data*, a :term:`bytes-like object <bytes-like object>`.
If *data* is the concatenation of multiple compressed streams, decompress
all of the streams.
.. versionchanged:: 3.3
Support for multi-stream inputs was added.
+.. _bz2-usage-examples:
+
+Examples of usage
+-----------------
+
+Below are some examples of typical usage of the :mod:`bz2` module.
+
+Using :func:`compress` and :func:`decompress` to demonstrate round-trip compression:
+
+ >>> import bz2
+
+ >>> data = b"""\
+ ... Donec rhoncus quis sapien sit amet molestie. Fusce scelerisque vel augue
+ ... nec ullamcorper. Nam rutrum pretium placerat. Aliquam vel tristique lorem,
+ ... sit amet cursus ante. In interdum laoreet mi, sit amet ultrices purus
+ ... pulvinar a. Nam gravida euismod magna, non varius justo tincidunt feugiat.
+ ... Aliquam pharetra lacus non risus vehicula rutrum. Maecenas aliquam leo
+ ... felis. Pellentesque semper nunc sit amet nibh ullamcorper, ac elementum
+ ... dolor luctus. Curabitur lacinia mi ornare consectetur vestibulum."""
+
+ >>> c = bz2.compress(data)
+ >>> len(data) / len(c) # Data compression ratio
+ 1.513595166163142
+
+ >>> d = bz2.decompress(c)
+ >>> data == d # Check equality to original object after round-trip
+ True
+
+Using :class:`BZ2Compressor` for incremental compression:
+
+ >>> import bz2
+
+ >>> def gen_data(chunks=10, chunksize=1000):
+ ... """Yield incremental blocks of chunksize bytes."""
+ ... for _ in range(chunks):
+ ... yield b"z" * chunksize
+ ...
+ >>> comp = bz2.BZ2Compressor()
+ >>> out = b""
+ >>> for chunk in gen_data():
+ ... # Provide data to the compressor object
+ ... out = out + comp.compress(chunk)
+ ...
+ >>> # Finish the compression process. Call this once you have
+ >>> # finished providing data to the compressor.
+ >>> out = out + comp.flush()
+
+The example above uses a very "nonrandom" stream of data
+(a stream of `b"z"` chunks). Random data tends to compress poorly,
+while ordered, repetitive data usually yields a high compression ratio.
+
+Writing and reading a bzip2-compressed file in binary mode:
+
+ >>> import bz2
+
+ >>> data = b"""\
+ ... Donec rhoncus quis sapien sit amet molestie. Fusce scelerisque vel augue
+ ... nec ullamcorper. Nam rutrum pretium placerat. Aliquam vel tristique lorem,
+ ... sit amet cursus ante. In interdum laoreet mi, sit amet ultrices purus
+ ... pulvinar a. Nam gravida euismod magna, non varius justo tincidunt feugiat.
+ ... Aliquam pharetra lacus non risus vehicula rutrum. Maecenas aliquam leo
+ ... felis. Pellentesque semper nunc sit amet nibh ullamcorper, ac elementum
+ ... dolor luctus. Curabitur lacinia mi ornare consectetur vestibulum."""
+
+ >>> with bz2.open("myfile.bz2", "wb") as f:
+ ... # Write compressed data to file
+ ... unused = f.write(data)
+
+ >>> with bz2.open("myfile.bz2", "rb") as f:
+ ... # Decompress data from file
+ ... content = f.read()
+
+ >>> content == data # Check equality to original object after round-trip
+ True
--------------
-This module is always available. It provides access to mathematical functions
-for complex numbers. The functions in this module accept integers,
-floating-point numbers or complex numbers as arguments. They will also accept
-any Python object that has either a :meth:`__complex__` or a :meth:`__float__`
-method: these methods are used to convert the object to a complex or
-floating-point number, respectively, and the function is then applied to the
-result of the conversion.
+This module provides access to mathematical functions for complex numbers. The
+functions in this module accept integers, floating-point numbers or complex
+numbers as arguments. They will also accept any Python object that has either a
+:meth:`__complex__` or a :meth:`__float__` method: these methods are used to
+convert the object to a complex or floating-point number, respectively, and
+the function is then applied to the result of the conversion.
.. note::
.. method:: setstate(state)
- Set the state of the encoder to *state*. *state* must be a decoder state
+ Set the state of the decoder to *state*. *state* must be a decoder state
returned by :meth:`getstate`.
>>> Pixel(11, 22, 128, 255, 0)
Pixel(x=11, y=22, red=128, green=255, blue=0)
-.. attribute:: somenamedtuple._fields_defaults
+.. attribute:: somenamedtuple._field_defaults
Dictionary mapping field names to default values.
.. doctest::
>>> Account = namedtuple('Account', ['type', 'balance'], defaults=[0])
- >>> Account._fields_defaults
+ >>> Account._field_defaults
{'balance': 0}
>>> Account('premium')
Account(type='premium', balance=0)
>>> getattr(p, 'x')
11
-To convert a dictionary to a named tuple, use the double-star-operator
+To convert a dictionary to a named tuple, use the ``**`` operator
(as described in :ref:`tut-unpacking-arguments`):
>>> d = {'x': 11, 'y': 22}
.. seealso::
- * `Recipe for named tuple abstract base class with a metaclass mix-in
- <https://code.activestate.com/recipes/577629-namedtupleabc-abstract-base-class-mix-in-for-named/>`_
- by Jan Kaliszewski. Besides providing an :term:`abstract base class` for
- named tuples, it also supports an alternate :term:`metaclass`-based
- constructor that is convenient for use cases where named tuples are being
- subclassed.
+ * See :class:`typing.NamedTuple` for a way to add type hints for named
+ tuples. It also provides an elegant notation using the :keyword:`class`
+ keyword::
+
+ class Component(NamedTuple):
+ part_number: int
+ weight: float
+ description: Optional[str] = None
* See :meth:`types.SimpleNamespace` for a mutable namespace based on an
underlying dictionary instead of a tuple.
- * See :meth:`typing.NamedTuple` for a way to add type hints for named tuples.
+ * The :mod:`dataclasses` module provides a decorator and functions for
+ automatically adding generated special methods to user-defined classes.
:class:`OrderedDict` objects
given, it will default to the number of processors on the machine.
If *max_workers* is lower or equal to ``0``, then a :exc:`ValueError`
will be raised.
+ On Windows, *max_workers* must be equal or lower than ``61``. If it is not
+ then :exc:`ValueError` will be raised. If *max_workers* is ``None``, then
+ the default chosen will be at most ``61``, even if more processors are
+ available.
*mp_context* can be a multiprocessing context or None. It will be used to
launch the workers. If *mp_context* is ``None`` or not given, the default
multiprocessing context is used.
.. method:: cancel()
- Attempt to cancel the call. If the call is currently being executed and
- cannot be cancelled then the method will return ``False``, otherwise the
- call will be cancelled and the method will return ``True``.
+ Attempt to cancel the call. If the call is currently being executed or
+ finished running and cannot be cancelled then the method will return
+ ``False``, otherwise the call will be cancelled and the method will
+ return ``True``.
.. method:: cancelled()
Wait for the :class:`Future` instances (possibly created by different
:class:`Executor` instances) given by *fs* to complete. Returns a named
2-tuple of sets. The first set, named ``done``, contains the futures that
- completed (finished or were cancelled) before the wait completed. The second
- set, named ``not_done``, contains uncompleted futures.
+ completed (finished or cancelled futures) before the wait completed. The
+ second set, named ``not_done``, contains the futures that did not complete
+ (pending or running futures).
*timeout* can be used to control the maximum number of seconds to wait before
returning. *timeout* can be an int or float. If *timeout* is not specified
Returns an iterator over the :class:`Future` instances (possibly created by
different :class:`Executor` instances) given by *fs* that yields futures as
- they complete (finished or were cancelled). Any futures given by *fs* that
+ they complete (finished or cancelled futures). Any futures given by *fs* that
are duplicated will be returned once. Any futures that completed before
:func:`as_completed` is called will be yielded first. The returned iterator
raises a :exc:`concurrent.futures.TimeoutError` if :meth:`~iterator.__next__`
>>> list(custom['Section2'].keys())
['AnotherKey']
+ .. note::
+ The optionxform function transforms option names to a canonical form.
+ This should be an idempotent function: if the name is already in
+ canonical form, it should be returned unchanged.
+
+
.. attribute:: ConfigParser.SECTCRE
A compiled regular expression used to parse section headers. The default
and so on). Later assignments to the :attr:`_fields_` class variable will
raise an AttributeError.
- It is possible to defined sub-subclasses of structure types, they inherit
+ It is possible to define sub-subclasses of structure types, they inherit
the fields of the base class plus the :attr:`_fields_` defined in the
sub-subclass, if any.
td.lptdesc = POINTER(some_type)
td.u.lptdesc = POINTER(some_type)
- It is possible to defined sub-subclasses of structures, they inherit the
+ It is possible to define sub-subclasses of structures, they inherit the
fields of the base class. If the subclass definition has a separate
:attr:`_fields_` variable, the fields specified in this are appended to the
fields of the base class.
| | rounded to the nearest multiple of |
| | timedelta.resolution using round-half-to-even.|
+--------------------------------+-----------------------------------------------+
-| ``f = t2 / t3`` | Division (3) of *t2* by *t3*. Returns a |
-| | :class:`float` object. |
+| ``f = t2 / t3`` | Division (3) of overall duration *t2* by |
+| | interval unit *t3*. Returns a :class:`float` |
+| | object. |
+--------------------------------+-----------------------------------------------+
| ``t1 = t2 / f or t1 = t2 / i`` | Delta divided by a float or an int. The result|
| | is rounded to the nearest multiple of |
.. method:: timedelta.total_seconds()
Return the total number of seconds contained in the duration. Equivalent to
- ``td / timedelta(seconds=1)``.
+ ``td / timedelta(seconds=1)``. For interval units other than seconds, use the
+ division form directly (e.g. ``td / timedelta(microseconds=1)``).
Note that for very large time intervals (greater than 270 years on
most platforms) this method will lose microsecond accuracy.
Using datetime with tzinfo:
- >>> from datetime import timedelta, datetime, tzinfo
- >>> class GMT1(tzinfo):
+ >>> from datetime import timedelta, datetime, tzinfo, timezone
+ >>> class KabulTz(tzinfo):
+ ... # Kabul used +4 until 1945, when they moved to +4:30
+ ... UTC_MOVE_DATE = datetime(1944, 12, 31, 20, tzinfo=timezone.utc)
... def utcoffset(self, dt):
- ... return timedelta(hours=1) + self.dst(dt)
- ... def dst(self, dt):
- ... # DST starts last Sunday in March
- ... d = datetime(dt.year, 4, 1) # ends last Sunday in October
- ... self.dston = d - timedelta(days=d.weekday() + 1)
- ... d = datetime(dt.year, 11, 1)
- ... self.dstoff = d - timedelta(days=d.weekday() + 1)
- ... if self.dston <= dt.replace(tzinfo=None) < self.dstoff:
- ... return timedelta(hours=1)
+ ... if dt.year < 1945:
+ ... return timedelta(hours=4)
+ ... elif (1945, 1, 1, 0, 0) <= dt.timetuple()[:5] < (1945, 1, 1, 0, 30):
+ ... # If dt falls in the imaginary range, use fold to decide how
+ ... # to resolve. See PEP495
+ ... return timedelta(hours=4, minutes=(30 if dt.fold else 0))
... else:
- ... return timedelta(0)
- ... def tzname(self,dt):
- ... return "GMT +1"
+ ... return timedelta(hours=4, minutes=30)
+ ...
+ ... def fromutc(self, dt):
+ ... # A custom implementation is required for fromutc as
+ ... # the input to this function is a datetime with utc values
+ ... # but with a tzinfo set to self
+ ... # See datetime.astimezone or fromtimestamp
+ ...
+ ... # Follow same validations as in datetime.tzinfo
+ ... if not isinstance(dt, datetime):
+ ... raise TypeError("fromutc() requires a datetime argument")
+ ... if dt.tzinfo is not self:
+ ... raise ValueError("dt.tzinfo is not self")
+ ...
+ ... if dt.replace(tzinfo=timezone.utc) >= self.UTC_MOVE_DATE:
+ ... return dt + timedelta(hours=4, minutes=30)
+ ... else:
+ ... return dt + timedelta(hours=4)
...
- >>> class GMT2(tzinfo):
- ... def utcoffset(self, dt):
- ... return timedelta(hours=2) + self.dst(dt)
... def dst(self, dt):
- ... d = datetime(dt.year, 4, 1)
- ... self.dston = d - timedelta(days=d.weekday() + 1)
- ... d = datetime(dt.year, 11, 1)
- ... self.dstoff = d - timedelta(days=d.weekday() + 1)
- ... if self.dston <= dt.replace(tzinfo=None) < self.dstoff:
- ... return timedelta(hours=1)
+ ... return timedelta(0)
+ ...
+ ... def tzname(self, dt):
+ ... if dt >= self.UTC_MOVE_DATE:
+ ... return "+04:30"
... else:
- ... return timedelta(0)
- ... def tzname(self,dt):
- ... return "GMT +2"
+ ... return "+04"
...
- >>> gmt1 = GMT1()
- >>> # Daylight Saving Time
- >>> dt1 = datetime(2006, 11, 21, 16, 30, tzinfo=gmt1)
- >>> dt1.dst()
- datetime.timedelta(0)
- >>> dt1.utcoffset()
- datetime.timedelta(seconds=3600)
- >>> dt2 = datetime(2006, 6, 14, 13, 0, tzinfo=gmt1)
- >>> dt2.dst()
- datetime.timedelta(seconds=3600)
- >>> dt2.utcoffset()
- datetime.timedelta(seconds=7200)
+ ... def __repr__(self):
+ ... return f"{self.__class__.__name__}()"
+ ...
+ >>> tz1 = KabulTz()
+ >>> # Datetime before the change
+ >>> dt1 = datetime(1900, 11, 21, 16, 30, tzinfo=tz1)
+ >>> print(dt1.utcoffset())
+ 4:00:00
+ >>> # Datetime after the change
+ >>> dt2 = datetime(2006, 6, 14, 13, 0, tzinfo=tz1)
+ >>> print(dt2.utcoffset())
+ 4:30:00
>>> # Convert datetime to another time zone
- >>> dt3 = dt2.astimezone(GMT2())
- >>> dt3 # doctest: +ELLIPSIS
- datetime.datetime(2006, 6, 14, 14, 0, tzinfo=<GMT2 object at 0x...>)
- >>> dt2 # doctest: +ELLIPSIS
- datetime.datetime(2006, 6, 14, 13, 0, tzinfo=<GMT1 object at 0x...>)
+ >>> dt3 = dt2.astimezone(timezone.utc)
+ >>> dt3
+ datetime.datetime(2006, 6, 14, 8, 30, tzinfo=datetime.timezone.utc)
+ >>> dt2
+ datetime.datetime(2006, 6, 14, 13, 0, tzinfo=KabulTz())
>>> dt2.utctimetuple() == dt3.utctimetuple()
True
Example:
>>> from datetime import time, tzinfo, timedelta
- >>> class GMT1(tzinfo):
+ >>> class TZ1(tzinfo):
... def utcoffset(self, dt):
... return timedelta(hours=1)
... def dst(self, dt):
... return timedelta(0)
... def tzname(self,dt):
- ... return "Europe/Prague"
+ ... return "+01:00"
+ ... def __repr__(self):
+ ... return f"{self.__class__.__name__}()"
...
- >>> t = time(12, 10, 30, tzinfo=GMT1())
- >>> t # doctest: +ELLIPSIS
- datetime.time(12, 10, 30, tzinfo=<GMT1 object at 0x...>)
- >>> gmt = GMT1()
+ >>> t = time(12, 10, 30, tzinfo=TZ1())
+ >>> t
+ datetime.time(12, 10, 30, tzinfo=TZ1())
>>> t.isoformat()
'12:10:30+01:00'
>>> t.dst()
datetime.timedelta(0)
>>> t.tzname()
- 'Europe/Prague'
+ '+01:00'
>>> t.strftime("%H:%M:%S %Z")
- '12:10:30 Europe/Prague'
+ '12:10:30 +01:00'
>>> 'The {} is {:%H:%M}.'.format("time", t)
'The time is 12:10.'
microseconds should not be used, as :class:`date` objects have no such
values. If they're used anyway, ``0`` is substituted for them.
+For the :meth:`datetime.strptime` class method, the default value is ``1900-01-01T00:00:00.000``:
+any components not specified in the format string will be pulled from the default value. [#]_
+
The full set of format codes supported varies across platforms, because Python
calls the platform C library's :func:`strftime` function, and platform
variations are common. To see the full set of format codes supported on your
| | where 0 is Sunday and 6 is | | |
| | Saturday. | | |
+-----------+--------------------------------+------------------------+-------+
-| ``%d`` | Day of the month as a | 01, 02, ..., 31 | |
+| ``%d`` | Day of the month as a | 01, 02, ..., 31 | \(9) |
| | zero-padded decimal number. | | |
+-----------+--------------------------------+------------------------+-------+
| ``%b`` | Month as locale's abbreviated || Jan, Feb, ..., Dec | \(1) |
| | || Januar, Februar, ..., | |
| | | Dezember (de_DE) | |
+-----------+--------------------------------+------------------------+-------+
-| ``%m`` | Month as a zero-padded | 01, 02, ..., 12 | |
+| ``%m`` | Month as a zero-padded | 01, 02, ..., 12 | \(9) |
| | decimal number. | | |
+-----------+--------------------------------+------------------------+-------+
-| ``%y`` | Year without century as a | 00, 01, ..., 99 | |
+| ``%y`` | Year without century as a | 00, 01, ..., 99 | \(9) |
| | zero-padded decimal number. | | |
+-----------+--------------------------------+------------------------+-------+
| ``%Y`` | Year with century as a decimal | 0001, 0002, ..., 2013, | \(2) |
| | number. | 2014, ..., 9998, 9999 | |
+-----------+--------------------------------+------------------------+-------+
-| ``%H`` | Hour (24-hour clock) as a | 00, 01, ..., 23 | |
+| ``%H`` | Hour (24-hour clock) as a | 00, 01, ..., 23 | \(9) |
| | zero-padded decimal number. | | |
+-----------+--------------------------------+------------------------+-------+
-| ``%I`` | Hour (12-hour clock) as a | 01, 02, ..., 12 | |
+| ``%I`` | Hour (12-hour clock) as a | 01, 02, ..., 12 | \(9) |
| | zero-padded decimal number. | | |
+-----------+--------------------------------+------------------------+-------+
| ``%p`` | Locale's equivalent of either || AM, PM (en_US); | \(1), |
| | AM or PM. || am, pm (de_DE) | \(3) |
+-----------+--------------------------------+------------------------+-------+
-| ``%M`` | Minute as a zero-padded | 00, 01, ..., 59 | |
+| ``%M`` | Minute as a zero-padded | 00, 01, ..., 59 | \(9) |
| | decimal number. | | |
+-----------+--------------------------------+------------------------+-------+
-| ``%S`` | Second as a zero-padded | 00, 01, ..., 59 | \(4) |
-| | decimal number. | | |
+| ``%S`` | Second as a zero-padded | 00, 01, ..., 59 | \(4), |
+| | decimal number. | | \(9) |
+-----------+--------------------------------+------------------------+-------+
| ``%f`` | Microsecond as a decimal | 000000, 000001, ..., | \(5) |
| | number, zero-padded on the | 999999 | |
| ``%Z`` | Time zone name (empty string | (empty), UTC, EST, CST | |
| | if the object is naive). | | |
+-----------+--------------------------------+------------------------+-------+
-| ``%j`` | Day of the year as a | 001, 002, ..., 366 | |
+| ``%j`` | Day of the year as a | 001, 002, ..., 366 | \(9) |
| | zero-padded decimal number. | | |
+-----------+--------------------------------+------------------------+-------+
-| ``%U`` | Week number of the year | 00, 01, ..., 53 | \(7) |
-| | (Sunday as the first day of | | |
+| ``%U`` | Week number of the year | 00, 01, ..., 53 | \(7), |
+| | (Sunday as the first day of | | \(9) |
| | the week) as a zero padded | | |
| | decimal number. All days in a | | |
| | new year preceding the first | | |
| | Sunday are considered to be in | | |
| | week 0. | | |
+-----------+--------------------------------+------------------------+-------+
-| ``%W`` | Week number of the year | 00, 01, ..., 53 | \(7) |
-| | (Monday as the first day of | | |
+| ``%W`` | Week number of the year | 00, 01, ..., 53 | \(7), |
+| | (Monday as the first day of | | \(9) |
| | the week) as a decimal number. | | |
| | All days in a new year | | |
| | preceding the first Monday | | |
| ``%u`` | ISO 8601 weekday as a decimal | 1, 2, ..., 7 | |
| | number where 1 is Monday. | | |
+-----------+--------------------------------+------------------------+-------+
-| ``%V`` | ISO 8601 week as a decimal | 01, 02, ..., 53 | \(8) |
-| | number with Monday as | | |
+| ``%V`` | ISO 8601 week as a decimal | 01, 02, ..., 53 | \(8), |
+| | number with Monday as | | \(9) |
| | the first day of the week. | | |
| | Week 01 is the week containing | | |
| | Jan 4. | | |
:meth:`strptime` format string. Also note that ``%G`` and ``%Y`` are not
interchangeable.
+(9)
+ When used with the :meth:`strptime` method, the leading zero is optional
+ for formats ``%d``, ``%m``, ``%H``, ``%I``, ``%M``, ``%S``, ``%J``, ``%U``,
+ ``%W``, and ``%V``. Format ``%y`` does require a leading zero.
+
.. rubric:: Footnotes
.. [#] If, that is, we ignore the effects of Relativity
+.. [#] Passing ``datetime.strptime('Feb 29', '%b %d')`` will fail since ``1900`` is not a leap year.
.. opcode:: RAISE_VARARGS (argc)
- Raises an exception. *argc* indicates the number of arguments to the raise
- statement, ranging from 0 to 3. The handler will find the traceback as TOS2,
- the parameter as TOS1, and the exception as TOS.
+ Raises an exception using one of the 3 forms of the ``raise`` statement,
+ depending on the value of *argc*:
+
+ * 0: ``raise`` (re-raise previous exception)
+ * 1: ``raise TOS`` (raise exception instance or type at ``TOS``)
+ * 2: ``raise TOS1 from TOS`` (raise exception instance or type at ``TOS1``
+ with ``__cause__`` set to ``TOS``)
.. opcode:: CALL_FUNCTION (argc)
.. opcode:: EXTENDED_ARG (ext)
- Prefixes any opcode which has an argument too big to fit into the default two
- bytes. *ext* holds two additional bytes which, taken together with the
- subsequent opcode's argument, comprise a four-byte argument, *ext* being the
- two most-significant bytes.
+ Prefixes any opcode which has an argument too big to fit into the default one
+ byte. *ext* holds an additional byte which act as higher bits in the argument.
+ For each opcode, at most three prefixal ``EXTENDED_ARG`` are allowed, forming
+ an argument from two-byte to four-byte.
.. opcode:: FORMAT_VALUE (flags)
:mod:`doctest` is serious about requiring exact matches in expected output. If
even a single character doesn't match, the test fails. This will probably
surprise you a few times, as you learn exactly what Python does and doesn't
-guarantee about output. For example, when printing a dict, Python doesn't
-guarantee that the key-value pairs will be printed in any particular order, so a
-test like ::
+guarantee about output. For example, when printing a set, Python doesn't
+guarantee that the element is printed in any particular order, so a test like ::
>>> foo()
- {"Hermione": "hippogryph", "Harry": "broomstick"}
+ {"Hermione", "Harry"}
is vulnerable! One workaround is to do ::
- >>> foo() == {"Hermione": "hippogryph", "Harry": "broomstick"}
+ >>> foo() == {"Hermione", "Harry"}
True
instead. Another is to do ::
- >>> d = sorted(foo().items())
+ >>> d = sorted(foo())
>>> d
- [('Harry', 'broomstick'), ('Hermione', 'hippogryph')]
+ ['Harry', 'Hermione']
+
+.. note::
+
+ Before Python 3.6, when printing a dict, Python did not guarantee that
+ the key-value pairs was printed in any particular order.
There are others, but you get the idea.
.. module:: email.message
:synopsis: The base class representing email messages in a fashion
backward compatible with Python 3.2
+ :noindex:
The :class:`Message` class is very similar to the
new API the functionality is provided by the *cte* parameter of
the :meth:`~email.message.EmailMessage.set_content` method.
+This module is deprecated in Python 3. The functions provided here
+should not be called explicitly since the :class:`~email.mime.text.MIMEText`
+class sets the content type and CTE header using the *_subtype* and *_charset*
+values passed during the instaniation of that class.
+
The remaining text in this section is the original documentation of the module.
When creating :class:`~email.message.Message` objects from scratch, you often
(This is required because strings cannot represent non-ASCII bytes.)
Convert any bytes with the high bit set as needed using an
ASCII-compatible :mailheader:`Content-Transfer-Encoding`. That is,
- transform parts with non-ASCII :mailheader:`Cotnent-Transfer-Encoding`
+ transform parts with non-ASCII :mailheader:`Content-Transfer-Encoding`
(:mailheader:`Content-Transfer-Encoding: 8bit`) to an ASCII compatible
:mailheader:`Content-Transfer-Encoding`, and encode RFC-invalid non-ASCII
bytes in headers using the MIME ``unknown-8bit`` character set, thus
.. exception:: PendingDeprecationWarning
- Base class for warnings about features which will be deprecated in the
- future.
+ Base class for warnings about features which are obsolete and
+ expected to be deprecated in the future, but are not deprecated
+ at the moment.
+
+ This class is rarely used as emitting a warning about a possible
+ upcoming deprecation is unusual, and :exc:`DeprecationWarning`
+ is preferred for already active deprecations.
.. exception:: SyntaxWarning
This iterates over the lines of all files listed in ``sys.argv[1:]``, defaulting
to ``sys.stdin`` if the list is empty. If a filename is ``'-'``, it is also
-replaced by ``sys.stdin``. To specify an alternative list of filenames, pass it
-as the first argument to :func:`.input`. A single file name is also allowed.
+replaced by ``sys.stdin`` and the optional arguments *mode* and *openhook*
+are ignored. To specify an alternative list of filenames, pass it as the
+first argument to :func:`.input`. A single file name is also allowed.
All files are opened in text mode by default, but you can override this by
specifying the *mode* parameter in the call to :func:`.input` or
@classmethod
def f(cls, arg1, arg2, ...): ...
- The ``@classmethod`` form is a function :term:`decorator` -- see the description
- of function definitions in :ref:`function` for details.
+ The ``@classmethod`` form is a function :term:`decorator` -- see
+ :ref:`function` for details.
- It can be called either on the class (such as ``C.f()``) or on an instance (such
+ A class method can be called either on the class (such as ``C.f()``) or on an instance (such
as ``C().f()``). The instance is ignored except for its class. If a class
method is called for a derived class, the derived class object is passed as the
implied first argument.
Class methods are different than C++ or Java static methods. If you want those,
- see :func:`staticmethod` in this section.
+ see :func:`staticmethod`.
- For more information on class methods, consult the documentation on the standard
- type hierarchy in :ref:`types`.
+ For more information on class methods, see :ref:`types`.
.. function:: compile(source, filename, mode, flags=0, dont_inherit=False, optimize=-1)
Update and return a dictionary representing the current local symbol table.
Free variables are returned by :func:`locals` when it is called in function
- blocks, but not in class blocks.
+ blocks, but not in class blocks. Note that at the module level, :func:`locals`
+ and :func:`globals` are the same dictionary.
.. note::
The contents of this dictionary should not be modified; changes may not
@staticmethod
def f(arg1, arg2, ...): ...
- The ``@staticmethod`` form is a function :term:`decorator` -- see the
- description of function definitions in :ref:`function` for details.
+ The ``@staticmethod`` form is a function :term:`decorator` -- see
+ :ref:`function` for details.
- It can be called either on the class (such as ``C.f()``) or on an instance (such
- as ``C().f()``). The instance is ignored except for its class.
+ A static method can be called either on the class (such as ``C.f()``) or on an instance (such
+ as ``C().f()``).
Static methods in Python are similar to those found in Java or C++. Also see
:func:`classmethod` for a variant that is useful for creating alternate class
class C:
builtin_open = staticmethod(open)
- For more information on static methods, consult the documentation on the
- standard type hierarchy in :ref:`types`.
+ For more information on static methods, see :ref:`types`.
.. index::
The :mod:`gettext` module provides internationalization (I18N) and localization
(L10N) services for your Python modules and applications. It supports both the
-GNU ``gettext`` message catalog API and a higher level, class-based API that may
+GNU :program:`gettext` message catalog API and a higher level, class-based API that may
be more appropriate for Python files. The interface described below allows you
to write your module and application messages in one natural language, and
provide a catalog of translated messages for running under different natural
Bind the *domain* to the locale directory *localedir*. More concretely,
:mod:`gettext` will look for binary :file:`.mo` files for the given domain using
- the path (on Unix): :file:`localedir/language/LC_MESSAGES/domain.mo`, where
+ the path (on Unix): :file:`{localedir}/{language}/LC_MESSAGES/{domain}.mo`, where
*languages* is searched for in the environment variables :envvar:`LANGUAGE`,
:envvar:`LC_ALL`, :envvar:`LC_MESSAGES`, and :envvar:`LANG` respectively.
The class-based API of the :mod:`gettext` module gives you more flexibility and
greater convenience than the GNU :program:`gettext` API. It is the recommended
way of localizing your Python applications and modules. :mod:`!gettext` defines
-a "translations" class which implements the parsing of GNU :file:`.mo` format
-files, and has methods for returning strings. Instances of this "translations"
-class can also install themselves in the built-in namespace as the function
-:func:`_`.
+a :class:`GNUTranslations` class which implements the parsing of GNU :file:`.mo` format
+files, and has methods for returning strings. Instances of this class can also
+install themselves in the built-in namespace as the function :func:`_`.
.. function:: find(domain, localedir=None, languages=None, all=False)
This function implements the standard :file:`.mo` file search algorithm. It
takes a *domain*, identical to what :func:`textdomain` takes. Optional
- *localedir* is as in :func:`bindtextdomain` Optional *languages* is a list of
+ *localedir* is as in :func:`bindtextdomain`. Optional *languages* is a list of
strings, where each string is a language code.
If *localedir* is not given, then the default system locale directory is used.
.. function:: translation(domain, localedir=None, languages=None, class_=None, fallback=False, codeset=None)
- Return a :class:`Translations` instance based on the *domain*, *localedir*,
+ Return a :class:`*Translations` instance based on the *domain*, *localedir*,
and *languages*, which are first passed to :func:`find` to get a list of the
associated :file:`.mo` file paths. Instances with identical :file:`.mo` file
- names are cached. The actual class instantiated is either *class_* if
+ names are cached. The actual class instantiated is *class_* if
provided, otherwise :class:`GNUTranslations`. The class's constructor must
take a single :term:`file object` argument. If provided, *codeset* will change
the charset used to encode translated strings in the
.. method:: _parse(fp)
- No-op'd in the base class, this method takes file object *fp*, and reads
+ No-op in the base class, this method takes file object *fp*, and reads
the data from the file, initializing its message catalog. If you have an
unsupported message catalog file format, you should override this method
to parse your format.
.. method:: info()
- Return the "protected" :attr:`_info` variable.
+ Return the "protected" :attr:`_info` variable, a dictionary containing
+ the metadata found in the message catalog file.
.. method:: charset()
:meth:`_parse` to enable reading GNU :program:`gettext` format :file:`.mo` files
in both big-endian and little-endian format.
-:class:`GNUTranslations` parses optional meta-data out of the translation
-catalog. It is convention with GNU :program:`gettext` to include meta-data as
-the translation for the empty string. This meta-data is in :rfc:`822`\ -style
+:class:`GNUTranslations` parses optional metadata out of the translation
+catalog. It is convention with GNU :program:`gettext` to include metadata as
+the translation for the empty string. This metadata is in :rfc:`822`\ -style
``key: value`` pairs, and should contain the ``Project-Id-Version`` key. If the
key ``Content-Type`` is found, then the ``charset`` property is used to
initialize the "protected" :attr:`_charset` instance variable, defaulting to
``None`` if not found. If the charset encoding is specified, then all message
ids and message strings read from the catalog are converted to Unicode using
-this encoding, else ASCII encoding is assumed.
+this encoding, else ASCII is assumed.
Since message ids are read as Unicode strings too, all :meth:`*gettext` methods
will assume message ids as Unicode strings, not byte strings.
#. run a suite of tools over your marked files to generate raw messages catalogs
-#. create language specific translations of the message catalogs
+#. create language-specific translations of the message catalogs
#. use the :mod:`gettext` module so that message strings are properly translated
filename = 'mylog.txt'
message = _('writing a log message')
- fp = open(filename, 'w')
- fp.write(message)
- fp.close()
+ with open(filename, 'w') as fp:
+ fp.write(message)
In this example, the string ``'writing a log message'`` is marked as a candidate
for translation, while the strings ``'mylog.txt'`` and ``'w'`` are not.
^^^^^^^^^^^^^^^^^^^^^^
If you are localizing your module, you must take care not to make global
-changes, e.g. to the built-in namespace. You should not use the GNU ``gettext``
+changes, e.g. to the built-in namespace. You should not use the GNU :program:`gettext`
API but instead the class-based API.
Let's say your module is called "spam" and the module's various natural language
.. [#] The default locale directory is system dependent; for example, on RedHat Linux
it is :file:`/usr/share/locale`, but on Solaris it is :file:`/usr/lib/locale`.
The :mod:`gettext` module does not try to support these system dependent
- defaults; instead its default is :file:`sys.prefix/share/locale`. For this
- reason, it is always best to call :func:`bindtextdomain` with an explicit
- absolute path at the start of your application.
+ defaults; instead its default is :file:`{sys.prefix}/share/locale` (see
+ :data:`sys.prefix`). For this reason, it is always best to call
+ :func:`bindtextdomain` with an explicit absolute path at the start of your
+ application.
.. [#] See the footnote for :func:`bindtextdomain` above.
:func:`ssl._create_unverified_context` can be passed to the *context*
parameter.
+ .. versionchanged:: 3.7.4
+ This class now enables TLS 1.3
+ :attr:`ssl.SSLContext.post_handshake_auth` for the default *context* or
+ when *cert_file* is passed with a custom *context*.
+
.. deprecated:: 3.6
*key_file* and *cert_file* are deprecated in favor of *context*.
.. class:: SimpleCookie([input])
This class derives from :class:`BaseCookie` and overrides :meth:`value_decode`
- and :meth:`value_encode` to be the identity and :func:`str` respectively.
-
+ and :meth:`value_encode`. SimpleCookie supports strings as cookie values.
+ When setting the value, SimpleCookie calls the builtin :func:`str()` to convert
+ the value to a string. Values received from HTTP are kept as strings.
.. seealso::
.. method:: BaseCookie.value_decode(val)
- Return a decoded value from a string representation. Return value can be any
- type. This method does nothing in :class:`BaseCookie` --- it exists so it can be
- overridden.
+ Return a tuple ``(real_value, coded_value)`` from a string representation.
+ ``real_value`` can be any type. This method does no decoding in
+ :class:`BaseCookie` --- it exists so it can be overridden.
.. method:: BaseCookie.value_encode(val)
- Return an encoded value. *val* can be any type, but return value must be a
- string. This method does nothing in :class:`BaseCookie` --- it exists so it can
+ Return a tuple ``(real_value, coded_value)``. *val* can be any type, but
+ ``coded_value`` will always be converted to a string.
+ This method does no encoding in :class:`BaseCookie` --- it exists so it can
be overridden.
In general, it should be the case that :meth:`value_encode` and
On macOS, there is one application menu. It dynamically changes according
to the window currently selected. It has an IDLE menu, and some entries
-described below are moved around to conform to Apple guidlines.
+described below are moved around to conform to Apple guidelines.
File menu (Shell and Editor)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Run menu (Editor window only)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+.. _python-shell:
+
Python Shell
Open or wake up the Python Shell window.
+.. _check-module:
+
Check Module
Check the syntax of the module currently open in the Editor window. If the
module has not been saved IDLE will either prompt the user to save or
there is a syntax error, the approximate location is indicated in the
Editor window.
+.. _run-module:
+
Run Module
- Do Check Module (above). If no error, restart the shell to clean the
+ Do :ref:`Check Module <check-module>`. If no error, restart the shell to clean the
environment, then execute the module. Output is displayed in the Shell
window. Note that output requires use of ``print`` or ``write``.
When execution is complete, the Shell retains focus and displays a prompt.
This is similar to executing a file with ``python -i file`` at a command
line.
+.. _run-custom:
+
+Run... Customized
+ Same as :ref:`Run Module <run-module>`, but run the module with customized
+ settings. *Command Line Arguments* extend :data:`sys.argv` as if passed
+ on a command line. The module can be run in the Shell without restarting.
+
+
Shell menu (Shell window only)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
menu. For more, see
:ref:`Setting preferences <preferences>` under Help and preferences.
-Zoom/Restore Height
- Toggles the window between normal size and maximum height. The initial size
- defaults to 40 lines by 80 chars unless changed on the General tab of the
- Configure IDLE dialog.
-
Show/Hide Code Context (Editor Window only)
Open a pane at the top of the edit window which shows the block context
of the code which has scrolled above the top of the window. See
:ref:`Code Context <code-context>` in the Editing and Navigation section below.
+Zoom/Restore Height
+ Toggles the window between normal size and maximum height. The initial size
+ defaults to 40 lines by 80 chars unless changed on the General tab of the
+ Configure IDLE dialog. The maximum height for a screen is determined by
+ momentarily maximizing a window the first time one is zoomed on the screen.
+ Changing screen settings may invalidate the saved height. This toogle has
+ no effect when a window is maximized.
+
Window menu (Shell and Editor)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Go to file/line
Same as in Debug menu.
-The Shell window also has an output squeezing facility explained in the
-the *Python Shell window* subsection below.
+The Shell window also has an output squeezing facility explained in the *Python
+Shell window* subsection below.
Squeeze
If the cursor is over an output line, squeeze all the output between
IDLE's changes are lost and input from the keyboard and output to the screen
will not work correctly.
+When user code raises SystemExit either directly or by calling sys.exit, IDLE
+returns to a Shell prompt instead of exiting.
+
User output in Shell
^^^^^^^^^^^^^^^^^^^^
A Windows console, for instance, keeps a user-settable 1 to 9999 lines,
with 300 the default.
-A Tk Text widget, and hence IDLE's Shell, displays characters (codepoints)
-in the the BMP (Basic Multilingual Plane) subset of Unicode.
-Which characters are displayed with a proper glyph and which with a
-replacement box depends on the operating system and installed fonts.
-Tab characters cause the following text to begin after
-the next tab stop. (They occur every 8 'characters').
-Newline characters cause following text to appear on a new line.
-Other control characters are ignored or displayed as a space, box, or
-something else, depending on the operating system and font.
-(Moving the text cursor through such output with arrow keys may exhibit
-some surprising spacing behavior.)
-
-.. code-block:: none
-
- >>> s = 'a\tb\a<\x02><\r>\bc\nd'
+A Tk Text widget, and hence IDLE's Shell, displays characters (codepoints) in
+the BMP (Basic Multilingual Plane) subset of Unicode. Which characters are
+displayed with a proper glyph and which with a replacement box depends on the
+operating system and installed fonts. Tab characters cause the following text
+to begin after the next tab stop. (They occur every 8 'characters'). Newline
+characters cause following text to appear on a new line. Other control
+characters are ignored or displayed as a space, box, or something else,
+depending on the operating system and font. (Moving the text cursor through
+such output with arrow keys may exhibit some surprising spacing behavior.) ::
+
+ >>> s = 'a\tb\a<\x02><\r>\bc\nd' # Enter 22 chars.
>>> len(s)
14
>>> s # Display repr(s)
root = tk.Tk()`` in standard Python and nothing appears. Enter the same
in IDLE and a tk window appears. In standard Python, one must also enter
``root.update()`` to see the window. IDLE does the equivalent in the
-background, about 20 times a second, which is about every 50 milleseconds.
+background, about 20 times a second, which is about every 50 milliseconds.
Next enter ``b = tk.Button(root, text='button'); b.pack()``. Again,
nothing visibly changes in standard Python until one enters ``root.update()``.
implementations represent a file that cannot be read, written or
seeked.
- Even though :class:`IOBase` does not declare :meth:`read`, :meth:`readinto`,
+ Even though :class:`IOBase` does not declare :meth:`read`
or :meth:`write` because their signatures will vary, implementations and
clients should consider those methods part of the interface. Also,
implementations may raise a :exc:`ValueError` (or :exc:`UnsupportedOperation`)
The basic type used for binary data read from or written to a file is
:class:`bytes`. Other :term:`bytes-like objects <bytes-like object>` are
- accepted as method arguments too. In some cases, such as
- :meth:`~RawIOBase.readinto`, a writable object such as :class:`bytearray`
- is required. Text I/O classes work with :class:`str` data.
+ accepted as method arguments too. Text I/O classes work with :class:`str` data.
Note that calling any method (even inquiries) on a closed stream is
undefined. Implementations may raise :exc:`ValueError` in this case.
Read bytes into a pre-allocated, writable
:term:`bytes-like object` *b*, and return the
- number of bytes read. If the object is in non-blocking mode and no bytes
+ number of bytes read. For example, *b* might be a :class:`bytearray`.
+ If the object is in non-blocking mode and no bytes
are available, ``None`` is returned.
.. method:: write(b)
Read bytes into a pre-allocated, writable
:term:`bytes-like object` *b* and return the number of bytes read.
+ For example, *b* might be a :class:`bytearray`.
Like :meth:`read`, multiple reads may be issued to the underlying raw
stream, unless the latter is interactive.
.. class:: TextIOBase
Base class for text streams. This class provides a character and line based
- interface to stream I/O. There is no :meth:`readinto` method because
- Python's character strings are immutable. It inherits :class:`IOBase`.
+ interface to stream I/O. It inherits :class:`IOBase`.
There is no public constructor.
:class:`TextIOBase` provides or overrides these data attributes and
will wrap a buffered object inside a :class:`TextIOWrapper`. This includes
standard streams and therefore affects the built-in function :func:`print()` as
well.
-
--------------
-This module is always available. It provides access to the mathematical
-functions defined by the C standard.
+This module provides access to the mathematical functions defined by the C
+standard.
These functions cannot be used with complex numbers; use the functions of the
same name from the :mod:`cmath` module if you require support for complex
.. function:: factorial(x)
- Return *x* factorial. Raises :exc:`ValueError` if *x* is not integral or
+ Return *x* factorial as an integer. Raises :exc:`ValueError` if *x* is not integral or
is negative.
| Ordering | ``a > b`` | ``gt(a, b)`` |
+-----------------------+-------------------------+---------------------------------------+
-Inplace Operators
------------------
+In-place Operators
+------------------
Many operations have an "in-place" version. Listed below are functions
providing a more primitive access to in-place operators than the usual syntax
>>> a
'hello'
-For mutable targets such as lists and dictionaries, the inplace method
+For mutable targets such as lists and dictionaries, the in-place method
will perform the update, so no subsequent assignment is necessary:
>>> s = ['h', 'e', 'l', 'l', 'o']
.. function:: commonpath(paths)
Return the longest common sub-path of each pathname in the sequence
- *paths*. Raise ValueError if *paths* contains both absolute and relative
+ *paths*. Raise :exc:`ValueError` if *paths* contains both absolute and relative
pathnames, or if *paths* is empty. Unlike :func:`commonprefix`, this
returns a valid path.
.. function:: normcase(path)
- Normalize the case of a pathname. On Unix and Mac OS X, this returns the
- path unchanged; on case-insensitive filesystems, it converts the path to
- lowercase. On Windows, it also converts forward slashes to backward slashes.
+ Normalize the case of a pathname. On Windows, convert all characters in the
+ pathname to lowercase, and also convert forward slashes to backward slashes.
+ On other operating systems, return the path unchanged.
Raise a :exc:`TypeError` if the type of *path* is not ``str`` or ``bytes`` (directly
or indirectly through the :class:`os.PathLike` interface).
is raised.
.. versionadded:: 3.6
- The *strict* argument.
+ The *strict* argument (pre-3.6 behavior is strict).
.. method:: Path.rglob(pattern)
Spawn a process, and connect its controlling terminal with the current
process's standard io. This is often used to baffle programs which insist on
- reading from the controlling terminal.
+ reading from the controlling terminal. It is expected that the process
+ spawned behind the pty will eventually terminate, and when it does *spawn*
+ will return.
+
+ The functions *master_read* and *stdin_read* are passed a file descriptor
+ which they should read from, and they should always return a byte string. In
+ order to force spawn to return before the child process exits an
+ :exc:`OSError` should be thrown.
+
+ The default implementation for both functions will read and return up to 1024
+ bytes each time the function is called. The *master_read* callback is passed
+ the pseudoterminal’s master file descriptor to read output from the child
+ process, and *stdin_read* is passed file descriptor 0, to read from the
+ parent process's standard input.
+
+ Returning an empty byte string from either callback is interpreted as an
+ end-of-file (EOF) condition, and that callback will not be called after
+ that. If *stdin_read* signals EOF the controlling terminal can no longer
+ communicate with the parent process OR the child process. Unless the child
+ process will quit without any input, *spawn* will then loop forever. If
+ *master_read* signals EOF the same behavior results (on linux at least).
+
+ If both callbacks signal EOF then *spawn* will probably never return, unless
+ *select* throws an error on your platform when passed three empty lists. This
+ is a bug, documented in `issue 26228 <https://bugs.python.org/issue26228>`_.
- The functions *master_read* and *stdin_read* should be functions which read from
- a file descriptor. The defaults try to read 1024 bytes each time they are
- called.
.. versionchanged:: 3.4
:func:`spawn` now returns the status value from :func:`os.waitpid`
.. versionadded:: 3.7
Descriptors for nested definitions. They are accessed through the
- new children attibute. Each has a new parent attribute.
+ new children attribute. Each has a new parent attribute.
The descriptors returned by these functions are instances of
Function and Class classes. Users are not expected to create instances
Otherwise (*block* is false), return an item if one is immediately available,
else raise the :exc:`Empty` exception (*timeout* is ignored in that case).
+ Prior to 3.0 on POSIX systems, and for all versions on Windows, if
+ *block* is true and *timeout* is ``None``, this operation goes into
+ an uninterruptible wait on an underlying lock. This means that no exceptions
+ can occur, and in particular a SIGINT will not trigger a :exc:`KeyboardInterrupt`.
+
.. method:: Queue.get_nowait()
Alternative Generator
---------------------
+.. class:: Random([seed])
+
+ Class that implements the default pseudo-random number generator used by the
+ :mod:`random` module.
+
.. class:: SystemRandom([seed])
Class that uses the :func:`os.urandom` function for generating random numbers
Unknown escapes in *repl* consisting of ``'\'`` and an ASCII letter
now are errors.
+ .. versionchanged:: 3.7
Empty matches for the pattern are replaced when adjacent to a previous
non-empty match.
Readline keybindings may be configured via an initialization file, typically
``.inputrc`` in your home directory. See `Readline Init File
-<https://cnswww.cns.cwru.edu/php/chet/readline/rluserman.html#SEC9>`_
+<https://tiswww.cwru.edu/php/chet/readline/rluserman.html#SEC9>`_
in the GNU Readline manual for information about the format and
allowable constructs of that file, and the capabilities of the
Readline library in general.
The :class:`scheduler` class defines a generic interface to scheduling events.
It needs two functions to actually deal with the "outside world" --- *timefunc*
should be callable without arguments, and return a number (the "time", in any
- units whatsoever). If time.monotonic is not available, the *timefunc* default
- is time.time instead. The *delayfunc* function should be callable with one
+ units whatsoever). The *delayfunc* function should be callable with one
argument, compatible with the output of *timefunc*, and should delay that many
time units. *delayfunc* will also be called with the argument ``0`` after each
event is run to allow other threads an opportunity to run in multi-threaded
executed when a signal is received. A small number of default handlers are
installed: :const:`SIGPIPE` is ignored (so write errors on pipes and sockets
can be reported as ordinary Python exceptions) and :const:`SIGINT` is
-translated into a :exc:`KeyboardInterrupt` exception.
+translated into a :exc:`KeyboardInterrupt` exception if the parent process
+has not changed it.
A handler for a particular signal, once set, remains installed until it is
explicitly reset (Python emulates the BSD style interface regardless of the
Enables CAN FD support in a CAN_RAW socket. This is disabled by default.
This allows your application to send both CAN and CAN FD frames; however,
- you one must accept both CAN and CAN FD frames when reading from the socket.
+ you must accept both CAN and CAN FD frames when reading from the socket.
This constant is documented in the Linux documentation.
with open('dump.sql', 'w') as f:
for line in con.iterdump():
f.write('%s\n' % line)
+ con.close()
.. method:: backup(target, *, pages=0, progress=None, name="main", sleep=0.250)
print(f'Copied {total-remaining} of {total} pages...')
con = sqlite3.connect('existing_db.db')
- with sqlite3.connect('backup.db') as bck:
+ bck = sqlite3.connect('backup.db')
+ with bck:
con.backup(bck, pages=1, progress=progress)
+ bck.close()
+ con.close()
Example 2, copy an existing database into a transient copy::
containing the part before the separator, the separator itself or its
bytearray copy, and the part after the separator.
If the separator is not found, return a 3-tuple
- containing a copy of the original sequence, followed by two empty bytes or
- bytearray objects.
+ containing two empty bytes or bytearray objects, followed by a copy of the
+ original sequence.
The separator to search for may be any :term:`bytes-like object`.
If *capture_output* is true, stdout and stderr will be captured.
When used, the internal :class:`Popen` object is automatically created with
``stdout=PIPE`` and ``stderr=PIPE``. The *stdout* and *stderr* arguments may
- not be used as well.
+ not be supplied at the same time as *capture_output*. If you wish to capture
+ and combine both streams into one, use ``stdout=PIPE`` and ``stderr=STDOUT``
+ instead of *capture_output*.
The *timeout* argument is passed to :meth:`Popen.communicate`. If the timeout
expires, the child process will be killed and waited for. The
Run the command described by *args*. Wait for command to complete, then
return the :attr:`~Popen.returncode` attribute.
- This is equivalent to::
+ Code needing to capture stdout or stderr should use :func:`run` instead:
run(...).returncode
- (except that the *input* and *check* parameters are not supported)
+ To suppress stdout or stderr, supply a value of :data:`DEVNULL`.
- The arguments shown above are merely the most
- common ones. The full function signature is largely the
+ The arguments shown above are merely some common ones.
+ The full function signature is the
same as that of the :class:`Popen` constructor - this function passes all
supplied arguments other than *timeout* directly through to that interface.
:exc:`CalledProcessError` object will have the return code in the
:attr:`~CalledProcessError.returncode` attribute.
- This is equivalent to::
+ Code needing to capture stdout or stderr should use :func:`run` instead:
run(..., check=True)
- (except that the *input* parameter is not supported)
+ To suppress stdout or stderr, supply a value of :data:`DEVNULL`.
- The arguments shown above are merely the most
- common ones. The full function signature is largely the
+ The arguments shown above are merely some common ones.
+ The full function signature is the
same as that of the :class:`Popen` constructor - this function passes all
supplied arguments other than *timeout* directly through to that interface.
run(..., check=True, stdout=PIPE).stdout
- The arguments shown above are merely the most common ones.
+ The arguments shown above are merely some common ones.
The full function signature is largely the same as that of :func:`run` -
most arguments are passed directly through to that interface.
However, explicitly passing ``input=None`` to inherit the parent's
encoding of the output data may depend on the command being invoked, so the
decoding to text will often need to be handled at the application level.
- This behaviour may be overridden by setting *universal_newlines* to
- ``True`` as described above in :ref:`frequently-used-arguments`.
+ This behaviour may be overridden by setting *text*, *encoding*, *errors*,
+ or *universal_newlines* to ``True`` as described in
+ :ref:`frequently-used-arguments` and :func:`run`.
To also capture standard error in the result, use
``stderr=subprocess.STDOUT``::
To loop over the standard input, or the list of files given on the
command line, see the :mod:`fileinput` module.
+ .. note::
+ On Unix, command line arguments are passed by bytes from OS. Python decodes
+ them with filesystem encoding and "surrogateescape" error handler.
+ When you need original bytes, you can get it by
+ ``[os.fsencode(arg) for arg in sys.argv]``.
+
.. data:: base_exec_prefix
Set the system's trace function, which allows you to implement a Python
source code debugger in Python. The function is thread-specific; for a
- debugger to support multiple threads, it must be registered using
- :func:`settrace` for each thread being debugged.
+ debugger to support multiple threads, it must register a trace function using
+ :func:`settrace` for each thread being debugged or use :func:`threading.settrace`.
Trace functions should have three arguments: *frame*, *event*, and
*arg*. *frame* is the current stack frame. *event* is a string: ``'call'``,
You may override this method in a subclass. The standard :meth:`run`
method invokes the callable object passed to the object's constructor as
- the *target* argument, if any, with sequential and keyword arguments taken
+ the *target* argument, if any, with positional and keyword arguments taken
from the *args* and *kwargs* arguments, respectively.
.. method:: join(timeout=None)
:c:func:`QueryPerformanceCounter`. The resolution is typically better than one
microsecond.
- .. deprecated:: 3.3
+ .. deprecated-removed:: 3.3 3.8
The behaviour of this function depends on the platform: use
:func:`perf_counter` or :func:`process_time` instead, depending on your
requirements, to have a well defined behaviour.
>>> timeit.timeit('"-".join(map(str, range(100)))', number=10000)
0.23702679807320237
+A callable can also be passed from the :ref:`python-interface`::
-Note however that :mod:`timeit` will automatically determine the number of
+ >>> timeit.timeit(lambda: "-".join(map(str, range(100))), number=10000)
+ 0.19665591977536678
+
+Note however that :func:`.timeit` will automatically determine the number of
repetitions only when the command-line interface is used. In the
:ref:`timeit-examples` section you can find more advanced examples.
`TKDocs <http://www.tkdocs.com/>`_
Extensive tutorial plus friendlier widget pages for some of the widgets.
- `Tkinter reference: a GUI for Python <https://infohost.nmt.edu/tcc/help/pubs/tkinter/web/index.html>`_
+ `Tkinter 8.5 reference: a GUI for Python <https://web.archive.org/web/20190524140835/https://infohost.nmt.edu/tcc/help/pubs/tkinter/web/index.html>`_
On-line reference material.
`Tkinter docs from effbot <http://effbot.org/tkinterbook/>`_
Book by Mark Lutz, has excellent coverage of Tkinter.
`Modern Tkinter for Busy Python Developers <https://www.amazon.com/Modern-Tkinter-Python-Developers-ebook/dp/B0071QDNLO/>`_
- Book by Mark Rozerman about building attractive and modern graphical user interfaces with Python and Tkinter.
+ Book by Mark Roseman about building attractive and modern graphical user interfaces with Python and Tkinter.
`Python and Tkinter Programming <https://www.manning.com/books/python-and-tkinter-programming>`_
Book by John Grayson (ISBN 1-884777-81-3).
`Tcl/Tk recent man pages <https://www.tcl.tk/doc/>`_
Recent Tcl/Tk manuals on www.tcl.tk.
- `ActiveState Tcl Home Page <http://tcl.activestate.com/>`_
+ `ActiveState Tcl Home Page <https://tcl.tk>`_
The Tk/Tcl development is largely taking place at ActiveState.
`Tcl and the Tk Toolkit <https://www.amazon.com/exec/obidos/ASIN/020163337X>`_
============
Turtle graphics is a popular way for introducing programming to kids. It was
-part of the original Logo programming language developed by Wally Feurzig and
-Seymour Papert in 1966.
+part of the original Logo programming language developed by Wally Feurzeig,
+Seymour Papert and Cynthia Solomon in 1967.
Imagine a robotic turtle starting at (0, 0) in the x-y plane. After an ``import turtle``, give it the
command ``turtle.forward(15)``, and it moves (on-screen!) 15 pixels in the
A generic version of :class:`collections.abc.Collection`
- .. versionadded:: 3.6
+ .. versionadded:: 3.6.0
.. class:: AbstractSet(Sized, Collection[T_co])
A generic version of :class:`collections.deque`.
+ .. versionadded:: 3.5.4
.. versionadded:: 3.6.1
.. class:: List(list, MutableSequence[T])
A generic version of :class:`collections.abc.Awaitable`.
+ .. versionadded:: 3.5.2
+
.. class:: Coroutine(Awaitable[V_co], Generic[T_co T_contra, V_co])
A generic version of :class:`collections.abc.Coroutine`.
async def bar() -> None:
x = await c # type: int
+ .. versionadded:: 3.5.3
+
.. class:: AsyncIterable(Generic[T_co])
A generic version of :class:`collections.abc.AsyncIterable`.
+ .. versionadded:: 3.5.2
+
.. class:: AsyncIterator(AsyncIterable[T_co])
A generic version of :class:`collections.abc.AsyncIterator`.
+ .. versionadded:: 3.5.2
+
.. class:: ContextManager(Generic[T_co])
A generic version of :class:`contextlib.AbstractContextManager`.
- .. versionadded:: 3.6
+ .. versionadded:: 3.5.4
+ .. versionadded:: 3.6.0
.. class:: AsyncContextManager(Generic[T_co])
A generic version of :class:`contextlib.AbstractAsyncContextManager`.
- .. versionadded:: 3.6
+ .. versionadded:: 3.5.4
+ .. versionadded:: 3.6.2
.. class:: Dict(dict, MutableMapping[KT, VT])
A generic version of :class:`collections.Counter`.
+ .. versionadded:: 3.5.4
.. versionadded:: 3.6.1
.. class:: ChainMap(collections.ChainMap, MutableMapping[KT, VT])
A generic version of :class:`collections.ChainMap`.
+ .. versionadded:: 3.5.4
.. versionadded:: 3.6.1
.. class:: Generator(Iterator[T_co], Generic[T_co, T_contra, V_co])
yield start
start = await increment(start)
- .. versionadded:: 3.5.4
+ .. versionadded:: 3.6.1
.. class:: Text
.. class:: NamedTuple
- Typed version of namedtuple.
+ Typed version of :func:`collections.namedtuple`.
Usage::
This wraps the decorator with something that wraps the decorated
function in :func:`no_type_check`.
+.. decorator:: type_check_only
+
+ Decorator to mark a class or function to be unavailable at runtime.
+
+ This decorator is itself not available at runtime. It is mainly
+ intended to mark classes that are defined in type stub files if
+ an implementation returns an instance of a private class::
+
+ @type_check_only
+ class Response: # private or not available at runtime
+ code: int
+ def get_header(self, name: str) -> str: ...
+
+ def fetch_response() -> Response: ...
+
+ Note that returning instances of private classes is not recommended.
+ It is usually preferable to make such classes public.
+
.. data:: Any
Special type indicating an unconstrained type.
raise RuntimeError('no way')
.. versionadded:: 3.5.4
+ .. versionadded:: 3.6.2
.. data:: Union
>>> test_function()
If :func:`patch.multiple` is used as a context manager, the value returned by the
-context manger is a dictionary where created mocks are keyed by name:
+context manager is a dictionary where created mocks are keyed by name::
>>> with patch.multiple('__main__', thing=DEFAULT, other=DEFAULT) as values:
... assert 'other' in repr(values['other'])
:class:`TestResult`.
Skipping a test is simply a matter of using the :func:`skip` :term:`decorator`
-or one of its conditional variants.
+or one of its conditional variants, calling :meth:`TestCase.skipTest` within a
+:meth:`~TestCase.setUp` or test method, or raising :exc:`SkipTest` directly.
Basic skipping looks like this::
# windows specific testing code
pass
+ def test_maybe_skipped(self):
+ if not external_resource_available():
+ self.skipTest("external resource not available")
+ # test code that depends on the external resource
+ pass
+
This is the output of running the example above in verbose mode::
test_format (__main__.MyTestCase) ... skipped 'not supported in this library version'
test_nothing (__main__.MyTestCase) ... skipped 'demonstrating skipping'
+ test_maybe_skipped (__main__.MyTestCase) ... skipped 'external resource not available'
test_windows_support (__main__.MyTestCase) ... skipped 'requires Windows'
----------------------------------------------------------------------
- Ran 3 tests in 0.005s
+ Ran 4 tests in 0.005s
- OK (skipped=3)
+ OK (skipped=4)
Classes can be skipped just like methods::
return lambda func: func
return unittest.skip("{!r} doesn't have {!r}".format(obj, attr))
-The following decorators implement test skipping and expected failures:
+The following decorators and exception implement test skipping and expected failures:
.. decorator:: skip(reason)
int('XYZ')
.. versionadded:: 3.1
- under the name ``assertRaisesRegexp``.
+ Added under the name ``assertRaisesRegexp``.
.. versionchanged:: 3.2
Renamed to :meth:`assertRaisesRegex`.
+---------------------------------------+--------------------------------+--------------+
| :meth:`assertCountEqual(a, b) | *a* and *b* have the same | 3.2 |
| <TestCase.assertCountEqual>` | elements in the same number, | |
- | | regardless of their order | |
+ | | regardless of their order. | |
+---------------------------------------+--------------------------------+--------------+
expression suitable for use by :func:`re.search`.
.. versionadded:: 3.1
- under the name ``assertRegexpMatches``.
+ Added under the name ``assertRegexpMatches``.
.. versionchanged:: 3.2
The method ``assertRegexpMatches()`` has been renamed to
:meth:`.assertRegex`.
============================== ====================== =======================
.. deprecated:: 3.1
- the fail* aliases listed in the second column.
+ The fail* aliases listed in the second column have been deprecated.
.. deprecated:: 3.2
- the assert* aliases listed in the third column.
+ The assert* aliases listed in the third column have been deprecated.
.. deprecated:: 3.2
``assertRegexpMatches`` and ``assertRaisesRegexp`` have been renamed to
:meth:`.assertRegex` and :meth:`.assertRaisesRegex`.
.. deprecated:: 3.5
- the ``assertNotRegexpMatches`` name in favor of :meth:`.assertNotRegex`.
+ The ``assertNotRegexpMatches`` name is deprecated in favor of :meth:`.assertNotRegex`.
.. _testsuite-objects:
.. function:: urlparse(urlstring, scheme='', allow_fragments=True)
- Parse a URL into six components, returning a 6-tuple. This corresponds to the
- general structure of a URL: ``scheme://netloc/path;parameters?query#fragment``.
+ Parse a URL into six components, returning a 6-item :term:`named tuple`. This
+ corresponds to the general structure of a URL:
+ ``scheme://netloc/path;parameters?query#fragment``.
Each tuple item is a string, possibly empty. The components are not broken up in
smaller parts (for example, the network location is a single string), and %
escapes are not expanded. The delimiters as shown above are not part of the
or query component, and :attr:`fragment` is set to the empty string in
the return value.
- The return value is actually an instance of a subclass of :class:`tuple`. This
- class has the following additional read-only convenience attributes:
+ The return value is a :term:`named tuple`, which means that its items can
+ be accessed by index or as named attributes, which are:
+------------------+-------+--------------------------+----------------------+
| Attribute | Index | Value | Value if not present |
``#``, ``@``, or ``:`` will raise a :exc:`ValueError`. If the URL is
decomposed before parsing, no error will be raised.
+ As is the case with all named tuples, the subclass has a few additional methods
+ and attributes that are particularly useful. One such method is :meth:`_replace`.
+ The :meth:`_replace` method will return a new ParseResult object replacing specified
+ fields with new values.
+
+ .. doctest::
+ :options: +NORMALIZE_WHITESPACE
+
+ >>> from urllib.parse import urlparse
+ >>> u = urlparse('//www.cwi.nl:80/%7Eguido/Python.html')
+ >>> u
+ ParseResult(scheme='', netloc='www.cwi.nl:80', path='/%7Eguido/Python.html',
+ params='', query='', fragment='')
+ >>> u._replace(scheme='http')
+ ParseResult(scheme='http', netloc='www.cwi.nl:80', path='/%7Eguido/Python.html',
+ params='', query='', fragment='')
+
+
.. versionchanged:: 3.2
Added IPv6 URL parsing capabilities.
This should generally be used instead of :func:`urlparse` if the more recent URL
syntax allowing parameters to be applied to each segment of the *path* portion
of the URL (see :rfc:`2396`) is wanted. A separate function is needed to
- separate the path segments and parameters. This function returns a 5-tuple:
- (addressing scheme, network location, path, query, fragment identifier).
+ separate the path segments and parameters. This function returns a 5-item
+ :term:`named tuple`::
+
+ (addressing scheme, network location, path, query, fragment identifier).
- The return value is actually an instance of a subclass of :class:`tuple`. This
- class has the following additional read-only convenience attributes:
+ The return value is a :term:`named tuple`, its items can be accessed by index
+ or as named attributes:
+------------------+-------+-------------------------+----------------------+
| Attribute | Index | Value | Value if not present |
string. If there is no fragment identifier in *url*, return *url* unmodified
and an empty string.
- The return value is actually an instance of a subclass of :class:`tuple`. This
- class has the following additional read-only convenience attributes:
+ The return value is a :term:`named tuple`, its items can be accessed by index
+ or as named attributes:
+------------------+-------+-------------------------+----------------------+
| Attribute | Index | Value | Value if not present |
*handler* should be an instance of :class:`BaseHandler`. The following methods
are searched, and added to the possible chains (note that HTTP errors are a
- special case).
+ special case). Note that, in the following, *protocol* should be replaced
+ with the actual protocol to handle, for example :meth:`http_response` would
+ be the HTTP protocol response handler. Also *type* should be replaced with
+ the actual HTTP code, for example :meth:`http_error_404` would handle HTTP
+ 404 errors.
- * :meth:`protocol_open` --- signal that the handler knows how to open *protocol*
+ * :meth:`<protocol>_open` --- signal that the handler knows how to open *protocol*
URLs.
- * :meth:`http_error_type` --- signal that the handler knows how to handle HTTP
+ See |protocol_open|_ for more information.
+
+ * :meth:`http_error_\<type\>` --- signal that the handler knows how to handle HTTP
errors with HTTP error code *type*.
- * :meth:`protocol_error` --- signal that the handler knows how to handle errors
+ See |http_error_nnn|_ for more information.
+
+ * :meth:`<protocol>_error` --- signal that the handler knows how to handle errors
from (non-\ ``http``) *protocol*.
- * :meth:`protocol_request` --- signal that the handler knows how to pre-process
+ * :meth:`<protocol>_request` --- signal that the handler knows how to pre-process
*protocol* requests.
- * :meth:`protocol_response` --- signal that the handler knows how to
+ See |protocol_request|_ for more information.
+
+ * :meth:`<protocol>_response` --- signal that the handler knows how to
post-process *protocol* responses.
+ See |protocol_response|_ for more information.
+
+.. |protocol_open| replace:: :meth:`BaseHandler.<protocol>_open`
+.. |http_error_nnn| replace:: :meth:`BaseHandler.http_error_\<nnn\>`
+.. |protocol_request| replace:: :meth:`BaseHandler.<protocol>_request`
+.. |protocol_response| replace:: :meth:`BaseHandler.<protocol>_response`
.. method:: OpenerDirector.open(url, data=None[, timeout])
Handle an error of the given protocol. This will call the registered error
handlers for the given protocol with the given arguments (which are protocol
specific). The HTTP protocol is a special case which uses the HTTP response
- code to determine the specific error handler; refer to the :meth:`http_error_\*`
+ code to determine the specific error handler; refer to the :meth:`http_error_\<type\>`
methods of the handler classes.
Return values and exceptions raised are the same as those of :func:`urlopen`.
The order in which these methods are called within each stage is determined by
sorting the handler instances.
-#. Every handler with a method named like :meth:`protocol_request` has that
+#. Every handler with a method named like :meth:`<protocol>_request` has that
method called to pre-process the request.
-#. Handlers with a method named like :meth:`protocol_open` are called to handle
+#. Handlers with a method named like :meth:`<protocol>_open` are called to handle
the request. This stage ends when a handler either returns a non-\ :const:`None`
value (ie. a response), or raises an exception (usually
:exc:`~urllib.error.URLError`). Exceptions are allowed to propagate.
In fact, the above algorithm is first tried for methods named
:meth:`default_open`. If all such methods return :const:`None`, the algorithm
- is repeated for methods named like :meth:`protocol_open`. If all such methods
+ is repeated for methods named like :meth:`<protocol>_open`. If all such methods
return :const:`None`, the algorithm is repeated for methods named
:meth:`unknown_open`.
:class:`OpenerDirector` instance's :meth:`~OpenerDirector.open` and
:meth:`~OpenerDirector.error` methods.
-#. Every handler with a method named like :meth:`protocol_response` has that
+#. Every handler with a method named like :meth:`<protocol>_response` has that
method called to post-process the response.
.. note::
The convention has been adopted that subclasses defining
- :meth:`protocol_request` or :meth:`protocol_response` methods are named
+ :meth:`<protocol>_request` or :meth:`<protocol>_response` methods are named
:class:`\*Processor`; all others are named :class:`\*Handler`.
This method will be called before any protocol-specific open method.
-.. method:: BaseHandler.protocol_open(req)
+.. _protocol_open:
+.. method:: BaseHandler.<protocol>_open(req)
:noindex:
This method is *not* defined in :class:`BaseHandler`, but subclasses should
:func:`urlopen`.
-.. method:: BaseHandler.http_error_nnn(req, fp, code, msg, hdrs)
+.. _http_error_nnn:
+.. method:: BaseHandler.http_error_<nnn>(req, fp, code, msg, hdrs)
*nnn* should be a three-digit HTTP error code. This method is also not defined
in :class:`BaseHandler`, but will be called, if it exists, on an instance of a
:meth:`http_error_default`.
-.. method:: BaseHandler.protocol_request(req)
+.. _protocol_request:
+.. method:: BaseHandler.<protocol>_request(req)
:noindex:
This method is *not* defined in :class:`BaseHandler`, but subclasses should
:class:`Request` object.
-.. method:: BaseHandler.protocol_response(req, response)
+.. _protocol_response:
+.. method:: BaseHandler.<protocol>_response(req, response)
:noindex:
This method is *not* defined in :class:`BaseHandler`, but subclasses should
--------------------
-.. method:: ProxyHandler.protocol_open(request)
+.. method:: ProxyHandler.<protocol>_open(request)
:noindex:
- The :class:`ProxyHandler` will have a method :meth:`protocol_open` for every
+ The :class:`ProxyHandler` will have a method :meth:`<protocol>_open` for every
*protocol* which has a proxy in the *proxies* dictionary given in the
constructor. The method will modify requests to go through the proxy, by
calling ``request.set_proxy()``, and call the next handler in the chain to
For 200 error codes, the response object is returned immediately.
For non-200 error codes, this simply passes the job on to the
- :meth:`protocol_error_code` handler methods, via :meth:`OpenerDirector.error`.
+ :meth:`http_error_\<type\>` handler methods, via :meth:`OpenerDirector.error`.
Eventually, :class:`HTTPDefaultErrorHandler` will raise an
:exc:`~urllib.error.HTTPError` if no other handler handles the error.
The *data* argument has the same meaning as the *data* argument of
:func:`urlopen`.
+ This method always quotes *fullurl* using :func:`~urllib.parse.quote`.
.. method:: open_unknown(fullurl, data=None)
There is also a module-level convenience function:
.. function:: create(env_dir, system_site_packages=False, clear=False, \
- symlinks=False, with_pip=False)
+ symlinks=False, with_pip=False, prompt=None)
Create an :class:`EnvBuilder` with the given keyword arguments, and call its
:meth:`~EnvBuilder.create` method with the *env_dir* argument.
+ .. versionadded:: 3.3
+
.. versionchanged:: 3.4
Added the ``with_pip`` parameter
+ .. versionchanged:: 3.6
+ Added the ``prompt`` parameter
+
An example of extending ``EnvBuilder``
--------------------------------------
+----------------------------------+-----------------------------------------------+
.. versionchanged:: 3.7
- Previously :exc:`DeprecationWarning` and :exc:`FutureWarning` were
- distinguished based on whether a feature was being removed entirely or
- changing its behaviour. They are now distinguished based on their
- intended audience and the way they're handled by the default warnings
- filters.
+ Previously :exc:`DeprecationWarning` and :exc:`FutureWarning` were
+ distinguished based on whether a feature was being removed entirely or
+ changing its behaviour. They are now distinguished based on their
+ intended audience and the way they're handled by the default warnings
+ filters.
.. _warning-filter:
Not all objects can be weakly referenced; those objects which can include class
instances, functions written in Python (but not in C), instance methods, sets,
-frozensets, some :term:`file objects <file object>`, :term:`generator`\s, type
-objects, sockets, arrays, deques, regular expression pattern objects, and code
+frozensets, some :term:`file objects <file object>`, :term:`generators <generator>`,
+type objects, sockets, arrays, deques, regular expression pattern objects, and code
objects.
.. versionchanged:: 3.2
obj = Dict(red=1, green=2, blue=3) # this object is weak referenceable
-Other built-in types such as :class:`tuple` and :class:`int` do not support weak
-references even when subclassed (This is an implementation detail and may be
-different across various Python implementations.).
+.. impl-detail::
+
+ Other built-in types such as :class:`tuple` and :class:`int` do not support weak
+ references even when subclassed.
Extension types can easily be made to support weak references; see
:ref:`weakref-support`.
:const:`False`, a finalizer will be called when the program exits if it
is still alive. For instance
- >>> obj = Object()
- >>> weakref.finalize(obj, print, "obj dead or exiting") #doctest:+ELLIPSIS
- <finalize object at ...; for 'Object' at ...>
- >>> exit() #doctest:+SKIP
- obj dead or exiting
+.. doctest::
+ :options: +SKIP
+
+ >>> obj = Object()
+ >>> weakref.finalize(obj, print, "obj dead or exiting")
+ <finalize object at ...; for 'Object' at ...>
+ >>> exit()
+ obj dead or exiting
Comparing finalizers with :meth:`__del__` methods
.. method:: Node.writexml(writer, indent="", addindent="", newl="")
- Write XML to the writer object. The writer should have a :meth:`write` method
- which matches that of the file object interface. The *indent* parameter is the
- indentation of the current node. The *addindent* parameter is the incremental
- indentation to use for subnodes of the current one. The *newl* parameter
- specifies the string to use to terminate newlines.
+ Write XML to the writer object. The writer receives texts but not bytes as input,
+ it should have a :meth:`write` method which matches that of the file object
+ interface. The *indent* parameter is the indentation of the current node.
+ The *addindent* parameter is the incremental indentation to use for subnodes
+ of the current one. The *newl* parameter specifies the string to use to
+ terminate newlines.
For the :class:`Document` node, an additional keyword argument *encoding* can
be used to specify the encoding field of the XML header.
* :class:`DOMTimeStamp`
-* :class:`DocumentType`
-
-* :class:`DOMImplementation`
-
-* :class:`CharacterData`
-
-* :class:`CDATASection`
-
-* :class:`Notation`
-
-* :class:`Entity`
-
* :class:`EntityReference`
-* :class:`DocumentFragment`
-
Most of these reflect information in the XML document that is not of general
utility to most DOM users.
---------
The file :file:`Python/pyhash.c` contains Marek Majkowski' implementation of
-Dan Bernstein's SipHash24 algorithm. The contains the following note::
+Dan Bernstein's SipHash24 algorithm. It contains the following note::
<MIT License>
Copyright (c) 2013 Marek Majkowski <marek@popcount.org>
| :attr:`__doc__` | The function's documentation | Writable |
| | string, or ``None`` if | |
| | unavailable; not inherited by | |
- | | subclasses | |
+ | | subclasses. | |
+-------------------------+-------------------------------+-----------+
- | :attr:`~definition.\ | The function's name | Writable |
+ | :attr:`~definition.\ | The function's name. | Writable |
| __name__` | | |
+-------------------------+-------------------------------+-----------+
| :attr:`~definition.\ | The function's | Writable |
- | __qualname__` | :term:`qualified name` | |
+ | __qualname__` | :term:`qualified name`. | |
| | | |
| | .. versionadded:: 3.3 | |
+-------------------------+-------------------------------+-----------+
| | argument values for those | |
| | arguments that have defaults, | |
| | or ``None`` if no arguments | |
- | | have a default value | |
+ | | have a default value. | |
+-------------------------+-------------------------------+-----------+
| :attr:`__code__` | The code object representing | Writable |
| | the compiled function body. | |
Called by the :func:`format` built-in function,
and by extension, evaluation of :ref:`formatted string literals
<f-strings>` and the :meth:`str.format` method, to produce a "formatted"
- string representation of an object. The ``format_spec`` argument is
+ string representation of an object. The *format_spec* argument is
a string that contains a description of the formatting options desired.
- The interpretation of the ``format_spec`` argument is up to the type
+ The interpretation of the *format_spec* argument is up to the type
implementing :meth:`__format__`, however most classes will either
delegate formatting to one of the built-in types, or use a similar
formatting option syntax.
When a class definition is executed, the following steps occur:
-* MRO entries are resolved
-* the appropriate metaclass is determined
-* the class namespace is prepared
-* the class body is executed
-* the class object is created
+* MRO entries are resolved;
+* the appropriate metaclass is determined;
+* the class namespace is prepared;
+* the class body is executed;
+* the class object is created.
Resolving MRO entries
The appropriate metaclass for a class definition is determined as follows:
-* if no bases and no explicit metaclass are given, then :func:`type` is used
+* if no bases and no explicit metaclass are given, then :func:`type` is used;
* if an explicit metaclass is given and it is *not* an instance of
- :func:`type`, then it is used directly as the metaclass
+ :func:`type`, then it is used directly as the metaclass;
* if an instance of :func:`type` is given as the explicit metaclass, or
- bases are defined, then the most derived metaclass is used
+ bases are defined, then the most derived metaclass is used.
The most derived metaclass is selected from the explicitly specified
metaclass (if any) and the metaclasses (i.e. ``type(cls)``) of all specified
* first, ``type.__new__`` collects all of the descriptors in the class
namespace that define a :meth:`~object.__set_name__` method;
* second, all of these ``__set_name__`` methods are called with the class
- being defined and the assigned name of that particular descriptor; and
+ being defined and the assigned name of that particular descriptor;
* finally, the :meth:`~object.__init_subclass__` hook is called on the
immediate parent of the new class in its method resolution order.
-----------------------
One can implement the generic class syntax as specified by :pep:`484`
-(for example ``List[int]``) by defining a special method
+(for example ``List[int]``) by defining a special method:
.. classmethod:: object.__class_getitem__(cls, key)
When an exception is not handled at all, the interpreter terminates execution of
the program, or returns to its interactive main loop. In either case, it prints
-a stack backtrace, except when the exception is :exc:`SystemExit`.
+a stack traceback, except when the exception is :exc:`SystemExit`.
Exceptions are identified by class instances. The :keyword:`except` clause is
selected depending on the class of the instance: it must reference the class of
.. index:: pair: empty; tuple
An empty pair of parentheses yields an empty tuple object. Since tuples are
-immutable, the rules for literals apply (i.e., two occurrences of the empty
+immutable, the same rules as for literals apply (i.e., two occurrences of the empty
tuple may or may not yield the same object).
.. index::
When the underlying iterator is complete, the :attr:`~StopIteration.value`
attribute of the raised :exc:`StopIteration` instance becomes the value of
the yield expression. It can be either set explicitly when raising
-:exc:`StopIteration`, or automatically when the sub-iterator is a generator
-(by returning a value from the sub-generator).
+:exc:`StopIteration`, or automatically when the subiterator is a generator
+(by returning a value from the subgenerator).
.. versionchanged:: 3.3
Added ``yield from <expr>`` to delegate control flow to a subiterator.
:pep:`380` - Syntax for Delegating to a Subgenerator
The proposal to introduce the :token:`yield_from` syntax, making delegation
- to sub-generators easy.
+ to subgenerators easy.
:pep:`525` - Asynchronous Generators
The proposal that expanded on :pep:`492` by adding generator capabilities to
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The presence of a yield expression in a function or method defined using
-:keyword:`async def` further defines the function as a
+:keyword:`async def` further defines the function as an
:term:`asynchronous generator` function.
When an asynchronous generator function is called, it returns an
Returns an awaitable which when run starts to execute the asynchronous
generator or resumes it at the last executed yield expression. When an
- asynchronous generator function is resumed with a :meth:`~agen.__anext__`
+ asynchronous generator function is resumed with an :meth:`~agen.__anext__`
method, the current yield expression always evaluates to :const:`None` in
the returned awaitable, which when run will continue to the next yield
expression. The value of the :token:`expression_list` of the yield
expression is the value of the :exc:`StopIteration` exception raised by
the completing coroutine. If the asynchronous generator exits without
- yielding another value, the awaitable instead raises an
+ yielding another value, the awaitable instead raises a
:exc:`StopAsyncIteration` exception, signalling that the asynchronous
iteration has completed.
where the asynchronous generator was paused, and returns the next value
yielded by the generator function as the value of the raised
:exc:`StopIteration` exception. If the asynchronous generator exits
- without yielding another value, an :exc:`StopAsyncIteration` exception is
+ without yielding another value, a :exc:`StopAsyncIteration` exception is
raised by the awaitable.
If the generator function does not catch the passed-in exception, or
raises a different exception, then when the awaitable is run that exception
``False`` otherwise.
For user-defined classes which do not define :meth:`__contains__` but do define
-:meth:`__iter__`, ``x in y`` is ``True`` if some value ``z`` with ``x == z`` is
-produced while iterating over ``y``. If an exception is raised during the
-iteration, it is as if :keyword:`in` raised that exception.
+:meth:`__iter__`, ``x in y`` is ``True`` if some value ``z``, for which the
+expression ``x is z or x == z`` is true, is produced while iterating over ``y``.
+If an exception is raised during the iteration, it is as if :keyword:`in` raised
+that exception.
Lastly, the old-style iteration protocol is tried: if a class defines
:meth:`__getitem__`, ``x in y`` is ``True`` if and only if there is a non-negative
-integer index *i* such that ``x == y[i]``, and all lower integer indices do not
-raise :exc:`IndexError` exception. (If any other exception is raised, it is as
+integer index *i* such that ``x is y[i] or x == y[i]``, and no lower integer index
+raises the :exc:`IndexError` exception. (If any other exception is raised, it is as
if :keyword:`in` raised that exception).
.. index::
pair: membership; test
object: sequence
-The operator :keyword:`not in` is defined to have the inverse true value of
+The operator :keyword:`not in` is defined to have the inverse truth value of
:keyword:`in`.
.. index::
Identity comparisons
--------------------
-The operators :keyword:`is` and :keyword:`is not` test for object identity: ``x
-is y`` is true if and only if *x* and *y* are the same object. Object identity
+The operators :keyword:`is` and :keyword:`is not` test for an object's identity: ``x
+is y`` is true if and only if *x* and *y* are the same object. An Object's identity
is determined using the :meth:`id` function. ``x is not y`` yields the inverse
truth value. [#]_
``None``. The latter indicates that the meta path search should continue,
while raising an exception terminates it immediately.
+.. _relativeimports:
+
+Package Relative Imports
+========================
+
+Relative imports use leading dots. A single leading dot indicates a relative
+import, starting with the current package. Two or more leading dots indicate a
+relative import to the parent(s) of the current package, one level per dot
+after the first. For example, given the following package layout::
+
+ package/
+ __init__.py
+ subpackage1/
+ __init__.py
+ moduleX.py
+ moduleY.py
+ subpackage2/
+ __init__.py
+ moduleZ.py
+ moduleA.py
+
+In either ``subpackage1/moduleX.py`` or ``subpackage1/__init__.py``,
+the following are valid relative imports::
+
+ from .moduleY import spam
+ from .moduleY import spam as ham
+ from . import moduleY
+ from ..subpackage1 import moduleY
+ from ..subpackage2.moduleZ import eggs
+ from ..moduleA import foo
+
+Absolute imports may use either the ``import <>`` or ``from <> import <>``
+syntax, but relative imports may only use the second form; the reason
+for this is that::
+
+ import XXX.YYY.ZZZ
+
+should expose ``XXX.YYY.ZZZ`` as a usable expression, but .moduleY is
+not a valid expression.
+
Special considerations for __main__
===================================
A comment starts with a hash character (``#``) that is not part of a string
literal, and ends at the end of the physical line. A comment signifies the end
of the logical line unless the implicit line joining rules are invoked. Comments
-are ignored by the syntax; they are not tokens.
+are ignored by the syntax.
.. _encodings:
So if you execute ``from . import mod`` from a module in the ``pkg`` package
then you will end up importing ``pkg.mod``. If you execute ``from ..subpkg2
import mod`` from within ``pkg.subpkg1`` you will import ``pkg.subpkg2.mod``.
-The specification for relative imports is contained within :pep:`328`.
+The specification for relative imports is contained in
+the :ref:`relativeimports` section.
:func:`importlib.import_module` is provided to support applications that
determine dynamically the modules to be loaded.
from docutils import nodes
from sphinx.builders import Builder
+import sphinx.util
detect_all = re.compile(r'''
::(?=[^=])| # two :: (but NOT ::=)
Checks for possibly invalid markup that may leak into the output.
"""
name = 'suspicious'
+ logger = sphinx.util.logging.getLogger("CheckSuspiciousMarkupBuilder")
def init(self):
# create output file
self.warn('Found %s/%s unused rules:' %
(len(unused_rules), len(self.rules)))
for rule in unused_rules:
- self.info(repr(rule))
+ self.logger.info(repr(rule))
return
def check_issue(self, line, lineno, issue):
return False
def report_issue(self, text, lineno, issue):
- if not self.any_issue: self.info()
+ if not self.any_issue: self.logger.info()
self.any_issue = True
self.write_log_entry(lineno, issue, text)
if py3:
A csv file, with exactly the same format as suspicious.csv
Fields: document name (normalized), line number, issue, surrounding text
"""
- self.info("loading ignore rules... ", nonl=1)
+ self.logger.info("loading ignore rules... ", nonl=1)
self.rules = rules = []
try:
if py3:
rule = Rule(docname, lineno, issue, text)
rules.append(rule)
f.close()
- self.info('done, %d rules loaded' % len(self.rules))
+ self.logger.info('done, %d rules loaded' % len(self.rules))
def get_lineno(node):
'(?:release/\\d.\\d[\\x\\d\\.]*)'];
var all_versions = {
- '3.8': 'dev (3.8)',
+ '3.9': 'dev (3.9)',
+ '3.8': 'pre (3.8)',
'3.7': '3.7',
'3.6': '3.6',
'3.5': '3.5',
'fr': 'French',
'ja': 'Japanese',
'ko': 'Korean',
+ 'zh-cn': 'Simplified Chinese',
};
function build_version_select(current_version, current_release) {
{% if last_updated %}<p><b>Last updated on: {{ last_updated }}.</b></p>{% endif %}
<p>To download an archive containing all the documents for this version of
-Python in one of various formats, follow one of links in this table. The numbers
-in the table are the size of the download files in megabytes.</p>
+Python in one of various formats, follow one of links in this table.</p>
<table class="docutils">
<tr><th>Format</th><th>Packed as .zip</th><th>Packed as .tar.bz2</th></tr>
<p><a href="{{ pathto('download') }}">{% trans %}Download these documents{% endtrans %}</a></p>
<h3>{% trans %}Docs by version{% endtrans %}</h3>
<ul>
- <li><a href="https://docs.python.org/3.8/">{% trans %}Python 3.8 (in development){% endtrans %}</a></li>
+ <li><a href="https://docs.python.org/3.9/">{% trans %}Python 3.9 (in development){% endtrans %}</a></li>
+ <li><a href="https://docs.python.org/3.8/">{% trans %}Python 3.8 (pre-release){% endtrans %}</a></li>
<li><a href="https://docs.python.org/3.7/">{% trans %}Python 3.7 (stable){% endtrans %}</a></li>
<li><a href="https://docs.python.org/3.6/">{% trans %}Python 3.6 (security-fixes){% endtrans %}</a></li>
<li><a href="https://docs.python.org/3.5/">{% trans %}Python 3.5 (security-fixes){% endtrans %}</a></li>
{% extends "!layout.html" %}
+
+{% block header %}
+{%- if outdated %}
+<div id="outdated-warning" style="padding: .5em; text-align: center; background-color: #FFBABA; color: #6A0E0E;">
+ {% trans %}This document is for an old version of Python that is no longer supported.
+ You should upgrade, and read the {% endtrans %}
+ <a href="/3/{{ pagename }}{{ file_suffix }}">{% trans %} Python documentation for the current stable release{% endtrans %}</a>.
+</div>
+{%- endif %}
+{% endblock %}
+
{% block rootrellink %}
<li><img src="{{ pathto('_static/py.png', 1) }}" alt=""
style="vertical-align: middle; margin-top: -1px"/></li>
function store the value in the local symbol table; whereas variable references
first look in the local symbol table, then in the local symbol tables of
enclosing functions, then in the global symbol table, and finally in the table
-of built-in names. Thus, global variables cannot be directly assigned a value
-within a function (unless named in a :keyword:`global` statement), although they
-may be referenced.
+of built-in names. Thus, global variables and variables of enclosing functions
+cannot be directly assigned a value within a function (unless, for global
+variables, named in a :keyword:`global` statement, or, for variables of enclosing
+functions, named in a :keyword:`nonlocal` statement), although they may be
+referenced.
The actual parameters (arguments) to a function call are introduced in the local
symbol table of the called function when it is called; thus, arguments are
dictionary (see :ref:`typesmapping`) containing all keyword arguments except for
those corresponding to a formal parameter. This may be combined with a formal
parameter of the form ``*name`` (described in the next subsection) which
-receives a tuple containing the positional arguments beyond the formal parameter
-list. (``*name`` must occur before ``**name``.) For example, if we define a
-function like this::
+receives a :ref:`tuple <tut-tuples>` containing the positional
+arguments beyond the formal parameter list. (``*name`` must occur
+before ``**name``.) For example, if we define a function like this::
def cheeseshop(kind, *arguments, **keywords):
print("-- Do you have any", kind, "?")
but need to be unpacked for a function call requiring separate positional
arguments. For instance, the built-in :func:`range` function expects separate
*start* and *stop* arguments. If they are not available separately, write the
-function call with the ``*``\ -operator to unpack the arguments out of a list
+function call with the ``*`` operator to unpack the arguments out of a list
or tuple::
>>> list(range(3, 6)) # normal call with separate arguments
single: **; in function calls
In the same fashion, dictionaries can deliver keyword arguments with the
-``**``\ -operator::
+``**`` operator::
>>> def parrot(voltage, state='a stiff', action='voom'):
... print("-- This parrot wouldn't", action, end=' ')
Python guru or system administrator. (E.g., :file:`/usr/local/python` is a
popular alternative location.)
-On Windows machines, the Python installation is usually placed in
-:file:`C:\\Python37`, though you can change this when you're running the
-installer. To add this directory to your path, you can type the following
-command into the command prompt in a DOS box::
-
- set path=%path%;C:\python37
+On Windows machines where you have installed from the :ref:`Microsoft Store
+<windows-store>`, the :file:`python3.7` command will be available. If you have
+the :ref:`py.exe launcher <launcher>` installed, you can use the :file:`py`
+command. See :ref:`setting-envvars` for other ways to launch Python.
Typing an end-of-file character (:kbd:`Control-D` on Unix, :kbd:`Control-Z` on
Windows) at the primary prompt causes the interpreter to exit with a zero exit
>>> squares
[1, 4, 9, 16, 25]
-Like strings (and all other built-in :term:`sequence` type), lists can be
+Like strings (and all other built-in :term:`sequence` types), lists can be
indexed and sliced::
>>> squares[0] # indexing returns the item
When importing the package, Python searches through the directories on
``sys.path`` looking for the package subdirectory.
-The :file:`__init__.py` files are required to make Python treat the directories
-as containing packages; this is done to prevent directories with a common name,
-such as ``string``, from unintentionally hiding valid modules that occur later
+The :file:`__init__.py` files are required to make Python treat directories
+containing the file as packages. This prevents directories with a common name,
+such as ``string``, unintentionally hiding valid modules that occur later
on the module search path. In the simplest case, :file:`__init__.py` can just be
an empty file, but it can also execute initialization code for the package or
set the ``__all__`` variable, described later.
patterns when you use ``import *``, it is still considered bad practice in
production code.
-Remember, there is nothing wrong with using ``from Package import
+Remember, there is nothing wrong with using ``from package import
specific_submodule``! In fact, this is the recommended notation unless the
importing module needs to use submodules with the same name from different
packages.
* https://docs.python.org: Fast access to Python's documentation.
* https://pypi.org: The Python Package Index, previously also nicknamed
- the Cheese Shop, is an index of user-created Python modules that are available
+ the Cheese Shop [#]_, is an index of user-created Python modules that are available
for download. Once you begin releasing code, you can register it here so that
others can find it.
:ref:`Frequently Asked Questions <faq-index>` (also called the FAQ). The
FAQ answers many of the questions that come up again and again, and may
already contain the solution for your problem.
+
+.. rubric:: Footnotes
+
+.. [#] "Cheese Shop" is a Monty Python's sketch: a customer enters a cheese shop,
+ but whatever cheese he asks for, the clerk says it's missing.
+
What you get after installing is a number of things:
-* A :file:`MacPython 3.6` folder in your :file:`Applications` folder. In here
+* A :file:`Python 3.7` folder in your :file:`Applications` folder. In here
you find IDLE, the development environment that is a standard part of official
Python distributions; PythonLauncher, which handles double-clicking Python
scripts from the Finder; and the "Build Applet" tool, which allows you to
anything that has a GUI) need to be run in a special way. Use :program:`pythonw`
instead of :program:`python` to start such scripts.
-With Python 3.6, you can use either :program:`python` or :program:`pythonw`.
+With Python 3.7, you can use either :program:`python` or :program:`pythonw`.
Configuration
*PyObjC* is a Python binding to Apple's Objective-C/Cocoa framework, which is
the foundation of most modern Mac development. Information on PyObjC is
-available from https://pythonhosted.org/pyobjc/.
+available from https://pypi.org/project/pyobjc/.
The standard Python GUI toolkit is :mod:`tkinter`, based on the cross-platform
Tk toolkit (https://www.tcl.tk). An Aqua-native version of Tk is bundled with OS
| DefaultJustForMeTargetDir | The default install directory for | :file:`%LocalAppData%\\\ |
| | just-for-me installs | Programs\\PythonXY` or |
| | | :file:`%LocalAppData%\\\ |
-| | | Programs\\PythonXY-32` |
+| | | Programs\\PythonXY-32` or|
+| | | :file:`%LocalAppData%\\\ |
+| | | Programs\\PythonXY-64` |
+---------------------------+--------------------------------------+--------------------------+
| DefaultCustomTargetDir | The default custom install directory | (empty) |
| | displayed in the UI | |
shebang lines starting with ``/usr``.
Any of the above virtual commands can be suffixed with an explicit version
-(either just the major version, or the major and minor version) - for example
-``/usr/bin/python2.7`` - which will cause that specific version to be located
-and used.
+(either just the major version, or the major and minor version).
+Furthermore the 32-bit version can be requested by adding "-32" after the
+minor version. I.e. ``/usr/bin/python2.7-32`` will request usage of the
+32-bit python 2.7.
+
+.. versionadded:: 3.7
+
+ Beginning with python launcher 3.7 it is possible to request 64-bit version
+ by the "-64" suffix. Furthermore it is possible to specify a major and
+ architecture without minor (i.e. ``/usr/bin/python3-64``).
The ``/usr/bin/env`` form of shebang line has one further special property.
Before looking for installed Python interpreters, this form will search the
In some cases, a version qualifier can be included in a command to dictate
which version of Python will be used by the command. A version qualifier
starts with a major version number and can optionally be followed by a period
-('.') and a minor version specifier. If the minor qualifier is specified, it
-may optionally be followed by "-32" to indicate the 32-bit implementation of
-that version be used.
+('.') and a minor version specifier. Furthermore it is possible to specifiy
+if a 32 or 64 bit implementation shall be requested by adding "-32" or "-64".
For example, a shebang line of ``#!python`` has no version qualifier, while
``#!python3`` has a version qualifier which specifies only a major version.
-If no version qualifiers are found in a command, the environment variable
-``PY_PYTHON`` can be set to specify the default version qualifier - the default
-value is "2". Note this value could specify just a major version (e.g. "2") or
-a major.minor qualifier (e.g. "2.6"), or even major.minor-32.
+If no version qualifiers are found in a command, the environment
+variable :envvar:`PY_PYTHON` can be set to specify the default version
+qualifier. If it is not set, the default is "3". The variable can
+specify any value that may be passed on the command line, such as "3",
+"3.7", "3.7-32" or "3.7-64". (Note that the "-64" option is only
+available with the launcher included with Python 3.7 or newer.)
If no minor version qualifiers are found, the environment variable
``PY_PYTHON{major}`` (where ``{major}`` is the current major version qualifier
can be predicted knowing only what versions are installed on the PC and
without regard to the order in which they were installed (i.e., without knowing
whether a 32 or 64-bit version of Python and corresponding launcher was
-installed last). As noted above, an optional "-32" suffix can be used on a
-version specifier to change this behaviour.
+installed last). As noted above, an optional "-32" or "-64" suffix can be
+used on a version specifier to change this behaviour.
Examples:
is a collection of modules for advanced Windows-specific support. This includes
utilities for:
-* `Component Object Model <https://www.microsoft.com/com/>`_ (COM)
+* `Component Object Model
+ <https://docs.microsoft.com/en-us/windows/desktop/com/component-object-model--com--portal>`_
+ (COM)
* Win32 API calls
* Registry
* Event log
MinGW gcc under Windows" or "Installing Python extension with distutils
and without Microsoft Visual C++" by Sébastien Sauvage, 2003
- `MingW -- Python extensions <http://oldwiki.mingw.org/index.php/Python%20extensions>`_
- by Trent Apted et al, 2007
+ `MingW -- Python extensions <http://www.mingw.org/wiki/FAQ#toc14>`_
Other Platforms
Sphinx is a standalone package that can be used for writing, and
almost two dozen other projects
-(`listed on the Sphinx web site <http://sphinx-doc.org/examples.html>`__)
+(`listed on the Sphinx web site <https://www.sphinx-doc.org/en/master/examples.html>`__)
have adopted Sphinx as their documentation tool.
.. seealso::
This article explains the new features in Python 3.2 as compared to 3.1. It
focuses on a few highlights and gives a few examples. For full details, see the
-`Misc/NEWS <https://hg.python.org/cpython/file/3.2/Misc/NEWS>`_ file.
+`Misc/NEWS
+<https://github.com/python/cpython/blob/076ca6c3c8df3030307e548d9be792ce3c1c6eea/Misc/NEWS>`_
+file.
.. seealso::
sealed and deposited in a queue for later handling.
See `Barrier Synchronization Patterns
-<https://parlab.eecs.berkeley.edu/wiki/_media/patterns/paraplop_g1_3.pdf>`_ for
-more examples of how barriers can be used in parallel computing. Also, there is
+<http://osl.cs.illinois.edu/media/papers/karmani-2009-barrier_synchronization_pattern.pdf>`_
+for more examples of how barriers can be used in parallel computing. Also, there is
a simple but thorough explanation of barriers in `The Little Book of Semaphores
-<http://greenteapress.com/semaphores/downey08semaphores.pdf>`_, *section 3.6*.
+<https://greenteapress.com/semaphores/LittleBookOfSemaphores.pdf>`_, *section 3.6*.
(Contributed by Kristján Valur Jónsson with an API review by Jeffrey Yasskin in
:issue:`8777`.)
members of the community to create and share external changesets. See
:pep:`385` for details.
-To learn to use the new version control system, see the `tutorial by Joel
-Spolsky <http://hginit.com>`_ or the `Guide to Mercurial Workflows
-<https://www.mercurial-scm.org/guide>`_.
+To learn to use the new version control system, see the `Quick Start
+<https://www.mercurial-scm.org/wiki/QuickStart>`_ or the `Guide to
+Mercurial Workflows <https://www.mercurial-scm.org/guide>`_.
Build and C API Changes
:exc:`~ssl.SSLCertVerificationError` and aborts the handshake with a proper
TLS Alert message. The new exception contains additional information.
Host name validation can be customized with
-:attr:`SSLContext.host_flags <ssl.SSLContext.host_flags>`.
+:attr:`SSLContext.hostname_checks_common_name <ssl.SSLContext.hostname_checks_common_name>`.
(Contributed by Christian Heimes in :issue:`31399`.)
.. note::
(Contributed by Christian Heimes in :issue:`32185`.)
:func:`~ssl.match_hostname` no longer supports partial wildcards like
-``www*.example.org``. :attr:`SSLContext.host_flags <ssl.SSLContext.host_flags>`
-has partial wildcard matching disabled by default.
+``www*.example.org``.
(Contributed by Mandeep Singh in :issue:`23033` and Christian Heimes in
:issue:`31399`.)
If nargs is equal to zero, args can be NULL. kwargs can be NULL.
nargs must be greater or equal to zero.
- Return the result on success. Raise an exception on return NULL on
+ Return the result on success. Raise an exception and return NULL on
error. */
PyAPI_FUNC(PyObject *) _PyObject_FastCallDict(
PyObject *callable,
#define _PyGC_generation0 _PyRuntime.gc.generation0
+/* Heuristic checking if a pointer value is newly allocated
+ (uninitialized) or newly freed. The pointer is not dereferenced, only the
+ pointer value is checked.
+
+ The heuristic relies on the debug hooks on Python memory allocators which
+ fills newly allocated memory with CLEANBYTE (0xCD) and newly freed memory
+ with DEADBYTE (0xDD). Detect also "untouchable bytes" marked
+ with FORBIDDENBYTE (0xFD). */
+static inline int _PyMem_IsPtrFreed(void *ptr)
+{
+ uintptr_t value = (uintptr_t)ptr;
+#if SIZEOF_VOID_P == 8
+ return (value == (uintptr_t)0xCDCDCDCDCDCDCDCD
+ || value == (uintptr_t)0xDDDDDDDDDDDDDDDD
+ || value == (uintptr_t)0xFDFDFDFDFDFDFDFD);
+#elif SIZEOF_VOID_P == 4
+ return (value == (uintptr_t)0xCDCDCDCD
+ || value == (uintptr_t)0xDDDDDDDD
+ || value == (uintptr_t)0xFDFDFDFD);
+#else
+# error "unknown pointer size"
+#endif
+}
+
#ifdef __cplusplus
}
#endif
Return NULL on success, or return an error message on failure. */
PyAPI_FUNC(_PyInitError) _PyRuntime_Initialize(void);
+PyAPI_FUNC(void) _PyRuntime_Finalize(void);
+
+
#define _Py_CURRENTLY_FINALIZING(tstate) \
(_PyRuntime.finalizing == tstate)
union _gc_head *gc_prev;
Py_ssize_t gc_refs;
} gc;
- double dummy; /* force worst-case alignment */
+ long double dummy; /* force worst-case alignment */
+ // malloc returns memory block aligned for any built-in types and
+ // long double is the largest standard C type.
+ // On amd64 linux, long double requires 16 byte alignment.
+ // See bpo-27987 for more discussion.
} PyGC_Head;
extern PyGC_Head *_PyGC_generation0;
/*--start constants--*/
#define PY_MAJOR_VERSION 3
#define PY_MINOR_VERSION 7
-#define PY_MICRO_VERSION 3
+#define PY_MICRO_VERSION 4
#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_FINAL
#define PY_RELEASE_SERIAL 0
/* Version as a string */
-#define PY_VERSION "3.7.3"
+#define PY_VERSION "3.7.4"
/*--end constants--*/
/* Version as a single 4-byte hex number, e.g. 0x010502B2 == 1.5.2b2.
static inline void PyDTrace_FUNCTION_ENTRY(const char *arg0, const char *arg1, int arg2) {}
static inline void PyDTrace_FUNCTION_RETURN(const char *arg0, const char *arg1, int arg2) {}
static inline void PyDTrace_GC_START(int arg0) {}
-static inline void PyDTrace_GC_DONE(int arg0) {}
+static inline void PyDTrace_GC_DONE(Py_ssize_t arg0) {}
static inline void PyDTrace_INSTANCE_NEW_START(int arg0) {}
static inline void PyDTrace_INSTANCE_NEW_DONE(int arg0) {}
static inline void PyDTrace_INSTANCE_DELETE_START(int arg0) {}
PyAPI_FUNC(wchar_t *) Py_GetPath(void);
#ifdef Py_BUILD_CORE
PyAPI_FUNC(_PyInitError) _PyPathConfig_Init(const _PyCoreConfig *core_config);
-PyAPI_FUNC(PyObject*) _PyPathConfig_ComputeArgv0(int argc, wchar_t **argv);
+PyAPI_FUNC(int) _PyPathConfig_ComputeArgv0(
+ int argc, wchar_t **argv,
+ PyObject **argv0_p);
PyAPI_FUNC(int) _Py_FindEnvConfigValue(
FILE *env_file,
const wchar_t *key,
PyAPI_FUNC(PyObject*) _PyTraceMalloc_GetTraceback(
unsigned int domain,
uintptr_t ptr);
-
-PyAPI_FUNC(int) _PyMem_IsFreed(void *ptr, size_t size);
#endif /* !defined(Py_LIMITED_API) */
# Exports only things specified by thread documentation;
# skipping obsolete synonyms allocate(), start_new(), exit_thread().
__all__ = ['error', 'start_new_thread', 'exit', 'get_ident', 'allocate_lock',
- 'interrupt_main', 'LockType']
+ 'interrupt_main', 'LockType', 'RLock']
# A dummy value
TIMEOUT_MAX = 2**31
hex(id(self))
)
+
+class RLock(LockType):
+ """Dummy implementation of threading._RLock.
+
+ Re-entrant lock can be aquired multiple times and needs to be released
+ just as many times. This dummy implemention does not check wheter the
+ current thread actually owns the lock, but does accounting on the call
+ counts.
+ """
+ def __init__(self):
+ super().__init__()
+ self._levels = 0
+
+ def acquire(self, waitflag=None, timeout=-1):
+ """Aquire the lock, can be called multiple times in succession.
+ """
+ locked = super().acquire(waitflag, timeout)
+ if locked:
+ self._levels += 1
+ return locked
+
+ def release(self):
+ """Release needs to be called once for every call to acquire().
+ """
+ if self._levels == 0:
+ raise error
+ if self._levels == 1:
+ super().release()
+ self._levels -= 1
+
# Used to signal that interrupt_main was called in a "thread"
_interrupt = False
# True when not executing in a "thread"
derived classes can override selectively; the default implementations
represent a file that cannot be read, written or seeked.
- Even though IOBase does not declare read, readinto, or write because
+ Even though IOBase does not declare read or write because
their signatures will vary, implementations and clients should
consider those methods part of the interface. Also, implementations
may raise UnsupportedOperation when operations they do not support are
called.
The basic type used for binary data read from or written to a file is
- bytes. Other bytes-like objects are accepted as method arguments too. In
- some cases (such as readinto), a writable object is required. Text I/O
- classes work with str data.
+ bytes. Other bytes-like objects are accepted as method arguments too.
+ Text I/O classes work with str data.
Note that calling any method (even inquiries) on a closed stream is
undefined. Implementations may raise OSError in this case.
return lines
def writelines(self, lines):
+ """Write a list of lines to the stream.
+
+ Line separators are not added, so it is usual for each of the lines
+ provided to have a line separator at the end.
+ """
self._checkClosed()
for line in lines:
self.write(line)
"""Buffered I/O implementation using an in-memory bytes buffer."""
+ # Initialize _buffer as soon as possible since it's used by __del__()
+ # which calls close()
+ _buffer = None
+
def __init__(self, initial_bytes=None):
buf = bytearray()
if initial_bytes is not None:
return memoryview(self._buffer)
def close(self):
- self._buffer.clear()
+ if self._buffer is not None:
+ self._buffer.clear()
super().close()
def read(self, size=-1):
"""Base class for text I/O.
This class provides a character and line based interface to stream
- I/O. There is no readinto method because Python's character strings
- are immutable. There is no public constructor.
+ I/O. There is no public constructor.
"""
def read(self, size=-1):
_CHUNK_SIZE = 2048
+ # Initialize _buffer as soon as possible since it's used by __del__()
+ # which calls close()
+ _buffer = None
+
# The write_through argument has no effect here since this
# implementation always writes through. The argument is present only
# so that the signature can match the signature of the C version.
# before cleanup of cancelled handles is performed.
_MIN_CANCELLED_TIMER_HANDLES_FRACTION = 0.5
-# Exceptions which must not call the exception handler in fatal error
-# methods (_fatal_error())
-_FATAL_ERROR_IGNORE = (BrokenPipeError,
- ConnectionResetError, ConnectionAbortedError)
-
_HAS_IPv6 = hasattr(socket, 'AF_INET6')
# Maximum timeout passed to select to avoid OS limitations
'SO_REUSEPORT defined but not implemented.')
-def _ipaddr_info(host, port, family, type, proto):
+def _ipaddr_info(host, port, family, type, proto, flowinfo=0, scopeid=0):
# Try to skip getaddrinfo if "host" is already an IP. Users might have
# handled name resolution in their own code and pass in resolved IPs.
if not hasattr(socket, 'inet_pton'):
socket.inet_pton(af, host)
# The host has already been resolved.
if _HAS_IPv6 and af == socket.AF_INET6:
- return af, type, proto, '', (host, port, 0, 0)
+ return af, type, proto, '', (host, port, flowinfo, scopeid)
else:
return af, type, proto, '', (host, port)
except OSError:
read = await self.run_in_executor(None, file.readinto, view)
if not read:
break # EOF
- await self.sock_sendall(sock, view)
+ await self.sock_sendall(sock, view[:read])
total_sent += read
return total_sent
finally:
if blocksize <= 0:
return total_sent
view = memoryview(buf)[:blocksize]
- read = file.readinto(view)
+ read = await self.run_in_executor(None, file.readinto, view)
if not read:
return total_sent # EOF
await proto.drain()
- transp.write(view)
+ transp.write(view[:read])
total_sent += read
finally:
if total_sent > 0 and hasattr(file, 'seek'):
if local_addr:
sock.bind(local_address)
if remote_addr:
- await self.sock_connect(sock, remote_address)
+ if not allow_broadcast:
+ await self.sock_connect(sock, remote_address)
r_addr = remote_address
except OSError as exc:
if sock is not None:
family=0, type=socket.SOCK_STREAM,
proto=0, flags=0, loop):
host, port = address[:2]
- info = _ipaddr_info(host, port, family, type, proto)
+ info = _ipaddr_info(host, port, family, type, proto, *address[2:])
if info is not None:
# "host" is already a resolved IP.
return [info]
def _fatal_error(self, exc, message='Fatal error on pipe transport'):
try:
- if isinstance(exc, base_events._FATAL_ERROR_IGNORE):
+ if isinstance(exc, OSError):
if self._loop.get_debug():
logger.debug("%r: %s", self, message, exc_info=True)
else:
def __init__(self, loop, sock, protocol, extra=None, server=None):
super().__init__(extra, loop)
self._extra['socket'] = sock
- self._extra['sockname'] = sock.getsockname()
+ try:
+ self._extra['sockname'] = sock.getsockname()
+ except OSError:
+ self._extra['sockname'] = None
if 'peername' not in self._extra:
try:
self._extra['peername'] = sock.getpeername()
def _fatal_error(self, exc, message='Fatal error on transport'):
# Should be called from exception handler only.
- if isinstance(exc, base_events._FATAL_ERROR_IGNORE):
+ if isinstance(exc, OSError):
if self._loop.get_debug():
logger.debug("%r: %s", self, message, exc_info=True)
else:
if not data:
return
- if self._address and addr not in (None, self._address):
- raise ValueError(
- f'Invalid address: must be None or {self._address}')
+ if self._address:
+ if addr not in (None, self._address):
+ raise ValueError(
+ f'Invalid address: must be None or {self._address}')
+ addr = self._address
if self._conn_lost and self._address:
if self._conn_lost >= constants.LOG_THRESHOLD_FOR_CONNLOST_WRITES:
if not self._buffer:
# Attempt to send it right away first.
try:
- if self._address:
+ if self._extra['peername']:
self._sock.send(data)
else:
self._sock.sendto(data, addr)
while self._buffer:
data, addr = self._buffer.popleft()
try:
- if self._address:
+ if self._extra['peername']:
self._sock.send(data)
else:
self._sock.sendto(data, addr)
self._app_transport._closed = True
self._transport = None
self._app_transport = None
+ if getattr(self, '_handshake_timeout_handle', None):
+ self._handshake_timeout_handle.cancel()
self._wakeup_waiter(exc)
+ self._app_protocol = None
+ self._sslpipe = None
def pause_writing(self):
"""Called when the low-level transport's buffer goes over
self._fatal_error(exc, 'Fatal error on SSL transport')
def _fatal_error(self, exc, message='Fatal error on transport'):
- if isinstance(exc, base_events._FATAL_ERROR_IGNORE):
+ if isinstance(exc, OSError):
if self._loop.get_debug():
logger.debug("%r: %s", self, message, exc_info=True)
else:
"""Return a set of all tasks for the loop."""
if loop is None:
loop = events.get_running_loop()
- # NB: set(_all_tasks) is required to protect
- # from https://bugs.python.org/issue34970 bug
- return {t for t in list(_all_tasks)
+ # Looping over a WeakSet (_all_tasks) isn't safe as it can be updated from another
+ # thread while we do so. Therefore we cast it to list prior to filtering. The list
+ # cast itself requires iteration, so we repeat it several times ignoring
+ # RuntimeErrors (which are not very likely to occur). See issues 34970 and 36607 for
+ # details.
+ i = 0
+ while True:
+ try:
+ tasks = list(_all_tasks)
+ except RuntimeError:
+ i += 1
+ if i >= 1000:
+ raise
+ else:
+ break
+ return {t for t in tasks
if futures._get_loop(t) is loop and not t.done()}
# method.
if loop is None:
loop = events.get_event_loop()
- # NB: set(_all_tasks) is required to protect
- # from https://bugs.python.org/issue34970 bug
- return {t for t in list(_all_tasks) if futures._get_loop(t) is loop}
+ # Looping over a WeakSet (_all_tasks) isn't safe as it can be updated from another
+ # thread while we do so. Therefore we cast it to list prior to filtering. The list
+ # cast itself requires iteration, so we repeat it several times ignoring
+ # RuntimeErrors (which are not very likely to occur). See issues 34970 and 36607 for
+ # details.
+ i = 0
+ while True:
+ try:
+ tasks = list(_all_tasks)
+ except RuntimeError:
+ i += 1
+ if i >= 1000:
+ raise
+ else:
+ break
+ return {t for t in tasks if futures._get_loop(t) is loop}
class Task(futures._PyFuture): # Inherit Python Task implementation
finally:
if timeout_handle is not None:
timeout_handle.cancel()
+ for f in fs:
+ f.remove_done_callback(_on_completion)
done, pending = set(), set()
for f in fs:
- f.remove_done_callback(_on_completion)
if f.done():
done.add(f)
else:
loop = futures._get_loop(inner)
outer = loop.create_future()
- def _done_callback(inner):
+ def _inner_done_callback(inner):
if outer.cancelled():
if not inner.cancelled():
# Mark inner's result as retrieved.
else:
outer.set_result(inner.result())
- inner.add_done_callback(_done_callback)
+
+ def _outer_done_callback(outer):
+ if not inner.done():
+ inner.remove_done_callback(_inner_done_callback)
+
+ inner.add_done_callback(_inner_done_callback)
+ outer.add_done_callback(_outer_done_callback)
return outer
def _fatal_error(self, exc, message='Fatal error on pipe transport'):
# should be called by exception handler only
- if isinstance(exc, base_events._FATAL_ERROR_IGNORE):
+ if isinstance(exc, OSError):
if self._loop.get_debug():
logger.debug("%r: %s", self, message, exc_info=True)
else:
# other end). Notably this is needed on AIX, and works
# just fine on other platforms.
stdin, stdin_w = socket.socketpair()
- self._proc = subprocess.Popen(
- args, shell=shell, stdin=stdin, stdout=stdout, stderr=stderr,
- universal_newlines=False, bufsize=bufsize, **kwargs)
- if stdin_w is not None:
- stdin.close()
- self._proc.stdin = open(stdin_w.detach(), 'wb', buffering=bufsize)
+ try:
+ self._proc = subprocess.Popen(
+ args, shell=shell, stdin=stdin, stdout=stdout, stderr=stderr,
+ universal_newlines=False, bufsize=bufsize, **kwargs)
+ if stdin_w is not None:
+ stdin.close()
+ self._proc.stdin = open(stdin_w.detach(), 'wb', buffering=bufsize)
+ stdin_w = None
+ finally:
+ if stdin_w is not None:
+ stdin.close()
+ stdin_w.close()
class AbstractChildWatcher:
# This method is more useful to debug a single function call.
- def runcall(self, func, *args, **kwds):
+ def runcall(*args, **kwds):
"""Debug a single function call.
Return the result of the function call.
"""
+ if len(args) >= 2:
+ self, func, *args = args
+ elif not args:
+ raise TypeError("descriptor 'runcall' of 'Bdb' object "
+ "needs an argument")
+ elif 'func' in kwds:
+ func = kwds.pop('func')
+ self, *args = args
+ else:
+ raise TypeError('runcall expected at least 1 positional argument, '
+ 'got %d' % (len(args)-1))
+
self.reset()
sys.settrace(self.trace_dispatch)
res = None
return self
# This method is more useful to profile a single function call.
- def runcall(self, func, *args, **kw):
+ def runcall(*args, **kw):
+ if len(args) >= 2:
+ self, func, *args = args
+ elif not args:
+ raise TypeError("descriptor 'runcall' of 'Profile' object "
+ "needs an argument")
+ elif 'func' in kw:
+ func = kw.pop('func')
+ self, *args = args
+ else:
+ raise TypeError('runcall expected at least 1 positional argument, '
+ 'got %d' % (len(args)-1))
+
self.enable()
try:
return func(*args, **kw)
def writelines(self, list):
- data = ''.join(list)
+ data = b''.join(list)
data, bytesdecoded = self.decode(data, self.errors)
return self.writer.write(data)
self.reader.reset()
self.writer.reset()
+ def seek(self, offset, whence=0):
+ # Seeks must be propagated to both the readers and writers
+ # as they might need to reset their internal buffers.
+ self.reader.seek(offset, whence)
+ self.writer.seek(offset, whence)
+
def __getattr__(self, name,
getattr=getattr):
'__doc__': f'{typename}({arg_list})',
'__slots__': (),
'_fields': field_names,
+ '_field_defaults': field_defaults,
+ # alternate spelling for backward compatiblity
'_fields_defaults': field_defaults,
'__new__': __new__,
'_make': _make,
# Now, add the methods in dicts but not in MutableMapping
def __repr__(self): return repr(self.data)
+ def __copy__(self):
+ inst = self.__class__.__new__(self.__class__)
+ inst.__dict__.update(self.__dict__)
+ # Create a copy and avoid triggering descriptors
+ inst.__dict__["data"] = self.__dict__["data"].copy()
+ return inst
+
def copy(self):
if self.__class__ is UserDict:
return UserDict(self.data.copy())
self.data = data
c.update(self)
return c
+
@classmethod
def fromkeys(cls, iterable, value=None):
d = cls()
return other.data if isinstance(other, UserList) else other
def __contains__(self, item): return item in self.data
def __len__(self): return len(self.data)
- def __getitem__(self, i): return self.data[i]
+ def __getitem__(self, i):
+ if isinstance(i, slice):
+ return self.__class__(self.data[i])
+ else:
+ return self.data[i]
def __setitem__(self, i, item): self.data[i] = item
def __delitem__(self, i): del self.data[i]
def __add__(self, other):
def __imul__(self, n):
self.data *= n
return self
+ def __copy__(self):
+ inst = self.__class__.__new__(self.__class__)
+ inst.__dict__.update(self.__dict__)
+ # Create a copy and avoid triggering descriptors
+ inst.__dict__["data"] = self.__dict__["data"][:]
+ return inst
def append(self, item): self.data.append(item)
def insert(self, i, item): self.data.insert(i, item)
def pop(self, i=-1): return self.data.pop(i)
if self._state not in [CANCELLED, CANCELLED_AND_NOTIFIED, FINISHED]:
self._done_callbacks.append(fn)
return
- fn(self)
+ try:
+ fn(self)
+ except Exception:
+ LOGGER.exception('exception calling callback for %r', self)
def result(self, timeout=None):
"""Return the result of the call that the future represents.
class Executor(object):
"""This is an abstract base class for concrete asynchronous executors."""
- def submit(self, fn, *args, **kwargs):
+ def submit(*args, **kwargs):
"""Submits a callable to be executed with the given arguments.
Schedules the callable to be executed as fn(*args, **kwargs) and returns
Returns:
A Future representing the given call.
"""
+ if len(args) >= 2:
+ pass
+ elif not args:
+ raise TypeError("descriptor 'submit' of 'Executor' object "
+ "needs an argument")
+ elif 'fn' not in kwargs:
+ raise TypeError('submit expected at least 1 positional argument, '
+ 'got %d' % (len(args)-1))
+
raise NotImplementedError()
def map(self, fn, *iterables, timeout=None, chunksize=1):
import weakref
from functools import partial
import itertools
+import sys
import traceback
# Workers are created as daemon threads and processes. This is done to allow the
EXTRA_QUEUED_CALLS = 1
+# On Windows, WaitForMultipleObjects is used to wait for processes to finish.
+# It can wait on, at most, 63 objects. There is an overhead of two objects:
+# - the result queue reader
+# - the thread wakeup reader
+_MAX_WINDOWS_WORKERS = 63 - 2
+
# Hack to embed stringification of remote traceback in local traceback
class _RemoteTraceback(Exception):
worker processes will be created as the machine has processors.
mp_context: A multiprocessing context to launch the workers. This
object should provide SimpleQueue, Queue and Process.
- initializer: An callable used to initialize worker processes.
+ initializer: A callable used to initialize worker processes.
initargs: A tuple of arguments to pass to the initializer.
"""
_check_system_limits()
if max_workers is None:
self._max_workers = os.cpu_count() or 1
+ if sys.platform == 'win32':
+ self._max_workers = min(_MAX_WINDOWS_WORKERS,
+ self._max_workers)
else:
if max_workers <= 0:
raise ValueError("max_workers must be greater than 0")
+ elif (sys.platform == 'win32' and
+ max_workers > _MAX_WINDOWS_WORKERS):
+ raise ValueError(
+ f"max_workers must be <= {_MAX_WINDOWS_WORKERS}")
self._max_workers = max_workers
p.start()
self._processes[p.pid] = p
- def submit(self, fn, *args, **kwargs):
+ def submit(*args, **kwargs):
+ if len(args) >= 2:
+ self, fn, *args = args
+ elif not args:
+ raise TypeError("descriptor 'submit' of 'ProcessPoolExecutor' object "
+ "needs an argument")
+ elif 'fn' in kwargs:
+ fn = kwargs.pop('fn')
+ self, *args = args
+ else:
+ raise TypeError('submit expected at least 1 positional argument, '
+ 'got %d' % (len(args)-1))
+
with self._shutdown_lock:
if self._broken:
raise BrokenProcessPool(self._broken)
max_workers: The maximum number of threads that can be used to
execute the given calls.
thread_name_prefix: An optional name prefix to give our threads.
- initializer: An callable used to initialize worker threads.
+ initializer: A callable used to initialize worker threads.
initargs: A tuple of arguments to pass to the initializer.
"""
if max_workers is None:
self._initializer = initializer
self._initargs = initargs
- def submit(self, fn, *args, **kwargs):
+ def submit(*args, **kwargs):
+ if len(args) >= 2:
+ self, fn, *args = args
+ elif not args:
+ raise TypeError("descriptor 'submit' of 'ThreadPoolExecutor' object "
+ "needs an argument")
+ elif 'fn' in kwargs:
+ fn = kwargs.pop('fn')
+ self, *args = args
+ else:
+ raise TypeError('submit expected at least 1 positional argument, '
+ 'got %d' % (len(args)-1))
+
with self._shutdown_lock:
if self._broken:
raise BrokenThreadPool(self._broken)
return _exit_wrapper
@staticmethod
- def _create_cb_wrapper(callback, *args, **kwds):
+ def _create_cb_wrapper(*args, **kwds):
+ callback, *args = args
def _exit_wrapper(exc_type, exc, tb):
callback(*args, **kwds)
return _exit_wrapper
self._push_cm_exit(cm, _exit)
return result
- def callback(self, callback, *args, **kwds):
+ def callback(*args, **kwds):
"""Registers an arbitrary callback and arguments.
Cannot suppress exceptions.
"""
+ if len(args) >= 2:
+ self, callback, *args = args
+ elif not args:
+ raise TypeError("descriptor 'callback' of '_BaseExitStack' object "
+ "needs an argument")
+ elif 'callback' in kwds:
+ callback = kwds.pop('callback')
+ self, *args = args
+ else:
+ raise TypeError('callback expected at least 1 positional argument, '
+ 'got %d' % (len(args)-1))
+
_exit_wrapper = self._create_cb_wrapper(callback, *args, **kwds)
# We changed the signature, so using @wraps is not appropriate, but
return _exit_wrapper
@staticmethod
- def _create_async_cb_wrapper(callback, *args, **kwds):
+ def _create_async_cb_wrapper(*args, **kwds):
+ callback, *args = args
async def _exit_wrapper(exc_type, exc, tb):
await callback(*args, **kwds)
return _exit_wrapper
self._push_async_cm_exit(exit, exit_method)
return exit # Allow use as a decorator
- def push_async_callback(self, callback, *args, **kwds):
+ def push_async_callback(*args, **kwds):
"""Registers an arbitrary coroutine function and arguments.
Cannot suppress exceptions.
"""
+ if len(args) >= 2:
+ self, callback, *args = args
+ elif not args:
+ raise TypeError("descriptor 'push_async_callback' of "
+ "'AsyncExitStack' object needs an argument")
+ elif 'callback' in kwds:
+ callback = kwds.pop('callback')
+ self, *args = args
+ else:
+ raise TypeError('push_async_callback expected at least 1 '
+ 'positional argument, got %d' % (len(args)-1))
+
_exit_wrapper = self._create_async_cb_wrapper(callback, *args, **kwds)
# We changed the signature, so using @wraps is not appropriate, but
"""
if isinstance(init, str):
if size is None:
- size = len(init)+1
+ if sizeof(c_wchar) == 2:
+ # UTF-16 requires a surrogate pair (2 wchar_t) for non-BMP
+ # characters (outside [U+0000; U+FFFF] range). +1 for trailing
+ # NUL character.
+ size = sum(2 if ord(c) > 0xFFFF else 1 for c in init) + 1
+ else:
+ # 32-bit wchar_t (1 wchar_t per Unicode character). +1 for
+ # trailing NUL character.
+ size = len(init) + 1
buftype = c_wchar * size
buf = buftype()
buf.value = init
from operator import delitem
self.assertRaises(TypeError, delitem, ca, 0)
+ def test_step_overflow(self):
+ a = (c_int * 5)()
+ a[3::sys.maxsize] = (1,)
+ self.assertListEqual(a[3::sys.maxsize], [1])
+ a = (c_char * 5)()
+ a[3::sys.maxsize] = b"A"
+ self.assertEqual(a[3::sys.maxsize], b"A")
+ a = (c_wchar * 5)()
+ a[3::sys.maxsize] = u"X"
+ self.assertEqual(a[3::sys.maxsize], u"X")
+
def test_numeric_arrays(self):
alen = 5
_type_ = c_int
_length_ = 1.87
+ def test_empty_element_struct(self):
+ class EmptyStruct(Structure):
+ _fields_ = []
+
+ obj = (EmptyStruct * 2)() # bpo37188: Floating point exception
+ self.assertEqual(sizeof(obj), 0)
+
+ def test_empty_element_array(self):
+ class EmptyArray(Array):
+ _type_ = c_int
+ _length_ = 0
+
+ obj = (EmptyArray * 2)() # bpo37188: Floating point exception
+ self.assertEqual(sizeof(obj), 0)
+
+ def test_bpo36504_signed_int_overflow(self):
+ # The overflow check in PyCArrayType_new() could cause signed integer
+ # overflow.
+ with self.assertRaises(OverflowError):
+ c_char * sys.maxsize * 2
+
@unittest.skipUnless(sys.maxsize > 2**32, 'requires 64bit platform')
@bigmemtest(size=_2G, memuse=1, dry_run=False)
def test_large_array(self, size):
self.assertEqual(b[::2], "ac")
self.assertEqual(b[::5], "a")
+ @need_symbol('c_wchar')
+ def test_create_unicode_buffer_non_bmp(self):
+ expected = 5 if sizeof(c_wchar) == 2 else 3
+ for s in '\U00010000\U00100000', '\U00010000\U0010ffff':
+ b = create_unicode_buffer(s)
+ self.assertEqual(len(b), expected)
+ self.assertEqual(b[-1], '\0')
+
+
if __name__ == "__main__":
unittest.main()
# raises an exception, wrapper() will restore the terminal to a sane state so
# you can read the resulting traceback.
-def wrapper(func, *args, **kwds):
+def wrapper(*args, **kwds):
"""Wrapper function that initializes curses and calls another function,
restoring normal keyboard/screen behavior on error.
The callable object 'func' is then passed the main window 'stdscr'
wrapper().
"""
+ if args:
+ func, *args = args
+ elif 'func' in kwds:
+ func = kwds.pop('func')
+ else:
+ raise TypeError('wrapper expected at least 1 positional argument, '
+ 'got %d' % len(args))
+
try:
# Initialize curses
stdscr = initscr()
self.build_scripts = os.path.join(self.build_base,
'scripts-%d.%d' % sys.version_info[:2])
- if self.executable is None:
+ if self.executable is None and sys.executable:
self.executable = os.path.normpath(sys.executable)
if isinstance(self.parallel, str):
def _check_rst_data(self, data):
"""Returns warnings when the provided data doesn't compile."""
- source_path = StringIO()
+ # the include and csv_table directives need this to be a path
+ source_path = self.distribution.script_name or 'setup.py'
parser = Parser()
settings = frontend.OptionParser(components=(Parser,)).get_default_values()
settings.tab_width = 4
A string listing directories separated by 'os.pathsep'; defaults to
os.environ['PATH']. Returns the complete filename or None if not found.
"""
- if path is None:
- path = os.environ.get('PATH', os.defpath)
-
- paths = path.split(os.pathsep)
- base, ext = os.path.splitext(executable)
-
+ _, ext = os.path.splitext(executable)
if (sys.platform == 'win32') and (ext != '.exe'):
executable = executable + '.exe'
- if not os.path.isfile(executable):
- for p in paths:
- f = os.path.join(p, executable)
- if os.path.isfile(f):
- # the file exists, we have a shot at spawn working
- return f
- return None
- else:
+ if os.path.isfile(executable):
return executable
+
+ if path is None:
+ path = os.environ.get('PATH', None)
+ if path is None:
+ try:
+ path = os.confstr("CS_PATH")
+ except (AttributeError, ValueError):
+ # os.confstr() or CS_PATH is not available
+ path = os.defpath
+ # bpo-35755: Don't use os.defpath if the PATH environment variable is
+ # set to an empty string
+
+ # PATH='' doesn't match, whereas PATH=':' looks in the current directory
+ if not path:
+ return None
+
+ paths = path.split(os.pathsep)
+ for p in paths:
+ f = os.path.join(p, executable)
+ if os.path.isfile(f):
+ # the file exists, we have a shot at spawn working
+ return f
+ return None
if "_PYTHON_PROJECT_BASE" in os.environ:
project_base = os.path.abspath(os.environ["_PYTHON_PROJECT_BASE"])
else:
- project_base = os.path.dirname(os.path.abspath(sys.executable))
+ if sys.executable:
+ project_base = os.path.dirname(os.path.abspath(sys.executable))
+ else:
+ # sys.executable can be empty if argv[0] has been changed and Python is
+ # unable to retrieve the real program name
+ project_base = os.getcwd()
# python_build: (Boolean) if true, we're either building Python or
_osx_support.customize_compiler(_config_vars)
_config_vars['CUSTOMIZED_OSX_COMPILER'] = 'True'
- (cc, cxx, opt, cflags, ccshared, ldshared, shlib_suffix, ar, ar_flags) = \
- get_config_vars('CC', 'CXX', 'OPT', 'CFLAGS',
+ (cc, cxx, cflags, ccshared, ldshared, shlib_suffix, ar, ar_flags) = \
+ get_config_vars('CC', 'CXX', 'CFLAGS',
'CCSHARED', 'LDSHARED', 'SHLIB_SUFFIX', 'AR', 'ARFLAGS')
if 'CC' in os.environ:
if 'LDFLAGS' in os.environ:
ldshared = ldshared + ' ' + os.environ['LDFLAGS']
if 'CFLAGS' in os.environ:
- cflags = opt + ' ' + os.environ['CFLAGS']
+ cflags = cflags + ' ' + os.environ['CFLAGS']
ldshared = ldshared + ' ' + os.environ['CFLAGS']
if 'CPPFLAGS' in os.environ:
cpp = cpp + ' ' + os.environ['CPPFLAGS']
--- /dev/null
+This should be included.
"""Tests for distutils.command.check."""
+import os
import textwrap
import unittest
from test.support import run_unittest
pygments = None
+HERE = os.path.dirname(__file__)
+
+
class CheckTestCase(support.LoggingSilencer,
support.TempdirManager,
unittest.TestCase):
- def _run(self, metadata=None, **options):
+ def _run(self, metadata=None, cwd=None, **options):
if metadata is None:
metadata = {}
+ if cwd is not None:
+ old_dir = os.getcwd()
+ os.chdir(cwd)
pkg_info, dist = self.create_dist(**metadata)
cmd = check(dist)
cmd.initialize_options()
setattr(cmd, name, value)
cmd.ensure_finalized()
cmd.run()
+ if cwd is not None:
+ os.chdir(old_dir)
return cmd
def test_check_metadata(self):
cmd = self._run(metadata, strict=1, restructuredtext=1)
self.assertEqual(cmd._warnings, 0)
+ # check that includes work to test #31292
+ metadata['long_description'] = 'title\n=====\n\n.. include:: includetest.rst'
+ cmd = self._run(metadata, cwd=HERE, strict=1, restructuredtext=1)
+ self.assertEqual(cmd._warnings, 0)
+
@unittest.skipUnless(HAS_DOCUTILS, "won't test without docutils")
def test_check_restructuredtext_with_syntax_highlight(self):
# Don't fail if there is a `code` or `code-block` directive
rv = find_executable(dont_exist_program , path=tmp_dir)
self.assertIsNone(rv)
- # test os.defpath: missing PATH environment variable
+ # PATH='': no match, except in the current directory
with test_support.EnvironmentVarGuard() as env:
- with mock.patch('distutils.spawn.os.defpath', tmp_dir):
- env.pop('PATH')
+ env['PATH'] = ''
+ with unittest.mock.patch('distutils.spawn.os.confstr',
+ return_value=tmp_dir, create=True), \
+ unittest.mock.patch('distutils.spawn.os.defpath',
+ tmp_dir):
+ rv = find_executable(program)
+ self.assertIsNone(rv)
+
+ # look in current directory
+ with test_support.change_cwd(tmp_dir):
+ rv = find_executable(program)
+ self.assertEqual(rv, program)
+
+ # PATH=':': explicitly looks in the current directory
+ with test_support.EnvironmentVarGuard() as env:
+ env['PATH'] = os.pathsep
+ with unittest.mock.patch('distutils.spawn.os.confstr',
+ return_value='', create=True), \
+ unittest.mock.patch('distutils.spawn.os.defpath', ''):
+ rv = find_executable(program)
+ self.assertIsNone(rv)
+
+ # look in current directory
+ with test_support.change_cwd(tmp_dir):
+ rv = find_executable(program)
+ self.assertEqual(rv, program)
+
+ # missing PATH: test os.confstr("CS_PATH") and os.defpath
+ with test_support.EnvironmentVarGuard() as env:
+ env.pop('PATH', None)
+
+ # without confstr
+ with unittest.mock.patch('distutils.spawn.os.confstr',
+ side_effect=ValueError,
+ create=True), \
+ unittest.mock.patch('distutils.spawn.os.defpath',
+ tmp_dir):
+ rv = find_executable(program)
+ self.assertEqual(rv, filename)
+ # with confstr
+ with unittest.mock.patch('distutils.spawn.os.confstr',
+ return_value=tmp_dir, create=True), \
+ unittest.mock.patch('distutils.spawn.os.defpath', ''):
rv = find_executable(program)
self.assertEqual(rv, filename)
"""Tests for distutils.sysconfig."""
+import contextlib
import os
import shutil
import subprocess
from distutils import sysconfig
from distutils.ccompiler import get_default_compiler
from distutils.tests import support
-from test.support import TESTFN, run_unittest, check_warnings
+from test.support import TESTFN, run_unittest, check_warnings, swap_item
class SysconfigTestCase(support.EnvironGuard, unittest.TestCase):
def setUp(self):
os.chdir(cwd)
self.assertEqual(srcdir, srcdir2)
- @unittest.skipUnless(get_default_compiler() == 'unix',
- 'not testing if default compiler is not unix')
- def test_customize_compiler(self):
- os.environ['AR'] = 'my_ar'
- os.environ['ARFLAGS'] = '-arflags'
-
+ def customize_compiler(self):
# make sure AR gets caught
class compiler:
compiler_type = 'unix'
def set_executables(self, **kw):
self.exes = kw
+ sysconfig_vars = {
+ 'AR': 'sc_ar',
+ 'CC': 'sc_cc',
+ 'CXX': 'sc_cxx',
+ 'ARFLAGS': '--sc-arflags',
+ 'CFLAGS': '--sc-cflags',
+ 'CCSHARED': '--sc-ccshared',
+ 'LDSHARED': 'sc_ldshared',
+ 'SHLIB_SUFFIX': 'sc_shutil_suffix',
+
+ # On macOS, disable _osx_support.customize_compiler()
+ 'CUSTOMIZED_OSX_COMPILER': 'True',
+ }
+
comp = compiler()
- sysconfig.customize_compiler(comp)
- self.assertEqual(comp.exes['archiver'], 'my_ar -arflags')
+ with contextlib.ExitStack() as cm:
+ for key, value in sysconfig_vars.items():
+ cm.enter_context(swap_item(sysconfig._config_vars, key, value))
+ sysconfig.customize_compiler(comp)
+
+ return comp
+
+ @unittest.skipUnless(get_default_compiler() == 'unix',
+ 'not testing if default compiler is not unix')
+ def test_customize_compiler(self):
+ # Make sure that sysconfig._config_vars is initialized
+ sysconfig.get_config_vars()
+
+ os.environ['AR'] = 'env_ar'
+ os.environ['CC'] = 'env_cc'
+ os.environ['CPP'] = 'env_cpp'
+ os.environ['CXX'] = 'env_cxx --env-cxx-flags'
+ os.environ['LDSHARED'] = 'env_ldshared'
+ os.environ['LDFLAGS'] = '--env-ldflags'
+ os.environ['ARFLAGS'] = '--env-arflags'
+ os.environ['CFLAGS'] = '--env-cflags'
+ os.environ['CPPFLAGS'] = '--env-cppflags'
+
+ comp = self.customize_compiler()
+ self.assertEqual(comp.exes['archiver'],
+ 'env_ar --env-arflags')
+ self.assertEqual(comp.exes['preprocessor'],
+ 'env_cpp --env-cppflags')
+ self.assertEqual(comp.exes['compiler'],
+ 'env_cc --sc-cflags --env-cflags --env-cppflags')
+ self.assertEqual(comp.exes['compiler_so'],
+ ('env_cc --sc-cflags '
+ '--env-cflags ''--env-cppflags --sc-ccshared'))
+ self.assertEqual(comp.exes['compiler_cxx'],
+ 'env_cxx --env-cxx-flags')
+ self.assertEqual(comp.exes['linker_exe'],
+ 'env_cc')
+ self.assertEqual(comp.exes['linker_so'],
+ ('env_ldshared --env-ldflags --env-cflags'
+ ' --env-cppflags'))
+ self.assertEqual(comp.shared_lib_extension, 'sc_shutil_suffix')
+
+ del os.environ['AR']
+ del os.environ['CC']
+ del os.environ['CPP']
+ del os.environ['CXX']
+ del os.environ['LDSHARED']
+ del os.environ['LDFLAGS']
+ del os.environ['ARFLAGS']
+ del os.environ['CFLAGS']
+ del os.environ['CPPFLAGS']
+
+ comp = self.customize_compiler()
+ self.assertEqual(comp.exes['archiver'],
+ 'sc_ar --sc-arflags')
+ self.assertEqual(comp.exes['preprocessor'],
+ 'sc_cc -E')
+ self.assertEqual(comp.exes['compiler'],
+ 'sc_cc --sc-cflags')
+ self.assertEqual(comp.exes['compiler_so'],
+ 'sc_cc --sc-cflags --sc-ccshared')
+ self.assertEqual(comp.exes['compiler_cxx'],
+ 'sc_cxx')
+ self.assertEqual(comp.exes['linker_exe'],
+ 'sc_cc')
+ self.assertEqual(comp.exes['linker_so'],
+ 'sc_ldshared')
+ self.assertEqual(comp.shared_lib_extension, 'sc_shutil_suffix')
def test_parse_makefile_base(self):
self.makefile = TESTFN
"""
import re
+import sys
import urllib # For urllib.parse.unquote
from string import hexdigits
from collections import OrderedDict
def quote_string(value):
return '"'+str(value).replace('\\', '\\\\').replace('"', r'\"')+'"'
+# Match a RFC 2047 word, looks like =?utf-8?q?someword?=
+rfc2047_matcher = re.compile(r'''
+ =\? # literal =?
+ [^?]* # charset
+ \? # literal ?
+ [qQbB] # literal 'q' or 'b', case insensitive
+ \? # literal ?
+ .*? # encoded word
+ \?= # literal ?=
+''', re.VERBOSE | re.MULTILINE)
+
+
#
# TokenList and its subclasses
#
_validate_xtext(vtext)
ew.append(vtext)
text = ''.join(remainder)
+ # Encoded words should be followed by a WS
+ if value and value[0] not in WSP:
+ ew.defects.append(errors.InvalidHeaderDefect(
+ "missing trailing whitespace after encoded-word"))
return ew, value
def get_unstructured(value):
unstructured.append(token)
continue
tok, *remainder = _wsp_splitter(value, 1)
+ # Split in the middle of an atom if there is a rfc2047 encoded word
+ # which does not have WSP on both sides. The defect will be registered
+ # the next time through the loop.
+ if rfc2047_matcher.search(tok):
+ tok, *remainder = value.partition('=?')
vtext = ValueTerminal(tok, 'vtext')
_validate_xtext(vtext)
unstructured.append(vtext)
"""
# max_line_length 0/None means no limit, ie: infinitely long.
- maxlen = policy.max_line_length or float("+inf")
+ maxlen = policy.max_line_length or sys.maxsize
encoding = 'utf-8' if policy.utf8 else 'us-ascii'
lines = ['']
last_ew = None
want_encoding = False
last_ew = None
if part.syntactic_break:
- encoded_part = part.fold(policy=policy)[:-1] # strip nl
+ encoded_part = part.fold(policy=policy)[:-len(policy.linesep)]
if policy.linesep not in encoded_part:
# It fits on a single line
if len(encoded_part) > maxlen - len(lines[-1]):
newline = _steal_trailing_WSP_if_exists(lines)
if newline or part.startswith_fws():
lines.append(newline + tstr)
+ last_ew = None
continue
if not hasattr(part, 'encode'):
# It's not a terminal, try folding the subparts.
lines.append(' ')
# XXX We'll get an infinite loop here if maxlen is <= 7
continue
- first_part = to_encode[:text_space]
- ew = _ew.encode(first_part, charset=encode_as)
- excess = len(ew) - remaining_space
- if excess > 0:
- # encode always chooses the shortest encoding, so this
- # is guaranteed to fit at this point.
- first_part = first_part[:-excess]
- ew = _ew.encode(first_part)
- lines[-1] += ew
- to_encode = to_encode[len(first_part):]
+
+ to_encode_word = to_encode[:text_space]
+ encoded_word = _ew.encode(to_encode_word, charset=encode_as)
+ excess = len(encoded_word) - remaining_space
+ while excess > 0:
+ # Since the chunk to encode is guaranteed to fit into less than 100 characters,
+ # shrinking it by one at a time shouldn't take long.
+ to_encode_word = to_encode_word[:-1]
+ encoded_word = _ew.encode(to_encode_word, charset=encode_as)
+ excess = len(encoded_word) - remaining_space
+ lines[-1] += encoded_word
+ to_encode = to_encode[len(to_encode_word):]
+
if to_encode:
lines.append(' ')
new_last_ew = len(lines[-1])
self._cur.set_payload(EMPTYSTRING.join(lines))
return
# Make sure a valid content type was specified per RFC 2045:6.4.
- if (self._cur.get('content-transfer-encoding', '8bit').lower()
+ if (str(self._cur.get('content-transfer-encoding', '8bit')).lower()
not in ('7bit', '8bit', 'binary')):
defect = errors.InvalidMultipartContentTransferEncodingDefect()
self.policy.handle_defect(self._cur, defect)
if end_of_line != (' ', ''):
self._current_line.push(*end_of_line)
if len(self._current_line) > 0:
- if self._current_line.is_onlyws():
+ if self._current_line.is_onlyws() and self._lines:
self._lines[-1] += str(self._current_line)
else:
self._lines.append(str(self._current_line))
"""
import re
+import sys
from email._policybase import Policy, Compat32, compat32, _extend_docstrings
from email.utils import _has_surrogates
from email.headerregistry import HeaderRegistry as HeaderRegistry
def _fold(self, name, value, refold_binary=False):
if hasattr(value, 'name'):
return value.fold(policy=self)
- maxlen = self.max_line_length if self.max_line_length else float('inf')
+ maxlen = self.max_line_length if self.max_line_length else sys.maxsize
lines = value.splitlines()
refold = (self.refold_source == 'all' or
self.refold_source == 'long' and
This iterates over the lines of all files listed in sys.argv[1:],
defaulting to sys.stdin if the list is empty. If a filename is '-' it
-is also replaced by sys.stdin. To specify an alternative list of
-filenames, pass it as the argument to input(). A single file name is
-also allowed.
+is also replaced by sys.stdin and the optional arguments mode and
+openhook are ignored. To specify an alternative list of filenames,
+pass it as the argument to input(). A single file name is also allowed.
Functions filename(), lineno() return the filename and cumulative line
number of the line that has just been read; filelineno() returns its
callables as instance methods.
"""
- def __init__(self, func, *args, **keywords):
+ def __init__(*args, **keywords):
+ if len(args) >= 2:
+ self, func, *args = args
+ elif not args:
+ raise TypeError("descriptor '__init__' of partialmethod "
+ "needs an argument")
+ elif 'func' in keywords:
+ func = keywords.pop('func')
+ self, *args = args
+ else:
+ raise TypeError("type 'partialmethod' takes at least one argument, "
+ "got %d" % (len(args)-1))
+ args = tuple(args)
+
if not callable(func) and not hasattr(func, "__get__"):
raise TypeError("{!r} is not callable or a descriptor"
.format(func))
_is_legal_header_name = re.compile(rb'[^:\s][^:\r\n]*').fullmatch
_is_illegal_header_value = re.compile(rb'\n(?![ \t])|\r(?![ \t\n])').search
+# These characters are not allowed within HTTP URL paths.
+# See https://tools.ietf.org/html/rfc3986#section-3.3 and the
+# https://tools.ietf.org/html/rfc3986#appendix-A pchar definition.
+# Prevents CVE-2019-9740. Includes control characters such as \r\n.
+# We don't restrict chars above \x7f as putrequest() limits us to ASCII.
+_contains_disallowed_url_pchar_re = re.compile('[\x00-\x20\x7f]')
+# Arguably only these _should_ allowed:
+# _is_allowed_url_pchars_re = re.compile(r"^[/!$&'()*+,;=:@%a-zA-Z0-9._~-]+$")
+# We are more lenient for assumed real world compatibility purposes.
+
# We always set the Content-Length header for these methods because some
# servers will otherwise respond with a 411
_METHODS_EXPECTING_BODY = {'PATCH', 'POST', 'PUT'}
self.headers = self.msg = parse_headers(self.fp)
if self.debuglevel > 0:
- for hdr in self.headers:
- print("header:", hdr + ":", self.headers.get(hdr))
+ for hdr, val in self.headers.items():
+ print("header:", hdr + ":", val)
# are we using the chunked-style of transfer encoding?
tr_enc = self.headers.get("transfer-encoding")
self._method = method
if not url:
url = '/'
+ # Prevent CVE-2019-9740.
+ match = _contains_disallowed_url_pchar_re.search(url)
+ if match:
+ raise InvalidURL(f"URL can't contain control characters. {url!r} "
+ f"(found at least {match.group()!r})")
request = '%s %s %s' % (method, url, self._http_vsn_str)
# Non-ASCII characters should have been eliminated earlier
self.cert_file = cert_file
if context is None:
context = ssl._create_default_https_context()
+ # enable PHA for TLS 1.3 connections if available
+ if context.post_handshake_auth is not None:
+ context.post_handshake_auth = True
will_verify = context.verify_mode != ssl.CERT_NONE
if check_hostname is None:
check_hostname = context.check_hostname
"either CERT_OPTIONAL or CERT_REQUIRED")
if key_file or cert_file:
context.load_cert_chain(cert_file, key_file)
+ # cert and key file means the user wants to authenticate.
+ # enable TLS 1.3 PHA implicitly even for custom contexts.
+ if context.post_handshake_auth is not None:
+ context.post_handshake_auth = True
self._context = context
if check_hostname is not None:
self._context.check_hostname = check_hostname
-What's New in IDLE 3.7.3
-Released on 2019-??-??
+What's New in IDLE 3.7.4
+Released on 2019-06-24?
======================================
+bpo-37321: Both subprocess connection error messages now refer to
+the 'Startup failure' section of the IDLE doc.
+
+bpo-37039: Adjust "Zoom Height" to individual screens by momemtarily
+maximizing the window on first use with a particular screen. Changing
+screen settings may invalidate the saved height. While a window is
+maximized, "Zoom Height" has no effect.
+
+bpo-35763: Make calltip reminder about '/' meaning positional-only less
+obtrusive by only adding it when there is room on the first line.
+
+bpo-35610: Replace now redundant editor.context_use_ps1 with
+.prompt_last_line. This finishes change started in bpo-31858.
+
+bpo-32411: Stop sorting dict created with desired line order.
+
+bpo-37038: Make idlelib.run runnable; add test clause.
+
+bpo-36958: Print any argument other than None or int passed to
+SystemExit or sys.exit().
+
+bpo-36807: When saving a file, call file.flush() and os.fsync()
+so bits are flushed to e.g. a USB drive.
+
+bpo-36429: Fix starting IDLE with pyshell.
+Add idlelib.pyshell alias at top; remove pyshell alias at bottom.
+Remove obsolete __name__=='__main__' command.
+
+bpo-30348: Increase test coverage of idlelib.autocomplete by 30%.
+Patch by Louie Lu.
+
+bpo-23205: Add tests and refactor grep's findfiles.
+
+bpo-36405: Use dict unpacking in idlelib.
+
+bpo-36396: Remove fgBg param of idlelib.config.GetHighlight().
+This param was only used twice and changed the return type.
+
+bpo-23216: IDLE: Add docstrings to search modules.
+
+
+What's New in IDLE 3.7.3
+Released on 2019-03-25
+======================================
+
bpo-36176: Fix IDLE autocomplete & calltip popup colors.
Prevent conflicts with Linux dark themes
(and slightly darken calltip background).
What's New in IDLE 3.7.2
-Released on 2018-12-21?
+Released on 2018-12-24
======================================
bpo-34864: When starting IDLE on MacOS, warn if the system setting
What's New in IDLE 3.7.1
-Released on 2018-07-31?
+Released on 2018-10-20
======================================
bpo-1529353: Output over N lines (50 by default) is squeezed down to a button.
Either on demand or after a user-selected delay after a key character,
pop up a list of candidates.
"""
+import __main__
import os
import string
import sys
from idlelib import autocomplete_w
from idlelib.config import idleConf
from idlelib.hyperparser import HyperParser
-import __main__
# This string includes all chars that may be in an identifier.
# TODO Update this here and elsewhere.
def open_completions(self, evalfuncs, complete, userWantsWin, mode=None):
"""Find the completions and create the AutoCompleteWindow.
Return True if successful (no syntax error or so found).
- if complete is True, then if there's nothing to complete and no
+ If complete is True, then if there's nothing to complete and no
start of completion, won't open completions and return False.
If mode is given, will open a completion list only in this mode.
+
+ Action Function Eval Complete WantWin Mode
+ ^space force_open_completions True, False, True no
+ . or / try_open_completions False, False, False yes
+ tab autocomplete False, True, True no
"""
# Cancel another delayed call, if it exists.
if self._delayed_completion_id is not None:
curline = self.text.get("insert linestart", "insert")
i = j = len(curline)
if hp.is_in_string() and (not mode or mode==COMPLETE_FILES):
- # Find the beginning of the string
- # fetch_completions will look at the file system to determine whether the
- # string value constitutes an actual file name
- # XXX could consider raw strings here and unescape the string value if it's
- # not raw.
+ # Find the beginning of the string.
+ # fetch_completions will look at the file system to determine
+ # whether the string value constitutes an actual file name
+ # XXX could consider raw strings here and unescape the string
+ # value if it's not raw.
self._remove_autocomplete_window()
mode = COMPLETE_FILES
# Find last separator or string start
else:
if mode == COMPLETE_ATTRIBUTES:
if what == "":
- namespace = __main__.__dict__.copy()
- namespace.update(__main__.__builtins__.__dict__)
+ namespace = {**__main__.__builtins__.__dict__,
+ **__main__.__dict__}
bigl = eval("dir()", namespace)
bigl.sort()
if "__all__" in bigl:
return smalll, bigl
def get_entity(self, name):
- """Lookup name in a namespace spanning sys.modules and __main.dict__"""
- namespace = sys.modules.copy()
- namespace.update(__main__.__dict__)
- return eval(name, namespace)
+ "Lookup name in a namespace spanning sys.modules and __main.dict__."
+ return eval(name, {**sys.modules, **__main__.__dict__})
AutoComplete.reload()
The dictionary maps names to pyclbr information objects.
Filter out imported objects.
Augment class names with bases.
- Sort objects by line number.
+ The insertion order of the dictonary is assumed to have been in line
+ number order, so sorting is not necessary.
- The current tree only calls this once per child_dic as it saves
+ The current tree only calls this once per child_dict as it saves
TreeItems once created. A future tree and tests might violate this,
so a check prevents multiple in-place augmentations.
"""
supers.append(sname)
obj.name += '({})'.format(', '.join(supers))
obs.append(obj)
- return sorted(obs, key=lambda o: o.lineno)
+ return obs
class ModuleBrowser:
parameter and docstring information when you type an opening parenthesis, and
which disappear when you type a closing parenthesis.
"""
+import __main__
import inspect
import re
import sys
from idlelib import calltip_w
from idlelib.hyperparser import HyperParser
-import __main__
class Calltip:
in a namespace spanning sys.modules and __main.dict__.
"""
if expression:
- namespace = sys.modules.copy()
- namespace.update(__main__.__dict__)
+ namespace = {**sys.modules, **__main__.__dict__}
try:
- return eval(expression, namespace)
+ return eval(expression, namespace) # Only protect user code.
except BaseException:
# An uncaught exception closes idle, and eval can raise any
# exception, especially if user classes are involved.
_first_param = re.compile(r'(?<=\()\w*\,?\s*')
_default_callable_argspec = "See source or doc"
_invalid_method = "invalid method signature"
-_argument_positional = "\n['/' marks preceding arguments as positional-only]\n"
+_argument_positional = " # '/' marks preceding args as positional-only."
def get_argspec(ob):
'''Return a string describing the signature of a callable object, or ''.
if msg.startswith(_invalid_method):
return _invalid_method
- if '/' in argspec:
- """Using AC's positional argument should add the explain"""
+ if '/' in argspec and len(argspec) < _MAX_COLS - len(_argument_positional):
+ # Add explanation TODO remove after 3.7, before 3.9.
argspec += _argument_positional
if isinstance(fob, type) and argspec == '()':
- """fob with no argument, use default callable argspec"""
+ # If fob has no argument, use default callable argspec.
argspec = _default_callable_argspec
lines = (textwrap.wrap(argspec, _MAX_COLS, subsequent_indent=_INDENT)
# Not automatic because ColorDelegator does not know 'text'.
theme = idleConf.CurrentTheme()
normal_colors = idleConf.GetHighlight(theme, 'normal')
- cursor_color = idleConf.GetHighlight(theme, 'cursor', fgBg='fg')
+ cursor_color = idleConf.GetHighlight(theme, 'cursor')['foreground']
select_colors = idleConf.GetHighlight(theme, 'hilite')
text.config(
foreground=normal_colors['foreground'],
format-paragraph= <Alt-Key-q>
flash-paren= <Control-Key-0>
run-module= <Key-F5>
+run-custom= <Shift-Key-F5>
check-module= <Alt-Key-x>
zoom-height= <Alt-Key-2>
format-paragraph= <Alt-Key-q>
flash-paren= <Control-Key-0>
run-module= <Key-F5>
+run-custom= <Shift-Key-F5>
check-module= <Alt-Key-x>
zoom-height= <Alt-Key-2>
format-paragraph= <Alt-Key-q>
flash-paren= <Control-Key-0>
run-module= <Key-F5>
+run-custom= <Shift-Key-F5>
check-module= <Alt-Key-x>
zoom-height= <Alt-Key-2>
format-paragraph= <Option-Key-q>
flash-paren= <Control-Key-0>
run-module= <Key-F5>
+run-custom= <Shift-Key-F5>
check-module= <Option-Key-x>
zoom-height= <Option-Key-0>
format-paragraph= <Option-Key-q>
flash-paren= <Control-Key-0>
run-module= <Key-F5>
+run-custom= <Shift-Key-F5>
check-module= <Option-Key-x>
zoom-height= <Option-Key-0>
there are separate dicts for default and user values. Each has
config-type keys 'main', 'extensions', 'highlight', and 'keys'. The
value for each key is a ConfigParser instance that maps section and item
-to values. For 'main' and 'extenstons', user values override
+to values. For 'main' and 'extensions', user values override
default values. For 'highlight' and 'keys', user sections augment the
default sections (and must, therefore, have distinct names).
class InvalidConfigType(Exception): pass
class InvalidConfigSet(Exception): pass
-class InvalidFgBg(Exception): pass
class InvalidTheme(Exception): pass
class IdleConfParser(ConfigParser):
raise InvalidConfigSet('Invalid configSet specified')
return cfgParser.sections()
- def GetHighlight(self, theme, element, fgBg=None):
- """Return individual theme element highlight color(s).
+ def GetHighlight(self, theme, element):
+ """Return dict of theme element highlight colors.
- fgBg - string ('fg' or 'bg') or None.
- If None, return a dictionary containing fg and bg colors with
- keys 'foreground' and 'background'. Otherwise, only return
- fg or bg color, as specified. Colors are intended to be
- appropriate for passing to Tkinter in, e.g., a tag_config call).
+ The keys are 'foreground' and 'background'. The values are
+ tkinter color strings for configuring backgrounds and tags.
"""
- if self.defaultCfg['highlight'].has_section(theme):
- themeDict = self.GetThemeDict('default', theme)
- else:
- themeDict = self.GetThemeDict('user', theme)
- fore = themeDict[element + '-foreground']
- if element == 'cursor': # There is no config value for cursor bg
- back = themeDict['normal-background']
- else:
- back = themeDict[element + '-background']
- highlight = {"foreground": fore, "background": back}
- if not fgBg: # Return dict of both colors
- return highlight
- else: # Return specified color only
- if fgBg == 'fg':
- return highlight["foreground"]
- if fgBg == 'bg':
- return highlight["background"]
- else:
- raise InvalidFgBg('Invalid fgBg specified')
+ cfg = ('default' if self.defaultCfg['highlight'].has_section(theme)
+ else 'user')
+ theme_dict = self.GetThemeDict(cfg, theme)
+ fore = theme_dict[element + '-foreground']
+ if element == 'cursor':
+ element = 'normal'
+ back = theme_dict[element + '-background']
+ return {"foreground": fore, "background": back}
def GetThemeDict(self, type, themeName):
"""Return {option:value} dict for elements in themeName.
former_extension_events = { # Those with user-configurable keys.
'<<force-open-completions>>', '<<expand-word>>',
'<<force-open-calltip>>', '<<flash-paren>>', '<<format-paragraph>>',
- '<<run-module>>', '<<check-module>>', '<<zoom-height>>'}
+ '<<run-module>>', '<<check-module>>', '<<zoom-height>>',
+ '<<run-custom>>',
+ }
def GetCoreKeys(self, keySetName=None):
"""Return dict of core virtual-key keybindings for keySetName.
'<<flash-paren>>': ['<Control-Key-0>'],
'<<format-paragraph>>': ['<Alt-Key-q>'],
'<<run-module>>': ['<Key-F5>'],
+ '<<run-custom>>': ['<Shift-Key-F5>'],
'<<check-module>>': ['<Alt-Key-x>'],
'<<zoom-height>>': ['<Alt-Key-2>'],
}
def help(self):
"""Create textview for config dialog help.
- Attrbutes accessed:
+ Attributes accessed:
note
Methods:
colors = idleConf.GetHighlight(theme, element)
if element == 'cursor': # Cursor sample needs special painting.
colors['background'] = idleConf.GetHighlight(
- theme, 'normal', fgBg='bg')
+ theme, 'normal')['background']
# Handle any unsaved changes to this theme.
if theme in changes['highlight']:
theme_dict = changes['highlight'][theme]
'General': '''
General:
-AutoComplete: Popupwait is milleseconds to wait after key char, without
+AutoComplete: Popupwait is milliseconds to wait after key char, without
cursor movement, before popping up completion box. Key char is '.' after
identifier or a '/' (or '\\' on Windows) within a string.
def resetcache(self):
"Removes added attributes while leaving original attributes."
- # Function is really about resetting delagator dict
+ # Function is really about resetting delegator dict
# to original state. Cache is just a means
for key in self.__cache:
try:
self.indentwidth = self.tabwidth
self.set_notabs_indentwidth()
- # If context_use_ps1 is true, parsing searches back for a ps1 line;
- # else searches for a popular (if, def, ...) Python stmt.
- self.context_use_ps1 = False
-
# When searching backwards for a reliable place to begin parsing,
# first start num_context_lines[0] lines back, then
# num_context_lines[1] lines back if that didn't work, and so on.
scriptbinding = ScriptBinding(self)
text.bind("<<check-module>>", scriptbinding.check_module_event)
text.bind("<<run-module>>", scriptbinding.run_module_event)
+ text.bind("<<run-custom>>", scriptbinding.run_custom_event)
text.bind("<<do-rstrip>>", self.Rstrip(self).do_rstrip)
ctip = self.Calltip(self)
text.bind("<<try-open-calltip>>", ctip.try_open_calltip_event)
self.CodeContext(self).toggle_code_context_event)
def _filename_to_unicode(self, filename):
- """Return filename as BMP unicode so diplayable in Tk."""
+ """Return filename as BMP unicode so displayable in Tk."""
# Decode bytes to unicode.
if isinstance(filename, bytes):
try:
# open/close first need to find the last stmt
lno = index2line(text.index('insert'))
y = pyparse.Parser(self.indentwidth, self.tabwidth)
- if not self.context_use_ps1:
+ if not self.prompt_last_line:
for context in self.num_context_lines:
startat = max(lno - context, 1)
startatindex = repr(startat) + ".0"
rawtext = text.get(startatindex, "insert")
y.set_code(rawtext)
bod = y.find_good_parse_start(
- self.context_use_ps1,
self._build_char_in_string_func(startatindex))
if bod is not None or startat == 1:
break
from idlelib import searchengine
# Importing OutputWindow here fails due to import loop
-# EditorWindow -> GrepDialop -> OutputWindow -> EditorWindow
+# EditorWindow -> GrepDialog -> OutputWindow -> EditorWindow
def grep(text, io=None, flist=None):
- """Create or find singleton GrepDialog instance.
+ """Open the Find in Files dialog.
+
+ Module-level function to access the singleton GrepDialog
+ instance and open the dialog. If text is selected, it is
+ used as the search phrase; otherwise, the previous entry
+ is used.
Args:
text: Text widget that contains the selected text for
io: iomenu.IOBinding instance with default path to search.
flist: filelist.FileList instance for OutputWindow parent.
"""
-
root = text._root()
engine = searchengine.get(root)
if not hasattr(engine, "_grepdialog"):
dialog.open(text, searchphrase, io)
+def walk_error(msg):
+ "Handle os.walk error."
+ print(msg)
+
+
+def findfiles(folder, pattern, recursive):
+ """Generate file names in dir that match pattern.
+
+ Args:
+ folder: Root directory to search.
+ pattern: File pattern to match.
+ recursive: True to include subdirectories.
+ """
+ for dirpath, _, filenames in os.walk(folder, onerror=walk_error):
+ yield from (os.path.join(dirpath, name)
+ for name in filenames
+ if fnmatch.fnmatch(name, pattern))
+ if not recursive:
+ break
+
+
class GrepDialog(SearchDialogBase):
"Dialog for searching multiple files."
searchengine instance to prepare the search.
Attributes:
- globvar: Value of Text Entry widget for path to search.
- recvar: Boolean value of Checkbutton widget
- for traversing through subdirectories.
+ flist: filelist.Filelist instance for OutputWindow parent.
+ globvar: String value of Entry widget for path to search.
+ globent: Entry widget for globvar. Created in
+ create_entries().
+ recvar: Boolean value of Checkbutton widget for
+ traversing through subdirectories.
"""
- SearchDialogBase.__init__(self, root, engine)
+ super().__init__(root, engine)
self.flist = flist
self.globvar = StringVar(root)
self.recvar = BooleanVar(root)
def open(self, text, searchphrase, io=None):
- "Make dialog visible on top of others and ready to use."
+ """Make dialog visible on top of others and ready to use.
+
+ Extend the SearchDialogBase open() to set the initial value
+ for globvar.
+
+ Args:
+ text: Multicall object containing the text information.
+ searchphrase: String phrase to search.
+ io: iomenu.IOBinding instance containing file path.
+ """
SearchDialogBase.open(self, text, searchphrase)
if io:
path = io.filename or ""
btn.pack(side="top", fill="both")
def create_command_buttons(self):
- "Create base command buttons and add button for search."
+ "Create base command buttons and add button for Search Files."
SearchDialogBase.create_command_buttons(self)
- self.make_button("Search Files", self.default_command, 1)
+ self.make_button("Search Files", self.default_command, isdef=True)
def default_command(self, event=None):
"""Grep for search pattern in file path. The default command is bound
search each line for the matching pattern. If the pattern is
found, write the file and line information to stdout (which
is an OutputWindow).
+
+ Args:
+ prog: The compiled, cooked search pattern.
+ path: String containing the search path.
"""
- dir, base = os.path.split(path)
- list = self.findfiles(dir, base, self.recvar.get())
- list.sort()
+ folder, filepat = os.path.split(path)
+ if not folder:
+ folder = os.curdir
+ filelist = sorted(findfiles(folder, filepat, self.recvar.get()))
self.close()
pat = self.engine.getpat()
print(f"Searching {pat!r} in {path} ...")
hits = 0
try:
- for fn in list:
+ for fn in filelist:
try:
with open(fn, errors='replace') as f:
for lineno, line in enumerate(f, 1):
# so in OW.write, OW.text.insert fails.
pass
- def findfiles(self, dir, base, rec):
- """Return list of files in the dir that match the base pattern.
-
- If rec is True, recursively iterate through subdirectories.
- """
- try:
- names = os.listdir(dir or os.curdir)
- except OSError as msg:
- print(msg)
- return []
- list = []
- subdirs = []
- for name in names:
- fn = os.path.join(dir, name)
- if os.path.isdir(fn):
- subdirs.append(fn)
- else:
- if fnmatch.fnmatch(name, base):
- list.append(fn)
- if rec:
- for subdir in subdirs:
- list.extend(self.findfiles(subdir, base, rec))
- return list
-
def _grep_dialog(parent): # htest #
from tkinter import Toplevel, Text, SEL, END
<head>
<meta http-equiv="X-UA-Compatible" content="IE=Edge" />
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
- <title>IDLE — Python 3.8.0a1 documentation</title>
+ <title>IDLE — Python 3.9.0a0 documentation</title>
<link rel="stylesheet" href="../_static/pydoctheme.css" type="text/css" />
<link rel="stylesheet" href="../_static/pygments.css" type="text/css" />
<script type="text/javascript" src="../_static/sidebar.js"></script>
<link rel="search" type="application/opensearchdescription+xml"
- title="Search within Python 3.8.0a1 documentation"
+ title="Search within Python 3.9.0a0 documentation"
href="../_static/opensearch.xml"/>
<link rel="author" title="About these documents" href="../about.html" />
<link rel="index" title="Index" href="../genindex.html" />
</head><body>
+
<div class="related" role="navigation" aria-label="related navigation">
<h3>Navigation</h3>
<ul>
<li>
- <a href="../index.html">3.8.0a1 Documentation</a> »
+ <a href="../index.html">3.9.0a0 Documentation</a> »
</li>
<li class="nav-item nav-item-1"><a href="index.html" >The Python Standard Library</a> »</li>
default title and context menu.</p>
<p>On macOS, there is one application menu. It dynamically changes according
to the window currently selected. It has an IDLE menu, and some entries
-described below are moved around to conform to Apple guidlines.</p>
+described below are moved around to conform to Apple guidelines.</p>
<div class="section" id="file-menu-shell-and-editor">
<h3>File menu (Shell and Editor)<a class="headerlink" href="#file-menu-shell-and-editor" title="Permalink to this headline">¶</a></h3>
<dl class="docutils">
</div>
<div class="section" id="run-menu-editor-window-only">
<span id="index-2"></span><h3>Run menu (Editor window only)<a class="headerlink" href="#run-menu-editor-window-only" title="Permalink to this headline">¶</a></h3>
-<dl class="docutils">
+<dl class="docutils" id="python-shell">
<dt>Python Shell</dt>
<dd>Open or wake up the Python Shell window.</dd>
+</dl>
+<dl class="docutils" id="check-module">
<dt>Check Module</dt>
<dd>Check the syntax of the module currently open in the Editor window. If the
module has not been saved IDLE will either prompt the user to save or
autosave, as selected in the General tab of the Idle Settings dialog. If
there is a syntax error, the approximate location is indicated in the
Editor window.</dd>
+</dl>
+<dl class="docutils" id="run-module">
<dt>Run Module</dt>
-<dd>Do Check Module (above). If no error, restart the shell to clean the
+<dd>Do <a class="reference internal" href="#check-module"><span class="std std-ref">Check Module</span></a>. If no error, restart the shell to clean the
environment, then execute the module. Output is displayed in the Shell
window. Note that output requires use of <code class="docutils literal notranslate"><span class="pre">print</span></code> or <code class="docutils literal notranslate"><span class="pre">write</span></code>.
When execution is complete, the Shell retains focus and displays a prompt.
This is similar to executing a file with <code class="docutils literal notranslate"><span class="pre">python</span> <span class="pre">-i</span> <span class="pre">file</span></code> at a command
line.</dd>
</dl>
+<dl class="docutils" id="run-custom">
+<dt>Run… Customized</dt>
+<dd>Same as <a class="reference internal" href="#run-module"><span class="std std-ref">Run Module</span></a>, but run the module with customized
+settings. <em>Command Line Arguments</em> extend <a class="reference internal" href="sys.html#sys.argv" title="sys.argv"><code class="xref py py-data docutils literal notranslate"><span class="pre">sys.argv</span></code></a> as if passed
+on a command line. The module can be run in the Shell without restarting.</dd>
+</dl>
</div>
<div class="section" id="shell-menu-shell-window-only">
<h3>Shell menu (Shell window only)<a class="headerlink" href="#shell-menu-shell-window-only" title="Permalink to this headline">¶</a></h3>
configuration dialog by selecting Preferences in the application
menu. For more, see
<a class="reference internal" href="#preferences"><span class="std std-ref">Setting preferences</span></a> under Help and preferences.</dd>
-<dt>Zoom/Restore Height</dt>
-<dd>Toggles the window between normal size and maximum height. The initial size
-defaults to 40 lines by 80 chars unless changed on the General tab of the
-Configure IDLE dialog.</dd>
<dt>Show/Hide Code Context (Editor Window only)</dt>
<dd>Open a pane at the top of the edit window which shows the block context
of the code which has scrolled above the top of the window. See
<a class="reference internal" href="#code-context"><span class="std std-ref">Code Context</span></a> in the Editing and Navigation section below.</dd>
+<dt>Zoom/Restore Height</dt>
+<dd>Toggles the window between normal size and maximum height. The initial size
+defaults to 40 lines by 80 chars unless changed on the General tab of the
+Configure IDLE dialog. The maximum height for a screen is determined by
+momentarily maximizing a window the first time one is zoomed on the screen.
+Changing screen settings may invalidate the saved height. This toogle has
+no effect when a window is maximized.</dd>
</dl>
</div>
<div class="section" id="window-menu-shell-and-editor">
<dt>Go to file/line</dt>
<dd>Same as in Debug menu.</dd>
</dl>
-<p>The Shell window also has an output squeezing facility explained in the
-the <em>Python Shell window</em> subsection below.</p>
+<p>The Shell window also has an output squeezing facility explained in the <em>Python
+Shell window</em> subsection below.</p>
<dl class="docutils">
<dt>Squeeze</dt>
<dd>If the cursor is over an output line, squeeze all the output between
<p>If <code class="docutils literal notranslate"><span class="pre">sys</span></code> is reset by user code, such as with <code class="docutils literal notranslate"><span class="pre">importlib.reload(sys)</span></code>,
IDLE’s changes are lost and input from the keyboard and output to the screen
will not work correctly.</p>
+<p>When user code raises SystemExit either directly or by calling sys.exit, IDLE
+returns to a Shell prompt instead of exiting.</p>
</div>
<div class="section" id="user-output-in-shell">
<h3>User output in Shell<a class="headerlink" href="#user-output-in-shell" title="Permalink to this headline">¶</a></h3>
In contrast, some system text windows only keep the last n lines of output.
A Windows console, for instance, keeps a user-settable 1 to 9999 lines,
with 300 the default.</p>
-<p>A Tk Text widget, and hence IDLE’s Shell, displays characters (codepoints)
-in the the BMP (Basic Multilingual Plane) subset of Unicode.
-Which characters are displayed with a proper glyph and which with a
-replacement box depends on the operating system and installed fonts.
-Tab characters cause the following text to begin after
-the next tab stop. (They occur every 8 ‘characters’).
-Newline characters cause following text to appear on a new line.
-Other control characters are ignored or displayed as a space, box, or
-something else, depending on the operating system and font.
-(Moving the text cursor through such output with arrow keys may exhibit
-some surprising spacing behavior.)</p>
-<div class="highlight-none notranslate"><div class="highlight"><pre><span></span>>>> s = 'a\tb\a<\x02><\r>\bc\nd'
->>> len(s)
-14
->>> s # Display repr(s)
-'a\tb\x07<\x02><\r>\x08c\nd'
->>> print(s, end='') # Display s as is.
-# Result varies by OS and font. Try it.
+<p>A Tk Text widget, and hence IDLE’s Shell, displays characters (codepoints) in
+the BMP (Basic Multilingual Plane) subset of Unicode. Which characters are
+displayed with a proper glyph and which with a replacement box depends on the
+operating system and installed fonts. Tab characters cause the following text
+to begin after the next tab stop. (They occur every 8 ‘characters’). Newline
+characters cause following text to appear on a new line. Other control
+characters are ignored or displayed as a space, box, or something else,
+depending on the operating system and font. (Moving the text cursor through
+such output with arrow keys may exhibit some surprising spacing behavior.)</p>
+<div class="highlight-python3 notranslate"><div class="highlight"><pre><span></span><span class="gp">>>> </span><span class="n">s</span> <span class="o">=</span> <span class="s1">'a</span><span class="se">\t</span><span class="s1">b</span><span class="se">\a</span><span class="s1"><</span><span class="se">\x02</span><span class="s1">><</span><span class="se">\r</span><span class="s1">></span><span class="se">\b</span><span class="s1">c</span><span class="se">\n</span><span class="s1">d'</span> <span class="c1"># Enter 22 chars.</span>
+<span class="gp">>>> </span><span class="nb">len</span><span class="p">(</span><span class="n">s</span><span class="p">)</span>
+<span class="go">14</span>
+<span class="gp">>>> </span><span class="n">s</span> <span class="c1"># Display repr(s)</span>
+<span class="go">'a\tb\x07<\x02><\r>\x08c\nd'</span>
+<span class="gp">>>> </span><span class="nb">print</span><span class="p">(</span><span class="n">s</span><span class="p">,</span> <span class="n">end</span><span class="o">=</span><span class="s1">''</span><span class="p">)</span> <span class="c1"># Display s as is.</span>
+<span class="go"># Result varies by OS and font. Try it.</span>
</pre></div>
</div>
<p>The <code class="docutils literal notranslate"><span class="pre">repr</span></code> function is used for interactive echo of expression
<span class="pre">root</span> <span class="pre">=</span> <span class="pre">tk.Tk()</span></code> in standard Python and nothing appears. Enter the same
in IDLE and a tk window appears. In standard Python, one must also enter
<code class="docutils literal notranslate"><span class="pre">root.update()</span></code> to see the window. IDLE does the equivalent in the
-background, about 20 times a second, which is about every 50 milleseconds.
+background, about 20 times a second, which is about every 50 milliseconds.
Next enter <code class="docutils literal notranslate"><span class="pre">b</span> <span class="pre">=</span> <span class="pre">tk.Button(root,</span> <span class="pre">text='button');</span> <span class="pre">b.pack()</span></code>. Again,
nothing visibly changes in standard Python until one enters <code class="docutils literal notranslate"><span class="pre">root.update()</span></code>.</p>
<p>Most tkinter programs run <code class="docutils literal notranslate"><span class="pre">root.mainloop()</span></code>, which usually does not
<li>
- <a href="../index.html">3.8.0a1 Documentation</a> »
+ <a href="../index.html">3.9.0a0 Documentation</a> »
</li>
<li class="nav-item nav-item-1"><a href="index.html" >The Python Standard Library</a> »</li>
<br />
<br />
- Last updated on Feb 23, 2019.
+ Last updated on Jun 17, 2019.
<a href="https://docs.python.org/3/bugs.html">Found a bug</a>?
<br />
Contents are subject to revision at any time, without notice.
-Help => About IDLE: diplay About Idle dialog
+Help => About IDLE: display About Idle dialog
<to be moved here from help_about.py>
return int(float(index))
lno = index2line(text.index(index))
- if not editwin.context_use_ps1:
+ if not editwin.prompt_last_line:
for context in editwin.num_context_lines:
startat = max(lno - context, 1)
startatindex = repr(startat) + ".0"
The first parameter of X must be 'parent'. When called, the parent
argument will be the root window. X must create a child Toplevel
window (or subclass thereof). The Toplevel may be a test widget or
-dialog, in which case the callable is the corresonding class. Or the
+dialog, in which case the callable is the corresponding class. Or the
Toplevel may contain the widget to be tested or set up a context in
which a test widget is invoked. In this latter case, the callable is a
wrapper function that sets up the Toplevel and other objects. Wrapper
"The default color scheme is in idlelib/config-highlight.def"
}
+CustomRun_spec = {
+ 'file': 'query',
+ 'kwds': {'title': 'Custom Run Args',
+ '_htest': True},
+ 'msg': "Enter with <Return> or [Ok]. Print valid entry to Shell\n"
+ "Arguments are parsed into a list\n"
+ "Close dialog with valid entry, <Escape>, [Cancel], [X]"
+ }
+
ConfigDialog_spec = {
'file': 'configdialog',
'kwds': {'title': 'ConfigDialogTest',
"""Generic mock for messagebox functions, which all have the same signature.
Instead of displaying a message box, the mock's call method saves the
- arguments as instance attributes, which test functions can then examime.
+ arguments as instance attributes, which test functions can then examine.
The test can set the result returned to ask function
"""
def __init__(self, result=None):
-"Test autocomplete, coverage 57%."
+"Test autocomplete, coverage 87%."
import unittest
+from unittest.mock import Mock, patch
from test.support import requires
from tkinter import Tk, Text
+import os
+import __main__
import idlelib.autocomplete as ac
import idlelib.autocomplete_w as acw
self.text = text
self.indentwidth = 8
self.tabwidth = 8
- self.context_use_ps1 = True
+ self.prompt_last_line = '>>>' # Currently not used by autocomplete.
class AutoCompleteTest(unittest.TestCase):
def setUpClass(cls):
requires('gui')
cls.root = Tk()
+ cls.root.withdraw()
cls.text = Text(cls.root)
cls.editor = DummyEditwin(cls.root, cls.text)
@classmethod
def tearDownClass(cls):
del cls.editor, cls.text
+ cls.root.update_idletasks()
cls.root.destroy()
del cls.root
def setUp(self):
- self.editor.text.delete('1.0', 'end')
+ self.text.delete('1.0', 'end')
self.autocomplete = ac.AutoComplete(self.editor)
def test_init(self):
self.assertIsNone(self.autocomplete.autocompletewindow)
def test_force_open_completions_event(self):
- # Test that force_open_completions_event calls _open_completions
+ # Test that force_open_completions_event calls _open_completions.
o_cs = Func()
self.autocomplete.open_completions = o_cs
self.autocomplete.force_open_completions_event('event')
o_c_l = Func()
autocomplete._open_completions_later = o_c_l
- # _open_completions_later should not be called with no text in editor
+ # _open_completions_later should not be called with no text in editor.
trycompletions('event')
Equal(o_c_l.args, None)
- # _open_completions_later should be called with COMPLETE_ATTRIBUTES (1)
+ # _open_completions_later should be called with COMPLETE_ATTRIBUTES (1).
self.text.insert('1.0', 're.')
trycompletions('event')
Equal(o_c_l.args, (False, False, False, 1))
- # _open_completions_later should be called with COMPLETE_FILES (2)
+ # _open_completions_later should be called with COMPLETE_FILES (2).
self.text.delete('1.0', 'end')
self.text.insert('1.0', '"./Lib/')
trycompletions('event')
autocomplete = self.autocomplete
# Test that the autocomplete event is ignored if user is pressing a
- # modifier key in addition to the tab key
+ # modifier key in addition to the tab key.
ev = Event(mc_state=True)
self.assertIsNone(autocomplete.autocomplete_event(ev))
del ev.mc_state
self.assertIsNone(autocomplete.autocomplete_event(ev))
self.text.delete('1.0', 'end')
- # If autocomplete window is open, complete() method is called
+ # If autocomplete window is open, complete() method is called.
self.text.insert('1.0', 're.')
- # This must call autocomplete._make_autocomplete_window()
+ # This must call autocomplete._make_autocomplete_window().
Equal(self.autocomplete.autocomplete_event(ev), 'break')
# If autocomplete window is not active or does not exist,
# open_completions is called. Return depends on its return.
autocomplete._remove_autocomplete_window()
- o_cs = Func() # .result = None
+ o_cs = Func() # .result = None.
autocomplete.open_completions = o_cs
Equal(self.autocomplete.autocomplete_event(ev), None)
Equal(o_cs.args, (False, True, True))
Equal(o_cs.args, (False, True, True))
def test_open_completions_later(self):
- # Test that autocomplete._delayed_completion_id is set
- pass
+ # Test that autocomplete._delayed_completion_id is set.
+ acp = self.autocomplete
+ acp._delayed_completion_id = None
+ acp._open_completions_later(False, False, False, ac.COMPLETE_ATTRIBUTES)
+ cb1 = acp._delayed_completion_id
+ self.assertTrue(cb1.startswith('after'))
+
+ # Test that cb1 is cancelled and cb2 is new.
+ acp._open_completions_later(False, False, False, ac.COMPLETE_FILES)
+ self.assertNotIn(cb1, self.root.tk.call('after', 'info'))
+ cb2 = acp._delayed_completion_id
+ self.assertTrue(cb2.startswith('after') and cb2 != cb1)
+ self.text.after_cancel(cb2)
def test_delayed_open_completions(self):
- # Test that autocomplete._delayed_completion_id set to None and that
- # open_completions only called if insertion index is the same as
- # _delayed_completion_index
- pass
+ # Test that autocomplete._delayed_completion_id set to None
+ # and that open_completions is not called if the index is not
+ # equal to _delayed_completion_index.
+ acp = self.autocomplete
+ acp.open_completions = Func()
+ acp._delayed_completion_id = 'after'
+ acp._delayed_completion_index = self.text.index('insert+1c')
+ acp._delayed_open_completions(1, 2, 3)
+ self.assertIsNone(acp._delayed_completion_id)
+ self.assertEqual(acp.open_completions.called, 0)
+
+ # Test that open_completions is called if indexes match.
+ acp._delayed_completion_index = self.text.index('insert')
+ acp._delayed_open_completions(1, 2, 3, ac.COMPLETE_FILES)
+ self.assertEqual(acp.open_completions.args, (1, 2, 3, 2))
def test_open_completions(self):
# Test completions of files and attributes as well as non-completion
- # of errors
- pass
+ # of errors.
+ self.text.insert('1.0', 'pr')
+ self.assertTrue(self.autocomplete.open_completions(False, True, True))
+ self.text.delete('1.0', 'end')
+
+ # Test files.
+ self.text.insert('1.0', '"t')
+ #self.assertTrue(self.autocomplete.open_completions(False, True, True))
+ self.text.delete('1.0', 'end')
+
+ # Test with blank will fail.
+ self.assertFalse(self.autocomplete.open_completions(False, True, True))
+
+ # Test with only string quote will fail.
+ self.text.insert('1.0', '"')
+ self.assertFalse(self.autocomplete.open_completions(False, True, True))
+ self.text.delete('1.0', 'end')
def test_fetch_completions(self):
# Test that fetch_completions returns 2 lists:
# For attribute completion, a large list containing all variables, and
# a small list containing non-private variables.
# For file completion, a large list containing all files in the path,
- # and a small list containing files that do not start with '.'
- pass
+ # and a small list containing files that do not start with '.'.
+ autocomplete = self.autocomplete
+ small, large = self.autocomplete.fetch_completions(
+ '', ac.COMPLETE_ATTRIBUTES)
+ if __main__.__file__ != ac.__file__:
+ self.assertNotIn('AutoComplete', small) # See issue 36405.
+
+ # Test attributes
+ s, b = autocomplete.fetch_completions('', ac.COMPLETE_ATTRIBUTES)
+ self.assertLess(len(small), len(large))
+ self.assertTrue(all(filter(lambda x: x.startswith('_'), s)))
+ self.assertTrue(any(filter(lambda x: x.startswith('_'), b)))
+
+ # Test smalll should respect to __all__.
+ with patch.dict('__main__.__dict__', {'__all__': ['a', 'b']}):
+ s, b = autocomplete.fetch_completions('', ac.COMPLETE_ATTRIBUTES)
+ self.assertEqual(s, ['a', 'b'])
+ self.assertIn('__name__', b) # From __main__.__dict__
+ self.assertIn('sum', b) # From __main__.__builtins__.__dict__
+
+ # Test attributes with name entity.
+ mock = Mock()
+ mock._private = Mock()
+ with patch.dict('__main__.__dict__', {'foo': mock}):
+ s, b = autocomplete.fetch_completions('foo', ac.COMPLETE_ATTRIBUTES)
+ self.assertNotIn('_private', s)
+ self.assertIn('_private', b)
+ self.assertEqual(s, [i for i in sorted(dir(mock)) if i[:1] != '_'])
+ self.assertEqual(b, sorted(dir(mock)))
+
+ # Test files
+ def _listdir(path):
+ # This will be patch and used in fetch_completions.
+ if path == '.':
+ return ['foo', 'bar', '.hidden']
+ return ['monty', 'python', '.hidden']
+
+ with patch.object(os, 'listdir', _listdir):
+ s, b = autocomplete.fetch_completions('', ac.COMPLETE_FILES)
+ self.assertEqual(s, ['bar', 'foo'])
+ self.assertEqual(b, ['.hidden', 'bar', 'foo'])
+
+ s, b = autocomplete.fetch_completions('~', ac.COMPLETE_FILES)
+ self.assertEqual(s, ['monty', 'python'])
+ self.assertEqual(b, ['.hidden', 'monty', 'python'])
def test_get_entity(self):
# Test that a name is in the namespace of sys.modules and
- # __main__.__dict__
- pass
+ # __main__.__dict__.
+ autocomplete = self.autocomplete
+ Equal = self.assertEqual
+
+ Equal(self.autocomplete.get_entity('int'), int)
+
+ # Test name from sys.modules.
+ mock = Mock()
+ with patch.dict('sys.modules', {'tempfile': mock}):
+ Equal(autocomplete.get_entity('tempfile'), mock)
+
+ # Test name from __main__.__dict__.
+ di = {'foo': 10, 'bar': 20}
+ with patch.dict('__main__.__dict__', {'d': di}):
+ Equal(autocomplete.get_entity('d'), di)
+
+ # Test name not in namespace.
+ with patch.dict('__main__.__dict__', {}):
+ with self.assertRaises(NameError):
+ autocomplete.get_entity('not_exist')
if __name__ == '__main__':
from tkinter import Text, Tk
-class Dummy_Editwin:
+class DummyEditwin:
# AutoExpand.__init__ only needs .text
def __init__(self, text):
self.text = text
requires('gui')
cls.tk = Tk()
cls.text = Text(cls.tk)
- cls.auto_expand = AutoExpand(Dummy_Editwin(cls.text))
+ cls.auto_expand = AutoExpand(DummyEditwin(cls.text))
cls.auto_expand.bell = lambda: None
# If mock_tk.Text._decode understood indexes 'insert' with suffixed 'linestart',
# Nested tree same as in test_pyclbr.py except for supers on C0. C1.
mb = pyclbr
module, fname = 'test', 'test.py'
-f0 = mb.Function(module, 'f0', fname, 1)
-f1 = mb._nest_function(f0, 'f1', 2)
-f2 = mb._nest_function(f1, 'f2', 3)
-c1 = mb._nest_class(f0, 'c1', 5)
-C0 = mb.Class(module, 'C0', ['base'], fname, 6)
-F1 = mb._nest_function(C0, 'F1', 8)
-C1 = mb._nest_class(C0, 'C1', 11, [''])
-C2 = mb._nest_class(C1, 'C2', 12)
-F3 = mb._nest_function(C2, 'F3', 14)
-mock_pyclbr_tree = {'f0': f0, 'C0': C0}
+C0 = mb.Class(module, 'C0', ['base'], fname, 1)
+F1 = mb._nest_function(C0, 'F1', 3)
+C1 = mb._nest_class(C0, 'C1', 6, [''])
+C2 = mb._nest_class(C1, 'C2', 7)
+F3 = mb._nest_function(C2, 'F3', 9)
+f0 = mb.Function(module, 'f0', fname, 11)
+f1 = mb._nest_function(f0, 'f1', 12)
+f2 = mb._nest_function(f1, 'f2', 13)
+c1 = mb._nest_class(f0, 'c1', 15)
+mock_pyclbr_tree = {'C0': C0, 'f0': f0}
# Adjust C0.name, C1.name so tests do not depend on order.
browser.transform_children(mock_pyclbr_tree, 'test') # C0(base)
transform = browser.transform_children
# Parameter matches tree module.
tcl = list(transform(mock_pyclbr_tree, 'test'))
- eq(tcl, [f0, C0])
- eq(tcl[0].name, 'f0')
- eq(tcl[1].name, 'C0(base)')
+ eq(tcl, [C0, f0])
+ eq(tcl[0].name, 'C0(base)')
+ eq(tcl[1].name, 'f0')
# Check that second call does not change suffix.
tcl = list(transform(mock_pyclbr_tree, 'test'))
- eq(tcl[1].name, 'C0(base)')
+ eq(tcl[0].name, 'C0(base)')
# Nothing to traverse if parameter name isn't same as tree module.
tcl = list(transform(mock_pyclbr_tree, 'different name'))
eq(tcl, [])
import unittest
import textwrap
import types
-
-default_tip = calltip._default_callable_argspec
+import re
# Test Class TC is used in multiple get_argspec test methods
t6.tip = "(no, self)"
def __call__(self, ci): 'doc'
__call__.tip = "(self, ci)"
+ def nd(self): pass # No doc.
# attaching .tip to wrapped methods does not work
@classmethod
def cm(cls, a): 'doc'
tc = TC()
-signature = calltip.get_argspec # 2.7 and 3.x use different functions
+default_tip = calltip._default_callable_argspec
+get_spec = calltip.get_argspec
-class Get_signatureTest(unittest.TestCase):
- # The signature function must return a string, even if blank.
+class Get_argspecTest(unittest.TestCase):
+ # The get_spec function must return a string, even if blank.
# Test a variety of objects to be sure that none cause it to raise
# (quite aside from getting as correct an answer as possible).
# The tests of builtins may break if inspect or the docstrings change,
def test_builtins(self):
+ def tiptest(obj, out):
+ self.assertEqual(get_spec(obj), out)
+
# Python class that inherits builtin methods
class List(list): "List() doc"
# Simulate builtin with no docstring for default tip test
class SB: __call__ = None
- def gtest(obj, out):
- self.assertEqual(signature(obj), out)
-
if List.__doc__ is not None:
- gtest(List, '(iterable=(), /)' + calltip._argument_positional
- + '\n' + List.__doc__)
- gtest(list.__new__,
+ tiptest(List,
+ f'(iterable=(), /){calltip._argument_positional}'
+ f'\n{List.__doc__}')
+ tiptest(list.__new__,
'(*args, **kwargs)\n'
'Create and return a new object. '
'See help(type) for accurate signature.')
- gtest(list.__init__,
+ tiptest(list.__init__,
'(self, /, *args, **kwargs)'
+ calltip._argument_positional + '\n' +
'Initialize self. See help(type(self)) for accurate signature.')
append_doc = (calltip._argument_positional
+ "\nAppend object to the end of the list.")
- gtest(list.append, '(self, object, /)' + append_doc)
- gtest(List.append, '(self, object, /)' + append_doc)
- gtest([].append, '(object, /)' + append_doc)
+ tiptest(list.append, '(self, object, /)' + append_doc)
+ tiptest(List.append, '(self, object, /)' + append_doc)
+ tiptest([].append, '(object, /)' + append_doc)
+
+ tiptest(types.MethodType, "method(function, instance)")
+ tiptest(SB(), default_tip)
- gtest(types.MethodType, "method(function, instance)")
- gtest(SB(), default_tip)
- import re
p = re.compile('')
- gtest(re.sub, '''\
+ tiptest(re.sub, '''\
(pattern, repl, string, count=0, flags=0)
Return the string obtained by replacing the leftmost
non-overlapping occurrences of the pattern in string by the
replacement repl. repl can be either a string or a callable;
if a string, backslash escapes in it are processed. If it is
a callable, it's passed the Match object and must return''')
- gtest(p.sub, '''\
+ tiptest(p.sub, '''\
(repl, string, count=0)
Return the string obtained by replacing the leftmost \
non-overlapping occurrences o...''')
def test_signature_wrap(self):
if textwrap.TextWrapper.__doc__ is not None:
- self.assertEqual(signature(textwrap.TextWrapper), '''\
+ self.assertEqual(get_spec(textwrap.TextWrapper), '''\
(width=70, initial_indent='', subsequent_indent='', expand_tabs=True,
replace_whitespace=True, fix_sentence_endings=False, break_long_words=True,
drop_whitespace=True, break_on_hyphens=True, tabsize=8, *, max_lines=None,
placeholder=' [...]')''')
def test_properly_formated(self):
+
def foo(s='a'*100):
pass
indent = calltip._INDENT
- str_foo = "(s='aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"\
- "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa\n" + indent + "aaaaaaaaa"\
- "aaaaaaaaaa')"
- str_bar = "(s='aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"\
- "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa\n" + indent + "aaaaaaaaa"\
- "aaaaaaaaaa')\nHello Guido"
- str_baz = "(s='aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"\
- "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa\n" + indent + "aaaaaaaaa"\
- "aaaaaaaaaa', z='bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb"\
- "bbbbbbbbbbbbbbbbb\n" + indent + "bbbbbbbbbbbbbbbbbbbbbb"\
- "bbbbbbbbbbbbbbbbbbbbbb')"
-
- self.assertEqual(calltip.get_argspec(foo), str_foo)
- self.assertEqual(calltip.get_argspec(bar), str_bar)
- self.assertEqual(calltip.get_argspec(baz), str_baz)
+ sfoo = "(s='aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"\
+ "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa\n" + indent + "aaaaaaaaa"\
+ "aaaaaaaaaa')"
+ sbar = "(s='aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"\
+ "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa\n" + indent + "aaaaaaaaa"\
+ "aaaaaaaaaa')\nHello Guido"
+ sbaz = "(s='aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"\
+ "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa\n" + indent + "aaaaaaaaa"\
+ "aaaaaaaaaa', z='bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb"\
+ "bbbbbbbbbbbbbbbbb\n" + indent + "bbbbbbbbbbbbbbbbbbbbbb"\
+ "bbbbbbbbbbbbbbbbbbbbbb')"
+
+ for func,doc in [(foo, sfoo), (bar, sbar), (baz, sbaz)]:
+ with self.subTest(func=func, doc=doc):
+ self.assertEqual(get_spec(func), doc)
def test_docline_truncation(self):
def f(): pass
f.__doc__ = 'a'*300
- self.assertEqual(signature(f), '()\n' + 'a' * (calltip._MAX_COLS-3) + '...')
+ self.assertEqual(get_spec(f), f"()\n{'a'*(calltip._MAX_COLS-3) + '...'}")
def test_multiline_docstring(self):
# Test fewer lines than max.
- self.assertEqual(signature(range),
+ self.assertEqual(get_spec(range),
"range(stop) -> range object\n"
"range(start, stop[, step]) -> range object")
# Test max lines
- self.assertEqual(signature(bytes), '''\
+ self.assertEqual(get_spec(bytes), '''\
bytes(iterable_of_ints) -> bytes
bytes(string, encoding[, errors]) -> bytes
bytes(bytes_or_buffer) -> immutable copy of bytes_or_buffer
# Test more than max lines
def f(): pass
f.__doc__ = 'a\n' * 15
- self.assertEqual(signature(f), '()' + '\na' * calltip._MAX_LINES)
+ self.assertEqual(get_spec(f), '()' + '\na' * calltip._MAX_LINES)
def test_functions(self):
def t1(): 'doc'
doc = '\ndoc' if t1.__doc__ is not None else ''
for func in (t1, t2, t3, t4, t5, TC):
- self.assertEqual(signature(func), func.tip + doc)
+ with self.subTest(func=func):
+ self.assertEqual(get_spec(func), func.tip + doc)
def test_methods(self):
doc = '\ndoc' if TC.__doc__ is not None else ''
for meth in (TC.t1, TC.t2, TC.t3, TC.t4, TC.t5, TC.t6, TC.__call__):
- self.assertEqual(signature(meth), meth.tip + doc)
- self.assertEqual(signature(TC.cm), "(a)" + doc)
- self.assertEqual(signature(TC.sm), "(b)" + doc)
+ with self.subTest(meth=meth):
+ self.assertEqual(get_spec(meth), meth.tip + doc)
+ self.assertEqual(get_spec(TC.cm), "(a)" + doc)
+ self.assertEqual(get_spec(TC.sm), "(b)" + doc)
def test_bound_methods(self):
# test that first parameter is correctly removed from argspec
for meth, mtip in ((tc.t1, "()"), (tc.t4, "(*args)"),
(tc.t6, "(self)"), (tc.__call__, '(ci)'),
(tc, '(ci)'), (TC.cm, "(a)"),):
- self.assertEqual(signature(meth), mtip + doc)
+ with self.subTest(meth=meth, mtip=mtip):
+ self.assertEqual(get_spec(meth), mtip + doc)
def test_starred_parameter(self):
# test that starred first parameter is *not* removed from argspec
def m1(*args): pass
c = C()
for meth, mtip in ((C.m1, '(*args)'), (c.m1, "(*args)"),):
- self.assertEqual(signature(meth), mtip)
+ with self.subTest(meth=meth, mtip=mtip):
+ self.assertEqual(get_spec(meth), mtip)
- def test_invalid_method_signature(self):
+ def test_invalid_method_get_spec(self):
class C:
def m2(**kwargs): pass
class Test:
def __call__(*, a): pass
mtip = calltip._invalid_method
- self.assertEqual(signature(C().m2), mtip)
- self.assertEqual(signature(Test()), mtip)
+ self.assertEqual(get_spec(C().m2), mtip)
+ self.assertEqual(get_spec(Test()), mtip)
def test_non_ascii_name(self):
# test that re works to delete a first parameter name that
assert calltip._first_param.sub('', uni) == '(a)'
def test_no_docstring(self):
- def nd(s):
- pass
- TC.nd = nd
- self.assertEqual(signature(nd), "(s)")
- self.assertEqual(signature(TC.nd), "(s)")
- self.assertEqual(signature(tc.nd), "()")
+ for meth, mtip in ((TC.nd, "(self)"), (tc.nd, "()")):
+ with self.subTest(meth=meth, mtip=mtip):
+ self.assertEqual(get_spec(meth), mtip)
def test_attribute_exception(self):
class NoCall:
for meth, mtip in ((NoCall, default_tip), (CallA, default_tip),
(NoCall(), ''), (CallA(), '(a, b, c)'),
(CallB(), '(ci)')):
- self.assertEqual(signature(meth), mtip)
+ with self.subTest(meth=meth, mtip=mtip):
+ self.assertEqual(get_spec(meth), mtip)
def test_non_callables(self):
for obj in (0, 0.0, '0', b'0', [], {}):
- self.assertEqual(signature(obj), '')
+ with self.subTest(obj=obj):
+ self.assertEqual(get_spec(obj), '')
class Get_entityTest(unittest.TestCase):
eq = self.assertEqual
eq(conf.GetHighlight('IDLE Classic', 'normal'), {'foreground': '#000000',
'background': '#ffffff'})
- eq(conf.GetHighlight('IDLE Classic', 'normal', 'fg'), '#000000')
- eq(conf.GetHighlight('IDLE Classic', 'normal', 'bg'), '#ffffff')
- with self.assertRaises(config.InvalidFgBg):
- conf.GetHighlight('IDLE Classic', 'normal', 'fb')
# Test cursor (this background should be normal-background)
eq(conf.GetHighlight('IDLE Classic', 'cursor'), {'foreground': 'black',
def test_get_keyset(self):
conf = self.mock_config()
- # Conflic with key set, should be disable to ''
+ # Conflict with key set, should be disable to ''
conf.defaultCfg['extensions'].add_section('Foobar')
conf.defaultCfg['extensions'].add_section('Foobar_cfgBindings')
conf.defaultCfg['extensions'].set('Foobar', 'enable', 'True')
def test_paint_theme_sample(self):
eq = self.assertEqual
- d = self.page
- del d.paint_theme_sample
- hs_tag = d.highlight_sample.tag_cget
+ page = self.page
+ del page.paint_theme_sample # Delete masking mock.
+ hs_tag = page.highlight_sample.tag_cget
gh = idleConf.GetHighlight
- fg = 'foreground'
- bg = 'background'
# Create custom theme based on IDLE Dark.
- d.theme_source.set(True)
- d.builtin_name.set('IDLE Dark')
+ page.theme_source.set(True)
+ page.builtin_name.set('IDLE Dark')
theme = 'IDLE Test'
- d.create_new(theme)
- d.set_color_sample.called = 0
+ page.create_new(theme)
+ page.set_color_sample.called = 0
# Base theme with nothing in `changes`.
- d.paint_theme_sample()
- eq(hs_tag('break', fg), gh(theme, 'break', fgBg='fg'))
- eq(hs_tag('cursor', bg), gh(theme, 'normal', fgBg='bg'))
- self.assertNotEqual(hs_tag('console', fg), 'blue')
- self.assertNotEqual(hs_tag('console', bg), 'yellow')
- eq(d.set_color_sample.called, 1)
+ page.paint_theme_sample()
+ new_console = {'foreground': 'blue',
+ 'background': 'yellow',}
+ for key, value in new_console.items():
+ self.assertNotEqual(hs_tag('console', key), value)
+ eq(page.set_color_sample.called, 1)
# Apply changes.
- changes.add_option('highlight', theme, 'console-foreground', 'blue')
- changes.add_option('highlight', theme, 'console-background', 'yellow')
- d.paint_theme_sample()
-
- eq(hs_tag('break', fg), gh(theme, 'break', fgBg='fg'))
- eq(hs_tag('cursor', bg), gh(theme, 'normal', fgBg='bg'))
- eq(hs_tag('console', fg), 'blue')
- eq(hs_tag('console', bg), 'yellow')
- eq(d.set_color_sample.called, 2)
+ for key, value in new_console.items():
+ changes.add_option('highlight', theme, 'console-'+key, value)
+ page.paint_theme_sample()
+ for key, value in new_console.items():
+ eq(hs_tag('console', key), value)
+ eq(page.set_color_sample.called, 2)
- d.paint_theme_sample = Func()
+ page.paint_theme_sample = Func()
def test_delete_custom(self):
eq = self.assertEqual
Otherwise, tests are mostly independent.
Currently only test grep_it, coverage 51%.
"""
-from idlelib.grep import GrepDialog
+from idlelib import grep
import unittest
from test.support import captured_stdout
from idlelib.idle_test.mock_tk import Var
+import os
import re
class Dummy_grep:
# Methods tested
#default_command = GrepDialog.default_command
- grep_it = GrepDialog.grep_it
- findfiles = GrepDialog.findfiles
+ grep_it = grep.GrepDialog.grep_it
# Other stuff needed
recvar = Var(False)
engine = searchengine
def close(self): # gui method
pass
-grep = Dummy_grep()
+_grep = Dummy_grep()
class FindfilesTest(unittest.TestCase):
- # findfiles is really a function, not a method, could be iterator
- # test that filename return filename
- # test that idlelib has many .py files
- # test that recursive flag adds idle_test .py files
- pass
+
+ @classmethod
+ def setUpClass(cls):
+ cls.realpath = os.path.realpath(__file__)
+ cls.path = os.path.dirname(cls.realpath)
+
+ @classmethod
+ def tearDownClass(cls):
+ del cls.realpath, cls.path
+
+ def test_invaliddir(self):
+ with captured_stdout() as s:
+ filelist = list(grep.findfiles('invaliddir', '*.*', False))
+ self.assertEqual(filelist, [])
+ self.assertIn('invalid', s.getvalue())
+
+ def test_curdir(self):
+ # Test os.curdir.
+ ff = grep.findfiles
+ save_cwd = os.getcwd()
+ os.chdir(self.path)
+ filename = 'test_grep.py'
+ filelist = list(ff(os.curdir, filename, False))
+ self.assertIn(os.path.join(os.curdir, filename), filelist)
+ os.chdir(save_cwd)
+
+ def test_base(self):
+ ff = grep.findfiles
+ readme = os.path.join(self.path, 'README.txt')
+
+ # Check for Python files in path where this file lives.
+ filelist = list(ff(self.path, '*.py', False))
+ # This directory has many Python files.
+ self.assertGreater(len(filelist), 10)
+ self.assertIn(self.realpath, filelist)
+ self.assertNotIn(readme, filelist)
+
+ # Look for .txt files in path where this file lives.
+ filelist = list(ff(self.path, '*.txt', False))
+ self.assertNotEqual(len(filelist), 0)
+ self.assertNotIn(self.realpath, filelist)
+ self.assertIn(readme, filelist)
+
+ # Look for non-matching pattern.
+ filelist = list(ff(self.path, 'grep.*', False))
+ self.assertEqual(len(filelist), 0)
+ self.assertNotIn(self.realpath, filelist)
+
+ def test_recurse(self):
+ ff = grep.findfiles
+ parent = os.path.dirname(self.path)
+ grepfile = os.path.join(parent, 'grep.py')
+ pat = '*.py'
+
+ # Get Python files only in parent directory.
+ filelist = list(ff(parent, pat, False))
+ parent_size = len(filelist)
+ # Lots of Python files in idlelib.
+ self.assertGreater(parent_size, 20)
+ self.assertIn(grepfile, filelist)
+ # Without subdirectories, this file isn't returned.
+ self.assertNotIn(self.realpath, filelist)
+
+ # Include subdirectories.
+ filelist = list(ff(parent, pat, True))
+ # More files found now.
+ self.assertGreater(len(filelist), parent_size)
+ self.assertIn(grepfile, filelist)
+ # This file exists in list now.
+ self.assertIn(self.realpath, filelist)
+
+ # Check another level up the tree.
+ parent = os.path.dirname(parent)
+ filelist = list(ff(parent, '*.py', True))
+ self.assertIn(self.realpath, filelist)
class Grep_itTest(unittest.TestCase):
# from incomplete replacement, so 'later'.
def report(self, pat):
- grep.engine._pat = pat
+ _grep.engine._pat = pat
with captured_stdout() as s:
- grep.grep_it(re.compile(pat), __file__)
+ _grep.grep_it(re.compile(pat), __file__)
lines = s.getvalue().split('\n')
lines.pop() # remove bogus '' after last \n
return lines
self.text = text
self.indentwidth = 8
self.tabwidth = 8
- self.context_use_ps1 = True
+ self.prompt_last_line = '>>>'
self.num_context_lines = 50, 500, 1000
_build_char_in_string_func = EditorWindow._build_char_in_string_func
def tearDown(self):
self.text.delete('1.0', 'end')
- self.editwin.context_use_ps1 = True
+ self.editwin.prompt_last_line = '>>>'
def get_parser(self, index):
"""
self.assertIn('precedes', str(ve.exception))
# test without ps1
- self.editwin.context_use_ps1 = False
+ self.editwin.prompt_last_line = ''
# number of lines lesser than 50
p = self.get_parser('end')
self.text = text
self.indentwidth = 8
self.tabwidth = 8
- self.context_use_ps1 = True
+ self.prompt_last_line = '>>>' # Currently not used by parenmatch.
class ParenMatchTest(unittest.TestCase):
TestInfo('\n def function1(self, a,\n', [0, 1, 2], BRACKET),
TestInfo('())\n', [0, 1], NONE), # Extra closer.
TestInfo(')(\n', [0, 1], BRACKET), # Extra closer.
- # For the mismatched example, it doesn't look like contination.
+ # For the mismatched example, it doesn't look like continuation.
TestInfo('{)(]\n', [0, 1], NONE), # Mismatched.
)
-"""Test query, coverage 91%).
+"""Test query, coverage 93%).
Non-gui tests for Query, SectionName, ModuleName, and HelpSource use
dummy versions that extract the non-gui methods and add other needed
ok = query.Query.ok
cancel = query.Query.cancel
# Add attributes and initialization needed for tests.
- entry = Var()
- entry_error = {}
def __init__(self, dummy_entry):
- self.entry.set(dummy_entry)
- self.entry_error['text'] = ''
+ self.entry = Var(value=dummy_entry)
+ self.entry_error = {'text': ''}
self.result = None
self.destroyed = False
def showerror(self, message):
class Dummy_SectionName:
entry_ok = query.SectionName.entry_ok # Function being tested.
used_names = ['used']
- entry = Var()
- entry_error = {}
def __init__(self, dummy_entry):
- self.entry.set(dummy_entry)
- self.entry_error['text'] = ''
+ self.entry = Var(value=dummy_entry)
+ self.entry_error = {'text': ''}
def showerror(self, message):
self.entry_error['text'] = message
class Dummy_ModuleName:
entry_ok = query.ModuleName.entry_ok # Function being tested.
text0 = ''
- entry = Var()
- entry_error = {}
def __init__(self, dummy_entry):
- self.entry.set(dummy_entry)
- self.entry_error['text'] = ''
+ self.entry = Var(value=dummy_entry)
+ self.entry_error = {'text': ''}
def showerror(self, message):
self.entry_error['text'] = message
self.assertEqual(dialog.entry_error['text'], '')
-# 3 HelpSource test classes each test one function.
-
-orig_platform = query.platform
+# 3 HelpSource test classes each test one method.
class HelpsourceBrowsefileTest(unittest.TestCase):
"Test browse_file method of ModuleName subclass of Query."
class Dummy_HelpSource:
path_ok = query.HelpSource.path_ok
- path = Var()
- path_error = {}
def __init__(self, dummy_path):
- self.path.set(dummy_path)
- self.path_error['text'] = ''
+ self.path = Var(value=dummy_path)
+ self.path_error = {'text': ''}
def showerror(self, message, widget=None):
self.path_error['text'] = message
+ orig_platform = query.platform # Set in test_path_ok_file.
@classmethod
def tearDownClass(cls):
- query.platform = orig_platform
+ query.platform = cls.orig_platform
def test_path_ok_blank(self):
dialog = self.Dummy_HelpSource(' ')
self.assertEqual(dialog.entry_ok(), result)
+# 2 CustomRun test classes each test one method.
+
+class CustomRunCLIargsokTest(unittest.TestCase):
+ "Test cli_ok method of the CustomRun subclass of Query."
+
+ class Dummy_CustomRun:
+ cli_args_ok = query.CustomRun.cli_args_ok
+ def __init__(self, dummy_entry):
+ self.entry = Var(value=dummy_entry)
+ self.entry_error = {'text': ''}
+ def showerror(self, message):
+ self.entry_error['text'] = message
+
+ def test_blank_args(self):
+ dialog = self.Dummy_CustomRun(' ')
+ self.assertEqual(dialog.cli_args_ok(), [])
+
+ def test_invalid_args(self):
+ dialog = self.Dummy_CustomRun("'no-closing-quote")
+ self.assertEqual(dialog.cli_args_ok(), None)
+ self.assertIn('No closing', dialog.entry_error['text'])
+
+ def test_good_args(self):
+ args = ['-n', '10', '--verbose', '-p', '/path', '--name']
+ dialog = self.Dummy_CustomRun(' '.join(args) + ' "my name"')
+ self.assertEqual(dialog.cli_args_ok(), args + ["my name"])
+ self.assertEqual(dialog.entry_error['text'], '')
+
+
+class CustomRunEntryokTest(unittest.TestCase):
+ "Test entry_ok method of the CustomRun subclass of Query."
+
+ class Dummy_CustomRun:
+ entry_ok = query.CustomRun.entry_ok
+ entry_error = {}
+ restartvar = Var()
+ def cli_args_ok(self):
+ return self.cli_args
+
+ def test_entry_ok_customrun(self):
+ dialog = self.Dummy_CustomRun()
+ for restart in {True, False}:
+ dialog.restartvar.set(restart)
+ for cli_args, result in ((None, None),
+ (['my arg'], (['my arg'], restart))):
+ with self.subTest(restart=restart, cli_args=cli_args):
+ dialog.cli_args = cli_args
+ self.assertEqual(dialog.entry_ok(), result)
+
+
# GUI TESTS
class QueryGuiTest(unittest.TestCase):
dialog.entry.insert(0, 'okay')
dialog.button_ok.invoke()
self.assertEqual(dialog.result, 'okay')
- del dialog
root.destroy()
- del root
class ModulenameGuiTest(unittest.TestCase):
self.assertEqual(dialog.entry.get(), 'idlelib')
dialog.button_ok.invoke()
self.assertTrue(dialog.result.endswith('__init__.py'))
- del dialog
root.destroy()
- del root
class HelpsourceGuiTest(unittest.TestCase):
dialog.button_ok.invoke()
prefix = "file://" if sys.platform == 'darwin' else ''
Equal(dialog.result, ('__test__', prefix + __file__))
- del dialog
root.destroy()
- del root
+
+
+class CustomRunGuiTest(unittest.TestCase):
+
+ @classmethod
+ def setUpClass(cls):
+ requires('gui')
+
+ def test_click_args(self):
+ root = Tk()
+ root.withdraw()
+ dialog = query.CustomRun(root, 'Title', _utest=True)
+ dialog.entry.insert(0, 'okay')
+ dialog.button_ok.invoke()
+ self.assertEqual(dialog.result, (['okay'], True))
+ root.destroy()
if __name__ == '__main__':
import unittest
from test.support import requires
-from tkinter import Tk
+from tkinter import Text, Tk, Toplevel
from tkinter.ttk import Frame
from idlelib import searchengine as se
from idlelib import searchbase as sdb
@classmethod
def tearDownClass(cls):
+ cls.root.update_idletasks()
cls.root.destroy()
del cls.root
# open calls create_widgets, which needs default_command
self.dialog.default_command = None
- # Since text parameter of .open is not used in base class,
- # pass dummy 'text' instead of tk.Text().
- self.dialog.open('text')
+ toplevel = Toplevel(self.root)
+ text = Text(toplevel)
+ self.dialog.open(text)
self.assertEqual(self.dialog.top.state(), 'normal')
self.dialog.close()
self.assertEqual(self.dialog.top.state(), 'withdrawn')
- self.dialog.open('text', searchphrase="hello")
+ self.dialog.open(text, searchphrase="hello")
self.assertEqual(self.dialog.ent.get(), 'hello')
- self.dialog.close()
+ toplevel.update_idletasks()
+ toplevel.destroy()
def test_create_widgets(self):
self.dialog.create_entries = Func()
# Look for close button command in buttonframe
closebuttoncommand = ''
for child in self.dialog.buttonframe.winfo_children():
- if child['text'] == 'close':
+ if child['text'] == 'Close':
closebuttoncommand = child['command']
self.assertIn('close', closebuttoncommand)
try:
with open(filename, "wb") as f:
f.write(chars)
+ f.flush()
+ os.fsync(f.fileno())
return True
except OSError as msg:
tkMessageBox.showerror("I/O Error", str(msg),
('Python Shell', '<<open-python-shell>>'),
('C_heck Module', '<<check-module>>'),
('R_un Module', '<<run-module>>'),
+ ('Run... _Customized', '<<run-custom>>'),
]),
('shell', [
#! /usr/bin/env python3
import sys
+if __name__ == "__main__":
+ sys.modules['idlelib.pyshell'] = sys.modules['__main__']
try:
from tkinter import *
# run from the IDLE source directory.
del_exitf = idleConf.GetOption('main', 'General', 'delete-exitfunc',
default=False, type='bool')
- if __name__ == 'idlelib.pyshell':
- command = "__import__('idlelib.run').run.main(%r)" % (del_exitf,)
- else:
- command = "__import__('run').main(%r)" % (del_exitf,)
+ command = "__import__('idlelib.run').run.main(%r)" % (del_exitf,)
return [sys.executable] + w + ["-c", command, str(self.port)]
def start_subprocess(self):
def display_no_subprocess_error(self):
tkMessageBox.showerror(
- "Subprocess Startup Error",
- "IDLE's subprocess didn't make connection. Either IDLE can't "
- "start a subprocess or personal firewall software is blocking "
- "the connection.",
+ "Subprocess Connection Error",
+ "IDLE's subprocess didn't make connection.\n"
+ "See the 'Startup failure' section of the IDLE doc, online at\n"
+ "https://docs.python.org/3/library/idle.html#startup-failure",
parent=self.tkconsole.text)
def display_executing_dialog(self):
self.usetabs = True
# indentwidth must be 8 when using tabs. See note in EditorWindow:
self.indentwidth = 8
- self.context_use_ps1 = True
+
self.sys_ps1 = sys.ps1 if hasattr(sys, 'ps1') else '>>> '
self.prompt_last_line = self.sys_ps1.split('\n')[-1]
self.prompt = self.sys_ps1 # Changes when debug active
capture_warnings(False)
if __name__ == "__main__":
- sys.modules['pyshell'] = sys.modules['__main__']
main()
capture_warnings(False) # Make sure turned off; see issue 18081
import importlib
import os
+import shlex
from sys import executable, platform # Platform is set for one test.
-from tkinter import Toplevel, StringVar, W, E, S
-from tkinter.ttk import Frame, Button, Entry, Label
+from tkinter import Toplevel, StringVar, BooleanVar, W, E, S
+from tkinter.ttk import Frame, Button, Entry, Label, Checkbutton
from tkinter import filedialog
from tkinter.font import Font
self.deiconify() # Unhide now that geometry set.
self.wait_window()
- def create_widgets(self): # Call from override, if any.
+ def create_widgets(self, ok_text='OK'): # Call from override, if any.
# Bind to self widgets needed for entry_ok or unittest.
self.frame = frame = Frame(self, padding=10)
frame.grid(column=0, row=0, sticky='news')
self.entry_error = Label(frame, text=' ', foreground='red',
font=self.error_font)
self.button_ok = Button(
- frame, text='OK', default='active', command=self.ok)
+ frame, text=ok_text, default='active', command=self.ok)
self.button_cancel = Button(
frame, text='Cancel', command=self.cancel)
path = self.path_ok()
return None if name is None or path is None else (name, path)
+class CustomRun(Query):
+ """Get settings for custom run of module.
+
+ 1. Command line arguments to extend sys.argv.
+ 2. Whether to restart Shell or not.
+ """
+ # Used in runscript.run_custom_event
+
+ def __init__(self, parent, title, *, cli_args='',
+ _htest=False, _utest=False):
+ # TODO Use cli_args to pre-populate entry.
+ message = 'Command Line Arguments for sys.argv:'
+ super().__init__(
+ parent, title, message, text0=cli_args,
+ _htest=_htest, _utest=_utest)
+
+ def create_widgets(self):
+ super().create_widgets(ok_text='Run')
+ frame = self.frame
+ self.restartvar = BooleanVar(self, value=True)
+ restart = Checkbutton(frame, variable=self.restartvar, onvalue=True,
+ offvalue=False, text='Restart shell')
+ self.args_error = Label(frame, text=' ', foreground='red',
+ font=self.error_font)
+
+ restart.grid(column=0, row=4, columnspan=3, padx=5, sticky='w')
+ self.args_error.grid(column=0, row=12, columnspan=3, padx=5,
+ sticky='we')
+
+ def cli_args_ok(self):
+ "Validity check and parsing for command line arguments."
+ cli_string = self.entry.get().strip()
+ try:
+ cli_args = shlex.split(cli_string, posix=True)
+ except ValueError as err:
+ self.showerror(str(err))
+ return None
+ return cli_args
+
+ def entry_ok(self):
+ "Return apparently valid (cli_args, restart) or None"
+ self.entry_error['text'] = ''
+ cli_args = self.cli_args_ok()
+ restart = self.restartvar.get()
+ return None if cli_args is None else (cli_args, restart)
+
if __name__ == '__main__':
from unittest import main
main('idlelib.idle_test.test_query', verbosity=2, exit=False)
from idlelib.idle_test.htest import run
- run(Query, HelpSource)
+ run(Query, HelpSource, CustomRun)
"""Replace dialog for IDLE. Inherits SearchDialogBase for GUI.
-Uses idlelib.SearchEngine for search capability.
+Uses idlelib.searchengine.SearchEngine for search capability.
Defines various replace related functions like replace, replace all,
-replace+find.
+and replace+find.
"""
import re
from idlelib.searchbase import SearchDialogBase
from idlelib import searchengine
+
def replace(text):
- """Returns a singleton ReplaceDialog instance.The single dialog
- saves user entries and preferences across instances."""
+ """Create or reuse a singleton ReplaceDialog instance.
+
+ The singleton dialog saves user entries and preferences
+ across instances.
+
+ Args:
+ text: Text widget containing the text to be searched.
+ """
root = text._root()
engine = searchengine.get(root)
if not hasattr(engine, "_replacedialog"):
class ReplaceDialog(SearchDialogBase):
+ "Dialog for finding and replacing a pattern in text."
title = "Replace Dialog"
icon = "Replace"
def __init__(self, root, engine):
- SearchDialogBase.__init__(self, root, engine)
+ """Create search dialog for finding and replacing text.
+
+ Uses SearchDialogBase as the basis for the GUI and a
+ searchengine instance to prepare the search.
+
+ Attributes:
+ replvar: StringVar containing 'Replace with:' value.
+ replent: Entry widget for replvar. Created in
+ create_entries().
+ ok: Boolean used in searchengine.search_text to indicate
+ whether the search includes the selection.
+ """
+ super().__init__(root, engine)
self.replvar = StringVar(root)
def open(self, text):
- """Display the replace dialog"""
+ """Make dialog visible on top of others and ready to use.
+
+ Also, highlight the currently selected text and set the
+ search to include the current selection (self.ok).
+
+ Args:
+ text: Text widget being searched.
+ """
SearchDialogBase.open(self, text)
try:
first = text.index("sel.first")
first = first or text.index("insert")
last = last or first
self.show_hit(first, last)
- self.ok = 1
+ self.ok = True
def create_entries(self):
- """Create label and text entry widgets"""
+ "Create base and additional label and text entry widgets."
SearchDialogBase.create_entries(self)
self.replent = self.make_entry("Replace with:", self.replvar)[0]
def create_command_buttons(self):
+ """Create base and additional command buttons.
+
+ The additional buttons are for Find, Replace,
+ Replace+Find, and Replace All.
+ """
SearchDialogBase.create_command_buttons(self)
self.make_button("Find", self.find_it)
self.make_button("Replace", self.replace_it)
- self.make_button("Replace+Find", self.default_command, 1)
+ self.make_button("Replace+Find", self.default_command, isdef=True)
self.make_button("Replace All", self.replace_all)
def find_it(self, event=None):
- self.do_find(0)
+ "Handle the Find button."
+ self.do_find(False)
def replace_it(self, event=None):
+ """Handle the Replace button.
+
+ If the find is successful, then perform replace.
+ """
if self.do_find(self.ok):
self.do_replace()
def default_command(self, event=None):
- "Replace and find next."
+ """Handle the Replace+Find button as the default command.
+
+ First performs a replace and then, if the replace was
+ successful, a find next.
+ """
if self.do_find(self.ok):
if self.do_replace(): # Only find next match if replace succeeded.
# A bad re can cause it to fail.
- self.do_find(0)
+ self.do_find(False)
def _replace_expand(self, m, repl):
- """ Helper function for expanding a regular expression
- in the replace field, if needed. """
+ "Expand replacement text if regular expression."
if self.engine.isre():
try:
new = m.expand(repl)
return new
def replace_all(self, event=None):
- """Replace all instances of patvar with replvar in text"""
+ """Handle the Replace All button.
+
+ Search text for occurrences of the Find value and replace
+ each of them. The 'wrap around' value controls the start
+ point for searching. If wrap isn't set, then the searching
+ starts at the first occurrence after the current selection;
+ if wrap is set, the replacement starts at the first line.
+ The replacement is always done top-to-bottom in the text.
+ """
prog = self.engine.getprog()
if not prog:
return
if self.engine.iswrap():
line = 1
col = 0
- ok = 1
+ ok = True
first = last = None
# XXX ought to replace circular instead of top-to-bottom when wrapping
text.undo_block_start()
- while 1:
- res = self.engine.search_forward(text, prog, line, col, 0, ok)
+ while True:
+ res = self.engine.search_forward(text, prog, line, col,
+ wrap=False, ok=ok)
if not res:
break
line, m = res
if new:
text.insert(first, new)
col = i + len(new)
- ok = 0
+ ok = False
text.undo_block_stop()
if first and last:
self.show_hit(first, last)
self.close()
- def do_find(self, ok=0):
+ def do_find(self, ok=False):
+ """Search for and highlight next occurrence of pattern in text.
+
+ No text replacement is done with this option.
+ """
if not self.engine.getprog():
return False
text = self.text
first = "%d.%d" % (line, i)
last = "%d.%d" % (line, j)
self.show_hit(first, last)
- self.ok = 1
+ self.ok = True
return True
def do_replace(self):
+ "Replace search pattern in text with replacement value."
prog = self.engine.getprog()
if not prog:
return False
text.insert(first, new)
text.undo_block_stop()
self.show_hit(first, text.index("insert"))
- self.ok = 0
+ self.ok = False
return True
def show_hit(self, first, last):
- """Highlight text from 'first' to 'last'.
- 'first', 'last' - Text indices"""
+ """Highlight text between first and last indices.
+
+ Text is highlighted via the 'hit' tag and the marked
+ section is brought into view.
+
+ The colors from the 'hit' tag aren't currently shown
+ when the text is displayed. This is due to the 'sel'
+ tag being added first, so the colors in the 'sel'
+ config are seen instead of the colors for 'hit'.
+ """
text = self.text
text.mark_set("insert", first)
text.tag_remove("sel", "1.0", "end")
text.update_idletasks()
def close(self, event=None):
+ "Close the dialog and remove hit tags."
SearchDialogBase.close(self, event)
self.text.tag_remove("hit", "1.0", "end")
+""" idlelib.run
+
+Simplified, pyshell.ModifiedInterpreter spawns a subprocess with
+f'''{sys.executable} -c "__import__('idlelib.run').run.main()"'''
+'.run' is needed because __import__ returns idlelib, not idlelib.run.
+"""
import io
import linecache
import queue
import threading
import warnings
-import tkinter # Tcl, deletions, messagebox if startup fails
-
from idlelib import autocomplete # AutoComplete, fetch_encodings
from idlelib import calltip # Calltip
from idlelib import debugger_r # start_debugger
from idlelib import stackviewer # StackTreeItem
import __main__
-for mod in ('simpledialog', 'messagebox', 'font',
- 'dialog', 'filedialog', 'commondialog',
- 'ttk'):
- delattr(tkinter, mod)
- del sys.modules['tkinter.' + mod]
+import tkinter # Use tcl and, if startup fails, messagebox.
+if not hasattr(sys.modules['idlelib.run'], 'firstrun'):
+ # Undo modifications of tkinter by idlelib imports; see bpo-25507.
+ for mod in ('simpledialog', 'messagebox', 'font',
+ 'dialog', 'filedialog', 'commondialog',
+ 'ttk'):
+ delattr(tkinter, mod)
+ del sys.modules['tkinter.' + mod]
+ # Avoid AttributeError if run again; see bpo-37038.
+ sys.modules['idlelib.run'].firstrun = False
LOCALHOST = '127.0.0.1'
root = tkinter.Tk()
fix_scaling(root)
root.withdraw()
- msg = f"IDLE's subprocess can't connect to {address[0]}:{address[1]}.\n"\
- f"Fatal OSError #{err.errno}: {err.strerror}.\n"\
- f"See the 'Startup failure' section of the IDLE doc, online at\n"\
- f"https://docs.python.org/3/library/idle.html#startup-failure"
- showerror("IDLE Subprocess Error", msg, parent=root)
+ showerror(
+ "Subprocess Connection Error",
+ f"IDLE's subprocess can't connect to {address[0]}:{address[1]}.\n"
+ f"Fatal OSError #{err.errno}: {err.strerror}.\n"
+ "See the 'Startup failure' section of the IDLE doc, online at\n"
+ "https://docs.python.org/3/library/idle.html#startup-failure",
+ parent=root)
root.destroy()
def print_exception():
exec(code, self.locals)
finally:
interruptable = False
- except SystemExit:
- # Scripts that raise SystemExit should just
- # return to the interactive prompt
- pass
+ except SystemExit as e:
+ if e.args: # SystemExit called with an argument.
+ ob = e.args[0]
+ if not isinstance(ob, (type(None), int)):
+ print('SystemExit: ' + str(ob), file=sys.stderr)
+ # Return to the interactive prompt.
except:
self.usr_exc_info = sys.exc_info()
if quitting:
exit()
- # even print a user code SystemExit exception, continue
print_exception()
jit = self.rpchandler.console.getvar("<<toggle-jit-stack-viewer>>")
if jit:
item = stackviewer.StackTreeItem(flist, tb)
return debugobj_r.remote_object_tree_item(item)
-capture_warnings(False) # Make sure turned off; see issue 18081
+
+if __name__ == '__main__':
+ from unittest import main
+ main('idlelib.idle_test.test_run', verbosity=2)
+
+capture_warnings(False) # Make sure turned off; see bpo-18081.
from idlelib.config import idleConf
from idlelib import macosx
from idlelib import pyshell
+from idlelib.query import CustomRun
indent_message = """Error: Inconsistent indentation detected!
# tries to run a module using the keyboard shortcut
# (the menu item works fine).
self.editwin.text_frame.after(200,
- lambda: self.editwin.text_frame.event_generate('<<run-module-event-2>>'))
+ lambda: self.editwin.text_frame.event_generate(
+ '<<run-module-event-2>>'))
return 'break'
else:
return self._run_module_event(event)
- def _run_module_event(self, event):
+ def run_custom_event(self, event):
+ return self._run_module_event(event, customize=True)
+
+ def _run_module_event(self, event, *, customize=False):
"""Run the module after setting up the environment.
- First check the syntax. If OK, make sure the shell is active and
- then transfer the arguments, set the run environment's working
- directory to the directory of the module being executed and also
- add that directory to its sys.path if not already included.
+ First check the syntax. Next get customization. If OK, make
+ sure the shell is active and then transfer the arguments, set
+ the run environment's working directory to the directory of the
+ module being executed and also add that directory to its
+ sys.path if not already included.
"""
-
filename = self.getfilename()
if not filename:
return 'break'
return 'break'
if not self.tabnanny(filename):
return 'break'
+ if customize:
+ title = f"Customize {self.editwin.short_title()} Run"
+ run_args = CustomRun(self.shell.text, title).result
+ if not run_args: # User cancelled.
+ return 'break'
+ cli_args, restart = run_args if customize else ([], True)
interp = self.shell.interp
- if pyshell.use_subprocess:
- interp.restart_subprocess(with_cwd=False, filename=
- self.editwin._filename_to_unicode(filename))
+ if pyshell.use_subprocess and restart:
+ interp.restart_subprocess(
+ with_cwd=False, filename=
+ self.editwin._filename_to_unicode(filename))
dirname = os.path.dirname(filename)
- # XXX Too often this discards arguments the user just set...
- interp.runcommand("""if 1:
+ argv = [filename]
+ if cli_args:
+ argv += cli_args
+ interp.runcommand(f"""if 1:
__file__ = {filename!r}
import sys as _sys
from os.path import basename as _basename
+ argv = {argv!r}
if (not _sys.argv or
- _basename(_sys.argv[0]) != _basename(__file__)):
- _sys.argv = [__file__]
+ _basename(_sys.argv[0]) != _basename(__file__) or
+ len(argv) > 1):
+ _sys.argv = argv
import os as _os
_os.chdir({dirname!r})
del _sys, _basename, _os
- \n""".format(filename=filename, dirname=dirname))
+ \n""")
interp.prepend_syspath(filename)
# XXX KBK 03Jul04 When run w/o subprocess, runtime warnings still
# go to __stderr__. With subprocess, they go to the shell.
+"""Search dialog for Find, Find Again, and Find Selection
+ functionality.
+
+ Inherits from SearchDialogBase for GUI and uses searchengine
+ to prepare search pattern.
+"""
from tkinter import TclError
from idlelib import searchengine
from idlelib.searchbase import SearchDialogBase
def _setup(text):
- "Create or find the singleton SearchDialog instance."
+ """Return the new or existing singleton SearchDialog instance.
+
+ The singleton dialog saves user entries and preferences
+ across instances.
+
+ Args:
+ text: Text widget containing the text to be searched.
+ """
root = text._root()
engine = searchengine.get(root)
if not hasattr(engine, "_searchdialog"):
return engine._searchdialog
def find(text):
- "Handle the editor edit menu item and corresponding event."
+ """Open the search dialog.
+
+ Module-level function to access the singleton SearchDialog
+ instance and open the dialog. If text is selected, it is
+ used as the search phrase; otherwise, the previous entry
+ is used. No search is done with this command.
+ """
pat = text.get("sel.first", "sel.last")
return _setup(text).open(text, pat) # Open is inherited from SDBase.
def find_again(text):
- "Handle the editor edit menu item and corresponding event."
+ """Repeat the search for the last pattern and preferences.
+
+ Module-level function to access the singleton SearchDialog
+ instance to search again using the user entries and preferences
+ from the last dialog. If there was no prior search, open the
+ search dialog; otherwise, perform the search without showing the
+ dialog.
+ """
return _setup(text).find_again(text)
def find_selection(text):
- "Handle the editor edit menu item and corresponding event."
+ """Search for the selected pattern in the text.
+
+ Module-level function to access the singleton SearchDialog
+ instance to search using the selected text. With a text
+ selection, perform the search without displaying the dialog.
+ Without a selection, use the prior entry as the search phrase
+ and don't display the dialog. If there has been no prior
+ search, open the search dialog.
+ """
return _setup(text).find_selection(text)
class SearchDialog(SearchDialogBase):
+ "Dialog for finding a pattern in text."
def create_widgets(self):
+ "Create the base search dialog and add a button for Find Next."
SearchDialogBase.create_widgets(self)
- self.make_button("Find Next", self.default_command, 1)
+ # TODO - why is this here and not in a create_command_buttons?
+ self.make_button("Find Next", self.default_command, isdef=True)
def default_command(self, event=None):
+ "Handle the Find Next button as the default command."
if not self.engine.getprog():
return
self.find_again(self.text)
def find_again(self, text):
+ """Repeat the last search.
+
+ If no search was previously run, open a new search dialog. In
+ this case, no search is done.
+
+ If a search was previously run, the search dialog won't be
+ shown and the options from the previous search (including the
+ search pattern) will be used to find the next occurrence
+ of the pattern. Next is relative based on direction.
+
+ Position the window to display the located occurrence in the
+ text.
+
+ Return True if the search was successful and False otherwise.
+ """
if not self.engine.getpat():
self.open(text)
return False
return False
def find_selection(self, text):
+ """Search for selected text with previous dialog preferences.
+
+ Instead of using the same pattern for searching (as Find
+ Again does), this first resets the pattern to the currently
+ selected text. If the selected text isn't changed, then use
+ the prior search phrase.
+ """
pat = text.get("sel.first", "sel.last")
if pat:
self.engine.setcookedpat(pat)
text (Text searched): set in open(), only used in subclasses().
ent (ry): created in make_entry() called from create_entry().
row (of grid): 0 in create_widgets(), +1 in make_entry/frame().
- default_command: set in subclasses, used in create_widgers().
+ default_command: set in subclasses, used in create_widgets().
title (of dialog): class attribute, override in subclasses.
icon (of dialog): ditto, use unclear if cannot minimize dialog.
else:
self.top.deiconify()
self.top.tkraise()
+ self.top.transient(text.winfo_toplevel())
if searchphrase:
self.ent.delete(0,"end")
self.ent.insert("end",searchphrase)
"Put dialog away for later use."
if self.top:
self.top.grab_release()
+ self.top.transient('')
self.top.withdraw()
def create_widgets(self):
f = self.buttonframe = Frame(self.top)
f.grid(row=0,column=2,padx=2,pady=2,ipadx=2,ipady=2)
- b = self.make_button("close", self.close)
+ b = self.make_button("Close", self.close)
b.lower()
completely unusable.
This extension will automatically replace long texts with a small button.
-Double-cliking this button will remove it and insert the original text instead.
+Double-clicking this button will remove it and insert the original text instead.
Middle-clicking will copy the text to the clipboard. Right-clicking will open
the text in a separate viewing window.
from idlelib.delegator import Delegator
-# tkintter import not needed because module does not create widgets,
+# tkinter import not needed because module does not create widgets,
# although many methods operate on text widget arguments.
#$ event <<redo>>
import re
import sys
+import tkinter
-from idlelib import macosx
+
+class WmInfoGatheringError(Exception):
+ pass
class ZoomHeight:
+ # Cached values for maximized window dimensions, one for each set
+ # of screen dimensions.
+ _max_height_and_y_coords = {}
def __init__(self, editwin):
self.editwin = editwin
+ self.top = self.editwin.top
def zoom_height_event(self, event=None):
- top = self.editwin.top
- zoomed = zoom_height(top)
- menu_status = 'Restore' if zoomed else 'Zoom'
- self.editwin.update_menu_label(menu='options', index='* Height',
- label=f'{menu_status} Height')
+ zoomed = self.zoom_height()
+
+ if zoomed is None:
+ self.top.bell()
+ else:
+ menu_status = 'Restore' if zoomed else 'Zoom'
+ self.editwin.update_menu_label(menu='options', index='* Height',
+ label=f'{menu_status} Height')
+
return "break"
+ def zoom_height(self):
+ top = self.top
+
+ width, height, x, y = get_window_geometry(top)
+
+ if top.wm_state() != 'normal':
+ # Can't zoom/restore window height for windows not in the 'normal'
+ # state, e.g. maximized and full-screen windows.
+ return None
+
+ try:
+ maxheight, maxy = self.get_max_height_and_y_coord()
+ except WmInfoGatheringError:
+ return None
+
+ if height != maxheight:
+ # Maximize the window's height.
+ set_window_geometry(top, (width, maxheight, x, maxy))
+ return True
+ else:
+ # Restore the window's height.
+ #
+ # .wm_geometry('') makes the window revert to the size requested
+ # by the widgets it contains.
+ top.wm_geometry('')
+ return False
+
+ def get_max_height_and_y_coord(self):
+ top = self.top
+
+ screen_dimensions = (top.winfo_screenwidth(),
+ top.winfo_screenheight())
+ if screen_dimensions not in self._max_height_and_y_coords:
+ orig_state = top.wm_state()
-def zoom_height(top):
+ # Get window geometry info for maximized windows.
+ try:
+ top.wm_state('zoomed')
+ except tkinter.TclError:
+ # The 'zoomed' state is not supported by some esoteric WMs,
+ # such as Xvfb.
+ raise WmInfoGatheringError(
+ 'Failed getting geometry of maximized windows, because ' +
+ 'the "zoomed" window state is unavailable.')
+ top.update()
+ maxwidth, maxheight, maxx, maxy = get_window_geometry(top)
+ if sys.platform == 'win32':
+ # On Windows, the returned Y coordinate is the one before
+ # maximizing, so we use 0 which is correct unless a user puts
+ # their dock on the top of the screen (very rare).
+ maxy = 0
+ maxrooty = top.winfo_rooty()
+
+ # Get the "root y" coordinate for non-maximized windows with their
+ # y coordinate set to that of maximized windows. This is needed
+ # to properly handle different title bar heights for non-maximized
+ # vs. maximized windows, as seen e.g. in Windows 10.
+ top.wm_state('normal')
+ top.update()
+ orig_geom = get_window_geometry(top)
+ max_y_geom = orig_geom[:3] + (maxy,)
+ set_window_geometry(top, max_y_geom)
+ top.update()
+ max_y_geom_rooty = top.winfo_rooty()
+
+ # Adjust the maximum window height to account for the different
+ # title bar heights of non-maximized vs. maximized windows.
+ maxheight += maxrooty - max_y_geom_rooty
+
+ self._max_height_and_y_coords[screen_dimensions] = maxheight, maxy
+
+ set_window_geometry(top, orig_geom)
+ top.wm_state(orig_state)
+
+ return self._max_height_and_y_coords[screen_dimensions]
+
+
+def get_window_geometry(top):
geom = top.wm_geometry()
m = re.match(r"(\d+)x(\d+)\+(-?\d+)\+(-?\d+)", geom)
- if not m:
- top.bell()
- return
- width, height, x, y = map(int, m.groups())
- newheight = top.winfo_screenheight()
- if sys.platform == 'win32':
- newy = 0
- newheight = newheight - 72
-
- elif macosx.isAquaTk():
- # The '88' below is a magic number that avoids placing the bottom
- # of the window below the panel on my machine. I don't know how
- # to calculate the correct value for this with tkinter.
- newy = 22
- newheight = newheight - newy - 88
-
- else:
- #newy = 24
- newy = 0
- #newheight = newheight - 96
- newheight = newheight - 88
- if height >= newheight:
- newgeom = ""
- else:
- newgeom = "%dx%d+%d+%d" % (width, newheight, x, newy)
- top.wm_geometry(newgeom)
- return newgeom != ""
+ return tuple(map(int, m.groups()))
+
+
+def set_window_geometry(top, geometry):
+ top.wm_geometry("{:d}x{:d}+{:d}+{:d}".format(*geometry))
if __name__ == "__main__":
raise TypeError('{!r} is a built-in module'.format(object))
if isclass(object):
if hasattr(object, '__module__'):
- object = sys.modules.get(object.__module__)
- if getattr(object, '__file__', None):
- return object.__file__
+ module = sys.modules.get(object.__module__)
+ if getattr(module, '__file__', None):
+ return module.__file__
raise TypeError('{!r} is a built-in class'.format(object))
if ismethod(object):
object = object.__func__
if arg not in cls._netmask_cache:
if isinstance(arg, int):
prefixlen = arg
+ if not (0 <= prefixlen <= cls._max_prefixlen):
+ cls._report_invalid_netmask(prefixlen)
else:
try:
# Check for a netmask in prefix length form
if arg not in cls._netmask_cache:
if isinstance(arg, int):
prefixlen = arg
+ if not (0 <= prefixlen <= cls._max_prefixlen):
+ cls._report_invalid_netmask(prefixlen)
else:
prefixlen = cls._prefix_from_prefix_string(arg)
netmask = IPv6Address(cls._ip_int_from_prefix(prefixlen))
Round-trip invariant for full input:
Untokenized source will match input source exactly
- Round-trip invariant for limited intput:
+ Round-trip invariant for limited input:
# Output text will tokenize the back to the input
t1 = [tok[:2] for tok in generate_tokens(f.readline)]
newcode = untokenize(t1)
# Prevent a held logging lock from blocking a child from logging.
if not hasattr(os, 'register_at_fork'): # Windows and friends.
- def _register_at_fork_acquire_release(instance):
+ def _register_at_fork_reinit_lock(instance):
pass # no-op when os.register_at_fork does not exist.
-else: # The os.register_at_fork API exists
- os.register_at_fork(before=_acquireLock,
- after_in_child=_releaseLock,
- after_in_parent=_releaseLock)
-
- # A collection of instances with acquire and release methods (logging.Handler)
- # to be called before and after fork. The weakref avoids us keeping discarded
- # Handler instances alive forever in case an odd program creates and destroys
- # many over its lifetime.
- _at_fork_acquire_release_weakset = weakref.WeakSet()
-
-
- def _register_at_fork_acquire_release(instance):
- # We put the instance itself in a single WeakSet as we MUST have only
- # one atomic weak ref. used by both before and after atfork calls to
- # guarantee matched pairs of acquire and release calls.
- _at_fork_acquire_release_weakset.add(instance)
-
+else:
+ # A collection of instances with a createLock method (logging.Handler)
+ # to be called in the child after forking. The weakref avoids us keeping
+ # discarded Handler instances alive. A set is used to avoid accumulating
+ # duplicate registrations as createLock() is responsible for registering
+ # a new Handler instance with this set in the first place.
+ _at_fork_reinit_lock_weakset = weakref.WeakSet()
+
+ def _register_at_fork_reinit_lock(instance):
+ _acquireLock()
+ try:
+ _at_fork_reinit_lock_weakset.add(instance)
+ finally:
+ _releaseLock()
- def _at_fork_weak_calls(method_name):
- for instance in _at_fork_acquire_release_weakset:
- method = getattr(instance, method_name)
+ def _after_at_fork_child_reinit_locks():
+ # _acquireLock() was called in the parent before forking.
+ for handler in _at_fork_reinit_lock_weakset:
try:
- method()
+ handler.createLock()
except Exception as err:
# Similar to what PyErr_WriteUnraisable does.
print("Ignoring exception from logging atfork", instance,
- method_name, "method:", err, file=sys.stderr)
-
-
- def _before_at_fork_weak_calls():
- _at_fork_weak_calls('acquire')
+ "._reinit_lock() method:", err, file=sys.stderr)
+ _releaseLock() # Acquired by os.register_at_fork(before=.
- def _after_at_fork_weak_calls():
- _at_fork_weak_calls('release')
-
-
- os.register_at_fork(before=_before_at_fork_weak_calls,
- after_in_child=_after_at_fork_weak_calls,
- after_in_parent=_after_at_fork_weak_calls)
+ os.register_at_fork(before=_acquireLock,
+ after_in_child=_after_at_fork_child_reinit_locks,
+ after_in_parent=_releaseLock)
#---------------------------------------------------------------------------
Acquire a thread lock for serializing access to the underlying I/O.
"""
self.lock = threading.RLock()
- _register_at_fork_acquire_release(self)
+ _register_at_fork_reinit_lock(self)
def acquire(self):
"""
sys.stderr.write('Message: %r\n'
'Arguments: %s\n' % (record.msg,
record.args))
+ except RecursionError: # See issue 36272
+ raise
except Exception:
sys.stderr.write('Unable to print the message and arguments'
' - possible formatting error.\nUse the'
# issue 35046: merged two stream.writes into one.
stream.write(msg + self.terminator)
self.flush()
+ except RecursionError: # See issue 36272
+ raise
except Exception:
self.handleError(record)
def __repr__(self):
level = getLevelName(self.level)
name = getattr(self.stream, 'name', '')
+ # bpo-36015: name can be an int
+ name = str(name)
if name:
name += ' '
return '<%s %s(%s)>' % (self.__class__.__name__, name, level)
from stat import ST_DEV, ST_INO, ST_MTIME
import queue
import threading
+import copy
#
# Some constants...
# exc_info and exc_text attributes, as they are no longer
# needed and, if not None, will typically not be pickleable.
msg = self.format(record)
+ # bpo-35726: make copy of record to avoid affecting other handlers in the chain.
+ record = copy.copy(record)
record.message = msg
record.msg = msg
record.args = None
try:
record = self.dequeue(True)
if record is self._sentinel:
+ if has_task_done:
+ q.task_done()
break
self.handle(record)
if has_task_done:
if AMD64:
flags |= 256
if keyfile:
- keyid = self.cab.gen_id(self.absolute, keyfile)
+ keyid = self.cab.gen_id(keyfile)
self.keyfiles[keyfile] = keyid
else:
keyid = None
finally:
self.stop_event.set()
- def create(self, c, typeid, *args, **kwds):
+ def create(*args, **kwds):
'''
Create a new shared object and return its id
'''
+ if len(args) >= 3:
+ self, c, typeid, *args = args
+ elif not args:
+ raise TypeError("descriptor 'create' of 'Server' object "
+ "needs an argument")
+ else:
+ if 'typeid' not in kwds:
+ raise TypeError('create expected at least 2 positional '
+ 'arguments, got %d' % (len(args)-1))
+ typeid = kwds.pop('typeid')
+ if len(args) >= 2:
+ self, c, *args = args
+ else:
+ if 'c' not in kwds:
+ raise TypeError('create expected at least 2 positional '
+ 'arguments, got %d' % (len(args)-1))
+ c = kwds.pop('c')
+ self, *args = args
+ args = tuple(args)
+
with self.mutex:
callable, exposed, method_to_typeid, proxytype = \
self.registry[typeid]
util.info('manager serving at %r', server.address)
server.serve_forever()
- def _create(self, typeid, *args, **kwds):
+ def _create(*args, **kwds):
'''
Create a new shared object; return the token and exposed tuple
'''
+ self, typeid, *args = args
+ args = tuple(args)
+
assert self._state.value == State.STARTED, 'server not yet started'
conn = self._Client(self._address, authkey=self._authkey)
try:
class _ResourceSharer(object):
- '''Manager for resouces using background thread.'''
+ '''Manager for resources using background thread.'''
def __init__(self):
self._key = 0
self._cache = {}
import re
import sys
from _collections_abc import Sequence
-from errno import EINVAL, ENOENT, ENOTDIR, EBADF
+from errno import EINVAL, ENOENT, ENOTDIR, EBADF, ELOOP
from operator import attrgetter
from stat import S_ISDIR, S_ISLNK, S_ISREG, S_ISSOCK, S_ISBLK, S_ISCHR, S_ISFIFO
from urllib.parse import quote_from_bytes as urlquote_from_bytes
# Internals
#
-# EBADF - guard agains macOS `stat` throwing EBADF
-_IGNORED_ERROS = (ENOENT, ENOTDIR, EBADF)
+# EBADF - guard against macOS `stat` throwing EBADF
+_IGNORED_ERROS = (ENOENT, ENOTDIR, EBADF, ELOOP)
_IGNORED_WINERRORS = (
21, # ERROR_NOT_READY - drive exists but is not accessible
+ 1921, # ERROR_CANT_RESOLVE_FILENAME - fix for broken symlink pointing to itself
)
def _ignore_error(exception):
cf = parent_path._flavour.casefold
entries = list(scandir(parent_path))
for entry in entries:
- if not self.dironly or entry.is_dir():
+ entry_is_dir = False
+ try:
+ entry_is_dir = entry.is_dir()
+ except OSError as e:
+ if not _ignore_error(e):
+ raise
+ if not self.dironly or entry_is_dir:
name = entry.name
casefolded = cf(name)
if self.pat.match(casefolded):
"""
co = self.curframe.f_code
dict = self.curframe_locals
- n = co.co_argcount
- if co.co_flags & 4: n = n+1
- if co.co_flags & 8: n = n+1
+ n = co.co_argcount + co.co_kwonlyargcount
+ if co.co_flags & inspect.CO_VARARGS: n = n+1
+ if co.co_flags & inspect.CO_VARKEYWORDS: n = n+1
for i in range(n):
name = co.co_varnames[i]
if name in dict:
self.write(BINUNICODE + pack("<I", n) + encoded)
else:
obj = obj.replace("\\", "\\u005c")
+ obj = obj.replace("\0", "\\u0000")
obj = obj.replace("\n", "\\u000a")
+ obj = obj.replace("\r", "\\u000d")
+ obj = obj.replace("\x1a", "\\u001a") # EOF on DOS
self.write(UNICODE + obj.encode('raw-unicode-escape') +
b'\n')
self.memoize(obj)
# type information
# 0.4.0 - added win32_ver() and modified the platform() output for WinXX
# 0.3.4 - fixed a bug in _follow_symlinks()
-# 0.3.3 - fixed popen() and "file" command invokation bugs
+# 0.3.3 - fixed popen() and "file" command invocation bugs
# 0.3.2 - added architecture() API and support for it in platform()
# 0.3.1 - fixed syscmd_ver() RE to support Windows NT
# 0.3.0 - added system alias support
from sys import getwindowsversion
except ImportError:
return release, version, csd, ptype
- try:
- from winreg import OpenKeyEx, QueryValueEx, CloseKey, HKEY_LOCAL_MACHINE
- except ImportError:
- from _winreg import OpenKeyEx, QueryValueEx, CloseKey, HKEY_LOCAL_MACHINE
winver = getwindowsversion()
maj, min, build = winver.platform_version or winver[:3]
_WIN32_SERVER_RELEASES.get((maj, None)) or
release)
- key = None
try:
- key = OpenKeyEx(HKEY_LOCAL_MACHINE,
- r'SOFTWARE\Microsoft\Windows NT\CurrentVersion')
- ptype = QueryValueEx(key, 'CurrentType')[0]
- except:
+ try:
+ import winreg
+ except ImportError:
+ import _winreg as winreg
+ except ImportError:
pass
- finally:
- if key:
- CloseKey(key)
+ else:
+ try:
+ cvkey = r'SOFTWARE\Microsoft\Windows NT\CurrentVersion'
+ with winreg.OpenKeyEx(HKEY_LOCAL_MACHINE, cvkey) as key:
+ ptype = QueryValueEx(key, 'CurrentType')[0]
+ except:
+ pass
return release, version, csd, ptype
extsep = '.'
sep = '/'
pathsep = ':'
-defpath = ':/bin:/usr/bin'
+defpath = '/bin:/usr/bin'
altsep = None
devnull = '/dev/null'
return self
# This method is more useful to profile a single function call.
- def runcall(self, func, *args, **kw):
+ def runcall(*args, **kw):
+ if len(args) >= 2:
+ self, func, *args = args
+ elif not args:
+ raise TypeError("descriptor 'runcall' of 'Profile' object "
+ "needs an argument")
+ elif 'func' in kw:
+ func = kw.pop('func')
+ self, *args = args
+ else:
+ raise TypeError('runcall expected at least 1 positional argument, '
+ 'got %d' % (len(args)-1))
+
self.set_cmd(repr(func))
sys.setprofile(self.dispatcher)
try:
return "%s:%d(%s)" % func_name
#**************************************************************************
-# The following functions combine statists for pairs functions.
+# The following functions combine statistics for pairs functions.
# The bulk of the processing involves correctly handling "call" lists,
# such as callers and callees.
#**************************************************************************
# -*- coding: utf-8 -*-
-# Autogenerated by Sphinx on Tue Mar 12 14:56:48 2019
+# Autogenerated by Sphinx on Tue Jun 18 16:49:39 2019
topics = {'assert': 'The "assert" statement\n'
'**********************\n'
'\n'
'\n'
'For user-defined classes which do not define "__contains__()" '
'but do\n'
- 'define "__iter__()", "x in y" is "True" if some value "z" '
- 'with "x ==\n'
- 'z" is produced while iterating over "y". If an exception is '
- 'raised\n'
- 'during the iteration, it is as if "in" raised that '
- 'exception.\n'
+ 'define "__iter__()", "x in y" is "True" if some value "z", '
+ 'for which\n'
+ 'the expression "x is z or x == z" is true, is produced while '
+ 'iterating\n'
+ 'over "y". If an exception is raised during the iteration, it '
+ 'is as if\n'
+ '"in" raised that exception.\n'
'\n'
'Lastly, the old-style iteration protocol is tried: if a class '
'defines\n'
'"__getitem__()", "x in y" is "True" if and only if there is a '
'non-\n'
- 'negative integer index *i* such that "x == y[i]", and all '
- 'lower\n'
- 'integer indices do not raise "IndexError" exception. (If any '
- 'other\n'
+ 'negative integer index *i* such that "x is y[i] or x == '
+ 'y[i]", and no\n'
+ 'lower integer index raises the "IndexError" exception. (If '
+ 'any other\n'
'exception is raised, it is as if "in" raised that '
'exception).\n'
'\n'
- 'The operator "not in" is defined to have the inverse true '
+ 'The operator "not in" is defined to have the inverse truth '
'value of\n'
'"in".\n'
'\n'
'Identity comparisons\n'
'====================\n'
'\n'
- 'The operators "is" and "is not" test for object identity: "x '
- 'is y" is\n'
- 'true if and only if *x* and *y* are the same object. Object '
- 'identity\n'
- 'is determined using the "id()" function. "x is not y" yields '
- 'the\n'
- 'inverse truth value. [4]\n',
+ 'The operators "is" and "is not" test for an object’s '
+ 'identity: "x is\n'
+ 'y" is true if and only if *x* and *y* are the same object. '
+ 'An\n'
+ 'Object’s identity is determined using the "id()" function. '
+ '"x is not\n'
+ 'y" yields the inverse truth value. [4]\n',
'compound': 'Compound statements\n'
'*******************\n'
'\n'
'"str.format()"\n'
' method, to produce a “formatted” string representation '
'of an\n'
- ' object. The "format_spec" argument is a string that '
+ ' object. The *format_spec* argument is a string that '
'contains a\n'
' description of the formatting options desired. The '
'interpretation\n'
- ' of the "format_spec" argument is up to the type '
+ ' of the *format_spec* argument is up to the type '
'implementing\n'
' "__format__()", however most classes will either '
'delegate\n'
'terminates\n'
'execution of the program, or returns to its interactive main '
'loop. In\n'
- 'either case, it prints a stack backtrace, except when the '
+ 'either case, it prints a stack traceback, except when the '
'exception is\n'
'"SystemExit".\n'
'\n'
'terminates\n'
'execution of the program, or returns to its interactive main '
'loop. In\n'
- 'either case, it prints a stack backtrace, except when the '
+ 'either case, it prints a stack traceback, except when the '
'exception is\n'
'"SystemExit".\n'
'\n'
'Meaning '
'|\n'
' '
- '+===========+============================================================+\n'
+ '|===========|============================================================|\n'
' | "\'<\'" | Forces the field to be left-aligned '
'within the available |\n'
' | | space (this is the default for most '
'Meaning '
'|\n'
' '
- '+===========+============================================================+\n'
+ '|===========|============================================================|\n'
' | "\'+\'" | indicates that a sign should be used for '
'both positive as |\n'
' | | well as negative '
'Meaning '
'|\n'
' '
- '+===========+============================================================+\n'
+ '|===========|============================================================|\n'
' | "\'s\'" | String format. This is the default type '
'for strings and |\n'
' | | may be '
'Meaning '
'|\n'
' '
- '+===========+============================================================+\n'
+ '|===========|============================================================|\n'
' | "\'b\'" | Binary format. Outputs the number in '
'base 2. |\n'
' '
'Meaning '
'|\n'
' '
- '+===========+============================================================+\n'
+ '|===========|============================================================|\n'
' | "\'e\'" | Exponent notation. Prints the number in '
'scientific |\n'
' | | notation using the letter ‘e’ to indicate '
'end up importing "pkg.mod". If you execute "from ..subpkg2 import '
'mod"\n'
'from within "pkg.subpkg1" you will import "pkg.subpkg2.mod". The\n'
- 'specification for relative imports is contained within **PEP '
- '328**.\n'
+ 'specification for relative imports is contained in the Package\n'
+ 'Relative Imports section.\n'
'\n'
'"importlib.import_module()" is provided to support applications '
'that\n'
'"False" otherwise.\n'
'\n'
'For user-defined classes which do not define "__contains__()" but do\n'
- 'define "__iter__()", "x in y" is "True" if some value "z" with "x ==\n'
- 'z" is produced while iterating over "y". If an exception is raised\n'
- 'during the iteration, it is as if "in" raised that exception.\n'
+ 'define "__iter__()", "x in y" is "True" if some value "z", for which\n'
+ 'the expression "x is z or x == z" is true, is produced while '
+ 'iterating\n'
+ 'over "y". If an exception is raised during the iteration, it is as if\n'
+ '"in" raised that exception.\n'
'\n'
'Lastly, the old-style iteration protocol is tried: if a class defines\n'
'"__getitem__()", "x in y" is "True" if and only if there is a non-\n'
- 'negative integer index *i* such that "x == y[i]", and all lower\n'
- 'integer indices do not raise "IndexError" exception. (If any other\n'
+ 'negative integer index *i* such that "x is y[i] or x == y[i]", and no\n'
+ 'lower integer index raises the "IndexError" exception. (If any other\n'
'exception is raised, it is as if "in" raised that exception).\n'
'\n'
- 'The operator "not in" is defined to have the inverse true value of\n'
+ 'The operator "not in" is defined to have the inverse truth value of\n'
'"in".\n',
'integers': 'Integer literals\n'
'****************\n'
'+-------------------------------------------------+---------------------------------------+\n'
'| Operator | '
'Description |\n'
- '+=================================================+=======================================+\n'
+ '|=================================================|=======================================|\n'
'| "lambda" | '
'Lambda expression |\n'
'+-------------------------------------------------+---------------------------------------+\n'
'"str.format()"\n'
' method, to produce a “formatted” string representation of '
'an\n'
- ' object. The "format_spec" argument is a string that '
+ ' object. The *format_spec* argument is a string that '
'contains a\n'
' description of the formatting options desired. The '
'interpretation\n'
- ' of the "format_spec" argument is up to the type '
+ ' of the *format_spec* argument is up to the type '
'implementing\n'
' "__format__()", however most classes will either '
'delegate\n'
'When a class definition is executed, the following steps '
'occur:\n'
'\n'
- '* MRO entries are resolved\n'
+ '* MRO entries are resolved;\n'
'\n'
- '* the appropriate metaclass is determined\n'
+ '* the appropriate metaclass is determined;\n'
'\n'
- '* the class namespace is prepared\n'
+ '* the class namespace is prepared;\n'
'\n'
- '* the class body is executed\n'
+ '* the class body is executed;\n'
'\n'
- '* the class object is created\n'
+ '* the class object is created.\n'
'\n'
'\n'
'Resolving MRO entries\n'
'\n'
'* if no bases and no explicit metaclass are given, then '
'"type()" is\n'
- ' used\n'
+ ' used;\n'
'\n'
'* if an explicit metaclass is given and it is *not* an '
'instance of\n'
- ' "type()", then it is used directly as the metaclass\n'
+ ' "type()", then it is used directly as the metaclass;\n'
'\n'
'* if an instance of "type()" is given as the explicit '
'metaclass, or\n'
' bases are defined, then the most derived metaclass is '
- 'used\n'
+ 'used.\n'
'\n'
'The most derived metaclass is selected from the explicitly '
'specified\n'
'with the\n'
' class being defined and the assigned name of that '
'particular\n'
- ' descriptor; and\n'
+ ' descriptor;\n'
'\n'
'* finally, the "__init_subclass__()" hook is called on the '
'immediate\n'
'\n'
'One can implement the generic class syntax as specified by '
'**PEP 484**\n'
- '(for example "List[int]") by defining a special method\n'
+ '(for example "List[int]") by defining a special method:\n'
'\n'
'classmethod object.__class_getitem__(cls, key)\n'
'\n'
' | Representation | '
'Description |\n'
' '
- '+=========================+===============================+\n'
+ '|=========================|===============================|\n'
' | "\\n" | Line '
'Feed |\n'
' '
'+-------------------+-----------------------------------+---------+\n'
'| Escape Sequence | Meaning | Notes '
'|\n'
- '+===================+===================================+=========+\n'
+ '|===================|===================================|=========|\n'
'| "\\newline" | Backslash and newline ignored '
'| |\n'
'+-------------------+-----------------------------------+---------+\n'
'+-------------------+-----------------------------------+---------+\n'
'| Escape Sequence | Meaning | Notes '
'|\n'
- '+===================+===================================+=========+\n'
+ '|===================|===================================|=========|\n'
'| "\\N{name}" | Character named *name* in the | '
'(4) |\n'
'| | Unicode database | '
' | Attribute | Meaning '
'| |\n'
' '
- '+===========================+=================================+=============+\n'
+ '|===========================|=================================|=============|\n'
' | "__doc__" | The function’s documentation '
'| Writable |\n'
' | | string, or "None" if '
'| |\n'
' | | unavailable; not inherited by '
'| |\n'
- ' | | subclasses '
+ ' | | subclasses. '
'| |\n'
' '
'+---------------------------+---------------------------------+-------------+\n'
- ' | "__name__" | The function’s name '
+ ' | "__name__" | The function’s name. '
'| Writable |\n'
' '
'+---------------------------+---------------------------------+-------------+\n'
- ' | "__qualname__" | The function’s *qualified name* '
+ ' | "__qualname__" | The function’s *qualified '
'| Writable |\n'
- ' | | New in version 3.3. '
+ ' | | name*. New in version 3.3. '
'| |\n'
' '
'+---------------------------+---------------------------------+-------------+\n'
'| |\n'
' | | or "None" if no arguments have '
'| |\n'
- ' | | a default value '
+ ' | | a default value. '
'| |\n'
' '
'+---------------------------+---------------------------------+-------------+\n'
'+----------------------------+----------------------------------+------------+\n'
'| Operation | Result '
'| Notes |\n'
- '+============================+==================================+============+\n'
+ '|============================|==================================|============|\n'
'| "x in s" | "True" if an item of *s* is '
'| (1) |\n'
'| | equal to *x*, else "False" '
'+--------------------------------+----------------------------------+-----------------------+\n'
'| Operation | '
'Result | Notes |\n'
- '+================================+==================================+=======================+\n'
+ '|================================|==================================|=======================|\n'
'| "s[i] = x" | item *i* of *s* is replaced '
'by | |\n'
'| | '
'| Operation | '
'Result | Notes '
'|\n'
- '+================================+==================================+=======================+\n'
+ '|================================|==================================|=======================|\n'
'| "s[i] = x" | item *i* of *s* is '
'replaced by | |\n'
'| | '
try:
names = os.listxattr(src, follow_symlinks=follow_symlinks)
except OSError as e:
- if e.errno not in (errno.ENOTSUP, errno.ENODATA):
+ if e.errno not in (errno.ENOTSUP, errno.ENODATA, errno.EINVAL):
raise
return
for name in names:
value = os.getxattr(src, name, follow_symlinks=follow_symlinks)
os.setxattr(dst, name, value, follow_symlinks=follow_symlinks)
except OSError as e:
- if e.errno not in (errno.EPERM, errno.ENOTSUP, errno.ENODATA):
+ if e.errno not in (errno.EPERM, errno.ENOTSUP, errno.ENODATA,
+ errno.EINVAL):
raise
else:
def _copyxattr(*args, **kwargs):
mode = stat.S_IMODE(st.st_mode)
lookup("utime")(dst, ns=(st.st_atime_ns, st.st_mtime_ns),
follow_symlinks=follow)
+ # We must copy extended attributes before the file is (potentially)
+ # chmod()'ed read-only, otherwise setxattr() will error with -EACCES.
+ _copyxattr(src, dst, follow_symlinks=follow)
try:
lookup("chmod")(dst, mode, follow_symlinks=follow)
except NotImplementedError:
break
else:
raise
- _copyxattr(src, dst, follow_symlinks=follow)
def copy(src, dst, *, follow_symlinks=True):
"""Copy data and mode bits ("cp src dst"). Return the file's destination.
return None
if path is None:
- path = os.environ.get("PATH", os.defpath)
+ path = os.environ.get("PATH", None)
+ if path is None:
+ try:
+ path = os.confstr("CS_PATH")
+ except (AttributeError, ValueError):
+ # os.confstr() or CS_PATH is not available
+ path = os.defpath
+ # bpo-35755: Don't use os.defpath if the PATH environment variable is
+ # set to an empty string
+
+ # PATH='' doesn't match, whereas PATH=':' looks in the current directory
if not path:
return None
path = path.split(os.pathsep)
env = os.environ
if sys.platform == 'darwin' and '__PYVENV_LAUNCHER__' in env:
executable = sys._base_executable = os.environ['__PYVENV_LAUNCHER__']
- elif sys.platform == 'win32' and '__PYVENV_LAUNCHER__' in env:
- executable = sys.executable
- import _winapi
- sys._base_executable = _winapi.GetModuleFileName(0)
- # bpo-35873: Clear the environment variable to avoid it being
- # inherited by child processes.
- del os.environ['__PYVENV_LAUNCHER__']
else:
executable = sys.executable
exe_dir, _ = os.path.split(os.path.abspath(executable))
Supports IPv4 addresses on all platforms and IPv6 on platforms with IPv6
support.
"""
- # inet_aton() also accepts strings like '1'
- if ipname.count('.') == 3:
- try:
- return _socket.inet_aton(ipname)
- except OSError:
- pass
+ # inet_aton() also accepts strings like '1', '127.1', some also trailing
+ # data like '127.0.0.1 whatever'.
+ try:
+ addr = _socket.inet_aton(ipname)
+ except OSError:
+ # not an IPv4 address
+ pass
+ else:
+ if _socket.inet_ntoa(addr) == ipname:
+ # only accept injective ipnames
+ return addr
+ else:
+ # refuse for short IPv4 notation and additional trailing data
+ raise ValueError(
+ "{!r} is not a quad-dotted IPv4 address.".format(ipname)
+ )
try:
return _socket.inet_pton(_socket.AF_INET6, ipname)
raise ValueError("{!r} is not an IPv4 address.".format(ipname))
-def _ipaddress_match(ipname, host_ip):
+def _ipaddress_match(cert_ipaddress, host_ip):
"""Exact matching of IP addresses.
RFC 6125 explicitly doesn't define an algorithm for this
(section 1.7.2 - "Out of Scope").
"""
- # OpenSSL may add a trailing newline to a subjectAltName's IP address
- ip = _inet_paton(ipname.rstrip())
+ # OpenSSL may add a trailing newline to a subjectAltName's IP address,
+ # commonly woth IPv6 addresses. Strip off trailing \n.
+ ip = _inet_paton(cert_ipaddress.rstrip())
return ip == host_ip
return self._sslobj.verify_client_post_handshake()
+def _sslcopydoc(func):
+ """Copy docstring from SSLObject to SSLSocket"""
+ func.__doc__ = getattr(SSLObject, func.__name__).__doc__
+ return func
+
+
class SSLSocket(socket):
"""This class implements a subtype of socket.socket that wraps
the underlying OS socket in an SSL context when necessary, and
return self
@property
+ @_sslcopydoc
def context(self):
return self._context
self._sslobj.context = ctx
@property
+ @_sslcopydoc
def session(self):
- """The SSLSession for client socket."""
if self._sslobj is not None:
return self._sslobj.session
self._sslobj.session = session
@property
+ @_sslcopydoc
def session_reused(self):
- """Was the client session reused during handshake"""
if self._sslobj is not None:
return self._sslobj.session_reused
raise ValueError("Write on closed or unwrapped SSL socket.")
return self._sslobj.write(data)
+ @_sslcopydoc
def getpeercert(self, binary_form=False):
- """Returns a formatted version of the data in the
- certificate provided by the other end of the SSL channel.
- Return None if no certificate was provided, {} if a
- certificate was provided, but not validated."""
-
self._checkClosed()
self._check_connected()
return self._sslobj.getpeercert(binary_form)
+ @_sslcopydoc
def selected_npn_protocol(self):
self._checkClosed()
if self._sslobj is None or not _ssl.HAS_NPN:
else:
return self._sslobj.selected_npn_protocol()
+ @_sslcopydoc
def selected_alpn_protocol(self):
self._checkClosed()
if self._sslobj is None or not _ssl.HAS_ALPN:
else:
return self._sslobj.selected_alpn_protocol()
+ @_sslcopydoc
def cipher(self):
self._checkClosed()
if self._sslobj is None:
else:
return self._sslobj.cipher()
+ @_sslcopydoc
def shared_ciphers(self):
self._checkClosed()
if self._sslobj is None:
else:
return self._sslobj.shared_ciphers()
+ @_sslcopydoc
def compression(self):
self._checkClosed()
if self._sslobj is None:
raise NotImplementedError("recvmsg_into not allowed on instances of "
"%s" % self.__class__)
+ @_sslcopydoc
def pending(self):
self._checkClosed()
if self._sslobj is not None:
self._sslobj = None
super().shutdown(how)
+ @_sslcopydoc
def unwrap(self):
if self._sslobj:
s = self._sslobj.shutdown()
else:
raise ValueError("No SSL wrapper around " + str(self))
+ @_sslcopydoc
def verify_client_post_handshake(self):
if self._sslobj:
return self._sslobj.verify_client_post_handshake()
self._sslobj = None
super()._real_close()
+ @_sslcopydoc
def do_handshake(self, block=False):
- """Perform a TLS/SSL handshake."""
self._check_connected()
timeout = self.gettimeout()
try:
server_side=True)
return newsock, addr
+ @_sslcopydoc
def get_channel_binding(self, cb_type="tls-unique"):
- """Get channel binding data for current connection. Raise ValueError
- if the requested `cb_type` is not supported. Return bytes of the data
- or None if the data is not available (e.g. before the handshake).
- """
if self._sslobj is not None:
return self._sslobj.get_channel_binding(cb_type)
else:
)
return None
+ @_sslcopydoc
def version(self):
- """
- Return a string identifying the protocol version used by the
- current SSL channel, or None if there is no established channel.
- """
if self._sslobj is not None:
return self._sslobj.version()
else:
The other arguments are the same as for the Popen constructor.
"""
if input is not None:
- if 'stdin' in kwargs:
+ if kwargs.get('stdin') is not None:
raise ValueError('stdin and input arguments may not both be used.')
kwargs['stdin'] = PIPE
if capture_output:
- if ('stdout' in kwargs) or ('stderr' in kwargs):
+ if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
raise ValueError('stdout and stderr arguments may not be used '
'with capture_output.')
kwargs['stdout'] = PIPE
#
# Timeout to wait until a process completes
-TIMEOUT = 30.0 # seconds
+TIMEOUT = 60.0 # seconds
def latin(s):
return s.encode('latin')
Find the test_os test method which alters the environment:
- ./python -m test.bisect --fail-env-changed test_os
+ ./python -m test.bisect_cmd --fail-env-changed test_os
Find a reference leak in "test_os", write the list of failing tests into the
"bisect" file:
- ./python -m test.bisect -o bisect -R 3:3 test_os
+ ./python -m test.bisect_cmd -o bisect -R 3:3 test_os
Load an existing list of tests from a file using -i option:
./python -m test --list-cases -m FileTests test_os > tests
- ./python -m test.bisect -i tests test_os
+ ./python -m test.bisect_cmd -i tests test_os
"""
import argparse
self.assertEqual(expected, got)
strptime = self.theclass.strptime
+
self.assertEqual(strptime("+0002", "%z").utcoffset(), 2 * MINUTE)
self.assertEqual(strptime("-0002", "%z").utcoffset(), -2 * MINUTE)
self.assertEqual(
with self.assertRaises(ValueError): strptime("-2400", "%z")
with self.assertRaises(ValueError): strptime("-000", "%z")
+ def test_strptime_single_digit(self):
+ # bpo-34903: Check that single digit dates and times are allowed.
+
+ strptime = self.theclass.strptime
+
+ with self.assertRaises(ValueError):
+ # %y does require two digits.
+ newdate = strptime('01/02/3 04:05:06', '%d/%m/%y %H:%M:%S')
+ dt1 = self.theclass(2003, 2, 1, 4, 5, 6)
+ dt2 = self.theclass(2003, 1, 2, 4, 5, 6)
+ dt3 = self.theclass(2003, 2, 1, 0, 0, 0)
+ dt4 = self.theclass(2003, 1, 25, 0, 0, 0)
+ inputs = [
+ ('%d', '1/02/03 4:5:6', '%d/%m/%y %H:%M:%S', dt1),
+ ('%m', '01/2/03 4:5:6', '%d/%m/%y %H:%M:%S', dt1),
+ ('%H', '01/02/03 4:05:06', '%d/%m/%y %H:%M:%S', dt1),
+ ('%M', '01/02/03 04:5:06', '%d/%m/%y %H:%M:%S', dt1),
+ ('%S', '01/02/03 04:05:6', '%d/%m/%y %H:%M:%S', dt1),
+ ('%j', '2/03 04am:05:06', '%j/%y %I%p:%M:%S',dt2),
+ ('%I', '02/03 4am:05:06', '%j/%y %I%p:%M:%S',dt2),
+ ('%w', '6/04/03', '%w/%U/%y', dt3),
+ # %u requires a single digit.
+ ('%W', '6/4/2003', '%u/%W/%Y', dt3),
+ ('%V', '6/4/2003', '%u/%V/%G', dt4),
+ ]
+ for reason, string, format, target in inputs:
+ reason = 'test single digit ' + reason
+ with self.subTest(reason=reason,
+ string=string,
+ format=format,
+ target=target):
+ newdate = strptime(string, format)
+ self.assertEqual(newdate, target, msg=reason)
+
def test_more_timetuple(self):
# This tests fields beyond those tested by the TestDate.test_timetuple.
t = self.theclass(2004, 12, 31, 6, 22, 33)
self.assertEqual(got, expected)
# However, if they're different members, uctoffset is not ignored.
- # Note that a time can't actually have an operand-depedent offset,
+ # Note that a time can't actually have an operand-dependent offset,
# though (and time.utcoffset() passes None to tzinfo.utcoffset()),
# so skip this test for time.
if cls is not time:
'(instead of the Python stdlib test suite)')
group = parser.add_argument_group('Special runs')
- group.add_argument('-l', '--findleaks', action='store_true',
- help='if GC is available detect tests that leak memory')
+ group.add_argument('-l', '--findleaks', action='store_const', const=2,
+ default=1,
+ help='deprecated alias to --fail-env-changed')
group.add_argument('-L', '--runleaks', action='store_true',
help='run the leaks(1) command just before exit.' +
more_details)
help='suppress error message boxes on Windows')
group.add_argument('-F', '--forever', action='store_true',
help='run the specified tests in a loop, until an '
- 'error happens')
+ 'error happens; imply --failfast')
group.add_argument('--list-tests', action='store_true',
help="only write the name of tests that will be run, "
"don't execute them")
# Defaults
ns = argparse.Namespace(testdir=None, verbose=0, quiet=False,
exclude=False, single=False, randomize=False, fromfile=None,
- findleaks=False, use_resources=None, trace=False, coverdir='coverage',
+ findleaks=1, use_resources=None, trace=False, coverdir='coverage',
runleaks=False, huntrleaks=False, verbose2=False, print_slow=False,
random_seed=None, use_mp=None, verbose3=False, forever=False,
header=False, failfast=False, match_tests=None, pgo=False)
parser.error("unrecognized arguments: %s" % arg)
sys.exit(1)
+ if ns.findleaks > 1:
+ # --findleaks implies --fail-env-changed
+ ns.fail_env_changed = True
if ns.single and ns.fromfile:
parser.error("-s and -f don't go together!")
if ns.use_mp is not None and ns.trace:
parser.error("-T and -j don't go together!")
- if ns.use_mp is not None and ns.findleaks:
- parser.error("-l and -j don't go together!")
if ns.failfast and not (ns.verbose or ns.verbose3):
parser.error("-G/--failfast needs either -v or -W")
if ns.pgo and (ns.verbose or ns.verbose2 or ns.verbose3):
with open(ns.match_filename) as fp:
for line in fp:
ns.match_tests.append(line.strip())
+ if ns.forever:
+ # --forever implies --failfast
+ ns.failfast = True
return ns
import datetime
import faulthandler
-import json
import locale
import os
import platform
findtests, runtest, get_abs_module,
STDTESTS, NOTTESTS, PASSED, FAILED, ENV_CHANGED, SKIPPED, RESOURCE_DENIED,
INTERRUPTED, CHILD_ERROR, TEST_DID_NOT_RUN,
- PROGRESS_MIN_TIME, format_test_result)
+ PROGRESS_MIN_TIME, format_test_result, is_failed)
from test.libregrtest.setup import setup_tests
from test.libregrtest.utils import removepy, count, format_duration, printlist
from test import support
-try:
- import gc
-except ImportError:
- gc = None
-
-
-# When tests are run from the Python build directory, it is best practice
-# to keep the test files in a subfolder. This eases the cleanup of leftover
-# files using the "make distclean" command.
-if sysconfig.is_python_build():
- TEMPDIR = sysconfig.get_config_var('abs_builddir')
- if TEMPDIR is None:
- # bpo-30284: On Windows, only srcdir is available. Using abs_builddir
- # mostly matters on UNIX when building Python out of the source tree,
- # especially when the source tree is read only.
- TEMPDIR = sysconfig.get_config_var('srcdir')
- TEMPDIR = os.path.join(TEMPDIR, 'build')
-else:
- TEMPDIR = tempfile.gettempdir()
-TEMPDIR = os.path.abspath(TEMPDIR)
class Regrtest:
self.skipped = []
self.resource_denieds = []
self.environment_changed = []
- self.rerun = []
self.run_no_tests = []
+ self.rerun = []
self.first_result = None
self.interrupted = False
# used by --coverage, trace.Trace instance
self.tracer = None
- # used by --findleaks, store for gc.garbage
- self.found_garbage = []
-
# used to display the progress bar "[ 3/100]"
self.start_time = time.monotonic()
self.test_count = ''
# used by --junit-xml
self.testsuite_xml = None
- def accumulate_result(self, test, result):
- ok, test_time, xml_data = result
- if ok not in (CHILD_ERROR, INTERRUPTED):
- self.test_times.append((test_time, test))
+ # misc
+ self.win_load_tracker = None
+ self.tmp_dir = None
+ self.worker_test_name = None
+
+ def get_executed(self):
+ return (set(self.good) | set(self.bad) | set(self.skipped)
+ | set(self.resource_denieds) | set(self.environment_changed)
+ | set(self.run_no_tests))
+
+ def accumulate_result(self, result, rerun=False):
+ test_name = result.test_name
+ ok = result.result
+
+ if ok not in (CHILD_ERROR, INTERRUPTED) and not rerun:
+ self.test_times.append((result.test_time, test_name))
+
if ok == PASSED:
- self.good.append(test)
+ self.good.append(test_name)
elif ok in (FAILED, CHILD_ERROR):
- self.bad.append(test)
+ if not rerun:
+ self.bad.append(test_name)
elif ok == ENV_CHANGED:
- self.environment_changed.append(test)
+ self.environment_changed.append(test_name)
elif ok == SKIPPED:
- self.skipped.append(test)
+ self.skipped.append(test_name)
elif ok == RESOURCE_DENIED:
- self.skipped.append(test)
- self.resource_denieds.append(test)
+ self.skipped.append(test_name)
+ self.resource_denieds.append(test_name)
elif ok == TEST_DID_NOT_RUN:
- self.run_no_tests.append(test)
- elif ok != INTERRUPTED:
+ self.run_no_tests.append(test_name)
+ elif ok == INTERRUPTED:
+ self.interrupted = True
+ else:
raise ValueError("invalid test result: %r" % ok)
+ if rerun and ok not in {FAILED, CHILD_ERROR, INTERRUPTED}:
+ self.bad.remove(test_name)
+
+ xml_data = result.xml_data
if xml_data:
import xml.etree.ElementTree as ET
for e in xml_data:
print(xml_data, file=sys.__stderr__)
raise
- def display_progress(self, test_index, test):
+ def display_progress(self, test_index, text):
if self.ns.quiet:
return
fails = len(self.bad) + len(self.environment_changed)
if fails and not self.ns.pgo:
line = f"{line}/{fails}"
- line = f"[{line}] {test}"
+ line = f"[{line}] {text}"
# add the system load prefix: "load avg: 1.80 "
- if hasattr(os, 'getloadavg'):
- load_avg_1min = os.getloadavg()[0]
- line = f"load avg: {load_avg_1min:.2f} {line}"
+ load_avg = self.getloadavg()
+ if load_avg is not None:
+ line = f"load avg: {load_avg:.2f} {line}"
# add the timestamp prefix: "0:01:05 "
test_time = time.monotonic() - self.start_time
"faulthandler.dump_traceback_later", file=sys.stderr)
ns.timeout = None
- if ns.threshold is not None and gc is None:
- print('No GC available, ignore --threshold.', file=sys.stderr)
- ns.threshold = None
-
- if ns.findleaks:
- if gc is not None:
- # Uncomment the line below to report garbage that is not
- # freeable by reference counting alone. By default only
- # garbage that is not collectable by the GC is reported.
- pass
- #gc.set_debug(gc.DEBUG_SAVEALL)
- else:
- print('No GC available, disabling --findleaks',
- file=sys.stderr)
- ns.findleaks = False
-
if ns.xmlpath:
support.junit_xml_list = self.testsuite_xml = []
+ worker_args = ns.worker_args
+ if worker_args is not None:
+ from test.libregrtest.runtest_mp import parse_worker_args
+ ns, test_name = parse_worker_args(ns.worker_args)
+ ns.worker_args = worker_args
+ self.worker_test_name = test_name
+
# Strip .py extensions.
removepy(ns.args)
self.tests = tests
if self.ns.single:
- self.next_single_filename = os.path.join(TEMPDIR, 'pynexttest')
+ self.next_single_filename = os.path.join(self.tmp_dir, 'pynexttest')
try:
with open(self.next_single_filename, 'r') as fp:
next_test = fp.read().strip()
support.verbose = False
support.set_match_tests(self.ns.match_tests)
- for test in self.selected:
- abstest = get_abs_module(self.ns, test)
+ for test_name in self.selected:
+ abstest = get_abs_module(self.ns, test_name)
try:
suite = unittest.defaultTestLoader.loadTestsFromName(abstest)
self._list_cases(suite)
except unittest.SkipTest:
- self.skipped.append(test)
+ self.skipped.append(test_name)
if self.skipped:
print(file=sys.stderr)
print()
print("Re-running failed tests in verbose mode")
self.rerun = self.bad[:]
- for test in self.rerun:
- print("Re-running test %r in verbose mode" % test, flush=True)
- try:
- self.ns.verbose = True
- ok = runtest(self.ns, test)
- except KeyboardInterrupt:
- self.interrupted = True
- # print a newline separate from the ^C
- print()
+ for test_name in self.rerun:
+ print(f"Re-running {test_name} in verbose mode", flush=True)
+ self.ns.verbose = True
+ result = runtest(self.ns, test_name)
+
+ self.accumulate_result(result, rerun=True)
+
+ if result.result == INTERRUPTED:
break
- else:
- if ok[0] in {PASSED, ENV_CHANGED, SKIPPED, RESOURCE_DENIED}:
- self.bad.remove(test)
- else:
- if self.bad:
- print(count(len(self.bad), 'test'), "failed again:")
- printlist(self.bad)
+
+ if self.bad:
+ print(count(len(self.bad), 'test'), "failed again:")
+ printlist(self.bad)
self.display_result()
print("== Tests result: %s ==" % self.get_tests_result())
if self.interrupted:
- print()
- # print a newline after ^C
print("Test suite interrupted by signal SIGINT.")
- executed = set(self.good) | set(self.bad) | set(self.skipped)
- omitted = set(self.selected) - executed
+
+ omitted = set(self.selected) - self.get_executed()
+ if omitted:
+ print()
print(count(len(omitted), "test"), "omitted:")
printlist(omitted)
self.test_times.sort(reverse=True)
print()
print("10 slowest tests:")
- for time, test in self.test_times[:10]:
- print("- %s: %s" % (test, format_duration(time)))
+ for test_time, test in self.test_times[:10]:
+ print("- %s: %s" % (test, format_duration(test_time)))
if self.bad:
print()
print("Run tests sequentially")
previous_test = None
- for test_index, test in enumerate(self.tests, 1):
+ for test_index, test_name in enumerate(self.tests, 1):
start_time = time.monotonic()
- text = test
+ text = test_name
if previous_test:
text = '%s -- %s' % (text, previous_test)
self.display_progress(test_index, text)
if self.tracer:
# If we're tracing code coverage, then we don't exit with status
# if on a false return value from main.
- cmd = ('result = runtest(self.ns, test); '
- 'self.accumulate_result(test, result)')
+ cmd = ('result = runtest(self.ns, test_name); '
+ 'self.accumulate_result(result)')
ns = dict(locals())
self.tracer.runctx(cmd, globals=globals(), locals=ns)
result = ns['result']
else:
- try:
- result = runtest(self.ns, test)
- except KeyboardInterrupt:
- self.interrupted = True
- self.accumulate_result(test, (INTERRUPTED, None, None))
- break
- else:
- self.accumulate_result(test, result)
-
- previous_test = format_test_result(test, result[0])
+ result = runtest(self.ns, test_name)
+ self.accumulate_result(result)
+
+ if result.result == INTERRUPTED:
+ break
+
+ previous_test = format_test_result(result)
test_time = time.monotonic() - start_time
if test_time >= PROGRESS_MIN_TIME:
previous_test = "%s in %s" % (previous_test, format_duration(test_time))
- elif result[0] == PASSED:
+ elif result.result == PASSED:
# be quiet: say nothing if the test passed shortly
previous_test = None
- if self.ns.findleaks:
- gc.collect()
- if gc.garbage:
- print("Warning: test created", len(gc.garbage), end=' ')
- print("uncollectable object(s).")
- # move the uncollectable objects somewhere so we don't see
- # them again
- self.found_garbage.extend(gc.garbage)
- del gc.garbage[:]
-
# Unload the newly imported modules (best effort finalization)
for module in sys.modules.keys():
if module not in save_modules and module.startswith("test."):
support.unload(module)
+ if self.ns.failfast and is_failed(result, self.ns):
+ break
+
if previous_test:
print(previous_test)
def _test_forever(self, tests):
while True:
- for test in tests:
- yield test
+ for test_name in tests:
+ yield test_name
if self.bad:
return
if self.ns.fail_env_changed and self.environment_changed:
self.run_tests_sequential()
def finalize(self):
+ if self.win_load_tracker is not None:
+ self.win_load_tracker.close()
+ self.win_load_tracker = None
+
if self.next_single_filename:
if self.next_single_test:
with open(self.next_single_filename, 'w') as fp:
for s in ET.tostringlist(root):
f.write(s)
- def main(self, tests=None, **kwargs):
- global TEMPDIR
- self.ns = self.parse_args(kwargs)
-
+ def create_temp_dir(self):
if self.ns.tempdir:
- TEMPDIR = self.ns.tempdir
- elif self.ns.worker_args:
- ns_dict, _ = json.loads(self.ns.worker_args)
- TEMPDIR = ns_dict.get("tempdir") or TEMPDIR
+ self.tmp_dir = self.ns.tempdir
+
+ if not self.tmp_dir:
+ # When tests are run from the Python build directory, it is best practice
+ # to keep the test files in a subfolder. This eases the cleanup of leftover
+ # files using the "make distclean" command.
+ if sysconfig.is_python_build():
+ self.tmp_dir = sysconfig.get_config_var('abs_builddir')
+ if self.tmp_dir is None:
+ # bpo-30284: On Windows, only srcdir is available. Using
+ # abs_builddir mostly matters on UNIX when building Python
+ # out of the source tree, especially when the source tree
+ # is read only.
+ self.tmp_dir = sysconfig.get_config_var('srcdir')
+ self.tmp_dir = os.path.join(self.tmp_dir, 'build')
+ else:
+ self.tmp_dir = tempfile.gettempdir()
- os.makedirs(TEMPDIR, exist_ok=True)
+ self.tmp_dir = os.path.abspath(self.tmp_dir)
+ os.makedirs(self.tmp_dir, exist_ok=True)
# Define a writable temp dir that will be used as cwd while running
# the tests. The name of the dir includes the pid to allow parallel
# testing (see the -j option).
- test_cwd = 'test_python_{}'.format(os.getpid())
- test_cwd = os.path.join(TEMPDIR, test_cwd)
+ pid = os.getpid()
+ if self.worker_test_name is not None:
+ test_cwd = 'worker_{}'.format(pid)
+ else:
+ test_cwd = 'test_python_{}'.format(pid)
+ test_cwd = os.path.join(self.tmp_dir, test_cwd)
+ return test_cwd
+
+ def main(self, tests=None, **kwargs):
+ self.ns = self.parse_args(kwargs)
+
+ test_cwd = self.create_temp_dir()
- # Run the tests in a context manager that temporarily changes the CWD to a
- # temporary and writable directory. If it's not possible to create or
- # change the CWD, the original CWD will be used. The original CWD is
- # available from support.SAVEDCWD.
+ # Run the tests in a context manager that temporarily changes the CWD
+ # to a temporary and writable directory. If it's not possible to
+ # create or change the CWD, the original CWD will be used.
+ # The original CWD is available from support.SAVEDCWD.
with support.temp_cwd(test_cwd, quiet=True):
+ # When using multiprocessing, worker processes will use test_cwd
+ # as their parent temporary directory. So when the main process
+ # exit, it removes also subdirectories of worker processes.
+ self.ns.tempdir = test_cwd
self._main(tests, kwargs)
+ def getloadavg(self):
+ if self.win_load_tracker is not None:
+ return self.win_load_tracker.getloadavg()
+
+ if hasattr(os, 'getloadavg'):
+ return os.getloadavg()[0]
+
+ return None
+
def _main(self, tests, kwargs):
if self.ns.huntrleaks:
warmup, repetitions, _ = self.ns.huntrleaks
print(msg, file=sys.stderr, flush=True)
sys.exit(2)
- if self.ns.worker_args is not None:
+ if self.worker_test_name is not None:
from test.libregrtest.runtest_mp import run_tests_worker
- run_tests_worker(self.ns.worker_args)
+ run_tests_worker(self.ns, self.worker_test_name)
if self.ns.wait:
input("Press any key to continue...")
self.list_cases()
sys.exit(0)
+ # If we're on windows and this is the parent runner (not a worker),
+ # track the load average.
+ if sys.platform == 'win32' and self.worker_test_name is None:
+ from test.libregrtest.win_utils import WindowsLoadTracker
+
+ try:
+ self.win_load_tracker = WindowsLoadTracker()
+ except FileNotFoundError as error:
+ # Windows IoT Core and Windows Nano Server do not provide
+ # typeperf.exe for x64, x86 or ARM
+ print(f'Failed to create WindowsLoadTracker: {error}')
+
self.run_tests()
self.display_result()
-import errno
import os
import re
import sys
try:
from _abc import _get_dump
except ImportError:
+ import weakref
+
def _get_dump(cls):
- # For legacy Python version
- return (cls._abc_registry, cls._abc_cache,
+ # Reimplement _get_dump() for pure-Python implementation of
+ # the abc module (Lib/_py_abc.py)
+ registry_weakrefs = set(weakref.ref(obj) for obj in cls._abc_registry)
+ return (registry_weakrefs, cls._abc_cache,
cls._abc_negative_cache, cls._abc_negative_cache_version)
-def dash_R(the_module, test, indirect_test, huntrleaks):
+def dash_R(ns, test_name, test_func):
"""Run a test multiple times, looking for reference leaks.
Returns:
raise Exception("Tracking reference leaks requires a debug build "
"of Python")
+ # Avoid false positives due to various caches
+ # filling slowly with random data:
+ warm_caches()
+
# Save current values for dash_R_cleanup() to restore.
fs = warnings.filters[:]
ps = copyreg.dispatch_table.copy()
def get_pooled_int(value):
return int_pool.setdefault(value, value)
- nwarmup, ntracked, fname = huntrleaks
+ nwarmup, ntracked, fname = ns.huntrleaks
fname = os.path.join(support.SAVEDCWD, fname)
repcount = nwarmup + ntracked
+
+ # Pre-allocate to ensure that the loop doesn't allocate anything new
+ rep_range = list(range(repcount))
rc_deltas = [0] * repcount
alloc_deltas = [0] * repcount
fd_deltas = [0] * repcount
+ getallocatedblocks = sys.getallocatedblocks
+ gettotalrefcount = sys.gettotalrefcount
+ fd_count = support.fd_count
- print("beginning", repcount, "repetitions", file=sys.stderr)
- print(("1234567890"*(repcount//10 + 1))[:repcount], file=sys.stderr,
- flush=True)
# initialize variables to make pyflakes quiet
rc_before = alloc_before = fd_before = 0
- for i in range(repcount):
- indirect_test()
- alloc_after, rc_after, fd_after = dash_R_cleanup(fs, ps, pic, zdc,
- abcs)
- print('.', end='', file=sys.stderr, flush=True)
- if i >= nwarmup:
- rc_deltas[i] = get_pooled_int(rc_after - rc_before)
- alloc_deltas[i] = get_pooled_int(alloc_after - alloc_before)
- fd_deltas[i] = get_pooled_int(fd_after - fd_before)
+
+ if not ns.quiet:
+ print("beginning", repcount, "repetitions", file=sys.stderr)
+ print(("1234567890"*(repcount//10 + 1))[:repcount], file=sys.stderr,
+ flush=True)
+
+ dash_R_cleanup(fs, ps, pic, zdc, abcs)
+
+ for i in rep_range:
+ test_func()
+ dash_R_cleanup(fs, ps, pic, zdc, abcs)
+
+ # dash_R_cleanup() ends with collecting cyclic trash:
+ # read memory statistics immediately after.
+ alloc_after = getallocatedblocks()
+ rc_after = gettotalrefcount()
+ fd_after = fd_count()
+
+ if not ns.quiet:
+ print('.', end='', file=sys.stderr, flush=True)
+
+ rc_deltas[i] = get_pooled_int(rc_after - rc_before)
+ alloc_deltas[i] = get_pooled_int(alloc_after - alloc_before)
+ fd_deltas[i] = get_pooled_int(fd_after - fd_before)
+
alloc_before = alloc_after
rc_before = rc_after
fd_before = fd_after
- print(file=sys.stderr)
+
+ if not ns.quiet:
+ print(file=sys.stderr)
# These checkers return False on success, True on failure
def check_rc_deltas(deltas):
deltas = deltas[nwarmup:]
if checker(deltas):
msg = '%s leaked %s %s, sum=%s' % (
- test, deltas, item_name, sum(deltas))
+ test_name, deltas, item_name, sum(deltas))
print(msg, file=sys.stderr, flush=True)
with open(fname, "a") as refrep:
print(msg, file=refrep)
def dash_R_cleanup(fs, ps, pic, zdc, abcs):
- import gc, copyreg
+ import copyreg
import collections.abc
# Restore some original values.
clear_caches()
- # Collect cyclic trash and read memory statistics immediately after.
- func1 = sys.getallocatedblocks
- func2 = sys.gettotalrefcount
- gc.collect()
- return func1(), func2(), support.fd_count()
-
def clear_caches():
- import gc
-
# Clear the warnings registry, so they can be displayed again
for mod in sys.modules.values():
if hasattr(mod, '__warningregistry__'):
for f in typing._cleanups:
f()
- gc.collect()
+ support.gc_collect()
def warm_caches():
+import collections
import faulthandler
+import functools
+import gc
import importlib
import io
import os
import time
import traceback
import unittest
+
from test import support
from test.libregrtest.refleak import dash_R, clear_caches
from test.libregrtest.save_env import saved_test_environment
+from test.libregrtest.utils import print_warning
# Test result constants.
RESOURCE_DENIED = -3
INTERRUPTED = -4
CHILD_ERROR = -5 # error in a child process
-TEST_DID_NOT_RUN = -6 # error in a child process
+TEST_DID_NOT_RUN = -6
_FORMAT_TEST_RESULT = {
PASSED: '%s passed',
NOTTESTS = set()
-def format_test_result(test_name, result):
- fmt = _FORMAT_TEST_RESULT.get(result, "%s")
- return fmt % test_name
+# used by --findleaks, store for gc.garbage
+FOUND_GARBAGE = []
+
+
+def is_failed(result, ns):
+ ok = result.result
+ if ok in (PASSED, RESOURCE_DENIED, SKIPPED, TEST_DID_NOT_RUN):
+ return False
+ if ok == ENV_CHANGED:
+ return ns.fail_env_changed
+ return True
+
+
+def format_test_result(result):
+ fmt = _FORMAT_TEST_RESULT.get(result.result, "%s")
+ return fmt % result.test_name
+
+
+def findtestdir(path=None):
+ return path or os.path.dirname(os.path.dirname(__file__)) or os.curdir
def findtests(testdir=None, stdtests=STDTESTS, nottests=NOTTESTS):
return stdtests + sorted(tests)
-def get_abs_module(ns, test):
- if test.startswith('test.') or ns.testdir:
- return test
+def get_abs_module(ns, test_name):
+ if test_name.startswith('test.') or ns.testdir:
+ return test_name
else:
- # Always import it from the test package
- return 'test.' + test
-
+ # Import it from the test package
+ return 'test.' + test_name
-def runtest(ns, test):
- """Run a single test.
-
- ns -- regrtest namespace of options
- test -- the name of the test
-
- Returns the tuple (result, test_time, xml_data), where result is one
- of the constants:
- INTERRUPTED KeyboardInterrupt when run under -j
- RESOURCE_DENIED test skipped because resource denied
- SKIPPED test skipped for some other reason
- ENV_CHANGED test failed because it changed the execution environment
- FAILED test failed
- PASSED test passed
- EMPTY_TEST_SUITE test ran no subtests.
+TestResult = collections.namedtuple('TestResult',
+ 'test_name result test_time xml_data')
- If ns.xmlpath is not None, xml_data is a list containing each
- generated testsuite element.
- """
+def _runtest(ns, test_name):
+ # Handle faulthandler timeout, capture stdout+stderr, XML serialization
+ # and measure time.
output_on_failure = ns.verbose3
use_timeout = (ns.timeout is not None)
if use_timeout:
faulthandler.dump_traceback_later(ns.timeout, exit=True)
+
+ start_time = time.perf_counter()
try:
support.set_match_tests(ns.match_tests)
- # reset the environment_altered flag to detect if a test altered
- # the environment
- support.environment_altered = False
support.junit_xml_list = xml_list = [] if ns.xmlpath else None
if ns.failfast:
support.failfast = True
+
if output_on_failure:
support.verbose = True
try:
sys.stdout = stream
sys.stderr = stream
- result = runtest_inner(ns, test, display_failure=False)
- if result[0] != PASSED:
+ result = _runtest_inner(ns, test_name,
+ display_failure=False)
+ if result != PASSED:
output = stream.getvalue()
orig_stderr.write(output)
orig_stderr.flush()
sys.stdout = orig_stdout
sys.stderr = orig_stderr
else:
- support.verbose = ns.verbose # Tell tests to be moderately quiet
- result = runtest_inner(ns, test, display_failure=not ns.verbose)
+ # Tell tests to be moderately quiet
+ support.verbose = ns.verbose
+
+ result = _runtest_inner(ns, test_name,
+ display_failure=not ns.verbose)
if xml_list:
import xml.etree.ElementTree as ET
xml_data = [ET.tostring(x).decode('us-ascii') for x in xml_list]
else:
xml_data = None
- return result + (xml_data,)
+
+ test_time = time.perf_counter() - start_time
+
+ return TestResult(test_name, result, test_time, xml_data)
finally:
if use_timeout:
faulthandler.cancel_dump_traceback_later()
- cleanup_test_droppings(test, ns.verbose)
support.junit_xml_list = None
-def post_test_cleanup():
+def runtest(ns, test_name):
+ """Run a single test.
+
+ ns -- regrtest namespace of options
+ test_name -- the name of the test
+
+ Returns the tuple (result, test_time, xml_data), where result is one
+ of the constants:
+
+ INTERRUPTED KeyboardInterrupt
+ RESOURCE_DENIED test skipped because resource denied
+ SKIPPED test skipped for some other reason
+ ENV_CHANGED test failed because it changed the execution environment
+ FAILED test failed
+ PASSED test passed
+ EMPTY_TEST_SUITE test ran no subtests.
+
+ If ns.xmlpath is not None, xml_data is a list containing each
+ generated testsuite element.
+ """
+ try:
+ return _runtest(ns, test_name)
+ except:
+ if not ns.pgo:
+ msg = traceback.format_exc()
+ print(f"test {test_name} crashed -- {msg}",
+ file=sys.stderr, flush=True)
+ return TestResult(test_name, FAILED, 0.0, None)
+
+
+def _test_module(the_module):
+ loader = unittest.TestLoader()
+ tests = loader.loadTestsFromModule(the_module)
+ for error in loader.errors:
+ print(error, file=sys.stderr)
+ if loader.errors:
+ raise Exception("errors while loading tests")
+ support.run_unittest(tests)
+
+
+def _runtest_inner2(ns, test_name):
+ # Load the test function, run the test function, handle huntrleaks
+ # and findleaks to detect leaks
+
+ abstest = get_abs_module(ns, test_name)
+
+ # remove the module from sys.module to reload it if it was already imported
+ support.unload(abstest)
+
+ the_module = importlib.import_module(abstest)
+
+ # If the test has a test_main, that will run the appropriate
+ # tests. If not, use normal unittest test loading.
+ test_runner = getattr(the_module, "test_main", None)
+ if test_runner is None:
+ test_runner = functools.partial(_test_module, the_module)
+
+ try:
+ if ns.huntrleaks:
+ # Return True if the test leaked references
+ refleak = dash_R(ns, test_name, test_runner)
+ else:
+ test_runner()
+ refleak = False
+ finally:
+ cleanup_test_droppings(test_name, ns.verbose)
+
+ support.gc_collect()
+
+ if gc.garbage:
+ support.environment_altered = True
+ print_warning(f"{test_name} created {len(gc.garbage)} "
+ f"uncollectable object(s).")
+
+ # move the uncollectable objects somewhere,
+ # so we don't see them again
+ FOUND_GARBAGE.extend(gc.garbage)
+ gc.garbage.clear()
+
support.reap_children()
+ return refleak
+
+
+def _runtest_inner(ns, test_name, display_failure=True):
+ # Detect environment changes, handle exceptions.
-def runtest_inner(ns, test, display_failure=True):
- support.unload(test)
+ # Reset the environment_altered flag to detect if a test altered
+ # the environment
+ support.environment_altered = False
+
+ if ns.pgo:
+ display_failure = False
- test_time = 0.0
- refleak = False # True if the test leaked references.
try:
- abstest = get_abs_module(ns, test)
clear_caches()
- with saved_test_environment(test, ns.verbose, ns.quiet, pgo=ns.pgo) as environment:
- start_time = time.perf_counter()
- the_module = importlib.import_module(abstest)
- # If the test has a test_main, that will run the appropriate
- # tests. If not, use normal unittest test loading.
- test_runner = getattr(the_module, "test_main", None)
- if test_runner is None:
- def test_runner():
- loader = unittest.TestLoader()
- tests = loader.loadTestsFromModule(the_module)
- for error in loader.errors:
- print(error, file=sys.stderr)
- if loader.errors:
- raise Exception("errors while loading tests")
- support.run_unittest(tests)
- if ns.huntrleaks:
- refleak = dash_R(the_module, test, test_runner, ns.huntrleaks)
- else:
- test_runner()
- test_time = time.perf_counter() - start_time
- post_test_cleanup()
+
+ with saved_test_environment(test_name, ns.verbose, ns.quiet, pgo=ns.pgo) as environment:
+ refleak = _runtest_inner2(ns, test_name)
except support.ResourceDenied as msg:
if not ns.quiet and not ns.pgo:
- print(test, "skipped --", msg, flush=True)
- return RESOURCE_DENIED, test_time
+ print(f"{test_name} skipped -- {msg}", flush=True)
+ return RESOURCE_DENIED
except unittest.SkipTest as msg:
if not ns.quiet and not ns.pgo:
- print(test, "skipped --", msg, flush=True)
- return SKIPPED, test_time
- except KeyboardInterrupt:
- raise
- except support.TestFailed as msg:
- if not ns.pgo:
- if display_failure:
- print("test", test, "failed --", msg, file=sys.stderr,
- flush=True)
- else:
- print("test", test, "failed", file=sys.stderr, flush=True)
- return FAILED, test_time
+ print(f"{test_name} skipped -- {msg}", flush=True)
+ return SKIPPED
+ except support.TestFailed as exc:
+ msg = f"test {test_name} failed"
+ if display_failure:
+ msg = f"{msg} -- {exc}"
+ print(msg, file=sys.stderr, flush=True)
+ return FAILED
except support.TestDidNotRun:
- return TEST_DID_NOT_RUN, test_time
+ return TEST_DID_NOT_RUN
+ except KeyboardInterrupt:
+ print()
+ return INTERRUPTED
except:
- msg = traceback.format_exc()
if not ns.pgo:
- print("test", test, "crashed --", msg, file=sys.stderr,
- flush=True)
- return FAILED, test_time
- else:
- if refleak:
- return FAILED, test_time
- if environment.changed:
- return ENV_CHANGED, test_time
- return PASSED, test_time
+ msg = traceback.format_exc()
+ print(f"test {test_name} crashed -- {msg}",
+ file=sys.stderr, flush=True)
+ return FAILED
+ if refleak:
+ return FAILED
+ if environment.changed:
+ return ENV_CHANGED
+ return PASSED
-def cleanup_test_droppings(testname, verbose):
- import shutil
- import stat
- import gc
+def cleanup_test_droppings(test_name, verbose):
# First kill any dangling references to open files etc.
# This can also issue some ResourceWarnings which would otherwise get
# triggered during the following test run, and possibly produce failures.
- gc.collect()
+ support.gc_collect()
# Try to clean up junk commonly left behind. While tests shouldn't leave
# any files or directories behind, when a test fails that can be tedious
continue
if os.path.isdir(name):
+ import shutil
kind, nuker = "directory", shutil.rmtree
elif os.path.isfile(name):
kind, nuker = "file", os.unlink
else:
- raise SystemError("os.path says %r exists but is neither "
- "directory nor file" % name)
+ raise RuntimeError(f"os.path says {name!r} exists but is neither "
+ f"directory nor file")
if verbose:
- print("%r left behind %s %r" % (testname, kind, name))
+ print_warning("%r left behind %s %r" % (test_name, kind, name))
+ support.environment_altered = True
+
try:
- # if we have chmod, fix possible permissions problems
- # that might prevent cleanup
- if (hasattr(os, 'chmod')):
- os.chmod(name, stat.S_IRWXU | stat.S_IRWXG | stat.S_IRWXO)
+ import stat
+ # fix possible permissions problems that might prevent cleanup
+ os.chmod(name, stat.S_IRWXU | stat.S_IRWXG | stat.S_IRWXO)
nuker(name)
- except Exception as msg:
- print(("%r left behind %s %r and it couldn't be "
- "removed: %s" % (testname, kind, name, msg)), file=sys.stderr)
-
-
-def findtestdir(path=None):
- return path or os.path.dirname(os.path.dirname(__file__)) or os.curdir
+ except Exception as exc:
+ print_warning(f"{test_name} left behind {kind} {name!r} "
+ f"and it couldn't be removed: {exc}")
+import collections
import faulthandler
import json
import os
import queue
+import subprocess
import sys
import threading
import time
from test.libregrtest.runtest import (
runtest, INTERRUPTED, CHILD_ERROR, PROGRESS_MIN_TIME,
- format_test_result)
+ format_test_result, TestResult, is_failed)
from test.libregrtest.setup import setup_tests
from test.libregrtest.utils import format_duration
# Display the running tests if nothing happened last N seconds
PROGRESS_UPDATE = 30.0 # seconds
-# If interrupted, display the wait progress every N seconds
-WAIT_PROGRESS = 2.0 # seconds
+# Time to wait until a worker completes: should be immediate
+JOIN_TIMEOUT = 30.0 # seconds
-def run_test_in_subprocess(testname, ns):
- """Run the given test in a subprocess with --worker-args.
+def must_stop(result, ns):
+ if result.result == INTERRUPTED:
+ return True
+ if ns.failfast and is_failed(result, ns):
+ return True
+ return False
+
+
+def parse_worker_args(worker_args):
+ ns_dict, test_name = json.loads(worker_args)
+ ns = types.SimpleNamespace(**ns_dict)
+ return (ns, test_name)
- ns is the option Namespace parsed from command-line arguments. regrtest
- is invoked in a subprocess with the --worker-args argument; when the
- subprocess exits, its return code, stdout and stderr are returned as a
- 3-tuple.
- """
- from subprocess import Popen, PIPE
+def run_test_in_subprocess(testname, ns):
ns_dict = vars(ns)
worker_args = (ns_dict, testname)
worker_args = json.dumps(worker_args)
'-u', # Unbuffered stdout and stderr
'-m', 'test.regrtest',
'--worker-args', worker_args]
- if ns.pgo:
- cmd += ['--pgo']
# Running the child from the same working directory as regrtest's original
# invocation ensures that TEMPDIR for the child is the same when
# sysconfig.is_python_build() is true. See issue 15300.
- popen = Popen(cmd,
- stdout=PIPE, stderr=PIPE,
- universal_newlines=True,
- close_fds=(os.name != 'nt'),
- cwd=support.SAVEDCWD)
- with popen:
- stdout, stderr = popen.communicate()
- retcode = popen.wait()
- return retcode, stdout, stderr
-
-
-def run_tests_worker(worker_args):
- ns_dict, testname = json.loads(worker_args)
- ns = types.SimpleNamespace(**ns_dict)
+ return subprocess.Popen(cmd,
+ stdout=subprocess.PIPE,
+ stderr=subprocess.PIPE,
+ universal_newlines=True,
+ close_fds=(os.name != 'nt'),
+ cwd=support.SAVEDCWD)
+
+def run_tests_worker(ns, test_name):
setup_tests(ns)
- try:
- result = runtest(ns, testname)
- except KeyboardInterrupt:
- result = INTERRUPTED, '', None
- except BaseException as e:
- traceback.print_exc()
- result = CHILD_ERROR, str(e)
+ result = runtest(ns, test_name)
print() # Force a newline (just in case)
- print(json.dumps(result), flush=True)
+
+ # Serialize TestResult as list in JSON
+ print(json.dumps(list(result)), flush=True)
sys.exit(0)
"""A thread-safe iterator over tests for multiprocess mode."""
- def __init__(self, tests):
- self.interrupted = False
+ def __init__(self, tests_iter):
self.lock = threading.Lock()
- self.tests = tests
+ self.tests_iter = tests_iter
def __iter__(self):
return self
def __next__(self):
with self.lock:
- if self.interrupted:
- raise StopIteration('tests interrupted')
- return next(self.tests)
+ if self.tests_iter is None:
+ raise StopIteration
+ return next(self.tests_iter)
+
+ def stop(self):
+ with self.lock:
+ self.tests_iter = None
+
+
+MultiprocessResult = collections.namedtuple('MultiprocessResult',
+ 'result stdout stderr error_msg')
+
+class ExitThread(Exception):
+ pass
class MultiprocessThread(threading.Thread):
self.pending = pending
self.output = output
self.ns = ns
- self.current_test = None
+ self.current_test_name = None
self.start_time = None
-
- def _runtest(self):
- try:
- test = next(self.pending)
- except StopIteration:
- self.output.put((None, None, None, None))
- return True
-
+ self._popen = None
+ self._killed = False
+
+ def __repr__(self):
+ info = ['MultiprocessThread']
+ test = self.current_test_name
+ if self.is_alive():
+ info.append('alive')
+ if test:
+ info.append(f'test={test}')
+ popen = self._popen
+ if popen:
+ info.append(f'pid={popen.pid}')
+ return '<%s>' % ' '.join(info)
+
+ def kill(self):
+ self._killed = True
+
+ popen = self._popen
+ if popen is None:
+ return
+ popen.kill()
+ # stdout and stderr must be closed to ensure that communicate()
+ # does not hang
+ popen.stdout.close()
+ popen.stderr.close()
+
+ def _runtest(self, test_name):
try:
self.start_time = time.monotonic()
- self.current_test = test
-
- retcode, stdout, stderr = run_test_in_subprocess(test, self.ns)
+ self.current_test_name = test_name
+
+ self._popen = run_test_in_subprocess(test_name, self.ns)
+ popen = self._popen
+ with popen:
+ try:
+ if self._killed:
+ # If kill() has been called before self._popen is set,
+ # self._popen is still running. Call again kill()
+ # to ensure that the process is killed.
+ self.kill()
+ raise ExitThread
+
+ try:
+ stdout, stderr = popen.communicate()
+ except OSError:
+ if self._killed:
+ # kill() has been called: communicate() fails
+ # on reading closed stdout/stderr
+ raise ExitThread
+ raise
+ except:
+ self.kill()
+ popen.wait()
+ raise
+
+ retcode = popen.wait()
finally:
- self.current_test = None
+ self.current_test_name = None
+ self._popen = None
- if retcode != 0:
- result = (CHILD_ERROR, "Exit code %s" % retcode, None)
- self.output.put((test, stdout.rstrip(), stderr.rstrip(),
- result))
- return False
-
- stdout, _, result = stdout.strip().rpartition("\n")
- if not result:
- self.output.put((None, None, None, None))
- return True
+ stdout = stdout.strip()
+ stderr = stderr.rstrip()
- result = json.loads(result)
- assert len(result) == 3, f"Invalid result tuple: {result!r}"
- self.output.put((test, stdout.rstrip(), stderr.rstrip(),
- result))
- return False
+ err_msg = None
+ if retcode != 0:
+ err_msg = "Exit code %s" % retcode
+ else:
+ stdout, _, result = stdout.rpartition("\n")
+ stdout = stdout.rstrip()
+ if not result:
+ err_msg = "Failed to parse worker stdout"
+ else:
+ try:
+ # deserialize run_tests_worker() output
+ result = json.loads(result)
+ result = TestResult(*result)
+ except Exception as exc:
+ err_msg = "Failed to parse worker JSON: %s" % exc
+
+ if err_msg is not None:
+ test_time = time.monotonic() - self.start_time
+ result = TestResult(test_name, CHILD_ERROR, test_time, None)
+
+ return MultiprocessResult(result, stdout, stderr, err_msg)
def run(self):
- try:
- stop = False
- while not stop:
- stop = self._runtest()
- except BaseException:
- self.output.put((None, None, None, None))
- raise
+ while not self._killed:
+ try:
+ try:
+ test_name = next(self.pending)
+ except StopIteration:
+ break
+ mp_result = self._runtest(test_name)
+ self.output.put((False, mp_result))
-def run_tests_multiprocess(regrtest):
- output = queue.Queue()
- pending = MultiprocessIterator(regrtest.tests)
- test_timeout = regrtest.ns.timeout
- use_timeout = (test_timeout is not None)
-
- workers = [MultiprocessThread(pending, output, regrtest.ns)
- for i in range(regrtest.ns.use_mp)]
- print("Run tests in parallel using %s child processes"
- % len(workers))
+ if must_stop(mp_result.result, self.ns):
+ break
+ except ExitThread:
+ break
+ except BaseException:
+ self.output.put((True, traceback.format_exc()))
+ break
+
+
+def get_running(workers):
+ running = []
for worker in workers:
- worker.start()
-
- def get_running(workers):
- running = []
- for worker in workers:
- current_test = worker.current_test
- if not current_test:
- continue
- dt = time.monotonic() - worker.start_time
- if dt >= PROGRESS_MIN_TIME:
- text = '%s (%s)' % (current_test, format_duration(dt))
- running.append(text)
- return running
-
- finished = 0
- test_index = 1
- get_timeout = max(PROGRESS_UPDATE, PROGRESS_MIN_TIME)
- try:
- while finished < regrtest.ns.use_mp:
- if use_timeout:
- faulthandler.dump_traceback_later(test_timeout, exit=True)
+ current_test_name = worker.current_test_name
+ if not current_test_name:
+ continue
+ dt = time.monotonic() - worker.start_time
+ if dt >= PROGRESS_MIN_TIME:
+ text = '%s (%s)' % (current_test_name, format_duration(dt))
+ running.append(text)
+ return running
+
+
+class MultiprocessRunner:
+ def __init__(self, regrtest):
+ self.regrtest = regrtest
+ self.ns = regrtest.ns
+ self.output = queue.Queue()
+ self.pending = MultiprocessIterator(self.regrtest.tests)
+ if self.ns.timeout is not None:
+ self.test_timeout = self.ns.timeout * 1.5
+ else:
+ self.test_timeout = None
+ self.workers = None
+
+ def start_workers(self):
+ self.workers = [MultiprocessThread(self.pending, self.output, self.ns)
+ for _ in range(self.ns.use_mp)]
+ print("Run tests in parallel using %s child processes"
+ % len(self.workers))
+ for worker in self.workers:
+ worker.start()
+
+ def wait_workers(self):
+ start_time = time.monotonic()
+ for worker in self.workers:
+ worker.kill()
+ for worker in self.workers:
+ while True:
+ worker.join(1.0)
+ if not worker.is_alive():
+ break
+ dt = time.monotonic() - start_time
+ print("Wait for regrtest worker %r for %.1f sec" % (worker, dt))
+ if dt > JOIN_TIMEOUT:
+ print("Warning -- failed to join a regrtest worker %s"
+ % worker)
+ break
+
+ def _get_result(self):
+ if not any(worker.is_alive() for worker in self.workers):
+ # all worker threads are done: consume pending results
+ try:
+ return self.output.get(timeout=0)
+ except queue.Empty:
+ return None
+ while True:
+ if self.test_timeout is not None:
+ faulthandler.dump_traceback_later(self.test_timeout, exit=True)
+
+ # wait for a thread
+ timeout = max(PROGRESS_UPDATE, PROGRESS_MIN_TIME)
try:
- item = output.get(timeout=get_timeout)
+ return self.output.get(timeout=timeout)
except queue.Empty:
- running = get_running(workers)
- if running and not regrtest.ns.pgo:
- print('running: %s' % ', '.join(running), flush=True)
- continue
-
- test, stdout, stderr, result = item
- if test is None:
- finished += 1
- continue
- regrtest.accumulate_result(test, result)
-
- # Display progress
- ok, test_time, xml_data = result
- text = format_test_result(test, ok)
- if (ok not in (CHILD_ERROR, INTERRUPTED)
- and test_time >= PROGRESS_MIN_TIME
- and not regrtest.ns.pgo):
- text += ' (%s)' % format_duration(test_time)
- elif ok == CHILD_ERROR:
- text = '%s (%s)' % (text, test_time)
- running = get_running(workers)
- if running and not regrtest.ns.pgo:
- text += ' -- running: %s' % ', '.join(running)
- regrtest.display_progress(test_index, text)
-
- # Copy stdout and stderr from the child process
- if stdout:
- print(stdout, flush=True)
- if stderr and not regrtest.ns.pgo:
- print(stderr, file=sys.stderr, flush=True)
-
- if result[0] == INTERRUPTED:
- raise KeyboardInterrupt
- test_index += 1
- except KeyboardInterrupt:
- regrtest.interrupted = True
- pending.interrupted = True
- print()
- finally:
- if use_timeout:
- faulthandler.cancel_dump_traceback_later()
-
- # If tests are interrupted, wait until tests complete
- wait_start = time.monotonic()
- while True:
- running = [worker.current_test for worker in workers]
- running = list(filter(bool, running))
- if not running:
- break
-
- dt = time.monotonic() - wait_start
- line = "Waiting for %s (%s tests)" % (', '.join(running), len(running))
- if dt >= WAIT_PROGRESS:
- line = "%s since %.0f sec" % (line, dt)
- print(line, flush=True)
- for worker in workers:
- worker.join(WAIT_PROGRESS)
+ pass
+
+ # display progress
+ running = get_running(self.workers)
+ if running and not self.ns.pgo:
+ print('running: %s' % ', '.join(running), flush=True)
+
+ def display_result(self, mp_result):
+ result = mp_result.result
+
+ text = format_test_result(result)
+ if mp_result.error_msg is not None:
+ # CHILD_ERROR
+ text += ' (%s)' % mp_result.error_msg
+ elif (result.test_time >= PROGRESS_MIN_TIME and not self.ns.pgo):
+ text += ' (%s)' % format_duration(result.test_time)
+ running = get_running(self.workers)
+ if running and not self.ns.pgo:
+ text += ' -- running: %s' % ', '.join(running)
+ self.regrtest.display_progress(self.test_index, text)
+
+ def _process_result(self, item):
+ if item[0]:
+ # Thread got an exception
+ format_exc = item[1]
+ print(f"regrtest worker thread failed: {format_exc}",
+ file=sys.stderr, flush=True)
+ return True
+
+ self.test_index += 1
+ mp_result = item[1]
+ self.regrtest.accumulate_result(mp_result.result)
+ self.display_result(mp_result)
+
+ if mp_result.stdout:
+ print(mp_result.stdout, flush=True)
+ if mp_result.stderr and not self.ns.pgo:
+ print(mp_result.stderr, file=sys.stderr, flush=True)
+
+ if must_stop(mp_result.result, self.ns):
+ return True
+
+ return False
+
+ def run_tests(self):
+ self.start_workers()
+
+ self.test_index = 0
+ try:
+ while True:
+ item = self._get_result()
+ if item is None:
+ break
+
+ stop = self._process_result(item)
+ if stop:
+ break
+ except KeyboardInterrupt:
+ print()
+ self.regrtest.interrupted = True
+ finally:
+ if self.test_timeout is not None:
+ faulthandler.cancel_dump_traceback_later()
+
+ # a test failed (and --failfast is set) or all tests completed
+ self.pending.stop()
+ self.wait_workers()
+
+
+def run_tests_multiprocess(regrtest):
+ MultiprocessRunner(regrtest).run_tests()
import threading
import warnings
from test import support
+from test.libregrtest.utils import print_warning
try:
import _multiprocessing, multiprocessing.process
except ImportError:
self.changed = True
restore(original)
if not self.quiet and not self.pgo:
- print(f"Warning -- {name} was modified by {self.testname}",
- file=sys.stderr, flush=True)
+ print_warning(f"{name} was modified by {self.testname}")
print(f" Before: {original}\n After: {current} ",
file=sys.stderr, flush=True)
return False
except ImportError:
gc = None
-from test.libregrtest.refleak import warm_caches
-
def setup_tests(ns):
try:
if getattr(module, '__file__', None):
module.__file__ = os.path.abspath(module.__file__)
- # MacOSX (a.k.a. Darwin) has a default stack size that is too small
- # for deeply recursive regular expressions. We see this as crashes in
- # the Python test suite when running test_re.py and test_sre.py. The
- # fix is to set the stack limit to 2048.
- # This approach may also be useful for other Unixy platforms that
- # suffer from small default stack limits.
- if sys.platform == 'darwin':
- try:
- import resource
- except ImportError:
- pass
- else:
- soft, hard = resource.getrlimit(resource.RLIMIT_STACK)
- newsoft = min(hard, max(soft, 1024*2048))
- resource.setrlimit(resource.RLIMIT_STACK, (newsoft, hard))
-
if ns.huntrleaks:
unittest.BaseTestSuite._cleanup = False
- # Avoid false positives due to various caches
- # filling slowly with random data:
- warm_caches()
-
if ns.memlimit is not None:
support.set_memlimit(ns.memlimit)
-import os.path
import math
+import os.path
+import sys
import textwrap
print(textwrap.fill(' '.join(str(elt) for elt in sorted(x)), width,
initial_indent=blanks, subsequent_indent=blanks),
file=file)
+
+
+def print_warning(msg):
+ print(f"Warning -- {msg}", file=sys.stderr, flush=True)
--- /dev/null
+import _winapi
+import msvcrt
+import os
+import subprocess
+import uuid
+from test import support
+
+
+# Max size of asynchronous reads
+BUFSIZE = 8192
+# Exponential damping factor (see below)
+LOAD_FACTOR_1 = 0.9200444146293232478931553241
+# Seconds per measurement
+SAMPLING_INTERVAL = 5
+COUNTER_NAME = r'\System\Processor Queue Length'
+
+
+class WindowsLoadTracker():
+ """
+ This class asynchronously interacts with the `typeperf` command to read
+ the system load on Windows. Mulitprocessing and threads can't be used
+ here because they interfere with the test suite's cases for those
+ modules.
+ """
+
+ def __init__(self):
+ self.load = 0.0
+ self.start()
+
+ def start(self):
+ # Create a named pipe which allows for asynchronous IO in Windows
+ pipe_name = r'\\.\pipe\typeperf_output_' + str(uuid.uuid4())
+
+ open_mode = _winapi.PIPE_ACCESS_INBOUND
+ open_mode |= _winapi.FILE_FLAG_FIRST_PIPE_INSTANCE
+ open_mode |= _winapi.FILE_FLAG_OVERLAPPED
+
+ # This is the read end of the pipe, where we will be grabbing output
+ self.pipe = _winapi.CreateNamedPipe(
+ pipe_name, open_mode, _winapi.PIPE_WAIT,
+ 1, BUFSIZE, BUFSIZE, _winapi.NMPWAIT_WAIT_FOREVER, _winapi.NULL
+ )
+ # The write end of the pipe which is passed to the created process
+ pipe_write_end = _winapi.CreateFile(
+ pipe_name, _winapi.GENERIC_WRITE, 0, _winapi.NULL,
+ _winapi.OPEN_EXISTING, 0, _winapi.NULL
+ )
+ # Open up the handle as a python file object so we can pass it to
+ # subprocess
+ command_stdout = msvcrt.open_osfhandle(pipe_write_end, 0)
+
+ # Connect to the read end of the pipe in overlap/async mode
+ overlap = _winapi.ConnectNamedPipe(self.pipe, overlapped=True)
+ overlap.GetOverlappedResult(True)
+
+ # Spawn off the load monitor
+ command = ['typeperf', COUNTER_NAME, '-si', str(SAMPLING_INTERVAL)]
+ self.p = subprocess.Popen(command, stdout=command_stdout, cwd=support.SAVEDCWD)
+
+ # Close our copy of the write end of the pipe
+ os.close(command_stdout)
+
+ def close(self):
+ if self.p is None:
+ return
+ self.p.kill()
+ self.p.wait()
+ self.p = None
+
+ def __del__(self):
+ self.close()
+
+ def read_output(self):
+ import _winapi
+
+ overlapped, _ = _winapi.ReadFile(self.pipe, BUFSIZE, True)
+ bytes_read, res = overlapped.GetOverlappedResult(False)
+ if res != 0:
+ return
+
+ return overlapped.getbuffer().decode()
+
+ def getloadavg(self):
+ typeperf_output = self.read_output()
+ # Nothing to update, just return the current load
+ if not typeperf_output:
+ return self.load
+
+ # Process the backlog of load values
+ for line in typeperf_output.splitlines():
+ # typeperf outputs in a CSV format like this:
+ # "07/19/2018 01:32:26.605","3.000000"
+ toks = line.split(',')
+ # Ignore blank lines and the initial header
+ if line.strip() == '' or (COUNTER_NAME in line) or len(toks) != 2:
+ continue
+
+ load = float(toks[1].replace('"', ''))
+ # We use an exponentially weighted moving average, imitating the
+ # load calculation on Unix systems.
+ # https://en.wikipedia.org/wiki/Load_(computing)#Unix-style_load_calculation
+ new_load = self.load * LOAD_FACTOR_1 + load * (1.0 - LOAD_FACTOR_1)
+ self.load = new_load
+
+ return self.load
frame_size = self.FRAME_SIZE_TARGET
num_frames = 20
- # Large byte objects (dict values) intermitted with small objects
+ # Large byte objects (dict values) intermittent with small objects
# (dict keys)
obj = {i: bytes([i]) * frame_size for i in range(num_frames)}
class AbstractPickleModuleTests(unittest.TestCase):
def test_dump_closed_file(self):
- import os
f = open(TESTFN, "wb")
try:
f.close()
self.assertRaises(ValueError, self.dump, 123, f)
finally:
- os.remove(TESTFN)
+ support.unlink(TESTFN)
def test_load_closed_file(self):
- import os
f = open(TESTFN, "wb")
try:
f.close()
self.assertRaises(ValueError, self.dump, 123, f)
finally:
- os.remove(TESTFN)
+ support.unlink(TESTFN)
def test_load_from_and_dump_to_file(self):
stream = io.BytesIO()
self.Pickler(f, -1)
self.Pickler(f, protocol=-1)
+ def test_dump_text_file(self):
+ f = open(TESTFN, "w")
+ try:
+ for proto in protocols:
+ self.assertRaises(TypeError, self.dump, 123, f, proto)
+ finally:
+ f.close()
+ support.unlink(TESTFN)
+
+ def test_incomplete_input(self):
+ s = io.BytesIO(b"X''.")
+ self.assertRaises((EOFError, struct.error, pickle.UnpicklingError), self.load, s)
+
def test_bad_init(self):
# Test issue3664 (pickle can segfault from a badly initialized Pickler).
# Override initialization without calling __init__() of the superclass.
import re
import sys
import traceback
+import warnings
def normalize_text(text):
info_add('platform.platform',
platform.platform(aliased=True))
+ libc_ver = ('%s %s' % platform.libc_ver()).strip()
+ if libc_ver:
+ info_add('platform.libc_ver', libc_ver)
+
def collect_locale(info_add):
import locale
copy_attributes(info_add, time, 'time.%s', attributes)
if hasattr(time, 'get_clock_info'):
- for clock in ('time', 'perf_counter'):
- tinfo = time.get_clock_info(clock)
- info_add('time.get_clock_info(%s)' % clock, tinfo)
+ for clock in ('clock', 'monotonic', 'perf_counter',
+ 'process_time', 'thread_time', 'time'):
+ try:
+ # prevent DeprecatingWarning on get_clock_info('clock')
+ with warnings.catch_warnings(record=True):
+ clock_info = time.get_clock_info(clock)
+ except ValueError:
+ # missing clock like time.thread_time()
+ pass
+ else:
+ info_add('time.get_clock_info(%s)' % clock, clock_info)
def collect_datetime(info_add):
except ImportError:
args = CC.split()
args.append('--version')
- proc = subprocess.Popen(args,
- stdout=subprocess.PIPE,
- stderr=subprocess.STDOUT,
- universal_newlines=True)
+ try:
+ proc = subprocess.Popen(args,
+ stdout=subprocess.PIPE,
+ stderr=subprocess.STDOUT,
+ universal_newlines=True)
+ except OSError:
+ # Cannot run the compiler, for example when Python has been
+ # cross-compiled and installed on the target platform where the
+ # compiler is missing.
+ return
+
stdout = proc.communicate()[0]
if proc.returncode:
# CC --version failed: ignore error
# Dump global configuration variables, _PyCoreConfig
# and _PyMainInterpreterConfig
try:
- from _testcapi import get_global_config, get_core_config, get_main_config
+ from _testinternalcapi import get_configs
except ImportError:
return
- for prefix, get_config_func in (
- ('global_config', get_global_config),
- ('core_config', get_core_config),
- ('main_config', get_main_config),
- ):
- config = get_config_func()
+ all_configs = get_configs()
+ for config_type in sorted(all_configs):
+ config = all_configs[config_type]
for key in sorted(config):
- info_add('%s[%s]' % (prefix, key), repr(config[key]))
+ info_add('%s[%s]' % (config_type, key), repr(config[key]))
+
+
+def collect_subprocess(info_add):
+ import subprocess
+ copy_attributes(info_add, subprocess, 'subprocess.%s', ('_USE_POSIX_SPAWN',))
def collect_info(info):
collect_cc,
collect_gdbm,
collect_get_config,
+ collect_subprocess,
# Collecting from tests should be last as they have side effects.
collect_test_socket,
-----BEGIN CERTIFICATE-----
-MIIClTCCAf6gAwIBAgIJAKGU95wKR8pTMA0GCSqGSIb3DQEBBQUAMHAxCzAJBgNV
-BAYTAlhZMRcwFQYDVQQHDA5DYXN0bGUgQW50aHJheDEjMCEGA1UECgwaUHl0aG9u
-IFNvZnR3YXJlIEZvdW5kYXRpb24xIzAhBgNVBAMMGnNlbGYtc2lnbmVkLnB5dGhv
-bnRlc3QubmV0MB4XDTE0MTEwMjE4MDkyOVoXDTI0MTAzMDE4MDkyOVowcDELMAkG
-A1UEBhMCWFkxFzAVBgNVBAcMDkNhc3RsZSBBbnRocmF4MSMwIQYDVQQKDBpQeXRo
-b24gU29mdHdhcmUgRm91bmRhdGlvbjEjMCEGA1UEAwwac2VsZi1zaWduZWQucHl0
-aG9udGVzdC5uZXQwgZ8wDQYJKoZIhvcNAQEBBQADgY0AMIGJAoGBANDXQXW9tjyZ
-Xt0Iv2tLL1+jinr4wGg36ioLDLFkMf+2Y1GL0v0BnKYG4N1OKlAU15LXGeGer8vm
-Sv/yIvmdrELvhAbbo3w4a9TMYQA4XkIVLdvu3mvNOAet+8PMJxn26dbDhG809ALv
-EHY57lQsBS3G59RZyBPVqAqmImWNJnVzAgMBAAGjNzA1MCUGA1UdEQQeMByCGnNl
-bGYtc2lnbmVkLnB5dGhvbnRlc3QubmV0MAwGA1UdEwQFMAMBAf8wDQYJKoZIhvcN
-AQEFBQADgYEAIuzAhgMouJpNdf3URCHIineyoSt6WK/9+eyUcjlKOrDoXNZaD72h
-TXMeKYoWvJyVcSLKL8ckPtDobgP2OTt0UkyAaj0n+ZHaqq1lH2yVfGUA1ILJv515
-C8BqbvVZuqm3i7ygmw3bqE/lYMgOrYtXXnqOrz6nvsE6Yc9V9rFflOM=
+MIIF9zCCA9+gAwIBAgIUH98b4Fw/DyugC9cV7VK7ZODzHsIwDQYJKoZIhvcNAQEL
+BQAwgYoxCzAJBgNVBAYTAlhZMRcwFQYDVQQIDA5DYXN0bGUgQW50aHJheDEYMBYG
+A1UEBwwPQXJndW1lbnQgQ2xpbmljMSMwIQYDVQQKDBpQeXRob24gU29mdHdhcmUg
+Rm91bmRhdGlvbjEjMCEGA1UEAwwac2VsZi1zaWduZWQucHl0aG9udGVzdC5uZXQw
+HhcNMTkwNTA4MDEwMjQzWhcNMjcwNzI0MDEwMjQzWjCBijELMAkGA1UEBhMCWFkx
+FzAVBgNVBAgMDkNhc3RsZSBBbnRocmF4MRgwFgYDVQQHDA9Bcmd1bWVudCBDbGlu
+aWMxIzAhBgNVBAoMGlB5dGhvbiBTb2Z0d2FyZSBGb3VuZGF0aW9uMSMwIQYDVQQD
+DBpzZWxmLXNpZ25lZC5weXRob250ZXN0Lm5ldDCCAiIwDQYJKoZIhvcNAQEBBQAD
+ggIPADCCAgoCggIBAMKdJlyCThkahwoBb7pl5q64Pe9Fn5jrIvzsveHTc97TpjV2
+RLfICnXKrltPk/ohkVl6K5SUZQZwMVzFubkyxE0nZPHYHlpiKWQxbsYVkYv01rix
+IFdLvaxxbGYke2jwQao31s4o61AdlsfK1SdpHQUynBBMssqI3SB4XPmcA7e+wEEx
+jxjVish4ixA1vuIZOx8yibu+CFCf/geEjoBMF3QPdzULzlrCSw8k/45iZCSoNbvK
+DoL4TVV07PHOxpheDh8ZQmepGvU6pVqhb9m4lgmV0OGWHgozd5Ur9CbTVDmxIEz3
+TSoRtNJK7qtyZdGNqwjksQxgZTjM/d/Lm/BJG99AiOmYOjsl9gbQMZgvQmMAtUsI
+aMJnQuZ6R+KEpW/TR5qSKLWZSG45z/op+tzI2m+cE6HwTRVAWbcuJxcAA55MZjqU
+OOOu3BBYMjS5nf2sQ9uoXsVBFH7i0mQqoW1SLzr9opI8KsWwFxQmO2vBxWYaN+lH
+OmwBZBwyODIsmI1YGXmTp09NxRYz3Qe5GCgFzYowpMrcxUC24iduIdMwwhRM7rKg
+7GtIWMSrFfuI1XCLRmSlhDbhNN6fVg2f8Bo9PdH9ihiIyxSrc+FOUasUYCCJvlSZ
+8hFUlLvcmrZlWuazohm0lsXuMK1JflmQr/DA/uXxP9xzFfRy+RU3jDyxJbRHAgMB
+AAGjUzBRMB0GA1UdDgQWBBSQJyxiPMRK01i+0BsV9zUwDiBaHzAfBgNVHSMEGDAW
+gBSQJyxiPMRK01i+0BsV9zUwDiBaHzAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3
+DQEBCwUAA4ICAQCR+7a7N/m+WLkxPPIA/CB4MOr2Uf8ixTv435Nyv6rXOun0+lTP
+ExSZ0uYQ+L0WylItI3cQHULldDueD+s8TGzxf5woaLKf6tqyr0NYhKs+UeNEzDnN
+9PHQIhX0SZw3XyXGUgPNBfRCg2ZDdtMMdOU4XlQN/IN/9hbYTrueyY7eXq9hmtI9
+1srftAMqr9SR1JP7aHI6DVgrEsZVMTDnfT8WmLSGLlY1HmGfdEn1Ip5sbo9uSkiH
+AEPgPfjYIvR5LqTOMn4KsrlZyBbFIDh9Sl99M1kZzgH6zUGVLCDg1y6Cms69fx/e
+W1HoIeVkY4b4TY7Bk7JsqyNhIuqu7ARaxkdaZWhYaA2YyknwANdFfNpfH+elCLIk
+BUt5S3f4i7DaUePTvKukCZiCq4Oyln7RcOn5If73wCeLB/ZM9Ei1HforyLWP1CN8
+XLfpHaoeoPSWIveI0XHUl65LsPN2UbMbul/F23hwl+h8+BLmyAS680Yhn4zEN6Ku
+B7Po90HoFa1Du3bmx4jsN73UkT/dwMTi6K072FbipnC1904oGlWmLwvAHvrtxxmL
+Pl3pvEaZIu8wa/PNF6Y7J7VIewikIJq6Ta6FrWeFfzMWOj2qA1ZZi6fUaDSNYvuV
+J5quYKCc/O+I/yDDf8wyBbZ/gvUXzUHTMYGG+bFrn1p7XDbYYeEJ6R/xEg==
-----END CERTIFICATE-----
a = self.type2test([0,1,2,3,4])
self.assertEqual(a[ -pow(2,128): 3 ], self.type2test([0,1,2]))
self.assertEqual(a[ 3: pow(2,145) ], self.type2test([3,4]))
+ self.assertEqual(a[3::sys.maxsize], self.type2test([3]))
def test_contains(self):
u = self.type2test([0, 1, 2])
def test_extended_getslice(self):
# Test extended slicing by comparing with list slicing.
s = string.ascii_letters + string.digits
- indices = (0, None, 1, 3, 41, -1, -2, -37)
+ indices = (0, None, 1, 3, 41, sys.maxsize, -1, -2, -37)
for start in indices:
for stop in indices:
# Skip step 0 (invalid)
yield path
finally:
# In case the process forks, let only the parent remove the
- # directory. The child has a diffent process id. (bpo-30028)
+ # directory. The child has a different process id. (bpo-30028)
if dir_created and pid == os.getpid():
rmtree(path)
ioerror_peer_reset = TransientResource(OSError, errno=errno.ECONNRESET)
+def get_socket_conn_refused_errs():
+ """
+ Get the different socket error numbers ('errno') which can be received
+ when a connection is refused.
+ """
+ errors = [errno.ECONNREFUSED]
+ if hasattr(errno, 'ENETUNREACH'):
+ # On Solaris, ENETUNREACH is returned sometimes instead of ECONNREFUSED
+ errors.append(errno.ENETUNREACH)
+ if hasattr(errno, 'EADDRNOTAVAIL'):
+ # bpo-31910: socket.create_connection() fails randomly
+ # with EADDRNOTAVAIL on Travis CI
+ errors.append(errno.EADDRNOTAVAIL)
+ return errors
+
+
@contextlib.contextmanager
def transient_internet(resource_name, *, timeout=30.0, errnos=()):
"""Return a context manager that raises ResourceDenied when various issues
# Test extended slicing by comparing with list slicing
# (Assumes list conversion works correctly, too)
a = array.array(self.typecode, self.example)
- indices = (0, None, 1, 3, 19, 100, -1, -2, -31, -100)
+ indices = (0, None, 1, 3, 19, 100, sys.maxsize, -1, -2, -31, -100)
for start in indices:
for stop in indices:
# Everything except the initial 0 (invalid step)
self.assertRaises(TypeError, a.__setitem__, slice(0, 1), b)
def test_extended_set_del_slice(self):
- indices = (0, None, 1, 3, 19, 100, -1, -2, -31, -100)
+ indices = (0, None, 1, 3, 19, 100, sys.maxsize, -1, -2, -31, -100)
for start in indices:
for stop in indices:
# Everything except the initial 0 (invalid step)
t.close()
test_utils.run_briefly(self.loop) # allow transport to close
+ @patch_socket
+ def test_create_connection_ipv6_scope(self, m_socket):
+ m_socket.getaddrinfo = socket.getaddrinfo
+ sock = m_socket.socket.return_value
+ sock.family = socket.AF_INET6
+
+ self.loop._add_reader = mock.Mock()
+ self.loop._add_reader._is_coroutine = False
+ self.loop._add_writer = mock.Mock()
+ self.loop._add_writer._is_coroutine = False
+
+ coro = self.loop.create_connection(asyncio.Protocol, 'fe80::1%1', 80)
+ t, p = self.loop.run_until_complete(coro)
+ try:
+ sock.connect.assert_called_with(('fe80::1', 80, 0, 1))
+ _, kwargs = m_socket.socket.call_args
+ self.assertEqual(kwargs['family'], m_socket.AF_INET6)
+ self.assertEqual(kwargs['type'], m_socket.SOCK_STREAM)
+ finally:
+ t.close()
+ test_utils.run_briefly(self.loop) # allow transport to close
+
@patch_socket
def test_create_connection_ip_addr(self, m_socket):
self._test_create_connection_ip_addr(m_socket, True)
self.assertRaises(
OSError, self.loop.run_until_complete, coro)
+ def test_create_datagram_endpoint_allow_broadcast(self):
+ protocol = MyDatagramProto(create_future=True, loop=self.loop)
+ self.loop.sock_connect = sock_connect = mock.Mock()
+ sock_connect.return_value = []
+
+ coro = self.loop.create_datagram_endpoint(
+ lambda: protocol,
+ remote_addr=('127.0.0.1', 0),
+ allow_broadcast=True)
+
+ transport, _ = self.loop.run_until_complete(coro)
+ self.assertFalse(sock_connect.called)
+
+ transport.close()
+ self.loop.run_until_complete(protocol.done)
+ self.assertEqual('CLOSED', protocol.state)
+
@patch_socket
def test_create_datagram_endpoint_socket_err(self, m_socket):
m_socket.getaddrinfo = socket.getaddrinfo
class SendfileBase:
- DATA = b"SendfileBaseData" * (1024 * 8) # 128 KiB
+ # 128 KiB plus small unaligned to buffer chunk
+ DATA = b"SendfileBaseData" * (1024 * 8 + 1)
# Reduce socket buffer size to test on relative small data sets.
BUF_SIZE = 4 * 1024 # 4 KiB
tr._force_close = mock.Mock()
tr._fatal_error(exc)
+ m_exc.assert_not_called()
+
+ tr._force_close.assert_called_with(exc)
+
+ @mock.patch('asyncio.log.logger.error')
+ def test_fatal_error_custom_exception(self, m_exc):
+ class MyError(Exception):
+ pass
+ exc = MyError()
+ tr = self.create_transport()
+ tr._force_close = mock.Mock()
+ tr._fatal_error(exc)
+
m_exc.assert_called_with(
test_utils.MockPattern(
'Fatal error on transport\nprotocol:.*\ntransport:.*'),
- exc_info=(OSError, MOCK_ANY, MOCK_ANY))
+ exc_info=(MyError, MOCK_ANY, MOCK_ANY))
tr._force_close.assert_called_with(exc)
self.sock.fileno.return_value = 7
def datagram_transport(self, address=None):
+ self.sock.getpeername.side_effect = None if address else OSError
transport = _SelectorDatagramTransport(self.loop, self.sock,
self.protocol,
address=address)
err = ConnectionRefusedError()
transport._fatal_error(err)
self.assertFalse(self.protocol.error_received.called)
+ m_exc.assert_not_called()
+
+ @mock.patch('asyncio.base_events.logger.error')
+ def test_fatal_error_connected_custom_error(self, m_exc):
+ class MyException(Exception):
+ pass
+ transport = self.datagram_transport(address=('0.0.0.0', 1))
+ err = MyException()
+ transport._fatal_error(err)
+ self.assertFalse(self.protocol.error_received.called)
m_exc.assert_called_with(
test_utils.MockPattern(
'Fatal error on transport\nprotocol:.*\ntransport:.*'),
- exc_info=(ConnectionRefusedError, MOCK_ANY, MOCK_ANY))
+ exc_info=(MyException, MOCK_ANY, MOCK_ANY))
if __name__ == '__main__':
import socket
import sys
import unittest
+import weakref
from unittest import mock
try:
import ssl
self.loop.run_until_complete(
asyncio.wait_for(client(srv.addr), loop=self.loop, timeout=10))
+ # No garbage is left if SSL is closed uncleanly
+ client_context = weakref.ref(client_context)
+ self.assertIsNone(client_context())
+
+ def test_create_connection_memory_leak(self):
+ HELLO_MSG = b'1' * self.PAYLOAD_SIZE
+
+ server_context = test_utils.simple_server_sslcontext()
+ client_context = test_utils.simple_client_sslcontext()
+
+ def serve(sock):
+ sock.settimeout(self.TIMEOUT)
+
+ sock.start_tls(server_context, server_side=True)
+
+ sock.sendall(b'O')
+ data = sock.recv_all(len(HELLO_MSG))
+ self.assertEqual(len(data), len(HELLO_MSG))
+
+ sock.shutdown(socket.SHUT_RDWR)
+ sock.close()
+
+ class ClientProto(asyncio.Protocol):
+ def __init__(self, on_data, on_eof):
+ self.on_data = on_data
+ self.on_eof = on_eof
+ self.con_made_cnt = 0
+
+ def connection_made(proto, tr):
+ # XXX: We assume user stores the transport in protocol
+ proto.tr = tr
+ proto.con_made_cnt += 1
+ # Ensure connection_made gets called only once.
+ self.assertEqual(proto.con_made_cnt, 1)
+
+ def data_received(self, data):
+ self.on_data.set_result(data)
+
+ def eof_received(self):
+ self.on_eof.set_result(True)
+
+ async def client(addr):
+ await asyncio.sleep(0.5)
+
+ on_data = self.loop.create_future()
+ on_eof = self.loop.create_future()
+
+ tr, proto = await self.loop.create_connection(
+ lambda: ClientProto(on_data, on_eof), *addr,
+ ssl=client_context)
+
+ self.assertEqual(await on_data, b'O')
+ tr.write(HELLO_MSG)
+ await on_eof
+
+ tr.close()
+
+ with self.tcp_server(serve, timeout=self.TIMEOUT) as srv:
+ self.loop.run_until_complete(
+ asyncio.wait_for(client(srv.addr), timeout=10))
+
+ # No garbage is left for SSL client from loop.create_connection, even
+ # if user stores the SSLTransport in corresponding protocol instance
+ client_context = weakref.ref(client_context)
+ self.assertIsNone(client_context())
+
def test_start_tls_client_buf_proto_1(self):
HELLO_MSG = b'1' * self.PAYLOAD_SIZE
def test_start_tls_server_1(self):
HELLO_MSG = b'1' * self.PAYLOAD_SIZE
+ ANSWER = b'answer'
server_context = test_utils.simple_server_sslcontext()
client_context = test_utils.simple_client_sslcontext()
- if sys.platform.startswith('freebsd'):
- # bpo-35031: Some FreeBSD buildbots fail to run this test
+ if (sys.platform.startswith('freebsd')
+ or sys.platform.startswith('win')
+ or sys.platform.startswith('darwin')):
+ # bpo-35031: Some FreeBSD and Windows buildbots fail to run this test
# as the eof was not being received by the server if the payload
# size is not big enough. This behaviour only appears if the
- # client is using TLS1.3.
+ # client is using TLS1.3. Also seen on macOS.
client_context.options |= ssl.OP_NO_TLSv1_3
+ answer = None
def client(sock, addr):
+ nonlocal answer
sock.settimeout(self.TIMEOUT)
sock.connect(addr)
sock.start_tls(client_context)
sock.sendall(HELLO_MSG)
-
- sock.shutdown(socket.SHUT_RDWR)
+ answer = sock.recv_all(len(ANSWER))
sock.close()
class ServerProto(asyncio.Protocol):
- def __init__(self, on_con, on_eof, on_con_lost):
+ def __init__(self, on_con, on_con_lost):
self.on_con = on_con
- self.on_eof = on_eof
self.on_con_lost = on_con_lost
self.data = b''
+ self.transport = None
def connection_made(self, tr):
+ self.transport = tr
self.on_con.set_result(tr)
+ def replace_transport(self, tr):
+ self.transport = tr
+
def data_received(self, data):
self.data += data
-
- def eof_received(self):
- self.on_eof.set_result(1)
+ if len(self.data) >= len(HELLO_MSG):
+ self.transport.write(ANSWER)
def connection_lost(self, exc):
+ self.transport = None
if exc is None:
self.on_con_lost.set_result(None)
else:
self.on_con_lost.set_exception(exc)
- async def main(proto, on_con, on_eof, on_con_lost):
+ async def main(proto, on_con, on_con_lost):
tr = await on_con
tr.write(HELLO_MSG)
server_side=True,
ssl_handshake_timeout=self.TIMEOUT)
- await on_eof
+ proto.replace_transport(new_tr)
+
await on_con_lost
self.assertEqual(proto.data, HELLO_MSG)
new_tr.close()
async def run_main():
on_con = self.loop.create_future()
- on_eof = self.loop.create_future()
on_con_lost = self.loop.create_future()
- proto = ServerProto(on_con, on_eof, on_con_lost)
+ proto = ServerProto(on_con, on_con_lost)
server = await self.loop.create_server(
lambda: proto, '127.0.0.1', 0)
with self.tcp_client(lambda sock: client(sock, addr),
timeout=self.TIMEOUT):
await asyncio.wait_for(
- main(proto, on_con, on_eof, on_con_lost),
+ main(proto, on_con, on_con_lost),
loop=self.loop, timeout=self.TIMEOUT)
server.close()
await server.wait_closed()
+ self.assertEqual(answer, ANSWER)
self.loop.run_until_complete(run_main())
# exception or log an error, even if the handshake failed
self.assertEqual(messages, [])
+ # The 10s handshake timeout should be cancelled to free related
+ # objects without really waiting for 10s
+ client_sslctx = weakref.ref(client_sslctx)
+ self.assertIsNone(client_sslctx())
+
def test_create_connection_ssl_slow_handshake(self):
client_sslctx = test_utils.simple_client_sslcontext()
isinstance(self, SubprocessFastWatcherTests)):
asyncio.get_child_watcher()._callbacks.clear()
- def test_popen_error(self):
- # Issue #24763: check that the subprocess transport is closed
- # when BaseSubprocessTransport fails
+ def _test_popen_error(self, stdin):
if sys.platform == 'win32':
target = 'asyncio.windows_utils.Popen'
else:
popen.side_effect = exc
create = asyncio.create_subprocess_exec(sys.executable, '-c',
- 'pass', loop=self.loop)
+ 'pass', stdin=stdin,
+ loop=self.loop)
with warnings.catch_warnings(record=True) as warns:
with self.assertRaises(exc):
self.loop.run_until_complete(create)
self.assertEqual(warns, [])
+ def test_popen_error(self):
+ # Issue #24763: check that the subprocess transport is closed
+ # when BaseSubprocessTransport fails
+ self._test_popen_error(stdin=None)
+
+ def test_popen_error_with_stdin_pipe(self):
+ # Issue #35721: check that newly created socket pair is closed when
+ # Popen fails
+ self._test_popen_error(stdin=subprocess.PIPE)
+
def test_read_stdout_after_process_exit(self):
async def execute():
test_utils.run_briefly(self.loop)
self.assertIs(outer.exception(), exc)
- def test_shield_cancel(self):
+ def test_shield_cancel_inner(self):
inner = self.new_future(self.loop)
outer = asyncio.shield(inner)
test_utils.run_briefly(self.loop)
test_utils.run_briefly(self.loop)
self.assertTrue(outer.cancelled())
+ def test_shield_cancel_outer(self):
+ inner = self.new_future(self.loop)
+ outer = asyncio.shield(inner)
+ test_utils.run_briefly(self.loop)
+ outer.cancel()
+ test_utils.run_briefly(self.loop)
+ self.assertTrue(outer.cancelled())
+ self.assertEqual(0, 0 if outer._callbacks is None else len(outer._callbacks))
+
def test_shield_shortcut(self):
fut = self.new_future(self.loop)
fut.set_result(42)
self.assertFalse(self.loop.readers)
self.assertEqual(bytearray(), tr._buffer)
self.assertTrue(tr.is_closing())
- m_logexc.assert_called_with(
- test_utils.MockPattern(
- 'Fatal write error on pipe transport'
- '\nprotocol:.*\ntransport:.*'),
- exc_info=(OSError, MOCK_ANY, MOCK_ANY))
+ m_logexc.assert_not_called()
self.assertEqual(1, tr._conn_lost)
test_utils.run_briefly(self.loop)
self.protocol.connection_lost.assert_called_with(err)
# Test extended slicing by comparing with list slicing.
L = list(range(255))
b = self.type2test(L)
- indices = (0, None, 1, 3, 19, 100, -1, -2, -31, -100)
+ indices = (0, None, 1, 3, 19, 100, sys.maxsize, -1, -2, -31, -100)
for start in indices:
for stop in indices:
# Skip step 0 (invalid)
self.assertLessEqual(sys.getsizeof(b), size)
def test_extended_set_del_slice(self):
- indices = (0, None, 1, 3, 19, 300, 1<<333, -1, -2, -31, -300)
+ indices = (0, None, 1, 3, 19, 300, 1<<333, sys.maxsize,
+ -1, -2, -31, -300)
for start in indices:
for stop in indices:
# Skip invalid step 0
import struct
import collections
import itertools
+import gc
class FunctionCalls(unittest.TestCase):
result = _testcapi.pyobject_fastcallkeywords(func, args, kwnames)
self.check_result(result, expected)
+ def test_fastcall_clearing_dict(self):
+ # Test bpo-36907: the point of the test is just checking that this
+ # does not crash.
+ class IntWithDict:
+ __slots__ = ["kwargs"]
+ def __init__(self, **kwargs):
+ self.kwargs = kwargs
+ def __int__(self):
+ self.kwargs.clear()
+ gc.collect()
+ return 0
+ x = IntWithDict(dont_inherit=IntWithDict())
+ # We test the argument handling of "compile" here, the compilation
+ # itself is not relevant. When we pass flags=x below, x.__int__() is
+ # called, which changes the keywords dict.
+ compile("pass", "", "exec", x, **x.kwargs)
if __name__ == "__main__":
unittest.main()
r" The [0-9] pad bytes at p-[0-9] are FORBIDDENBYTE, as expected.\n"
r" The [0-9] pad bytes at tail={ptr} are not all FORBIDDENBYTE \(0x[0-9a-f]{{2}}\):\n"
r" at tail\+0: 0x78 \*\*\* OUCH\n"
- r" at tail\+1: 0xfb\n"
- r" at tail\+2: 0xfb\n"
+ r" at tail\+1: 0xfd\n"
+ r" at tail\+2: 0xfd\n"
r" .*\n"
r" The block was made by call #[0-9]+ to debug malloc/realloc.\n"
- r" Data at p: cb cb cb .*\n"
+ r" Data at p: cd cd cd .*\n"
r"\n"
r"Enable tracemalloc to get the memory block allocation traceback\n"
r"\n"
r" The [0-9] pad bytes at p-[0-9] are FORBIDDENBYTE, as expected.\n"
r" The [0-9] pad bytes at tail={ptr} are FORBIDDENBYTE, as expected.\n"
r" The block was made by call #[0-9]+ to debug malloc/realloc.\n"
- r" Data at p: cb cb cb .*\n"
+ r" Data at p: cd cd cd .*\n"
r"\n"
r"Enable tracemalloc to get the memory block allocation traceback\n"
r"\n"
code = 'import _testcapi; _testcapi.pyobject_malloc_without_gil()'
self.check_malloc_without_gil(code)
+ def check_pyobject_is_freed(self, func):
+ code = textwrap.dedent('''
+ import gc, os, sys, _testcapi
+ # Disable the GC to avoid crash on GC collection
+ gc.disable()
+ obj = _testcapi.{func}()
+ error = (_testcapi.pyobject_is_freed(obj) == False)
+ # Exit immediately to avoid a crash while deallocating
+ # the invalid object
+ os._exit(int(error))
+ ''')
+ code = code.format(func=func)
+ assert_python_ok('-c', code, PYTHONMALLOC=self.PYTHONMALLOC)
+
+ def test_pyobject_is_freed_uninitialized(self):
+ self.check_pyobject_is_freed('pyobject_uninitialized')
+
+ def test_pyobject_is_freed_forbidden_bytes(self):
+ self.check_pyobject_is_freed('pyobject_forbidden_bytes')
+
+ def test_pyobject_is_freed_free(self):
+ self.check_pyobject_is_freed('pyobject_freed')
+
class PyMemMallocDebugTests(PyMemDebugTests):
PYTHONMALLOC = 'malloc_debug'
script_name, script_name, script_dir, '',
importlib.machinery.SourceFileLoader)
+ def test_issue20884(self):
+ # On Windows, script with encoding cookie and LF line ending
+ # will be failed.
+ with support.temp_dir() as script_dir:
+ script_name = os.path.join(script_dir, "issue20884.py")
+ with open(script_name, "w", newline='\n') as f:
+ f.write("#coding: iso-8859-1\n")
+ f.write('"""\n')
+ for _ in range(30):
+ f.write('x'*80 + '\n')
+ f.write('"""\n')
+
+ with support.change_cwd(path=script_dir):
+ rc, out, err = assert_python_ok(script_name)
+ self.assertEqual(b"", out)
+ self.assertEqual(b"", err)
+
@contextlib.contextmanager
def setup_test_pkg(self, *args):
with support.temp_dir() as script_dir, \
self.assertEqual(test_sequence.decode(self.encoding, "backslashreplace"),
before + backslashreplace + after)
+ def test_incremental_surrogatepass(self):
+ # Test incremental decoder for surrogatepass handler:
+ # see issue #24214
+ # High surrogate
+ data = '\uD901'.encode(self.encoding, 'surrogatepass')
+ for i in range(1, len(data)):
+ dec = codecs.getincrementaldecoder(self.encoding)('surrogatepass')
+ self.assertEqual(dec.decode(data[:i]), '')
+ self.assertEqual(dec.decode(data[i:], True), '\uD901')
+ # Low surrogate
+ data = '\uDC02'.encode(self.encoding, 'surrogatepass')
+ for i in range(1, len(data)):
+ dec = codecs.getincrementaldecoder(self.encoding)('surrogatepass')
+ self.assertEqual(dec.decode(data[:i]), '')
+ final = self.encoding == "cp65001"
+ self.assertEqual(dec.decode(data[i:], final), '\uDC02')
+
class UTF32Test(ReadTest, unittest.TestCase):
encoding = "utf-32"
with self.assertRaises(UnicodeDecodeError):
b"abc\xed\xa0z".decode(self.encoding, "surrogatepass")
+ def test_incremental_errors(self):
+ # Test that the incremental decoder can fail with final=False.
+ # See issue #24214
+ cases = [b'\x80', b'\xBF', b'\xC0', b'\xC1', b'\xF5', b'\xF6', b'\xFF']
+ for prefix in (b'\xC2', b'\xDF', b'\xE0', b'\xE0\xA0', b'\xEF',
+ b'\xEF\xBF', b'\xF0', b'\xF0\x90', b'\xF0\x90\x80',
+ b'\xF4', b'\xF4\x8F', b'\xF4\x8F\xBF'):
+ for suffix in b'\x7F', b'\xC0':
+ cases.append(prefix + suffix)
+ cases.extend((b'\xE0\x80', b'\xE0\x9F', b'\xED\xA0\x80',
+ b'\xED\xBF\xBF', b'\xF0\x80', b'\xF0\x8F', b'\xF4\x90'))
+
+ for data in cases:
+ with self.subTest(data=data):
+ dec = codecs.getincrementaldecoder(self.encoding)()
+ self.assertRaises(UnicodeDecodeError, dec.decode, data)
+
@unittest.skipUnless(sys.platform == 'win32',
'cp65001 is a Windows-only codec')
('[\U0010ffff\uDC80]', 'replace', b'[\xf4\x8f\xbf\xbf?]'),
))
+ def test_code_page_decode_flags(self):
+ # Issue #36312: For some code pages (e.g. UTF-7) flags for
+ # MultiByteToWideChar() must be set to 0.
+ for cp in (50220, 50221, 50222, 50225, 50227, 50229,
+ *range(57002, 57011+1), 65000):
+ self.assertEqual(codecs.code_page_decode(cp, b'abc'), ('abc', 3))
+ self.assertEqual(codecs.code_page_decode(42, b'abc'),
+ ('\uf061\uf062\uf063', 3))
+
def test_incremental(self):
decoded = codecs.code_page_decode(932, b'\x82', 'strict', False)
self.assertEqual(decoded, ('', 0))
self.assertEqual(data.decode('latin1'), expected)
+class StreamRecoderTest(unittest.TestCase):
+ def test_writelines(self):
+ bio = io.BytesIO()
+ codec = codecs.lookup('ascii')
+ sr = codecs.StreamRecoder(bio, codec.encode, codec.decode,
+ encodings.ascii.StreamReader, encodings.ascii.StreamWriter)
+ sr.writelines([b'a', b'b'])
+ self.assertEqual(bio.getvalue(), b'ab')
+
+ def test_write(self):
+ bio = io.BytesIO()
+ codec = codecs.lookup('latin1')
+ # Recode from Latin-1 to utf-8.
+ sr = codecs.StreamRecoder(bio, codec.encode, codec.decode,
+ encodings.utf_8.StreamReader, encodings.utf_8.StreamWriter)
+
+ text = 'àñé'
+ sr.write(text.encode('latin1'))
+ self.assertEqual(bio.getvalue(), text.encode('utf-8'))
+
+ def test_seeking_read(self):
+ bio = io.BytesIO('line1\nline2\nline3\n'.encode('utf-16-le'))
+ sr = codecs.EncodedFile(bio, 'utf-8', 'utf-16-le')
+
+ self.assertEqual(sr.readline(), b'line1\n')
+ sr.seek(0)
+ self.assertEqual(sr.readline(), b'line1\n')
+ self.assertEqual(sr.readline(), b'line2\n')
+ self.assertEqual(sr.readline(), b'line3\n')
+ self.assertEqual(sr.readline(), b'')
+
+ def test_seeking_write(self):
+ bio = io.BytesIO('123456789\n'.encode('utf-16-le'))
+ sr = codecs.EncodedFile(bio, 'utf-8', 'utf-16-le')
+
+ # Test that seek() only resets its internal buffer when offset
+ # and whence are zero.
+ sr.seek(2)
+ sr.write(b'\nabc\n')
+ self.assertEqual(sr.readline(), b'789\n')
+ sr.seek(0)
+ self.assertEqual(sr.readline(), b'1\n')
+ self.assertEqual(sr.readline(), b'abc\n')
+ self.assertEqual(sr.readline(), b'789\n')
+
+
if __name__ == "__main__":
unittest.main()
b=b.__name__,
),
)
+
+ def _copy_test(self, obj):
+ # Test internal copy
+ obj_copy = obj.copy()
+ self.assertIsNot(obj.data, obj_copy.data)
+ self.assertEqual(obj.data, obj_copy.data)
+
+ # Test copy.copy
+ obj.test = [1234] # Make sure instance vars are also copied.
+ obj_copy = copy.copy(obj)
+ self.assertIsNot(obj.data, obj_copy.data)
+ self.assertEqual(obj.data, obj_copy.data)
+ self.assertIs(obj.test, obj_copy.test)
+
def test_str_protocol(self):
self._superset_test(UserString, str)
def test_dict_protocol(self):
self._superset_test(UserDict, dict)
+ def test_list_copy(self):
+ obj = UserList()
+ obj.append(123)
+ self._copy_test(obj)
+
+ def test_dict_copy(self):
+ obj = UserDict()
+ obj[123] = "abc"
+ self._copy_test(obj)
+
################################################################################
### ChainMap (helper class for configparser and the string module)
def test_defaults(self):
Point = namedtuple('Point', 'x y', defaults=(10, 20)) # 2 defaults
- self.assertEqual(Point._fields_defaults, {'x': 10, 'y': 20})
+ self.assertEqual(Point._field_defaults, {'x': 10, 'y': 20})
self.assertEqual(Point(1, 2), (1, 2))
self.assertEqual(Point(1), (1, 20))
self.assertEqual(Point(), (10, 20))
Point = namedtuple('Point', 'x y', defaults=(20,)) # 1 default
- self.assertEqual(Point._fields_defaults, {'y': 20})
+ self.assertEqual(Point._field_defaults, {'y': 20})
self.assertEqual(Point(1, 2), (1, 2))
self.assertEqual(Point(1), (1, 20))
Point = namedtuple('Point', 'x y', defaults=()) # 0 defaults
- self.assertEqual(Point._fields_defaults, {})
+ self.assertEqual(Point._field_defaults, {})
self.assertEqual(Point(1, 2), (1, 2))
with self.assertRaises(TypeError):
Point(1)
Point = namedtuple('Point', 'x y', defaults=False)
Point = namedtuple('Point', 'x y', defaults=None) # default is None
- self.assertEqual(Point._fields_defaults, {})
+ self.assertEqual(Point._field_defaults, {})
self.assertIsNone(Point.__new__.__defaults__, None)
self.assertEqual(Point(10, 20), (10, 20))
with self.assertRaises(TypeError): # catch too few args
Point(10)
Point = namedtuple('Point', 'x y', defaults=[10, 20]) # allow non-tuple iterable
- self.assertEqual(Point._fields_defaults, {'x': 10, 'y': 20})
+ self.assertEqual(Point._field_defaults, {'x': 10, 'y': 20})
self.assertEqual(Point.__new__.__defaults__, (10, 20))
self.assertEqual(Point(1, 2), (1, 2))
self.assertEqual(Point(1), (1, 20))
self.assertEqual(Point(), (10, 20))
Point = namedtuple('Point', 'x y', defaults=iter([10, 20])) # allow plain iterator
- self.assertEqual(Point._fields_defaults, {'x': 10, 'y': 20})
+ self.assertEqual(Point._field_defaults, {'x': 10, 'y': 20})
self.assertEqual(Point.__new__.__defaults__, (10, 20))
self.assertEqual(Point(1, 2), (1, 2))
self.assertEqual(Point(1), (1, 20))
def mul(x, y):
return x * y
+def capture(*args, **kwargs):
+ return args, kwargs
+
def sleep_and_raise(t):
time.sleep(t)
raise Exception('this is an exception')
def test_submit_keyword(self):
future = self.executor.submit(mul, 2, y=8)
self.assertEqual(16, future.result())
+ future = self.executor.submit(capture, 1, self=2, fn=3)
+ self.assertEqual(future.result(), ((1,), {'self': 2, 'fn': 3}))
+ future = self.executor.submit(fn=capture, arg=1)
+ self.assertEqual(future.result(), ((), {'arg': 1}))
+ with self.assertRaises(TypeError):
+ self.executor.submit(arg=1)
def test_map(self):
self.assertEqual(
class ProcessPoolExecutorTest(ExecutorTest):
+
+ @unittest.skipUnless(sys.platform=='win32', 'Windows-only process limit')
+ def test_max_workers_too_large(self):
+ with self.assertRaisesRegex(ValueError,
+ "max_workers must be <= 61"):
+ futures.ProcessPoolExecutor(max_workers=62)
+
def test_killed_child(self):
# When a child process is abruptly terminated, the whole pool gets
# "broken".
f.add_done_callback(fn)
self.assertTrue(was_cancelled)
+ def test_done_callback_raises_already_succeeded(self):
+ with test.support.captured_stderr() as stderr:
+ def raising_fn(callback_future):
+ raise Exception('doh!')
+
+ f = Future()
+
+ # Set the result first to simulate a future that runs instantly,
+ # effectively allowing the callback to be run immediately.
+ f.set_result(5)
+ f.add_done_callback(raising_fn)
+
+ self.assertIn('exception calling callback for', stderr.getvalue())
+ self.assertIn('doh!', stderr.getvalue())
+
+
def test_repr(self):
self.assertRegex(repr(PENDING_FUTURE),
'<Future at 0x[0-9a-f]+ state=pending>')
((), dict(example=1)),
((1,), dict(example=1)),
((1,2), dict(example=1)),
+ ((1,2), dict(self=3, callback=4)),
]
result = []
def _exit(*args, **kwds):
self.assertIsNone(wrapper[1].__doc__, _exit.__doc__)
self.assertEqual(result, expected)
+ result = []
+ with self.exit_stack() as stack:
+ with self.assertRaises(TypeError):
+ stack.callback(arg=1)
+ with self.assertRaises(TypeError):
+ self.exit_stack.callback(arg=2)
+ stack.callback(callback=_exit, arg=3)
+ self.assertEqual(result, [((), {'arg': 3})])
+
def test_push(self):
exc_raised = ZeroDivisionError
def _expect_exc(exc_type, exc, exc_tb):
self.assertEqual(result, expected)
+ result = []
+ async with AsyncExitStack() as stack:
+ with self.assertRaises(TypeError):
+ stack.push_async_callback(arg=1)
+ with self.assertRaises(TypeError):
+ self.exit_stack.push_async_callback(arg=2)
+ stack.push_async_callback(callback=_exit, arg=3)
+ self.assertEqual(result, [((), {'arg': 3})])
+
@_async_test
async def test_async_push(self):
exc_raised = ZeroDivisionError
}
)
- # Make sure that the returned dicts are actuall OrderedDicts.
+ # Make sure that the returned dicts are actually OrderedDicts.
self.assertIs(type(d), OrderedDict)
self.assertIs(type(d['y'][1]), OrderedDict)
a.setstate(100)
self.assertEqual(a.getstate(), 100)
+ def test_wrap_lenfunc_bad_cast(self):
+ self.assertEqual(range(sys.maxsize).__len__(), sys.maxsize)
+
+
class ClassPropertiesAndMethods(unittest.TestCase):
def assertHasAttr(self, obj, name):
self.assertEqual(x2, SubSpam)
self.assertEqual(a2, a1)
self.assertEqual(d2, d1)
- with self.assertRaises(TypeError):
+
+ with self.assertRaises(TypeError) as cm:
spam_cm()
- with self.assertRaises(TypeError):
+ self.assertEqual(
+ str(cm.exception),
+ "descriptor 'classmeth' of 'xxsubtype.spamlist' "
+ "object needs an argument")
+
+ with self.assertRaises(TypeError) as cm:
spam_cm(spam.spamlist())
- with self.assertRaises(TypeError):
+ self.assertEqual(
+ str(cm.exception),
+ "descriptor 'classmeth' requires a type "
+ "but received a 'xxsubtype.spamlist' instance")
+
+ with self.assertRaises(TypeError) as cm:
spam_cm(list)
+ self.assertEqual(
+ str(cm.exception),
+ "descriptor 'classmeth' requires a subtype of 'xxsubtype.spamlist' "
+ "but received 'list'")
def test_staticmethods(self):
# Testing static methods...
self.assertIn("unlocked", repr(self.lock))
+class RLockTests(unittest.TestCase):
+ """Test dummy RLock objects."""
+
+ def setUp(self):
+ self.rlock = _thread.RLock()
+
+ def test_multiple_acquire(self):
+ self.assertIn("unlocked", repr(self.rlock))
+ self.rlock.acquire()
+ self.rlock.acquire()
+ self.assertIn("locked", repr(self.rlock))
+ self.rlock.release()
+ self.assertIn("locked", repr(self.rlock))
+ self.rlock.release()
+ self.assertIn("unlocked", repr(self.rlock))
+ self.assertRaises(RuntimeError, self.rlock.release)
+
+
class MiscTests(unittest.TestCase):
"""Miscellaneous tests."""
func = mock.Mock(side_effect=Exception)
_thread.start_new_thread(func, tuple())
self.assertTrue(mock_print_exc.called)
+
+if __name__ == '__main__':
+ unittest.main()
'=?us-ascii?q?first?==?utf-8?q?second?=',
'first',
'first',
- [],
+ [errors.InvalidHeaderDefect],
'=?utf-8?q?second?=')
def test_get_encoded_word_sets_extra_attributes(self):
'=?utf-8?q?foo?==?utf-8?q?bar?=',
'foobar',
'foobar',
+ [errors.InvalidHeaderDefect,
+ errors.InvalidHeaderDefect],
+ '')
+
+ def test_get_unstructured_ew_without_leading_whitespace(self):
+ self._test_get_x(
+ self._get_unst,
+ 'nowhitespace=?utf-8?q?somevalue?=',
+ 'nowhitespacesomevalue',
+ 'nowhitespacesomevalue',
+ [errors.InvalidHeaderDefect],
+ '')
+
+ def test_get_unstructured_ew_without_trailing_whitespace(self):
+ self._test_get_x(
+ self._get_unst,
+ '=?utf-8?q?somevalue?=nowhitespace',
+ 'somevaluenowhitespace',
+ 'somevaluenowhitespace',
[errors.InvalidHeaderDefect],
'')
'"=?utf-8?Q?not_really_valid?="',
'"not really valid"',
'not really valid',
- [errors.InvalidHeaderDefect],
+ [errors.InvalidHeaderDefect,
+ errors.InvalidHeaderDefect],
'')
# get_comment
g.flatten(msg)
self.assertEqual(b.getvalue(), source + b'>From R\xc3\xb6lli\n')
+ def test_mutltipart_with_bad_bytes_in_cte(self):
+ # bpo30835
+ source = textwrap.dedent("""\
+ From: aperson@example.com
+ Content-Type: multipart/mixed; boundary="1"
+ Content-Transfer-Encoding: \xc8
+ """).encode('utf-8')
+ msg = email.message_from_bytes(source)
+
# Test the basic MIMEAudio class
class TestMIMEAudio(unittest.TestCase):
msg['SomeHeader'] = ' value with leading ws'
self.assertEqual(str(msg), "SomeHeader: value with leading ws\n\n")
+ def test_whitespace_header(self):
+ self.assertEqual(Header(' ').encode(), ' ')
+
# Test RFC 2231 header parameters (en/de)coding
from email import message_from_string, message_from_bytes
from email.message import EmailMessage
from email.generator import Generator, BytesGenerator
+from email.headerregistry import Address
from email import policy
from test.test_email import TestEmailBase, parameterize
g.flatten(msg)
self.assertEqual(s.getvalue(), expected)
+ def test_smtp_policy(self):
+ msg = EmailMessage()
+ msg["From"] = Address(addr_spec="foo@bar.com", display_name="Páolo")
+ msg["To"] = Address(addr_spec="bar@foo.com", display_name="Dinsdale")
+ msg["Subject"] = "Nudge nudge, wink, wink"
+ msg.set_content("oh boy, know what I mean, know what I mean?")
+ expected = textwrap.dedent("""\
+ From: =?utf-8?q?P=C3=A1olo?= <foo@bar.com>
+ To: Dinsdale <bar@foo.com>
+ Subject: Nudge nudge, wink, wink
+ Content-Type: text/plain; charset="utf-8"
+ Content-Transfer-Encoding: 7bit
+ MIME-Version: 1.0
+
+ oh boy, know what I mean, know what I mean?
+ """).encode().replace(b"\n", b"\r\n")
+ s = io.BytesIO()
+ g = BytesGenerator(s, policy=policy.SMTP)
+ g.flatten(msg)
+ self.assertEqual(s.getvalue(), expected)
+
if __name__ == '__main__':
unittest.main()
'rfc2047_atom_in_quoted_string_is_decoded':
('"=?utf-8?q?=C3=89ric?=" <foo@example.com>',
- [errors.InvalidHeaderDefect],
+ [errors.InvalidHeaderDefect,
+ errors.InvalidHeaderDefect],
'Éric <foo@example.com>',
'Éric',
'foo@example.com',
self.assertEqual(
h.fold(policy=policy.default),
'X-Report-Abuse: =?utf-8?q?=3Chttps=3A//www=2Emailitapp=2E'
- 'com/report=5F?=\n'
- ' =?utf-8?q?abuse=2Ephp=3Fmid=3Dxxx-xxx-xxxx'
- 'xxxxxxxxxxxxxxxxxxxx=3D=3D-xxx-?=\n'
- ' =?utf-8?q?xx-xx=3E?=\n')
+ 'com/report=5Fabuse?=\n'
+ ' =?utf-8?q?=2Ephp=3Fmid=3Dxxx-xxx-xxxx'
+ 'xxxxxxxxxxxxxxxxxxxx=3D=3D-xxx-xx-xx?=\n'
+ ' =?utf-8?q?=3E?=\n')
if __name__ == '__main__':
m['Subject'] = 'unicöde'
self.assertEqual(str(m), 'Subject: unicöde\n\n')
+ def test_folding_with_utf8_encoding_1(self):
+ # bpo-36520
+ #
+ # Fold a line that contains UTF-8 words before
+ # and after the whitespace fold point, where the
+ # line length limit is reached within an ASCII
+ # word.
+
+ m = EmailMessage()
+ m['Subject'] = 'Hello Wörld! Hello Wörld! ' \
+ 'Hello Wörld! Hello Wörld!Hello Wörld!'
+ self.assertEqual(bytes(m),
+ b'Subject: Hello =?utf-8?q?W=C3=B6rld!_Hello_W'
+ b'=C3=B6rld!_Hello_W=C3=B6rld!?=\n'
+ b' Hello =?utf-8?q?W=C3=B6rld!Hello_W=C3=B6rld!?=\n\n')
+
+
+ def test_folding_with_utf8_encoding_2(self):
+ # bpo-36520
+ #
+ # Fold a line that contains UTF-8 words before
+ # and after the whitespace fold point, where the
+ # line length limit is reached at the end of an
+ # encoded word.
+
+ m = EmailMessage()
+ m['Subject'] = 'Hello Wörld! Hello Wörld! ' \
+ 'Hello Wörlds123! Hello Wörld!Hello Wörld!'
+ self.assertEqual(bytes(m),
+ b'Subject: Hello =?utf-8?q?W=C3=B6rld!_Hello_W'
+ b'=C3=B6rld!_Hello_W=C3=B6rlds123!?=\n'
+ b' Hello =?utf-8?q?W=C3=B6rld!Hello_W=C3=B6rld!?=\n\n')
+
+ def test_folding_with_utf8_encoding_3(self):
+ # bpo-36520
+ #
+ # Fold a line that contains UTF-8 words before
+ # and after the whitespace fold point, where the
+ # line length limit is reached at the end of the
+ # first word.
+
+ m = EmailMessage()
+ m['Subject'] = 'Hello-Wörld!-Hello-Wörld!-Hello-Wörlds123! ' \
+ 'Hello Wörld!Hello Wörld!'
+ self.assertEqual(bytes(m), \
+ b'Subject: =?utf-8?q?Hello-W=C3=B6rld!-Hello-W'
+ b'=C3=B6rld!-Hello-W=C3=B6rlds123!?=\n'
+ b' Hello =?utf-8?q?W=C3=B6rld!Hello_W=C3=B6rld!?=\n\n')
+
+ def test_folding_with_utf8_encoding_4(self):
+ # bpo-36520
+ #
+ # Fold a line that contains UTF-8 words before
+ # and after the fold point, where the first
+ # word is UTF-8 and the fold point is within
+ # the word.
+
+ m = EmailMessage()
+ m['Subject'] = 'Hello-Wörld!-Hello-Wörld!-Hello-Wörlds123!-Hello' \
+ ' Wörld!Hello Wörld!'
+ self.assertEqual(bytes(m),
+ b'Subject: =?utf-8?q?Hello-W=C3=B6rld!-Hello-W'
+ b'=C3=B6rld!-Hello-W=C3=B6rlds123!?=\n'
+ b' =?utf-8?q?-Hello_W=C3=B6rld!Hello_W=C3=B6rld!?=\n\n')
+
+ def test_folding_with_utf8_encoding_5(self):
+ # bpo-36520
+ #
+ # Fold a line that contains a UTF-8 word after
+ # the fold point.
+
+ m = EmailMessage()
+ m['Subject'] = '123456789 123456789 123456789 123456789 123456789' \
+ ' 123456789 123456789 Hello Wörld!'
+ self.assertEqual(bytes(m),
+ b'Subject: 123456789 123456789 123456789 123456789'
+ b' 123456789 123456789 123456789\n'
+ b' Hello =?utf-8?q?W=C3=B6rld!?=\n\n')
+
+ def test_folding_with_utf8_encoding_6(self):
+ # bpo-36520
+ #
+ # Fold a line that contains a UTF-8 word before
+ # the fold point and ASCII words after
+
+ m = EmailMessage()
+ m['Subject'] = '123456789 123456789 123456789 123456789 Hello Wörld!' \
+ ' 123456789 123456789 123456789 123456789 123456789' \
+ ' 123456789'
+ self.assertEqual(bytes(m),
+ b'Subject: 123456789 123456789 123456789 123456789'
+ b' Hello =?utf-8?q?W=C3=B6rld!?=\n 123456789 '
+ b'123456789 123456789 123456789 123456789 '
+ b'123456789\n\n')
+
+ def test_folding_with_utf8_encoding_7(self):
+ # bpo-36520
+ #
+ # Fold a line twice that contains UTF-8 words before
+ # and after the first fold point, and ASCII words
+ # after the second fold point.
+
+ m = EmailMessage()
+ m['Subject'] = '123456789 123456789 Hello Wörld! Hello Wörld! ' \
+ '123456789-123456789 123456789 Hello Wörld! 123456789' \
+ ' 123456789'
+ self.assertEqual(bytes(m),
+ b'Subject: 123456789 123456789 Hello =?utf-8?q?'
+ b'W=C3=B6rld!_Hello_W=C3=B6rld!?=\n'
+ b' 123456789-123456789 123456789 Hello '
+ b'=?utf-8?q?W=C3=B6rld!?= 123456789\n 123456789\n\n')
+
+ def test_folding_with_utf8_encoding_8(self):
+ # bpo-36520
+ #
+ # Fold a line twice that contains UTF-8 words before
+ # the first fold point, and ASCII words after the
+ # first fold point, and UTF-8 words after the second
+ # fold point.
+
+ m = EmailMessage()
+ m['Subject'] = '123456789 123456789 Hello Wörld! Hello Wörld! ' \
+ '123456789 123456789 123456789 123456789 123456789 ' \
+ '123456789-123456789 123456789 Hello Wörld! 123456789' \
+ ' 123456789'
+ self.assertEqual(bytes(m),
+ b'Subject: 123456789 123456789 Hello '
+ b'=?utf-8?q?W=C3=B6rld!_Hello_W=C3=B6rld!?=\n 123456789 '
+ b'123456789 123456789 123456789 123456789 '
+ b'123456789-123456789\n 123456789 Hello '
+ b'=?utf-8?q?W=C3=B6rld!?= 123456789 123456789\n\n')
class TestMIMEPart(TestEmailMessageBase, TestEmailBase):
# Doing the full test run here may seem a bit redundant, since the two
import io
+import sys
import types
import textwrap
import unittest
for attr, value in expected.items():
self.assertEqual(getattr(added, attr), value)
+ def test_fold_zero_max_line_length(self):
+ expected = 'Subject: =?utf-8?q?=C3=A1?=\n'
+
+ msg = email.message.EmailMessage()
+ msg['Subject'] = 'á'
+
+ p1 = email.policy.default.clone(max_line_length=0)
+ p2 = email.policy.default.clone(max_line_length=None)
+
+ self.assertEqual(p1.fold('Subject', msg['Subject']), expected)
+ self.assertEqual(p2.fold('Subject', msg['Subject']), expected)
+
def test_register_defect(self):
class Dummy:
def __init__(self):
email.policy.EmailPolicy.header_factory)
self.assertEqual(newpolicy.__dict__, {'raise_on_defect': True})
+ def test_non_ascii_chars_do_not_cause_inf_loop(self):
+ policy = email.policy.default.clone(max_line_length=20)
+ actual = policy.fold('Subject', 'ą' * 12)
+ self.assertEqual(
+ actual,
+ 'Subject: \n' +
+ 12 * ' =?utf-8?q?=C4=85?=\n')
+
# XXX: Need subclassing tests.
# For adding subclassed objects, make sure the usual rules apply (subclass
# wins), but that the order still works (right overrides left).
positional = functools.partialmethod(capture, 1)
keywords = functools.partialmethod(capture, a=2)
both = functools.partialmethod(capture, 3, b=4)
+ spec_keywords = functools.partialmethod(capture, self=1, func=2)
nested = functools.partialmethod(positional, 5)
self.assertEqual(self.A.both(self.a, 5, c=6), ((self.a, 3, 5), {'b': 4, 'c': 6}))
+ self.assertEqual(self.a.spec_keywords(), ((self.a,), {'self': 1, 'func': 2}))
+
def test_nested(self):
self.assertEqual(self.a.nested(), ((self.a, 1, 5), {}))
self.assertEqual(self.a.nested(6), ((self.a, 1, 5, 6), {}))
with self.assertRaises(TypeError):
class B(object):
method = functools.partialmethod(None, 1)
+ with self.assertRaises(TypeError):
+ class B:
+ method = functools.partialmethod()
+ class B:
+ method = functools.partialmethod(func=capture, a=1)
+ b = B()
+ self.assertEqual(b.method(2, x=3), ((b, 2), {'a': 1, 'x': 3}))
def test_repr(self):
self.assertEqual(repr(vars(self.A)['both']),
self.assertEqual(f(20), '.20.')
self.assertEqual(f.cache_info().currsize, 10)
+ def test_lru_bug_36650(self):
+ # C version of lru_cache was treating a call with an empty **kwargs
+ # dictionary as being distinct from a call with no keywords at all.
+ # This did not result in an incorrect answer, but it did trigger
+ # an unexpected cache miss.
+
+ @self.module.lru_cache()
+ def f(x):
+ pass
+
+ f(0)
+ f(0, **{})
+ self.assertEqual(f.cache_info().hits, 1)
+
def test_lru_hash_only_once(self):
# To protect against weird reentrancy bugs and to improve
# efficiency when faced with slow __hash__ methods, the
self.assertEqual(getitem_args[0], (C, (int, str)))
self.assertEqual(getitem_args[1], {})
- def test_class_getitem(self):
+ def test_class_getitem_format(self):
class C:
def __class_getitem__(cls, item):
return f'C[{item.__name__}]'
self.assertEqual(out[:2], b"\x1f\x8b")
@create_and_remove_directory(TEMPDIR)
- def test_compress_infile_outfile(self):
+ def test_compress_infile_outfile_default(self):
local_testgzip = os.path.join(TEMPDIR, 'testgzip')
gzipname = local_testgzip + '.gz'
self.assertFalse(os.path.exists(gzipname))
import itertools
import os
import array
+import re
import socket
import threading
body = (
b'HTTP/1.1 200 OK\r\n'
b'First: val\r\n'
- b'Second: val\r\n'
+ b'Second: val1\r\n'
+ b'Second: val2\r\n'
)
sock = FakeSocket(body)
resp = client.HTTPResponse(sock, debuglevel=1)
lines = output.getvalue().splitlines()
self.assertEqual(lines[0], "reply: 'HTTP/1.1 200 OK\\r\\n'")
self.assertEqual(lines[1], "header: First: val")
- self.assertEqual(lines[2], "header: Second: val")
+ self.assertEqual(lines[2], "header: Second: val1")
+ self.assertEqual(lines[3], "header: Second: val2")
class TransferEncodingTest(TestCase):
# We feed the server's cert as a validating cert
import ssl
support.requires('network')
- with support.transient_internet('self-signed.pythontest.net'):
+ selfsigned_pythontestdotnet = 'self-signed.pythontest.net'
+ with support.transient_internet(selfsigned_pythontestdotnet):
context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
self.assertEqual(context.verify_mode, ssl.CERT_REQUIRED)
self.assertEqual(context.check_hostname, True)
context.load_verify_locations(CERT_selfsigned_pythontestdotnet)
- h = client.HTTPSConnection('self-signed.pythontest.net', 443, context=context)
- h.request('GET', '/')
- resp = h.getresponse()
+ try:
+ h = client.HTTPSConnection(selfsigned_pythontestdotnet, 443,
+ context=context)
+ h.request('GET', '/')
+ resp = h.getresponse()
+ except ssl.SSLError as ssl_err:
+ ssl_err_str = str(ssl_err)
+ # In the error message of [SSL: CERTIFICATE_VERIFY_FAILED] on
+ # modern Linux distros (Debian Buster, etc) default OpenSSL
+ # configurations it'll fail saying "key too weak" until we
+ # address https://bugs.python.org/issue36816 to use a proper
+ # key size on self-signed.pythontest.net.
+ if re.search(r'(?i)key.too.weak', ssl_err_str):
+ raise unittest.SkipTest(
+ f'Got {ssl_err_str} trying to connect '
+ f'to {selfsigned_pythontestdotnet}. '
+ 'See https://bugs.python.org/issue36816.')
+ raise
server_string = resp.getheader('server')
resp.close()
h.close()
self.assertEqual(h, c.host)
self.assertEqual(p, c.port)
+ def test_tls13_pha(self):
+ import ssl
+ if not ssl.HAS_TLSv1_3:
+ self.skipTest('TLS 1.3 support required')
+ # just check status of PHA flag
+ h = client.HTTPSConnection('localhost', 443)
+ self.assertTrue(h._context.post_handshake_auth)
+
+ context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
+ self.assertFalse(context.post_handshake_auth)
+ h = client.HTTPSConnection('localhost', 443, context=context)
+ self.assertIs(h._context, context)
+ self.assertFalse(h._context.post_handshake_auth)
+
+ h = client.HTTPSConnection('localhost', 443, context=context,
+ cert_file=CERT_localhost)
+ self.assertTrue(h._context.post_handshake_auth)
+
class RequestBodyTest(TestCase):
"""Test cases where a request includes a message body."""
except socket.error:
pass
- expected_errnos = [
- # This is the exception that should be raised.
- errno.ECONNREFUSED,
- ]
- if hasattr(errno, 'EADDRNOTAVAIL'):
- # socket.create_connection() fails randomly with
- # EADDRNOTAVAIL on Travis CI.
- expected_errnos.append(errno.EADDRNOTAVAIL)
+ # This is the exception that should be raised.
+ expected_errnos = support.get_socket_conn_refused_errs()
with self.assertRaises(OSError) as cm:
imaplib.IMAP4()
self.assertIn(cm.exception.errno, expected_errnos)
self.assertEqual(typ, 'OK')
self.assertEqual(data[0], b'LOGIN completed')
typ, data = client.logout()
- self.assertEqual(typ, 'BYE')
- self.assertEqual(data[0], b'IMAP4ref1 Server logging out')
+ self.assertEqual(typ, 'BYE', (typ, data))
+ self.assertEqual(data[0], b'IMAP4ref1 Server logging out', (typ, data))
self.assertEqual(client.state, 'LOGOUT')
def test_lsub(self):
with transient_internet(self.host):
rs = self.server.logout()
self.server = None
- self.assertEqual(rs[0], 'BYE')
+ self.assertEqual(rs[0], 'BYE', rs)
@unittest.skipUnless(ssl, "SSL not available")
with transient_internet(self.host):
_server = self.imap_class(self.host, self.port)
rs = _server.logout()
- self.assertEqual(rs[0], 'BYE')
+ self.assertEqual(rs[0], 'BYE', rs)
def test_ssl_context_certfile_exclusive(self):
with transient_internet(self.host):
def test_init(self):
with self.assertRaises(TypeError):
- # Classes that dono't define exec_module() trigger TypeError.
+ # Classes that don't define exec_module() trigger TypeError.
util.LazyLoader(object)
def new_module(self, source_code=None):
def test_getfile(self):
self.assertEqual(inspect.getfile(mod.StupidGit), mod.__file__)
+ def test_getfile_builtin_module(self):
+ with self.assertRaises(TypeError) as e:
+ inspect.getfile(sys)
+ self.assertTrue(str(e.exception).startswith('<module'))
+
+ def test_getfile_builtin_class(self):
+ with self.assertRaises(TypeError) as e:
+ inspect.getfile(int)
+ self.assertTrue(str(e.exception).startswith('<class'))
+
+ def test_getfile_builtin_function_or_method(self):
+ with self.assertRaises(TypeError) as e_abs:
+ inspect.getfile(abs)
+ self.assertIn('expected, got', str(e_abs.exception))
+ with self.assertRaises(TypeError) as e_append:
+ inspect.getfile(list.append)
+ self.assertIn('expected, got', str(e_append.exception))
+
def test_getfile_class_without_module(self):
class CM(type):
@property
def test_repr(self):
raw = self.MockRawIO()
b = self.tp(raw)
- clsname = "%s.%s" % (self.tp.__module__, self.tp.__qualname__)
- self.assertEqual(repr(b), "<%s>" % clsname)
+ clsname = r"(%s\.)?%s" % (self.tp.__module__, self.tp.__qualname__)
+ self.assertRegex(repr(b), "<%s>" % clsname)
raw.name = "dummy"
- self.assertEqual(repr(b), "<%s name='dummy'>" % clsname)
+ self.assertRegex(repr(b), "<%s name='dummy'>" % clsname)
raw.name = b"dummy"
- self.assertEqual(repr(b), "<%s name=b'dummy'>" % clsname)
+ self.assertRegex(repr(b), "<%s name=b'dummy'>" % clsname)
def test_recursive_repr(self):
# Issue #25455
b = self.BufferedReader(raw)
t = self.TextIOWrapper(b, encoding="utf-8")
modname = self.TextIOWrapper.__module__
- self.assertEqual(repr(t),
- "<%s.TextIOWrapper encoding='utf-8'>" % modname)
+ self.assertRegex(repr(t),
+ r"<(%s\.)?TextIOWrapper encoding='utf-8'>" % modname)
raw.name = "dummy"
- self.assertEqual(repr(t),
- "<%s.TextIOWrapper name='dummy' encoding='utf-8'>" % modname)
+ self.assertRegex(repr(t),
+ r"<(%s\.)?TextIOWrapper name='dummy' encoding='utf-8'>" % modname)
t.mode = "r"
- self.assertEqual(repr(t),
- "<%s.TextIOWrapper name='dummy' mode='r' encoding='utf-8'>" % modname)
+ self.assertRegex(repr(t),
+ r"<(%s\.)?TextIOWrapper name='dummy' mode='r' encoding='utf-8'>" % modname)
raw.name = b"dummy"
- self.assertEqual(repr(t),
- "<%s.TextIOWrapper name=b'dummy' mode='r' encoding='utf-8'>" % modname)
+ self.assertRegex(repr(t),
+ r"<(%s\.)?TextIOWrapper name=b'dummy' mode='r' encoding='utf-8'>" % modname)
t.buffer.detach()
repr(t) # Should not raise an exception
err = res.err.decode()
if res.rc != 0:
# Failure: should be a fatal error
- self.assertIn("Fatal Python error: could not acquire lock "
- "for <_io.BufferedWriter name='<{stream_name}>'> "
- "at interpreter shutdown, possibly due to "
- "daemon threads".format_map(locals()),
- err)
+ pattern = (r"Fatal Python error: could not acquire lock "
+ r"for <(_io\.)?BufferedWriter name='<{stream_name}>'> "
+ r"at interpreter shutdown, possibly due to "
+ r"daemon threads".format_map(locals()))
+ self.assertRegex(err, pattern)
else:
self.assertFalse(err.strip('.!'))
assertBadNetmask("1.1.1.1", "pudding")
assertBadNetmask("1.1.1.1", "::")
+ def test_netmask_in_tuple_errors(self):
+ def assertBadNetmask(addr, netmask):
+ msg = "%r is not a valid netmask" % netmask
+ with self.assertNetmaskError(re.escape(msg)):
+ self.factory((addr, netmask))
+ assertBadNetmask("1.1.1.1", -1)
+ assertBadNetmask("1.1.1.1", 33)
+
def test_pickle(self):
self.pickle_test('192.0.2.0/27')
self.pickle_test('192.0.2.0/31') # IPV4LENGTH - 1
assertBadNetmask("::1", "pudding")
assertBadNetmask("::", "::")
+ def test_netmask_in_tuple_errors(self):
+ def assertBadNetmask(addr, netmask):
+ msg = "%r is not a valid netmask" % netmask
+ with self.assertNetmaskError(re.escape(msg)):
+ self.factory((addr, netmask))
+ assertBadNetmask("::1", -1)
+ assertBadNetmask("::1", 129)
+
def test_pickle(self):
self.pickle_test('2001:db8::1000/124')
self.pickle_test('2001:db8::1000/127') # IPV6LENGTH - 1
import struct
import sys
import tempfile
-from test.support.script_helper import assert_python_ok
+from test.support.script_helper import assert_python_ok, assert_python_failure
from test import support
import textwrap
import threading
# register_at_fork mechanism is also present and used.
@unittest.skipIf(not hasattr(os, 'fork'), 'Test requires os.fork().')
def test_post_fork_child_no_deadlock(self):
- """Ensure forked child logging locks are not held; bpo-6721."""
- refed_h = logging.Handler()
+ """Ensure child logging locks are not held; bpo-6721 & bpo-36533."""
+ class _OurHandler(logging.Handler):
+ def __init__(self):
+ super().__init__()
+ self.sub_handler = logging.StreamHandler(
+ stream=open('/dev/null', 'wt'))
+
+ def emit(self, record):
+ self.sub_handler.acquire()
+ try:
+ self.sub_handler.emit(record)
+ finally:
+ self.sub_handler.release()
+
+ self.assertEqual(len(logging._handlers), 0)
+ refed_h = _OurHandler()
+ self.addCleanup(refed_h.sub_handler.stream.close)
refed_h.name = 'because we need at least one for this test'
self.assertGreater(len(logging._handlers), 0)
+ self.assertGreater(len(logging._at_fork_reinit_lock_weakset), 1)
+ test_logger = logging.getLogger('test_post_fork_child_no_deadlock')
+ test_logger.addHandler(refed_h)
+ test_logger.setLevel(logging.DEBUG)
locks_held__ready_to_fork = threading.Event()
fork_happened__release_locks_and_end_thread = threading.Event()
locks_held__ready_to_fork.wait()
pid = os.fork()
if pid == 0: # Child.
- logging.error(r'Child process did not deadlock. \o/')
- os._exit(0)
+ try:
+ test_logger.info(r'Child process did not deadlock. \o/')
+ finally:
+ os._exit(0)
else: # Parent.
+ test_logger.info(r'Parent process returned from fork. \o/')
fork_happened__release_locks_and_end_thread.set()
lock_holder_thread.join()
start_time = time.monotonic()
while True:
+ test_logger.debug('Waiting for child process.')
waited_pid, status = os.waitpid(pid, os.WNOHANG)
if waited_pid == pid:
break # child process exited.
if time.monotonic() - start_time > 7:
break # so long? implies child deadlock.
time.sleep(0.05)
+ test_logger.debug('Done waiting.')
if waited_pid != pid:
os.kill(pid, signal.SIGKILL)
waited_pid, status = os.waitpid(pid, 0)
def handleError(self, record):
self.error_record = record
+class StreamWithIntName(object):
+ level = logging.NOTSET
+ name = 2
+
class StreamHandlerTest(BaseTest):
def test_error_handling(self):
h = TestStreamHandler(BadStream())
actual = h.setStream(old)
self.assertIsNone(actual)
+ def test_can_represent_stream_with_int_name(self):
+ h = logging.StreamHandler(StreamWithIntName())
+ self.assertEqual(repr(h), '<StreamHandler 2 (NOTSET)>')
+
# -- The following section could be moved into a server_helper.py module
# -- if it proves to be of wider utility than just test_logging
listener.stop()
self.assertEqual(self.stream.getvalue().strip().count('Traceback'), 1)
+ @unittest.skipUnless(hasattr(logging.handlers, 'QueueListener'),
+ 'logging.handlers.QueueListener required for this test')
+ def test_queue_listener_with_multiple_handlers(self):
+ # Test that queue handler format doesn't affect other handler formats (bpo-35726).
+ self.que_hdlr.setFormatter(self.root_formatter)
+ self.que_logger.addHandler(self.root_hdlr)
+
+ listener = logging.handlers.QueueListener(self.queue, self.que_hdlr)
+ listener.start()
+ self.que_logger.error("error")
+ listener.stop()
+ self.assertEqual(self.stream.getvalue().strip(), "que -> ERROR: error")
+
if hasattr(logging.handlers, 'QueueListener'):
import multiprocessing
from unittest.mock import patch
[m.msg if isinstance(m, logging.LogRecord)
else m for m in items]))
+ def test_calls_task_done_after_stop(self):
+ # Issue 36813: Make sure queue.join does not deadlock.
+ log_queue = queue.Queue()
+ listener = logging.handlers.QueueListener(log_queue)
+ listener.start()
+ listener.stop()
+ with self.assertRaises(ValueError):
+ # Make sure all tasks are done and .join won't block.
+ log_queue.task_done()
+
ZERO = datetime.timedelta(0)
self.assertIn("exception in __del__", err)
self.assertIn("ValueError: some error", err)
+ def test_recursion_error(self):
+ # Issue 36272
+ code = """if 1:
+ import logging
+
+ def rec():
+ logging.error("foo")
+ rec()
+
+ rec()"""
+ rc, out, err = assert_python_failure("-c", code)
+ err = err.decode()
+ self.assertNotIn("Cannot recover from stack overflow.", err)
+ self.assertEqual(rc, 1)
+
class LogRecordTest(BaseTest):
def test_str_rep(self):
m = mmap.mmap(-1, len(s))
m[:] = s
self.assertEqual(m[:], s)
- indices = (0, None, 1, 3, 19, 300, -1, -2, -31, -300)
+ indices = (0, None, 1, 3, 19, 300, sys.maxsize, -1, -2, -31, -300)
for start in indices:
for stop in indices:
# Skip step 0 (invalid)
# Test extended slicing by comparing with list slicing.
s = bytes(reversed(range(256)))
m = mmap.mmap(-1, len(s))
- indices = (0, None, 1, 3, 19, 300, -1, -2, -31, -300)
+ indices = (0, None, 1, 3, 19, 300, sys.maxsize, -1, -2, -31, -300)
for start in indices:
for stop in indices:
# Skip invalid step 0
db.Close()
self.addCleanup(unlink, db_path)
+ def test_directory_start_component_keyfile(self):
+ db, db_path = init_database()
+ self.addCleanup(db.Close)
+ feature = msilib.Feature(db, 0, 'Feature', 'A feature', 'Python')
+ cab = msilib.CAB('CAB')
+ dir = msilib.Directory(db, cab, None, TESTFN, 'TARGETDIR',
+ 'SourceDir', 0)
+ dir.start_component(None, feature, None, 'keyfile')
+
class Test_make_id(unittest.TestCase):
#http://msdn.microsoft.com/en-us/library/aa369212(v=vs.85).aspx
import functools
import contextlib
import os.path
+import re
import threading
from test import support
TIMEOUT = 30
certfile = os.path.join(os.path.dirname(__file__), 'keycert3.pem')
+if ssl is not None:
+ SSLError = ssl.SSLError
+else:
+ class SSLError(Exception):
+ """Non-existent exception class when we lack SSL support."""
+ reason = "This will never be raised."
+
# TODO:
# - test the `file` arg to more commands
# - test error conditions
return False
return True
- with self.NNTP_CLASS(self.NNTP_HOST, timeout=TIMEOUT, usenetrc=False) as server:
- self.assertTrue(is_connected())
- self.assertTrue(server.help())
- self.assertFalse(is_connected())
-
- with self.NNTP_CLASS(self.NNTP_HOST, timeout=TIMEOUT, usenetrc=False) as server:
- server.quit()
- self.assertFalse(is_connected())
+ try:
+ with self.NNTP_CLASS(self.NNTP_HOST, timeout=TIMEOUT, usenetrc=False) as server:
+ self.assertTrue(is_connected())
+ self.assertTrue(server.help())
+ self.assertFalse(is_connected())
+
+ with self.NNTP_CLASS(self.NNTP_HOST, timeout=TIMEOUT, usenetrc=False) as server:
+ server.quit()
+ self.assertFalse(is_connected())
+ except SSLError as ssl_err:
+ # matches "[SSL: DH_KEY_TOO_SMALL] dh key too small"
+ if re.search(r'(?i)KEY.TOO.SMALL', ssl_err.reason):
+ raise unittest.SkipTest(f"Got {ssl_err} connecting "
+ f"to {self.NNTP_HOST!r}")
+ raise
NetworkedNNTPTestsMixin.wrap_methods()
try:
cls.server = cls.NNTP_CLASS(cls.NNTP_HOST, timeout=TIMEOUT,
usenetrc=False)
+ except SSLError as ssl_err:
+ # matches "[SSL: DH_KEY_TOO_SMALL] dh key too small"
+ if re.search(r'(?i)KEY.TOO.SMALL', ssl_err.reason):
+ raise unittest.SkipTest(f"{cls} got {ssl_err} connecting "
+ f"to {cls.NNTP_HOST!r}")
+ raise
except EOF_ERRORS:
raise unittest.SkipTest(f"{cls} got EOF error on connecting "
f"to {cls.NNTP_HOST!r}")
self.addCleanup(os.close, fd2)
self.assertEqual(os.get_inheritable(fd2), False)
+ @unittest.skipUnless(sys.platform == 'win32', 'win32-specific test')
+ def test_dup_nul(self):
+ # os.dup() was creating inheritable fds for character files.
+ fd1 = os.open('NUL', os.O_RDONLY)
+ self.addCleanup(os.close, fd1)
+ fd2 = os.dup(fd1)
+ self.addCleanup(os.close, fd2)
+ self.assertFalse(os.get_inheritable(fd2))
+
@unittest.skipUnless(hasattr(os, 'dup2'), "need os.dup2()")
def test_dup2(self):
fd = os.open(__file__, os.O_RDONLY)
self.check_suite("try: pass\nexcept: pass\nelse: pass\n"
"finally: pass\n")
+ def test_if_stmt(self):
+ self.check_suite("if True:\n pass\nelse:\n pass\n")
+ self.check_suite("if True:\n pass\nelif True:\n pass\nelse:\n pass\n")
+
def test_position(self):
# An absolutely minimal test of position information. Better
# tests would be a big project.
with self.assertRaises(UnicodeEncodeError):
parser.sequence2st(tree)
+ def test_invalid_node_id(self):
+ tree = (257, (269, (-7, '')))
+ self.check_bad_tree(tree, "negative node id")
+ tree = (257, (269, (99, '')))
+ self.check_bad_tree(tree, "invalid token id")
+ tree = (257, (269, (9999, (0, ''))))
+ self.check_bad_tree(tree, "invalid symbol id")
+
+ def test_ParserError_message(self):
+ try:
+ parser.sequence2st((257,(269,(257,(0,'')))))
+ except parser.ParserError as why:
+ self.assertIn("simple_stmt", str(why)) # Expected
+ self.assertIn("file_input", str(why)) # Got
+
+
class CompileTestCase(unittest.TestCase):
# |-- dirE # No permissions
# |-- fileA
# |-- linkA -> fileA
- # `-- linkB -> dirB
+ # |-- linkB -> dirB
+ # `-- brokenLinkLoop -> brokenLinkLoop
#
def setUp(self):
self.dirlink(os.path.join('..', 'dirB'), join('dirA', 'linkC'))
# This one goes upwards, creating a loop
self.dirlink(os.path.join('..', 'dirB'), join('dirB', 'linkD'))
+ # Broken symlink (pointing to itself).
+ os.symlink('brokenLinkLoop', join('brokenLinkLoop'))
if os.name == 'nt':
# Workaround for http://bugs.python.org/issue13772
paths = set(it)
expected = ['dirA', 'dirB', 'dirC', 'dirE', 'fileA']
if support.can_symlink():
- expected += ['linkA', 'linkB', 'brokenLink']
+ expected += ['linkA', 'linkB', 'brokenLink', 'brokenLinkLoop']
self.assertEqual(paths, { P(BASE, q) for q in expected })
@support.skip_unless_symlink
'fileA',
'linkA',
'linkB',
+ 'brokenLinkLoop',
}
self.assertEqual(given, {p / x for x in expect})
... print('...')
... return foo.upper()
+ >>> def test_function3(arg=None, *, kwonly=None):
+ ... pass
+
>>> def test_function():
... import pdb; pdb.Pdb(nosigint=True, readrc=False).set_trace()
... ret = test_function_2('baz')
+ ... test_function3(kwonly=True)
... print(ret)
>>> with PdbTestInput([ # doctest: +ELLIPSIS, +NORMALIZE_WHITESPACE
... 'jump 8', # jump over second for loop
... 'return', # return out of function
... 'retval', # display return value
+ ... 'next', # step to test_function3()
+ ... 'step', # stepping into test_function3()
+ ... 'args', # display function args
... 'continue',
... ]):
... test_function()
- > <doctest test.test_pdb.test_pdb_basic_commands[1]>(3)test_function()
+ > <doctest test.test_pdb.test_pdb_basic_commands[2]>(3)test_function()
-> ret = test_function_2('baz')
(Pdb) step
--Call--
[EOF]
(Pdb) bt
...
- <doctest test.test_pdb.test_pdb_basic_commands[2]>(18)<module>()
+ <doctest test.test_pdb.test_pdb_basic_commands[3]>(21)<module>()
-> test_function()
- <doctest test.test_pdb.test_pdb_basic_commands[1]>(3)test_function()
+ <doctest test.test_pdb.test_pdb_basic_commands[2]>(3)test_function()
-> ret = test_function_2('baz')
> <doctest test.test_pdb.test_pdb_basic_commands[0]>(1)test_function_2()
-> def test_function_2(foo, bar='default'):
(Pdb) up
- > <doctest test.test_pdb.test_pdb_basic_commands[1]>(3)test_function()
+ > <doctest test.test_pdb.test_pdb_basic_commands[2]>(3)test_function()
-> ret = test_function_2('baz')
(Pdb) down
> <doctest test.test_pdb.test_pdb_basic_commands[0]>(1)test_function_2()
-> return foo.upper()
(Pdb) retval
'BAZ'
+ (Pdb) next
+ > <doctest test.test_pdb.test_pdb_basic_commands[2]>(4)test_function()
+ -> test_function3(kwonly=True)
+ (Pdb) step
+ --Call--
+ > <doctest test.test_pdb.test_pdb_basic_commands[1]>(1)test_function3()
+ -> def test_function3(arg=None, *, kwonly=None):
+ (Pdb) args
+ arg = None
+ kwonly = True
(Pdb) continue
BAZ
"""
any('main.py(5)foo()->None' in l for l in stdout.splitlines()),
'Fail to step into the caller after a return')
- def test_issue13210(self):
- # invoking "continue" on a non-main thread triggered an exception
- # inside signal.signal
+ def test_issue13120(self):
+ # Invoking "continue" on a non-main thread triggered an exception
+ # inside signal.signal.
with open(support.TESTFN, 'wb') as f:
f.write(textwrap.dedent("""
self.assertFalse(instr.opname.startswith('BINARY_'))
self.assertFalse(instr.opname.startswith('BUILD_'))
+ def test_condition_with_binop_with_bools(self):
+ def f():
+ if True or False:
+ return 1
+ return 0
+ self.assertEqual(f(), 1)
+
class TestBuglets(unittest.TestCase):
@support.skip_unless_symlink
def test_architecture_via_symlink(self): # issue3762
+ if sys.platform == "win32" and not os.path.exists(sys.executable):
+ # App symlink appears to not exist, but we want the
+ # real executable here anyway
+ import _winapi
+ real = _winapi.GetModuleFileName(0)
+ else:
+ real = os.path.realpath(sys.executable)
+ link = os.path.abspath(support.TESTFN)
+ os.symlink(real, link)
+
# On Windows, the EXE needs to know where pythonXY.dll and *.pyd is at
# so we add the directory to the path, PYTHONHOME and PYTHONPATH.
env = None
if sys.platform == "win32":
env = {k.upper(): os.environ[k] for k in os.environ}
env["PATH"] = "{};{}".format(
- os.path.dirname(sys.executable), env.get("PATH", ""))
- env["PYTHONHOME"] = os.path.dirname(sys.executable)
+ os.path.dirname(real), env.get("PATH", ""))
+ env["PYTHONHOME"] = os.path.dirname(real)
if sysconfig.is_python_build(True):
env["PYTHONPATH"] = os.path.dirname(os.__file__)
.format(p.returncode))
return r
- real = os.path.realpath(sys.executable)
- link = os.path.abspath(support.TESTFN)
- os.symlink(real, link)
try:
- self.assertEqual(get(real), get(link, env=env))
+ self.assertEqual(get(sys.executable), get(link, env=env))
finally:
os.remove(link)
os.path.exists(sys.executable+'.exe'):
# Cygwin horror
executable = sys.executable + '.exe'
+ elif sys.platform == "win32" and not os.path.exists(sys.executable):
+ # App symlink appears to not exist, but we want the
+ # real executable here anyway
+ import _winapi
+ executable = _winapi.GetModuleFileName(0)
else:
executable = sys.executable
res = platform.libc_ver(executable)
buf = [bytearray(i) for i in [5, 3, 2]]
self.assertEqual(posix.preadv(fd, buf, 3, os.RWF_HIPRI), 10)
self.assertEqual([b't1tt2', b't3t', b'5t'], list(buf))
+ except NotImplementedError:
+ self.skipTest("preadv2 not available")
+ except OSError as inst:
+ # Is possible that the macro RWF_HIPRI was defined at compilation time
+ # but the option is not supported by the kernel or the runtime libc shared
+ # library.
+ if inst.errno in {errno.EINVAL, errno.ENOTSUP}:
+ raise unittest.SkipTest("RWF_HIPRI is not supported by the current system")
+ else:
+ raise
finally:
os.close(fd)
c = self.gen.choices(range(n), cum_weights=range(1, n+1), k=10000)
self.assertEqual(a, c)
- # Amerian Roulette
+ # American Roulette
population = ['Red', 'Black', 'Green']
weights = [18, 18, 2]
cum_weights = [18, 36, 38]
def test_match_repr(self):
for string in '[abracadabra]', S('[abracadabra]'):
m = re.search(r'(.+)(.*?)\1', string)
- self.assertEqual(repr(m), "<%s.%s object; "
- "span=(1, 12), match='abracadabra'>" %
- (type(m).__module__, type(m).__qualname__))
+ pattern = r"<(%s\.)?%s object; span=\(1, 12\), match='abracadabra'>" % (
+ type(m).__module__, type(m).__qualname__
+ )
+ self.assertRegex(repr(m), pattern)
for string in (b'[abracadabra]', B(b'[abracadabra]'),
bytearray(b'[abracadabra]'),
memoryview(b'[abracadabra]')):
m = re.search(br'(.+)(.*?)\1', string)
- self.assertEqual(repr(m), "<%s.%s object; "
- "span=(1, 12), match=b'abracadabra'>" %
- (type(m).__module__, type(m).__qualname__))
+ pattern = r"<(%s\.)?%s object; span=\(1, 12\), match=b'abracadabra'>" % (
+ type(m).__module__, type(m).__qualname__
+ )
+ self.assertRegex(repr(m), pattern)
first, second = list(re.finditer("(aa)|(bb)", "aa bb"))
- self.assertEqual(repr(first), "<%s.%s object; "
- "span=(0, 2), match='aa'>" %
- (type(second).__module__, type(first).__qualname__))
- self.assertEqual(repr(second), "<%s.%s object; "
- "span=(3, 5), match='bb'>" %
- (type(second).__module__, type(second).__qualname__))
+ pattern = r"<(%s\.)?%s object; span=\(0, 2\), match='aa'>" % (
+ type(second).__module__, type(second).__qualname__
+ )
+ self.assertRegex(repr(first), pattern)
+ pattern = r"<(%s\.)?%s object; span=\(3, 5\), match='bb'>" % (
+ type(second).__module__, type(second).__qualname__
+ )
+ self.assertRegex(repr(second), pattern)
def test_zerowidth(self):
# Issues 852532, 1647489, 3262, 25054.
from test.libregrtest import utils
-Py_DEBUG = hasattr(sys, 'getobjects')
+Py_DEBUG = hasattr(sys, 'gettotalrefcount')
ROOT_DIR = os.path.join(os.path.dirname(__file__), '..', '..')
ROOT_DIR = os.path.abspath(os.path.normpath(ROOT_DIR))
self.assertTrue(ns.quiet)
self.assertEqual(ns.verbose, 0)
- def test_slow(self):
+ def test_slowest(self):
for opt in '-o', '--slowest':
with self.subTest(opt=opt):
ns = libregrtest._parse_args([opt])
self.checkError([opt], 'expected one argument')
self.checkError([opt, 'foo'], 'invalid int value')
self.checkError([opt, '2', '-T'], "don't go together")
- self.checkError([opt, '2', '-l'], "don't go together")
self.checkError([opt, '0', '-T'], "don't go together")
- self.checkError([opt, '0', '-l'], "don't go together")
def test_coverage(self):
for opt in '-T', '--coverage':
regex = list_regex('%s re-run test%s', rerun)
self.check_line(output, regex)
self.check_line(output, "Re-running failed tests in verbose mode")
- for name in rerun:
- regex = "Re-running test %r in verbose mode" % name
+ for test_name in rerun:
+ regex = f"Re-running {test_name} in verbose mode"
self.check_line(output, regex)
if no_test_ran:
result.append('SUCCESS')
result = ', '.join(result)
if rerun:
- self.check_line(output, 'Tests result: %s' % result)
+ self.check_line(output, 'Tests result: FAILURE')
result = 'FAILURE then %s' % result
self.check_line(output, 'Tests result: %s' % result)
% (self.TESTNAME_REGEX, len(tests)))
self.check_line(output, regex)
- def test_slow_interrupted(self):
+ def test_slowest_interrupted(self):
# Issue #25373: test --slowest with an interrupted test
code = TEST_INTERRUPTED
test = self.create_test("sigint", code=code)
for multiprocessing in (False, True):
- if multiprocessing:
- args = ("--slowest", "-j2", test)
- else:
- args = ("--slowest", test)
- output = self.run_tests(*args, exitcode=130)
- self.check_executed_tests(output, test,
- omitted=test, interrupted=True)
-
- regex = ('10 slowest tests:\n')
- self.check_line(output, regex)
+ with self.subTest(multiprocessing=multiprocessing):
+ if multiprocessing:
+ args = ("--slowest", "-j2", test)
+ else:
+ args = ("--slowest", test)
+ output = self.run_tests(*args, exitcode=130)
+ self.check_executed_tests(output, test,
+ omitted=test, interrupted=True)
+
+ regex = ('10 slowest tests:\n')
+ self.check_line(output, regex)
def test_coverage(self):
# test --coverage
testname)
self.assertEqual(output.splitlines(), all_methods)
+ @support.cpython_only
def test_crashed(self):
# Any code which causes a crash
code = 'import faulthandler; faulthandler._sigsegv()'
crash_test = self.create_test(name="crash", code=code)
- ok_test = self.create_test(name="ok")
- tests = [crash_test, ok_test]
+ tests = [crash_test]
output = self.run_tests("-j2", *tests, exitcode=2)
self.check_executed_tests(output, tests, failed=crash_test,
randomize=True)
fail_env_changed=True)
def test_rerun_fail(self):
+ # FAILURE then FAILURE
code = textwrap.dedent("""
import unittest
self.check_executed_tests(output, [testname],
failed=testname, rerun=testname)
+ def test_rerun_success(self):
+ # FAILURE then SUCCESS
+ code = textwrap.dedent("""
+ import builtins
+ import unittest
+
+ class Tests(unittest.TestCase):
+ failed = False
+
+ def test_fail_once(self):
+ if not hasattr(builtins, '_test_failed'):
+ builtins._test_failed = True
+ self.fail("bug")
+ """)
+ testname = self.create_test(code=code)
+
+ output = self.run_tests("-w", testname, exitcode=0)
+ self.check_executed_tests(output, [testname],
+ rerun=testname)
+
def test_no_tests_ran(self):
code = textwrap.dedent("""
import unittest
self.check_executed_tests(output, [testname, testname2],
no_test_ran=[testname])
+ @support.cpython_only
+ def test_findleaks(self):
+ code = textwrap.dedent(r"""
+ import _testcapi
+ import gc
+ import unittest
+
+ @_testcapi.with_tp_del
+ class Garbage:
+ def __tp_del__(self):
+ pass
+
+ class Tests(unittest.TestCase):
+ def test_garbage(self):
+ # create an uncollectable object
+ obj = Garbage()
+ obj.ref_cycle = obj
+ obj = None
+ """)
+ testname = self.create_test(code=code)
+
+ output = self.run_tests("--fail-env-changed", testname, exitcode=3)
+ self.check_executed_tests(output, [testname],
+ env_changed=[testname],
+ fail_env_changed=True)
+
+ # --findleaks is now basically an alias to --fail-env-changed
+ output = self.run_tests("--findleaks", testname, exitcode=3)
+ self.check_executed_tests(output, [testname],
+ env_changed=[testname],
+ fail_env_changed=True)
+
class TestUtils(unittest.TestCase):
def test_format_duration(self):
class BaseRequestRateTest(BaseRobotTest):
+ request_rate = None
+ crawl_delay = None
def test_request_rate(self):
+ parser = self.parser
for url in self.good + self.bad:
agent, url = self.get_agent_and_url(url)
with self.subTest(url=url, agent=agent):
- if self.crawl_delay:
- self.assertEqual(
- self.parser.crawl_delay(agent), self.crawl_delay
- )
- if self.request_rate:
+ self.assertEqual(parser.crawl_delay(agent), self.crawl_delay)
+
+ parsed_request_rate = parser.request_rate(agent)
+ self.assertEqual(parsed_request_rate, self.request_rate)
+ if self.request_rate is not None:
self.assertIsInstance(
- self.parser.request_rate(agent),
+ parsed_request_rate,
urllib.robotparser.RequestRate
)
self.assertEqual(
- self.parser.request_rate(agent).requests,
+ parsed_request_rate.requests,
self.request_rate.requests
)
self.assertEqual(
- self.parser.request_rate(agent).seconds,
+ parsed_request_rate.seconds,
self.request_rate.seconds
)
+class EmptyFileTest(BaseRequestRateTest, unittest.TestCase):
+ robots_txt = ''
+ good = ['/foo']
+
+
class CrawlDelayAndRequestRateTest(BaseRequestRateTest, unittest.TestCase):
robots_txt = """\
User-agent: figtree
class DifferentAgentTest(CrawlDelayAndRequestRateTest):
agent = 'FigTree Robot libwww-perl/5.04'
- # these are not actually tested, but we still need to parse it
- # in order to accommodate the input parameters
- request_rate = None
- crawl_delay = None
class InvalidRequestRateTest(BaseRobotTest, unittest.TestCase):
self.assertEqual(self.set, copy,
"%s != %s" % (self.set, copy))
+ def test_issue_37219(self):
+ with self.assertRaises(TypeError):
+ set().difference(123)
+ with self.assertRaises(TypeError):
+ set().difference_update(123)
+
#------------------------------------------------------------------------------
class TestBasicOpsEmpty(TestBasicOps, unittest.TestCase):
# test that shutil.copystat copies xattrs
src = os.path.join(tmp_dir, 'the_original')
+ srcro = os.path.join(tmp_dir, 'the_original_ro')
write_file(src, src)
+ write_file(srcro, srcro)
os.setxattr(src, 'user.the_value', b'fiddly')
+ os.setxattr(srcro, 'user.the_value', b'fiddly')
+ os.chmod(srcro, 0o444)
dst = os.path.join(tmp_dir, 'the_copy')
+ dstro = os.path.join(tmp_dir, 'the_copy_ro')
write_file(dst, dst)
+ write_file(dstro, dstro)
shutil.copystat(src, dst)
+ shutil.copystat(srcro, dstro)
self.assertEqual(os.getxattr(dst, 'user.the_value'), b'fiddly')
+ self.assertEqual(os.getxattr(dstro, 'user.the_value'), b'fiddly')
@support.skip_unless_symlink
@support.skip_unless_xattr
rv = shutil.which(self.file)
self.assertEqual(rv, self.temp_file.name)
+ def test_environ_path_empty(self):
+ # PATH='': no match
+ with support.EnvironmentVarGuard() as env:
+ env['PATH'] = ''
+ with unittest.mock.patch('os.confstr', return_value=self.dir, \
+ create=True), \
+ support.swap_attr(os, 'defpath', self.dir), \
+ support.change_cwd(self.dir):
+ rv = shutil.which(self.file)
+ self.assertIsNone(rv)
+
+ def test_environ_path_cwd(self):
+ expected_cwd = os.path.basename(self.temp_file.name)
+ if sys.platform == "win32":
+ curdir = os.curdir
+ if isinstance(expected_cwd, bytes):
+ curdir = os.fsencode(curdir)
+ expected_cwd = os.path.join(curdir, expected_cwd)
+
+ # PATH=':': explicitly looks in the current directory
+ with support.EnvironmentVarGuard() as env:
+ env['PATH'] = os.pathsep
+ with unittest.mock.patch('os.confstr', return_value=self.dir, \
+ create=True), \
+ support.swap_attr(os, 'defpath', self.dir):
+ rv = shutil.which(self.file)
+ self.assertIsNone(rv)
+
+ # look in current directory
+ with support.change_cwd(self.dir):
+ rv = shutil.which(self.file)
+ self.assertEqual(rv, expected_cwd)
+
+ def test_environ_path_missing(self):
+ with support.EnvironmentVarGuard() as env:
+ env.pop('PATH', None)
+
+ # without confstr
+ with unittest.mock.patch('os.confstr', side_effect=ValueError, \
+ create=True), \
+ support.swap_attr(os, 'defpath', self.dir):
+ rv = shutil.which(self.file)
+ self.assertEqual(rv, self.temp_file.name)
+
+ # with confstr
+ with unittest.mock.patch('os.confstr', return_value=self.dir, \
+ create=True), \
+ support.swap_attr(os, 'defpath', ''):
+ rv = shutil.which(self.file)
+ self.assertEqual(rv, self.temp_file.name)
+
def test_empty_path(self):
base_dir = os.path.dirname(self.dir)
with support.change_cwd(path=self.dir), \
rv = shutil.which(self.file)
self.assertIsNone(rv)
+ @unittest.skipUnless(sys.platform == "win32", 'test specific to Windows')
+ def test_pathext(self):
+ ext = ".xyz"
+ temp_filexyz = tempfile.NamedTemporaryFile(dir=self.temp_dir,
+ prefix="Tmp2", suffix=ext)
+ os.chmod(temp_filexyz.name, stat.S_IXUSR)
+ self.addCleanup(temp_filexyz.close)
+
+ # strip path and extension
+ program = os.path.basename(temp_filexyz.name)
+ program = os.path.splitext(program)[0]
+
+ with support.EnvironmentVarGuard() as env:
+ env['PATHEXT'] = ext
+ rv = shutil.which(program, path=self.temp_dir)
+ self.assertEqual(rv, temp_filexyz.name)
+
class TestMove(unittest.TestCase):
# http://bugs.python.org/issue19205
re_mods = {'re', '_sre', 'sre_compile', 'sre_constants', 'sre_parse'}
- # _osx_support uses the re module in many placs
- if sys.platform != 'darwin':
- self.assertFalse(modules.intersection(re_mods), stderr)
+ self.assertFalse(modules.intersection(re_mods), stderr)
+
# http://bugs.python.org/issue9548
self.assertNotIn('locale', modules, stderr)
- if sys.platform != 'darwin':
- # http://bugs.python.org/issue19209
- self.assertNotIn('copyreg', modules, stderr)
- # http://bugs.python.org/issue19218>
+
+ # http://bugs.python.org/issue19209
+ self.assertNotIn('copyreg', modules, stderr)
+
+ # http://bugs.python.org/issue19218
collection_mods = {'_collections', 'collections', 'functools',
'heapq', 'itertools', 'keyword', 'operator',
'reprlib', 'types', 'weakref'
# On Solaris, ENETUNREACH is returned in this circumstance instead
# of ECONNREFUSED. So, if that errno exists, add it to our list of
# expected errnos.
- expected_errnos = [ errno.ECONNREFUSED, ]
- if hasattr(errno, 'ENETUNREACH'):
- expected_errnos.append(errno.ENETUNREACH)
- if hasattr(errno, 'EADDRNOTAVAIL'):
- # bpo-31910: socket.create_connection() fails randomly
- # with EADDRNOTAVAIL on Travis CI
- expected_errnos.append(errno.EADDRNOTAVAIL)
-
+ expected_errnos = support.get_socket_conn_refused_errs()
self.assertIn(cm.exception.errno, expected_errnos)
def test_create_connection_timeout(self):
check_against_PyObject_RichCompareBool(self, [float('nan')]*100)
check_against_PyObject_RichCompareBool(self, [float('nan') for
_ in range(100)])
+
+ def test_not_all_tuples(self):
+ self.assertRaises(TypeError, [(1.0, 1.0), (False, "A"), 6].sort)
+ self.assertRaises(TypeError, [('a', 1), (1, 'a')].sort)
+ self.assertRaises(TypeError, [(1, 'a'), ('a', 1)].sort)
#==============================================================================
if __name__ == "__main__":
cert = {'subject': ((('commonName', 'example.com'),),),
'subjectAltName': (('DNS', 'example.com'),
('IP Address', '10.11.12.13'),
- ('IP Address', '14.15.16.17'))}
+ ('IP Address', '14.15.16.17'),
+ ('IP Address', '127.0.0.1'))}
ok(cert, '10.11.12.13')
ok(cert, '14.15.16.17')
+ # socket.inet_ntoa(socket.inet_aton('127.1')) == '127.0.0.1'
+ fail(cert, '127.1')
+ fail(cert, '14.15.16.17 ')
+ fail(cert, '14.15.16.17 extra data')
fail(cert, '14.15.16.18')
fail(cert, 'example.net')
('IP Address', '2003:0:0:0:0:0:0:BABA\n'))}
ok(cert, '2001::cafe')
ok(cert, '2003::baba')
+ fail(cert, '2003::baba ')
+ fail(cert, '2003::baba extra data')
fail(cert, '2003::bebe')
fail(cert, 'example.net')
self.sock, server_side=True)
self.server.selected_npn_protocols.append(self.sslconn.selected_npn_protocol())
self.server.selected_alpn_protocols.append(self.sslconn.selected_alpn_protocol())
- except (ConnectionResetError, BrokenPipeError) as e:
+ except (ConnectionResetError, BrokenPipeError, ConnectionAbortedError) as e:
# We treat ConnectionResetError as though it were an
# SSLError - OpenSSL on Ubuntu abruptly closes the
# connection when asked to use an unsupported protocol.
# BrokenPipeError is raised in TLS 1.3 mode, when OpenSSL
# tries to send session tickets after handshake.
# https://github.com/openssl/openssl/issues/6342
+ #
+ # ConnectionAbortedError is raised in TLS 1.3 mode, when OpenSSL
+ # tries to send session tickets after handshake when using WinSock.
self.server.conn_errors.append(str(e))
if self.server.chatty:
handle_error("\n server: bad connection attempt from " + repr(self.addr) + ":\n")
sys.stdout.write(" server: read %r (%s), sending back %r (%s)...\n"
% (msg, ctype, msg.lower(), ctype))
self.write(msg.lower())
- except ConnectionResetError:
+ except (ConnectionResetError, ConnectionAbortedError):
# XXX: OpenSSL 1.1.1 sometimes raises ConnectionResetError
# when connection is not shut down gracefully.
if self.server.chatty and support.verbose:
)
self.close()
self.running = False
+ except ssl.SSLError as err:
+ # On Windows sometimes test_pha_required_nocert receives the
+ # PEER_DID_NOT_RETURN_A_CERTIFICATE exception
+ # before the 'tlsv13 alert certificate required' exception.
+ # If the server is stopped when PEER_DID_NOT_RETURN_A_CERTIFICATE
+ # is received test_pha_required_nocert fails with ConnectionResetError
+ # because the underlying socket is closed
+ if 'PEER_DID_NOT_RETURN_A_CERTIFICATE' == err.reason:
+ if self.server.chatty and support.verbose:
+ sys.stdout.write(err.args[1])
+ # test_pha_required_nocert is expecting this exception
+ raise ssl.SSLError('tlsv13 alert certificate required')
except OSError:
if self.server.chatty:
handle_error("Test server failure:\n")
self.assertEqual(s.recv(1024), b'FALSE\n')
s.write(b'PHA')
self.assertEqual(s.recv(1024), b'OK\n')
- # optional doens't fail when client does not have a cert
+ # optional doesn't fail when client does not have a cert
s.write(b'HASCERT')
self.assertEqual(s.recv(1024), b'FALSE\n')
s.write(b'PHA')
self.assertIn(b'WRONG_SSL_VERSION', s.recv(1024))
+ def test_bpo37428_pha_cert_none(self):
+ # verify that post_handshake_auth does not implicitly enable cert
+ # validation.
+ hostname = SIGNED_CERTFILE_HOSTNAME
+ client_context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
+ client_context.post_handshake_auth = True
+ client_context.load_cert_chain(SIGNED_CERTFILE)
+ # no cert validation and CA on client side
+ client_context.check_hostname = False
+ client_context.verify_mode = ssl.CERT_NONE
+
+ server_context = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER)
+ server_context.load_cert_chain(SIGNED_CERTFILE)
+ server_context.load_verify_locations(SIGNING_CA)
+ server_context.post_handshake_auth = True
+ server_context.verify_mode = ssl.CERT_REQUIRED
+
+ server = ThreadedEchoServer(context=server_context, chatty=False)
+ with server:
+ with client_context.wrap_socket(socket.socket(),
+ server_hostname=hostname) as s:
+ s.connect((HOST, server.port))
+ s.write(b'HASCERT')
+ self.assertEqual(s.recv(1024), b'FALSE\n')
+ s.write(b'PHA')
+ self.assertEqual(s.recv(1024), b'OK\n')
+ s.write(b'HASCERT')
+ self.assertEqual(s.recv(1024), b'TRUE\n')
+ # server cert has not been validated
+ self.assertEqual(s.getpeercert(), {})
+
def test_main(verbose=False):
if support.verbose:
except ImportError:
with_pymalloc = support.with_pymalloc()
else:
- alloc_name = _testcapi.pymem_getallocatorsname()
- with_pymalloc = (alloc_name in ('pymalloc', 'pymalloc_debug'))
+ try:
+ alloc_name = _testcapi.pymem_getallocatorsname()
+ except RuntimeError as exc:
+ # "cannot get allocators name" (ex: tracemalloc is used)
+ with_pymalloc = True
+ else:
+ with_pymalloc = (alloc_name in ('pymalloc', 'pymalloc_debug'))
# Some sanity checks
a = sys.getallocatedblocks()
@skip_unless_symlink
def test_symlink(self):
+ if sys.platform == "win32" and not os.path.exists(sys.executable):
+ # App symlink appears to not exist, but we want the
+ # real executable here anyway
+ import _winapi
+ real = _winapi.GetModuleFileName(0)
+ else:
+ real = os.path.realpath(sys.executable)
+ link = os.path.abspath(TESTFN)
+ os.symlink(real, link)
+
# On Windows, the EXE needs to know where pythonXY.dll is at so we have
# to add the directory to the path.
env = None
if sys.platform == "win32":
env = {k.upper(): os.environ[k] for k in os.environ}
env["PATH"] = "{};{}".format(
- os.path.dirname(sys.executable), env.get("PATH", ""))
+ os.path.dirname(real), env.get("PATH", ""))
# Requires PYTHONHOME as well since we locate stdlib from the
# EXE path and not the DLL path (which should be fixed)
- env["PYTHONHOME"] = os.path.dirname(sys.executable)
+ env["PYTHONHOME"] = os.path.dirname(real)
if sysconfig.is_python_build(True):
env["PYTHONPATH"] = os.path.dirname(os.__file__)
self.fail('Non-zero return code {0} (0x{0:08X})'
.format(p.returncode))
return out, err
- real = os.path.realpath(sys.executable)
- link = os.path.abspath(TESTFN)
- os.symlink(real, link)
try:
self.assertEqual(get(real), get(link, env))
finally:
import weakref
import os
import subprocess
+import signal
from test import lock_tests
from test import support
self.assertEqual(data.splitlines(),
["GC: True True True"] * 2)
+ def test_finalization_shutdown(self):
+ # bpo-36402: Py_Finalize() calls threading._shutdown() which must wait
+ # until Python thread states of all non-daemon threads get deleted.
+ #
+ # Test similar to SubinterpThreadingTests.test_threads_join_2(), but
+ # test the finalization of the main interpreter.
+ code = """if 1:
+ import os
+ import threading
+ import time
+ import random
+
+ def random_sleep():
+ seconds = random.random() * 0.010
+ time.sleep(seconds)
+
+ class Sleeper:
+ def __del__(self):
+ random_sleep()
+
+ tls = threading.local()
+
+ def f():
+ # Sleep a bit so that the thread is still running when
+ # Py_Finalize() is called.
+ random_sleep()
+ tls.x = Sleeper()
+ random_sleep()
+
+ threading.Thread(target=f).start()
+ random_sleep()
+ """
+ rc, out, err = assert_python_ok("-c", code)
+ self.assertEqual(err, b"")
+
def test_tstate_lock(self):
# Test an implementation detail of Thread objects.
started = _thread.allocate_lock()
finally:
sys.settrace(old_trace)
+ @cpython_only
+ def test_shutdown_locks(self):
+ for daemon in (False, True):
+ with self.subTest(daemon=daemon):
+ event = threading.Event()
+ thread = threading.Thread(target=event.wait, daemon=daemon)
+
+ # Thread.start() must add lock to _shutdown_locks,
+ # but only for non-daemon thread
+ thread.start()
+ tstate_lock = thread._tstate_lock
+ if not daemon:
+ self.assertIn(tstate_lock, threading._shutdown_locks)
+ else:
+ self.assertNotIn(tstate_lock, threading._shutdown_locks)
+
+ # unblock the thread and join it
+ event.set()
+ thread.join()
+
+ # Thread._stop() must remove tstate_lock from _shutdown_locks.
+ # Daemon threads must never add it to _shutdown_locks.
+ self.assertNotIn(tstate_lock, threading._shutdown_locks)
+
class ThreadJoinOnShutdown(BaseTestCase):
self.addCleanup(os.close, w)
code = r"""if 1:
import os
+ import random
import threading
import time
+ def random_sleep():
+ seconds = random.random() * 0.010
+ time.sleep(seconds)
+
def f():
# Sleep a bit so that the thread is still running when
# Py_EndInterpreter is called.
- time.sleep(0.05)
+ random_sleep()
os.write(%d, b"x")
+
threading.Thread(target=f).start()
+ random_sleep()
""" % (w,)
ret = test.support.run_in_subinterp(code)
self.assertEqual(ret, 0)
self.addCleanup(os.close, w)
code = r"""if 1:
import os
+ import random
import threading
import time
+ def random_sleep():
+ seconds = random.random() * 0.010
+ time.sleep(seconds)
+
class Sleeper:
def __del__(self):
- time.sleep(0.05)
+ random_sleep()
tls = threading.local()
def f():
# Sleep a bit so that the thread is still running when
# Py_EndInterpreter is called.
- time.sleep(0.05)
+ random_sleep()
tls.x = Sleeper()
os.write(%d, b"x")
+
threading.Thread(target=f).start()
+ random_sleep()
""" % (w,)
ret = test.support.run_in_subinterp(code)
self.assertEqual(ret, 0)
class BarrierTests(lock_tests.BarrierTests):
barriertype = staticmethod(threading.Barrier)
+
class MiscTestCase(unittest.TestCase):
def test__all__(self):
extra = {"ThreadError"}
support.check__all__(self, threading, ('threading', '_thread'),
extra=extra, blacklist=blacklist)
+
+class InterruptMainTests(unittest.TestCase):
+ def test_interrupt_main_subthread(self):
+ # Calling start_new_thread with a function that executes interrupt_main
+ # should raise KeyboardInterrupt upon completion.
+ def call_interrupt():
+ _thread.interrupt_main()
+ t = threading.Thread(target=call_interrupt)
+ with self.assertRaises(KeyboardInterrupt):
+ t.start()
+ t.join()
+ t.join()
+
+ def test_interrupt_main_mainthread(self):
+ # Make sure that if interrupt_main is called in main thread that
+ # KeyboardInterrupt is raised instantly.
+ with self.assertRaises(KeyboardInterrupt):
+ _thread.interrupt_main()
+
+ def test_interrupt_main_noerror(self):
+ handler = signal.getsignal(signal.SIGINT)
+ try:
+ # No exception should arise.
+ signal.signal(signal.SIGINT, signal.SIG_IGN)
+ _thread.interrupt_main()
+
+ signal.signal(signal.SIGINT, signal.SIG_DFL)
+ _thread.interrupt_main()
+ finally:
+ # Restore original handler
+ signal.signal(signal.SIGINT, handler)
+
+
if __name__ == "__main__":
unittest.main()
t2 = time.monotonic()
dt = t2 - t1
self.assertGreater(t2, t1)
- # Issue #20101: On some Windows machines, dt may be slightly low
- self.assertTrue(0.45 <= dt <= 1.0, dt)
+ # bpo-20101: tolerate a difference of 50 ms because of bad timer
+ # resolution on Windows
+ self.assertTrue(0.450 <= dt)
# monotonic() is a monotonic but non adjustable clock
info = time.get_clock_info('monotonic')
self.assertIn('doc3', msgids)
def test_classdocstring_early_colon(self):
- """ Test docstring extraction for a class with colons occuring within
+ """ Test docstring extraction for a class with colons occurring within
the parentheses.
"""
msgids = self.extract_docstrings_from_str(dedent('''\
--- /dev/null
+"""Tests for the lll script in the Tools/script directory."""
+
+import os
+import tempfile
+from test import support
+from test.test_tools import skip_if_missing, import_tool
+import unittest
+
+skip_if_missing()
+
+
+class lllTests(unittest.TestCase):
+
+ def setUp(self):
+ self.lll = import_tool('lll')
+
+ @support.skip_unless_symlink
+ def test_lll_multiple_dirs(self):
+ with tempfile.TemporaryDirectory() as dir1, \
+ tempfile.TemporaryDirectory() as dir2:
+ fn1 = os.path.join(dir1, 'foo1')
+ fn2 = os.path.join(dir2, 'foo2')
+ for fn, dir in (fn1, dir1), (fn2, dir2):
+ open(fn, 'w').close()
+ os.symlink(fn, os.path.join(dir, 'symlink'))
+
+ with support.captured_stdout() as output:
+ self.lll.main([dir1, dir2])
+ self.assertEqual(output.getvalue(),
+ f'{dir1}:\n'
+ f'symlink -> {fn1}\n'
+ f'\n'
+ f'{dir2}:\n'
+ f'symlink -> {fn2}\n'
+ )
+
+
+if __name__ == '__main__':
+ unittest.main()
def traced_doubler(num):
return num * 2
+def traced_capturer(*args, **kwargs):
+ return args, kwargs
+
def traced_caller_list_comprehension():
k = 10
mylist = [traced_doubler(i) for i in range(k)]
}
self.assertEqual(self.tracer.results().calledfuncs, expected)
+ def test_arg_errors(self):
+ res = self.tracer.runfunc(traced_capturer, 1, 2, self=3, func=4)
+ self.assertEqual(res, ((1, 2), {'self': 3, 'func': 4}))
+ res = self.tracer.runfunc(func=traced_capturer, arg=1)
+ self.assertEqual(res, ((), {'arg': 1}))
+ with self.assertRaises(TypeError):
+ self.tracer.runfunc()
+
def test_loop_caller_importing(self):
self.tracer.runfunc(traced_func_importing_caller, 1)
import pickle
import re
import sys
-from unittest import TestCase, main, skipUnless, SkipTest, expectedFailure
+from unittest import TestCase, main, skipUnless, SkipTest, skip
from copy import copy, deepcopy
from typing import Any, NoReturn
self.assertEqual(gth(ann_module2), {})
self.assertEqual(gth(ann_module3), {})
- @expectedFailure
+ @skip("known bug")
def test_get_type_hints_modules_forwardref(self):
# FIXME: This currently exposes a bug in typing. Cached forward references
# don't account for the case where there are multiple types of the same
self.assertIn('SupportsBytes', a)
self.assertIn('SupportsComplex', a)
+ def test_all_exported_names(self):
+ import typing
+
+ actual_all = set(typing.__all__)
+ computed_all = {
+ k for k, v in vars(typing).items()
+ # explicitly exported, not a thing with __module__
+ if k in actual_all or (
+ # avoid private names
+ not k.startswith('_') and
+ # avoid things in the io / re typing submodules
+ k not in typing.io.__all__ and
+ k not in typing.re.__all__ and
+ k not in {'io', 're'} and
+ # there's a few types and metaclasses that aren't exported
+ not k.endswith(('Meta', '_contra', '_co')) and
+ not k.upper() == k and
+ # but export all things that have __module__ == 'typing'
+ getattr(v, '__module__', None) == typing.__name__
+ )
+ }
+ self.assertSetEqual(computed_all, actual_all)
+
+
if __name__ == '__main__':
main()
ssl = None
import sys
import tempfile
+import warnings
from nturl2path import url2pathname, pathname2url
from base64 import b64encode
finally:
self.unfakehttp()
+ @unittest.skipUnless(ssl, "ssl module required")
+ def test_url_with_control_char_rejected(self):
+ for char_no in list(range(0, 0x21)) + [0x7f]:
+ char = chr(char_no)
+ schemeless_url = f"//localhost:7777/test{char}/"
+ self.fakehttp(b"HTTP/1.1 200 OK\r\n\r\nHello.")
+ try:
+ # We explicitly test urllib.request.urlopen() instead of the top
+ # level 'def urlopen()' function defined in this... (quite ugly)
+ # test suite. They use different url opening codepaths. Plain
+ # urlopen uses FancyURLOpener which goes via a codepath that
+ # calls urllib.parse.quote() on the URL which makes all of the
+ # above attempts at injection within the url _path_ safe.
+ escaped_char_repr = repr(char).replace('\\', r'\\')
+ InvalidURL = http.client.InvalidURL
+ with self.assertRaisesRegex(
+ InvalidURL, f"contain control.*{escaped_char_repr}"):
+ urllib.request.urlopen(f"http:{schemeless_url}")
+ with self.assertRaisesRegex(
+ InvalidURL, f"contain control.*{escaped_char_repr}"):
+ urllib.request.urlopen(f"https:{schemeless_url}")
+ # This code path quotes the URL so there is no injection.
+ resp = urlopen(f"http:{schemeless_url}")
+ self.assertNotIn(char, resp.geturl())
+ finally:
+ self.unfakehttp()
+
+ @unittest.skipUnless(ssl, "ssl module required")
+ def test_url_with_newline_header_injection_rejected(self):
+ self.fakehttp(b"HTTP/1.1 200 OK\r\n\r\nHello.")
+ host = "localhost:7777?a=1 HTTP/1.1\r\nX-injected: header\r\nTEST: 123"
+ schemeless_url = "//" + host + ":8080/test/?test=a"
+ try:
+ # We explicitly test urllib.request.urlopen() instead of the top
+ # level 'def urlopen()' function defined in this... (quite ugly)
+ # test suite. They use different url opening codepaths. Plain
+ # urlopen uses FancyURLOpener which goes via a codepath that
+ # calls urllib.parse.quote() on the URL which makes all of the
+ # above attempts at injection within the url _path_ safe.
+ InvalidURL = http.client.InvalidURL
+ with self.assertRaisesRegex(
+ InvalidURL, r"contain control.*\\r.*(found at least . .)"):
+ urllib.request.urlopen(f"http:{schemeless_url}")
+ with self.assertRaisesRegex(InvalidURL, r"contain control.*\\n"):
+ urllib.request.urlopen(f"https:{schemeless_url}")
+ # This code path quotes the URL so there is no injection.
+ resp = urlopen(f"http:{schemeless_url}")
+ self.assertNotIn(' ', resp.geturl())
+ self.assertNotIn('\r', resp.geturl())
+ self.assertNotIn('\n', resp.geturl())
+ finally:
+ self.unfakehttp()
+
def test_read_0_9(self):
# "0.9" response accepted (but not "simple responses" without
# a status line)
"spam://c:|windows%/:=&?~#+!$,;'@()*[]|/path/"),
"//c:|windows%/:=&?~#+!$,;'@()*[]|/path/")
+ def test_local_file_open(self):
+ # bpo-35907, CVE-2019-9948: urllib must reject local_file:// scheme
+ class DummyURLopener(urllib.request.URLopener):
+ def open_local_file(self, url):
+ return url
+
+ with warnings.catch_warnings(record=True):
+ warnings.simplefilter("ignore", DeprecationWarning)
+
+ for url in ('local_file://example', 'local-file://example'):
+ self.assertRaises(OSError, urllib.request.urlopen, url)
+ self.assertRaises(OSError, urllib.request.URLopener().open, url)
+ self.assertRaises(OSError, urllib.request.URLopener().retrieve, url)
+ self.assertRaises(OSError, DummyURLopener().open, url)
+ self.assertRaises(OSError, DummyURLopener().retrieve, url)
+
+
# Just commented them out.
# Can't really tell why keep failing in windows and sparc.
# Everywhere else they work ok, but on those machines, sometimes
self.assertIn('\u2100', denorm_chars)
self.assertIn('\uFF03', denorm_chars)
+ # bpo-36742: Verify port separators are ignored when they
+ # existed prior to decomposition
+ urllib.parse.urlsplit('http://\u30d5\u309a:80')
+ with self.assertRaises(ValueError):
+ urllib.parse.urlsplit('http://\u30d5\u309a\ufe1380')
+
for scheme in ["http", "https", "ftp"]:
- for c in denorm_chars:
- url = "{}://netloc{}false.netloc/path".format(scheme, c)
- with self.subTest(url=url, char='{:04X}'.format(ord(c))):
- with self.assertRaises(ValueError):
- urllib.parse.urlsplit(url)
+ for netloc in ["netloc{}false.netloc", "n{}user@netloc"]:
+ for c in denorm_chars:
+ url = "{}://{}/path".format(scheme, netloc.format(c))
+ with self.subTest(url=url, char='{:04X}'.format(ord(c))):
+ with self.assertRaises(ValueError):
+ urllib.parse.urlsplit(url)
class Utility_Tests(unittest.TestCase):
"""Testcase to test the various utility functions in the urllib."""
for j in range(-3, 6):
self.assertEqual(u[i:j], l[i:j])
+ def test_slice_type(self):
+ l = [0, 1, 2, 3, 4]
+ u = UserList(l)
+ self.assertIsInstance(u[:], u.__class__)
+ self.assertEqual(u[:],u)
+
def test_add_specials(self):
u = UserList("spam")
u2 = u + "eggs"
except ImportError:
ctypes = None
-skipInVenv = unittest.skipIf(sys.prefix != sys.base_prefix,
- 'Test not appropriate in a venv')
+# Platforms that set sys._base_executable can create venvs from within
+# another venv, so no need to skip tests that require venv.create().
+requireVenvCreate = unittest.skipUnless(
+ hasattr(sys, '_base_executable')
+ or sys.prefix == sys.base_prefix,
+ 'cannot run venv.create from within a venv on this platform')
def check_output(cmd, encoding=None):
p = subprocess.Popen(cmd,
self.include = 'include'
executable = getattr(sys, '_base_executable', sys.executable)
self.exe = os.path.split(executable)[-1]
+ if (sys.platform == 'win32'
+ and os.path.lexists(executable)
+ and not os.path.exists(executable)):
+ self.cannot_link_exe = True
+ else:
+ self.cannot_link_exe = False
def tearDown(self):
rmtree(self.env_dir)
context = builder.ensure_directories(self.env_dir)
self.assertEqual(context.prompt, '(My prompt) ')
- @skipInVenv
+ @requireVenvCreate
def test_prefixes(self):
"""
Test that the prefix values are as expected.
"""
- #check our prefixes
- self.assertEqual(sys.base_prefix, sys.prefix)
- self.assertEqual(sys.base_exec_prefix, sys.exec_prefix)
-
# check a venv's prefixes
rmtree(self.env_dir)
self.run_with_capture(venv.create, self.env_dir)
for prefix, expected in (
('prefix', self.env_dir),
('prefix', self.env_dir),
- ('base_prefix', sys.prefix),
- ('base_exec_prefix', sys.exec_prefix)):
+ ('base_prefix', sys.base_prefix),
+ ('base_exec_prefix', sys.base_exec_prefix)):
cmd[2] = 'import sys; print(sys.%s)' % prefix
out, err = check_output(cmd)
self.assertEqual(out.strip(), expected.encode())
# symlinked to 'python3.3' in the env, even when symlinking in
# general isn't wanted.
if usl:
- self.assertTrue(os.path.islink(fn))
+ if self.cannot_link_exe:
+ # Symlinking is skipped when our executable is already a
+ # special app symlink
+ self.assertFalse(os.path.islink(fn))
+ else:
+ self.assertTrue(os.path.islink(fn))
# If a venv is created from a source build and that venv is used to
# run the test, the pyvenv.cfg in the venv created in the test will
# point to the venv being used to run the test, and we lose the link
# to the source build - so Python can't initialise properly.
- @skipInVenv
+ @requireVenvCreate
def test_executable(self):
"""
Test that the sys.executable value is as expected.
)
self.assertEqual(out.strip(), '0')
+ @requireVenvCreate
def test_multiprocessing(self):
"""
Test that the multiprocessing is able to spawn.
envpy = os.path.join(os.path.realpath(self.env_dir),
self.bindir, self.exe)
out, err = check_output([envpy, '-c',
- 'from multiprocessing import Pool; ' +
- 'print(Pool(1).apply_async("Python".lower).get(3))'])
+ 'from multiprocessing import Pool; '
+ 'pool = Pool(1); '
+ 'print(pool.apply_async("Python".lower).get(3)); '
+ 'pool.terminate()'])
self.assertEqual(out.strip(), "python".encode())
-@skipInVenv
+@requireVenvCreate
class EnsurePipTest(BaseTest):
"""Test venv module installation of pip."""
def assert_pip_not_installed(self):
self.assertEqual(f.alive, False)
self.assertEqual(res, [199])
+ def test_arg_errors(self):
+ def fin(*args, **kwargs):
+ res.append((args, kwargs))
+
+ a = self.A()
+
+ res = []
+ f = weakref.finalize(a, fin, 1, 2, func=3, obj=4)
+ self.assertEqual(f.peek(), (a, fin, (1, 2), {'func': 3, 'obj': 4}))
+ f()
+ self.assertEqual(res, [((1, 2), {'func': 3, 'obj': 4})])
+
+ res = []
+ f = weakref.finalize(a, func=fin, arg=1)
+ self.assertEqual(f.peek(), (a, fin, (), {'arg': 1}))
+ f()
+ self.assertEqual(res, [((), {'arg': 1})])
+
+ res = []
+ f = weakref.finalize(obj=a, func=fin, arg=1)
+ self.assertEqual(f.peek(), (a, fin, (), {'arg': 1}))
+ f()
+ self.assertEqual(res, [((), {'arg': 1})])
+
+ self.assertRaises(TypeError, weakref.finalize, a)
+ self.assertRaises(TypeError, weakref.finalize)
+
def test_order(self):
a = self.A()
res = []
b"Hello, world!",
written)
+ def testClientConnectionTerminations(self):
+ environ = {"SERVER_PROTOCOL": "HTTP/1.0"}
+ for exception in (
+ ConnectionAbortedError,
+ BrokenPipeError,
+ ConnectionResetError,
+ ):
+ with self.subTest(exception=exception):
+ class AbortingWriter:
+ def write(self, b):
+ raise exception
+
+ stderr = StringIO()
+ h = SimpleHandler(BytesIO(), AbortingWriter(), stderr, environ)
+ h.run(hello_app)
+
+ self.assertFalse(stderr.getvalue())
+
+ def testDontResetInternalStateOnException(self):
+ class CustomException(ValueError):
+ pass
+
+ # We are raising CustomException here to trigger an exception
+ # during the execution of SimpleHandler.finish_response(), so
+ # we can easily test that the internal state of the handler is
+ # preserved in case of an exception.
+ class AbortingWriter:
+ def write(self, b):
+ raise CustomException
+
+ stderr = StringIO()
+ environ = {"SERVER_PROTOCOL": "HTTP/1.0"}
+ h = SimpleHandler(BytesIO(), AbortingWriter(), stderr, environ)
+ h.run(hello_app)
+
+ self.assertIn("CustomException", stderr.getvalue())
+
+ # Test that the internal state of the handler is preserved.
+ self.assertIsNotNone(h.result)
+ self.assertIsNotNone(h.headers)
+ self.assertIsNotNone(h.status)
+ self.assertIsNotNone(h.environ)
+
if __name__ == "__main__":
unittest.main()
def test_partial_post(self):
# Check that a partial POST doesn't make the server loop: issue #14001.
conn = http.client.HTTPConnection(ADDR, PORT)
- conn.request('POST', '/RPC2 HTTP/1.0\r\nContent-Length: 100\r\n\r\nbye')
+ conn.send('POST /RPC2 HTTP/1.0\r\n'
+ 'Content-Length: 100\r\n\r\n'
+ 'bye HTTP/1.1\r\n'
+ f'Host: {ADDR}:{PORT}\r\n'
+ 'Accept-Encoding: identity\r\n'
+ 'Content-Length: 0\r\n\r\n'.encode('ascii'))
conn.close()
def test_context_manager(self):
self.assertEqual(one_info._compresslevel, 1)
self.assertEqual(nine_info._compresslevel, 9)
+ def test_writing_errors(self):
+ class BrokenFile(io.BytesIO):
+ def write(self, data):
+ nonlocal count
+ if count is not None:
+ if count == stop:
+ raise OSError
+ count += 1
+ super().write(data)
+
+ stop = 0
+ while True:
+ testfile = BrokenFile()
+ count = None
+ with zipfile.ZipFile(testfile, 'w', self.compression) as zipfp:
+ with zipfp.open('file1', 'w') as f:
+ f.write(b'data1')
+ count = 0
+ try:
+ with zipfp.open('file2', 'w') as f:
+ f.write(b'data2')
+ except OSError:
+ stop += 1
+ else:
+ break
+ finally:
+ count = None
+ with zipfile.ZipFile(io.BytesIO(testfile.getvalue())) as zipfp:
+ self.assertEqual(zipfp.namelist(), ['file1'])
+ self.assertEqual(zipfp.read('file1'), b'data1')
+
+ with zipfile.ZipFile(io.BytesIO(testfile.getvalue())) as zipfp:
+ self.assertEqual(zipfp.namelist(), ['file1', 'file2'])
+ self.assertEqual(zipfp.read('file1'), b'data1')
+ self.assertEqual(zipfp.read('file2'), b'data2')
+
+
def tearDown(self):
unlink(TESTFN)
unlink(TESTFN2)
_active = {} # maps thread id to Thread object
_limbo = {}
_dangling = WeakSet()
+# Set of Thread._tstate_lock locks of non-daemon threads used by _shutdown()
+# to wait until all Python thread states get deleted:
+# see Thread._set_tstate_lock().
+_shutdown_locks_lock = _allocate_lock()
+_shutdown_locks = set()
# Main class for threads
self._tstate_lock = _set_sentinel()
self._tstate_lock.acquire()
+ if not self.daemon:
+ with _shutdown_locks_lock:
+ _shutdown_locks.add(self._tstate_lock)
+
def _bootstrap_inner(self):
try:
self._set_ident()
assert not lock.locked()
self._is_stopped = True
self._tstate_lock = None
+ if not self.daemon:
+ with _shutdown_locks_lock:
+ _shutdown_locks.discard(lock)
def _delete(self):
"Remove current thread from the dict of currently running threads."
_main_thread = _MainThread()
def _shutdown():
+ """
+ Wait until the Python thread state of all non-daemon threads get deleted.
+ """
# Obscure: other threads may be waiting to join _main_thread. That's
# dubious, but some code does it. We can't wait for C code to release
# the main thread's tstate_lock - that won't happen until the interpreter
if _main_thread._is_stopped:
# _shutdown() was already called
return
+
+ # Main thread
tlock = _main_thread._tstate_lock
# The main thread isn't finished yet, so its thread state lock can't have
# been released.
assert tlock.locked()
tlock.release()
_main_thread._stop()
- t = _pickSomeNonDaemonThread()
- while t:
- t.join()
- t = _pickSomeNonDaemonThread()
-def _pickSomeNonDaemonThread():
- for t in enumerate():
- if not t.daemon and t.is_alive():
- return t
- return None
+ # Join all non-deamon threads
+ while True:
+ with _shutdown_locks_lock:
+ locks = list(_shutdown_locks)
+ _shutdown_locks.clear()
+
+ if not locks:
+ break
+
+ for lock in locks:
+ # mimick Thread.join()
+ lock.acquire()
+ lock.release()
+
+ # new threads can be spawned while we were waiting for the other
+ # threads to complete
+
def main_thread():
"""Return the main thread object.
# Reset _active_limbo_lock, in case we forked while the lock was held
# by another (non-forked) thread. http://bugs.python.org/issue874900
global _active_limbo_lock, _main_thread
+ global _shutdown_locks_lock, _shutdown_locks
_active_limbo_lock = _allocate_lock()
# fork() only copied the current thread; clear references to others.
new_active = {}
current = current_thread()
_main_thread = current
+
+ # reset _shutdown() locks: threads re-register their _tstate_lock below
+ _shutdown_locks_lock = _allocate_lock()
+ _shutdown_locks = set()
+
with _active_limbo_lock:
# Dangling thread instances must still have their locks reset,
# because someone may join() them.
# Return the empty string, plus all of the valid string prefixes.
def _all_string_prefixes():
# The valid string prefixes. Only contain the lower case versions,
- # and don't contain any permuations (include 'fr', but not
+ # and don't contain any permutations (include 'fr', but not
# 'rf'). The various permutations will be generated.
_valid_string_prefixes = ['b', 'r', 'u', 'f', 'br', 'fr']
# if we add binary f-strings, add: ['fb', 'fbr']
sys.settrace(None)
threading.settrace(None)
- def runfunc(self, func, *args, **kw):
+ def runfunc(*args, **kw):
+ if len(args) >= 2:
+ self, func, *args = args
+ elif not args:
+ raise TypeError("descriptor 'runfunc' of 'Trace' object "
+ "needs an argument")
+ elif 'func' in kw:
+ func = kw.pop('func')
+ self, *args = args
+ else:
+ raise TypeError('runfunc expected at least 1 positional argument, '
+ 'got %d' % (len(args)-1))
+
result = None
if not self.donothing:
sys.settrace(self.globaltrace)
fullcircle - a number
Set angle measurement units, i. e. set number
- of 'degrees' for a full circle. Dafault value is
+ of 'degrees' for a full circle. Default value is
360 degrees.
Example (for a Turtle instance named turtle):
'Any',
'Callable',
'ClassVar',
+ 'ForwardRef',
'Generic',
'Optional',
'Tuple',
'SupportsRound',
# Concrete collection types.
+ 'ChainMap',
'Counter',
'Deque',
'Dict',
'DefaultDict',
'List',
+ 'OrderedDict',
'Set',
'FrozenSet',
'NamedTuple', # Not really a type.
"""
self._type_equality_funcs[typeobj] = function
- def addCleanup(self, function, *args, **kwargs):
+ def addCleanup(*args, **kwargs):
"""Add a function, with arguments, to be called when the test is
completed. Functions added are called on a LIFO basis and are
called after tearDown on test failure or success.
Cleanup items are called even if setUp fails (unlike tearDown)."""
+ if len(args) >= 2:
+ self, function, *args = args
+ elif not args:
+ raise TypeError("descriptor 'addCleanup' of 'TestCase' object "
+ "needs an argument")
+ elif 'function' in kwargs:
+ function = kwargs.pop('function')
+ self, *args = args
+ else:
+ raise TypeError('addCleanup expected at least 1 positional '
+ 'argument, got %d' % (len(args)-1))
+ args = tuple(args)
+
self._cleanups.append((function, args, kwargs))
def setUp(self):
__version__ = '1.0'
+import io
import inspect
import pprint
import sys
import builtins
-from types import ModuleType
+from types import ModuleType, MethodType
from functools import wraps, partial
def _callable(obj):
if isinstance(obj, type):
return True
+ if isinstance(obj, (staticmethod, classmethod, MethodType)):
+ return _callable(obj.__func__)
if getattr(obj, '__call__', None) is not None:
return True
return False
extras = self._mock_methods or []
from_type = dir(type(self))
from_dict = list(self.__dict__)
+ from_child_mocks = [
+ m_name for m_name, m_value in self._mock_children.items()
+ if m_value is not _deleted]
from_type = [e for e in from_type if not e.startswith('_')]
from_dict = [e for e in from_dict if not e.startswith('_') or
_is_magic(e)]
- return sorted(set(extras + from_type + from_dict +
- list(self._mock_children)))
+ return sorted(set(extras + from_type + from_dict + from_child_mocks))
def __setattr__(self, name, value):
obj = self._mock_children.get(name, _missing)
if name in self.__dict__:
- super().__delattr__(name)
+ _safe_super(NonCallableMock, self).__delattr__(name)
elif obj is _deleted:
raise AttributeError(name)
if obj is not _missing:
file_spec = None
-def _iterate_read_data(read_data):
- # Helper for mock_open:
- # Retrieve lines from read_data via a generator so that separate calls to
- # readline, read, and readlines are properly interleaved
- sep = b'\n' if isinstance(read_data, bytes) else '\n'
- data_as_list = [l + sep for l in read_data.split(sep)]
-
- if data_as_list[-1] == sep:
- # If the last line ended in a newline, the list comprehension will have an
- # extra entry that's just a newline. Remove this.
- data_as_list = data_as_list[:-1]
- else:
- # If there wasn't an extra newline by itself, then the file being
- # emulated doesn't have a newline to end the last line remove the
- # newline that our naive format() added
- data_as_list[-1] = data_as_list[-1][:-1]
- for line in data_as_list:
- yield line
+def _to_stream(read_data):
+ if isinstance(read_data, bytes):
+ return io.BytesIO(read_data)
+ else:
+ return io.StringIO(read_data)
def mock_open(mock=None, read_data=''):
`read_data` is a string for the `read`, `readline` and `readlines` of the
file handle to return. This is an empty string by default.
"""
+ _read_data = _to_stream(read_data)
+ _state = [_read_data, None]
+
def _readlines_side_effect(*args, **kwargs):
if handle.readlines.return_value is not None:
return handle.readlines.return_value
- return list(_state[0])
+ return _state[0].readlines(*args, **kwargs)
def _read_side_effect(*args, **kwargs):
if handle.read.return_value is not None:
return handle.read.return_value
- return type(read_data)().join(_state[0])
+ return _state[0].read(*args, **kwargs)
- def _readline_side_effect():
+ def _readline_side_effect(*args, **kwargs):
yield from _iter_side_effect()
while True:
- yield type(read_data)()
+ yield _state[0].readline(*args, **kwargs)
def _iter_side_effect():
if handle.readline.return_value is not None:
for line in _state[0]:
yield line
+ def _next_side_effect():
+ if handle.readline.return_value is not None:
+ return handle.readline.return_value
+ return next(_state[0])
+
global file_spec
if file_spec is None:
import _io
handle = MagicMock(spec=file_spec)
handle.__enter__.return_value = handle
- _state = [_iterate_read_data(read_data), None]
-
handle.write.return_value = None
handle.read.return_value = None
handle.readline.return_value = None
handle.readline.side_effect = _state[1]
handle.readlines.side_effect = _readlines_side_effect
handle.__iter__.side_effect = _iter_side_effect
+ handle.__next__.side_effect = _next_side_effect
def reset_data(*args, **kwargs):
- _state[0] = _iterate_read_data(read_data)
+ _state[0] = _to_stream(read_data)
if handle.readline.side_effect == _state[1]:
# Only reset the side effect if the user hasn't overridden it.
_state[1] = _readline_side_effect()
from unittest.mock import (
call, _Call, create_autospec, MagicMock,
- Mock, ANY, _CallList, patch, PropertyMock
+ Mock, ANY, _CallList, patch, PropertyMock, _callable
)
from datetime import datetime
self.assertNotIsInstance(returned, PropertyMock)
+class TestCallablePredicate(unittest.TestCase):
+
+ def test_type(self):
+ for obj in [str, bytes, int, list, tuple, SomeClass]:
+ self.assertTrue(_callable(obj))
+
+ def test_call_magic_method(self):
+ class Callable:
+ def __call__(self):
+ pass
+ instance = Callable()
+ self.assertTrue(_callable(instance))
+
+ def test_staticmethod(self):
+ class WithStaticMethod:
+ @staticmethod
+ def staticfunc():
+ pass
+ self.assertTrue(_callable(WithStaticMethod.staticfunc))
+
+ def test_non_callable_staticmethod(self):
+ class BadStaticMethod:
+ not_callable = staticmethod(None)
+ self.assertFalse(_callable(BadStaticMethod.not_callable))
+
+ def test_classmethod(self):
+ class WithClassMethod:
+ @classmethod
+ def classfunc(cls):
+ pass
+ self.assertTrue(_callable(WithClassMethod.classfunc))
+
+ def test_non_callable_classmethod(self):
+ class BadClassMethod:
+ not_callable = classmethod(None)
+ self.assertFalse(_callable(BadClassMethod.not_callable))
+
+
if __name__ == '__main__':
unittest.main()
patcher.stop()
+ def test_dir_does_not_include_deleted_attributes(self):
+ mock = Mock()
+ mock.child.return_value = 1
+
+ self.assertIn('child', dir(mock))
+ del mock.child
+ self.assertNotIn('child', dir(mock))
+
+
def test_configure_mock(self):
mock = Mock(foo='bar')
self.assertEqual(mock.foo, 'bar')
m = mock.create_autospec(object(), name='sweet_func')
self.assertIn('sweet_func', repr(m))
+ #Issue23078
+ def test_create_autospec_classmethod_and_staticmethod(self):
+ class TestClass:
+ @classmethod
+ def class_method(cls):
+ pass
+
+ @staticmethod
+ def static_method():
+ pass
+ for method in ('class_method', 'static_method'):
+ with self.subTest(method=method):
+ mock_method = mock.create_autospec(getattr(TestClass, method))
+ mock_method()
+ mock_method.assert_called_once_with()
+ self.assertRaises(TypeError, mock_method, 'extra_arg')
+
#Issue21238
def test_mock_unsafe(self):
m = Mock()
self.assertEqual(lines[1], 'Norwegian Blue')
self.assertEqual(list(f1), [])
+ def test_mock_open_using_next(self):
+ mocked_open = mock.mock_open(read_data='1st line\n2nd line\n3rd line')
+ f1 = mocked_open('a-name')
+ line1 = next(f1)
+ line2 = f1.__next__()
+ lines = [line for line in f1]
+ self.assertEqual(line1, '1st line\n')
+ self.assertEqual(line2, '2nd line\n')
+ self.assertEqual(lines[0], '3rd line')
+ self.assertEqual(list(f1), [])
+ with self.assertRaises(StopIteration):
+ next(f1)
+
def test_mock_open_write(self):
# Test exception in file writing write()
mock_namedtemp = mock.mock_open(mock.MagicMock(name='JLV'))
self.assertRaises(TypeError, mock.child, 1)
self.assertEqual(mock.mock_calls, [call.child(1, 2)])
+ def test_isinstance_under_settrace(self):
+ # bpo-36593 : __class__ is not set for a class that has __class__
+ # property defined when it's used with sys.settrace(trace) set.
+ # Delete the module to force reimport with tracing function set
+ # restore the old reference later since there are other tests that are
+ # dependent on unittest.mock.patch. In testpatch.PatchTest
+ # test_patch_dict_test_prefix and test_patch_test_prefix not restoring
+ # causes the objects patched to go out of sync
+
+ old_patch = unittest.mock.patch
+
+ # Directly using __setattr__ on unittest.mock causes current imported
+ # reference to be updated. Use a lambda so that during cleanup the
+ # re-imported new reference is updated.
+ self.addCleanup(lambda patch: setattr(unittest.mock, 'patch', patch),
+ old_patch)
+
+ with patch.dict('sys.modules'):
+ del sys.modules['unittest.mock']
+
+ def trace(frame, event, arg):
+ return trace
+
+ sys.settrace(trace)
+ self.addCleanup(sys.settrace, None)
+
+ from unittest.mock import (
+ Mock, MagicMock, NonCallableMock, NonCallableMagicMock
+ )
+
+ mocks = [
+ Mock, MagicMock, NonCallableMock, NonCallableMagicMock
+ ]
+
+ for mock in mocks:
+ obj = mock(spec=Something)
+ self.assertIsInstance(obj, Something)
+
if __name__ == '__main__':
unittest.main()
pass
foo = 'bar'
+ @staticmethod
+ def static_method():
+ return 24
+
+ @classmethod
+ def class_method(cls):
+ return 42
+
class Bar(object):
def a(self):
pass
self.assertEqual(result, 3)
+ def test_autospec_staticmethod(self):
+ with patch('%s.Foo.static_method' % __name__, autospec=True) as method:
+ Foo.static_method()
+ method.assert_called_once_with()
+
+
+ def test_autospec_classmethod(self):
+ with patch('%s.Foo.class_method' % __name__, autospec=True) as method:
+ Foo.class_method()
+ method.assert_called_once_with()
+
+
def test_autospec_with_new(self):
patcher = patch('%s.function' % __name__, new=3, autospec=True)
self.assertRaises(TypeError, patcher.start)
self.assertEqual(lines[1], 'bar\n')
self.assertEqual(lines[2], 'baz\n')
self.assertEqual(h.readline(), '')
+ with self.assertRaises(StopIteration):
+ next(h)
+ def test_next_data(self):
+ # Check that next will correctly return the next available
+ # line and plays well with the dunder_iter part.
+ mock = mock_open(read_data='foo\nbar\nbaz\n')
+ with patch('%s.open' % __name__, mock, create=True):
+ h = open('bar')
+ line1 = next(h)
+ line2 = next(h)
+ lines = [l for l in h]
+ self.assertEqual(line1, 'foo\n')
+ self.assertEqual(line2, 'bar\n')
+ self.assertEqual(lines[0], 'baz\n')
+ self.assertEqual(h.readline(), '')
def test_readlines_data(self):
# Test that emulating a file that ends in a newline character works
# for mocks returned by mock_open
some_data = 'foo\nbar\nbaz'
mock = mock_open(read_data=some_data)
- self.assertEqual(mock().read(10), some_data)
+ self.assertEqual(mock().read(10), some_data[:10])
+ self.assertEqual(mock().read(10), some_data[:10])
+
+ f = mock()
+ self.assertEqual(f.read(10), some_data[:10])
+ self.assertEqual(f.read(10), some_data[10:])
def test_interleaved_reads(self):
# looking for characters like \u2100 that expand to 'a/c'
# IDNA uses NFKC equivalence, so normalize for this check
import unicodedata
- netloc2 = unicodedata.normalize('NFKC', netloc)
- if netloc == netloc2:
+ n = netloc.replace('@', '') # ignore characters already included
+ n = n.replace(':', '') # but not the surrounding text
+ n = n.replace('#', '')
+ n = n.replace('?', '')
+ netloc2 = unicodedata.normalize('NFKC', n)
+ if n == netloc2:
return
- _, _, netloc = netloc.rpartition('@') # anything to the left of '@' is okay
for c in '/?#@:':
if c in netloc2:
- raise ValueError("netloc '" + netloc2 + "' contains invalid " +
+ raise ValueError("netloc '" + netloc + "' contains invalid " +
"characters under NFKC normalization")
def urlsplit(url, scheme='', allow_fragments=True):
"""quote('abc def') -> 'abc%20def'
Each part of a URL, e.g. the path info, the query, etc., has a
- different set of reserved characters that must be quoted.
+ different set of reserved characters that must be quoted. The
+ quote function offers a cautious (not minimal) way to quote a
+ string for most of these parts.
- RFC 3986 Uniform Resource Identifiers (URI): Generic Syntax lists
- the following reserved characters.
+ RFC 3986 Uniform Resource Identifier (URI): Generic Syntax lists
+ the following (un)reserved characters.
- reserved = ";" | "/" | "?" | ":" | "@" | "&" | "=" | "+" |
- "$" | "," | "~"
+ unreserved = ALPHA / DIGIT / "-" / "." / "_" / "~"
+ reserved = gen-delims / sub-delims
+ gen-delims = ":" / "/" / "?" / "#" / "[" / "]" / "@"
+ sub-delims = "!" / "$" / "&" / "'" / "(" / ")"
+ / "*" / "+" / "," / ";" / "="
- Each of these characters is reserved in some component of a URL,
+ Each of the reserved characters is reserved in some component of a URL,
but not necessarily in all of them.
- Python 3.7 updates from using RFC 2396 to RFC 3986 to quote URL strings.
- Now, "~" is included in the set of reserved characters.
+ The quote function %-escapes all characters that are neither in the
+ unreserved chars ("always safe") nor the additional chars set via the
+ safe arg.
+
+ The default for the safe arg is '/'. The character is reserved, but in
+ typical usage the quote function is being called on a path where the
+ existing slash characters are to be preserved.
- By default, the quote function is intended for quoting the path
- section of a URL. Thus, it will not encode '/'. This character
- is reserved, but in typical usage the quote function is being
- called on a path where the existing slash characters are used as
- reserved characters.
+ Python 3.7 updates from using RFC 2396 to RFC 3986 to quote URL strings.
+ Now, "~" is included in the set of unreserved characters.
string and safe may be either str or bytes objects. encoding and errors
must not be specified if string is a bytes object.
name = 'open_' + urltype
self.type = urltype
name = name.replace('-', '_')
- if not hasattr(self, name):
+ if not hasattr(self, name) or name == 'open_local_file':
if proxy:
return self.open_unknown_proxy(proxy, fullurl, data)
else:
for entry in self.entries:
if entry.applies_to(useragent):
return entry.delay
- return self.default_entry.delay
+ if self.default_entry:
+ return self.default_entry.delay
+ return None
def request_rate(self, useragent):
if not self.mtime():
for entry in self.entries:
if entry.applies_to(useragent):
return entry.req_rate
- return self.default_entry.req_rate
+ if self.default_entry:
+ return self.default_entry.req_rate
+ return None
def __str__(self):
entries = self.entries
f.write('include-system-site-packages = %s\n' % incl)
f.write('version = %d.%d.%d\n' % sys.version_info[:3])
- def symlink_or_copy(self, src, dst, relative_symlinks_ok=False):
- """
- Try symlinking a file, and if that fails, fall back to copying.
- """
- force_copy = not self.symlinks
- if not force_copy:
- try:
- if not os.path.islink(dst): # can't link to itself!
+ if os.name != 'nt':
+ def symlink_or_copy(self, src, dst, relative_symlinks_ok=False):
+ """
+ Try symlinking a file, and if that fails, fall back to copying.
+ """
+ force_copy = not self.symlinks
+ if not force_copy:
+ try:
+ if not os.path.islink(dst): # can't link to itself!
+ if relative_symlinks_ok:
+ assert os.path.dirname(src) == os.path.dirname(dst)
+ os.symlink(os.path.basename(src), dst)
+ else:
+ os.symlink(src, dst)
+ except Exception: # may need to use a more specific exception
+ logger.warning('Unable to symlink %r to %r', src, dst)
+ force_copy = True
+ if force_copy:
+ shutil.copyfile(src, dst)
+ else:
+ def symlink_or_copy(self, src, dst, relative_symlinks_ok=False):
+ """
+ Try symlinking a file, and if that fails, fall back to copying.
+ """
+ bad_src = os.path.lexists(src) and not os.path.exists(src)
+ if self.symlinks and not bad_src and not os.path.islink(dst):
+ try:
if relative_symlinks_ok:
assert os.path.dirname(src) == os.path.dirname(dst)
os.symlink(os.path.basename(src), dst)
else:
os.symlink(src, dst)
- except Exception: # may need to use a more specific exception
- logger.warning('Unable to symlink %r to %r', src, dst)
- force_copy = True
- if force_copy:
- if os.name == 'nt':
- # On Windows, we rewrite symlinks to our base python.exe into
- # copies of venvlauncher.exe
- basename, ext = os.path.splitext(os.path.basename(src))
+ return
+ except Exception: # may need to use a more specific exception
+ logger.warning('Unable to symlink %r to %r', src, dst)
+
+ # On Windows, we rewrite symlinks to our base python.exe into
+ # copies of venvlauncher.exe
+ basename, ext = os.path.splitext(os.path.basename(src))
+ srcfn = os.path.join(os.path.dirname(__file__),
+ "scripts",
+ "nt",
+ basename + ext)
+ # Builds or venv's from builds need to remap source file
+ # locations, as we do not put them into Lib/venv/scripts
+ if sysconfig.is_python_build(True) or not os.path.isfile(srcfn):
if basename.endswith('_d'):
ext = '_d' + ext
basename = basename[:-2]
- if sysconfig.is_python_build(True):
- if basename == 'python':
- basename = 'venvlauncher'
- elif basename == 'pythonw':
- basename = 'venvwlauncher'
- scripts = os.path.dirname(src)
- else:
- scripts = os.path.join(os.path.dirname(__file__), "scripts", "nt")
- src = os.path.join(scripts, basename + ext)
+ if basename == 'python':
+ basename = 'venvlauncher'
+ elif basename == 'pythonw':
+ basename = 'venvwlauncher'
+ src = os.path.join(os.path.dirname(src), basename + ext)
+ else:
+ src = srcfn
+ if not os.path.exists(src):
+ if not bad_src:
+ logger.warning('Unable to copy %r', src)
+ return
shutil.copyfile(src, dst)
for suffix in suffixes:
src = os.path.join(dirname, suffix)
- if os.path.exists(src):
+ if os.path.lexists(src):
copier(src, os.path.join(binpath, suffix))
if sysconfig.is_python_build(True):
@echo off\r
\r
rem This file is UTF-8 encoded, so we need to update the current code page while executing it\r
-for /f "tokens=2 delims=:" %%a in ('"%SystemRoot%\System32\chcp.com"') do (\r
+for /f "tokens=2 delims=:." %%a in ('"%SystemRoot%\System32\chcp.com"') do (\r
set "_OLD_CODEPAGE=%%a"\r
)\r
if defined _OLD_CODEPAGE (\r
class _Info:
__slots__ = ("weakref", "func", "args", "kwargs", "atexit", "index")
- def __init__(self, obj, func, *args, **kwargs):
+ def __init__(*args, **kwargs):
+ if len(args) >= 3:
+ self, obj, func, *args = args
+ elif not args:
+ raise TypeError("descriptor '__init__' of 'finalize' object "
+ "needs an argument")
+ else:
+ if 'func' not in kwargs:
+ raise TypeError('finalize expected at least 2 positional '
+ 'arguments, got %d' % (len(args)-1))
+ func = kwargs.pop('func')
+ if len(args) >= 2:
+ self, obj, *args = args
+ else:
+ if 'obj' not in kwargs:
+ raise TypeError('finalize expected at least 2 positional '
+ 'arguments, got %d' % (len(args)-1))
+ obj = kwargs.pop('obj')
+ self, *args = args
+ args = tuple(args)
+
if not self._registered_with_atexit:
# We may register the exit function more than once because
# of a thread race, but that is harmless
self.setup_environ()
self.result = application(self.environ, self.start_response)
self.finish_response()
+ except (ConnectionAbortedError, BrokenPipeError, ConnectionResetError):
+ # We expect the client to close the connection abruptly from time
+ # to time.
+ return
except:
try:
self.handle_error()
for data in self.result:
self.write(data)
self.finish_content()
- finally:
+ except:
+ # Call close() on the iterable returned by the WSGI application
+ # in case of an exception.
+ if hasattr(self.result, 'close'):
+ self.result.close()
+ raise
+ else:
+ # We only call close() when no exception is raised, because it
+ # will set status, result, headers, and environ fields to None.
+ # See bpo-29183 for more details.
self.close()
if sig != stringEndArchive64Locator:
return endrec
- if diskno != 0 or disks != 1:
+ if diskno != 0 or disks > 1:
raise BadZipFile("zipfiles that span multiple disks are not supported")
# Assume no 'zip64 extensible data'
def read(self, n=-1):
"""Read and return up to n bytes.
- If the argument is omitted, None, or negative, data is read and returned until EOF is reached..
+ If the argument is omitted, None, or negative, data is read and returned until EOF is reached.
"""
if n is None or n < 0:
buf = self._readbuffer[self._offset:]
def close(self):
if self.closed:
return
- super().close()
- # Flush any data from the compressor, and update header info
- if self._compressor:
- buf = self._compressor.flush()
- self._compress_size += len(buf)
- self._fileobj.write(buf)
- self._zinfo.compress_size = self._compress_size
- else:
- self._zinfo.compress_size = self._file_size
- self._zinfo.CRC = self._crc
- self._zinfo.file_size = self._file_size
-
- # Write updated header info
- if self._zinfo.flag_bits & 0x08:
- # Write CRC and file sizes after the file data
- fmt = '<LLQQ' if self._zip64 else '<LLLL'
- self._fileobj.write(struct.pack(fmt, _DD_SIGNATURE, self._zinfo.CRC,
- self._zinfo.compress_size, self._zinfo.file_size))
- self._zipfile.start_dir = self._fileobj.tell()
- else:
- if not self._zip64:
- if self._file_size > ZIP64_LIMIT:
- raise RuntimeError('File size unexpectedly exceeded ZIP64 '
- 'limit')
- if self._compress_size > ZIP64_LIMIT:
- raise RuntimeError('Compressed size unexpectedly exceeded '
- 'ZIP64 limit')
- # Seek backwards and write file header (which will now include
- # correct CRC and file sizes)
-
- # Preserve current position in file
- self._zipfile.start_dir = self._fileobj.tell()
- self._fileobj.seek(self._zinfo.header_offset)
- self._fileobj.write(self._zinfo.FileHeader(self._zip64))
- self._fileobj.seek(self._zipfile.start_dir)
-
- self._zipfile._writing = False
-
- # Successfully written: Add file to our caches
- self._zipfile.filelist.append(self._zinfo)
- self._zipfile.NameToInfo[self._zinfo.filename] = self._zinfo
+ try:
+ super().close()
+ # Flush any data from the compressor, and update header info
+ if self._compressor:
+ buf = self._compressor.flush()
+ self._compress_size += len(buf)
+ self._fileobj.write(buf)
+ self._zinfo.compress_size = self._compress_size
+ else:
+ self._zinfo.compress_size = self._file_size
+ self._zinfo.CRC = self._crc
+ self._zinfo.file_size = self._file_size
+
+ # Write updated header info
+ if self._zinfo.flag_bits & 0x08:
+ # Write CRC and file sizes after the file data
+ fmt = '<LLQQ' if self._zip64 else '<LLLL'
+ self._fileobj.write(struct.pack(fmt, _DD_SIGNATURE, self._zinfo.CRC,
+ self._zinfo.compress_size, self._zinfo.file_size))
+ self._zipfile.start_dir = self._fileobj.tell()
+ else:
+ if not self._zip64:
+ if self._file_size > ZIP64_LIMIT:
+ raise RuntimeError(
+ 'File size unexpectedly exceeded ZIP64 limit')
+ if self._compress_size > ZIP64_LIMIT:
+ raise RuntimeError(
+ 'Compressed size unexpectedly exceeded ZIP64 limit')
+ # Seek backwards and write file header (which will now include
+ # correct CRC and file sizes)
+
+ # Preserve current position in file
+ self._zipfile.start_dir = self._fileobj.tell()
+ self._fileobj.seek(self._zinfo.header_offset)
+ self._fileobj.write(self._zinfo.FileHeader(self._zip64))
+ self._fileobj.seek(self._zipfile.start_dir)
+
+ # Successfully written: Add file to our caches
+ self._zipfile.filelist.append(self._zinfo)
+ self._zipfile.NameToInfo[self._zinfo.filename] = self._zinfo
+ finally:
+ self._zipfile._writing = False
+
+
class ZipFile:
""" Class with methods to open, read, write, close, list zip files.
result.extend([
dict(
- name="OpenSSL 1.1.0j",
- url="https://www.openssl.org/source/openssl-1.1.0j.tar.gz",
- checksum='b4ca5b78ae6ae79da80790b30dbedbdc',
+ name="OpenSSL 1.1.1c",
+ url="https://www.openssl.org/source/openssl-1.1.1c.tar.gz",
+ checksum='15e21da6efe8aa0e0768ffd8cd37a5f6',
buildrecipe=build_universal_openssl,
configure=None,
install=None,
),
),
dict(
- name="SQLite 3.22.0",
- url="https://www.sqlite.org/2018/sqlite-autoconf-3220000.tar.gz",
- checksum='96b5648d542e8afa6ab7ffb8db8ddc3d',
+ name="SQLite 3.28.0",
+ url="https://www.sqlite.org/2019/sqlite-autoconf-3280000.tar.gz",
+ checksum='3c68eb400f8354605736cd55400e1572',
extra_cflags=('-Os '
'-DSQLITE_ENABLE_FTS5 '
'-DSQLITE_ENABLE_FTS4 '
"ppc": ["darwin-ppc-cc"],
"ppc64": ["darwin64-ppc-cc"],
}
+
+ # Somewhere between OpenSSL 1.1.0j and 1.1.1c, changes cause the
+ # "enable-ec_nistp_64_gcc_128" option to get compile errors when
+ # building on our 10.6 gcc-4.2 environment. There have been other
+ # reports of projects running into this when using older compilers.
+ # So, for now, do not try to use "enable-ec_nistp_64_gcc_128" when
+ # building for 10.6.
+ if getDeptargetTuple() == (10, 6):
+ arch_opts['x86_64'].remove('enable-ec_nistp_64_gcc_128')
+
configure_opts = [
"no-idea",
"no-mdc2",
print(" -- retrying hdiutil create")
time.sleep(5)
else:
- raise RuntimeError("command failed: %s"%(commandline,))
+ raise RuntimeError("command failed: %s"%(cmd,))
if not os.path.exists(os.path.join(WORKDIR, "mnt")):
os.mkdir(os.path.join(WORKDIR, "mnt"))
-{\rtf1\ansi\ansicpg1252\cocoartf1561\cocoasubrtf400
-{\fonttbl\f0\fswiss\fcharset0 Helvetica;\f1\fmodern\fcharset0 CourierNewPSMT;}
+{\rtf1\ansi\ansicpg1252\cocoartf1671\cocoasubrtf500
+{\fonttbl\f0\fswiss\fcharset0 Helvetica;\f1\fswiss\fcharset0 Helvetica-Bold;\f2\fswiss\fcharset0 Helvetica-Oblique;
+\f3\fmodern\fcharset0 CourierNewPSMT;}
{\colortbl;\red255\green255\blue255;}
{\*\expandedcolortbl;;}
\margl1440\margr1440\vieww13380\viewh14600\viewkind0
\
\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural\partightenfactor0
-\b \cf0 \ul \ulc0 Which installer variant should I use?
-\b0 \ulnone \
+\f1\b \cf0 \ul \ulc0 Which installer variant should I use?
+\f0\b0 \ulnone \
\
-For Python.3.7, python.org currently provides two installer variants for download: one that installs a
-\i 64-bit-only
-\i0 Python capable of running on
-\i macOS 10.9 (Mavericks)
-\i0 or later; and one that installs a
-\i 64-bit/32-bit Intel
-\i0 Python capable of running on
-\i macOS 10.6 (Snow Leopard)
-\i0 or later. (This ReadMe was installed with the
-\i $MACOSX_DEPLOYMENT_TARGET
-\i0 variant.) If you are running on macOS 10.9 or later and if you have no need for compatibility with older systems, use the 10.9 variant. Use the 10.6 variant if you are running on macOS 10.6 through 10.8 or if you want to produce standalone applications that can run on systems from 10.6. The Pythons installed by these installers are built with private copies of some third-party libraries not included with or newer than those in macOS itself. The list of these libraries varies by installer variant and is included at the end of the License.rtf file.
-\b \ul \
+\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\partightenfactor0
+
+\f1\b \cf0 CHANGED in 3.7.4:
+\f0\b0 The
+\f1\b 10.6+ 64-/32-bit installer variant is being deprecated
+\f0\b0 .
+\f1\b Python 3.8.0
+\f0\b0 will
+\f1\b not
+\f0\b0 include a binary installer for 10.6+ and
+\f1\b future bugfix releases of 3.7.x
+\f0\b0 may not, either. Mac OS X 10.6 Snow Leopard was released in 2009 and has not been supported by Apple for many years including lack of security updates. It is becoming increasingly difficult to ensure new Python features and bug fixes are compatible with such old systems especially with Apple's deprecation and removal of 32-bit support in recent and upcoming macOS releases. We believe that there is now very little usage of this installer variant and so we would like to focus our resources on supporting newer systems. We do not plan to intentionally break Python support on 10.6 and we will consider bug fixes for problems found when building from source on 10.6. \
+\
+\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural\partightenfactor0
+\cf0 For Python.3.7, python.org currently provides two installer variants for download: one that installs a
+\f2\i 64-bit-only
+\f0\i0 Python capable of running on
+\f2\i macOS 10.9 (Mavericks)
+\f0\i0 or later; and one that installs a
+\f2\i 64-bit/32-bit Intel
+\f0\i0 Python capable of running on
+\f2\i macOS 10.6 (Snow Leopard)
+\f0\i0 or later. (This ReadMe was installed with the
+\f2\i $MACOSX_DEPLOYMENT_TARGET
+\f0\i0 variant.) If you are running on macOS 10.9 or later and if you have no need for compatibility with older systems, use the 10.9 variant. Use the 10.6 variant if you are running on macOS 10.6 through 10.8 or if you want to produce standalone applications that can run on systems from 10.6. The Pythons installed by these installers are built with private copies of some third-party libraries not included with or newer than those in macOS itself. The list of these libraries varies by installer variant and is included at the end of the License.rtf file.
+\f1\b \ul \
\
Certificate verification and OpenSSL\
-\b0 \ulnone \
-This variant of Python 3.7 includes its own private copy of OpenSSL 1.1.0. The deprecated Apple-supplied OpenSSL libraries are no longer used. This means that the trust certificates in system and user keychains managed by the
-\i Keychain Access
-\i0 application and the
-\i security
-\i0 command line utility are no longer used as defaults by the Python
-\f1 ssl
+\f0\b0 \ulnone \
+\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\partightenfactor0
+\cf0 This variant of Python 3.7 includes its own private copy of OpenSSL 1.1.1. The deprecated Apple-supplied OpenSSL libraries are no longer used. This means that the trust certificates in system and user keychains managed by the
+\f2\i Keychain Access
+\f0\i0 application and the
+\f2\i security
+\f0\i0 command line utility are no longer used as defaults by the Python
+\f3 ssl
\f0 module. A sample command script is included in
-\f1 /Applications/Python 3.7
+\f3 /Applications/Python 3.7
\f0 to install a curated bundle of default root certificates from the third-party
-\f1 certifi
+\f3 certifi
\f0 package ({\field{\*\fldinst{HYPERLINK "https://pypi.org/project/certifi/"}}{\fldrslt https://pypi.org/project/certifi/}}). If you choose to use
-\f1 certifi
+\f3 certifi
\f0 , you should consider subscribing to the{\field{\*\fldinst{HYPERLINK "https://certifi.io/en/latest/"}}{\fldrslt project's email update service}} to be notified when the certificate bundle is updated.\
-\
+\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural\partightenfactor0
+\cf0 \
The bundled
-\f1 pip
+\f3 pip
\f0 included with this installer has its own default certificate store for verifying download connections.\
-\
+\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\partightenfactor0
+
+\f1\b \cf0 \
+CHANGED in 3.7.4:
+\f0\b0 OpenSSL has been updated from 1.1.0 to 1.1.1.\
+\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural\partightenfactor0
+\cf0 \
-\b \ul Using IDLE or other Tk applications
-\b0 \ulnone \
+\f1\b \ul Using IDLE or other Tk applications
+\f0\b0 \ulnone \
\
Both installer variants now come with their own private version of Tcl/Tk 8.6. They no longer use system-supplied or third-party supplied versions of Tcl/Tk as in previous releases.\
-\b \ul \
+\f1\b \ul \
Other changes\
+\
+\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\partightenfactor0
+\cf0 \ulnone CHANGED in 3.7.4:
+\f0\b0 SQLite has been updated from 3.22.0 to 3.28.0
+\f1\b \ul \ulc0 \
+\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural\partightenfactor0
-\b0 \ulnone \
+\f0\b0 \cf0 \ulnone \
For other changes in this release, see the
-\i What's new
-\i0 section in the {\field{\*\fldinst{HYPERLINK "https://www.python.org/doc/"}}{\fldrslt Documentation Set}} for this release and its
-\i Release Notes
-\i0 link at {\field{\*\fldinst{HYPERLINK "https://www.python.org/downloads/"}}{\fldrslt https://www.python.org/downloads/}}.\
+\f2\i What's new
+\f0\i0 section in the {\field{\*\fldinst{HYPERLINK "https://www.python.org/doc/"}}{\fldrslt Documentation Set}} for this release and its
+\f2\i Release Notes
+\f0\i0 link at {\field{\*\fldinst{HYPERLINK "https://www.python.org/downloads/"}}{\fldrslt https://www.python.org/downloads/}}.\
-\b \ul \
+\f1\b \ul \
Python 3 and Python 2 Co-existence\
-\b0 \ulnone \
+\f0\b0 \ulnone \
Python.org Python $VERSION and 2.7.x versions can both be installed on your system and will not conflict. Command names for Python 3 contain a 3 in them,
-\f1 python3
+\f3 python3
\f0 (or
-\f1 python$VERSION
+\f3 python$VERSION
\f0 ),
-\f1 idle3
+\f3 idle3
\f0 (or i
-\f1 dle$VERSION
+\f3 dle$VERSION
\f0 ),
-\f1 pip3
+\f3 pip3
\f0 (or
-\f1 pip$VERSION
+\f3 pip$VERSION
\f0 ), etc. Python 2.7 command names contain a 2 or no digit:
-\f1 python2
+\f3 python2
\f0 (or
-\f1 python2.7
+\f3 python2.7
\f0 or
-\f1 python
+\f3 python
\f0 ),
-\f1 idle2
+\f3 idle2
\f0 (or
-\f1 idle2.7
+\f3 idle2.7
\f0 or
-\f1 idle
+\f3 idle
\f0 ), etc.\
}
\ No newline at end of file
-{\rtf1\ansi\ansicpg1252\cocoartf1671\cocoasubrtf100
+{\rtf1\ansi\ansicpg1252\cocoartf1671\cocoasubrtf500
\cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;\f1\fswiss\fcharset0 Helvetica-Bold;\f2\fmodern\fcharset0 CourierNewPSMT;
}
{\colortbl;\red255\green255\blue255;}
\
\f1\b NEW in 3.7.0:
-\f0\b0 two installer variants (10.9+ 64-bit-only, 10.6+ 64-/32-bit), built-in Tcl/Tk 8.6 support (no additional third-party downloads!), OpenSSL 1.1.0, and more!\
+\f0\b0 two installer variants (10.9+ 64-bit-only, 10.6+ 64-/32-bit), built-in Tcl/Tk 8.6 support (no additional third-party downloads!), OpenSSL 1.1, and more!\
+
+\f1\b \
+CHANGED in 3.7.4:
+\f0\b0 OpenSSL 1.1.1, SQLite 3.28.0, 10.6+ 64-/32-bit installer variant deprecated\
}
\ No newline at end of file
LDLAST= @LDLAST@
SGI_ABI= @SGI_ABI@
CCSHARED= @CCSHARED@
+# LINKFORSHARED are the flags passed to the $(CC) command that links
+# the python executable -- this is only needed for a few systems
LINKFORSHARED= @LINKFORSHARED@
ARFLAGS= @ARFLAGS@
# Extra C flags added for building the interpreter object files.
Chris Angelico
Jérémy Anger
Jon Anglin
+Michele Angrisano
Ankur Ankan
Heidi Annexstad
Ramchandra Apte
Julian Berman
Brice Berna
Olivier Bernard
+Maxwell Bernstein
Eric Beser
Steven Bethard
Stephen Bevan
Ron Bickers
Natalia B. Bidart
Adrian von Bidder
+Olexa Bilaniuk
David Binger
Dominic Binks
Philippe Biondi
Charles Cazabon
Jesús Cea Avión
Per Cederqvist
-Matej Cepl
+Matěj Cepl
Carl Cerecke
Octavian Cerna
Michael Cetrulo
John Feuerstein
Carl Feynman
Vincent Fiack
+Niklas Fiekas
Anastasia Filatova
Tomer Filiba
Segev Finer
Gerhard Häring
Fredrik Håård
Florian Höch
+Robert Hölzl
Catalin Iacob
Mihai Ibanescu
Ali Ikinci
Zubin Mithra
Florian Mladitsch
Doug Moen
+Paul Monson
The Dragon De Monsyne
Bastien Montagne
Skip Montanaro
Peter Moody
Alan D. Moore
+Nicolai Moore
Paul Moore
Ross Moore
Ben Morgan
Doug Zongker
Peter Åstrand
Zheao Li
+Geoff Shannon
Python News
+++++++++++
+What's New in Python 3.7.4 final?
+=================================
+
+*Release date: 2019-07-08*
+
+Core and Builtins
+-----------------
+
+- bpo-37500: Due to unintended side effects, revert the change introduced by
+ :issue:`1875` in 3.7.4rc1 to check for syntax errors in dead conditional
+ code blocks.
+
+Documentation
+-------------
+
+- bpo-37149: Replace the dead link to the Tkinter 8.5 reference by John
+ Shipman, New Mexico Tech, with a link to the archive.org copy.
+
+
+What's New in Python 3.7.4 release candidate 2?
+===============================================
+
+*Release date: 2019-07-02*
+
+Security
+--------
+
+- bpo-37463: ssl.match_hostname() no longer accepts IPv4 addresses with
+ additional text after the address and only quad-dotted notation without
+ trailing whitespaces. Some inet_aton() implementations ignore whitespace
+ and all data after whitespace, e.g. '127.0.0.1 whatever'.
+
+Core and Builtins
+-----------------
+
+- bpo-24214: Improved support of the surrogatepass error handler in the
+ UTF-8 and UTF-16 incremental decoders.
+
+Library
+-------
+
+- bpo-37440: http.client now enables TLS 1.3 post-handshake authentication
+ for default context or if a cert_file is passed to HTTPSConnection.
+
+- bpo-37437: Update vendorized expat version to 2.2.7.
+
+- bpo-37428: SSLContext.post_handshake_auth = True no longer sets
+ SSL_VERIFY_POST_HANDSHAKE verify flag for client connections. Although the
+ option is documented as ignored for clients, OpenSSL implicitly enables
+ cert chain validation when the flag is set.
+
+- bpo-32627: Fix compile error when ``_uuid`` headers conflicting included.
+
+Windows
+-------
+
+- bpo-37369: Fixes path for :data:`sys.executable` when running from the
+ Microsoft Store.
+
+- bpo-35360: Update Windows builds to use SQLite 3.28.0.
+
+macOS
+-----
+
+- bpo-34602: Avoid test suite failures on macOS by no longer calling
+ resource.setrlimit to increase the process stack size limit at runtime.
+ The runtime change is no longer needed since the interpreter is being
+ built with a larger default stack size.
+
+
+What's New in Python 3.7.4 release candidate 1?
+===============================================
+
+*Release date: 2019-06-18*
+
+Security
+--------
+
+- bpo-35907: CVE-2019-9948: Avoid file reading by disallowing
+ ``local-file://`` and ``local_file://`` URL schemes in
+ ``URLopener().open()`` and ``URLopener().retrieve()`` of
+ :mod:`urllib.request`.
+
+- bpo-36742: Fixes mishandling of pre-normalization characters in
+ urlsplit().
+
+- bpo-30458: Address CVE-2019-9740 by disallowing URL paths with embedded
+ whitespace or control characters through into the underlying http client
+ request. Such potentially malicious header injection URLs now cause an
+ http.client.InvalidURL exception to be raised.
+
+- bpo-33529: Prevent fold function used in email header encoding from
+ entering infinite loop when there are too many non-ASCII characters in a
+ header.
+
+- bpo-35755: :func:`shutil.which` now uses ``os.confstr("CS_PATH")`` if
+ available and if the :envvar:`PATH` environment variable is not set.
+ Remove also the current directory from :data:`posixpath.defpath`. On Unix,
+ :func:`shutil.which` and the :mod:`subprocess` module no longer search the
+ executable in the current directory if the :envvar:`PATH` environment
+ variable is not set.
+
+Core and Builtins
+-----------------
+
+- bpo-37269: Fix a bug in the peephole optimizer that was not treating
+ correctly constant conditions with binary operators. Patch by Pablo
+ Galindo.
+
+- bpo-37219: Remove errorneous optimization for empty set differences.
+
+- bpo-26423: Fix possible overflow in ``wrap_lenfunc()`` when ``sizeof(long)
+ < sizeof(Py_ssize_t)`` (e.g., 64-bit Windows).
+
+- bpo-36829: :c:func:`PyErr_WriteUnraisable` now displays the exception even
+ if displaying the traceback failed. Moreover, hold a strong reference to
+ :data:`sys.stderr` while using it. Document that an exception must be set
+ when calling :c:func:`PyErr_WriteUnraisable`.
+
+- bpo-36907: Fix a crash when calling a C function with a keyword dict
+ (``f(**kwargs)``) and changing the dict ``kwargs`` while that function is
+ running.
+
+- bpo-36946: Fix possible signed integer overflow when handling slices.
+
+- bpo-27987: ``PyGC_Head`` structure is aligned to ``long double``. This is
+ needed to ensure GC-ed objects are aligned properly. Patch by Inada
+ Naoki.
+
+- bpo-1875: A :exc:`SyntaxError` is now raised if a code blocks that will be
+ optimized away (e.g. if conditions that are always false) contains syntax
+ errors. Patch by Pablo Galindo. (Reverted in 3.7.4 final by
+ :issue:`37500`.)
+
+- bpo-28866: Avoid caching attributes of classes which type defines mro() to
+ avoid a hard cache invalidation problem.
+
+- bpo-27639: Correct return type for UserList slicing operations. Patch by
+ Michael Blahay, Erick Cervantes, and vaultah
+
+- bpo-32849: Fix Python Initialization code on FreeBSD to detect properly
+ when stdin file descriptor (fd 0) is invalid.
+
+- bpo-27987: pymalloc returns memory blocks aligned by 16 bytes, instead of
+ 8 bytes, on 64-bit platforms to conform x86-64 ABI. Recent compilers
+ assume this alignment more often. Patch by Inada Naoki.
+
+- bpo-36504: Fix signed integer overflow in _ctypes.c's
+ ``PyCArrayType_new()``.
+
+- bpo-20844: Fix running script with encoding cookie and LF line ending may
+ fail on Windows.
+
+- bpo-24214: Fixed support of the surrogatepass error handler in the UTF-8
+ incremental decoder.
+
+- bpo-36459: Fix a possible double ``PyMem_FREE()`` due to tokenizer.c's
+ ``tok_nextc()``.
+
+- bpo-36433: Fixed TypeError message in classmethoddescr_call.
+
+- bpo-36430: Fix a possible reference leak in :func:`itertools.count`.
+
+- bpo-36440: Include node names in ``ParserError`` messages, instead of
+ numeric IDs. Patch by A. Skrobov.
+
+- bpo-36421: Fix a possible double decref in _ctypes.c's
+ ``PyCArrayType_new()``.
+
+- bpo-36256: Fix bug in parsermodule when parsing a state in a DFA that has
+ two or more arcs with labels of the same type. Patch by Pablo Galindo.
+
+- bpo-36236: At Python initialization, the current directory is no longer
+ prepended to :data:`sys.path` if it has been removed.
+
+- bpo-36262: Fix an unlikely memory leak on conversion from string to float
+ in the function ``_Py_dg_strtod()`` used by ``float(str)``,
+ ``complex(str)``, :func:`pickle.load`, :func:`marshal.load`, etc.
+
+- bpo-36218: Fix a segfault occuring when sorting a list of heterogeneous
+ values. Patch contributed by Rémi Lapeyre and Elliot Gorokhovsky.
+
+- bpo-36035: Added fix for broken symlinks in combination with pathlib
+
+- bpo-18372: Add missing :c:func:`PyObject_GC_Track` calls in the
+ :mod:`pickle` module. Patch by Zackery Spytz.
+
+- bpo-34408: Prevent a null pointer dereference and resource leakage in
+ ``PyInterpreterState_New()``.
+
+Library
+-------
+
+- bpo-37280: Use threadpool for reading from file for sendfile fallback
+ mode.
+
+- bpo-37279: Fix asyncio sendfile support when sendfile sends extra data in
+ fallback mode.
+
+- bpo-19865: :func:`ctypes.create_unicode_buffer()` now also supports
+ non-BMP characters on platforms with 16-bit :c:type:`wchar_t` (for
+ example, Windows and AIX).
+
+- bpo-35922: Fix :meth:`RobotFileParser.crawl_delay` and
+ :meth:`RobotFileParser.request_rate` to return ``None`` rather than raise
+ :exc:`AttributeError` when no relevant rule is defined in the robots.txt
+ file. Patch by Rémi Lapeyre.
+
+- bpo-36607: Eliminate :exc:`RuntimeError` raised by
+ :func:`asyncio.all_tasks()` if internal tasks weak set is changed by
+ another thread during iteration.
+
+- bpo-36402: Fix a race condition at Python shutdown when waiting for
+ threads. Wait until the Python thread state of all non-daemon threads get
+ deleted (join all non-daemon threads), rather than just wait until
+ non-daemon Python threads complete.
+
+- bpo-34886: Fix an unintended ValueError from :func:`subprocess.run` when
+ checking for conflicting `input` and `stdin` or `capture_output` and
+ `stdout` or `stderr` args when they were explicitly provided but with
+ `None` values within a passed in `**kwargs` dict rather than as passed
+ directly by name. Patch contributed by Rémi Lapeyre.
+
+- bpo-37173: The exception message for ``inspect.getfile()`` now correctly
+ reports the passed class rather than the builtins module.
+
+- bpo-12639: :meth:`msilib.Directory.start_component()` no longer fails if
+ *keyfile* is not ``None``.
+
+- bpo-36520: Lengthy email headers with UTF-8 characters are now properly
+ encoded when they are folded. Patch by Jeffrey Kintscher.
+
+- bpo-37054: Fix destructor :class:`_pyio.BytesIO` and
+ :class:`_pyio.TextIOWrapper`: initialize their ``_buffer`` attribute as
+ soon as possible (in the class body), because it's used by ``__del__()``
+ which calls ``close()``.
+
+- bpo-30835: Fixed a bug in email parsing where a message with invalid bytes
+ in content-transfer-encoding of a multipart message can cause an
+ AttributeError. Patch by Andrew Donnellan.
+
+- bpo-37035: Don't log OSError based exceptions if a fatal error has
+ occurred in asyncio transport. Peer can generate almost any OSError, user
+ cannot avoid these exceptions by fixing own code. Errors are still
+ propagated to user code, it's just logging them is pointless and pollute
+ asyncio logs.
+
+- bpo-37008: Add support for calling :func:`next` with the mock resulting
+ from :func:`unittest.mock.mock_open`
+
+- bpo-27737: Allow whitespace only header encoding in ``email.header`` - by
+ Batuhan Taskaya
+
+- bpo-36969: PDB command `args` now display keyword only arguments. Patch
+ contributed by Rémi Lapeyre.
+
+- bpo-36983: Add missing names to ``typing.__all__``: ``ChainMap``,
+ ``ForwardRef``, ``OrderedDict`` - by Anthony Sottile.
+
+- bpo-21315: Email headers containing RFC2047 encoded words are parsed
+ despite the missing whitespace, and a defect registered. Also missing
+ trailing whitespace after encoded words is now registered as a defect.
+
+- bpo-33524: Fix the folding of email header when the max_line_length is 0
+ or None and the header contains non-ascii characters. Contributed by
+ Licht Takeuchi (@Licht-T).
+
+- bpo-24564: :func:`shutil.copystat` now ignores :const:`errno.EINVAL` on
+ :func:`os.setxattr` which may occur when copying files on filesystems
+ without extended attributes support.
+
+ Original patch by Giampaolo Rodola, updated by Ying Wang.
+
+- bpo-36845: Added validation of integer prefixes to the construction of IP
+ networks and interfaces in the ipaddress module.
+
+- bpo-35545: Fix asyncio discarding IPv6 scopes when ensuring hostname
+ resolutions internally
+
+- bpo-35070: posix.getgrouplist() now works correctly when the user belongs
+ to NGROUPS_MAX supplemental groups. Patch by Jeffrey Kintscher.
+
+- bpo-24538: In `shutil.copystat()`, first copy extended file attributes and
+ then file permissions, since extended attributes can only be set on the
+ destination while it is still writeable.
+
+- bpo-33110: Handle exceptions raised by functions added by
+ concurrent.futures add_done_callback correctly when the Future has already
+ completed.
+
+- bpo-26903: Limit `max_workers` in `ProcessPoolExecutor` to 61 to work
+ around a WaitForMultipleObjects limitation.
+
+- bpo-36813: Fix :class:`~logging.handlers.QueueListener` to call
+ ``queue.task_done()`` upon stopping. Patch by Bar Harel.
+
+- bpo-36734: Fix compilation of ``faulthandler.c`` on HP-UX. Initialize
+ ``stack_t current_stack`` to zero using ``memset()``.
+
+- bpo-29183: Fix double exceptions in :class:`wsgiref.handlers.BaseHandler`
+ by calling its :meth:`~wsgiref.handlers.BaseHandler.close` method only
+ when no exception is raised.
+
+- bpo-36650: The C version of functools.lru_cache() was treating calls with
+ an empty ``**kwargs`` dictionary as being distinct from calls with no
+ keywords at all. This did not result in an incorrect answer, but it did
+ trigger an unexpected cache miss.
+
+- bpo-28552: Fix :mod:`distutils.sysconfig` if :data:`sys.executable` is
+ ``None`` or an empty string: use :func:`os.getcwd` to initialize
+ ``project_base``. Fix also the distutils build command: don't use
+ :data:`sys.executable` if it is ``None`` or an empty string.
+
+- bpo-35755: :func:`shutil.which` and
+ :func:`distutils.spawn.find_executable` now use ``os.confstr("CS_PATH")``
+ if available instead of :data:`os.defpath`, if the ``PATH`` environment
+ variable is not set. Moreover, don't use ``os.confstr("CS_PATH")`` nor
+ :data:`os.defpath` if the ``PATH`` environment variable is set to an empty
+ string.
+
+- bpo-36613: Fix :mod:`asyncio` wait() not removing callback if exception
+
+- bpo-36598: Fix ``isinstance`` check for Mock objects with spec when the
+ code is executed under tracing. Patch by Karthikeyan Singaravelan.
+
+- bpo-36533: Reinitialize logging.Handler locks in forked child processes
+ instead of attempting to acquire them all in the parent before forking
+ only to be released in the child process. The acquire/release pattern was
+ leading to deadlocks in code that has implemented any form of chained
+ logging handlers that depend upon one another as the lock acquision order
+ cannot be guaranteed.
+
+- bpo-36522: If *debuglevel* is set to >0 in :mod:`http.client`, print all
+ values for headers with multiple values for the same header name. Patch by
+ Matt Houglum.
+
+- bpo-36492: Arbitrary keyword arguments (even with names "self" and "func")
+ can now be passed to some functions which should accept arbitrary keyword
+ arguments and pass them to other function (for example partialmethod(),
+ TestCase.addCleanup() and Profile.runcall()) if the required arguments are
+ passed as positional arguments.
+
+- bpo-36434: Errors during writing to a ZIP file no longer prevent to
+ properly close it.
+
+- bpo-34745: Fix :mod:`asyncio` ssl memory issues caused by circular
+ references
+
+- bpo-36321: collections.namedtuple() misspelled the name of an attribute.
+ To be consistent with typing.NamedTuple, the attribute name should have
+ been "_field_defaults" instead of "_fields_defaults". For backwards
+ compatibility, both spellings are now created. The misspelled version may
+ be removed in the future.
+
+- bpo-36272: :mod:`logging` does not silently ignore RecursionError anymore.
+ Patch contributed by Rémi Lapeyre.
+
+- bpo-36235: Fix ``CFLAGS`` in ``customize_compiler()`` of
+ ``distutils.sysconfig``: when the ``CFLAGS`` environment variable is
+ defined, don't override ``CFLAGS`` variable with the ``OPT`` variable
+ anymore. Initial patch written by David Malcolm.
+
+- bpo-35125: Asyncio: Remove inner callback on outer cancellation in shield
+
+- bpo-35802: Clean up code which checked presence of ``os.stat`` /
+ ``os.lstat`` / ``os.chmod`` which are always present. Patch by Anthony
+ Sottile.
+
+- bpo-23078: Add support for :func:`classmethod` and :func:`staticmethod` to
+ :func:`unittest.mock.create_autospec`. Initial patch by Felipe Ochoa.
+
+- bpo-35721: Fix :meth:`asyncio.SelectorEventLoop.subprocess_exec()` leaks
+ file descriptors if ``Popen`` fails and called with
+ ``stdin=subprocess.PIPE``. Patch by Niklas Fiekas.
+
+- bpo-35726: QueueHandler.prepare() now makes a copy of the record before
+ modifying and enqueueing it, to avoid affecting other handlers in the
+ chain.
+
+- bpo-31855: :func:`unittest.mock.mock_open` results now respects the
+ argument of read([size]). Patch contributed by Rémi Lapeyre.
+
+- bpo-35082: Don't return deleted attributes when calling dir on a
+ :class:`unittest.mock.Mock`.
+
+- bpo-34547: :class:`wsgiref.handlers.BaseHandler` now handles abrupt client
+ connection terminations gracefully. Patch by Petter Strandmark.
+
+- bpo-34424: Fix serialization of messages containing encoded strings when
+ the policy.linesep is set to a multi-character string. Patch by Jens
+ Troeger.
+
+- bpo-33361: Fix a bug in :class:`codecs.StreamRecoder` where seeking might
+ leave old data in a buffer and break subsequent read calls. Patch by Ammar
+ Askar.
+
+- bpo-31922: :meth:`asyncio.AbstractEventLoop.create_datagram_endpoint`: Do
+ not connect UDP socket when broadcast is allowed. This allows to receive
+ replies after a UDP broadcast.
+
+- bpo-22102: Added support for ZIP files with disks set to 0. Such files are
+ commonly created by builtin tools on Windows when use ZIP64 extension.
+ Patch by Francisco Facioni.
+
+- bpo-27141: Added a ``__copy__()`` to ``collections.UserList`` and
+ ``collections.UserDict`` in order to correctly implement shallow copying
+ of the objects. Patch by Bar Harel.
+
+- bpo-31829: ``\r``, ``\0`` and ``\x1a`` (end-of-file on Windows) are now
+ escaped in protocol 0 pickles of Unicode strings. This allows to load them
+ without loss from files open in text mode in Python 2.
+
+- bpo-31292: Fix ``setup.py check --restructuredtext`` for files containing
+ ``include`` directives.
+
+- bpo-23395: ``_thread.interrupt_main()`` now avoids setting the Python
+ error status if the ``SIGINT`` signal is ignored or not handled by Python.
+
+Documentation
+-------------
+
+- bpo-34903: Documented that in :meth:`datetime.datetime.strptime()`, the
+ leading zero in some two-digit formats is optional. Patch by Mike Gleen.
+
+- bpo-36984: Improve version added references in ``typing`` module - by
+ Anthony Sottile.
+
+- bpo-36868: What's new now mentions SSLContext.hostname_checks_common_name
+ instead of SSLContext.host_flags.
+
+- bpo-36783: Added C API Documentation for Time_FromTimeAndFold and
+ PyDateTime_FromDateAndTimeAndFold as per PEP 495. Patch by Edison
+ Abahurire.
+
+- bpo-30840: Document relative imports
+
+- bpo-36523: Add docstring for io.IOBase.writelines().
+
+- bpo-36425: New documentation translation: `Simplified Chinese
+ <https://docs.python.org/zh-cn/>`_.
+
+- bpo-36157: Added Documention for PyInterpreterState_Main().
+
+- bpo-36138: Improve documentation about converting datetime.timedelta to
+ scalars.
+
+- bpo-22865: Add detail to the documentation on the `pty.spawn` function.
+
+- bpo-35581: @typing.type_check_only now allows type stubs to mark functions
+ and classes not available during runtime.
+
+- bpo-35564: Explicitly set master_doc variable in conf.py for compliance
+ with Sphinx 2.0
+
+- bpo-10536: Enhance the gettext docs. Patch by Éric Araujo
+
+- bpo-32995: Added the context variable in glossary.
+
+- bpo-33832: Add glossary entry for 'magic method'.
+
+- bpo-33482: Make `codecs.StreamRecoder.writelines` take a list of bytes.
+
+- bpo-25735: Added documentation for func factorial to indicate that returns
+ integer values
+
+Tests
+-----
+
+- bpo-35998: Avoid TimeoutError in test_asyncio: test_start_tls_server_1()
+
+- bpo-37153: ``test_venv.test_mutiprocessing()`` now explicitly calls
+ ``pool.terminate()`` to wait until the pool completes.
+
+- bpo-37081: Test with OpenSSL 1.1.1c
+
+- bpo-36915: The main regrtest process now always removes all temporary
+ directories of worker processes even if they crash or if they are killed
+ on KeyboardInterrupt (CTRL+c).
+
+- bpo-36719: "python3 -m test -jN ..." now continues the execution of next
+ tests when a worker process crash (CHILD_ERROR state). Previously, the
+ test suite stopped immediately. Use --failfast to stop at the first error.
+
+- bpo-36816: Update Lib/test/selfsigned_pythontestdotnet.pem to match
+ self-signed.pythontest.net's new TLS certificate.
+
+- bpo-35925: Skip httplib and nntplib networking tests when they would
+ otherwise fail due to a modern OS or distro with a default OpenSSL policy
+ of rejecting connections to servers with weak certificates.
+
+- bpo-36719: regrtest now always detects uncollectable objects. Previously,
+ the check was only enabled by ``--findleaks``. The check now also works
+ with ``-jN/--multiprocess N``. ``--findleaks`` becomes a deprecated alias
+ to ``--fail-env-changed``.
+
+- bpo-36725: When using mulitprocessing mode (-jN), regrtest now better
+ reports errors if a worker process fails, and it exits immediately on a
+ worker thread failure or when interrupted.
+
+- bpo-36454: Change test_time.test_monotonic() to test only the lower bound
+ of elapsed time after a sleep command rather than the upper bound. This
+ prevents unnecessary test failures on slow buildbots. Patch by Victor
+ Stinner.
+
+- bpo-36629: Fix ``test_imap4_host_default_value()`` of ``test_imaplib``:
+ catch also :data:`errno.ENETUNREACH` error.
+
+- bpo-36611: Fix ``test_sys.test_getallocatedblocks()`` when
+ :mod:`tracemalloc` is enabled.
+
+- bpo-36560: Fix reference leak hunting in regrtest: compute also deltas (of
+ reference count, allocated memory blocks, file descriptor count) during
+ warmup, to ensure that everything is initialized before starting to hunt
+ reference leaks.
+
+- bpo-36565: Fix reference hunting (``python3 -m test -R 3:3``) when Python
+ has no built-in abc module.
+
+- bpo-36436: Fix ``_testcapi.pymem_buffer_overflow()``: handle memory
+ allocation failure.
+
+Build
+-----
+
+- bpo-36605: ``make tags`` and ``make TAGS`` now also parse
+ ``Modules/_io/*.c`` and ``Modules/_io/*.h``.
+
+- bpo-36508: ``python-config --ldflags`` no longer includes flags of the
+ ``LINKFORSHARED`` variable. The ``LINKFORSHARED`` variable must only be
+ used to build executables.
+
+Windows
+-------
+
+- bpo-34631: Updated OpenSSL to 1.1.1c in Windows installer
+
+- bpo-37267: On Windows, :func:`os.dup` no longer creates an inheritable fd
+ when handling a character file.
+
+- bpo-36779: Ensure ``time.tzname`` is correct on Windows when the active
+ code page is set to CP_UTF7 or CP_UTF8.
+
+- bpo-36965: include of STATUS_CONTROL_C_EXIT without depending on MSC
+ compiler
+
+- bpo-36649: Remove trailing spaces for registry keys when installed via the
+ Store.
+
+- bpo-34144: Fixed activate.bat to correctly update codepage when chcp.com
+ returns dots in output. Patch by Lorenz Mende.
+
+- bpo-35941: enum_certificates function of the ssl module now returns
+ certificates from all available certificate stores inside windows in a
+ query instead of returning only certificates from the system wide
+ certificate store. This includes certificates from these certificate
+ stores: local machine, local machine enterprise, local machine group
+ policy, current user, current user group policy, services, users.
+ ssl.enum_crls() function is changed in the same way to return all
+ certificate revocation lists inside the windows certificate revocation
+ list stores.
+
+- bpo-36441: Fixes creating a venv when debug binaries are installed.
+
+- bpo-36312: Fixed decoders for the following code pages: 50220, 50221,
+ 50222, 50225, 50227, 50229, 57002 through 57011, 65000 and 42.
+
+- bpo-36010: Add the venv standard library module to the nuget distribution
+ for Windows.
+
+- bpo-34060: Report system load when running test suite on Windows. Patch by
+ Ammar Askar. Based on prior work by Jeremy Kloth.
+
+macOS
+-----
+
+- bpo-35360: Update macOS installer to use SQLite 3.28.0.
+
+- bpo-34631: Updated OpenSSL to 1.1.1c in macOS installer.
+
+- bpo-36231: Support building Python on macOS without /usr/include
+ installed. As of macOS 10.14, system header files are only available
+ within an SDK provided by either the Command Line Tools or the Xcode app.
+
+- bpo-34602: Avoid failures setting macOS stack resource limit with
+ resource.setrlimit. This reverts an earlier fix for bpo-18075 which forced
+ a non-default stack size when building the interpreter executable on
+ macOS.
+
+IDLE
+----
+
+- bpo-37321: Both subprocess connection error messages now refer to the
+ 'Startup failure' section of the IDLE doc.
+
+- bpo-37177: Properly 'attach' search dialogs to their main window so that
+ they behave like other dialogs and do not get hidden behind their main
+ window.
+
+- bpo-37039: Adjust "Zoom Height" to individual screens by momemtarily
+ maximizing the window on first use with a particular screen. Changing
+ screen settings may invalidate the saved height. While a window is
+ maximized, "Zoom Height" has no effect.
+
+- bpo-35763: Make calltip reminder about '/' meaning positional-only less
+ obtrusive by only adding it when there is room on the first line.
+
+- bpo-5680: Add 'Run... Customized' to the Run menu to run a module with
+ customized settings. Any 'command line arguments' entered are added to
+ sys.argv. One can suppress the normal Shell main module restart.
+
+- bpo-35610: Replace now redundant .context_use_ps1 with .prompt_last_line.
+ This finishes change started in bpo-31858.
+
+- bpo-37038: Make idlelib.run runnable; add test clause.
+
+- bpo-36958: Print any argument other than None or int passed to SystemExit
+ or sys.exit().
+
+- bpo-13102: When saving a file, call os.fsync() so bits are flushed to e.g.
+ USB drive.
+
+- bpo-36429: Fix starting IDLE with pyshell. Add idlelib.pyshell alias at
+ top; remove pyshell alias at bottom. Remove obsolete __name__=='__main__'
+ command.
+
+- bpo-36405: Use dict unpacking in idlelib.
+
+- bpo-36396: Remove fgBg param of idlelib.config.GetHighlight(). This param
+ was only used twice and changed the return type.
+
+- bpo-23205: For the grep module, add tests for findfiles, refactor
+ findfiles to be a module-level function, and refactor findfiles to use
+ os.walk.
+
+- bpo-23216: Add docstrings to IDLE search modules.
+
+- bpo-30348: Increase test coverage of idlelib.autocomplete by 30%.
+
+- bpo-32411: In browser.py, remove extraneous sorting by line number since
+ dictionary was created in line number order.
+
+Tools/Demos
+-----------
+
+- bpo-14546: Fix the argument handling in Tools/scripts/lll.py.
+
+- bpo-32217: Fix freeze script on Windows.
+
+C API
+-----
+
+- bpo-28805: The :const:`METH_FASTCALL` calling convention has been
+ documented.
+
+- bpo-37170: Fix the cast on error in
+ :c:func:`PyLong_AsUnsignedLongLongMask()`.
+
+- bpo-36389: Change the value of ``CLEANBYTE``, ``DEADDYTE`` and
+ ``FORBIDDENBYTE`` internal constants used by debug hooks on Python memory
+ allocators (:c:func:`PyMem_SetupDebugHooks` function). Byte patterns
+ ``0xCB``, ``0xDB`` and ``0xFB`` have been replaced with ``0xCD``, ``0xDD``
+ and ``0xFD`` to use the same values than Windows CRT debug ``malloc()``
+ and ``free()``.
+
+
What's New in Python 3.7.3 final?
=================================
if opt == '--ldflags':
if not getvar('Py_ENABLE_SHARED'):
libs.insert(0, '-L' + getvar('LIBPL'))
- if not getvar('PYTHONFRAMEWORK'):
- libs.extend(getvar('LINKFORSHARED').split())
print(' '.join(libs))
elif opt == '--extension-suffix':
LIBS="-lpython${VERSION}${ABIFLAGS} @LIBS@ $SYSLIBS"
BASECFLAGS="@BASECFLAGS@"
LDLIBRARY="@LDLIBRARY@"
-LINKFORSHARED="@LINKFORSHARED@"
OPT="@OPT@"
PY_ENABLE_SHARED="@PY_ENABLE_SHARED@"
LDVERSION="@LDVERSION@"
echo "$LIBS"
;;
--ldflags)
- LINKFORSHAREDUSED=
- if [ -z "$PYTHONFRAMEWORK" ] ; then
- LINKFORSHAREDUSED=$LINKFORSHARED
- fi
LIBPLUSED=
if [ "$PY_ENABLE_SHARED" = "0" ] ; then
LIBPLUSED="-L$LIBPL"
fi
- echo "$LIBPLUSED -L$libdir $LIBS $LINKFORSHAREDUSED"
+ echo "$LIBPLUSED -L$libdir $LIBS"
;;
--extension-suffix)
echo "$SO"
# Interface to the Expat XML parser
# More information on Expat can be found at www.libexpat.org.
#
-#pyexpat expat/xmlparse.c expat/xmlrole.c expat/xmltok.c pyexpat.c -I$(srcdir)/Modules/expat -DHAVE_EXPAT_CONFIG_H -DXML_POOR_ENTROPY=1 -DUSE_PYEXPAT_CAPI
+#pyexpat expat/xmlparse.c expat/xmlrole.c expat/xmltok.c pyexpat.c -I$(srcdir)/Modules/expat -DHAVE_EXPAT_CONFIG_H -DXML_POOR_ENTROPY -DUSE_PYEXPAT_CAPI
# Hye-Shik Chang's CJKCodecs
}
itemsize = itemdict->size;
- if (length * itemsize < 0) {
+ if (itemsize != 0 && length > PY_SSIZE_T_MAX / itemsize) {
PyErr_SetString(PyExc_OverflowError,
"array too large");
goto error;
stgdict->align = itemalign;
stgdict->length = length;
stgdict->proto = type_attr;
+ type_attr = NULL;
stgdict->paramfunc = &PyCArrayType_paramfunc;
"ctypes objects containing pointers cannot be pickled");
return NULL;
}
- return Py_BuildValue("O(O(NN))",
- _unpickle,
- Py_TYPE(myself),
- PyObject_GetAttrString(myself, "__dict__"),
+ PyObject *dict = PyObject_GetAttrString(myself, "__dict__");
+ if (dict == NULL) {
+ return NULL;
+ }
+ return Py_BuildValue("O(O(NN))", _unpickle, Py_TYPE(myself), dict,
PyBytes_FromStringAndSize(self->b_ptr, self->b_size));
}
StgDictObject *stgdict, *itemdict;
PyObject *proto;
PyObject *np;
- Py_ssize_t start, stop, step, slicelen, cur, i;
+ Py_ssize_t start, stop, step, slicelen, i;
+ size_t cur;
if (PySlice_Unpack(item, &start, &stop, &step) < 0) {
return NULL;
return Array_ass_item(myself, i, value);
}
else if (PySlice_Check(item)) {
- Py_ssize_t start, stop, step, slicelen, otherlen, i, cur;
+ Py_ssize_t start, stop, step, slicelen, otherlen, i;
+ size_t cur;
if (PySlice_Unpack(item, &start, &stop, &step) < 0) {
return -1;
if (cnv)
dict = PyType_stgdict(cnv);
else {
- PrintError("Getting argument converter %d\n", i);
+ PrintError("Getting argument converter %zd\n", i);
goto Done;
}
if (dict && dict->getfunc && !_ctypes_simple_instance(cnv)) {
PyObject *v = dict->getfunc(*pArgs, dict->size);
if (!v) {
- PrintError("create argument %d:\n", i);
+ PrintError("create argument %zd:\n", i);
Py_DECREF(cnv);
goto Done;
}
/* Hm, shouldn't we use PyCData_AtAddress() or something like that instead? */
CDataObject *obj = (CDataObject *)_PyObject_CallNoArg(cnv);
if (!obj) {
- PrintError("create argument %d:\n", i);
+ PrintError("create argument %zd:\n", i);
Py_DECREF(cnv);
goto Done;
}
if (!CDataObject_Check(obj)) {
Py_DECREF(obj);
Py_DECREF(cnv);
- PrintError("unexpected result of create argument %d:\n", i);
+ PrintError("unexpected result of create argument %zd:\n", i);
goto Done;
}
memcpy(obj->b_ptr, *pArgs, dict->size);
} else {
PyErr_SetString(PyExc_TypeError,
"cannot build parameter");
- PrintError("Parsing argument %d\n", i);
+ PrintError("Parsing argument %zd\n", i);
Py_DECREF(cnv);
goto Done;
}
converter = PyTuple_GET_ITEM(argtypes, i);
v = PyObject_CallFunctionObjArgs(converter, arg, NULL);
if (v == NULL) {
- _ctypes_extend_error(PyExc_ArgError, "argument %d: ", i+1);
+ _ctypes_extend_error(PyExc_ArgError, "argument %zd: ", i+1);
goto cleanup;
}
err = ConvParam(v, i+1, pa);
Py_DECREF(v);
if (-1 == err) {
- _ctypes_extend_error(PyExc_ArgError, "argument %d: ", i+1);
+ _ctypes_extend_error(PyExc_ArgError, "argument %zd: ", i+1);
goto cleanup;
}
} else {
err = ConvParam(arg, i+1, pa);
if (-1 == err) {
- _ctypes_extend_error(PyExc_ArgError, "argument %d: ", i+1);
+ _ctypes_extend_error(PyExc_ArgError, "argument %zd: ", i+1);
goto cleanup; /* leaking ? */
}
}
#ifdef MALLOC_CLOSURE_DEBUG
printf("block at %p allocated (%d bytes), %d ITEMs\n",
- item, count * sizeof(ITEM), count);
+ item, count * (int)sizeof(ITEM), count);
#endif
/* put them into the free list */
for (i = 0; i < count; ++i) {
return element_getitem(self_, i);
}
else if (PySlice_Check(item)) {
- Py_ssize_t start, stop, step, slicelen, cur, i;
+ Py_ssize_t start, stop, step, slicelen, i;
+ size_t cur;
PyObject* list;
if (!self->extra)
return element_setitem(self_, i, value);
}
else if (PySlice_Check(item)) {
- Py_ssize_t start, stop, step, slicelen, newlen, cur, i;
+ Py_ssize_t start, stop, step, slicelen, newlen, i;
+ size_t cur;
PyObject* recycle = NULL;
PyObject* seq;
PyObject *key, *keyword, *value;
Py_ssize_t key_size, pos, key_pos, kwds_size;
+ kwds_size = kwds ? PyDict_GET_SIZE(kwds) : 0;
+
/* short path, key will match args anyway, which is a tuple */
- if (!typed && !kwds) {
+ if (!typed && !kwds_size) {
if (PyTuple_GET_SIZE(args) == 1) {
key = PyTuple_GET_ITEM(args, 0);
if (PyUnicode_CheckExact(key) || PyLong_CheckExact(key)) {
return args;
}
- kwds_size = kwds ? PyDict_GET_SIZE(kwds) : 0;
- assert(kwds_size >= 0);
-
key_size = PyTuple_GET_SIZE(args);
if (kwds_size)
key_size += kwds_size * 2 + 1;
if (universal) {
if (creating || writing || appending || updating) {
PyErr_SetString(PyExc_ValueError,
- "mode U cannot be combined with x', 'w', 'a', or '+'");
+ "mode U cannot be combined with 'x', 'w', 'a', or '+'");
goto error;
}
if (PyErr_WarnEx(PyExc_DeprecationWarning,
Py_ssize_t res;
res = _PyObject_SIZE(Py_TYPE(self));
- if (self->buf && !SHARED_BUF(self))
- res += _PySys_GetSizeOf(self->buf);
+ if (self->buf && !SHARED_BUF(self)) {
+ Py_ssize_t s = _PySys_GetSizeOf(self->buf);
+ if (s == -1) {
+ return NULL;
+ }
+ res += s;
+ }
return PyLong_FromSsize_t(res);
}
PyDoc_STRVAR(_io__IOBase_writelines__doc__,
"writelines($self, lines, /)\n"
"--\n"
-"\n");
+"\n"
+"Write a list of lines to stream.\n"
+"\n"
+"Line separators are not added, so it is usual for each of the\n"
+"lines provided to have a line separator at the end.");
#define _IO__IOBASE_WRITELINES_METHODDEF \
{"writelines", (PyCFunction)_io__IOBase_writelines, METH_O, _io__IOBase_writelines__doc__},
{
return _io__RawIOBase_readall_impl(self);
}
-/*[clinic end generated code: output=64989ec3dbf44a7c input=a9049054013a1b77]*/
+/*[clinic end generated code: output=6f8d078401fb9d48 input=a9049054013a1b77]*/
_io._IOBase.writelines
lines: object
/
+
+Write a list of lines to stream.
+
+Line separators are not added, so it is usual for each of the
+lines provided to have a line separator at the end.
[clinic start generated code]*/
static PyObject *
_io__IOBase_writelines(PyObject *self, PyObject *lines)
-/*[clinic end generated code: output=976eb0a9b60a6628 input=432e729a8450b3cb]*/
+/*[clinic end generated code: output=976eb0a9b60a6628 input=cac3fc8864183359]*/
{
PyObject *iter, *res;
if (u8n) {
PyErr_Format(PyExc_SystemError,
- "Buffer had room for %d bytes but %d bytes required",
+ "Buffer had room for %zd bytes but %u bytes required",
len, u8n);
return -1;
}
char encoding[100];
char locale[100];
- PyOS_snprintf(encoding, sizeof(encoding), "cp%d", GetACP());
+ PyOS_snprintf(encoding, sizeof(encoding), "cp%u", GetACP());
if (GetLocaleInfo(LOCALE_USER_DEFAULT,
LOCALE_SISO639LANGNAME,
if (lzma_lzma_preset(options, preset)) {
PyMem_Free(options);
- PyErr_Format(Error, "Invalid compression preset: %d", preset);
+ PyErr_Format(Error, "Invalid compression preset: %u", preset);
return NULL;
}
lzma_options_lzma options;
if (lzma_lzma_preset(&options, preset)) {
- PyErr_Format(Error, "Invalid compression preset: %d", preset);
+ PyErr_Format(Error, "Invalid compression preset: %u", preset);
return -1;
}
lzret = lzma_alone_encoder(lzs, &options);
default:
PyErr_Format(PyExc_RuntimeError, "WaitForSingleObject() or "
"WaitForMultipleObjects() gave unrecognized "
- "value %d", res);
+ "value %u", res);
return NULL;
}
}
Py_DECREF(self);
return NULL;
}
+
+ PyObject_GC_Track(self);
return self;
}
return NULL;
}
+ PyObject_GC_Track(self);
return self;
}
*p++ = Py_hexdigits[ch & 15];
}
/* Map 16-bit characters, '\\' and '\n' to '\uxxxx' */
- else if (ch >= 256 || ch == '\\' || ch == '\n') {
+ else if (ch >= 256 ||
+ ch == '\\' || ch == 0 || ch == '\n' || ch == '\r' ||
+ ch == 0x1a)
+ {
/* -1: subtract 1 preallocated byte */
p = _PyBytesWriter_Prepare(&writer, p, 6-1);
if (p == NULL)
const char *name = "main";
int rc;
int callback_error = 0;
- double sleep_secs = 0.250;
+ PyObject *sleep_obj = NULL;
+ int sleep_ms = 250;
sqlite3 *bck_conn;
sqlite3_backup *bck_handle;
static char *keywords[] = {"target", "pages", "progress", "name", "sleep", NULL};
- if (!PyArg_ParseTupleAndKeywords(args, kwds, "O!|$iOsd:backup", keywords,
+ if (!PyArg_ParseTupleAndKeywords(args, kwds, "O!|$iOsO:backup", keywords,
&pysqlite_ConnectionType, &target,
- &pages, &progress, &name, &sleep_secs)) {
+ &pages, &progress, &name, &sleep_obj)) {
return NULL;
}
+ if (sleep_obj != NULL) {
+ _PyTime_t sleep_secs;
+ if (_PyTime_FromSecondsObject(&sleep_secs, sleep_obj,
+ _PyTime_ROUND_TIMEOUT)) {
+ return NULL;
+ }
+ _PyTime_t ms = _PyTime_AsMilliseconds(sleep_secs,
+ _PyTime_ROUND_TIMEOUT);
+ if (ms < INT_MIN || ms > INT_MAX) {
+ PyErr_SetString(PyExc_OverflowError, "sleep is too large");
+ return NULL;
+ }
+ sleep_ms = (int)ms;
+ }
+
if (!pysqlite_check_connection((pysqlite_Connection *)target)) {
return NULL;
}
the engine could not make any progress */
if (rc == SQLITE_BUSY || rc == SQLITE_LOCKED) {
Py_BEGIN_ALLOW_THREADS
- sqlite3_sleep(sleep_secs * 1000.0);
+ sqlite3_sleep(sleep_ms);
Py_END_ALLOW_THREADS
}
} while (rc == SQLITE_OK || rc == SQLITE_BUSY || rc == SQLITE_LOCKED);
if (msg == NULL)
goto fail;
- init_value = Py_BuildValue("iN", ssl_errno, msg);
+ init_value = Py_BuildValue("iN", ERR_GET_REASON(ssl_errno), msg);
if (init_value == NULL)
goto fail;
SSL_set_mode(self->ssl,
SSL_MODE_ACCEPT_MOVING_WRITE_BUFFER | SSL_MODE_AUTO_RETRY);
+#ifdef TLS1_3_VERSION
+ if (sslctx->post_handshake_auth == 1) {
+ if (socket_type == PY_SSL_SERVER) {
+ /* bpo-37428: OpenSSL does not ignore SSL_VERIFY_POST_HANDSHAKE.
+ * Set SSL_VERIFY_POST_HANDSHAKE flag only for server sockets and
+ * only in combination with SSL_VERIFY_PEER flag. */
+ int mode = SSL_get_verify_mode(self->ssl);
+ if (mode & SSL_VERIFY_PEER) {
+ int (*verify_cb)(int, X509_STORE_CTX *) = NULL;
+ verify_cb = SSL_get_verify_callback(self->ssl);
+ mode |= SSL_VERIFY_POST_HANDSHAKE;
+ SSL_set_verify(self->ssl, mode, verify_cb);
+ }
+ } else {
+ /* client socket */
+ SSL_set_post_handshake_auth(self->ssl, 1);
+ }
+ }
+#endif
+
if (server_hostname != NULL) {
if (_ssl_configure_hostname(self, server_hostname) < 0) {
Py_DECREF(self);
"invalid value for verify_mode");
return -1;
}
-#ifdef TLS1_3_VERSION
- if (self->post_handshake_auth)
- mode |= SSL_VERIFY_POST_HANDSHAKE;
-#endif
+
+ /* bpo-37428: newPySSLSocket() sets SSL_VERIFY_POST_HANDSHAKE flag for
+ * server sockets and SSL_set_post_handshake_auth() for client. */
+
/* keep current verify cb */
verify_cb = SSL_CTX_get_verify_callback(self->ctx);
SSL_CTX_set_verify(self->ctx, mode, verify_cb);
#if HAVE_ALPN
if ((size_t)protos->len > UINT_MAX) {
PyErr_Format(PyExc_OverflowError,
- "protocols longer than %d bytes", UINT_MAX);
+ "protocols longer than %u bytes", UINT_MAX);
return NULL;
}
#if TLS1_3_VERSION
static int
set_post_handshake_auth(PySSLContext *self, PyObject *arg, void *c) {
- int (*verify_cb)(int, X509_STORE_CTX *) = NULL;
- int mode = SSL_CTX_get_verify_mode(self->ctx);
if (arg == NULL) {
PyErr_SetString(PyExc_AttributeError, "cannot delete attribute");
return -1;
}
self->post_handshake_auth = pha;
- /* client-side socket setting, ignored by server-side */
- SSL_CTX_set_post_handshake_auth(self->ctx, pha);
-
- /* server-side socket setting, ignored by client-side */
- verify_cb = SSL_CTX_get_verify_callback(self->ctx);
- if (pha) {
- mode |= SSL_VERIFY_POST_HANDSHAKE;
- } else {
- mode ^= SSL_VERIFY_POST_HANDSHAKE;
- }
- SSL_CTX_set_verify(self->ctx, mode, verify_cb);
+ /* bpo-37428: newPySSLSocket() sets SSL_VERIFY_POST_HANDSHAKE flag for
+ * server sockets and SSL_set_post_handshake_auth() for client. */
return 0;
}
return retval;
}
+static HCERTSTORE
+ssl_collect_certificates(const char *store_name)
+{
+/* this function collects the system certificate stores listed in
+ * system_stores into a collection certificate store for being
+ * enumerated. The store must be readable to be added to the
+ * store collection.
+ */
+
+ HCERTSTORE hCollectionStore = NULL, hSystemStore = NULL;
+ static DWORD system_stores[] = {
+ CERT_SYSTEM_STORE_LOCAL_MACHINE,
+ CERT_SYSTEM_STORE_LOCAL_MACHINE_ENTERPRISE,
+ CERT_SYSTEM_STORE_LOCAL_MACHINE_GROUP_POLICY,
+ CERT_SYSTEM_STORE_CURRENT_USER,
+ CERT_SYSTEM_STORE_CURRENT_USER_GROUP_POLICY,
+ CERT_SYSTEM_STORE_SERVICES,
+ CERT_SYSTEM_STORE_USERS};
+ size_t i, storesAdded;
+ BOOL result;
+
+ hCollectionStore = CertOpenStore(CERT_STORE_PROV_COLLECTION, 0,
+ (HCRYPTPROV)NULL, 0, NULL);
+ if (!hCollectionStore) {
+ return NULL;
+ }
+ storesAdded = 0;
+ for (i = 0; i < sizeof(system_stores) / sizeof(DWORD); i++) {
+ hSystemStore = CertOpenStore(CERT_STORE_PROV_SYSTEM_A, 0,
+ (HCRYPTPROV)NULL,
+ CERT_STORE_READONLY_FLAG |
+ system_stores[i], store_name);
+ if (hSystemStore) {
+ result = CertAddStoreToCollection(hCollectionStore, hSystemStore,
+ CERT_PHYSICAL_STORE_ADD_ENABLE_FLAG, 0);
+ if (result) {
+ ++storesAdded;
+ }
+ }
+ }
+ if (storesAdded == 0) {
+ CertCloseStore(hCollectionStore, CERT_CLOSE_STORE_FORCE_FLAG);
+ return NULL;
+ }
+
+ return hCollectionStore;
+}
+
+/* code from Objects/listobject.c */
+
+static int
+list_contains(PyListObject *a, PyObject *el)
+{
+ Py_ssize_t i;
+ int cmp;
+
+ for (i = 0, cmp = 0 ; cmp == 0 && i < Py_SIZE(a); ++i)
+ cmp = PyObject_RichCompareBool(el, PyList_GET_ITEM(a, i),
+ Py_EQ);
+ return cmp;
+}
+
/*[clinic input]
_ssl.enum_certificates
store_name: str
_ssl_enum_certificates_impl(PyObject *module, const char *store_name)
/*[clinic end generated code: output=5134dc8bb3a3c893 input=915f60d70461ea4e]*/
{
- HCERTSTORE hStore = NULL;
+ HCERTSTORE hCollectionStore = NULL;
PCCERT_CONTEXT pCertCtx = NULL;
PyObject *keyusage = NULL, *cert = NULL, *enc = NULL, *tup = NULL;
PyObject *result = NULL;
if (result == NULL) {
return NULL;
}
- hStore = CertOpenStore(CERT_STORE_PROV_SYSTEM_A, 0, (HCRYPTPROV)NULL,
- CERT_STORE_READONLY_FLAG | CERT_SYSTEM_STORE_LOCAL_MACHINE,
- store_name);
- if (hStore == NULL) {
+ hCollectionStore = ssl_collect_certificates(store_name);
+ if (hCollectionStore == NULL) {
Py_DECREF(result);
return PyErr_SetFromWindowsErr(GetLastError());
}
- while (pCertCtx = CertEnumCertificatesInStore(hStore, pCertCtx)) {
+ while (pCertCtx = CertEnumCertificatesInStore(hCollectionStore, pCertCtx)) {
cert = PyBytes_FromStringAndSize((const char*)pCertCtx->pbCertEncoded,
pCertCtx->cbCertEncoded);
if (!cert) {
enc = NULL;
PyTuple_SET_ITEM(tup, 2, keyusage);
keyusage = NULL;
- if (PyList_Append(result, tup) < 0) {
- Py_CLEAR(result);
- break;
+ if (!list_contains((PyListObject*)result, tup)) {
+ if (PyList_Append(result, tup) < 0) {
+ Py_CLEAR(result);
+ break;
+ }
}
Py_CLEAR(tup);
}
Py_XDECREF(keyusage);
Py_XDECREF(tup);
- if (!CertCloseStore(hStore, 0)) {
+ /* CERT_CLOSE_STORE_FORCE_FLAG forces freeing of memory for all contexts
+ associated with the store, in this case our collection store and the
+ associated system stores. */
+ if (!CertCloseStore(hCollectionStore, CERT_CLOSE_STORE_FORCE_FLAG)) {
/* This error case might shadow another exception.*/
Py_XDECREF(result);
return PyErr_SetFromWindowsErr(GetLastError());
}
+
return result;
}
_ssl_enum_crls_impl(PyObject *module, const char *store_name)
/*[clinic end generated code: output=bce467f60ccd03b6 input=a1f1d7629f1c5d3d]*/
{
- HCERTSTORE hStore = NULL;
+ HCERTSTORE hCollectionStore = NULL;
PCCRL_CONTEXT pCrlCtx = NULL;
PyObject *crl = NULL, *enc = NULL, *tup = NULL;
PyObject *result = NULL;
if (result == NULL) {
return NULL;
}
- hStore = CertOpenStore(CERT_STORE_PROV_SYSTEM_A, 0, (HCRYPTPROV)NULL,
- CERT_STORE_READONLY_FLAG | CERT_SYSTEM_STORE_LOCAL_MACHINE,
- store_name);
- if (hStore == NULL) {
+ hCollectionStore = ssl_collect_certificates(store_name);
+ if (hCollectionStore == NULL) {
Py_DECREF(result);
return PyErr_SetFromWindowsErr(GetLastError());
}
- while (pCrlCtx = CertEnumCRLsInStore(hStore, pCrlCtx)) {
+ while (pCrlCtx = CertEnumCRLsInStore(hCollectionStore, pCrlCtx)) {
crl = PyBytes_FromStringAndSize((const char*)pCrlCtx->pbCrlEncoded,
pCrlCtx->cbCrlEncoded);
if (!crl) {
PyTuple_SET_ITEM(tup, 1, enc);
enc = NULL;
- if (PyList_Append(result, tup) < 0) {
- Py_CLEAR(result);
- break;
+ if (!list_contains((PyListObject*)result, tup)) {
+ if (PyList_Append(result, tup) < 0) {
+ Py_CLEAR(result);
+ break;
+ }
}
Py_CLEAR(tup);
}
Py_XDECREF(enc);
Py_XDECREF(tup);
- if (!CertCloseStore(hStore, 0)) {
+ /* CERT_CLOSE_STORE_FORCE_FLAG forces freeing of memory for all contexts
+ associated with the store, in this case our collection store and the
+ associated system stores. */
+ if (!CertCloseStore(hCollectionStore, CERT_CLOSE_STORE_FORCE_FLAG)) {
/* This error case might shadow another exception.*/
Py_XDECREF(result);
return PyErr_SetFromWindowsErr(GetLastError());
return Py_None;
}
+static PyObject *
+test_long_as_unsigned_long_long_mask(PyObject *self,
+ PyObject *Py_UNUSED(ignored))
+{
+ unsigned long long res = PyLong_AsUnsignedLongLongMask(NULL);
+
+ if (res != (unsigned long long)-1 || !PyErr_Occurred()) {
+ return raiseTestError("test_long_as_unsigned_long_long_mask",
+ "PyLong_AsUnsignedLongLongMask(NULL) didn't "
+ "complain");
+ }
+ if (!PyErr_ExceptionMatches(PyExc_SystemError)) {
+ return raiseTestError("test_long_as_unsigned_long_long_mask",
+ "PyLong_AsUnsignedLongLongMask(NULL) raised "
+ "something other than SystemError");
+ }
+ PyErr_Clear();
+ Py_RETURN_NONE;
+}
+
/* Test the PyLong_AsDouble API. At present this just tests that
non-integer arguments are handled correctly.
*/
/* Deliberate buffer overflow to check that PyMem_Free() detects
the overflow when debug hooks are installed. */
buffer = PyMem_Malloc(16);
+ if (buffer == NULL) {
+ PyErr_NoMemory();
+ return NULL;
+ }
buffer[16] = 'x';
PyMem_Free(buffer);
}
+static PyObject*
+pyobject_is_freed(PyObject *self, PyObject *op)
+{
+ int res = _PyObject_IsFreed(op);
+ return PyBool_FromLong(res);
+}
+
+
+static PyObject*
+pyobject_uninitialized(PyObject *self, PyObject *args)
+{
+ PyObject *op = (PyObject *)PyObject_Malloc(sizeof(PyObject));
+ if (op == NULL) {
+ return NULL;
+ }
+ /* Initialize reference count to avoid early crash in ceval or GC */
+ Py_REFCNT(op) = 1;
+ /* object fields like ob_type are uninitialized! */
+ return op;
+}
+
+
+static PyObject*
+pyobject_forbidden_bytes(PyObject *self, PyObject *args)
+{
+ /* Allocate an incomplete PyObject structure: truncate 'ob_type' field */
+ PyObject *op = (PyObject *)PyObject_Malloc(offsetof(PyObject, ob_type));
+ if (op == NULL) {
+ return NULL;
+ }
+ /* Initialize reference count to avoid early crash in ceval or GC */
+ Py_REFCNT(op) = 1;
+ /* ob_type field is after the memory block: part of "forbidden bytes"
+ when using debug hooks on memory allocatrs! */
+ return op;
+}
+
+
+static PyObject*
+pyobject_freed(PyObject *self, PyObject *args)
+{
+ PyObject *op = _PyObject_CallNoArg((PyObject *)&PyBaseObject_Type);
+ if (op == NULL) {
+ return NULL;
+ }
+ Py_TYPE(op)->tp_dealloc(op);
+ /* Reset reference count to avoid early crash in ceval or GC */
+ Py_REFCNT(op) = 1;
+ /* object memory is freed! */
+ return op;
+}
+
+
static PyObject*
pyobject_malloc_without_gil(PyObject *self, PyObject *args)
{
METH_NOARGS},
{"test_long_as_double", (PyCFunction)test_long_as_double,METH_NOARGS},
{"test_long_as_size_t", (PyCFunction)test_long_as_size_t,METH_NOARGS},
+ {"test_long_as_unsigned_long_long_mask",
+ (PyCFunction)test_long_as_unsigned_long_long_mask, METH_NOARGS},
{"test_long_numbits", (PyCFunction)test_long_numbits, METH_NOARGS},
{"test_k_code", (PyCFunction)test_k_code, METH_NOARGS},
{"test_empty_argparse", (PyCFunction)test_empty_argparse,METH_NOARGS},
{"pymem_api_misuse", pymem_api_misuse, METH_NOARGS},
{"pymem_malloc_without_gil", pymem_malloc_without_gil, METH_NOARGS},
{"pymem_getallocatorsname", test_pymem_getallocatorsname, METH_NOARGS},
+ {"pyobject_is_freed", (PyCFunction)(void(*)(void))pyobject_is_freed, METH_O},
+ {"pyobject_uninitialized", pyobject_uninitialized, METH_NOARGS},
+ {"pyobject_forbidden_bytes", pyobject_forbidden_bytes, METH_NOARGS},
+ {"pyobject_freed", pyobject_freed, METH_NOARGS},
{"pyobject_malloc_without_gil", pyobject_malloc_without_gil, METH_NOARGS},
{"tracemalloc_track", tracemalloc_track, METH_VARARGS},
{"tracemalloc_untrack", tracemalloc_untrack, METH_VARARGS},
+/*
+ * Python UUID module that wraps libuuid -
+ * DCE compatible Universally Unique Identifier library.
+ */
+
#define PY_SSIZE_T_CLEAN
#include "Python.h"
#ifdef HAVE_UUID_UUID_H
#include <uuid/uuid.h>
-#endif
-#ifdef HAVE_UUID_H
+#elif defined(HAVE_UUID_H)
#include <uuid.h>
#endif
-
static PyObject *
py_uuid_generate_time_safe(void)
{
``LLVMFuzzerTestOneInput`` will run in oss-fuzz, with each test in
``fuzz_tests.txt`` run separately.
+Seed data (corpus) for the test can be provided in a subfolder called
+``<test_name>_corpus`` such as ``fuzz_json_loads_corpus``. A wide variety
+of good input samples allows the fuzzer to more easily explore a diverse
+set of paths and provides a better base to find buggy input from.
+
+Dictionaries of tokens (see oss-fuzz documentation for more details) can
+be placed in the ``dictionaries`` folder with the name of the test.
+For example, ``dictionaries/fuzz_json_loads.dict`` contains JSON tokens
+to guide the fuzzer.
+
What makes a good fuzz test
---------------------------
--- /dev/null
+"0"
+",0"
+":0"
+"0:"
+"-1.2e+3"
+
+"true"
+"false"
+"null"
+
+"\"\""
+",\"\""
+":\"\""
+"\"\":"
+
+"{}"
+",{}"
+":{}"
+"{\"\":0}"
+"{{}}"
+
+"[]"
+",[]"
+":[]"
+"[0]"
+"[[]]"
+
+"''"
+"\\"
+"\\b"
+"\\f"
+"\\n"
+"\\r"
+"\\t"
+"\\u0000"
+"\\x00"
+"\\0"
+"\\uD800\\uDC00"
+"\\uDBFF\\uDFFF"
+
--- /dev/null
+[
+ "JSON Test Pattern pass1",
+ {"object with 1 member":["array with 1 element"]},
+ {},
+ [],
+ -42,
+ true,
+ false,
+ null,
+ {
+ "integer": 1234567890,
+ "real": -9876.543210,
+ "e": 0.123456789e-12,
+ "E": 1.234567890E+34,
+ "": 23456789012E66,
+ "zero": 0,
+ "one": 1,
+ "space": " ",
+ "quote": "\"",
+ "backslash": "\\",
+ "controls": "\b\f\n\r\t",
+ "slash": "/ & \/",
+ "alpha": "abcdefghijklmnopqrstuvwyz",
+ "ALPHA": "ABCDEFGHIJKLMNOPQRSTUVWYZ",
+ "digit": "0123456789",
+ "0123456789": "digit",
+ "special": "`1~!@#$%^&*()_+-={':[,]}|;.</>?",
+ "hex": "\u0123\u4567\u89AB\uCDEF\uabcd\uef4A",
+ "true": true,
+ "false": false,
+ "null": null,
+ "array":[ ],
+ "object":{ },
+ "address": "50 St. James Street",
+ "url": "http://www.JSON.org/",
+ "comment": "// /* <!-- --",
+ "# -- --> */": " ",
+ " s p a c e d " :[1,2 , 3
+
+,
+
+4 , 5 , 6 ,7 ],"compact":[1,2,3,4,5,6,7],
+ "jsontext": "{\"object with 1 member\":[\"array with 1 element\"]}",
+ "quotes": "" \u0022 %22 0x22 034 "",
+ "\/\\\"\uCAFE\uBABE\uAB98\uFCDE\ubcda\uef4A\b\f\n\r\t`1~!@#$%^&*()_+-=[]{}|;:',./<>?"
+: "A key can be any string"
+ },
+ 0.5 ,98.6
+,
+99.44
+,
+
+1066,
+1e1,
+0.1e1,
+1e-1,
+1e00,2e+00,2e-00
+,"rosebud"]
\ No newline at end of file
--- /dev/null
+[[[[[[[[[[[[[[[[[[["Not too deep"]]]]]]]]]]]]]]]]]]]
\ No newline at end of file
--- /dev/null
+{
+ "JSON Test Pattern pass3": {
+ "The outermost value": "must be an object or array.",
+ "In this test": "It is an object."
+ }
+}
--- /dev/null
+[1, 2, 3, "abcd", "xyz"]
fuzz_builtin_float
fuzz_builtin_int
fuzz_builtin_unicode
+fuzz_json_loads
return 0;
}
+#define MAX_INT_TEST_SIZE 0x10000
+
/* Fuzz PyLong_FromUnicodeObject as a proxy for int(str). */
static int fuzz_builtin_int(const char* data, size_t size) {
+ /* Ignore test cases with very long ints to avoid timeouts
+ int("9" * 1000000) is not a very interesting test caase */
+ if (size > MAX_INT_TEST_SIZE) {
+ return 0;
+ }
/* Pick a random valid base. (When the fuzzed function takes extra
parameters, it's somewhat normal to hash the input to generate those
parameters. We want to exercise all code paths, so we do so here.) */
return 0;
}
+#define MAX_JSON_TEST_SIZE 0x10000
+
+/* Initialized in LLVMFuzzerTestOneInput */
+PyObject* json_loads_method = NULL;
+/* Fuzz json.loads(x) */
+static int fuzz_json_loads(const char* data, size_t size) {
+ /* Since python supports arbitrarily large ints in JSON,
+ long inputs can lead to timeouts on boring inputs like
+ `json.loads("9" * 100000)` */
+ if (size > MAX_JSON_TEST_SIZE) {
+ return 0;
+ }
+ PyObject* input_bytes = PyBytes_FromStringAndSize(data, size);
+ if (input_bytes == NULL) {
+ return 0;
+ }
+ PyObject* parsed = PyObject_CallFunctionObjArgs(json_loads_method, input_bytes, NULL);
+ /* Ignore ValueError as the fuzzer will more than likely
+ generate some invalid json and values */
+ if (parsed == NULL && PyErr_ExceptionMatches(PyExc_ValueError)) {
+ PyErr_Clear();
+ }
+ /* Ignore RecursionError as the fuzzer generates long sequences of
+ arrays such as `[[[...` */
+ if (parsed == NULL && PyErr_ExceptionMatches(PyExc_RecursionError)) {
+ PyErr_Clear();
+ }
+ /* Ignore unicode errors, invalid byte sequences are common */
+ if (parsed == NULL && PyErr_ExceptionMatches(PyExc_UnicodeDecodeError)) {
+ PyErr_Clear();
+ }
+ Py_DECREF(input_bytes);
+ Py_XDECREF(parsed);
+ return 0;
+}
+
/* Run fuzzer and abort on failure. */
static int _run_fuzz(const uint8_t *data, size_t size, int(*fuzzer)(const char* , size_t)) {
int rv = fuzzer((const char*) data, size);
/* CPython generates a lot of leak warnings for whatever reason. */
int __lsan_is_turned_off(void) { return 1; }
+
+int LLVMFuzzerInitialize(int *argc, char ***argv) {
+ wchar_t* wide_program_name = Py_DecodeLocale(*argv[0], NULL);
+ Py_SetProgramName(wide_program_name);
+ return 0;
+}
+
/* Fuzz test interface.
This returns the bitwise or of all fuzz test's return values.
initialize CPython ourselves on the first run. */
Py_InitializeEx(0);
}
+#if !defined(_Py_FUZZ_ONE) || defined(_Py_FUZZ_fuzz_json_loads)
+ if (json_loads_method == NULL) {
+ PyObject* json_module = PyImport_ImportModule("json");
+ json_loads_method = PyObject_GetAttrString(json_module, "loads");
+ }
+#endif
int rv = 0;
#endif
#if !defined(_Py_FUZZ_ONE) || defined(_Py_FUZZ_fuzz_builtin_unicode)
rv |= _run_fuzz(data, size, fuzz_builtin_unicode);
+#endif
+#if !defined(_Py_FUZZ_ONE) || defined(_Py_FUZZ_fuzz_json_loads)
+ rv |= _run_fuzz(data, size, fuzz_json_loads);
#endif
return rv;
}
return array_item(self, i);
}
else if (PySlice_Check(item)) {
- Py_ssize_t start, stop, step, slicelength, cur, i;
+ Py_ssize_t start, stop, step, slicelength, i;
+ size_t cur;
PyObject* result;
arrayobject* ar;
int itemsize = self->ob_descr->itemsize;
return 0;
}
else {
- Py_ssize_t cur, i;
+ size_t cur;
+ Py_ssize_t i;
if (needed != slicelength) {
PyErr_Format(PyExc_ValueError,
*/
PyErr_Format(Error,
"Invalid base64-encoded string: "
- "number of data characters (%d) cannot be 1 more "
+ "number of data characters (%zd) cannot be 1 more "
"than a multiple of 4",
(bin_data - bin_data_start) / 3 * 4 + 1);
} else {
}
PyDoc_STRVAR(module_doc,
-"This module is always available. It provides access to mathematical\n"
-"functions for complex numbers.");
+"This module provides access to mathematical functions for complex\n"
+"numbers.");
static PyMethodDef cmath_methods[] = {
CMATH_ACOS_METHODDEF
*/
#define XML_MAJOR_VERSION 2
#define XML_MINOR_VERSION 2
-#define XML_MICRO_VERSION 6
+#define XML_MICRO_VERSION 7
#ifdef __cplusplus
}
/* External API definitions */
-/* Namespace external symbols to allow multiple libexpat version to
- co-exist. */
-#include "pyexpatns.h"
-
#if defined(_MSC_EXTENSIONS) && !defined(__BEOS__) && !defined(__CYGWIN__)
# define XML_USE_MSC_EXTENSIONS 1
#endif
# endif
#endif /* not defined XMLCALL */
+/* Namespace external symbols to allow multiple libexpat version to
+ co-exist. */
+#include "pyexpatns.h"
+
#if !defined(XML_STATIC) && !defined(XMLIMPORT)
# ifndef XML_BUILDING_EXPAT
# endif
#endif /* not defined XML_STATIC */
-#if !defined(XMLIMPORT) && defined(__GNUC__) && (__GNUC__ >= 4)
+#ifndef XML_ENABLE_VISIBILITY
+# define XML_ENABLE_VISIBILITY 0
+#endif
+
+#if !defined(XMLIMPORT) && XML_ENABLE_VISIBILITY
# define XMLIMPORT __attribute__ ((visibility ("default")))
#endif
#endif
+#ifdef XML_ENABLE_VISIBILITY
+#if XML_ENABLE_VISIBILITY
+__attribute__ ((visibility ("default")))
+#endif
+#endif
void
_INTERNAL_trim_to_complete_utf8_characters(const char * from, const char ** fromLimRef);
/* we will assume all Windows platforms are little endian */
#define BYTEORDER 1234
-/* Windows has memmove() available. */
-#define HAVE_MEMMOVE
-
-
#endif /* !defined(HAVE_EXPAT_CONFIG_H) */
-/* 19ac4776051591216f1874e34ee99b6a43a3784c8bd7d70efeb9258dd22b906a (2.2.6+)
+/* 69df5be70289a11fb834869ce4a91c23c1d9dd04baffcbd10e86742d149a080c (2.2.7+)
__ __ _
___\ \/ /_ __ __ _| |_
/ _ \\ /| '_ \ / _` | __|
/* Do safe (NULL-aware) pointer arithmetic */
#define EXPAT_SAFE_PTR_DIFF(p, q) (((p) && (q)) ? ((p) - (q)) : 0)
-/* Handle the case where memmove() doesn't exist. */
-#ifndef HAVE_MEMMOVE
-#ifdef HAVE_BCOPY
-#define memmove(d,s,l) bcopy((s),(d),(l))
-#else
-#error memmove does not exist on this platform, nor is a substitute available
-#endif /* HAVE_BCOPY */
-#endif /* HAVE_MEMMOVE */
-
#include "internal.h"
#include "xmltok.h"
#include "xmlrole.h"
#endif /* ! defined(HAVE_ARC4RANDOM_BUF) && ! defined(HAVE_ARC4RANDOM) */
-#if defined(HAVE_ARC4RANDOM)
+#if defined(HAVE_ARC4RANDOM) && ! defined(HAVE_ARC4RANDOM_BUF)
static void
writeRandomBytes_arc4random(void * target, size_t count) {
}
}
-#endif /* defined(HAVE_ARC4RANDOM) */
+#endif /* defined(HAVE_ARC4RANDOM) && ! defined(HAVE_ARC4RANDOM_BUF) */
#ifdef _WIN32
enum XML_Error result;
if (parser->m_startCdataSectionHandler)
parser->m_startCdataSectionHandler(parser->m_handlerArg);
-#if 0
+/* BEGIN disabled code */
/* Suppose you doing a transformation on a document that involves
changing only the character data. You set up a defaultHandler
and a characterDataHandler. The defaultHandler simply copies
However, now we have a start/endCdataSectionHandler, so it seems
easier to let the user deal with this.
*/
- else if (parser->m_characterDataHandler)
+ else if (0 && parser->m_characterDataHandler)
parser->m_characterDataHandler(parser->m_handlerArg, parser->m_dataBuf, 0);
-#endif
+/* END disabled code */
else if (parser->m_defaultHandler)
reportDefault(parser, enc, s, next);
result = doCdataSection(parser, enc, &next, end, nextPtr, haveMore);
case XML_TOK_CDATA_SECT_CLOSE:
if (parser->m_endCdataSectionHandler)
parser->m_endCdataSectionHandler(parser->m_handlerArg);
-#if 0
+/* BEGIN disabled code */
/* see comment under XML_TOK_CDATA_SECT_OPEN */
- else if (parser->m_characterDataHandler)
+ else if (0 && parser->m_characterDataHandler)
parser->m_characterDataHandler(parser->m_handlerArg, parser->m_dataBuf, 0);
-#endif
+/* END disabled code */
else if (parser->m_defaultHandler)
reportDefault(parser, enc, s, next);
*startPtr = next;
else
poolDiscard(&dtd->pool);
elementType->prefix = prefix;
-
+ break;
}
}
return 1;
USE OR OTHER DEALINGS IN THE SOFTWARE.
*/
-#if !defined(_WIN32) && defined(HAVE_EXPAT_CONFIG_H)
-# include <pyconfig.h>
-#endif
#include <stddef.h>
#include <string.h> /* memcpy */
{
size_t depth, size;
uintptr_t sp = (uintptr_t)&depth;
- uintptr_t stop;
+ uintptr_t stop, lower_limit, upper_limit;
faulthandler_suppress_crash_report();
depth = 0;
- stop = stack_overflow(sp - STACK_OVERFLOW_MAX_SIZE,
- sp + STACK_OVERFLOW_MAX_SIZE,
- &depth);
+
+ if (STACK_OVERFLOW_MAX_SIZE <= sp) {
+ lower_limit = sp - STACK_OVERFLOW_MAX_SIZE;
+ }
+ else {
+ lower_limit = 0;
+ }
+
+ if (UINTPTR_MAX - STACK_OVERFLOW_MAX_SIZE >= sp) {
+ upper_limit = sp + STACK_OVERFLOW_MAX_SIZE;
+ }
+ else {
+ upper_limit = UINTPTR_MAX;
+ }
+
+ stop = stack_overflow(lower_limit, upper_limit, &depth);
if (sp < stop)
size = stop - sp;
else
#ifdef HAVE_SIGALTSTACK
if (stack.ss_sp != NULL) {
/* Fetch the current alt stack */
- stack_t current_stack = {};
+ stack_t current_stack;
+ memset(¤t_stack, 0, sizeof(current_stack));
if (sigaltstack(NULL, ¤t_stack) == 0) {
if (current_stack.ss_sp == stack.ss_sp) {
/* The current alt stack is the one that we installed.
lz = (countobject *)type->tp_alloc(type, 0);
if (lz == NULL) {
Py_XDECREF(long_cnt);
+ Py_DECREF(long_step);
return NULL;
}
lz->cnt = cnt;
# endif
#endif
-#ifdef _MSC_VER
-# include <crtdbg.h>
+#ifdef MS_WINDOWS
+# include <windows.h> /* STATUS_CONTROL_C_EXIT */
#endif
#ifdef __FreeBSD__
orig_argc = 0;
orig_argv = NULL;
+ _PyRuntime_Finalize();
+
PyMem_SetAllocator(PYMEM_DOMAIN_RAW, &old_alloc);
}
}
-static int
-pymain_compute_path0(_PyMain *pymain, _PyCoreConfig *config, PyObject **path0)
-{
- if (pymain->main_importer_path != NULL) {
- /* Let pymain_run_main_from_importer() adjust sys.path[0] later */
- *path0 = NULL;
- return 0;
- }
-
- if (Py_IsolatedFlag) {
- *path0 = NULL;
- return 0;
- }
-
- *path0 = _PyPathConfig_ComputeArgv0(config->argc, config->argv);
- if (*path0 == NULL) {
- pymain->err = _Py_INIT_NO_MEMORY();
- return -1;
- }
- return 0;
-}
-
-
static int
pymain_update_sys_path(_PyMain *pymain, PyObject *path0)
{
const _PyCoreConfig *config = &_PyGILState_GetInterpreterStateUnsafe()->core_config;
FILE* fp;
- fp = _Py_wfopen(pymain->filename, L"r");
+ fp = _Py_wfopen(pymain->filename, L"rb");
if (fp == NULL) {
char *cfilename_buffer;
const char *cfilename;
pymain->main_importer_path = pymain_get_importer(pymain->filename);
}
- PyObject *path0;
- if (pymain_compute_path0(pymain, config, &path0) < 0) {
+ if (pymain->main_importer_path != NULL) {
+ /* Let pymain_run_main_from_importer() adjust sys.path[0] later */
+ return 0;
+ }
+
+ if (Py_IsolatedFlag) {
+ return 0;
+ }
+
+ PyObject *path0 = NULL;
+ if (!_PyPathConfig_ComputeArgv0(config->argc, config->argv, &path0)) {
+ return 0;
+ }
+ if (path0 == NULL) {
+ pymain->err = _Py_INIT_NO_MEMORY();
return -1;
}
- if (path0 != NULL) {
- if (pymain_update_sys_path(pymain, path0) < 0) {
- Py_DECREF(path0);
- return -1;
- }
+ if (pymain_update_sys_path(pymain, path0) < 0) {
Py_DECREF(path0);
+ return -1;
}
+ Py_DECREF(path0);
return 0;
}
PyDoc_STRVAR(module_doc,
-"This module is always available. It provides access to the\n"
-"mathematical functions defined by the C standard.");
+"This module provides access to the mathematical functions\n"
+"defined by the C standard.");
static struct PyModuleDef mathmodule = {
slicelen);
else {
char *result_buf = (char *)PyMem_Malloc(slicelen);
- Py_ssize_t cur, i;
+ size_t cur;
+ Py_ssize_t i;
PyObject *result;
if (result_buf == NULL)
memcpy(self->data + start, vbuf.buf, slicelen);
}
else {
- Py_ssize_t cur, i;
+ size_t cur;
+ Py_ssize_t i;
for (cur = start, i = 0;
i < slicelen;
* Py_[X]DECREF() and Py_[X]INCREF() macros. The lint annotations
* look like "NOTE(...)".
*
- * To debug parser errors like
- * "parser.ParserError: Expected node type 12, got 333."
- * decode symbol numbers using the automatically-generated files
- * Lib/symbol.h and Include/token.h.
*/
#include "Python.h" /* general Python API */
for (pos = 0; pos < nch; ++pos) {
node *ch = CHILD(tree, pos);
int ch_type = TYPE(ch);
+ if ((ch_type >= NT_OFFSET + _PyParser_Grammar.g_ndfas)
+ || (ISTERMINAL(ch_type) && (ch_type >= N_TOKENS))
+ || (ch_type < 0)
+ ) {
+ PyErr_Format(parser_error, "Unrecognized node type %d.", ch_type);
+ return 0;
+ }
for (arc = 0; arc < dfa_state->s_narcs; ++arc) {
short a_label = dfa_state->s_arc[arc].a_lbl;
assert(a_label < _PyParser_Grammar.g_ll.ll_nlabels);
- if (_PyParser_Grammar.g_ll.ll_label[a_label].lb_type == ch_type) {
+
+ const char *label_str = _PyParser_Grammar.g_ll.ll_label[a_label].lb_str;
+ if ((_PyParser_Grammar.g_ll.ll_label[a_label].lb_type == ch_type)
+ && ((ch->n_str == NULL) || (label_str == NULL)
+ || (strcmp(ch->n_str, label_str) == 0))
+ ) {
/* The child is acceptable; if non-terminal, validate it recursively. */
if (ISNONTERMINAL(ch_type) && !validate_node(ch))
return 0;
/* What would this state have accepted? */
{
short a_label = dfa_state->s_arc->a_lbl;
- int next_type;
if (!a_label) /* Wouldn't accept any more children */
goto illegal_num_children;
- next_type = _PyParser_Grammar.g_ll.ll_label[a_label].lb_type;
- if (ISNONTERMINAL(next_type))
- PyErr_Format(parser_error, "Expected node type %d, got %d.",
- next_type, ch_type);
- else
+ int next_type = _PyParser_Grammar.g_ll.ll_label[a_label].lb_type;
+ const char *expected_str = _PyParser_Grammar.g_ll.ll_label[a_label].lb_str;
+
+ if (ISNONTERMINAL(next_type)) {
+ PyErr_Format(parser_error, "Expected %s, got %s.",
+ _PyParser_Grammar.g_dfa[next_type - NT_OFFSET].d_name,
+ ISTERMINAL(ch_type) ? _PyParser_TokenNames[ch_type] :
+ _PyParser_Grammar.g_dfa[ch_type - NT_OFFSET].d_name);
+ }
+ else if (expected_str != NULL) {
+ PyErr_Format(parser_error, "Illegal terminal: expected '%s'.",
+ expected_str);
+ }
+ else {
PyErr_Format(parser_error, "Illegal terminal: expected %s.",
_PyParser_TokenNames[next_type]);
+ }
return 0;
}
}
#endif /* HAVE_GETPID */
+#ifdef NGROUPS_MAX
+#define MAX_GROUPS NGROUPS_MAX
+#else
+ /* defined to be 16 on Solaris7, so this should be a small number */
+#define MAX_GROUPS 64
+#endif
+
#ifdef HAVE_GETGROUPLIST
/* AC 3.5: funny apple logic below */
static PyObject *
posix_getgrouplist(PyObject *self, PyObject *args)
{
-#ifdef NGROUPS_MAX
-#define MAX_GROUPS NGROUPS_MAX
-#else
- /* defined to be 16 on Solaris7, so this should be a small number */
-#define MAX_GROUPS 64
-#endif
-
const char *user;
int i, ngroups;
PyObject *list;
#else
gid_t *groups, basegid;
#endif
- ngroups = MAX_GROUPS;
+
+ /*
+ * NGROUPS_MAX is defined by POSIX.1 as the maximum
+ * number of supplimental groups a users can belong to.
+ * We have to increment it by one because
+ * getgrouplist() returns both the supplemental groups
+ * and the primary group, i.e. all of the groups the
+ * user belongs to.
+ */
+ ngroups = 1 + MAX_GROUPS;
#ifdef __APPLE__
if (!PyArg_ParseTuple(args, "si:getgrouplist", &user, &basegid))
/*[clinic end generated code: output=42b0c17758561b56 input=d3f109412e6a155c]*/
{
PyObject *result = NULL;
-
-#ifdef NGROUPS_MAX
-#define MAX_GROUPS NGROUPS_MAX
-#else
- /* defined to be 16 on Solaris7, so this should be a small number */
-#define MAX_GROUPS 64
-#endif
gid_t grouplist[MAX_GROUPS];
/* On MacOSX getgroups(2) can return more than MAX_GROUPS results
call_readline(FILE *sys_stdin, FILE *sys_stdout, const char *prompt)
{
size_t n;
- char *p, *q;
+ char *p;
int signal;
#ifdef SAVE_LOCALE
}
/* Copy the malloc'ed buffer into a PyMem_Malloc'ed one and
release the original. */
- q = p;
+ char *q = p;
p = PyMem_RawMalloc(n+2);
if (p != NULL) {
- strncpy(p, q, n);
+ memcpy(p, q, n);
p[n] = '\n';
p[n+1] = '\0';
}
}
-/* Replacements for intrcheck.c functionality
- * Declared in pyerrors.h
- */
+/* Simulate the effect of a signal.SIGINT signal arriving. The next time
+ PyErr_CheckSignals is called, the Python SIGINT signal handler will be
+ raised.
+
+ Missing signal handler for the SIGINT signal is silently ignored. */
void
PyErr_SetInterrupt(void)
{
- trip_signal(SIGINT);
+ if ((Handlers[SIGINT].func != IgnoreHandler) &&
+ (Handlers[SIGINT].func != DefaultHandler)) {
+ trip_signal(SIGINT);
+ }
}
void
break;
default:
PyErr_Format(PyExc_TypeError,
- "sendto() takes 2 or 3 arguments (%d given)",
+ "sendto() takes 2 or 3 arguments (%zd given)",
arglen);
return NULL;
}
return PyLong_FromUnsignedLong(recv); }
#endif
default:
- PyErr_Format(PyExc_ValueError, "invalid ioctl command %d", cmd);
+ PyErr_Format(PyExc_ValueError, "invalid ioctl command %lu", cmd);
return NULL;
}
}
PyModule_AddIntConstant(m, "altzone", _Py_timezone-3600);
#endif
PyModule_AddIntConstant(m, "daylight", _Py_daylight);
+#ifdef MS_WINDOWS
+ TIME_ZONE_INFORMATION tzinfo = {0};
+ GetTimeZoneInformation(&tzinfo);
+ otz0 = PyUnicode_FromWideChar(tzinfo.StandardName, -1);
+ if (otz0 == NULL) {
+ return -1;
+ }
+ otz1 = PyUnicode_FromWideChar(tzinfo.DaylightName, -1);
+ if (otz1 == NULL) {
+ Py_DECREF(otz0);
+ return -1;
+ }
+#else
otz0 = PyUnicode_DecodeLocale(_Py_tzname[0], "surrogateescape");
if (otz0 == NULL) {
return -1;
Py_DECREF(otz0);
return -1;
}
+#endif // MS_WINDOWS
PyObject *tzname_obj = Py_BuildValue("(NN)", otz0, otz1);
if (tzname_obj == NULL) {
return -1;
return PyLong_FromLong((unsigned char)(PyByteArray_AS_STRING(self)[i]));
}
else if (PySlice_Check(index)) {
- Py_ssize_t start, stop, step, slicelength, cur, i;
+ Py_ssize_t start, stop, step, slicelength, i;
+ size_t cur;
if (PySlice_Unpack(index, &start, &stop, &step) < 0) {
return NULL;
}
if (!errors || strcmp(errors, "strict") == 0) {
PyErr_Format(PyExc_ValueError,
- "invalid \\x escape at position %d",
+ "invalid \\x escape at position %zd",
s - 2 - (end - len));
goto failed;
}
return PyLong_FromLong((unsigned char)self->ob_sval[i]);
}
else if (PySlice_Check(item)) {
- Py_ssize_t start, stop, step, slicelength, cur, i;
+ Py_ssize_t start, stop, step, slicelength, i;
+ size_t cur;
char* source_buf;
char* result_buf;
PyObject* result;
}
result = (*fastmeth) (self, stack, nargs, kwnames);
- if (stack != args) {
+ if (kwnames != NULL) {
+ Py_ssize_t i, n = nargs + PyTuple_GET_SIZE(kwnames);
+ for (i = 0; i < n; i++) {
+ Py_DECREF(stack[i]);
+ }
PyMem_Free((PyObject **)stack);
+ Py_DECREF(kwnames);
}
- Py_XDECREF(kwnames);
break;
}
default:
PyErr_SetString(PyExc_SystemError,
- "Bad call flags in _PyCFunction_FastCallKeywords. "
+ "Bad call flags in _PyMethodDef_RawFastCallKeywords. "
"METH_OLDARGS is no longer supported!");
goto exit;
}
return -1;
}
- /* Copy position arguments (borrowed references) */
- memcpy(stack, args, nargs * sizeof(stack[0]));
+ /* Copy positional arguments */
+ for (i = 0; i < nargs; i++) {
+ Py_INCREF(args[i]);
+ stack[i] = args[i];
+ }
kwstack = stack + nargs;
pos = i = 0;
called in the performance critical hot code. */
while (PyDict_Next(kwargs, &pos, &key, &value)) {
Py_INCREF(key);
+ Py_INCREF(value);
PyTuple_SET_ITEM(kwnames, i, key);
- /* The stack contains borrowed references */
kwstack[i] = value;
i++;
}
if (!PyType_Check(self)) {
PyErr_Format(PyExc_TypeError,
"descriptor '%V' requires a type "
- "but received a '%.100s'",
+ "but received a '%.100s' instance",
descr_name((PyDescrObject *)descr), "?",
- PyDescr_TYPE(descr)->tp_name,
self->ob_type->tp_name);
return NULL;
}
if (!PyType_IsSubtype((PyTypeObject *)self, PyDescr_TYPE(descr))) {
PyErr_Format(PyExc_TypeError,
- "descriptor '%V' "
- "requires a subtype of '%.100s' "
- "but received '%.100s",
+ "descriptor '%V' requires a subtype of '%.100s' "
+ "but received '%.100s'",
descr_name((PyDescrObject *)descr), "?",
PyDescr_TYPE(descr)->tp_name,
- self->ob_type->tp_name);
+ ((PyTypeObject*)self)->tp_name);
return NULL;
}
while (ofs < maxofs) {
IFLT(a[ofs], key) {
lastofs = ofs;
+ assert(ofs <= (PY_SSIZE_T_MAX - 1) / 2);
ofs = (ofs << 1) + 1;
- if (ofs <= 0) /* int overflow */
- ofs = maxofs;
}
else /* key <= a[hint + ofs] */
break;
break;
/* key <= a[hint - ofs] */
lastofs = ofs;
+ assert(ofs <= (PY_SSIZE_T_MAX - 1) / 2);
ofs = (ofs << 1) + 1;
- if (ofs <= 0) /* int overflow */
- ofs = maxofs;
}
if (ofs > maxofs)
ofs = maxofs;
while (ofs < maxofs) {
IFLT(key, *(a-ofs)) {
lastofs = ofs;
+ assert(ofs <= (PY_SSIZE_T_MAX - 1) / 2);
ofs = (ofs << 1) + 1;
- if (ofs <= 0) /* int overflow */
- ofs = maxofs;
}
else /* a[hint - ofs] <= key */
break;
break;
/* a[hint + ofs] <= key */
lastofs = ofs;
+ assert(ofs <= (PY_SSIZE_T_MAX - 1) / 2);
ofs = (ofs << 1) + 1;
- if (ofs <= 0) /* int overflow */
- ofs = maxofs;
}
if (ofs > maxofs)
ofs = maxofs;
if (key->ob_type != key_type) {
keys_are_all_same_type = 0;
- break;
+ /* If keys are in tuple we must loop over the whole list to make
+ sure all items are tuples */
+ if (!keys_are_in_tuples) {
+ break;
+ }
}
- if (key_type == &PyLong_Type) {
- if (ints_are_bounded && Py_ABS(Py_SIZE(key)) > 1)
+ if (keys_are_all_same_type) {
+ if (key_type == &PyLong_Type &&
+ ints_are_bounded &&
+ Py_ABS(Py_SIZE(key)) > 1) {
+
ints_are_bounded = 0;
+ }
+ else if (key_type == &PyUnicode_Type &&
+ strings_are_latin &&
+ PyUnicode_KIND(key) != PyUnicode_1BYTE_KIND) {
+
+ strings_are_latin = 0;
+ }
+ }
}
- else if (key_type == &PyUnicode_Type){
- if (strings_are_latin &&
- PyUnicode_KIND(key) != PyUnicode_1BYTE_KIND)
- strings_are_latin = 0;
- }
- }
/* Choose the best compare, given what we now know about the keys. */
if (keys_are_all_same_type) {
if (keys_are_in_tuples) {
/* Make sure we're not dealing with tuples of tuples
* (remember: here, key_type refers list [key[0] for key in keys]) */
- if (key_type == &PyTuple_Type)
+ if (key_type == &PyTuple_Type) {
ms.tuple_elem_compare = safe_object_compare;
- else
+ }
+ else {
ms.tuple_elem_compare = ms.key_compare;
+ }
ms.key_compare = unsafe_tuple_compare;
}
if (vv == NULL || !PyLong_Check(vv)) {
PyErr_BadInternalCall();
- return (unsigned long) -1;
+ return (unsigned long long) -1;
}
v = (PyLongObject *)vv;
switch(Py_SIZE(v)) {
if (op == NULL) {
PyErr_BadInternalCall();
- return (unsigned long)-1;
+ return (unsigned long long)-1;
}
if (PyLong_Check(op)) {
}
-/* Heuristic checking if the object memory has been deallocated.
- Rely on the debug hooks on Python memory allocators which fills the memory
- with DEADBYTE (0xDB) when memory is deallocated.
+/* Heuristic checking if the object memory is uninitialized or deallocated.
+ Rely on the debug hooks on Python memory allocators:
+ see _PyMem_IsPtrFreed().
The function can be used to prevent segmentation fault on dereferencing
- pointers like 0xdbdbdbdbdbdbdbdb. Such pointer is very unlikely to be mapped
- in memory. */
+ pointers like 0xDDDDDDDDDDDDDDDD. */
int
_PyObject_IsFreed(PyObject *op)
{
- uintptr_t ptr = (uintptr_t)op;
- if (_PyMem_IsFreed(&ptr, sizeof(ptr))) {
+ if (_PyMem_IsPtrFreed(op) || _PyMem_IsPtrFreed(op->ob_type)) {
return 1;
}
- int freed = _PyMem_IsFreed(&op->ob_type, sizeof(op->ob_type));
- /* ignore op->ob_ref: the value can have be modified
+ /* ignore op->ob_ref: its value can have be modified
by Py_INCREF() and Py_DECREF(). */
#ifdef Py_TRACE_REFS
- freed &= _PyMem_IsFreed(&op->_ob_next, sizeof(op->_ob_next));
- freed &= _PyMem_IsFreed(&op->_ob_prev, sizeof(op->_ob_prev));
+ if (_PyMem_IsPtrFreed(op->_ob_next) || _PyMem_IsPtrFreed(op->_ob_prev)) {
+ return 1;
+ }
#endif
- return freed;
+ return 0;
}
if (_PyObject_IsFreed(op)) {
/* It seems like the object memory has been freed:
don't access it to prevent a segmentation fault. */
- fprintf(stderr, "<freed object>\n");
+ fprintf(stderr, "<Freed object>\n");
return;
}
}
if (tp->tp_setattr != NULL) {
const char *name_str = PyUnicode_AsUTF8(name);
- if (name_str == NULL)
+ if (name_str == NULL) {
+ Py_DECREF(name);
return -1;
+ }
err = (*tp->tp_setattr)(v, (char *)name_str, value);
Py_DECREF(name);
return err;
*
* You shouldn't change this unless you know what you are doing.
*/
+
+#if SIZEOF_VOID_P > 4
+#define ALIGNMENT 16 /* must be 2^N */
+#define ALIGNMENT_SHIFT 4
+#else
#define ALIGNMENT 8 /* must be 2^N */
#define ALIGNMENT_SHIFT 3
+#endif
/* Return the number of bytes in size class I, as a uint. */
#define INDEX2SIZE(I) (((uint)(I) + 1) << ALIGNMENT_SHIFT)
/* Special bytes broadcast into debug memory blocks at appropriate times.
* Strings of these are unlikely to be valid addresses, floats, ints or
- * 7-bit ASCII.
+ * 7-bit ASCII. If modified, _PyMem_IsPtrFreed() should be updated as well.
+ *
+ * Byte patterns 0xCB, 0xBB and 0xFB have been replaced with 0xCD, 0xDD and
+ * 0xFD to use the same values than Windows CRT debug malloc() and free().
*/
#undef CLEANBYTE
#undef DEADBYTE
#undef FORBIDDENBYTE
-#define CLEANBYTE 0xCB /* clean (newly allocated) memory */
-#define DEADBYTE 0xDB /* dead (newly freed) memory */
-#define FORBIDDENBYTE 0xFB /* untouchable bytes at each end of a block */
+#define CLEANBYTE 0xCD /* clean (newly allocated) memory */
+#define DEADBYTE 0xDD /* dead (newly freed) memory */
+#define FORBIDDENBYTE 0xFD /* untouchable bytes at each end of a block */
static size_t serialno = 0; /* incremented on each debug {m,re}alloc */
}
-/* Heuristic checking if the memory has been freed. Rely on the debug hooks on
- Python memory allocators which fills the memory with DEADBYTE (0xDB) when
- memory is deallocated. */
-int
-_PyMem_IsFreed(void *ptr, size_t size)
-{
- unsigned char *bytes = ptr;
- for (size_t i=0; i < size; i++) {
- if (bytes[i] != DEADBYTE) {
- return 0;
- }
- }
- return 1;
-}
-
-
/* The debug free first checks the 2*SST bytes on each end for sanity (in
particular, that the FORBIDDENBYTEs with the api ID are still intact).
Then fills the original bytes with DEADBYTE.
if (len == -1)
return -1;
if (len > 1) {
- const char *msg = "expected at most 1 arguments, got %d";
+ const char *msg = "expected at most 1 arguments, got %zd";
PyErr_Format(PyExc_TypeError, msg, len);
return -1;
}
assert(args == NULL || PyTuple_Check(args));
len = (args != NULL) ? PyTuple_GET_SIZE(args) : 0;
if (len > 1) {
- const char *msg = "update() takes at most 1 positional argument (%d given)";
+ const char *msg = "update() takes at most 1 positional argument (%zd given)";
PyErr_Format(PyExc_TypeError, msg, len);
return NULL;
}
"rangeobject.count(value) -> integer -- return number of occurrences of value");
PyDoc_STRVAR(index_doc,
-"rangeobject.index(value, [start, [stop]]) -> integer -- return index of value.\n"
+"rangeobject.index(value) -> integer -- return index of value.\n"
"Raise ValueError if the value is not present.");
static PyMethodDef range_methods[] = {
static int
set_difference_update_internal(PySetObject *so, PyObject *other)
{
- if (PySet_GET_SIZE(so) == 0) {
- return 0;
- }
-
if ((PyObject *)so == other)
return set_clear_internal(so);
Py_ssize_t pos = 0, other_size;
int rv;
- if (PySet_GET_SIZE(so) == 0) {
- return set_copy(so);
- }
-
if (PyAnySet_Check(other)) {
other_size = PySet_GET_SIZE(other);
}
goto InvalidContinuation1;
} else if (ch == 0xF4 && ch2 >= 0x90) {
/* invalid sequence
- \xF4\x90\x80\80- -- 110000- overflow */
+ \xF4\x90\x80\x80- -- 110000- overflow */
goto InvalidContinuation1;
}
if (!IS_CONTINUATION_BYTE(ch3)) {
}
/* UTF-16 code pair: */
- if (q >= e)
- goto UnexpectedEnd;
if (!Py_UNICODE_IS_HIGH_SURROGATE(ch))
goto IllegalEncoding;
+ if (q >= e)
+ goto UnexpectedEnd;
ch2 = (q[ihi] << 8) | q[ilo];
q += 2;
if (!Py_UNICODE_IS_LOW_SURROGATE(ch2))
endofbuf= &buf[REPR_BUFFER_SIZE-5];
/* "typename(", limited to TYPE_MAXSIZE */
- len = strlen(typ->tp_name) > TYPE_MAXSIZE ? TYPE_MAXSIZE :
- strlen(typ->tp_name);
- strncpy(pbuf, typ->tp_name, len);
- pbuf += len;
+ assert(TYPE_MAXSIZE < sizeof(buf));
+ len = strlen(typ->tp_name);
+ if (len <= TYPE_MAXSIZE) {
+ strcpy(pbuf, typ->tp_name);
+ pbuf += len;
+ }
+ else {
+ strncpy(pbuf, typ->tp_name, TYPE_MAXSIZE);
+ pbuf += TYPE_MAXSIZE;
+ }
*pbuf++ = '(';
for (i=0; i < VISIBLE_SIZE(obj); i++) {
cname = typ->tp_members[i].name;
if (cname == NULL) {
- PyErr_Format(PyExc_SystemError, "In structseq_repr(), member %d name is NULL"
+ PyErr_Format(PyExc_SystemError, "In structseq_repr(), member %zd name is NULL"
" for type %.500s", i, typ->tp_name);
return NULL;
}
return tupleitem(self, i);
}
else if (PySlice_Check(item)) {
- Py_ssize_t start, stop, step, slicelength, cur, i;
+ Py_ssize_t start, stop, step, slicelength, i;
+ size_t cur;
PyObject* result;
PyObject* it;
PyObject **src, **dest;
static void
clear_slotdefs(void);
+static PyObject *
+lookup_maybe_method(PyObject *self, _Py_Identifier *attrid, int *unbound);
+
/*
* finds the beginning of the docstring's introspection signature.
* if present, returns a pointer pointing to the first '('.
Unset HAVE_VERSION_TAG and VALID_VERSION_TAG if the type
has a custom MRO that includes a type which is not officially
- super type.
+ super type, or if the type implements its own mro() method.
Called from mro_internal, which will subsequently be called on
each subclass when their mro is recursively updated.
*/
Py_ssize_t i, n;
- int clear = 0;
+ int custom = (Py_TYPE(type) != &PyType_Type);
+ int unbound;
+ PyObject *mro_meth = NULL;
+ PyObject *type_mro_meth = NULL;
if (!PyType_HasFeature(type, Py_TPFLAGS_HAVE_VERSION_TAG))
return;
+ if (custom) {
+ _Py_IDENTIFIER(mro);
+ mro_meth = lookup_maybe_method(
+ (PyObject *)type, &PyId_mro, &unbound);
+ if (mro_meth == NULL)
+ goto clear;
+ type_mro_meth = lookup_maybe_method(
+ (PyObject *)&PyType_Type, &PyId_mro, &unbound);
+ if (type_mro_meth == NULL)
+ goto clear;
+ if (mro_meth != type_mro_meth)
+ goto clear;
+ Py_XDECREF(mro_meth);
+ Py_XDECREF(type_mro_meth);
+ }
n = PyTuple_GET_SIZE(bases);
for (i = 0; i < n; i++) {
PyObject *b = PyTuple_GET_ITEM(bases, i);
if (!PyType_HasFeature(cls, Py_TPFLAGS_HAVE_VERSION_TAG) ||
!PyType_IsSubtype(type, cls)) {
- clear = 1;
- break;
+ goto clear;
}
}
-
- if (clear)
- type->tp_flags &= ~(Py_TPFLAGS_HAVE_VERSION_TAG|
- Py_TPFLAGS_VALID_VERSION_TAG);
+ return;
+ clear:
+ Py_XDECREF(mro_meth);
+ Py_XDECREF(type_mro_meth);
+ type->tp_flags &= ~(Py_TPFLAGS_HAVE_VERSION_TAG|
+ Py_TPFLAGS_VALID_VERSION_TAG);
}
static int
static PyMemberDef type_members[] = {
{"__basicsize__", T_PYSSIZET, offsetof(PyTypeObject,tp_basicsize),READONLY},
{"__itemsize__", T_PYSSIZET, offsetof(PyTypeObject, tp_itemsize), READONLY},
- {"__flags__", T_LONG, offsetof(PyTypeObject, tp_flags), READONLY},
- {"__weakrefoffset__", T_LONG,
+ {"__flags__", T_ULONG, offsetof(PyTypeObject, tp_flags), READONLY},
+ {"__weakrefoffset__", T_PYSSIZET,
offsetof(PyTypeObject, tp_weaklistoffset), READONLY},
{"__base__", T_OBJECT, offsetof(PyTypeObject, tp_base), READONLY},
- {"__dictoffset__", T_LONG,
+ {"__dictoffset__", T_PYSSIZET,
offsetof(PyTypeObject, tp_dictoffset), READONLY},
{"__mro__", T_OBJECT, offsetof(PyTypeObject, tp_mro), READONLY},
{0}
size_t len = strlen(old_doc)+1;
char *tp_doc = PyObject_MALLOC(len);
if (tp_doc == NULL) {
+ type->tp_doc = NULL;
PyErr_NoMemory();
goto fail;
}
res = (*func)(self);
if (res == -1 && PyErr_Occurred())
return NULL;
- return PyLong_FromLong((long)res);
+ return PyLong_FromSsize_t(res);
}
static PyObject *
endinpos = startinpos + 1;
break;
case 2:
+ if (consumed && (unsigned char)s[0] == 0xED && end - s == 2
+ && (unsigned char)s[1] >= 0xA0 && (unsigned char)s[1] <= 0xBF)
+ {
+ /* Truncated surrogate code in range D800-DFFF */
+ goto End;
+ }
+ /* fall through */
case 3:
case 4:
errmsg = "invalid continuation byte";
const char *in,
int insize)
{
- const DWORD flags = decode_code_page_flags(code_page);
+ DWORD flags = MB_ERR_INVALID_CHARS;
wchar_t *out;
DWORD outsize;
/* First get the size of the result */
assert(insize > 0);
- outsize = MultiByteToWideChar(code_page, flags, in, insize, NULL, 0);
- if (outsize <= 0)
- goto error;
+ while ((outsize = MultiByteToWideChar(code_page, flags,
+ in, insize, NULL, 0)) <= 0)
+ {
+ if (!flags || GetLastError() != ERROR_INVALID_FLAGS) {
+ goto error;
+ }
+ /* For some code pages (e.g. UTF-7) flags must be set to 0. */
+ flags = 0;
+ }
if (*v == NULL) {
/* Create unicode object */
{
const char *startin = in;
const char *endin = in + size;
- const DWORD flags = decode_code_page_flags(code_page);
+ DWORD flags = MB_ERR_INVALID_CHARS;
/* Ideally, we should get reason from FormatMessage. This is the Windows
2000 English version of the message. */
const char *reason = "No mapping for the Unicode character exists "
if (outsize > 0)
break;
err = GetLastError();
+ if (err == ERROR_INVALID_FLAGS && flags) {
+ /* For some code pages (e.g. UTF-7) flags must be set to 0. */
+ flags = 0;
+ continue;
+ }
if (err != ERROR_NO_UNICODE_TRANSLATION
&& err != ERROR_INSUFFICIENT_BUFFER)
{
i += PyUnicode_GET_LENGTH(self);
return unicode_getitem(self, i);
} else if (PySlice_Check(item)) {
- Py_ssize_t start, stop, step, slicelength, cur, i;
+ Py_ssize_t start, stop, step, slicelength, i;
+ size_t cur;
PyObject *result;
void *src_data, *dest_data;
int src_kind, dest_kind;
wchar_t program_full_path[MAXPATHLEN+1];
memset(program_full_path, 0, sizeof(program_full_path));
+ if (!GetModuleFileNameW(NULL, program_full_path, MAXPATHLEN)) {
+ /* GetModuleFileName should never fail when passed NULL */
+ return _Py_INIT_ERR("Cannot determine program path");
+ }
+
/* The launcher may need to force the executable path to a
* different environment, so override it here. */
pyvenv_launcher = _wgetenv(L"__PYVENV_LAUNCHER__");
if (pyvenv_launcher && pyvenv_launcher[0]) {
+ _wputenv_s(L"__PYVENV_BASE_EXECUTABLE__", program_full_path);
wcscpy_s(program_full_path, MAXPATHLEN+1, pyvenv_launcher);
- } else if (!GetModuleFileNameW(NULL, program_full_path, MAXPATHLEN)) {
- /* GetModuleFileName should never fail when passed NULL */
- return _Py_INIT_ERR("Cannot determine program path");
+ /* bpo-35873: Clear the environment variable to avoid it being
+ * inherited by child processes. */
+ _wputenv_s(L"__PYVENV_LAUNCHER__", L"");
}
config->program_full_path = PyMem_RawMalloc(
}
child_command = calloc(child_command_size, sizeof(wchar_t));
if (child_command == NULL)
- error(RC_CREATE_PROCESS, L"unable to allocate %d bytes for child command.",
+ error(RC_CREATE_PROCESS, L"unable to allocate %zd bytes for child command.",
child_command_size);
if (no_suffix)
_snwprintf_s(child_command, child_command_size,
if (rc == 0) {
read = fread(buffer, sizeof(char), BUFSIZE, fp);
- debug(L"maybe_handle_shebang: read %d bytes\n", read);
+ debug(L"maybe_handle_shebang: read %zd bytes\n", read);
fclose(fp);
if ((read >= 4) && (buffer[3] == '\n') && (buffer[2] == '\r')) {
bom = BOMs; /* points to UTF-8 entry - the default */
}
else {
- debug(L"maybe_handle_shebang: BOM found, code page %d\n",
+ debug(L"maybe_handle_shebang: BOM found, code page %u\n",
bom->code_page);
start = &buffer[bom->length];
}
"SysVersion": VER_DOT,
"Version": "{}.{}.{}".format(VER_MAJOR, VER_MINOR, VER_MICRO),
"InstallPath": {
- # I have no idea why the trailing spaces are needed, but they seem to be needed.
- "": "[{AppVPackageRoot}][ ]",
- "ExecutablePath": "[{AppVPackageRoot}]python.exe[ ]",
- "WindowedExecutablePath": "[{AppVPackageRoot}]pythonw.exe[ ]",
+ "": "[{AppVPackageRoot}]",
+ "ExecutablePath": "[{AppVPackageRoot}]\\python.exe",
+ "WindowedExecutablePath": "[{AppVPackageRoot}]\\pythonw.exe",
},
"Help": {
"Main Python Documentation": {
"_condition": lambda ns: ns.include_chm,
- "": "[{{AppVPackageRoot}}]Doc\\{}[ ]".format(
+ "": "[{{AppVPackageRoot}}]\\Doc\\{}".format(
PYTHON_CHM_NAME
),
},
"Local Python Documentation": {
"_condition": lambda ns: ns.include_html_doc,
- "": "[{AppVPackageRoot}]Doc\\html\\index.html[ ]",
+ "": "[{AppVPackageRoot}]\\Doc\\html\\index.html",
},
"Online Python Documentation": {
"": "https://docs.python.org/{}".format(VER_DOT)
},
"Idle": {
"_condition": lambda ns: ns.include_idle,
- "": "[{AppVPackageRoot}]Lib\\idlelib\\idle.pyw[ ]",
+ "": "[{AppVPackageRoot}]\\Lib\\idlelib\\idle.pyw",
},
}
}
},
"nuget": {
"help": "nuget package",
- "options": ["stable", "pip", "distutils", "dev", "props"],
+ "options": [
+ "dev",
+ "tools",
+ "pip",
+ "stable",
+ "distutils",
+ "venv",
+ "props"
+ ],
},
"default": {
"help": "development kit package",
#define WIN32_LEAN_AND_MEAN
#include <Windows.h>
#include <shellapi.h>
+#include <shlobj.h>
+
+#include <string>
#include <winrt\Windows.ApplicationModel.h>
#include <winrt\Windows.Storage.h>
#endif
#endif
-static void
-set_user_base()
+static std::wstring
+get_user_base()
{
- wchar_t envBuffer[2048];
try {
const auto appData = winrt::Windows::Storage::ApplicationData::Current();
if (appData) {
const auto localCache = appData.LocalCacheFolder();
if (localCache) {
auto path = localCache.Path();
- if (!path.empty() &&
- !wcscpy_s(envBuffer, path.c_str()) &&
- !wcscat_s(envBuffer, L"\\local-packages")
- ) {
- _wputenv_s(L"PYTHONUSERBASE", envBuffer);
+ if (!path.empty()) {
+ return std::wstring(path) + L"\\local-packages";
}
}
}
} catch (...) {
}
+ return std::wstring();
}
-static const wchar_t *
-get_argv0(const wchar_t *argv0)
+static std::wstring
+get_package_family()
{
- winrt::hstring installPath;
- const wchar_t *launcherPath;
- wchar_t *buffer;
- size_t len;
-
- launcherPath = _wgetenv(L"__PYVENV_LAUNCHER__");
- if (launcherPath && launcherPath[0]) {
- len = wcslen(launcherPath) + 1;
- buffer = (wchar_t *)malloc(sizeof(wchar_t) * len);
- if (!buffer) {
- Py_FatalError("out of memory");
- return NULL;
- }
- if (wcscpy_s(buffer, len, launcherPath)) {
- Py_FatalError("failed to copy to buffer");
- return NULL;
+ try {
+ const auto package = winrt::Windows::ApplicationModel::Package::Current();
+ if (package) {
+ const auto id = package.Id();
+ if (id) {
+ return std::wstring(id.FamilyName());
+ }
}
- return buffer;
}
+ catch (...) {
+ }
+
+ return std::wstring();
+}
+static std::wstring
+get_package_home()
+{
try {
const auto package = winrt::Windows::ApplicationModel::Package::Current();
if (package) {
- const auto install = package.InstalledLocation();
- if (install) {
- installPath = install.Path();
+ const auto path = package.InstalledLocation();
+ if (path) {
+ return std::wstring(path.Path());
}
}
}
catch (...) {
}
- if (!installPath.empty()) {
- len = installPath.size() + wcslen(PROGNAME) + 2;
- } else {
- len = wcslen(argv0) + wcslen(PROGNAME) + 1;
- }
-
- buffer = (wchar_t *)malloc(sizeof(wchar_t) * len);
- if (!buffer) {
- Py_FatalError("out of memory");
- return NULL;
- }
+ return std::wstring();
+}
- if (!installPath.empty()) {
- if (wcscpy_s(buffer, len, installPath.c_str())) {
- Py_FatalError("failed to copy to buffer");
- return NULL;
- }
- if (wcscat_s(buffer, len, L"\\")) {
- Py_FatalError("failed to concatenate backslash");
- return NULL;
- }
- } else {
- if (wcscpy_s(buffer, len, argv0)) {
- Py_FatalError("failed to copy argv[0]");
- return NULL;
+static int
+set_process_name()
+{
+ const auto home = get_package_home();
+ const auto family = get_package_family();
+
+ std::wstring executable;
+
+ /* If inside a package, use user's symlink name for executable */
+ if (!family.empty()) {
+ PWSTR localAppData;
+ if (SUCCEEDED(SHGetKnownFolderPath(FOLDERID_LocalAppData, 0,
+ NULL, &localAppData))) {
+ executable = std::wstring(localAppData)
+ + L"\\Microsoft\\WindowsApps\\"
+ + family
+ + L"\\"
+ + PROGNAME;
+
+ CoTaskMemFree(localAppData);
}
+ }
- wchar_t *name = wcsrchr(buffer, L'\\');
- if (name) {
- name[1] = L'\0';
- } else {
- buffer[0] = L'\0';
+ /* Only use module filename if we don't have a home */
+ if (home.empty() && executable.empty()) {
+ executable.resize(MAX_PATH);
+ while (true) {
+ DWORD len = GetModuleFileNameW(
+ NULL, executable.data(), (DWORD)executable.size());
+ if (len == 0) {
+ executable.clear();
+ break;
+ } else if (len == executable.size() &&
+ GetLastError() == ERROR_INSUFFICIENT_BUFFER) {
+ executable.resize(len * 2);
+ } else {
+ executable.resize(len);
+ break;
+ }
}
}
- if (wcscat_s(buffer, len, PROGNAME)) {
- Py_FatalError("failed to concatenate program name");
- return NULL;
+ if (!home.empty()) {
+ Py_SetPythonHome(home.c_str());
}
- return buffer;
-}
-
-static wchar_t *
-get_process_name()
-{
- DWORD bufferLen = MAX_PATH;
- DWORD len = bufferLen;
- wchar_t *r = NULL;
-
- while (!r) {
- r = (wchar_t *)malloc(bufferLen * sizeof(wchar_t));
- if (!r) {
- Py_FatalError("out of memory");
- return NULL;
- }
- len = GetModuleFileNameW(NULL, r, bufferLen);
- if (len == 0) {
- free((void *)r);
- return NULL;
- } else if (len == bufferLen &&
- GetLastError() == ERROR_INSUFFICIENT_BUFFER) {
- free(r);
- r = NULL;
- bufferLen *= 2;
+ const wchar_t *launcherPath = _wgetenv(L"__PYVENV_LAUNCHER__");
+ if (launcherPath) {
+ if (!executable.empty()) {
+ _wputenv_s(L"__PYVENV_BASE_EXECUTABLE__", executable.c_str());
}
+ _Py_SetProgramFullPath(launcherPath);
+ /* bpo-35873: Clear the environment variable to avoid it being
+ * inherited by child processes. */
+ _wputenv_s(L"__PYVENV_LAUNCHER__", L"");
+ } else if (!executable.empty()) {
+ _Py_SetProgramFullPath(executable.c_str());
}
- return r;
+ return 1;
}
int
wmain(int argc, wchar_t **argv)
{
- const wchar_t **new_argv;
- int new_argc;
- const wchar_t *exeName;
-
- new_argc = argc;
- new_argv = (const wchar_t**)malloc(sizeof(wchar_t *) * (argc + 2));
- if (new_argv == NULL) {
- Py_FatalError("out of memory");
- return -1;
+ if (!set_process_name()) {
+ return 121;
}
-
- exeName = get_process_name();
-
- new_argv[0] = get_argv0(exeName ? exeName : argv[0]);
- for (int i = 1; i < argc; ++i) {
- new_argv[i] = argv[i];
+ const wchar_t *p = _wgetenv(L"PYTHONUSERBASE");
+ if (!p || !*p) {
+ _wputenv_s(L"PYTHONUSERBASE", get_user_base().c_str());
}
- set_user_base();
+ p = wcsrchr(argv[0], L'\\');
+ if (!p) {
+ p = argv[0];
+ }
+ if (p) {
+ if (*p == L'\\') {
+ p++;
+ }
- if (exeName) {
- const wchar_t *p = wcsrchr(exeName, L'\\');
- if (p) {
- const wchar_t *moduleName = NULL;
- if (*p++ == L'\\') {
- if (wcsnicmp(p, L"pip", 3) == 0) {
- moduleName = L"pip";
- _wputenv_s(L"PIP_USER", L"true");
- }
- else if (wcsnicmp(p, L"idle", 4) == 0) {
- moduleName = L"idlelib";
- }
- }
+ const wchar_t *moduleName = NULL;
+ if (wcsnicmp(p, L"pip", 3) == 0) {
+ moduleName = L"pip";
+ /* No longer required when pip 19.1 is added */
+ _wputenv_s(L"PIP_USER", L"true");
+ } else if (wcsnicmp(p, L"idle", 4) == 0) {
+ moduleName = L"idlelib";
+ }
- if (moduleName) {
- new_argc += 2;
- for (int i = argc; i >= 1; --i) {
- new_argv[i + 2] = new_argv[i];
- }
- new_argv[1] = L"-m";
- new_argv[2] = moduleName;
+ if (moduleName) {
+ /* Not even pretending we're going to free this memory.
+ * The OS will clean it all up when our process exits
+ */
+ wchar_t **new_argv = (wchar_t **)PyMem_RawMalloc((argc + 2) * sizeof(wchar_t *));
+ new_argv[0] = argv[0];
+ new_argv[1] = _PyMem_RawWcsdup(L"-m");
+ new_argv[2] = _PyMem_RawWcsdup(moduleName);
+ for (int i = 1; i < argc; ++i) {
+ new_argv[i + 2] = argv[i];
}
+ argv = new_argv;
+ argc += 2;
}
}
- /* Override program_full_path from here so that
- sys.executable is set correctly. */
- _Py_SetProgramFullPath(new_argv[0]);
-
- int result = Py_Main(new_argc, (wchar_t **)new_argv);
-
- free((void *)exeName);
- free((void *)new_argv);
-
- return result;
+ return Py_Main(argc, (wchar_t**)argv);
}
#ifdef PYTHONW
Q = data + len;
for (P = data, i = 0; P < Q && *P != '\0'; P++, i++) {
str[i] = P;
- for(; *P != '\0'; P++)
+ for (; P < Q && *P != '\0'; P++)
;
}
}
}
Py_BEGIN_ALLOW_THREADS
- rc = RegSetValueW(key, sub_key, REG_SZ, value, value_length+1);
+ rc = RegSetValueW(key, sub_key, REG_SZ, value, (DWORD)(value_length + 1));
Py_END_ALLOW_THREADS
if (rc != ERROR_SUCCESS)
return PyErr_SetFromWindowsErrWithFunction(rc, "RegSetValue");
setlocal\r
rem Simple script to fetch source for external libraries\r
\r
-if "%PCBUILD%"=="" (set PCBUILD=%~dp0)\r
-if "%EXTERNALS_DIR%"=="" (set EXTERNALS_DIR=%PCBUILD%\..\externals)\r
+if NOT DEFINED PCBUILD (set PCBUILD=%~dp0)\r
+if NOT DEFINED EXTERNALS_DIR (set EXTERNALS_DIR=%PCBUILD%\..\externals)\r
\r
set DO_FETCH=true\r
set DO_CLEAN=false\r
if "%ORG%"=="" (set ORG=python)\r
call "%PCBUILD%\find_python.bat" "%PYTHON%"\r
\r
-if "%PYTHON%"=="" (\r
+if NOT DEFINED PYTHON (\r
where /Q git || echo Python 3.6 could not be found or installed, and git.exe is not on your PATH && exit /B 1\r
)\r
\r
\r
set libraries=\r
set libraries=%libraries% bzip2-1.0.6\r
-if NOT "%IncludeSSLSrc%"=="false" set libraries=%libraries% openssl-1.1.0j\r
-set libraries=%libraries% sqlite-3.21.0.0\r
+if NOT "%IncludeSSLSrc%"=="false" set libraries=%libraries% openssl-1.1.1c\r
+set libraries=%libraries% sqlite-3.28.0.0\r
if NOT "%IncludeTkinterSrc%"=="false" set libraries=%libraries% tcl-core-8.6.9.0\r
if NOT "%IncludeTkinterSrc%"=="false" set libraries=%libraries% tk-8.6.9.0\r
if NOT "%IncludeTkinterSrc%"=="false" set libraries=%libraries% tix-8.4.3.6\r
for %%e in (%libraries%) do (\r
if exist "%EXTERNALS_DIR%\%%e" (\r
echo.%%e already exists, skipping.\r
- ) else if "%PYTHON%"=="" (\r
+ ) else if NOT DEFINED PYTHON (\r
echo.Fetching %%e with git...\r
git clone --depth 1 https://github.com/%ORG%/cpython-source-deps --branch %%e "%EXTERNALS_DIR%\%%e"\r
) else (\r
echo.Fetching external binaries...\r
\r
set binaries=\r
-if NOT "%IncludeSSL%"=="false" set binaries=%binaries% openssl-bin-1.1.0j\r
+if NOT "%IncludeSSL%"=="false" set binaries=%binaries% openssl-bin-1.1.1c\r
if NOT "%IncludeTkinter%"=="false" set binaries=%binaries% tcltk-8.6.9.0\r
if NOT "%IncludeSSLSrc%"=="false" set binaries=%binaries% nasm-2.11.06\r
\r
for %%b in (%binaries%) do (\r
if exist "%EXTERNALS_DIR%\%%b" (\r
echo.%%b already exists, skipping.\r
- ) else if "%PYTHON%"=="" (\r
+ ) else if NOT DEFINED PYTHON (\r
echo.Fetching %%b with git...\r
git clone --depth 1 https://github.com/%ORG%/cpython-bin-deps --branch %%b "%EXTERNALS_DIR%\%%b"\r
) else (\r
</ItemDefinitionGroup>\r
<PropertyGroup>\r
<_DLLSuffix>-1_1</_DLLSuffix>\r
- <_DLLSuffix Condition="$(Platform) == 'x64'">$(_DLLSuffix)-x64</_DLLSuffix>\r
+ <_DLLSuffix Condition="$(Platform) == 'ARM'">$(_DLLSuffix)-arm</_DLLSuffix>\r
+ <_DLLSuffix Condition="$(Platform) == 'ARM64'">$(_DLLSuffix)-arm64</_DLLSuffix>\r
</PropertyGroup>\r
<ItemGroup>\r
<_SSLDLL Include="$(opensslOutDir)\libcrypto$(_DLLSuffix).dll" />\r
<?xml version="1.0" encoding="utf-8"?>\r
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">\r
<ItemGroup Label="ProjectConfigurations">\r
- <ProjectConfiguration Include="Debug|Win32">\r
- <Configuration>Debug</Configuration>\r
- <Platform>Win32</Platform>\r
- </ProjectConfiguration>\r
<ProjectConfiguration Include="Release|Win32">\r
<Configuration>Release</Configuration>\r
<Platform>Win32</Platform>\r
</ProjectConfiguration>\r
- <ProjectConfiguration Include="PGInstrument|Win32">\r
- <Configuration>PGInstrument</Configuration>\r
- <Platform>Win32</Platform>\r
- </ProjectConfiguration>\r
- <ProjectConfiguration Include="PGInstrument|x64">\r
- <Configuration>PGInstrument</Configuration>\r
- <Platform>x64</Platform>\r
- </ProjectConfiguration>\r
- <ProjectConfiguration Include="PGUpdate|Win32">\r
- <Configuration>PGUpdate</Configuration>\r
- <Platform>Win32</Platform>\r
- </ProjectConfiguration>\r
- <ProjectConfiguration Include="PGUpdate|x64">\r
- <Configuration>PGUpdate</Configuration>\r
+ <ProjectConfiguration Include="Release|x64">\r
+ <Configuration>Release</Configuration>\r
<Platform>x64</Platform>\r
</ProjectConfiguration>\r
- <ProjectConfiguration Include="Debug|x64">\r
- <Configuration>Debug</Configuration>\r
- <Platform>x64</Platform>\r
+ <ProjectConfiguration Include="Release|ARM">\r
+ <Configuration>Release</Configuration>\r
+ <Platform>ARM</Platform>\r
</ProjectConfiguration>\r
- <ProjectConfiguration Include="Release|x64">\r
+ <ProjectConfiguration Include="Release|ARM64">\r
<Configuration>Release</Configuration>\r
- <Platform>x64</Platform>\r
+ <Platform>ARM64</Platform>\r
</ProjectConfiguration>\r
</ItemGroup>\r
<PropertyGroup Label="Globals">\r
\r
<Import Project="python.props" />\r
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.Default.props" />\r
- \r
- <PropertyGroup Label="Configuration">\r
+\r
+ <PropertyGroup Label="Configuration" Condition="$(Platform) == 'Win32'">\r
<ConfigurationType>Makefile</ConfigurationType>\r
<Bitness>32</Bitness>\r
- <Bitness Condition="$(Platform) == 'x64'">64</Bitness>\r
<ArchName>x86</ArchName>\r
- <ArchName Condition="$(Platform) == 'x64'">amd64</ArchName>\r
<OpenSSLPlatform>VC-WIN32</OpenSSLPlatform>\r
- <OpenSSLPlatform Condition="$(Platform) == 'x64'">VC-WIN64A</OpenSSLPlatform>\r
+ <SupportSigning>true</SupportSigning>\r
+ </PropertyGroup>\r
+\r
+ <PropertyGroup Label="Configuration" Condition="$(Platform) == 'x64'">\r
+ <ConfigurationType>Makefile</ConfigurationType>\r
+ <Bitness>64</Bitness>\r
+ <ArchName>amd64</ArchName>\r
+ <OpenSSLPlatform>VC-WIN64A-masm</OpenSSLPlatform>\r
+ <SupportSigning>true</SupportSigning>\r
+ </PropertyGroup>\r
+\r
+ <PropertyGroup Label="Configuration" Condition="$(Platform) == 'ARM'">\r
+ <ConfigurationType>Makefile</ConfigurationType>\r
+ <Bitness>ARM</Bitness>\r
+ <ArchName>ARM</ArchName>\r
+ <OpenSSLPlatform>VC-WIN32-ARM</OpenSSLPlatform>\r
+ <SupportSigning>true</SupportSigning>\r
+ </PropertyGroup>\r
+\r
+ <PropertyGroup Label="Configuration" Condition="$(Platform) == 'ARM64'">\r
+ <ConfigurationType>Makefile</ConfigurationType>\r
+ <Bitness>ARM64</Bitness>\r
+ <ArchName>ARM64</ArchName>\r
+ <OpenSSLPlatform>VC-WIN64-ARM</OpenSSLPlatform>\r
<SupportSigning>true</SupportSigning>\r
</PropertyGroup>\r
\r
call "%PCBUILD%\find_python.bat" "%PYTHON%"\r
if ERRORLEVEL 1 (echo Cannot locate python.exe on PATH or as PYTHON variable & exit /b 3)\r
\r
-call "%PCBUILD%\get_externals.bat" --openssl-src %ORG_SETTING%\r
+call "%PCBUILD%\get_externals.bat" --openssl-src --no-openssl %ORG_SETTING%\r
\r
if "%PERL%" == "" where perl > "%TEMP%\perl.loc" 2> nul && set /P PERL= <"%TEMP%\perl.loc" & del "%TEMP%\perl.loc"\r
if "%PERL%" == "" (echo Cannot locate perl.exe on PATH or as PERL variable & exit /b 4)\r
if errorlevel 1 exit /b\r
%MSBUILD% "%PCBUILD%\openssl.vcxproj" /p:Configuration=Release /p:Platform=x64\r
if errorlevel 1 exit /b\r
+%MSBUILD% "%PCBUILD%\openssl.vcxproj" /p:Configuration=Release /p:Platform=ARM\r
+if errorlevel 1 exit /b\r
+%MSBUILD% "%PCBUILD%\openssl.vcxproj" /p:Configuration=Release /p:Platform=ARM64\r
+if errorlevel 1 exit /b\r
\r
<ExternalsDir>$(EXTERNALS_DIR)</ExternalsDir>\r
<ExternalsDir Condition="$(ExternalsDir) == ''">$([System.IO.Path]::GetFullPath(`$(PySourcePath)externals`))</ExternalsDir>\r
<ExternalsDir Condition="!HasTrailingSlash($(ExternalsDir))">$(ExternalsDir)\</ExternalsDir>\r
- <sqlite3Dir>$(ExternalsDir)sqlite-3.21.0.0\</sqlite3Dir>\r
+ <sqlite3Dir>$(ExternalsDir)sqlite-3.28.0.0\</sqlite3Dir>\r
<bz2Dir>$(ExternalsDir)bzip2-1.0.6\</bz2Dir>\r
<lzmaDir>$(ExternalsDir)xz-5.2.2\</lzmaDir>\r
- <opensslDir>$(ExternalsDir)openssl-1.1.0j\</opensslDir>\r
- <opensslOutDir>$(ExternalsDir)openssl-bin-1.1.0j\$(ArchName)\</opensslOutDir>\r
+ <opensslDir>$(ExternalsDir)openssl-1.1.1c\</opensslDir>\r
+ <opensslOutDir>$(ExternalsDir)openssl-bin-1.1.1c\$(ArchName)\</opensslOutDir>\r
<opensslIncludeDir>$(opensslOutDir)include</opensslIncludeDir>\r
<nasmDir>$(ExternalsDir)\nasm-2.11.06\</nasmDir>\r
<zlibDir>$(ExternalsDir)\zlib-1.2.11\</zlibDir>\r
Homepage:\r
http://tukaani.org/xz/\r
_ssl\r
- Python wrapper for version 1.1.0h of the OpenSSL secure sockets\r
+ Python wrapper for version 1.1.1c of the OpenSSL secure sockets\r
library, which is downloaded from our binaries repository at\r
https://github.com/python/cpython-bin-deps.\r
\r
again when building.\r
\r
_sqlite3\r
- Wraps SQLite 3.21.0.0, which is itself built by sqlite3.vcxproj\r
+ Wraps SQLite 3.28.0.0, which is itself built by sqlite3.vcxproj\r
Homepage:\r
http://www.sqlite.org/\r
_tkinter\r
}
*current = '\0';
final_length = current - buf + 1;
- if (final_length < needed_length && final_length)
+ if (final_length < needed_length && final_length) {
/* should never fail */
- buf = PyMem_REALLOC(buf, final_length);
+ char* result = PyMem_REALLOC(buf, final_length);
+ if (result == NULL) {
+ PyMem_FREE(buf);
+ }
+ buf = result;
+ }
return buf;
}
break;
start--;
}
- cols += substr - start;
+ cols += (int)(substr - start);
/* Fix lineno in mulitline strings. */
while ((substr = strchr(substr + 1, '\n')))
lines--;
}
if (PyLong_CheckExact(item)) {
long b = PyLong_AsLongAndOverflow(item, &overflow);
- long x = i_result + b;
- if (overflow == 0 && ((x^i_result) >= 0 || (x^b) >= 0)) {
- i_result = x;
+ if (overflow == 0 &&
+ (i_result >= 0 ? (b <= LONG_MAX - i_result)
+ : (b >= LONG_MIN - i_result)))
+ {
+ i_result += b;
Py_DECREF(item);
continue;
}
}
x = PyImport_GetModule(fullmodname);
Py_DECREF(fullmodname);
- if (x == NULL) {
+ if (x == NULL && !PyErr_Occurred()) {
goto error;
}
Py_DECREF(pkgname);
"cannot import name %R from %R (unknown location)",
name, pkgname_or_unknown
);
- /* NULL check for errmsg done by PyErr_SetImportError. */
+ /* NULL checks for errmsg and pkgname done by PyErr_SetImportError. */
PyErr_SetImportError(errmsg, pkgname, NULL);
}
else {
"cannot import name %R from %R (%S)",
name, pkgname_or_unknown, pkgpath
);
- /* NULL check for errmsg done by PyErr_SetImportError. */
+ /* NULL checks for errmsg and pkgname done by PyErr_SetImportError. */
PyErr_SetImportError(errmsg, pkgname, pkgpath);
}
return 1;
}
+// Return 1 if the method call was optimized, -1 if not, and 0 on error.
static int
maybe_optimize_method_call(struct compiler *c, expr_ty e)
{
static int
compiler_call(struct compiler *c, expr_ty e)
{
- if (maybe_optimize_method_call(c, e) > 0)
- return 1;
-
+ int ret = maybe_optimize_method_call(c, e);
+ if (ret >= 0) {
+ return ret;
+ }
VISIT(c, expr, e->v.Call.func);
return compiler_call_helper(c, 0,
e->v.Call.args,
ULong y, z, abs_exp;
Long L;
BCinfo bc;
- Bigint *bb, *bb1, *bd, *bd0, *bs, *delta;
+ Bigint *bb = NULL, *bd = NULL, *bd0 = NULL, *bs = NULL, *delta = NULL;
size_t ndigits, fraclen;
+ double result;
dval(&rv) = 0.;
if (k > 9) {
dval(&rv) = tens[k - 9] * dval(&rv) + z;
}
- bd0 = 0;
if (nd <= DBL_DIG
&& Flt_Rounds == 1
) {
bd = Balloc(bd0->k);
if (bd == NULL) {
- Bfree(bd0);
goto failed_malloc;
}
Bcopy(bd, bd0);
bb = sd2b(&rv, bc.scale, &bbe); /* srv = bb * 2^bbe */
if (bb == NULL) {
- Bfree(bd);
- Bfree(bd0);
goto failed_malloc;
}
/* Record whether lsb of bb is odd, in case we need this
/* tdv = bd * 10**e; srv = bb * 2**bbe */
bs = i2b(1);
if (bs == NULL) {
- Bfree(bb);
- Bfree(bd);
- Bfree(bd0);
goto failed_malloc;
}
if (bb5 > 0) {
bs = pow5mult(bs, bb5);
if (bs == NULL) {
- Bfree(bb);
- Bfree(bd);
- Bfree(bd0);
goto failed_malloc;
}
- bb1 = mult(bs, bb);
+ Bigint *bb1 = mult(bs, bb);
Bfree(bb);
bb = bb1;
if (bb == NULL) {
- Bfree(bs);
- Bfree(bd);
- Bfree(bd0);
goto failed_malloc;
}
}
if (bb2 > 0) {
bb = lshift(bb, bb2);
if (bb == NULL) {
- Bfree(bs);
- Bfree(bd);
- Bfree(bd0);
goto failed_malloc;
}
}
if (bd5 > 0) {
bd = pow5mult(bd, bd5);
if (bd == NULL) {
- Bfree(bb);
- Bfree(bs);
- Bfree(bd0);
goto failed_malloc;
}
}
if (bd2 > 0) {
bd = lshift(bd, bd2);
if (bd == NULL) {
- Bfree(bb);
- Bfree(bs);
- Bfree(bd0);
goto failed_malloc;
}
}
if (bs2 > 0) {
bs = lshift(bs, bs2);
if (bs == NULL) {
- Bfree(bb);
- Bfree(bd);
- Bfree(bd0);
goto failed_malloc;
}
}
delta = diff(bb, bd);
if (delta == NULL) {
- Bfree(bb);
- Bfree(bs);
- Bfree(bd);
- Bfree(bd0);
goto failed_malloc;
}
dsign = delta->sign;
}
delta = lshift(delta,Log2P);
if (delta == NULL) {
- Bfree(bb);
- Bfree(bs);
- Bfree(bd);
- Bfree(bd0);
goto failed_malloc;
}
if (cmp(delta, bs) > 0)
if ((word0(&rv) & Exp_mask) >=
Exp_msk1*(DBL_MAX_EXP+Bias-P)) {
if (word0(&rv0) == Big0 && word1(&rv0) == Big1) {
- Bfree(bb);
- Bfree(bd);
- Bfree(bs);
- Bfree(bd0);
- Bfree(delta);
goto ovfl;
}
word0(&rv) = Big0;
}
}
cont:
- Bfree(bb);
- Bfree(bd);
- Bfree(bs);
- Bfree(delta);
+ Bfree(bb); bb = NULL;
+ Bfree(bd); bd = NULL;
+ Bfree(bs); bs = NULL;
+ Bfree(delta); delta = NULL;
}
- Bfree(bb);
- Bfree(bd);
- Bfree(bs);
- Bfree(bd0);
- Bfree(delta);
if (bc.nd > nd) {
error = bigcomp(&rv, s0, &bc);
if (error)
}
ret:
- return sign ? -dval(&rv) : dval(&rv);
+ result = sign ? -dval(&rv) : dval(&rv);
+ goto done;
parse_error:
- return 0.0;
+ result = 0.0;
+ goto done;
failed_malloc:
errno = ENOMEM;
- return -1.0;
+ result = -1.0;
+ goto done;
undfl:
- return sign ? -0.0 : 0.0;
+ result = sign ? -0.0 : 0.0;
+ goto done;
ovfl:
errno = ERANGE;
/* Can't trust HUGE_VAL */
word0(&rv) = Exp_mask;
word1(&rv) = 0;
- return sign ? -dval(&rv) : dval(&rv);
+ result = sign ? -dval(&rv) : dval(&rv);
+ goto done;
+
+ done:
+ Bfree(bb);
+ Bfree(bd);
+ Bfree(bs);
+ Bfree(bd0);
+ Bfree(delta);
+ return result;
}
This should not happen if called correctly. */
if (theLength == 0) {
message = PyUnicode_FromFormat(
- "DLL load failed with error code %d",
+ "DLL load failed with error code %u",
errorCode);
} else {
/* For some reason a \r\n
}
-/* Call when an exception has occurred but there is no way for Python
- to handle it. Examples: exception in __del__ or during GC. */
-void
-PyErr_WriteUnraisable(PyObject *obj)
+static void
+write_unraisable_exc_file(PyObject *exc_type, PyObject *exc_value,
+ PyObject *exc_tb, PyObject *obj, PyObject *file)
{
- _Py_IDENTIFIER(__module__);
- PyObject *f, *t, *v, *tb;
- PyObject *moduleName = NULL;
- char* className;
-
- PyErr_Fetch(&t, &v, &tb);
-
- f = _PySys_GetObjectId(&PyId_stderr);
- if (f == NULL || f == Py_None)
- goto done;
-
if (obj) {
- if (PyFile_WriteString("Exception ignored in: ", f) < 0)
- goto done;
- if (PyFile_WriteObject(obj, f, 0) < 0) {
+ if (PyFile_WriteString("Exception ignored in: ", file) < 0) {
+ return;
+ }
+ if (PyFile_WriteObject(obj, file, 0) < 0) {
PyErr_Clear();
- if (PyFile_WriteString("<object repr() failed>", f) < 0) {
- goto done;
+ if (PyFile_WriteString("<object repr() failed>", file) < 0) {
+ return;
}
}
- if (PyFile_WriteString("\n", f) < 0)
- goto done;
+ if (PyFile_WriteString("\n", file) < 0) {
+ return;
+ }
}
- if (PyTraceBack_Print(tb, f) < 0)
- goto done;
+ if (exc_tb != NULL) {
+ if (PyTraceBack_Print(exc_tb, file) < 0) {
+ /* continue even if writing the traceback failed */
+ PyErr_Clear();
+ }
+ }
- if (!t)
- goto done;
+ if (!exc_type) {
+ return;
+ }
- assert(PyExceptionClass_Check(t));
- className = PyExceptionClass_Name(t);
+ assert(PyExceptionClass_Check(exc_type));
+ char* className = PyExceptionClass_Name(exc_type);
if (className != NULL) {
char *dot = strrchr(className, '.');
- if (dot != NULL)
+ if (dot != NULL) {
className = dot+1;
+ }
}
- moduleName = _PyObject_GetAttrId(t, &PyId___module__);
+ _Py_IDENTIFIER(__module__);
+ PyObject *moduleName = _PyObject_GetAttrId(exc_type, &PyId___module__);
if (moduleName == NULL || !PyUnicode_Check(moduleName)) {
+ Py_XDECREF(moduleName);
PyErr_Clear();
- if (PyFile_WriteString("<unknown>", f) < 0)
- goto done;
+ if (PyFile_WriteString("<unknown>", file) < 0) {
+ return;
+ }
}
else {
if (!_PyUnicode_EqualToASCIIId(moduleName, &PyId_builtins)) {
- if (PyFile_WriteObject(moduleName, f, Py_PRINT_RAW) < 0)
- goto done;
- if (PyFile_WriteString(".", f) < 0)
- goto done;
+ if (PyFile_WriteObject(moduleName, file, Py_PRINT_RAW) < 0) {
+ Py_DECREF(moduleName);
+ return;
+ }
+ Py_DECREF(moduleName);
+ if (PyFile_WriteString(".", file) < 0) {
+ return;
+ }
+ }
+ else {
+ Py_DECREF(moduleName);
}
}
+
if (className == NULL) {
- if (PyFile_WriteString("<unknown>", f) < 0)
- goto done;
+ if (PyFile_WriteString("<unknown>", file) < 0) {
+ return;
+ }
}
else {
- if (PyFile_WriteString(className, f) < 0)
- goto done;
+ if (PyFile_WriteString(className, file) < 0) {
+ return;
+ }
}
- if (v && v != Py_None) {
- if (PyFile_WriteString(": ", f) < 0)
- goto done;
- if (PyFile_WriteObject(v, f, Py_PRINT_RAW) < 0) {
+ if (exc_value && exc_value != Py_None) {
+ if (PyFile_WriteString(": ", file) < 0) {
+ return;
+ }
+ if (PyFile_WriteObject(exc_value, file, Py_PRINT_RAW) < 0) {
PyErr_Clear();
- if (PyFile_WriteString("<exception str() failed>", f) < 0) {
- goto done;
+ if (PyFile_WriteString("<exception str() failed>", file) < 0) {
+ return;
}
}
}
- if (PyFile_WriteString("\n", f) < 0)
- goto done;
+ if (PyFile_WriteString("\n", file) < 0) {
+ return;
+ }
+}
-done:
- Py_XDECREF(moduleName);
- Py_XDECREF(t);
- Py_XDECREF(v);
- Py_XDECREF(tb);
+
+/* Display an unraisable exception into sys.stderr.
+
+ Called when an exception has occurred but there is no way for Python to
+ handle it. For example, when a destructor raises an exception or during
+ garbage collection (gc.collect()).
+
+ An exception must be set when calling this function. */
+void
+PyErr_WriteUnraisable(PyObject *obj)
+{
+ PyObject *f, *exc_type, *exc_value, *exc_tb;
+
+ PyErr_Fetch(&exc_type, &exc_value, &exc_tb);
+
+ f = _PySys_GetObjectId(&PyId_stderr);
+ /* Do nothing if sys.stderr is not available or set to None */
+ if (f != NULL && f != Py_None) {
+ write_unraisable_exc_file(exc_type, exc_value, exc_tb, obj, f);
+ }
+
+ Py_XDECREF(exc_type);
+ Py_XDECREF(exc_value);
+ Py_XDECREF(exc_tb);
PyErr_Clear(); /* Just in case */
}
{
#ifdef MS_WINDOWS
HANDLE handle;
- DWORD ftype;
#endif
assert(PyGILState_Check());
return -1;
}
- /* get the file type, ignore the error if it failed */
- ftype = GetFileType(handle);
-
Py_BEGIN_ALLOW_THREADS
_Py_BEGIN_SUPPRESS_IPH
fd = dup(fd);
return -1;
}
- /* Character files like console cannot be make non-inheritable */
- if (ftype != FILE_TYPE_CHAR) {
- if (_Py_set_inheritable(fd, 0, NULL) < 0) {
- _Py_BEGIN_SUPPRESS_IPH
- close(fd);
- _Py_END_SUPPRESS_IPH
- return -1;
- }
+ if (_Py_set_inheritable(fd, 0, NULL) < 0) {
+ _Py_BEGIN_SUPPRESS_IPH
+ close(fd);
+ _Py_END_SUPPRESS_IPH
+ return -1;
}
#elif defined(HAVE_FCNTL_H) && defined(F_DUPFD_CLOEXEC)
Py_BEGIN_ALLOW_THREADS
if (nargs < min || max < nargs) {
if (message == NULL)
PyErr_Format(PyExc_TypeError,
- "%.150s%s takes %s %d argument%s (%ld given)",
+ "%.150s%s takes %s %d argument%s (%zd given)",
fname==NULL ? "function" : fname,
fname==NULL ? "" : "()",
min==max ? "exactly"
: nargs < min ? "at least" : "at most",
nargs < min ? min : max,
(nargs < min ? min : max) == 1 ? "" : "s",
- Py_SAFE_DOWNCAST(nargs, Py_ssize_t, long));
+ nargs);
else
PyErr_SetString(PyExc_TypeError, message);
return cleanreturn(0, &freelist);
else {
PyErr_Format(PyExc_TypeError,
"%.200s%s takes %s %d positional arguments"
- " (%d given)",
+ " (%zd given)",
(fname == NULL) ? "function" : fname,
(fname == NULL) ? "" : "()",
(min != INT_MAX) ? "at most" : "exactly",
if (skip) {
PyErr_Format(PyExc_TypeError,
"%.200s%s takes %s %d positional arguments"
- " (%d given)",
+ " (%zd given)",
(fname == NULL) ? "function" : fname,
(fname == NULL) ? "" : "()",
(Py_MIN(pos, min) < i) ? "at least" : "exactly",
}
else {
PyErr_Format(PyExc_TypeError,
- "%.200s%s takes %s %d positional arguments (%d given)",
+ "%.200s%s takes %s %d positional arguments (%zd given)",
(parser->fname == NULL) ? "function" : parser->fname,
(parser->fname == NULL) ? "" : "()",
(parser->min != INT_MAX) ? "at most" : "exactly",
Py_ssize_t min = Py_MIN(pos, parser->min);
PyErr_Format(PyExc_TypeError,
"%.200s%s takes %s %d positional arguments"
- " (%d given)",
+ " (%zd given)",
(parser->fname == NULL) ? "function" : parser->fname,
(parser->fname == NULL) ? "" : "()",
min < parser->max ? "at least" : "exactly",
goto error;
}
- if (_hamt_dump_format(writer, "%d::\n", i)) {
+ if (_hamt_dump_format(writer, "%zd::\n", i)) {
goto error;
}
Py_DECREF(v);
m = PyImport_GetModule(name);
- if (m == NULL) {
+ if (m == NULL && !PyErr_Occurred()) {
PyErr_Format(PyExc_ImportError,
"Loaded module %R not found in sys.modules",
name);
- return NULL;
}
return m;
}
mod = PyImport_GetModule(abs_name);
+ if (mod == NULL && PyErr_Occurred()) {
+ goto error;
+ }
+
if (mod != NULL && mod != Py_None) {
_Py_IDENTIFIER(__spec__);
_Py_IDENTIFIER(_initializing);
final_mod = PyImport_GetModule(to_return);
Py_DECREF(to_return);
if (final_mod == NULL) {
- PyErr_Format(PyExc_KeyError,
- "%R not in sys.modules as expected",
- to_return);
+ if (!PyErr_Occurred()) {
+ PyErr_Format(PyExc_KeyError,
+ "%R not in sys.modules as expected",
+ to_return);
+ }
goto error;
}
}
PyObject *reloaded_module = NULL;
PyObject *imp = _PyImport_GetModuleId(&PyId_imp);
if (imp == NULL) {
+ if (PyErr_Occurred()) {
+ return NULL;
+ }
+
imp = PyImport_ImportModule("imp");
if (imp == NULL) {
return NULL;
}
/* Compute argv[0] which will be prepended to sys.argv */
-PyObject*
-_PyPathConfig_ComputeArgv0(int argc, wchar_t **argv)
+int
+_PyPathConfig_ComputeArgv0(int argc, wchar_t **argv, PyObject **argv0_p)
{
wchar_t *argv0;
wchar_t *p = NULL;
wchar_t fullpath[MAX_PATH];
#endif
+ assert(*argv0_p == NULL);
+
argv0 = argv[0];
if (argc > 0 && argv0 != NULL) {
have_module_arg = (wcscmp(argv0, L"-m") == 0);
if (have_module_arg) {
#if defined(HAVE_REALPATH) || defined(MS_WINDOWS)
- _Py_wgetcwd(fullpath, Py_ARRAY_LENGTH(fullpath));
+ if (!_Py_wgetcwd(fullpath, Py_ARRAY_LENGTH(fullpath))) {
+ return 0;
+ }
argv0 = fullpath;
n = wcslen(argv0);
#else
}
#endif /* All others */
- return PyUnicode_FromWideChar(argv0, n);
+ *argv0_p = PyUnicode_FromWideChar(argv0, n);
+ return 1;
}
PyTuple_SET_ITEM(newconst, i, constant);
}
+ Py_ssize_t index = PyList_GET_SIZE(consts);
+#if SIZEOF_SIZE_T > SIZEOF_INT
+ if ((size_t)index >= UINT_MAX - 1) {
+ Py_DECREF(newconst);
+ PyErr_SetString(PyExc_OverflowError, "too many constants");
+ return -1;
+ }
+#endif
+
/* Append folded constant onto consts */
if (PyList_Append(consts, newconst)) {
Py_DECREF(newconst);
Py_DECREF(newconst);
return copy_op_arg(codestr, c_start, LOAD_CONST,
- PyList_GET_SIZE(consts)-1, opcode_end);
+ (unsigned int)index, opcode_end);
}
static unsigned int *
PyCode_Optimize(PyObject *code, PyObject* consts, PyObject *names,
PyObject *lnotab_obj)
{
- Py_ssize_t h, i, nexti, op_start, codelen, tgt;
+ Py_ssize_t h, i, nexti, op_start, tgt;
unsigned int j, nops;
unsigned char opcode, nextop;
_Py_CODEUNIT *codestr = NULL;
the peephole optimizer doesn't modify line numbers. */
assert(PyBytes_Check(code));
- codelen = PyBytes_GET_SIZE(code);
- assert(codelen % sizeof(_Py_CODEUNIT) == 0);
+ Py_ssize_t codesize = PyBytes_GET_SIZE(code);
+ assert(codesize % sizeof(_Py_CODEUNIT) == 0);
+ Py_ssize_t codelen = codesize / sizeof(_Py_CODEUNIT);
+ if (codelen > INT_MAX) {
+ /* Python assembler is limited to INT_MAX: see assembler.a_offset in
+ compile.c. */
+ goto exitUnchanged;
+ }
/* Make a modifiable copy of the code string */
- codestr = (_Py_CODEUNIT *)PyMem_Malloc(codelen);
+ codestr = (_Py_CODEUNIT *)PyMem_Malloc(codesize);
if (codestr == NULL) {
PyErr_NoMemory();
goto exitError;
}
- memcpy(codestr, PyBytes_AS_STRING(code), codelen);
- codelen /= sizeof(_Py_CODEUNIT);
+ memcpy(codestr, PyBytes_AS_STRING(code), codesize);
blocks = markblocks(codestr, codelen);
if (blocks == NULL)
jump past it), and all conditional jumps pop their
argument when they're not taken (so change the
first jump to pop its argument when it's taken). */
- h = set_arg(codestr, i, (tgt + 1) * sizeof(_Py_CODEUNIT));
+ Py_ssize_t arg = (tgt + 1);
+ /* cannot overflow: codelen <= INT_MAX */
+ assert((size_t)arg <= UINT_MAX / sizeof(_Py_CODEUNIT));
+ arg *= sizeof(_Py_CODEUNIT);
+ h = set_arg(codestr, i, (unsigned int)arg);
j = opcode == JUMP_IF_TRUE_OR_POP ?
POP_JUMP_IF_TRUE : POP_JUMP_IF_FALSE;
}
codestr[op_start] = PACKOPARG(RETURN_VALUE, 0);
fill_nops(codestr, op_start + 1, i + 1);
} else if (UNCONDITIONAL_JUMP(_Py_OPCODE(codestr[tgt]))) {
- j = GETJUMPTGT(codestr, tgt);
+ size_t arg = GETJUMPTGT(codestr, tgt);
if (opcode == JUMP_FORWARD) { /* JMP_ABS can go backwards */
opcode = JUMP_ABSOLUTE;
} else if (!ABSOLUTE_JUMP(opcode)) {
- if ((Py_ssize_t)j < i + 1) {
+ if (arg < (size_t)(i + 1)) {
break; /* No backward relative jumps */
}
- j -= i + 1; /* Calc relative jump addr */
+ arg -= i + 1; /* Calc relative jump addr */
}
- j *= sizeof(_Py_CODEUNIT);
- copy_op_arg(codestr, op_start, opcode, j, i + 1);
+ /* cannot overflow: codelen <= INT_MAX */
+ assert(arg <= (UINT_MAX / sizeof(_Py_CODEUNIT)));
+ arg *= sizeof(_Py_CODEUNIT);
+ copy_op_arg(codestr, op_start, opcode,
+ (unsigned int)arg, i + 1);
}
break;
/* Fixup lnotab */
for (i = 0, nops = 0; i < codelen; i++) {
- assert(i - nops <= INT_MAX);
+ size_t block = (size_t)i - nops;
+ /* cannot overflow: codelen <= INT_MAX */
+ assert(block <= UINT_MAX);
/* original code offset => new code offset */
- blocks[i] = i - nops;
- if (_Py_OPCODE(codestr[i]) == NOP)
+ blocks[i] = (unsigned int)block;
+ if (_Py_OPCODE(codestr[i]) == NOP) {
nops++;
+ }
}
cum_orig_offset = 0;
last_offset = 0;
j *= sizeof(_Py_CODEUNIT);
break;
}
- nexti = i - op_start + 1;
- if (instrsize(j) > nexti)
+ Py_ssize_t ilen = i - op_start + 1;
+ if (instrsize(j) > ilen) {
goto exitUnchanged;
- /* If instrsize(j) < nexti, we'll emit EXTENDED_ARG 0 */
- write_op_arg(codestr + h, opcode, j, nexti);
- h += nexti;
+ }
+ assert(ilen <= INT_MAX);
+ /* If instrsize(j) < ilen, we'll emit EXTENDED_ARG 0 */
+ write_op_arg(codestr + h, opcode, j, (int)ilen);
+ h += ilen;
}
assert(h + (Py_ssize_t)nops == codelen);
#if defined(Py_DEBUG)
/*
fprintf(stderr,
- "alloc=%d size=%d blocks=%d block_size=%d big=%d objects=%d\n",
+ "alloc=%zu size=%zu blocks=%zu block_size=%zu big=%zu objects=%zu\n",
arena->total_allocs, arena->total_size, arena->total_blocks,
arena->total_block_size, arena->total_big_blocks,
PyList_Size(arena->a_objects));
static int
is_valid_fd(int fd)
{
-#ifdef __APPLE__
- /* bpo-30225: On macOS Tiger, when stdout is redirected to a pipe
- and the other side of the pipe is closed, dup(1) succeed, whereas
- fstat(1, &st) fails with EBADF. Prefer fstat() over dup() to detect
- such error. */
- struct stat st;
- return (fstat(fd, &st) == 0);
-#else
- int fd2;
- if (fd < 0)
+/* dup() is faster than fstat(): fstat() can require input/output operations,
+ whereas dup() doesn't. There is a low risk of EMFILE/ENFILE at Python
+ startup. Problem: dup() doesn't check if the file descriptor is valid on
+ some platforms.
+
+ bpo-30225: On macOS Tiger, when stdout is redirected to a pipe and the other
+ side of the pipe is closed, dup(1) succeed, whereas fstat(1, &st) fails with
+ EBADF. FreeBSD has similar issue (bpo-32849).
+
+ Only use dup() on platforms where dup() is enough to detect invalid FD in
+ corner cases: on Linux and Windows (bpo-32849). */
+#if defined(__linux__) || defined(MS_WINDOWS)
+ if (fd < 0) {
return 0;
+ }
+ int fd2;
+
_Py_BEGIN_SUPPRESS_IPH
- /* Prefer dup() over fstat(). fstat() can require input/output whereas
- dup() doesn't, there is a low risk of EMFILE/ENFILE at Python
- startup. */
fd2 = dup(fd);
- if (fd2 >= 0)
+ if (fd2 >= 0) {
close(fd2);
+ }
_Py_END_SUPPRESS_IPH
- return fd2 >= 0;
+
+ return (fd2 >= 0);
+#else
+ struct stat st;
+ return (fstat(fd, &st) == 0);
#endif
}
PyObject *result;
PyObject *threading = _PyImport_GetModuleId(&PyId_threading);
if (threading == NULL) {
- /* threading not imported */
- PyErr_Clear();
+ if (PyErr_Occurred()) {
+ PyErr_WriteUnraisable(NULL);
+ }
+ /* else: threading not imported */
return;
}
result = _PyObject_CallMethodId(threading, &PyId__shutdown, NULL);
interp->pyexitmodule = NULL;
HEAD_LOCK();
- interp->next = _PyRuntime.interpreters.head;
- if (_PyRuntime.interpreters.main == NULL) {
- _PyRuntime.interpreters.main = interp;
- }
- _PyRuntime.interpreters.head = interp;
if (_PyRuntime.interpreters.next_id < 0) {
/* overflow or Py_Initialize() not called! */
PyErr_SetString(PyExc_RuntimeError,
"failed to get an interpreter ID");
- /* XXX deallocate! */
+ PyMem_RawFree(interp);
interp = NULL;
} else {
interp->id = _PyRuntime.interpreters.next_id;
_PyRuntime.interpreters.next_id += 1;
+ interp->next = _PyRuntime.interpreters.head;
+ if (_PyRuntime.interpreters.main == NULL) {
+ _PyRuntime.interpreters.main = interp;
+ }
+ _PyRuntime.interpreters.head = interp;
}
HEAD_UNLOCK();
+ if (interp == NULL) {
+ return NULL;
+ }
+
interp->tstate_next_unique_id = 0;
return interp;
builtins = _PyImport_GetModuleId(&PyId_builtins);
if (builtins == NULL) {
- PyErr_SetString(PyExc_RuntimeError, "lost builtins module");
+ if (!PyErr_Occurred()) {
+ PyErr_SetString(PyExc_RuntimeError, "lost builtins module");
+ }
return NULL;
}
Py_DECREF(builtins);
return _Py_INIT_ERR("can't initialize sys module");
}
-#undef SET_SYS_FROM_STRING
-
/* Updating the sys namespace, returning integer error codes */
#define SET_SYS_FROM_STRING_INT_RESULT(key, value) \
do { \
SET_SYS_FROM_STRING_BORROW("exec_prefix", config->exec_prefix);
SET_SYS_FROM_STRING_BORROW("base_exec_prefix", config->base_exec_prefix);
+#ifdef MS_WINDOWS
+ const wchar_t *baseExecutable = _wgetenv(L"__PYVENV_BASE_EXECUTABLE__");
+ if (baseExecutable) {
+ SET_SYS_FROM_STRING("_base_executable",
+ PyUnicode_FromWideChar(baseExecutable, -1));
+ _wputenv_s(L"__PYVENV_BASE_EXECUTABLE__", L"");
+ } else {
+ SET_SYS_FROM_STRING_BORROW("_base_executable", config->executable);
+ }
+#endif
+
if (config->argv != NULL) {
SET_SYS_FROM_STRING_BORROW("argv", config->argv);
}
return -1;
}
+#undef SET_SYS_FROM_STRING
#undef SET_SYS_FROM_STRING_BORROW
#undef SET_SYS_FROM_STRING_INT_RESULT
if (updatepath) {
/* If argv[0] is not '-c' nor '-m', prepend argv[0] to sys.path.
If argv[0] is a symlink, use the real path. */
- PyObject *argv0 = _PyPathConfig_ComputeArgv0(argc, argv);
+ PyObject *argv0 = NULL;
+ if (!_PyPathConfig_ComputeArgv0(argc, argv, &argv0)) {
+ return;
+ }
if (argv0 == NULL) {
Py_FatalError("can't compute path0 from argv");
}
-This is Python version 3.7.3
+This is Python version 3.7.4
============================
.. image:: https://travis-ci.org/python/cpython.svg?branch=master
port = PORT
i = host.find(':')
if i >= 0:
- port = int(port[i+1:])
+ port = int(host[i+1:])
host = host[:i]
command = ' '.join(sys.argv[2:])
s = socket(AF_INET, SOCK_STREAM)
# locations derived from options
version = '%d.%d' % sys.version_info[:2]
- flagged_version = version + sys.abiflags
+ if hasattr(sys, 'abiflags'):
+ flagged_version = version + sys.abiflags
+ else:
+ flagged_version = version
if win:
extensions_c = 'frozen_extensions.c'
if ishome:
</Fragment>
<Fragment>
+ <!-- The auto-generated directory is not available when building debug binaries -->
+ <DirectoryRef Id="Lib">
+ <Directory Id="Lib_venv__d" Name="venv">
+ <Directory Id="Lib_venv_scripts__d" Name="scripts">
+ <Directory Id="Lib_venv_scripts_nt__d" Name="nt" />
+ </Directory>
+ </Directory>
+ </DirectoryRef>
+
<ComponentGroup Id="lib_extensions_d">
<?foreach ext in $(var.exts)?>
<Component Id="sqlite3_d.pdb" Directory="DLLs" Guid="*">
<File Name="sqlite3_d.pdb" KeyPath="yes" />
</Component>
+ <Component Id="venvlauncher_d.exe" Directory="Lib_venv_scripts_nt__d" Guid="*">
+ <File Name="python_d.exe" Source="venvlauncher_d.exe" KeyPath="yes" />
+ </Component>
+ <Component Id="venvwlauncher_d.exe" Directory="Lib_venv_scripts_nt__d" Guid="*">
+ <File Name="pythonw_d.exe" Source="venvwlauncher_d.exe" KeyPath="yes" />
+ </Component>
</ComponentGroup>
</Fragment>
<Fragment>
#>\r
param(\r
[Parameter(Mandatory=$true)][string]$catalog,\r
+ [switch]$sign,\r
[string]$description,\r
[string]$certname,\r
[string]$certsha1,\r
if (-not $?) {\r
throw "Catalog compilation failed"\r
}\r
-Sign-File -certname $certname -certsha1 $certsha1 -certfile $certfile -description $description -files @($catalog -replace 'cdf$', 'cat')\r
+if ($sign) {\r
+ Sign-File -certname $certname -certsha1 $certsha1 -certfile $certfile -description $description -files @($catalog -replace 'cdf$', 'cat')\r
+}\r
PyArchExt=$(PyArchExt);\r
PyTestExt=$(PyTestExt);\r
OptionalFeatureName=$(OutputName);\r
+ ssltag=-1_1;\r
</DefineConstants>\r
<DefineConstants Condition="'$(CRTRedist)' != ''">\r
$(DefineConstants);CRTRedist=$(CRTRedist);\r
</DefineConstants>\r
<DefineConstants Condition="$(Platform) != 'x64'">\r
- $(DefineConstants);Suffix32=-32;ssltag=-1_1;\r
+ $(DefineConstants);Suffix32=-32;\r
</DefineConstants>\r
<DefineConstants Condition="$(Platform) == 'x64'">\r
- $(DefineConstants);Suffix32=;ssltag=-1_1-x64;\r
+ $(DefineConstants);Suffix32=;\r
</DefineConstants>\r
</PropertyGroup>\r
\r
$certfile = $env:SigningCertificateFile;
}
+ if (-not ($certsha1 -or $certname -or $certfile)) {
+ throw "No signing certificate specified"
+ }
+
foreach ($a in $files) {
if ($certsha1) {
SignTool sign /sha1 $certsha1 /fd sha256 /t http://timestamp.verisign.com/scripts/timestamp.dll /d $description $a
SignTool sign /a /n $certname /fd sha256 /t http://timestamp.verisign.com/scripts/timestamp.dll /d $description $a
} elseif ($certfile) {
SignTool sign /f $certfile /fd sha256 /t http://timestamp.verisign.com/scripts/timestamp.dll /d $description $a
- } else {
- SignTool sign /a /fd sha256 /t http://timestamp.verisign.com/scripts/timestamp.dll /d $description $a
}
}
}
<PythonArguments>$(PythonArguments) -b "$(BuildPath.TrimEnd(`\`))" -s "$(PySourcePath.TrimEnd(`\`))"</PythonArguments>\r
<PythonArguments>$(PythonArguments) -t "$(IntermediateOutputPath)obj"</PythonArguments>\r
<PythonArguments>$(PythonArguments) --copy "$(IntermediateOutputPath)pkg"</PythonArguments>\r
- <PythonArguments>$(PythonArguments) --include-dev --include-tools --include-pip --include-stable --include-launcher --include-props</PythonArguments>\r
+ <PythonArguments>$(PythonArguments) --preset-nuget</PythonArguments>\r
\r
<PackageArguments Condition="$(Packages) != ''">"$(IntermediateOutputPath)pkg\pip.exe" -B -m pip install -U $(Packages)</PackageArguments>\r
\r
full = os.path.join(dirname, name)
if os.path.islink(full):
print(name, '->', os.readlink(full))
-def main():
- args = sys.argv[1:]
+def main(args):
if not args: args = [os.curdir]
first = 1
for arg in args:
if not first: print()
first = 0
print(arg + ':')
- lll(arg)
+ lll(arg)
if __name__ == '__main__':
- main()
+ main(sys.argv[1:])
]
OPENSSL_RECENT_VERSIONS = [
- "1.0.2p",
- "1.1.0i",
- "1.1.1",
+ "1.0.2s",
+ "1.1.0k",
+ "1.1.1c",
]
LIBRESSL_OLD_VERSIONS = [
docdir
oldincludedir
includedir
-runstatedir
localstatedir
sharedstatedir
sysconfdir
sysconfdir='${prefix}/etc'
sharedstatedir='${prefix}/com'
localstatedir='${prefix}/var'
-runstatedir='${localstatedir}/run'
includedir='${prefix}/include'
oldincludedir='/usr/include'
docdir='${datarootdir}/doc/${PACKAGE_TARNAME}'
| -silent | --silent | --silen | --sile | --sil)
silent=yes ;;
- -runstatedir | --runstatedir | --runstatedi | --runstated \
- | --runstate | --runstat | --runsta | --runst | --runs \
- | --run | --ru | --r)
- ac_prev=runstatedir ;;
- -runstatedir=* | --runstatedir=* | --runstatedi=* | --runstated=* \
- | --runstate=* | --runstat=* | --runsta=* | --runst=* | --runs=* \
- | --run=* | --ru=* | --r=*)
- runstatedir=$ac_optarg ;;
-
-sbindir | --sbindir | --sbindi | --sbind | --sbin | --sbi | --sb)
ac_prev=sbindir ;;
-sbindir=* | --sbindir=* | --sbindi=* | --sbind=* | --sbin=* \
for ac_var in exec_prefix prefix bindir sbindir libexecdir datarootdir \
datadir sysconfdir sharedstatedir localstatedir includedir \
oldincludedir docdir infodir htmldir dvidir pdfdir psdir \
- libdir localedir mandir runstatedir
+ libdir localedir mandir
do
eval ac_val=\$$ac_var
# Remove trailing slashes.
--sysconfdir=DIR read-only single-machine data [PREFIX/etc]
--sharedstatedir=DIR modifiable architecture-independent data [PREFIX/com]
--localstatedir=DIR modifiable single-machine data [PREFIX/var]
- --runstatedir=DIR modifiable per-process data [LOCALSTATEDIR/run]
--libdir=DIR object code libraries [EPREFIX/lib]
--includedir=DIR C header files [PREFIX/include]
--oldincludedir=DIR C header files for non-gcc [/usr/include]
done
-SRCDIRS="Parser Objects Python Modules Programs"
+SRCDIRS="Parser Objects Python Modules Modules/_io Programs"
{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for build directories" >&5
$as_echo_n "checking for build directories... " >&6; }
for dir in $SRCDIRS; do
done
AC_SUBST(SRCDIRS)
-SRCDIRS="Parser Objects Python Modules Programs"
+SRCDIRS="Parser Objects Python Modules Modules/_io Programs"
AC_MSG_CHECKING(for build directories)
for dir in $SRCDIRS; do
if test ! -d $dir; then
break
return dirs
+MACOS_SDK_ROOT = None
+
def macosx_sdk_root():
+ """Return the directory of the current macOS SDK.
+
+ If no SDK was explicitly configured, call the compiler to find which
+ include files paths are being searched by default. Use '/' if the
+ compiler is searching /usr/include (meaning system header files are
+ installed) or use the root of an SDK if that is being searched.
+ (The SDK may be supplied via Xcode or via the Command Line Tools).
+ The SDK paths used by Apple-supplied tool chains depend on the
+ setting of various variables; see the xcrun man page for more info.
"""
- Return the directory of the current OSX SDK,
- or '/' if no SDK was specified.
- """
+ global MACOS_SDK_ROOT
+
+ # If already called, return cached result.
+ if MACOS_SDK_ROOT:
+ return MACOS_SDK_ROOT
+
cflags = sysconfig.get_config_var('CFLAGS')
m = re.search(r'-isysroot\s+(\S+)', cflags)
- if m is None:
- sysroot = '/'
+ if m is not None:
+ MACOS_SDK_ROOT = m.group(1)
else:
- sysroot = m.group(1)
- return sysroot
+ MACOS_SDK_ROOT = '/'
+ cc = sysconfig.get_config_var('CC')
+ tmpfile = '/tmp/setup_sdk_root.%d' % os.getpid()
+ try:
+ os.unlink(tmpfile)
+ except:
+ pass
+ ret = os.system('%s -E -v - </dev/null 2>%s 1>/dev/null' % (cc, tmpfile))
+ in_incdirs = False
+ try:
+ if ret >> 8 == 0:
+ with open(tmpfile) as fp:
+ for line in fp.readlines():
+ if line.startswith("#include <...>"):
+ in_incdirs = True
+ elif line.startswith("End of search list"):
+ in_incdirs = False
+ elif in_incdirs:
+ line = line.strip()
+ if line == '/usr/include':
+ MACOS_SDK_ROOT = '/'
+ elif line.endswith(".sdk/usr/include"):
+ MACOS_SDK_ROOT = line[:-12]
+ finally:
+ os.unlink(tmpfile)
+
+ return MACOS_SDK_ROOT
def is_macosx_sdk_path(path):
"""
sqlite_setup_debug = False # verbose debug prints from this script?
# We hunt for #define SQLITE_VERSION "n.n.n"
- # We need to find >= sqlite version 3.0.8
+ # We need to find >= sqlite version 3.3.9, for sqlite3_prepare_v2
sqlite_incdir = sqlite_libdir = None
sqlite_inc_paths = [ '/usr/include',
'/usr/include/sqlite',
]
if cross_compiling:
sqlite_inc_paths = []
- MIN_SQLITE_VERSION_NUMBER = (3, 0, 8)
+ MIN_SQLITE_VERSION_NUMBER = (3, 3, 9)
MIN_SQLITE_VERSION = ".".join([str(x)
for x in MIN_SQLITE_VERSION_NUMBER])
break
else:
if sqlite_setup_debug:
- print("%s: version %d is too old, need >= %s"%(d,
+ print("%s: version %s is too old, need >= %s"%(d,
sqlite_version, MIN_SQLITE_VERSION))
elif sqlite_setup_debug:
print("sqlite: %s had no SQLITE_VERSION"%(f,))
cc = sysconfig.get_config_var('CC').split()[0]
ret = os.system(
- '"%s" -Werror -Wimplicit-fallthrough -E -xc /dev/null >/dev/null 2>&1' % cc)
+ '"%s" -Werror -Wno-unreachable-code -E -xc /dev/null >/dev/null 2>&1' % cc)
if ret >> 8 == 0:
- extra_compile_args.append('-Wno-implicit-fallthrough')
+ extra_compile_args.append('-Wno-unreachable-code')
exts.append(Extension('pyexpat',
define_macros = define_macros,