venv:
$(PYTHON) -m venv $(VENVDIR)
$(VENVDIR)/bin/python3 -m pip install -U pip setuptools
- $(VENVDIR)/bin/python3 -m pip install -U Sphinx==1.8.2 blurb python-docs-theme
+ $(VENVDIR)/bin/python3 -m pip install -U Sphinx==2.3.1 blurb python-docs-theme
@echo "The venv has been created in the $(VENVDIR) directory"
dist:
Function to initialize a preconfiguration:
- .. c:function:: void PyPreConfig_InitIsolatedConfig(PyPreConfig *preconfig)
+ .. c:function:: void PyPreConfig_InitPythonConfig(PyPreConfig *preconfig)
Initialize the preconfiguration with :ref:`Python Configuration
<init-python-config>`.
- .. c:function:: void PyPreConfig_InitPythonConfig(PyPreConfig *preconfig)
+ .. c:function:: void PyPreConfig_InitIsolatedConfig(PyPreConfig *preconfig)
Initialize the preconfiguration with :ref:`Isolated Configuration
<init-isolated-conf>`.
.. c:function:: PyObject* PyLong_FromUnicode(Py_UNICODE *u, Py_ssize_t length, int base)
- Convert a sequence of Unicode digits to a Python integer value. The Unicode
- string is first encoded to a byte string using :c:func:`PyUnicode_EncodeDecimal`
- and then converted using :c:func:`PyLong_FromString`.
+ Convert a sequence of Unicode digits to a Python integer value.
- .. deprecated-removed:: 3.3 4.0
+ .. deprecated-removed:: 3.3 3.10
Part of the old-style :c:type:`Py_UNICODE` API; please migrate to using
:c:func:`PyLong_FromUnicodeObject`.
.. c:function:: PyObject* PyLong_FromUnicodeObject(PyObject *u, int base)
Convert a sequence of Unicode digits in the string *u* to a Python integer
- value. The Unicode string is first encoded to a byte string using
- :c:func:`PyUnicode_EncodeDecimal` and then converted using
- :c:func:`PyLong_FromString`.
+ value.
.. versionadded:: 3.3
:c:func:`PyUnicode_AsWideChar`, :c:func:`PyUnicode_ReadChar` or similar new
APIs.
+ .. deprecated-removed:: 3.3 3.10
+
.. c:function:: PyObject* PyUnicode_TransformDecimalToASCII(Py_UNICODE *s, Py_ssize_t size)
* `Project structure`_
* `Building and packaging the project`_
* `Uploading the project to the Python Packaging Index`_
+* `The .pypirc file`_
.. _Project structure: \
https://packaging.python.org/tutorials/distributing-packages/
https://packaging.python.org/tutorials/distributing-packages/#packaging-your-project
.. _Uploading the project to the Python Packaging Index: \
https://packaging.python.org/tutorials/distributing-packages/#uploading-your-project-to-pypi
+.. _The .pypirc file: \
+ https://packaging.python.org/specifications/pypirc/
How do I...?
Python is an interpreted, interactive, object-oriented programming language. It
incorporates modules, exceptions, dynamic typing, very high level dynamic data
-types, and classes. Python combines remarkable power with very clear syntax.
-It has interfaces to many system calls and libraries, as well as to various
-window systems, and is extensible in C or C++. It is also usable as an
-extension language for applications that need a programmable interface.
-Finally, Python is portable: it runs on many Unix variants, on the Mac, and on
-Windows 2000 and later.
+types, and classes. It supports multiple programming paradigms beyond
+object-oriented programming, such as procedural and functional programming.
+Python combines remarkable power with very clear syntax. It has interfaces to
+many system calls and libraries, as well as to various window systems, and is
+extensible in C or C++. It is also usable as an extension language for
+applications that need a programmable interface. Finally, Python is portable:
+it runs on many Unix variants including Linux and macOS, and on Windows.
To find out more, start with :ref:`tutorial-index`. The `Beginner's Guide to
Python <https://wiki.python.org/moin/BeginnersGuide>`_ links to other
---------------------
Very stable. New, stable releases have been coming out roughly every 6 to 18
-months since 1991, and this seems likely to continue. Currently there are
-usually around 18 months between major releases.
+months since 1991, and this seems likely to continue. As of version 3.9,
+Python will have a major new release every 12 months (:pep:`602`).
The developers issue "bugfix" releases of older versions, so the stability of
existing releases gradually improves. Bugfix releases, indicated by a third
How many people are using Python?
---------------------------------
-There are probably tens of thousands of users, though it's difficult to obtain
-an exact count.
+There are probably millions of users, though it's difficult to obtain an exact
+count.
Python is available for free download, so there are no sales figures, and it's
available from many different sites and packaged with many Linux distributions,
A list of bytecode instructions can be found in the documentation for
:ref:`the dis module <bytecodes>`.
+ callback
+ A subroutine function which is passed as an argument to be executed at
+ some point in the future.
+
class
A template for creating user-defined objects. Class definitions
normally contain method definitions which operate on instances of the
In Python, you use ``socket.setblocking(0)`` to make it non-blocking. In C, it's
more complex, (for one thing, you'll need to choose between the BSD flavor
-``O_NONBLOCK`` and the almost indistinguishable Posix flavor ``O_NDELAY``, which
+``O_NONBLOCK`` and the almost indistinguishable POSIX flavor ``O_NDELAY``, which
is completely different from ``TCP_NODELAY``), but it's the exact same idea. You
do this after creating the socket, but before using it. (Actually, if you're
nuts, you can switch back and forth.)
executes an ``await`` expression, the running Task gets suspended, and
the event loop executes the next Task.
-To schedule a callback from a different OS thread, the
+To schedule a :term:`callback` from another OS thread, the
:meth:`loop.call_soon_threadsafe` method should be used. Example::
loop.call_soon_threadsafe(callback, *args)
.. method:: loop.call_soon(callback, *args, context=None)
- Schedule a *callback* to be called with *args* arguments at
- the next iteration of the event loop.
+ Schedule the *callback* :term:`callback` to be called with
+ *args* arguments at the next iteration of the event loop.
Callbacks are called in the order in which they are registered.
Each callback will be called exactly once.
An example of a subprocess protocol used to get the output of a
subprocess and to wait for the subprocess exit.
-The subprocess is created by th :meth:`loop.subprocess_exec` method::
+The subprocess is created by the :meth:`loop.subprocess_exec` method::
import asyncio
import sys
See the documentation of :meth:`loop.subprocess_shell` for other
parameters.
-.. important::
-
- It is the application's responsibility to ensure that all whitespace and
- special characters are quoted appropriately to avoid `shell injection
- <https://en.wikipedia.org/wiki/Shell_injection#Shell_injection>`_
- vulnerabilities. The :func:`shlex.quote` function can be used to properly
- escape whitespace and special shell characters in strings that are going
- to be used to construct shell commands.
+ .. important::
+
+ It is the application's responsibility to ensure that all whitespace and
+ special characters are quoted appropriately to avoid `shell injection
+ <https://en.wikipedia.org/wiki/Shell_injection#Shell_injection>`_
+ vulnerabilities. The :func:`shlex.quote` function can be used to properly
+ escape whitespace and special shell characters in strings that are going
+ to be used to construct shell commands.
.. deprecated-removed:: 3.8 3.10
.. function:: as_completed(aws, \*, loop=None, timeout=None)
Run :ref:`awaitable objects <asyncio-awaitables>` in the *aws*
- set concurrently. Return an iterator of :class:`Future` objects.
- Each Future object returned represents the earliest result
- from the set of the remaining awaitables.
+ set concurrently. Return an iterator of coroutines.
+ Each coroutine returned can be awaited to get the earliest next
+ result from the set of the remaining awaitables.
Raises :exc:`asyncio.TimeoutError` if the timeout occurs before
all Futures are done.
Example::
- for f in as_completed(aws):
- earliest_result = await f
+ for coro in as_completed(aws):
+ earliest_result = await coro
# ...
.. function:: monthcalendar(year, month)
Returns a matrix representing a month's calendar. Each row represents a week;
- days outside of the month a represented by zeros. Each week begins with Monday
+ days outside of the month are represented by zeros. Each week begins with Monday
unless set by :func:`setfirstweekday`.
*source* is the source string; *filename* is the optional filename from which
source was read, defaulting to ``'<input>'``; and *symbol* is the optional
- grammar start symbol, which should be either ``'single'`` (the default) or
- ``'eval'``.
+ grammar start symbol, which should be ``'single'`` (the default), ``'eval'``
+ or ``'exec'``.
Returns a code object (the same as ``compile(source, filename, symbol)``) if the
command is complete and valid; ``None`` if the command is incomplete; raises
:exc:`OverflowError` or :exc:`ValueError` if there is an invalid literal.
The *symbol* argument determines whether *source* is compiled as a statement
- (``'single'``, the default) or as an :term:`expression` (``'eval'``). Any
- other value will cause :exc:`ValueError` to be raised.
+ (``'single'``, the default), as a sequence of statements (``'exec'``) or
+ as an :term:`expression` (``'eval'``). Any other value will
+ cause :exc:`ValueError` to be raised.
.. note::
return value
def __setitem__(self, key, value):
+ if key in self:
+ self.move_to_end(key)
super().__setitem__(key, value)
if len(self) > self.maxsize:
oldest = next(iter(self))
All other optional or keyword arguments are passed to the underlying
:class:`reader` instance.
+ .. versionchanged:: 3.6
+ Returned rows are now of type :class:`OrderedDict`.
+
.. versionchanged:: 3.8
Returned rows are now of type :class:`dict`.
@dataclass
class InventoryItem:
- '''Class for keeping track of an item in inventory.'''
+ """Class for keeping track of an item in inventory."""
name: str
unit_price: float
quantity_on_hand: int = 0
Aware and Naive Objects
-----------------------
-Date and time objects may be categorized as "aware" or "naive."
+Date and time objects may be categorized as "aware" or "naive" depending on
+whether or not they include timezone information.
With sufficient knowledge of applicable algorithmic and political time
adjustments, such as time zone and daylight saving time information,
A. Yes. In the CPython and PyPy3 implementations, the C/CFFI versions of
the decimal module integrate the high speed `libmpdec
<https://www.bytereef.org/mpdecimal/doc/libmpdec/index.html>`_ library for
-arbitrary precision correctly-rounded decimal floating point arithmetic [#]_.
+arbitrary precision correctly-rounded decimal floating point arithmetic.
``libmpdec`` uses `Karatsuba multiplication
<https://en.wikipedia.org/wiki/Karatsuba_algorithm>`_
for medium-sized numbers and the `Number Theoretic Transform
<https://en.wikipedia.org/wiki/Discrete_Fourier_transform_(general)#Number-theoretic_transform>`_
-for very large numbers.
+for very large numbers. However, to realize this performance gain, the
+context needs to be set for unrounded calculations.
-The context must be adapted for exact arbitrary precision arithmetic. :attr:`Emin`
-and :attr:`Emax` should always be set to the maximum values, :attr:`clamp`
-should always be 0 (the default). Setting :attr:`prec` requires some care.
+ >>> c = getcontext()
+ >>> c.prec = MAX_PREC
+ >>> c.Emax = MAX_EMAX
+ >>> c.Emin = MIN_EMIN
-The easiest approach for trying out bignum arithmetic is to use the maximum
-value for :attr:`prec` as well [#]_::
-
- >>> setcontext(Context(prec=MAX_PREC, Emax=MAX_EMAX, Emin=MIN_EMIN))
- >>> x = Decimal(2) ** 256
- >>> x / 128
- Decimal('904625697166532776746648320380374280103671755200316906558262375061821325312')
-
-
-For inexact results, :attr:`MAX_PREC` is far too large on 64-bit platforms and
-the available memory will be insufficient::
-
- >>> Decimal(1) / 3
- Traceback (most recent call last):
- File "<stdin>", line 1, in <module>
- MemoryError
-
-On systems with overallocation (e.g. Linux), a more sophisticated approach is to
-adjust :attr:`prec` to the amount of available RAM. Suppose that you have 8GB of
-RAM and expect 10 simultaneous operands using a maximum of 500MB each::
-
- >>> import sys
- >>>
- >>> # Maximum number of digits for a single operand using 500MB in 8-byte words
- >>> # with 19 digits per word (4-byte and 9 digits for the 32-bit build):
- >>> maxdigits = 19 * ((500 * 1024**2) // 8)
- >>>
- >>> # Check that this works:
- >>> c = Context(prec=maxdigits, Emax=MAX_EMAX, Emin=MIN_EMIN)
- >>> c.traps[Inexact] = True
- >>> setcontext(c)
- >>>
- >>> # Fill the available precision with nines:
- >>> x = Decimal(0).logical_invert() * 9
- >>> sys.getsizeof(x)
- 524288112
- >>> x + 2
- Traceback (most recent call last):
- File "<stdin>", line 1, in <module>
- decimal.Inexact: [<class 'decimal.Inexact'>]
-
-In general (and especially on systems without overallocation), it is recommended
-to estimate even tighter bounds and set the :attr:`Inexact` trap if all calculations
-are expected to be exact.
-
-
-.. [#]
- .. versionadded:: 3.3
-
-.. [#]
- .. versionchanged:: 3.9
- This approach now works for all exact results except for non-integer powers.
- Also backported to 3.7 and 3.8.
+.. versionadded:: 3.3
\ No newline at end of file
the next :class:`int` in sequence with the last :class:`int` provided, but
the way it does this is an implementation detail and may change.
+.. note::
+
+ The :meth:`_generate_next_value_` method must be defined before any members.
+
Iteration
---------
Clear any selection and update the line and column status.
Show Completions
- Open a scrollable list allowing selection of keywords and attributes. See
+ Open a scrollable list allowing selection of existing names. See
:ref:`Completions <completions>` in the Editing and navigation section below.
Expand Word
See also the indent/dedent region commands on the
:ref:`Format menu <format-menu>`.
-
.. _completions:
Completions
^^^^^^^^^^^
-Completions are supplied for functions, classes, and attributes of classes,
-both built-in and user-defined. Completions are also provided for
-filenames.
-
-The AutoCompleteWindow (ACW) will open after a predefined delay (default is
-two seconds) after a '.' or (in a string) an os.sep is typed. If after one
-of those characters (plus zero or more other characters) a tab is typed
-the ACW will open immediately if a possible continuation is found.
-
-If there is only one possible completion for the characters entered, a
-:kbd:`Tab` will supply that completion without opening the ACW.
-
-'Show Completions' will force open a completions window, by default the
-:kbd:`C-space` will open a completions window. In an empty
-string, this will contain the files in the current directory. On a
-blank line, it will contain the built-in and user-defined functions and
-classes in the current namespaces, plus any modules imported. If some
-characters have been entered, the ACW will attempt to be more specific.
-
-If a string of characters is typed, the ACW selection will jump to the
-entry most closely matching those characters. Entering a :kbd:`tab` will
-cause the longest non-ambiguous match to be entered in the Editor window or
-Shell. Two :kbd:`tab` in a row will supply the current ACW selection, as
-will return or a double click. Cursor keys, Page Up/Down, mouse selection,
-and the scroll wheel all operate on the ACW.
-
-"Hidden" attributes can be accessed by typing the beginning of hidden
-name after a '.', e.g. '_'. This allows access to modules with
-``__all__`` set, or to class-private attributes.
-
-Completions and the 'Expand Word' facility can save a lot of typing!
-
-Completions are currently limited to those in the namespaces. Names in
-an Editor window which are not via ``__main__`` and :data:`sys.modules` will
-not be found. Run the module once with your imports to correct this situation.
-Note that IDLE itself places quite a few modules in sys.modules, so
-much can be found by default, e.g. the re module.
-
-If you don't like the ACW popping up unbidden, simply make the delay
-longer or disable the extension.
+Completions are supplied, when requested and available, for module
+names, attributes of classes or functions, or filenames. Each request
+method displays a completion box with existing names. (See tab
+completions below for an exception.) For any box, change the name
+being completed and the item highlighted in the box by
+typing and deleting characters; by hitting :kbd:`Up`, :kbd:`Down`,
+:kbd:`PageUp`, :kbd:`PageDown`, :kbd:`Home`, and :kbd:`End` keys;
+and by a single click within the box. Close the box with :kbd:`Escape`,
+:kbd:`Enter`, and double :kbd:`Tab` keys or clicks outside the box.
+A double click within the box selects and closes.
+
+One way to open a box is to type a key character and wait for a
+predefined interval. This defaults to 2 seconds; customize it
+in the settings dialog. (To prevent auto popups, set the delay to a
+large number of milliseconds, such as 100000000.) For imported module
+names or class or function attributes, type '.'.
+For filenames in the root directory, type :data:`os.sep` or
+data:`os.altsep` immediately after an opening quote. (On Windows,
+one can specify a drive first.) Move into subdirectories by typing a
+directory name and a separator.
+
+Instead of waiting, or after a box is closed, open a completion box
+immediately with Show Completions on the Edit menu. The default hot
+key is :kbd:`C-space`. If one types a prefix for the desired name
+before opening the box, the first match or near miss is made visible.
+The result is the same as if one enters a prefix
+after the box is displayed. Show Completions after a quote completes
+filenames in the current directory instead of a root directory.
+
+Hitting :kbd:`Tab` after a prefix usually has the same effect as Show
+Completions. (With no prefix, it indents.) However, if there is only
+one match to the prefix, that match is immediately added to the editor
+text without opening a box.
+
+Invoking 'Show Completions', or hitting :kbd:`Tab` after a prefix,
+outside of a string and without a preceding '.' opens a box with
+keywords, builtin names, and available module-level names.
+
+When editing code in an editor (as oppose to Shell), increase the
+available module-level names by running your code
+and not restarting the Shell thereafter. This is especially useful
+after adding imports at the top of a file. This also increases
+possible attribute completions.
+
+Completion boxes intially exclude names beginning with '_' or, for
+modules, not included in '__all__'. The hidden names can be accessed
+by typing '_' after '.', either before or after the box is opened.
.. _calltips:
**Source code:** :source:`Lib/imp.py`
.. deprecated:: 3.4
- The :mod:`imp` package is pending deprecation in favor of :mod:`importlib`.
+ The :mod:`imp` module is deprecated in favor of :mod:`importlib`.
.. index:: statement: import
Return *r* length subsequences of elements from the input *iterable*.
- Combinations are emitted in lexicographic sort order. So, if the
- input *iterable* is sorted, the combination tuples will be produced
- in sorted order.
+ The combination tuples are emitted in lexicographic ordering according to
+ the order of the input *iterable*. So, if the input *iterable* is sorted,
+ the combination tuples will be produced in sorted order.
Elements are treated as unique based on their position, not on their
value. So if the input elements are unique, there will be no repeat
Return *r* length subsequences of elements from the input *iterable*
allowing individual elements to be repeated more than once.
- Combinations are emitted in lexicographic sort order. So, if the
- input *iterable* is sorted, the combination tuples will be produced
- in sorted order.
+ The combination tuples are emitted in lexicographic ordering according to
+ the order of the input *iterable*. So, if the input *iterable* is sorted,
+ the combination tuples will be produced in sorted order.
Elements are treated as unique based on their position, not on their
value. So if the input elements are unique, the generated combinations
of the *iterable* and all possible full-length permutations
are generated.
- Permutations are emitted in lexicographic sort order. So, if the
- input *iterable* is sorted, the permutation tuples will be produced
- in sorted order.
+ The permutation tuples are emitted in lexicographic ordering according to
+ the order of the input *iterable*. So, if the input *iterable* is sorted,
+ the combination tuples will be produced in sorted order.
Elements are treated as unique based on their position, not on their
value. So if the input elements are unique, there will be no repeat
for generating bindings for C++ libraries as Python classes, and
is specifically designed for Python.
- `PySide <https://wiki.qt.io/PySide>`_
- PySide is a newer binding to the Qt toolkit, provided by Nokia.
- Compared to PyQt, its licensing scheme is friendlier to non-open source
- applications.
+ `PySide2 <https://doc.qt.io/qtforpython/>`_
+ Also known as the Qt for Python project, PySide2 is a newer binding to the
+ Qt toolkit. It is provided by The Qt Company and aims to provide a
+ complete port of PySide to Qt 5. Compared to PyQt, its licensing scheme is
+ friendlier to non-open source applications.
`wxPython <https://www.wxpython.org>`_
wxPython is a cross-platform GUI toolkit for Python that is built around
an XML-based resource format and more, including an ever growing library
of user-contributed modules.
-PyGTK, PyQt, and wxPython, all have a modern look and feel and more
+PyGTK, PyQt, PySide2, and wxPython, all have a modern look and feel and more
widgets than Tkinter. In addition, there are many other GUI toolkits for
Python, both cross-platform, and platform-specific. See the `GUI Programming
<https://wiki.python.org/moin/GuiProgramming>`_ page in the Python Wiki for a
executed in the current environment).
.. pdbcommand:: retval
+
Print the return value for the last return of a function.
.. rubric:: Footnotes
.. function:: select(rlist, wlist, xlist[, timeout])
This is a straightforward interface to the Unix :c:func:`select` system call.
- The first three arguments are sequences of 'waitable objects': either
+ The first three arguments are iterables of 'waitable objects': either
integers representing file descriptors or objects with a parameterless method
named :meth:`~io.IOBase.fileno` returning such an integer:
* *xlist*: wait for an "exceptional condition" (see the manual page for what
your system considers such a condition)
- Empty sequences are allowed, but acceptance of three empty sequences is
+ Empty iterables are allowed, but acceptance of three empty iterables is
platform-dependent. (It is known to work on Unix but not on Windows.) The
optional *timeout* argument specifies a time-out as a floating point number
in seconds. When the *timeout* argument is omitted the function blocks until
single: socket() (in module socket)
single: popen() (in module os)
- Among the acceptable object types in the sequences are Python :term:`file
+ Among the acceptable object types in the iterables are Python :term:`file
objects <file object>` (e.g. ``sys.stdin``, or objects returned by
:func:`open` or :func:`os.popen`), socket objects returned by
:func:`socket.socket`. You may also define a :dfn:`wrapper` class yourself,
available), or "xztar" (if the :mod:`lzma` module is available).
*root_dir* is a directory that will be the root directory of the
- archive; for example, we typically chdir into *root_dir* before creating the
- archive.
+ archive, all paths in the archive will be relative to it; for example,
+ we typically chdir into *root_dir* before creating the archive.
*base_dir* is the directory where we start archiving from;
i.e. *base_dir* will be the common prefix of all files and
- directories in the archive.
+ directories in the archive. *base_dir* must be given relative
+ to *root_dir*. See :ref:`shutil-archiving-example-with-basedir` for how to
+ use *base_dir* and *root_dir* together.
*root_dir* and *base_dir* both default to the current directory.
-rw-r--r-- tarek/staff 37192 2010-02-06 18:23:10 ./known_hosts
+.. _shutil-archiving-example-with-basedir:
+
+Archiving example with *base_dir*
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+In this example, similar to the `one above <shutil-archiving-example_>`_,
+we show how to use :func:`make_archive`, but this time with the usage of
+*base_dir*. We now have the following directory structure:
+
+.. code-block:: shell-session
+
+ $ tree tmp
+ tmp
+ └── root
+ └── structure
+ ├── content
+ └── please_add.txt
+ └── do_not_add.txt
+
+In the final archive, :file:`please_add.txt` should be included, but
+:file:`do_not_add.txt` should not. Therefore we use the following::
+
+ >>> from shutil import make_archive
+ >>> import os
+ >>> archive_name = os.path.expanduser(os.path.join('~', 'myarchive'))
+ >>> make_archive(
+ ... archive_name,
+ ... 'tar',
+ ... root_dir='tmp/root',
+ ... base_dir='structure/content',
+ ... )
+ '/Users/tarek/my_archive.tar'
+
+Listing the files in the resulting archive gives us:
+
+.. code-block:: shell-session
+
+ $ python -m tarfile -l /Users/tarek/myarchive.tar
+ structure/content/
+ structure/content/please_add.txt
+
+
Querying the size of the output terminal
----------------------------------------
- :meth:`~SSLSocket.read`
- :meth:`~SSLSocket.write`
- :meth:`~SSLSocket.getpeercert`
+ - :meth:`~SSLSocket.selected_alpn_protocol`
- :meth:`~SSLSocket.selected_npn_protocol`
- :meth:`~SSLSocket.cipher`
- :meth:`~SSLSocket.shared_ciphers`
- :meth:`~SSLSocket.compression`
- :meth:`~SSLSocket.pending`
- :meth:`~SSLSocket.do_handshake`
+ - :meth:`~SSLSocket.verify_client_post_handshake`
- :meth:`~SSLSocket.unwrap`
- :meth:`~SSLSocket.get_channel_binding`
+ - :meth:`~SSLSocket.version`
When compared to :class:`SSLSocket`, this object lacks the following
features:
Negative shift counts are illegal and cause a :exc:`ValueError` to be raised.
(2)
- A left shift by *n* bits is equivalent to multiplication by ``pow(2, n)``
- without overflow check.
+ A left shift by *n* bits is equivalent to multiplication by ``pow(2, n)``.
(3)
- A right shift by *n* bits is equivalent to division by ``pow(2, n)`` without
- overflow check.
+ A right shift by *n* bits is equivalent to floor division by ``pow(2, n)``.
(4)
Performing these calculations with at least one extra sign extension bit in
.. function:: run(args, *, stdin=None, input=None, stdout=None, stderr=None,\
capture_output=False, shell=False, cwd=None, timeout=None, \
check=False, encoding=None, errors=None, text=None, env=None, \
- universal_newlines=None)
+ universal_newlines=None, **other_popen_kwargs)
Run the command described by *args*. Wait for command to complete, then
return a :class:`CompletedProcess` instance.
.. method:: Popen.communicate(input=None, timeout=None)
Interact with process: Send data to stdin. Read data from stdout and stderr,
- until end-of-file is reached. Wait for process to terminate. The optional
- *input* argument should be data to be sent to the child process, or
- ``None``, if no data should be sent to the child. If streams were opened in
- text mode, *input* must be a string. Otherwise, it must be bytes.
+ until end-of-file is reached. Wait for process to terminate and set the
+ :attr:`~Popen.returncode` attribute. The optional *input* argument should be
+ data to be sent to the child process, or ``None``, if no data should be sent
+ to the child. If streams were opened in text mode, *input* must be a string.
+ Otherwise, it must be bytes.
:meth:`communicate` returns a tuple ``(stdout_data, stderr_data)``.
The data will be strings if streams were opened in text mode; otherwise,
.. method:: Popen.terminate()
- Stop the child. On Posix OSs the method sends SIGTERM to the
+ Stop the child. On POSIX OSs the method sends SIGTERM to the
child. On Windows the Win32 API function :c:func:`TerminateProcess` is called
to stop the child.
.. method:: Popen.kill()
- Kills the child. On Posix OSs the function sends SIGKILL to the child.
+ Kills the child. On POSIX OSs the function sends SIGKILL to the child.
On Windows :meth:`kill` is an alias for :meth:`terminate`.
subprocess. You can now use :func:`run` in many cases, but lots of existing code
calls these functions.
-.. function:: call(args, *, stdin=None, stdout=None, stderr=None, shell=False, cwd=None, timeout=None)
+.. function:: call(args, *, stdin=None, stdout=None, stderr=None, \
+ shell=False, cwd=None, timeout=None, **other_popen_kwargs)
Run the command described by *args*. Wait for command to complete, then
return the :attr:`~Popen.returncode` attribute.
.. versionchanged:: 3.3
*timeout* was added.
-.. function:: check_call(args, *, stdin=None, stdout=None, stderr=None, shell=False, cwd=None, timeout=None)
+.. function:: check_call(args, *, stdin=None, stdout=None, stderr=None, \
+ shell=False, cwd=None, timeout=None, \
+ **other_popen_kwargs)
Run command with arguments. Wait for command to complete. If the return
code was zero then return, otherwise raise :exc:`CalledProcessError`. The
.. function:: check_output(args, *, stdin=None, stderr=None, shell=False, \
cwd=None, encoding=None, errors=None, \
- universal_newlines=None, timeout=None, text=None)
+ universal_newlines=None, timeout=None, text=None, \
+ **other_popen_kwargs)
Run command with arguments and return its output.
Python currently supports seven schemes:
-- *posix_prefix*: scheme for Posix platforms like Linux or Mac OS X. This is
+- *posix_prefix*: scheme for POSIX platforms like Linux or Mac OS X. This is
the default scheme used when Python or a component is installed.
-- *posix_home*: scheme for Posix platforms used when a *home* option is used
+- *posix_home*: scheme for POSIX platforms used when a *home* option is used
upon installation. This scheme is used when a component is installed through
Distutils with a specific home prefix.
-- *posix_user*: scheme for Posix platforms used when a component is installed
+- *posix_user*: scheme for POSIX platforms used when a component is installed
through Distutils and the *user* option is used. This scheme defines paths
located under the user home directory.
- *nt*: scheme for NT platforms like Windows.
import tarfile
tar = tarfile.open("sample.tar.gz", "r:gz")
for tarinfo in tar:
- print(tarinfo.name, "is", tarinfo.size, "bytes in size and is", end="")
+ print(tarinfo.name, "is", tarinfo.size, "bytes in size and is ", end="")
if tarinfo.isreg():
print("a regular file.")
elif tarinfo.isdir():
A generic version of :class:`collections.abc.ByteString`.
This type represents the types :class:`bytes`, :class:`bytearray`,
- and :class:`memoryview`.
+ and :class:`memoryview` of byte sequences.
As a shorthand for this type, :class:`bytes` can be used to
annotate arguments of any of the types mentioned above.
``List[ForwardRef("SomeClass")]``. This class should not be instantiated by
a user, but may be used by introspection tools.
-.. function:: NewType(typ)
+.. function:: NewType(name, tp)
- A helper function to indicate a distinct types to a typechecker,
+ A helper function to indicate a distinct type to a typechecker,
see :ref:`distinct`. At runtime it returns a function that returns
its argument. Usage::
.. seealso::
`Python Packaging User Guide: Creating and using virtual environments
- <https://packaging.python.org/installing/#creating-virtual-environments>`__
+ <https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/#creating-a-virtual-environment>`__
Creating virtual environments
application without adding attributes to those objects. This can be especially
useful with objects that override attribute accesses.
- .. note::
-
- Caution: Because a :class:`WeakKeyDictionary` is built on top of a Python
- dictionary, it must not change size when iterating over it. This can be
- difficult to ensure for a :class:`WeakKeyDictionary` because actions
- performed by the program during iteration may cause items in the
- dictionary to vanish "by magic" (as a side effect of garbage collection).
:class:`WeakKeyDictionary` objects have an additional method that
exposes the internal references directly. The references are not guaranteed to
Mapping class that references values weakly. Entries in the dictionary will be
discarded when no strong reference to the value exists any more.
- .. note::
-
- Caution: Because a :class:`WeakValueDictionary` is built on top of a Python
- dictionary, it must not change size when iterating over it. This can be
- difficult to ensure for a :class:`WeakValueDictionary` because actions performed
- by the program during iteration may cause items in the dictionary to vanish "by
- magic" (as a side effect of garbage collection).
:class:`WeakValueDictionary` objects have an additional method that has the
same issues as the :meth:`keyrefs` method of :class:`WeakKeyDictionary`
%PYTHON% -c "import sphinx" > nul 2> nul\r
if errorlevel 1 (\r
echo Installing sphinx with %PYTHON%\r
- %PYTHON% -m pip install sphinx\r
+ %PYTHON% -m pip install sphinx==2.2.0\r
if errorlevel 1 exit /B\r
)\r
set SPHINXBUILD=%PYTHON% -c "import sphinx.cmd.build, sys; sys.exit(sphinx.cmd.build.main())"\r
A non-normative HTML file listing all valid identifier characters for Unicode
4.1 can be found at
-https://www.dcl.hpi.uni-potsdam.de/home/loewis/table-3131.html.
+https://www.unicode.org/Public/13.0.0/ucd/DerivedCoreProperties.txt
.. _keywords:
final_argument_whitespace = True
option_spec = {}
- _label = 'Deprecated since version {deprecated}, will be removed in version {removed}'
+ _deprecated_label = 'Deprecated since version {deprecated}, will be removed in version {removed}'
+ _removed_label = 'Deprecated since version {deprecated}, removed in version {removed}'
def run(self):
node = addnodes.versionmodified()
node['type'] = 'deprecated-removed'
version = (self.arguments[0], self.arguments[1])
node['version'] = version
- label = translators['sphinx'].gettext(self._label)
+ env = self.state.document.settings.env
+ current_version = tuple(int(e) for e in env.config.version.split('.'))
+ removed_version = tuple(int(e) for e in self.arguments[1].split('.'))
+ if current_version < removed_version:
+ label = self._deprecated_label
+ else:
+ label = self._removed_label
+
+ label = translators['sphinx'].gettext(label)
text = label.format(deprecated=self.arguments[0], removed=self.arguments[1])
if len(self.arguments) == 3:
inodes, messages = self.state.inline_text(self.arguments[2],
'(?:release/\\d.\\d[\\x\\d\\.]*)'];
var all_versions = {
- '3.9': 'dev (3.9)',
+ '3.10': 'dev (3.10)',
+ '3.9': 'pre (3.9)',
'3.8': '3.8',
'3.7': '3.7',
'3.6': '3.6',
{% trans %}CPython implementation detail:{% endtrans %}
{% trans %}Deprecated since version {deprecated}, will be removed in version {removed}{% endtrans %}
+{% trans %}Deprecated since version {deprecated}, removed in version {removed}{% endtrans %}
<p><a href="{{ pathto('download') }}">{% trans %}Download these documents{% endtrans %}</a></p>
<h3>{% trans %}Docs by version{% endtrans %}</h3>
<ul>
- <li><a href="https://docs.python.org/3.9/">{% trans %}Python 3.9 (in development){% endtrans %}</a></li>
+ <li><a href="https://docs.python.org/3.10/">{% trans %}Python 3.10 (in development){% endtrans %}</a></li>
+ <li><a href="https://docs.python.org/3.9/">{% trans %}Python 3.9 (pre-release){% endtrans %}</a></li>
<li><a href="https://docs.python.org/3.8/">{% trans %}Python 3.8 (stable){% endtrans %}</a></li>
<li><a href="https://docs.python.org/3.7/">{% trans %}Python 3.7 (stable){% endtrans %}</a></li>
<li><a href="https://docs.python.org/3.6/">{% trans %}Python 3.6 (security-fixes){% endtrans %}</a></li>
name attempts to find the name in the namespace.
Although scopes are determined statically, they are used dynamically. At any
-time during execution, there are at least three nested scopes whose namespaces
-are directly accessible:
+time during execution, At any time during execution, there are 3 or 4 nested
+scopes whose namespaces are directly accessible:
* the innermost scope, which is searched first, contains the local names
* the scopes of any enclosing functions, which are searched starting with the
not the value of the object). [#]_ When a function calls another function, a new
local symbol table is created for that call.
-A function definition introduces the function name in the current symbol table.
-The value of the function name has a type that is recognized by the interpreter
-as a user-defined function. This value can be assigned to another name which
-can then also be used as a function. This serves as a general renaming
-mechanism::
+A function definition associates the function name with the function object in
+the current symbol table. The interpreter recognizes the object pointed to by
+that name as a user-defined function. Other names can also point to that same
+function object and can also be used to access the function::
>>> fib
<function fib at 10042ed0>
If you have a really long format string that you don't want to split up, it
would be nice if you could reference the variables to be formatted by name
instead of by position. This can be done by simply passing the dict and using
-square brackets ``'[]'`` to access the keys ::
+square brackets ``'[]'`` to access the keys. ::
>>> table = {'Sjoerd': 4127, 'Jack': 4098, 'Dcab': 8637678}
>>> print('Jack: {0[Jack]:d}; Sjoerd: {0[Sjoerd]:d}; '
Old string formatting
---------------------
-The ``%`` operator can also be used for string formatting. It interprets the
-left argument much like a :c:func:`sprintf`\ -style format string to be applied
-to the right argument, and returns the string resulting from this formatting
-operation. For example::
+The % operator (modulo) can also be used for string formatting. Given ``'string'
+% values``, instances of ``%`` in ``string`` are replaced with zero or more
+elements of ``values``. This operation is commonly known as string
+interpolation. For example::
>>> import math
>>> print('The value of pi is approximately %5.3f.' % math.pi)
* A :file:`Python 3.8` folder in your :file:`Applications` folder. In here
you find IDLE, the development environment that is a standard part of official
- Python distributions; PythonLauncher, which handles double-clicking Python
- scripts from the Finder; and the "Build Applet" tool, which allows you to
- package Python scripts as standalone applications on your system.
+ Python distributions; and PythonLauncher, which handles double-clicking Python
+ scripts from the Finder.
* A framework :file:`/Library/Frameworks/Python.framework`, which includes the
Python executable and libraries. The installer adds this location to your shell
Distributing Python Applications on the Mac
===========================================
-The "Build Applet" tool that is placed in the MacPython 3.6 folder is fine for
-packaging small Python scripts on your own machine to run as a standard Mac
-application. This tool, however, is not robust enough to distribute Python
-applications to other users.
-
The standard tool for deploying standalone Python applications on the Mac is
:program:`py2app`. More information on installing and using py2app can be found
at http://undefined.org/python/#py2app.
PS C:\> Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser
See `About Execution Policies
- <ttps:/go.microsoft.com/fwlink/?LinkID=135170>`_
+ <https://go.microsoft.com/fwlink/?LinkID=135170>`_
for more information.
The created ``pyvenv.cfg`` file also includes the
to ``1``.
This allows the :func:`open` function, the :mod:`os` module and most other
-path functionality to accept and return paths longer than 260 characters when
-using strings. (Use of bytes as paths is deprecated on Windows, and this feature
-is not available when using bytes.)
+path functionality to accept and return paths longer than 260 characters.
After changing the above option, no further configuration is required.
What's New In Python 3.0
****************************
-TEST CHANGE TO BE UNDONE
-
.. XXX Add trademark info for Apple, Microsoft.
:Author: Guido van Rossum
PyAPI_FUNC(void) _PyGILState_Reinit(_PyRuntimeState *runtime);
+
+PyAPI_FUNC(int) _PyOS_InterruptOccurred(PyThreadState *tstate);
+
#ifdef __cplusplus
}
#endif
/*--start constants--*/
#define PY_MAJOR_VERSION 3
#define PY_MINOR_VERSION 8
-#define PY_MICRO_VERSION 3
+#define PY_MICRO_VERSION 4
#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_FINAL
#define PY_RELEASE_SERIAL 0
/* Version as a string */
-#define PY_VERSION "3.8.3"
+#define PY_VERSION "3.8.4"
/*--end constants--*/
/* Version as a single 4-byte hex number, e.g. 0x010502B2 == 1.5.2b2.
return type.__instancecheck__(cls, inst)
def _new(cls, *args, **kwargs):
+ for key in kwargs:
+ if key not in cls._fields:
+ # arbitrary keyword arguments are accepted
+ continue
+ pos = cls._fields.index(key)
+ if pos < len(args):
+ raise TypeError(f"{cls.__name__} got multiple values for argument {key!r}")
if cls in _const_types:
return Constant(*args, **kwargs)
return Constant.__new__(cls, *args, **kwargs)
try:
# Register a dummy signal handler to ask Python to write the signal
- # number in the wakup file descriptor. _process_self_data() will
+ # number in the wakeup file descriptor. _process_self_data() will
# read signal numbers from this file descriptor to handle signals.
signal.signal(sig, _sighandler_noop)
ctype = "multipart/form-data; boundary={}".format(boundary)
headers = Message()
headers.set_type(ctype)
- headers['Content-Length'] = pdict['CONTENT-LENGTH']
+ try:
+ headers['Content-Length'] = pdict['CONTENT-LENGTH']
+ except KeyError:
+ pass
fs = FieldStorage(fp, headers=headers, encoding=encoding, errors=errors,
environ={'REQUEST_METHOD': 'POST'})
return {k: fs.getlist(k) for k in fs}
last_line_lfend = True
_read = 0
while 1:
- if self.limit is not None and _read >= self.limit:
+
+ if self.limit is not None and 0 <= self.limit <= _read:
break
line = self.fp.readline(1<<16) # bytes
self.bytes_read += len(line)
"""
import __future__
+import warnings
_features = [getattr(__future__, fname)
for fname in __future__.all_feature_names]
except SyntaxError as err:
pass
- try:
- code1 = compiler(source + "\n", filename, symbol)
- except SyntaxError as e:
- err1 = e
+ # Suppress warnings after the first compile to avoid duplication.
+ with warnings.catch_warnings():
+ warnings.simplefilter("ignore")
+ try:
+ code1 = compiler(source + "\n", filename, symbol)
+ except SyntaxError as e:
+ err1 = e
- try:
- code2 = compiler(source + "\n\n", filename, symbol)
- except SyntaxError as e:
- err2 = e
+ try:
+ code2 = compiler(source + "\n\n", filename, symbol)
+ except SyntaxError as e:
+ err2 = e
try:
if code:
source -- the source string; may contain \n characters
filename -- optional filename from which source was read; default
"<input>"
- symbol -- optional grammar start symbol; "single" (default) or "eval"
+ symbol -- optional grammar start symbol; "single" (default), "exec"
+ or "eval"
Return value / exceptions raised:
import functools
import unittest
+from test import support
+
from ctypes import *
from ctypes.test import need_symbol
import _ctypes_test
self.assertEqual(s.second, check.second)
self.assertEqual(s.third, check.third)
-################################################################
+ def test_callback_too_many_args(self):
+ def func(*args):
+ return len(args)
+
+ CTYPES_MAX_ARGCOUNT = 1024
+ proto = CFUNCTYPE(c_int, *(c_int,) * CTYPES_MAX_ARGCOUNT)
+ cb = proto(func)
+ args1 = (1,) * CTYPES_MAX_ARGCOUNT
+ self.assertEqual(cb(*args1), CTYPES_MAX_ARGCOUNT)
+
+ args2 = (1,) * (CTYPES_MAX_ARGCOUNT + 1)
+ with self.assertRaises(ArgumentError):
+ cb(*args2)
+
+ def test_convert_result_error(self):
+ def func():
+ return ("tuple",)
+
+ proto = CFUNCTYPE(c_int)
+ ctypes_func = proto(func)
+ with support.catch_unraisable_exception() as cm:
+ # don't test the result since it is an uninitialized value
+ result = ctypes_func()
+
+ self.assertIsInstance(cm.unraisable.exc_value, TypeError)
+ self.assertEqual(cm.unraisable.err_msg,
+ "Exception ignored on converting result "
+ "of ctypes callback function")
+ self.assertIs(cm.unraisable.object, func)
+
if __name__ == '__main__':
unittest.main()
# Relative path (but not just filename) should succeed
should_pass("WinDLL('./_sqlite3.dll')")
- # XXX: This test has started failing on Azure Pipelines CI. See
- # bpo-40214 for more information.
- if 0:
- # Insecure load flags should succeed
- should_pass("WinDLL('_sqlite3.dll', winmode=0)")
+ # Insecure load flags should succeed
+ # Clear the DLL directory to avoid safe search settings propagating
+ should_pass("windll.kernel32.SetDllDirectoryW(None); WinDLL('_sqlite3.dll', winmode=0)")
# Full path load without DLL_LOAD_DIR shouldn't find dependency
should_fail("WinDLL(nt._getfullpathname('_sqlite3.dll'), " +
from ctypes import *
-import unittest, sys
+import contextlib
+from test import support
+import unittest
+import sys
+
def callback_func(arg):
42 / arg
# created, then a full traceback printed. When SystemExit is
# raised in a callback function, the interpreter exits.
- def capture_stderr(self, func, *args, **kw):
- # helper - call function 'func', and return the captured stderr
- import io
- old_stderr = sys.stderr
- logger = sys.stderr = io.StringIO()
- try:
- func(*args, **kw)
- finally:
- sys.stderr = old_stderr
- return logger.getvalue()
+ @contextlib.contextmanager
+ def expect_unraisable(self, exc_type, exc_msg=None):
+ with support.catch_unraisable_exception() as cm:
+ yield
+
+ self.assertIsInstance(cm.unraisable.exc_value, exc_type)
+ if exc_msg is not None:
+ self.assertEqual(str(cm.unraisable.exc_value), exc_msg)
+ self.assertEqual(cm.unraisable.err_msg,
+ "Exception ignored on calling ctypes "
+ "callback function")
+ self.assertIs(cm.unraisable.object, callback_func)
def test_ValueError(self):
cb = CFUNCTYPE(c_int, c_int)(callback_func)
- out = self.capture_stderr(cb, 42)
- self.assertEqual(out.splitlines()[-1],
- "ValueError: 42")
+ with self.expect_unraisable(ValueError, '42'):
+ cb(42)
def test_IntegerDivisionError(self):
cb = CFUNCTYPE(c_int, c_int)(callback_func)
- out = self.capture_stderr(cb, 0)
- self.assertEqual(out.splitlines()[-1][:19],
- "ZeroDivisionError: ")
+ with self.expect_unraisable(ZeroDivisionError):
+ cb(0)
def test_FloatDivisionError(self):
cb = CFUNCTYPE(c_int, c_double)(callback_func)
- out = self.capture_stderr(cb, 0.0)
- self.assertEqual(out.splitlines()[-1][:19],
- "ZeroDivisionError: ")
+ with self.expect_unraisable(ZeroDivisionError):
+ cb(0.0)
def test_TypeErrorDivisionError(self):
cb = CFUNCTYPE(c_int, c_char_p)(callback_func)
- out = self.capture_stderr(cb, b"spam")
- self.assertEqual(out.splitlines()[-1],
- "TypeError: "
- "unsupported operand type(s) for /: 'int' and 'bytes'")
+ err_msg = "unsupported operand type(s) for /: 'int' and 'bytes'"
+ with self.expect_unraisable(TypeError, err_msg):
+ cb(b"spam")
+
if __name__ == '__main__':
unittest.main()
class TestStructures(unittest.TestCase):
def test_native(self):
for typ in structures:
-## print typ.value
self.assertEqual(typ.value.offset, 1)
o = typ()
o.value = 4
def test_swapped(self):
for typ in byteswapped_structures:
-## print >> sys.stderr, typ.value
self.assertEqual(typ.value.offset, 1)
o = typ()
o.value = 4
# method, because:
# - it does not recurse in to the namedtuple fields and
# convert them to dicts (using dict_factory).
- # - I don't actually want to return a dict here. The the main
+ # - I don't actually want to return a dict here. The main
# use case here is json.dumps, and it handles converting
# namedtuples to lists. Admittedly we're losing some
# information here when we produce a json list instead of a
import os
import importlib.util
import sys
-from glob import glob
+import glob
from distutils.core import Command
from distutils.errors import *
files = []
for pattern in globs:
# Each pattern has to be converted to a platform-specific path
- filelist = glob(os.path.join(src_dir, convert_path(pattern)))
+ filelist = glob.glob(os.path.join(glob.escape(src_dir), convert_path(pattern)))
# Files that match more than one pattern are only added once
files.extend([fn for fn in filelist if fn not in files
and os.path.isfile(fn)])
def find_package_modules(self, package, package_dir):
self.check_package(package, package_dir)
- module_files = glob(os.path.join(package_dir, "*.py"))
+ module_files = glob.glob(os.path.join(glob.escape(package_dir), "*.py"))
modules = []
setup_script = os.path.abspath(self.distribution.script_name)
import os
import sys
import unittest
-from test.support import run_unittest
+from test.support import run_unittest, save_restore_warnings_filters
here = os.path.dirname(__file__) or os.curdir
for fn in os.listdir(here):
if fn.startswith("test") and fn.endswith(".py"):
modname = "distutils.tests." + fn[:-3]
- __import__(modname)
+ # bpo-40055: Save/restore warnings filters to leave them unchanged.
+ # Importing tests imports docutils which imports pkg_resources
+ # which adds a warnings filter.
+ with save_restore_warnings_filters():
+ __import__(modname)
module = sys.modules[modname]
suite.addTest(module.test_suite())
return suite
if value[0] in WSP:
token, value = get_fws(value)
elif value[:2] == '=?':
+ valid_ew = False
try:
token, value = get_encoded_word(value)
bare_quoted_string.defects.append(errors.InvalidHeaderDefect(
"encoded word inside quoted string"))
+ valid_ew = True
except errors.HeaderParseError:
token, value = get_qcontent(value)
+ # Collapse the whitespace between two encoded words that occur in a
+ # bare-quoted-string.
+ if valid_ew and len(bare_quoted_string) > 1:
+ if (bare_quoted_string[-1].token_type == 'fws' and
+ bare_quoted_string[-2].token_type == 'encoded-word'):
+ bare_quoted_string[-1] = EWWhiteSpaceTerminal(
+ bare_quoted_string[-1], 'fws')
else:
token, value = get_qcontent(value)
bare_quoted_string.append(token)
def normal_body(lines): return b'\n'.join(lines) + b'\n'
if cte==None:
# Use heuristics to decide on the "best" encoding.
- try:
- return '7bit', normal_body(lines).decode('ascii')
- except UnicodeDecodeError:
- pass
- if (policy.cte_type == '8bit' and
- max(len(x) for x in lines) <= policy.max_line_length):
- return '8bit', normal_body(lines).decode('ascii', 'surrogateescape')
+ if max((len(x) for x in lines), default=0) <= policy.max_line_length:
+ try:
+ return '7bit', normal_body(lines).decode('ascii')
+ except UnicodeDecodeError:
+ pass
+ if policy.cte_type == '8bit':
+ return '8bit', normal_body(lines).decode('ascii', 'surrogateescape')
sniff = embedded_body(lines[:10])
sniff_qp = quoprimime.body_encode(sniff.decode('latin-1'),
policy.max_line_length)
without any Content Transfer Encoding.
"""
+
+ inputs = ''.join(filter(None, (display_name, username, domain, addr_spec)))
+ if '\r' in inputs or '\n' in inputs:
+ raise ValueError("invalid arguments; address parts cannot contain CR or LF")
+
# This clause with its potential 'raise' may only happen when an
# application program creates an Address object using an addr_spec
# keyword. The email library code itself must always supply username
If the first element of pair is false, then the second element is
returned unmodified.
- Optional charset if given is the character set that is used to encode
+ The optional charset is the character set that is used to encode
realname in case realname is not ASCII safe. Can be an instance of str or
a Charset-like object which has a header_encode method. Default is
'utf-8'.
__all__ = ["version", "bootstrap"]
-_SETUPTOOLS_VERSION = "41.2.0"
+_SETUPTOOLS_VERSION = "47.1.0"
-_PIP_VERSION = "19.2.3"
+_PIP_VERSION = "20.1.1"
_PROJECTS = [
- ("setuptools", _SETUPTOOLS_VERSION),
- ("pip", _PIP_VERSION),
+ ("setuptools", _SETUPTOOLS_VERSION, "py3"),
+ ("pip", _PIP_VERSION, "py2.py3"),
]
# Put our bundled wheels into a temporary directory and construct the
# additional paths that need added to sys.path
additional_paths = []
- for project, version in _PROJECTS:
- wheel_name = "{}-{}-py2.py3-none-any.whl".format(project, version)
+ for project, version, py_tag in _PROJECTS:
+ wheel_name = "{}-{}-{}-none-any.whl".format(project, version, py_tag)
whl = pkgutil.get_data(
"ensurepip",
"_bundled/{}".format(wheel_name),
additional_paths.append(os.path.join(tmpdir, wheel_name))
# Construct the arguments to be passed to the pip command
- args = ["install", "--no-index", "--find-links", tmpdir]
+ args = ["install", "--no-cache-dir", "--no-index", "--find-links", tmpdir]
if root:
args += ["--root", root]
if upgrade:
self._member_names = []
self._last_values = []
self._ignore = []
+ self._auto_called = False
def __setitem__(self, key, value):
"""Changes anything not dundered or not a descriptor.
):
raise ValueError('_names_ are reserved for future Enum use')
if key == '_generate_next_value_':
+ # check if members already defined as auto()
+ if self._auto_called:
+ raise TypeError("_generate_next_value_ must be defined before members")
setattr(self, '_generate_next_value', value)
elif key == '_ignore_':
if isinstance(value, str):
# enum overwriting a descriptor?
raise TypeError('%r already defined as: %r' % (key, self[key]))
if isinstance(value, auto):
+ self._auto_called = True
if value.value == _auto_null:
value.value = self._generate_next_value(key, 1, len(self._member_names), self._last_values[:])
value = value.value
-What's New in IDLE 3.8.1
-Released on 2019-12-16?
+What's New in IDLE 3.8.4
+Released on 2020-07-03?
======================================
+bpo-37765: Add keywords to module name completion list. Rewrite
+Completions section of IDLE doc.
+
+bpo-41152: The encoding of ``stdin``, ``stdout`` and ``stderr`` in IDLE
+is now always UTF-8.
+
+bpo-41144: Make Open Module open a special module such as os.path.
+
+bpo-40723: Make test_idle pass when run after import.
+Patch by Florian Dahlitz.
+
+
+What's New in IDLE 3.8.3
+Released on 2020-05-13
+======================================
+
bpo-38689: IDLE will no longer freeze when inspect.signature fails
when fetching a calltip.
bpo-27115: For 'Go to Line', use a Query entry box subclass with
IDLE standard behavior and improved error checking.
-bpo-39885: Since clicking to get an IDLE context menu moves the
-cursor, any text selection should be and now is cleared.
+bpo-39885: When a context menu is invoked by right-clicking outside
+of a selection, clear the selection and move the cursor. Cut and
+Copy require that the click be within the selection.
bpo-39852: Edit "Go to line" now clears any selection, preventing
accidental deletion. It also updates Ln and Col on the status bar.
bpo-39663: Add tests for pyparse find_good_parse_start().
+
+What's New in IDLE 3.8.2
+Released on 2020-02-17
+======================================
+
bpo-39600: Remove duplicate font names from configuration list.
bpo-38792: Close a shell calltip if a :exc:`KeyboardInterrupt`
Remove unneeded arguments and dead code from pyparse
find_good_parse_start method.
+
+What's New in IDLE 3.8.1
+Released on 2019-12-18
+======================================
+
bpo-38943: Fix autocomplete windows not always appearing on some
systems. Patch by Johnny Najera.
pop up a list of candidates.
"""
import __main__
+import keyword
import os
import string
import sys
(what, mode), {})
else:
if mode == ATTRS:
- if what == "":
+ if what == "": # Main module names.
namespace = {**__main__.__builtins__.__dict__,
**__main__.__dict__}
bigl = eval("dir()", namespace)
+ kwds = (s for s in keyword.kwlist
+ if s not in {'True', 'False', 'None'})
+ bigl.extend(kwds)
bigl.sort()
if "__all__" in bigl:
smalll = sorted(eval("__all__", namespace))
rmenu = None
def right_menu_event(self, event):
- self.text.tag_remove("sel", "1.0", "end")
- self.text.mark_set("insert", "@%d,%d" % (event.x, event.y))
+ text = self.text
+ newdex = text.index(f'@{event.x},{event.y}')
+ try:
+ in_selection = (text.compare('sel.first', '<=', newdex) and
+ text.compare(newdex, '<=', 'sel.last'))
+ except TclError:
+ in_selection = False
+ if not in_selection:
+ text.tag_remove("sel", "1.0", "end")
+ text.mark_set("insert", newdex)
if not self.rmenu:
self.make_rmenu()
rmenu = self.rmenu
self.event = event
iswin = sys.platform[:3] == 'win'
if iswin:
- self.text.config(cursor="arrow")
+ text.config(cursor="arrow")
for item in self.rmenu_specs:
try:
state = getattr(self, verify_state)()
rmenu.entryconfigure(label, state=state)
-
rmenu.tk_popup(event.x_root, event.y_root)
if iswin:
self.text.config(cursor="ibeam")
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta charset="utf-8" />
- <title>IDLE — Python 3.9.0a4 documentation</title>
+ <title>IDLE — Python 3.10.0a0 documentation</title>
<link rel="stylesheet" href="../_static/pydoctheme.css" type="text/css" />
<link rel="stylesheet" href="../_static/pygments.css" type="text/css" />
<script type="text/javascript" src="../_static/sidebar.js"></script>
<link rel="search" type="application/opensearchdescription+xml"
- title="Search within Python 3.9.0a4 documentation"
+ title="Search within Python 3.10.0a0 documentation"
href="../_static/opensearch.xml"/>
<link rel="author" title="About these documents" href="../about.html" />
<link rel="index" title="Index" href="../genindex.html" />
<li>
- <a href="../index.html">3.9.0a4 Documentation</a> »
+ <a href="../index.html">3.10.0a0 Documentation</a> »
</li>
<li class="nav-item nav-item-1"><a href="index.html" >The Python Standard Library</a> »</li>
line visible. A request past the end of the file goes to the end.
Clear any selection and update the line and column status.</p>
</dd>
-<dt>Show Completions</dt><dd><p>Open a scrollable list allowing selection of keywords and attributes. See
+<dt>Show Completions</dt><dd><p>Open a scrollable list allowing selection of existing names. See
<a class="reference internal" href="#completions"><span class="std std-ref">Completions</span></a> in the Editing and navigation section below.</p>
</dd>
<dt>Expand Word</dt><dd><p>Expand a prefix you have typed to match a full word in the same window;
</div>
<div class="section" id="completions">
<span id="id3"></span><h3>Completions<a class="headerlink" href="#completions" title="Permalink to this headline">¶</a></h3>
-<p>Completions are supplied for functions, classes, and attributes of classes,
-both built-in and user-defined. Completions are also provided for
-filenames.</p>
-<p>The AutoCompleteWindow (ACW) will open after a predefined delay (default is
-two seconds) after a ‘.’ or (in a string) an os.sep is typed. If after one
-of those characters (plus zero or more other characters) a tab is typed
-the ACW will open immediately if a possible continuation is found.</p>
-<p>If there is only one possible completion for the characters entered, a
-<kbd class="kbd docutils literal notranslate">Tab</kbd> will supply that completion without opening the ACW.</p>
-<p>‘Show Completions’ will force open a completions window, by default the
-<kbd class="kbd docutils literal notranslate">C-space</kbd> will open a completions window. In an empty
-string, this will contain the files in the current directory. On a
-blank line, it will contain the built-in and user-defined functions and
-classes in the current namespaces, plus any modules imported. If some
-characters have been entered, the ACW will attempt to be more specific.</p>
-<p>If a string of characters is typed, the ACW selection will jump to the
-entry most closely matching those characters. Entering a <kbd class="kbd docutils literal notranslate">tab</kbd> will
-cause the longest non-ambiguous match to be entered in the Editor window or
-Shell. Two <kbd class="kbd docutils literal notranslate">tab</kbd> in a row will supply the current ACW selection, as
-will return or a double click. Cursor keys, Page Up/Down, mouse selection,
-and the scroll wheel all operate on the ACW.</p>
-<p>“Hidden” attributes can be accessed by typing the beginning of hidden
-name after a ‘.’, e.g. ‘_’. This allows access to modules with
-<code class="docutils literal notranslate"><span class="pre">__all__</span></code> set, or to class-private attributes.</p>
-<p>Completions and the ‘Expand Word’ facility can save a lot of typing!</p>
-<p>Completions are currently limited to those in the namespaces. Names in
-an Editor window which are not via <code class="docutils literal notranslate"><span class="pre">__main__</span></code> and <a class="reference internal" href="sys.html#sys.modules" title="sys.modules"><code class="xref py py-data docutils literal notranslate"><span class="pre">sys.modules</span></code></a> will
-not be found. Run the module once with your imports to correct this situation.
-Note that IDLE itself places quite a few modules in sys.modules, so
-much can be found by default, e.g. the re module.</p>
-<p>If you don’t like the ACW popping up unbidden, simply make the delay
-longer or disable the extension.</p>
+<p>Completions are supplied, when requested and available, for module
+names, attributes of classes or functions, or filenames. Each request
+method displays a completion box with existing names. (See tab
+completions below for an exception.) For any box, change the name
+being completed and the item highlighted in the box by
+typing and deleting characters; by hitting <kbd class="kbd docutils literal notranslate">Up</kbd>, <kbd class="kbd docutils literal notranslate">Down</kbd>,
+<kbd class="kbd docutils literal notranslate">PageUp</kbd>, <kbd class="kbd docutils literal notranslate">PageDown</kbd>, <kbd class="kbd docutils literal notranslate">Home</kbd>, and <kbd class="kbd docutils literal notranslate">End</kbd> keys;
+and by a single click within the box. Close the box with <kbd class="kbd docutils literal notranslate">Escape</kbd>,
+<kbd class="kbd docutils literal notranslate">Enter</kbd>, and double <kbd class="kbd docutils literal notranslate">Tab</kbd> keys or clicks outside the box.
+A double click within the box selects and closes.</p>
+<p>One way to open a box is to type a key character and wait for a
+predefined interval. This defaults to 2 seconds; customize it
+in the settings dialog. (To prevent auto popups, set the delay to a
+large number of milliseconds, such as 100000000.) For imported module
+names or class or function attributes, type ‘.’.
+For filenames in the root directory, type <a class="reference internal" href="os.html#os.sep" title="os.sep"><code class="xref py py-data docutils literal notranslate"><span class="pre">os.sep</span></code></a> or
+data:<cite>os.altsep</cite> immediately after an opening quote. (On Windows,
+one can specify a drive first.) Move into subdirectories by typing a
+directory name and a separator.</p>
+<p>Instead of waiting, or after a box is closed. open a completion box
+immediately with Show Completions on the Edit menu. The default hot
+key is <kbd class="kbd docutils literal notranslate">C-space</kbd>. If one types a prefix for the desired name
+before opening the box, the first match is displayed.
+The result is the same as if one enters a prefix
+after the box is displayed. Show Completions after a quote completes
+filenames in the current directory instead of a root directory.</p>
+<p>Hitting <kbd class="kbd docutils literal notranslate">Tab</kbd> after a prefix usually has the same effect as Show
+Completions. (With no prefix, it indents.) However, if there is only
+one match to the prefix, that match is immediately added to the editor
+text without opening a box.</p>
+<p>Invoking ‘Show Completions’, or hitting <kbd class="kbd docutils literal notranslate">Tab</kbd> after a prefix,
+outside of a string and without a preceding ‘.’ opens a box with
+keywords, builtin names, and available module-level names.</p>
+<p>When editing code in an editor (as oppose to Shell), increase the
+available module-level names by running your code
+and not restarting the Shell thereafter. This is especially useful
+after adding imports at the top of a file. This also increases
+possible attribute completions.</p>
+<p>Completion boxes intially exclude names beginning with ‘_’ or, for
+modules, not included in ‘__all__’. The hidden names can be accessed
+by typing ‘_’ after ‘.’, either before or after the box is opened.</p>
</div>
<div class="section" id="calltips">
<span id="id4"></span><h3>Calltips<a class="headerlink" href="#calltips" title="Permalink to this headline">¶</a></h3>
<li>
- <a href="../index.html">3.9.0a4 Documentation</a> »
+ <a href="../index.html">3.10.0a0 Documentation</a> »
</li>
<li class="nav-item nav-item-1"><a href="index.html" >The Python Standard Library</a> »</li>
<br />
<br />
- Last updated on Mar 07, 2020.
+ Last updated on Jul 08, 2020.
<a href="https://docs.python.org/3/bugs.html">Found a bug</a>?
<br />
acp = self.autocomplete
small, large = acp.fetch_completions(
'', ac.ATTRS)
- if __main__.__file__ != ac.__file__:
+ if hasattr(__main__, '__file__') and __main__.__file__ != ac.__file__:
self.assertNotIn('AutoComplete', small) # See issue 36405.
# Test attributes
with patch.dict('__main__.__dict__', {'__all__': ['a', 'b']}):
s, b = acp.fetch_completions('', ac.ATTRS)
self.assertEqual(s, ['a', 'b'])
- self.assertIn('__name__', b) # From __main__.__dict__
- self.assertIn('sum', b) # From __main__.__builtins__.__dict__
+ self.assertIn('__name__', b) # From __main__.__dict__.
+ self.assertIn('sum', b) # From __main__.__builtins__.__dict__.
+ self.assertIn('nonlocal', b) # From keyword.kwlist.
+ pos = b.index('False') # Test False not included twice.
+ self.assertNotEqual(b[pos+1], 'False')
# Test attributes with name entity.
mock = Mock()
from collections import namedtuple
from test.support import requires
from tkinter import Tk
+from idlelib.idle_test.mock_idle import Func
Editor = editor.EditorWindow
)
+def insert(text, string):
+ text.delete('1.0', 'end')
+ text.insert('end', string)
+ text.update() # Force update for colorizer to finish.
+
+
class IndentAndNewlineTest(unittest.TestCase):
@classmethod
cls.root.destroy()
del cls.root
- def insert(self, text):
- t = self.window.text
- t.delete('1.0', 'end')
- t.insert('end', text)
- # Force update for colorizer to finish.
- t.update()
-
def test_indent_and_newline_event(self):
eq = self.assertEqual
w = self.window
w.prompt_last_line = ''
for test in tests:
with self.subTest(label=test.label):
- self.insert(test.text)
+ insert(text, test.text)
text.mark_set('insert', test.mark)
nl(event=None)
eq(get('1.0', 'end'), test.expected)
# Selected text.
- self.insert(' def f1(self, a, b):\n return a + b')
+ insert(text, ' def f1(self, a, b):\n return a + b')
text.tag_add('sel', '1.17', '1.end')
nl(None)
# Deletes selected text before adding new line.
# Preserves the whitespace in shell prompt.
w.prompt_last_line = '>>> '
- self.insert('>>> \t\ta =')
+ insert(text, '>>> \t\ta =')
text.mark_set('insert', '1.5')
nl(None)
eq(get('1.0', 'end'), '>>> \na =\n')
+class RMenuTest(unittest.TestCase):
+
+ @classmethod
+ def setUpClass(cls):
+ requires('gui')
+ cls.root = Tk()
+ cls.root.withdraw()
+ cls.window = Editor(root=cls.root)
+
+ @classmethod
+ def tearDownClass(cls):
+ cls.window._close()
+ del cls.window
+ cls.root.update_idletasks()
+ for id in cls.root.tk.call('after', 'info'):
+ cls.root.after_cancel(id)
+ cls.root.destroy()
+ del cls.root
+
+ class DummyRMenu:
+ def tk_popup(x, y): pass
+
+ def test_rclick(self):
+ pass
+
+
if __name__ == '__main__':
unittest.main(verbosity=2)
get = self.text.get
write = self.window.write
- # Test bytes.
- b = b'Test bytes.'
- eq(write(b), len(b))
- eq(get('1.0', '1.end'), b.decode())
-
# No new line - insert stays on same line.
delete('1.0', 'end')
test_text = 'test text'
dialog = self.Dummy_ModuleName('idlelib')
self.assertTrue(dialog.entry_ok().endswith('__init__.py'))
self.assertEqual(dialog.entry_error['text'], '')
+ dialog = self.Dummy_ModuleName('os.path')
+ self.assertTrue(dialog.entry_ok().endswith('path.py'))
+ self.assertEqual(dialog.entry_error['text'], '')
class GotoTest(unittest.TestCase):
-import codecs
-from codecs import BOM_UTF8
import os
-import re
import shlex
import sys
import tempfile
+import tokenize
import tkinter.filedialog as tkFileDialog
import tkinter.messagebox as tkMessageBox
import idlelib
from idlelib.config import idleConf
-if idlelib.testing: # Set True by test.test_idle to avoid setlocale.
- encoding = 'utf-8'
- errors = 'surrogateescape'
+encoding = 'utf-8'
+if sys.platform == 'win32':
+ errors = 'surrogatepass'
else:
- # Try setting the locale, so that we can find out
- # what encoding to use
- try:
- import locale
- locale.setlocale(locale.LC_CTYPE, "")
- except (ImportError, locale.Error):
- pass
-
- if sys.platform == 'win32':
- encoding = 'utf-8'
- errors = 'surrogateescape'
- else:
- try:
- # Different things can fail here: the locale module may not be
- # loaded, it may not offer nl_langinfo, or CODESET, or the
- # resulting codeset may be unknown to Python. We ignore all
- # these problems, falling back to ASCII
- locale_encoding = locale.nl_langinfo(locale.CODESET)
- if locale_encoding:
- codecs.lookup(locale_encoding)
- except (NameError, AttributeError, LookupError):
- # Try getdefaultlocale: it parses environment variables,
- # which may give a clue. Unfortunately, getdefaultlocale has
- # bugs that can cause ValueError.
- try:
- locale_encoding = locale.getdefaultlocale()[1]
- if locale_encoding:
- codecs.lookup(locale_encoding)
- except (ValueError, LookupError):
- pass
-
- if locale_encoding:
- encoding = locale_encoding.lower()
- errors = 'strict'
- else:
- # POSIX locale or macOS
- encoding = 'ascii'
- errors = 'surrogateescape'
- # Encoding is used in multiple files; locale_encoding nowhere.
- # The only use of 'encoding' below is in _decode as initial value
- # of deprecated block asking user for encoding.
- # Perhaps use elsewhere should be reviewed.
-
-coding_re = re.compile(r'^[ \t\f]*#.*?coding[:=][ \t]*([-\w.]+)', re.ASCII)
-blank_re = re.compile(r'^[ \t\f]*(?:[#\r\n]|$)', re.ASCII)
-
-def coding_spec(data):
- """Return the encoding declaration according to PEP 263.
-
- When checking encoded data, only the first two lines should be passed
- in to avoid a UnicodeDecodeError if the rest of the data is not unicode.
- The first two lines would contain the encoding specification.
-
- Raise a LookupError if the encoding is declared but unknown.
- """
- if isinstance(data, bytes):
- # This encoding might be wrong. However, the coding
- # spec must be ASCII-only, so any non-ASCII characters
- # around here will be ignored. Decoding to Latin-1 should
- # never fail (except for memory outage)
- lines = data.decode('iso-8859-1')
- else:
- lines = data
- # consider only the first two lines
- if '\n' in lines:
- lst = lines.split('\n', 2)[:2]
- elif '\r' in lines:
- lst = lines.split('\r', 2)[:2]
- else:
- lst = [lines]
- for line in lst:
- match = coding_re.match(line)
- if match is not None:
- break
- if not blank_re.match(line):
- return None
- else:
- return None
- name = match.group(1)
- try:
- codecs.lookup(name)
- except LookupError:
- # The standard encoding error does not indicate the encoding
- raise LookupError("Unknown encoding: "+name)
- return name
+ errors = 'surrogateescape'
+
class IOBinding:
self.save_as)
self.__id_savecopy = self.text.bind("<<save-copy-of-window-as-file>>",
self.save_a_copy)
- self.fileencoding = None
+ self.fileencoding = 'utf-8'
self.__id_print = self.text.bind("<<print-window>>", self.print_window)
def close(self):
self.text.focus_set()
return "break"
- eol = r"(\r\n)|\n|\r" # \r\n (Windows), \n (UNIX), or \r (Mac)
- eol_re = re.compile(eol)
eol_convention = os.linesep # default
def loadfile(self, filename):
try:
- # open the file in binary mode so that we can handle
- # end-of-line convention ourselves.
- with open(filename, 'rb') as f:
- two_lines = f.readline() + f.readline()
- f.seek(0)
- bytes = f.read()
- except OSError as msg:
- tkMessageBox.showerror("I/O Error", str(msg), parent=self.text)
+ try:
+ with tokenize.open(filename) as f:
+ chars = f.read()
+ fileencoding = f.encoding
+ eol_convention = f.newlines
+ converted = False
+ except (UnicodeDecodeError, SyntaxError):
+ # Wait for the editor window to appear
+ self.editwin.text.update()
+ enc = askstring(
+ "Specify file encoding",
+ "The file's encoding is invalid for Python 3.x.\n"
+ "IDLE will convert it to UTF-8.\n"
+ "What is the current encoding of the file?",
+ initialvalue='utf-8',
+ parent=self.editwin.text)
+ with open(filename, encoding=enc) as f:
+ chars = f.read()
+ fileencoding = f.encoding
+ eol_convention = f.newlines
+ converted = True
+ except OSError as err:
+ tkMessageBox.showerror("I/O Error", str(err), parent=self.text)
return False
- chars, converted = self._decode(two_lines, bytes)
- if chars is None:
+ except UnicodeDecodeError:
tkMessageBox.showerror("Decoding Error",
"File %s\nFailed to Decode" % filename,
parent=self.text)
return False
- # We now convert all end-of-lines to '\n's
- firsteol = self.eol_re.search(chars)
- if firsteol:
- self.eol_convention = firsteol.group(0)
- chars = self.eol_re.sub(r"\n", chars)
+
self.text.delete("1.0", "end")
self.set_filename(None)
+ self.fileencoding = fileencoding
+ self.eol_convention = eol_convention
self.text.insert("1.0", chars)
self.reset_undo()
self.set_filename(filename)
self.updaterecentfileslist(filename)
return True
- def _decode(self, two_lines, bytes):
- "Create a Unicode string."
- chars = None
- # Check presence of a UTF-8 signature first
- if bytes.startswith(BOM_UTF8):
- try:
- chars = bytes[3:].decode("utf-8")
- except UnicodeDecodeError:
- # has UTF-8 signature, but fails to decode...
- return None, False
- else:
- # Indicates that this file originally had a BOM
- self.fileencoding = 'BOM'
- return chars, False
- # Next look for coding specification
- try:
- enc = coding_spec(two_lines)
- except LookupError as name:
- tkMessageBox.showerror(
- title="Error loading the file",
- message="The encoding '%s' is not known to this Python "\
- "installation. The file may not display correctly" % name,
- parent = self.text)
- enc = None
- except UnicodeDecodeError:
- return None, False
- if enc:
- try:
- chars = str(bytes, enc)
- self.fileencoding = enc
- return chars, False
- except UnicodeDecodeError:
- pass
- # Try ascii:
- try:
- chars = str(bytes, 'ascii')
- self.fileencoding = None
- return chars, False
- except UnicodeDecodeError:
- pass
- # Try utf-8:
- try:
- chars = str(bytes, 'utf-8')
- self.fileencoding = 'utf-8'
- return chars, False
- except UnicodeDecodeError:
- pass
- # Finally, try the locale's encoding. This is deprecated;
- # the user should declare a non-ASCII encoding
- try:
- # Wait for the editor window to appear
- self.editwin.text.update()
- enc = askstring(
- "Specify file encoding",
- "The file's encoding is invalid for Python 3.x.\n"
- "IDLE will convert it to UTF-8.\n"
- "What is the current encoding of the file?",
- initialvalue = encoding,
- parent = self.editwin.text)
-
- if enc:
- chars = str(bytes, enc)
- self.fileencoding = None
- return chars, True
- except (UnicodeDecodeError, LookupError):
- pass
- return None, False # None on failure
-
def maybesave(self):
if self.get_saved():
return "yes"
# text to us. Don't try to guess further.
return chars
# Preserve a BOM that might have been present on opening
- if self.fileencoding == 'BOM':
- return BOM_UTF8 + chars.encode("utf-8")
+ if self.fileencoding == 'utf-8-sig':
+ return chars.encode('utf-8-sig')
# See whether there is anything non-ASCII in it.
# If not, no need to figure out the encoding.
try:
return chars.encode('ascii')
- except UnicodeError:
+ except UnicodeEncodeError:
pass
# Check if there is an encoding declared
try:
- # a string, let coding_spec slice it to the first two lines
- enc = coding_spec(chars)
- failed = None
- except LookupError as msg:
- failed = msg
- enc = None
- else:
- if not enc:
- # PEP 3120: default source encoding is UTF-8
- enc = 'utf-8'
- if enc:
- try:
- return chars.encode(enc)
- except UnicodeError:
- failed = "Invalid encoding '%s'" % enc
+ encoded = chars.encode('ascii', 'replace')
+ enc, _ = tokenize.detect_encoding(io.BytesIO(encoded).readline)
+ return chars.encode(enc)
+ except SyntaxError as err:
+ failed = str(err)
+ except UnicodeEncodeError:
+ failed = "Invalid encoding '%s'" % enc
tkMessageBox.showerror(
"I/O Error",
"%s.\nSaving as UTF-8" % failed,
- parent = self.text)
+ parent=self.text)
# Fallback: save as UTF-8, with BOM - ignoring the incorrect
# declared encoding
- return BOM_UTF8 + chars.encode("utf-8")
+ return chars.encode('utf-8-sig')
def print_window(self, event):
confirm = tkMessageBox.askokcancel(
from tkinter import messagebox
from idlelib.editor import EditorWindow
-from idlelib import iomenu
file_line_pats = [
Return:
Length of text inserted.
"""
- if isinstance(s, bytes):
- s = s.decode(iomenu.encoding, "replace")
+ assert isinstance(s, str)
self.text.insert(mark, s, tags)
self.text.see(mark)
self.text.update()
# HelpSource was extracted from configHelpSourceEdit.py (temporarily
# config_help.py), with darwin code moved from ok to path_ok.
-import importlib
+import importlib.util, importlib.abc
import os
import shlex
from sys import executable, platform # Platform is set for one test.
self.withdraw() # Hide while configuring, especially geometry.
self.title(title)
self.transient(parent)
- self.grab_set()
+ if not _utest: # Otherwise fail when directly run unittest.
+ self.grab_set()
windowingsystem = self.tk.call('tk', 'windowingsystem')
if windowingsystem == 'aqua':
self.showerror(str(msg))
return None
if spec is None:
- self.showerror("module not found")
+ self.showerror("module not found.")
return None
if not isinstance(spec.loader, importlib.abc.SourceLoader):
- self.showerror("not a source-based module")
+ self.showerror("not a source-based module.")
return None
try:
file_path = spec.loader.get_filename(name)
except AttributeError:
- self.showerror("loader does not support get_filename",
- parent=self)
+ self.showerror("loader does not support get_filename.")
return None
+ except ImportError:
+ # Some special modules require this (e.g. os.path)
+ try:
+ file_path = spec.loader.get_filename()
+ except TypeError:
+ self.showerror("loader failed to get filename.")
+ return None
return file_path
return cli_args
def entry_ok(self):
- "Return apparently valid (cli_args, restart) or None"
+ "Return apparently valid (cli_args, restart) or None."
cli_args = self.cli_args_ok()
restart = self.restartvar.get()
return None if cli_args is None else (cli_args, restart)
"""Utility to display the available icons."""
root = Tk()
import glob
- list = glob.glob(os.path.join(icondir, "*.gif"))
+ list = glob.glob(os.path.join(glob.escape(icondir), "*.gif"))
list.sort()
images = []
row = column = 0
if recursive or toplevel:
print('recursing down:')
import glob
- names = glob.glob(os.path.join(filename, '*'))
+ names = glob.glob(os.path.join(glob.escape(filename), '*'))
testall(names, recursive, 0)
else:
print('*** directory (use -r) ***')
return False
def __hash__(self):
- return self._ip ^ self._prefixlen ^ int(self.network.network_address)
+ return hash((self._ip, self._prefixlen, int(self.network.network_address)))
__reduce__ = _IPAddressBase.__reduce__
return False
def __hash__(self):
- return self._ip ^ self._prefixlen ^ int(self.network.network_address)
+ return hash((self._ip, self._prefixlen, int(self.network.network_address)))
__reduce__ = _IPAddressBase.__reduce__
try:
stat = os.stat(fullname)
except OSError:
- del cache[filename]
+ cache.pop(filename, None)
continue
if size != stat.st_size or mtime != stat.st_mtime:
- del cache[filename]
+ cache.pop(filename, None)
def updatecache(filename, module_globals=None):
if filename in cache:
if len(cache[filename]) != 1:
- del cache[filename]
+ cache.pop(filename, None)
if not filename or (filename.startswith('<') and filename.endswith('>')):
return []
def read_mime_types(file):
try:
- f = open(file)
+ f = open(file, encoding='utf-8')
except OSError:
return None
with f:
'.dvi' : 'application/x-dvi',
'.gtar' : 'application/x-gtar',
'.hdf' : 'application/x-hdf',
+ '.h5' : 'application/x-hdf5',
'.latex' : 'application/x-latex',
'.mif' : 'application/x-mif',
'.cdf' : 'application/x-netcdf',
if sys.platform == 'win32':
return ['spawn']
else:
+ methods = ['spawn', 'fork'] if sys.platform == 'darwin' else ['fork', 'spawn']
if reduction.HAVE_SEND_HANDLE:
- return ['fork', 'spawn', 'forkserver']
- else:
- return ['fork', 'spawn']
+ methods.append('forkserver')
+ return methods
+
#
# Context types for fixed start method
import pprint
import signal
import inspect
+import tokenize
import traceback
import linecache
def find_function(funcname, filename):
cre = re.compile(r'def\s+%s\s*[(]' % re.escape(funcname))
try:
- fp = open(filename)
+ fp = tokenize.open(filename)
except OSError:
return None
# consumer of this info expects the first line to be 1
except Exception:
ret = []
# Then, try to complete file names as well.
- globs = glob.glob(text + '*')
+ globs = glob.glob(glob.escape(text) + '*')
for fn in globs:
if os.path.isdir(fn):
ret.append(fn + '/')
return module_name
# Protect the iteration by using a list copy of sys.modules against dynamic
# modules that trigger imports of other modules upon calls to getattr.
- for module_name, module in list(sys.modules.items()):
+ for module_name, module in sys.modules.copy().items():
if module_name == '__main__' or module is None:
continue
try:
# -*- coding: utf-8 -*-
-# Autogenerated by Sphinx on Wed May 13 19:29:27 2020
+# Autogenerated by Sphinx on Mon Jul 13 13:47:56 2020
topics = {'assert': 'The "assert" statement\n'
'**********************\n'
'\n'
' the current environment).\n'
'\n'
'retval\n'
- 'Print the return value for the last return of a function.\n'
+ '\n'
+ ' Print the return value for the last return of a function.\n'
'\n'
'-[ Footnotes ]-\n'
'\n'
'\n'
'A non-normative HTML file listing all valid identifier '
'characters for\n'
- 'Unicode 4.1 can be found at https://www.dcl.hpi.uni-\n'
- 'potsdam.de/home/loewis/table-3131.html.\n'
+ 'Unicode 4.1 can be found at\n'
+ 'https://www.unicode.org/Public/13.0.0/ucd/DerivedCoreProperties.txt\n'
'\n'
'\n'
'Keywords\n'
def write_history():
try:
readline.write_history_file(history)
- except (FileNotFoundError, PermissionError):
- # home directory does not exist or is not writable
- # https://bugs.python.org/issue19891
+ except OSError:
+ # bpo-19891, bpo-41193: Home directory does not exist
+ # or is not writable, or the filesystem is read-only.
pass
atexit.register(write_history)
if recursive or toplevel:
print('recursing down:')
import glob
- names = glob.glob(os.path.join(filename, '*'))
+ names = glob.glob(os.path.join(glob.escape(filename), '*'))
testall(names, recursive, 0)
else:
print('*** directory (use -r) ***')
-#-*- coding: iso-8859-1 -*-
# pysqlite2/test/userfunctions.py: tests for user-defined functions and
# aggregates.
#
-# Copyright (C) 2005-2007 Gerhard Häring <gh@ghaering.de>
+# Copyright (C) 2005-2007 Gerhard Häring <gh@ghaering.de>
#
# This file is part of pysqlite.
#
self.con.create_function("isblob", 1, func_isblob)
self.con.create_function("islonglong", 1, func_islonglong)
self.con.create_function("spam", -1, func)
+ self.con.execute("create table test(t text)")
def tearDown(self):
self.con.close()
val = cur.fetchone()[0]
self.assertEqual(val, 2)
+ # Regarding deterministic functions:
+ #
+ # Between 3.8.3 and 3.15.0, deterministic functions were only used to
+ # optimize inner loops, so for those versions we can only test if the
+ # sqlite machinery has factored out a call or not. From 3.15.0 and onward,
+ # deterministic functions were permitted in WHERE clauses of partial
+ # indices, which allows testing based on syntax, iso. the query optimizer.
+ @unittest.skipIf(sqlite.sqlite_version_info < (3, 8, 3), "Requires SQLite 3.8.3 or higher")
def CheckFuncNonDeterministic(self):
mock = unittest.mock.Mock(return_value=None)
- self.con.create_function("deterministic", 0, mock, deterministic=False)
- self.con.execute("select deterministic() = deterministic()")
- self.assertEqual(mock.call_count, 2)
-
- @unittest.skipIf(sqlite.sqlite_version_info < (3, 8, 3), "deterministic parameter not supported")
+ self.con.create_function("nondeterministic", 0, mock, deterministic=False)
+ if sqlite.sqlite_version_info < (3, 15, 0):
+ self.con.execute("select nondeterministic() = nondeterministic()")
+ self.assertEqual(mock.call_count, 2)
+ else:
+ with self.assertRaises(sqlite.OperationalError):
+ self.con.execute("create index t on test(t) where nondeterministic() is not null")
+
+ @unittest.skipIf(sqlite.sqlite_version_info < (3, 8, 3), "Requires SQLite 3.8.3 or higher")
def CheckFuncDeterministic(self):
mock = unittest.mock.Mock(return_value=None)
self.con.create_function("deterministic", 0, mock, deterministic=True)
- self.con.execute("select deterministic() = deterministic()")
- self.assertEqual(mock.call_count, 1)
+ if sqlite.sqlite_version_info < (3, 15, 0):
+ self.con.execute("select deterministic() = deterministic()")
+ self.assertEqual(mock.call_count, 1)
+ else:
+ try:
+ self.con.execute("create index t on test(t) where deterministic() is not null")
+ except sqlite.OperationalError:
+ self.fail("Unexpected failure while creating partial index")
@unittest.skipIf(sqlite.sqlite_version_info >= (3, 8, 3), "SQLite < 3.8.3 needed")
def CheckFuncDeterministicNotSupported(self):
calculated from ``c`` as given. Use the second case with care, as it can
lead to garbage results.
"""
- if c is None:
- c = mean(data)
+ if c is not None:
+ T, total, count = _sum((x-c)**2 for x in data)
+ return (T, total)
+ c = mean(data)
T, total, count = _sum((x-c)**2 for x in data)
# The following sum should mathematically equal zero, but due to rounding
# error may not.
# Skip tests if _multiprocessing wasn't built.
_multiprocessing = test.support.import_module('_multiprocessing')
# Skip tests if sem_open implementation is broken.
-test.support.import_module('multiprocessing.synchronize')
+support.skip_if_broken_multiprocessing_synchronize()
import threading
import multiprocessing.connection
def get_module_names(self):
import glob
folder = os.path.dirname(multiprocessing.__file__)
- pattern = os.path.join(folder, '*.py')
+ pattern = os.path.join(glob.escape(folder), '*.py')
files = glob.glob(pattern)
modules = [os.path.splitext(os.path.split(f)[1])[0] for f in files]
modules = ['multiprocessing.' + m for m in modules]
self.assertEqual(methods, ['spawn'])
else:
self.assertTrue(methods == ['fork', 'spawn'] or
- methods == ['fork', 'spawn', 'forkserver'])
+ methods == ['spawn', 'fork'] or
+ methods == ['fork', 'spawn', 'forkserver'] or
+ methods == ['spawn', 'fork', 'forkserver'])
def test_preload_resources(self):
if multiprocessing.get_start_method() != 'forkserver':
raise self.exc_type("saw event " + event)
-class TestFinalizeHook:
- """Used in the test_finalize_hooks function to ensure that hooks
- are correctly cleaned up, that they are notified about the cleanup,
- and are unable to prevent it.
- """
-
- def __init__(self):
- print("Created", id(self), file=sys.stdout, flush=True)
-
- def __call__(self, event, args):
- # Avoid recursion when we call id() below
- if event == "builtins.id":
- return
-
- print(event, id(self), file=sys.stdout, flush=True)
-
- if event == "cpython._PySys_ClearAuditHooks":
- raise RuntimeError("Should be ignored")
- elif event == "cpython.PyInterpreterState_Clear":
- raise RuntimeError("Should be ignored")
-
-
# Simple helpers, since we are not in unittest here
def assertEqual(x, y):
if x != y:
pass
-def test_finalize_hooks():
- sys.addaudithook(TestFinalizeHook())
-
-
def test_pickle():
import pickle
if __name__ == "__main__":
- from test.libregrtest.setup import suppress_msvcrt_asserts
+ from test.support import suppress_msvcrt_asserts
- suppress_msvcrt_asserts(False)
+ suppress_msvcrt_asserts()
test = sys.argv[1]
globals()[test]()
def cleanup(self):
import glob
- path = os.path.join(self.tmp_dir, 'test_python_*')
+ path = os.path.join(glob.escape(self.tmp_dir), 'test_python_*')
print("Cleanup %s directory" % self.tmp_dir)
for name in glob.glob(path):
if os.path.isdir(name):
if ns.threshold is not None:
gc.set_threshold(ns.threshold)
- suppress_msvcrt_asserts(ns.verbose and ns.verbose >= 2)
+ support.suppress_msvcrt_asserts(ns.verbose and ns.verbose >= 2)
support.use_resources = ns.use_resources
sys.addaudithook(_test_audit_hook)
-def suppress_msvcrt_asserts(verbose):
- try:
- import msvcrt
- except ImportError:
- return
-
- msvcrt.SetErrorMode(msvcrt.SEM_FAILCRITICALERRORS|
- msvcrt.SEM_NOALIGNMENTFAULTEXCEPT|
- msvcrt.SEM_NOGPFAULTERRORBOX|
- msvcrt.SEM_NOOPENFILEERRORBOX)
- try:
- msvcrt.CrtSetReportMode
- except AttributeError:
- # release build
- return
-
- for m in [msvcrt.CRT_WARN, msvcrt.CRT_ERROR, msvcrt.CRT_ASSERT]:
- if verbose:
- msvcrt.CrtSetReportMode(m, msvcrt.CRTDBG_MODE_FILE)
- msvcrt.CrtSetReportFile(m, msvcrt.CRTDBG_FILE_STDERR)
- else:
- msvcrt.CrtSetReportMode(m, 0)
-
-
-
def replace_stdout():
"""Set stdout encoder error handler to backslashreplace (as stderr error
handler) to avoid UnicodeEncodeError when printing a traceback"""
except ImportError:
_testbuffer = None
-try:
- import numpy as np
-except ImportError:
- np = None
-
from test import support
from test.support import (
TestFailed, TESTFN, run_with_locale, no_tracing,
_2G, _4G, bigmemtest, reap_threads, forget,
+ save_restore_warnings_filters
)
from pickle import bytes_types
+
+# bpo-41003: Save/restore warnings filters to leave them unchanged.
+# Ignore filters installed by numpy.
+try:
+ with save_restore_warnings_filters():
+ import numpy as np
+except ImportError:
+ np = None
+
+
requires_32b = unittest.skipUnless(sys.maxsize < 2**32,
"test is only meaningful on 32-bit builds")
For example, @_requires_unix_version('FreeBSD', (7, 2)) raises SkipTest if
the FreeBSD version is less than 7.2.
"""
- def decorator(func):
- @functools.wraps(func)
- def wrapper(*args, **kw):
- if platform.system() == sysname:
- version_txt = platform.release().split('-', 1)[0]
- try:
- version = tuple(map(int, version_txt.split('.')))
- except ValueError:
- pass
- else:
- if version < min_version:
- min_version_txt = '.'.join(map(str, min_version))
- raise unittest.SkipTest(
- "%s version %s or higher required, not %s"
- % (sysname, min_version_txt, version_txt))
- return func(*args, **kw)
- wrapper.min_version = min_version
- return wrapper
- return decorator
+ import platform
+ min_version_txt = '.'.join(map(str, min_version))
+ version_txt = platform.release().split('-', 1)[0]
+ if platform.system() == sysname:
+ try:
+ version = tuple(map(int, version_txt.split('.')))
+ except ValueError:
+ skip = False
+ else:
+ skip = version < min_version
+ else:
+ skip = False
+
+ return unittest.skipIf(
+ skip,
+ f"{sysname} version {min_version_txt} or higher required, not "
+ f"{version_txt}"
+ )
+
def requires_freebsd_version(*min_version):
"""Decorator raising SkipTest if the OS is FreeBSD and the FreeBSD version is
dll,
os.path.join(dest_dir, os.path.basename(dll))
))
- for runtime in glob.glob(os.path.join(src_dir, "vcruntime*.dll")):
+ for runtime in glob.glob(os.path.join(glob.escape(src_dir), "vcruntime*.dll")):
self._also_link.append((
runtime,
os.path.join(dest_dir, os.path.basename(runtime))
test_case.assertCountEqual(module.__all__, expected)
+def suppress_msvcrt_asserts(verbose=False):
+ try:
+ import msvcrt
+ except ImportError:
+ return
+
+ msvcrt.SetErrorMode(msvcrt.SEM_FAILCRITICALERRORS
+ | msvcrt.SEM_NOALIGNMENTFAULTEXCEPT
+ | msvcrt.SEM_NOGPFAULTERRORBOX
+ | msvcrt.SEM_NOOPENFILEERRORBOX)
+
+ # CrtSetReportMode() is only available in debug build
+ if hasattr(msvcrt, 'CrtSetReportMode'):
+ for m in [msvcrt.CRT_WARN, msvcrt.CRT_ERROR, msvcrt.CRT_ASSERT]:
+ if verbose:
+ msvcrt.CrtSetReportMode(m, msvcrt.CRTDBG_MODE_FILE)
+ msvcrt.CrtSetReportFile(m, msvcrt.CRTDBG_FILE_STDERR)
+ else:
+ msvcrt.CrtSetReportMode(m, 0)
+
+
class SuppressCrashReport:
"""Try to prevent a crash report from popping up.
def __enter__(self):
"""On Windows, disable Windows Error Reporting dialogs using
- SetErrorMode.
+ SetErrorMode() and CrtSetReportMode().
On UNIX, try to save the previous core file size limit, then set
soft limit to 0.
# see http://msdn.microsoft.com/en-us/library/windows/desktop/ms680621.aspx
# GetErrorMode is not available on Windows XP and Windows Server 2003,
# but SetErrorMode returns the previous value, so we can use that
- import ctypes
- self._k32 = ctypes.windll.kernel32
- SEM_NOGPFAULTERRORBOX = 0x02
- self.old_value = self._k32.SetErrorMode(SEM_NOGPFAULTERRORBOX)
- self._k32.SetErrorMode(self.old_value | SEM_NOGPFAULTERRORBOX)
-
- # Suppress assert dialogs in debug builds
- # (see http://bugs.python.org/issue23314)
try:
import msvcrt
- msvcrt.CrtSetReportMode
- except (AttributeError, ImportError):
- # no msvcrt or a release build
- pass
- else:
+ except ImportError:
+ return
+
+ self.old_value = msvcrt.SetErrorMode(msvcrt.SEM_NOGPFAULTERRORBOX)
+
+ msvcrt.SetErrorMode(self.old_value | msvcrt.SEM_NOGPFAULTERRORBOX)
+
+ # bpo-23314: Suppress assert dialogs in debug builds.
+ # CrtSetReportMode() is only available in debug build.
+ if hasattr(msvcrt, 'CrtSetReportMode'):
self.old_modes = {}
for report_type in [msvcrt.CRT_WARN,
msvcrt.CRT_ERROR,
return
if sys.platform.startswith('win'):
- self._k32.SetErrorMode(self.old_value)
+ import msvcrt
+ msvcrt.SetErrorMode(self.old_value)
if self.old_modes:
- import msvcrt
for report_type, (old_mode, old_file) in self.old_modes.items():
msvcrt.CrtSetReportMode(report_type, old_mode)
msvcrt.CrtSetReportFile(report_type, old_file)
del self.exc_value
del self.exc_traceback
del self.thread
+
+
+@contextlib.contextmanager
+def save_restore_warnings_filters():
+ old_filters = warnings.filters[:]
+ try:
+ yield
+ finally:
+ warnings.filters[:] = old_filters
+
+
+def skip_if_broken_multiprocessing_synchronize():
+ """
+ Skip tests if the multiprocessing.synchronize module is missing, if there
+ is no available semaphore implementation, or if creating a lock raises an
+ OSError (on Linux only).
+ """
+
+ # Skip tests if the _multiprocessing extension is missing.
+ import_module('_multiprocessing')
+
+ # Skip tests if there is no available semaphore implementation:
+ # multiprocessing.synchronize requires _multiprocessing.SemLock.
+ synchronize = import_module('multiprocessing.synchronize')
+
+ if sys.platform == "linux":
+ try:
+ # bpo-38377: On Linux, creating a semaphore fails with OSError
+ # if the current user does not have the permission to create
+ # a file in /dev/shm/ directory.
+ synchronize.Lock(ctx=None)
+ except OSError as exc:
+ raise unittest.SkipTest(f"broken multiprocessing SemLock: {exc!r}")
self._assert_values(i.to_bytes(2, 'little', signed=True)
for i in range(-1, 258))
+ def test_strs(self):
+ self._assert_values(['hello world', '你好世界', ''])
+
def test_int(self):
self._assert_values(itertools.chain(range(-1, 258),
[sys.maxsize, -sys.maxsize - 1]))
self.assertRaises(TypeError, ast.Num, 1, None, 2)
self.assertRaises(TypeError, ast.Num, 1, None, 2, lineno=0)
+ # Arbitrary keyword arguments are supported
+ self.assertEqual(ast.Constant(1, foo='bar').foo, 'bar')
+ self.assertEqual(ast.Num(1, foo='bar').foo, 'bar')
+
+ with self.assertRaisesRegex(TypeError, "Num got multiple values for argument 'n'"):
+ ast.Num(1, n=2)
+ with self.assertRaisesRegex(TypeError, "Constant got multiple values for argument 'value'"):
+ ast.Constant(1, value=2)
+
self.assertEqual(ast.Num(42).n, 42)
self.assertEqual(ast.Num(4.25).n, 4.25)
self.assertEqual(ast.Num(4.25j).n, 4.25j)
attr_b = tree.body[0].decorator_list[0].value
self.assertEqual(attr_b.end_col_offset, 4)
+ def test_issue40614_feature_version(self):
+ ast.parse('f"{x=}"', feature_version=(3, 8))
+ with self.assertRaises(SyntaxError):
+ ast.parse('f"{x=}"', feature_version=(3, 7))
+
+ def test_constant_as_name(self):
+ for constant in "True", "False", "None":
+ expr = ast.Expression(ast.Name(constant, ast.Load()))
+ ast.fix_missing_locations(expr)
+ with self.assertRaisesRegex(ValueError, f"Name node can't be used with '{constant}' constant"):
+ compile(expr, "<test>", "eval")
+
+
class ASTHelpers_Test(unittest.TestCase):
maxDiff = None
ConnectionRefusedError, client.connect, ('127.0.0.1', port))
client.close()
- def test_create_datagram_endpoint(self):
+ def _test_create_datagram_endpoint(self, local_addr, family):
class TestMyDatagramProto(MyDatagramProto):
def __init__(inner_self):
super().__init__(loop=self.loop)
self.transport.sendto(b'resp:'+data, addr)
coro = self.loop.create_datagram_endpoint(
- TestMyDatagramProto, local_addr=('127.0.0.1', 0))
+ TestMyDatagramProto, local_addr=local_addr, family=family)
s_transport, server = self.loop.run_until_complete(coro)
- host, port = s_transport.get_extra_info('sockname')
+ sockname = s_transport.get_extra_info('sockname')
+ host, port = socket.getnameinfo(
+ sockname, socket.NI_NUMERICHOST|socket.NI_NUMERICSERV)
self.assertIsInstance(s_transport, asyncio.Transport)
self.assertIsInstance(server, TestMyDatagramProto)
self.assertEqual('CLOSED', client.state)
server.transport.close()
+ def test_create_datagram_endpoint(self):
+ self._test_create_datagram_endpoint(('127.0.0.1', 0), socket.AF_INET)
+
+ @unittest.skipUnless(support.IPV6_ENABLED, 'IPv6 not supported or enabled')
+ def test_create_datagram_endpoint_ipv6(self):
+ self._test_create_datagram_endpoint(('::1', 0), socket.AF_INET6)
+
def test_create_datagram_endpoint_sock(self):
sock = None
local_address = ('127.0.0.1', 0)
if sys.platform != 'win32':
def test_get_event_loop_new_process(self):
- # Issue bpo-32126: The multiprocessing module used by
+ # bpo-32126: The multiprocessing module used by
# ProcessPoolExecutor is not functional when the
# multiprocessing.synchronize module cannot be imported.
- support.import_module('multiprocessing.synchronize')
+ support.skip_if_broken_multiprocessing_synchronize()
async def main():
pool = concurrent.futures.ProcessPoolExecutor()
def test_block_add_hook_baseexception(self):
self.do_test("test_block_add_hook_baseexception")
- def test_finalize_hooks(self):
- returncode, events, stderr = self.run_python("test_finalize_hooks")
- if stderr:
- print(stderr, file=sys.stderr)
- if returncode:
- self.fail(stderr)
-
- firstId = events[0][2]
- self.assertSequenceEqual(
- [
- ("Created", " ", firstId),
- ("cpython._PySys_ClearAuditHooks", " ", firstId),
- ],
- events,
- )
-
def test_pickle(self):
support.import_module("pickle")
('line', 2, 'tfunc_import'), ('step', ),
('line', 3, 'tfunc_import'), ('quit', ),
]
- skip = ('importlib*', 'zipimport', TEST_MODULE)
+ skip = ('importlib*', 'zipimport', 'encodings.*', TEST_MODULE)
with TracerRun(self, skip=skip) as tracer:
tracer.runcall(tfunc_import)
rv = ns['f']()
self.assertEqual(rv, tuple(expected))
+ def test_compile_top_level_await_no_coro(self):
+ """Make sure top level non-await codes get the correct coroutine flags"""
+ modes = ('single', 'exec')
+ code_samples = [
+ '''def f():pass\n''',
+ '''[x for x in l]''',
+ '''{x for x in l}''',
+ '''(x for x in l)''',
+ '''{x:x for x in l}''',
+ ]
+ for mode, code_sample in product(modes, code_samples):
+ source = dedent(code_sample)
+ co = compile(source,
+ '?',
+ mode,
+ flags=ast.PyCF_ALLOW_TOP_LEVEL_AWAIT)
+
+ self.assertNotEqual(co.co_flags & CO_COROUTINE, CO_COROUTINE,
+ msg=f"source={source} mode={mode}")
+
+
def test_compile_top_level_await(self):
"""Test whether code some top level await can be compiled.
# simply use the bigger test data for all tests.
test_size = 0
BIG_TEXT = bytearray(128*1024)
- for fname in glob.glob(os.path.join(os.path.dirname(__file__), '*.py')):
+ for fname in glob.glob(os.path.join(glob.escape(os.path.dirname(__file__)), '*.py')):
with open(fname, 'rb') as fh:
test_size += fh.readinto(memoryview(BIG_TEXT)[test_size:])
if test_size > 128*1024:
# Test that subtype_dealloc decref the newly assigned __class__ only once
self.assertEqual(new_type_refcnt, sys.getrefcount(_testcapi.HeapCTypeSubclass))
+ def test_heaptype_with_setattro(self):
+ obj = _testcapi.HeapCTypeSetattr()
+ self.assertEqual(obj.pvalue, 10)
+ obj.value = 12
+ self.assertEqual(obj.pvalue, 12)
+ del obj.value
+ self.assertEqual(obj.pvalue, 0)
+
def test_pynumber_tobase(self):
from _testcapi import pynumber_tobase
self.assertEqual(pynumber_tobase(123, 2), '0b1111011')
self.assertNotEqual(pickle.load(f), id(sys.modules))
self.assertNotEqual(pickle.load(f), id(builtins))
+ def test_subinterps_recent_language_features(self):
+ r, w = os.pipe()
+ code = """if 1:
+ import pickle
+ with open({:d}, "wb") as f:
+
+ def noop(x): return x
+
+ a = (b := f'1{{2}}3') + noop('x') # Py 3.8 (:=) / 3.6 (f'')
+
+ async def foo(arg): return await arg # Py 3.5
+
+ pickle.dump(dict(a=a, b=b), f)
+ """.format(w)
+
+ with open(r, "rb") as f:
+ ret = support.run_in_subinterp(code)
+ self.assertEqual(ret, 0)
+ self.assertEqual(pickle.load(f), {'a': '123x', 'b': '123'})
+
def test_mutate_exception(self):
"""
Exceptions saved in global module state get shared between
'file': [b'Testing 123.\n'], 'title': ['']}
self.assertEqual(result, expected)
+ def test_parse_multipart_without_content_length(self):
+ POSTDATA = '''--JfISa01
+Content-Disposition: form-data; name="submit-name"
+
+just a string
+
+--JfISa01--
+'''
+ fp = BytesIO(POSTDATA.encode('latin1'))
+ env = {'boundary': 'JfISa01'.encode('latin1')}
+ result = cgi.parse_multipart(fp, env)
+ expected = {'submit-name': ['just a string\n']}
+ self.assertEqual(result, expected)
+
def test_parse_multipart_invalid_encoding(self):
BOUNDARY = "JfISa01"
POSTDATA = """--JfISa01
Nick Mathewson
"""
import unittest
-from test.support import is_jython
+from test import support
from codeop import compile_command, PyCF_DONT_IMPLY_DEDENT
import io
-if is_jython:
+if support.is_jython:
import sys
def unify_callables(d):
def assertValid(self, str, symbol='single'):
'''succeed iff str is a valid piece of code'''
- if is_jython:
+ if support.is_jython:
code = compile_command(str, "<input>", symbol)
self.assertTrue(code)
if symbol == "single":
av = self.assertValid
# special case
- if not is_jython:
+ if not support.is_jython:
self.assertEqual(compile_command(""),
compile("pass", "<input>", 'single',
PyCF_DONT_IMPLY_DEDENT))
self.assertNotEqual(compile_command("a = 1\n", "abc").co_filename,
compile("a = 1\n", "def", 'single').co_filename)
+ def test_warning(self):
+ # Test that the warning is only returned once.
+ with support.check_warnings((".*literal", SyntaxWarning)) as w:
+ compile_command("0 is 0")
+ self.assertEqual(len(w.warnings), 1)
if __name__ == "__main__":
unittest.main()
# Skip tests if _multiprocessing wasn't built.
test.support.import_module('_multiprocessing')
# Skip tests if sem_open implementation is broken.
-test.support.import_module('multiprocessing.synchronize')
+test.support.skip_if_broken_multiprocessing_synchronize()
from test.support.script_helper import assert_python_ok
from test.support.script_helper import assert_python_failure
CRASHER_DIR = os.path.join(os.path.dirname(__file__), "crashers")
-CRASHER_FILES = os.path.join(CRASHER_DIR, "*.py")
+CRASHER_FILES = os.path.join(glob.escape(CRASHER_DIR), "*.py")
infinite_loops = ["infinite_loop_re.py", "nasty_eq_vs_dict.py"]
def delete_files():
# we don't know the precise name the underlying database uses
# so we use glob to locate all names
- for f in glob.glob(_fname + "*"):
+ for f in glob.glob(glob.escape(_fname) + "*"):
test.support.unlink(f)
self.assertEqual(Decimal.from_float(cls(101.1)),
Decimal.from_float(101.1))
- def test_maxcontext_exact_arith(self):
-
- # Make sure that exact operations do not raise MemoryError due
- # to huge intermediate values when the context precision is very
- # large.
-
- # The following functions fill the available precision and are
- # therefore not suitable for large precisions (by design of the
- # specification).
- MaxContextSkip = ['logical_invert', 'next_minus', 'next_plus',
- 'logical_and', 'logical_or', 'logical_xor',
- 'next_toward', 'rotate', 'shift']
-
- Decimal = C.Decimal
- Context = C.Context
- localcontext = C.localcontext
-
- # Here only some functions that are likely candidates for triggering a
- # MemoryError are tested. deccheck.py has an exhaustive test.
- maxcontext = Context(prec=C.MAX_PREC, Emin=C.MIN_EMIN, Emax=C.MAX_EMAX)
- with localcontext(maxcontext):
- self.assertEqual(Decimal(0).exp(), 1)
- self.assertEqual(Decimal(1).ln(), 0)
- self.assertEqual(Decimal(1).log10(), 0)
- self.assertEqual(Decimal(10**2).log10(), 2)
- self.assertEqual(Decimal(10**223).log10(), 223)
- self.assertEqual(Decimal(10**19).logb(), 19)
- self.assertEqual(Decimal(4).sqrt(), 2)
- self.assertEqual(Decimal("40E9").sqrt(), Decimal('2.0E+5'))
- self.assertEqual(divmod(Decimal(10), 3), (3, 1))
- self.assertEqual(Decimal(10) // 3, 3)
- self.assertEqual(Decimal(4) / 2, 2)
- self.assertEqual(Decimal(400) ** -1, Decimal('0.0025'))
-
-
@requires_docstrings
@unittest.skipUnless(C, "test requires C version")
class SignatureTest(unittest.TestCase):
True
>>> real_tests = [t for t in tests if len(t.examples) > 0]
>>> len(real_tests) # objects that actually have doctests
- 12
+ 13
>>> for t in real_tests:
... print('{} {}'.format(len(t.examples), t.name))
...
2 builtins.int.bit_length
5 builtins.memoryview.hex
1 builtins.oct
+ 1 builtins.zip
Note here that 'bin', 'oct', and 'hex' are functions; 'float.as_integer_ratio',
'float.hex', and 'int.bit_length' are methods; 'float.fromhex' is a classmethod,
self.assertEqual(m.get_payload(decode=True).decode('utf-8'), content)
self.assertEqual(m.get_content(), content)
+ def test_set_text_plain_null(self):
+ m = self._make_message()
+ content = ''
+ raw_data_manager.set_content(m, content)
+ self.assertEqual(str(m), textwrap.dedent("""\
+ Content-Type: text/plain; charset="utf-8"
+ Content-Transfer-Encoding: 7bit
+
+
+ """))
+ self.assertEqual(m.get_payload(decode=True).decode('utf-8'), '\n')
+ self.assertEqual(m.get_content(), '\n')
+
def test_set_text_html(self):
m = self._make_message()
content = "<p>Simple message.</p>\n"
self.assertEqual(m.get_payload(decode=True).decode('utf-8'), content)
self.assertEqual(m.get_content(), content)
+ def test_set_text_plain_long_line_heuristics(self):
+ m = self._make_message()
+ content = ("Simple but long message that is over 78 characters"
+ " long to force transfer encoding.\n")
+ raw_data_manager.set_content(m, content)
+ self.assertEqual(str(m), textwrap.dedent("""\
+ Content-Type: text/plain; charset="utf-8"
+ Content-Transfer-Encoding: quoted-printable
+
+ Simple but long message that is over 78 characters long to =
+ force transfer encoding.
+ """))
+ self.assertEqual(m.get_payload(decode=True).decode('utf-8'), content)
+ self.assertEqual(m.get_content(), content)
+
def test_set_text_short_line_minimal_non_ascii_heuristics(self):
m = self._make_message()
content = "et là il est monté sur moi et il commence à m'éto.\n"
{'filename': 'foo'},
[errors.InvalidHeaderDefect]),
+ 'invalid_parameter_value_with_fws_between_ew': (
+ 'attachment; filename="=?UTF-8?Q?Schulbesuchsbest=C3=A4ttigung=2E?='
+ ' =?UTF-8?Q?pdf?="',
+ 'attachment',
+ {'filename': 'Schulbesuchsbestättigung.pdf'},
+ [errors.InvalidHeaderDefect]*3,
+ ('attachment; filename="Schulbesuchsbestättigung.pdf"'),
+ ('Content-Disposition: attachment;\n'
+ ' filename*=utf-8\'\'Schulbesuchsbest%C3%A4ttigung.pdf\n'),
+ ),
+
+ 'parameter_value_with_fws_between_tokens': (
+ 'attachment; filename="File =?utf-8?q?Name?= With Spaces.pdf"',
+ 'attachment',
+ {'filename': 'File Name With Spaces.pdf'},
+ [errors.InvalidHeaderDefect],
+ 'attachment; filename="File Name With Spaces.pdf"',
+ ('Content-Disposition: attachment; filename="File Name With Spaces.pdf"\n'),
+ )
}
# with self.assertRaises(ValueError):
# Address('foo', 'wők', 'example.com')
+ def test_crlf_in_constructor_args_raises(self):
+ cases = (
+ dict(display_name='foo\r'),
+ dict(display_name='foo\n'),
+ dict(display_name='foo\r\n'),
+ dict(domain='example.com\r'),
+ dict(domain='example.com\n'),
+ dict(domain='example.com\r\n'),
+ dict(username='wok\r'),
+ dict(username='wok\n'),
+ dict(username='wok\r\n'),
+ dict(addr_spec='wok@example.com\r'),
+ dict(addr_spec='wok@example.com\n'),
+ dict(addr_spec='wok@example.com\r\n')
+ )
+ for kwargs in cases:
+ with self.subTest(kwargs=kwargs), self.assertRaisesRegex(ValueError, "invalid arguments"):
+ Address(**kwargs)
+
def test_non_ascii_username_in_addr_spec_raises(self):
with self.assertRaises(ValueError):
Address('foo', addr_spec='wők@example.com')
def debug_build(program):
program = os.path.basename(program)
name = os.path.splitext(program)[0]
- return name.endswith("_d")
+ return name.casefold().endswith("_d".casefold())
def remove_python_envvars():
if expected['stdio_errors'] is self.GET_DEFAULT_CONFIG:
expected['stdio_errors'] = 'surrogateescape'
- if sys.platform == 'win32':
+ if MS_WINDOWS:
default_executable = self.test_exe
elif expected['program_name'] is not self.GET_DEFAULT_CONFIG:
default_executable = os.path.abspath(expected['program_name'])
pre_config = dict(configs['pre_config'])
for key, value in list(expected.items()):
if value is self.IGNORE_CONFIG:
- del pre_config[key]
+ pre_config.pop(key, None)
del expected[key]
self.assertEqual(pre_config, expected)
config = dict(configs['config'])
for key, value in list(expected.items()):
if value is self.IGNORE_CONFIG:
- del config[key]
+ config.pop(key, None)
del expected[key]
self.assertEqual(config, expected)
self.check_pre_config(configs, expected_preconfig)
self.check_config(configs, expected_config)
self.check_global_config(configs)
+ return configs
def test_init_default_config(self):
self.check_all_configs("test_init_initialize_config", api=API_COMPAT)
}
self.default_program_name(config)
env = {'TESTPATH': os.path.pathsep.join(paths)}
+
self.check_all_configs("test_init_setpath", config,
api=API_COMPAT, env=env,
ignore_stderr=True)
# Copy pythonXY.dll (or pythonXY_d.dll)
ver = sys.version_info
dll = f'python{ver.major}{ver.minor}'
+ dll3 = f'python{ver.major}'
if debug_build(sys.executable):
dll += '_d'
+ dll3 += '_d'
dll += '.dll'
+ dll3 += '.dll'
dll = os.path.join(os.path.dirname(self.test_exe), dll)
+ dll3 = os.path.join(os.path.dirname(self.test_exe), dll3)
dll_copy = os.path.join(tmpdir, os.path.basename(dll))
+ dll3_copy = os.path.join(tmpdir, os.path.basename(dll3))
shutil.copyfile(dll, dll_copy)
+ shutil.copyfile(dll3, dll3_copy)
# Copy Python program
exec_copy = os.path.join(tmpdir, os.path.basename(self.test_exe))
config['base_prefix'] = pyvenv_home
config['prefix'] = pyvenv_home
env = self.copy_paths_by_env(config)
- self.check_all_configs("test_init_compat_config", config,
- api=API_COMPAT, env=env,
- ignore_stderr=True, cwd=tmpdir)
+ actual = self.check_all_configs("test_init_compat_config", config,
+ api=API_COMPAT, env=env,
+ ignore_stderr=True, cwd=tmpdir)
+ if MS_WINDOWS:
+ self.assertEqual(
+ actual['windows']['python3_dll'],
+ os.path.join(
+ tmpdir,
+ os.path.basename(self.EXPECTED_CONFIG['windows']['python3_dll'])
+ )
+ )
+
def test_global_pathconfig(self):
# Test C API functions getting the path configuration:
self.run_embedded_interpreter("test_audit_run_file", timeout=3, returncode=1)
def test_audit_run_interactivehook(self):
- startup = os.path.join(self.oldcwd, support.TESTFN) + ".py"
+ startup = os.path.join(self.oldcwd, support.TESTFN) + (support.FS_NONASCII or '') + ".py"
with open(startup, "w", encoding="utf-8") as f:
print("import sys", file=f)
print("sys.__interactivehook__ = lambda: None", file=f)
os.unlink(startup)
def test_audit_run_startup(self):
- startup = os.path.join(self.oldcwd, support.TESTFN) + ".py"
+ startup = os.path.join(self.oldcwd, support.TESTFN) + (support.FS_NONASCII or '') + ".py"
with open(startup, "w", encoding="utf-8") as f:
print("pass", file=f)
try:
self.run_pip.assert_called_once_with(
[
- "install", "--no-index", "--find-links",
+ "install", "--no-cache-dir", "--no-index", "--find-links",
unittest.mock.ANY, "setuptools", "pip",
],
unittest.mock.ANY,
self.run_pip.assert_called_once_with(
[
- "install", "--no-index", "--find-links",
+ "install", "--no-cache-dir", "--no-index", "--find-links",
unittest.mock.ANY, "--root", "/foo/bar/",
"setuptools", "pip",
],
self.run_pip.assert_called_once_with(
[
- "install", "--no-index", "--find-links",
+ "install", "--no-cache-dir", "--no-index", "--find-links",
unittest.mock.ANY, "--user", "setuptools", "pip",
],
unittest.mock.ANY,
self.run_pip.assert_called_once_with(
[
- "install", "--no-index", "--find-links",
+ "install", "--no-cache-dir", "--no-index", "--find-links",
unittest.mock.ANY, "--upgrade", "setuptools", "pip",
],
unittest.mock.ANY,
self.run_pip.assert_called_once_with(
[
- "install", "--no-index", "--find-links",
+ "install", "--no-cache-dir", "--no-index", "--find-links",
unittest.mock.ANY, "-v", "setuptools", "pip",
],
unittest.mock.ANY,
self.run_pip.assert_called_once_with(
[
- "install", "--no-index", "--find-links",
+ "install", "--no-cache-dir", "--no-index", "--find-links",
unittest.mock.ANY, "-vv", "setuptools", "pip",
],
unittest.mock.ANY,
self.run_pip.assert_called_once_with(
[
- "install", "--no-index", "--find-links",
+ "install", "--no-cache-dir", "--no-index", "--find-links",
unittest.mock.ANY, "-vvv", "setuptools", "pip",
],
unittest.mock.ANY,
self.run_pip.assert_called_once_with(
[
- "install", "--no-index", "--find-links",
+ "install", "--no-cache-dir", "--no-index", "--find-links",
unittest.mock.ANY, "setuptools", "pip",
],
unittest.mock.ANY,
self.assertEqual(Color.blue.value, 2)
self.assertEqual(Color.green.value, 3)
+ def test_auto_order(self):
+ with self.assertRaises(TypeError):
+ class Color(Enum):
+ red = auto()
+ green = auto()
+ blue = auto()
+ def _generate_next_value_(name, start, count, last):
+ return name
+
+
def test_duplicate_auto(self):
class Dupes(Enum):
first = primero = auto()
eq("dict[str, int]")
eq("set[str,]")
eq("tuple[str, ...]")
+ eq("tuple[(str, *types)]")
+ eq("tuple[xx:yy, (*types,)]")
+ eq("tuple[str, int, (str, int)]")
+ eq("tuple[(*int, str, str, (str, int))]")
eq("tuple[str, int, float, dict[str, int]]")
eq("slice[0]")
eq("slice[0:1]")
# 'GNU gdb (GDB) Fedora 7.9.1-17.fc22\n' -> 7.9
# 'GNU gdb 6.1.1 [FreeBSD]\n' -> 6.1
# 'GNU gdb (GDB) Fedora (7.5.1-37.fc18)\n' -> 7.5
- match = re.search(r"^GNU gdb.*?\b(\d+)\.(\d+)", version)
+ # 'HP gdb 6.7 for HP Itanium (32 or 64 bit) and target HP-UX 11iv2 and 11iv3.\n' -> 6.7
+ match = re.search(r"^(?:GNU|HP) gdb.*?\b(\d+)\.(\d+)", version)
if match is None:
raise Exception("unable to parse GDB version: %r" % version)
return (version, int(match.group(1)), int(match.group(2)))
@unittest.skipUnless(
support.is_resource_enabled('network'), 'network resource disabled')
+@unittest.skip('cyrus.andrew.cmu.edu blocks connections')
class RemoteIMAPTest(unittest.TestCase):
host = 'cyrus.andrew.cmu.edu'
port = 143
@unittest.skipUnless(ssl, "SSL not available")
@unittest.skipUnless(
support.is_resource_enabled('network'), 'network resource disabled')
+@unittest.skip('cyrus.andrew.cmu.edu blocks connections')
class RemoteIMAP_STARTTLSTest(RemoteIMAPTest):
def setUp(self):
@unittest.skipUnless(ssl, "SSL not available")
+@unittest.skip('cyrus.andrew.cmu.edu blocks connections')
class RemoteIMAP_SSLTest(RemoteIMAPTest):
port = 993
imap_class = IMAP4_SSL
pyexe = os.path.join(tmp, os.path.basename(sys.executable))
shutil.copy(sys.executable, pyexe)
shutil.copy(dllname, tmp)
- for f in glob.glob(os.path.join(sys.prefix, "vcruntime*.dll")):
+ for f in glob.glob(os.path.join(glob.escape(sys.prefix), "vcruntime*.dll")):
shutil.copy(f, tmp)
shutil.copy(pydname, tmp2)
-from __future__ import unicode_literals
-
import os
import sys
import shutil
+import pathlib
import tempfile
import textwrap
import contextlib
-try:
- from contextlib import ExitStack
-except ImportError:
- from contextlib2 import ExitStack
-
-try:
- import pathlib
-except ImportError:
- import pathlib2 as pathlib
-
-
-__metaclass__ = type
-
@contextlib.contextmanager
def tempdir():
class Fixtures:
def setUp(self):
- self.fixtures = ExitStack()
+ self.fixtures = contextlib.ExitStack()
self.addCleanup(self.fixtures.close)
sixtofouraddr.sixtofour)
self.assertFalse(bad_addr.sixtofour)
+ # issue41004 Hash collisions in IPv4Interface and IPv6Interface
+ def testV4HashIsNotConstant(self):
+ ipv4_address1 = ipaddress.IPv4Interface("1.2.3.4")
+ ipv4_address2 = ipaddress.IPv4Interface("2.3.4.5")
+ self.assertNotEqual(ipv4_address1.__hash__(), ipv4_address2.__hash__())
+
+ # issue41004 Hash collisions in IPv4Interface and IPv6Interface
+ def testV6HashIsNotConstant(self):
+ ipv6_address1 = ipaddress.IPv6Interface("2001:658:22a:cafe:200:0:0:1")
+ ipv6_address2 = ipaddress.IPv6Interface("2001:658:22a:cafe:200:0:0:2")
+ self.assertNotEqual(ipv6_address1.__hash__(), ipv6_address2.__hash__())
+
if __name__ == '__main__':
unittest.main()
return i
__iter__ = None
+class BadIterableClass:
+ def __iter__(self):
+ raise ZeroDivisionError
+
# Main test suite
class TestCase(unittest.TestCase):
self.assertRaises(TypeError, lambda: 3 in 12)
self.assertRaises(TypeError, lambda: 3 not in map)
+ self.assertRaises(ZeroDivisionError, lambda: 3 in BadIterableClass())
d = {"one": 1, "two": 2, "three": 3, 1j: 2j}
for k in d:
self.assertRaises(TypeError, indexOf, 42, 1)
self.assertRaises(TypeError, indexOf, indexOf, indexOf)
+ self.assertRaises(ZeroDivisionError, indexOf, BadIterableClass(), 1)
f = open(TESTFN, "w")
try:
def test_error_iter(self):
for typ in (DefaultIterClass, NoIterClass):
self.assertRaises(TypeError, iter, typ())
+ self.assertRaises(ZeroDivisionError, iter, BadIterableClass())
def test_main():
@patch.object(logging.handlers.QueueListener, 'handle')
def test_handle_called_with_mp_queue(self, mock_handle):
- # Issue 28668: The multiprocessing (mp) module is not functional
+ # bpo-28668: The multiprocessing (mp) module is not functional
# when the mp.synchronize module cannot be imported.
- support.import_module('multiprocessing.synchronize')
+ support.skip_if_broken_multiprocessing_synchronize()
for i in range(self.repeat):
log_queue = multiprocessing.Queue()
self.setup_and_log(log_queue, '%s_%s' % (self.id(), i))
indicates that messages were not registered on the queue until
_after_ the QueueListener stopped.
"""
- # Issue 28668: The multiprocessing (mp) module is not functional
+ # bpo-28668: The multiprocessing (mp) module is not functional
# when the mp.synchronize module cannot be imported.
- support.import_module('multiprocessing.synchronize')
+ support.skip_if_broken_multiprocessing_synchronize()
for i in range(self.repeat):
queue = multiprocessing.Queue()
self.setup_and_log(queue, '%s_%s' %(self.id(), i))
super().tearDown()
self._box.close()
self._delete_recursively(self._path)
- for lock_remnant in glob.glob(self._path + '.*'):
+ for lock_remnant in glob.glob(glob.escape(self._path) + '.*'):
support.unlink(lock_remnant)
def assertMailboxEmpty(self):
super().tearDown()
self._box.close()
self._delete_recursively(self._path)
- for lock_remnant in glob.glob(self._path + '.*'):
+ for lock_remnant in glob.glob(glob.escape(self._path) + '.*'):
support.unlink(lock_remnant)
def test_labels(self):
mime_dict = mimetypes.read_mime_types(file)
eq(mime_dict[".pyunit"], "x-application/x-unittest")
+ # bpo-41048: read_mime_types should read the rule file with 'utf-8' encoding.
+ # Not with locale encoding. _bootlocale has been imported because io.open(...)
+ # uses it.
+ with support.temp_dir() as directory:
+ data = "application/no-mans-land Fran\u00E7ais"
+ file = pathlib.Path(directory, "sample.mimetype")
+ file.write_text(data, encoding='utf-8')
+ import _bootlocale
+ with support.swap_attr(_bootlocale, 'getpreferredencoding', lambda do_setlocale=True: 'ASCII'):
+ mime_dict = mimetypes.read_mime_types(file)
+ eq(mime_dict[".Français"], "application/no-mans-land")
+
def test_non_standard_types(self):
eq = self.assertEqual
# First try strict
# 0xe2 is not allowed in utf8
print('CP1252 test P\xe2t\xe9')
import b_utf8
+""" + """\
b_utf8.py
# use the default of utf8
print('Unicode test A code point 2090 \u2090 that is not valid in cp1252')
-"""]
+""".encode('utf-8')]
def open_file(path):
dirname = os.path.dirname(path)
""" Test suite for the code in msilib """
import os
import unittest
-from test.support import TESTFN, import_module, unlink
+from test.support import TESTFN, FS_NONASCII, import_module, unlink
msilib = import_module('msilib')
import msilib.schema
def init_database():
- path = TESTFN + '.msi'
+ path = TESTFN + (FS_NONASCII or '') + '.msi'
db = msilib.init_database(
path,
msilib.schema,
)
self.addCleanup(unlink, db_path)
+ def test_view_non_ascii(self):
+ db, db_path = init_database()
+ view = db.OpenView("SELECT 'ß-розпад' FROM Property")
+ view.Execute(None)
+ record = view.Fetch()
+ self.assertEqual(record.GetString(1), 'ß-розпад')
+ view.Close()
+ db.Close()
+ self.addCleanup(unlink, db_path)
+
def test_summaryinfo_getproperty_issue1104(self):
db, db_path = init_database()
try:
AVAILABLE_START_METHODS = set(multiprocessing.get_all_start_methods())
# Issue #22332: Skip tests if sem_open implementation is broken.
-support.import_module('multiprocessing.synchronize')
+support.skip_if_broken_multiprocessing_synchronize()
verbose = support.verbose
def __rmul__(self, other):
return other * self.lst
+class BadIterable:
+ def __iter__(self):
+ raise ZeroDivisionError
+
class OperatorTestCase:
def test_lt(self):
operator = self.module
self.assertRaises(TypeError, operator.countOf)
self.assertRaises(TypeError, operator.countOf, None, None)
+ self.assertRaises(ZeroDivisionError, operator.countOf, BadIterable(), 1)
self.assertEqual(operator.countOf([1, 2, 1, 3, 1, 4], 3), 1)
self.assertEqual(operator.countOf([1, 2, 1, 3, 1, 4], 5), 0)
operator = self.module
self.assertRaises(TypeError, operator.indexOf)
self.assertRaises(TypeError, operator.indexOf, None, None)
+ self.assertRaises(ZeroDivisionError, operator.indexOf, BadIterable(), 1)
self.assertEqual(operator.indexOf([4, 3, 2, 1], 3), 1)
self.assertRaises(ValueError, operator.indexOf, [4, 3, 2, 1], 0)
operator = self.module
self.assertRaises(TypeError, operator.contains)
self.assertRaises(TypeError, operator.contains, None, None)
+ self.assertRaises(ZeroDivisionError, operator.contains, BadIterable(), 1)
self.assertTrue(operator.contains(range(4), 2))
self.assertFalse(operator.contains(range(4), 5))
import pdb
import sys
import types
+import codecs
import unittest
import subprocess
import textwrap
return self._run_pdb(['-m', self.module_name], commands)
def _assert_find_function(self, file_content, func_name, expected):
- file_content = textwrap.dedent(file_content)
-
- with open(support.TESTFN, 'w') as f:
+ with open(support.TESTFN, 'wb') as f:
f.write(file_content)
expected = None if not expected else (
expected, pdb.find_function(func_name, support.TESTFN))
def test_find_function_empty_file(self):
- self._assert_find_function('', 'foo', None)
+ self._assert_find_function(b'', 'foo', None)
def test_find_function_found(self):
self._assert_find_function(
"""\
- def foo():
- pass
+def foo():
+ pass
- def bar():
- pass
+def bœr():
+ pass
- def quux():
- pass
- """,
- 'bar',
- ('bar', 4),
+def quux():
+ pass
+""".encode(),
+ 'bœr',
+ ('bœr', 4),
+ )
+
+ def test_find_function_found_with_encoding_cookie(self):
+ self._assert_find_function(
+ """\
+# coding: iso-8859-15
+def foo():
+ pass
+
+def bœr():
+ pass
+
+def quux():
+ pass
+""".encode('iso-8859-15'),
+ 'bœr',
+ ('bœr', 5),
+ )
+
+ def test_find_function_found_with_bom(self):
+ self._assert_find_function(
+ codecs.BOM_UTF8 + """\
+def bœr():
+ pass
+""".encode(),
+ 'bœr',
+ ('bœr', 1),
)
def test_issue7964(self):
args = ['-Wd', '-E', '-bb', '-m', 'test.regrtest', '--list-tests']
output = self.run_python(args)
rough_number_of_tests_found = len(output.splitlines())
- actual_testsuite_glob = os.path.join(os.path.dirname(__file__),
+ actual_testsuite_glob = os.path.join(glob.escape(os.path.dirname(__file__)),
'test*.py')
rough_counted_test_py_files = len(glob.glob(actual_testsuite_glob))
# We're not trying to duplicate test finding logic in here,
# test.support.script_helper.
env = kw.setdefault('env', dict(os.environ))
env['TERM'] = 'vt100'
- return subprocess.Popen(cmd_line, executable=sys.executable,
+ return subprocess.Popen(cmd_line,
+ executable=sys.executable,
+ text=True,
stdin=subprocess.PIPE,
stdout=stdout, stderr=stderr,
**kw)
sys.exit(0)
"""
user_input = dedent(user_input)
- user_input = user_input.encode()
p = spawn_repl()
with SuppressCrashReport():
p.stdin.write(user_input)
output = kill_python(p)
- self.assertIn(b'After the exception.', output)
+ self.assertIn('After the exception.', output)
# Exit code 120: Py_FinalizeEx() failed to flush stdout and stderr.
self.assertIn(p.returncode, (1, 120))
</test>"""
'''
user_input = dedent(user_input)
- user_input = user_input.encode()
p = spawn_repl()
- with SuppressCrashReport():
- p.stdin.write(user_input)
+ p.stdin.write(user_input)
output = kill_python(p)
self.assertEqual(p.returncode, 0)
+ def test_close_stdin(self):
+ user_input = dedent('''
+ import os
+ print("before close")
+ os.close(0)
+ ''')
+ prepare_repl = dedent('''
+ from test.support import suppress_msvcrt_asserts
+ suppress_msvcrt_asserts()
+ ''')
+ process = spawn_repl('-c', prepare_repl)
+ output = process.communicate(user_input)[0]
+ self.assertEqual(process.returncode, 0)
+ self.assertIn('before close', output)
+
if __name__ == "__main__":
unittest.main()
# found in sys.path (see site.addpackage()). Skip the test if at least
# one .pth file is found.
for path in isolated_paths:
- pth_files = glob.glob(os.path.join(path, "*.pth"))
+ pth_files = glob.glob(os.path.join(glob.escape(path), "*.pth"))
if pth_files:
self.skipTest(f"found {len(pth_files)} .pth files in: {path}")
self.assertEqual(result, exact)
self.assertIsInstance(result, Decimal)
+ def test_center_not_at_mean(self):
+ data = (1.0, 2.0)
+ self.assertEqual(self.func(data), 0.5)
+ self.assertEqual(self.func(data, xbar=2.0), 1.0)
class TestPStdev(VarianceStdevMixin, NumericTestCase):
# Tests for population standard deviation.
expected = math.sqrt(statistics.pvariance(data))
self.assertEqual(self.func(data), expected)
+ def test_center_not_at_mean(self):
+ # See issue: 40855
+ data = (3, 6, 7, 10)
+ self.assertEqual(self.func(data), 2.5)
+ self.assertEqual(self.func(data, mu=0.5), 6.5)
class TestStdev(VarianceStdevMixin, NumericTestCase):
# Tests for sample standard deviation.
expected = math.sqrt(statistics.variance(data))
self.assertEqual(self.func(data), expected)
+ def test_center_not_at_mean(self):
+ data = (1.0, 2.0)
+ self.assertEqual(self.func(data, xbar=2.0), 1.0)
class TestGeometricMean(unittest.TestCase):
s2 = struct.Struct(s.format.encode())
self.assertEqual(s2.format, s.format)
+ def test_issue35714(self):
+ # Embedded null characters should not be allowed in format strings.
+ for s in '\0', '2\0i', b'\0':
+ with self.assertRaisesRegex(struct.error,
+ 'embedded null character'):
+ struct.calcsize(s)
+
class UnpackIteratorTest(unittest.TestCase):
"""
import glob, random
fn = support.findfile("tokenize_tests.txt")
tempdir = os.path.dirname(fn) or os.curdir
- testfiles = glob.glob(os.path.join(tempdir, "test*.py"))
+ testfiles = glob.glob(os.path.join(glob.escape(tempdir), "test*.py"))
# Tokenize is broken on test_pep3131.py because regular expressions are
# broken on the obscure unicode identifiers in it. *sigh*
import os
import sys
-from test.support import TESTFN, rmtree, unlink, captured_stdout
+from test.support import TESTFN, TESTFN_UNICODE, FS_NONASCII, rmtree, unlink, captured_stdout
from test.support.script_helper import assert_python_ok, assert_python_failure
import textwrap
import unittest
coverfile = 'tmp.cover'
def setUp(self):
- with open(self.codefile, 'w') as f:
+ with open(self.codefile, 'w', encoding='iso-8859-15') as f:
f.write(textwrap.dedent('''\
- x = 42
+ # coding: iso-8859-15
+ x = 'spœm'
if []:
print('unreachable')
'''))
self.assertEqual(stderr, b'')
self.assertFalse(os.path.exists(tracecoverpath))
self.assertTrue(os.path.exists(self.coverfile))
- with open(self.coverfile) as f:
+ with open(self.coverfile, encoding='iso-8859-15') as f:
self.assertEqual(f.read(),
- " 1: x = 42\n"
+ " # coding: iso-8859-15\n"
+ " 1: x = 'spœm'\n"
" 1: if []:\n"
" print('unreachable')\n"
)
argv = '-m trace --count --missing'.split() + [self.codefile]
status, stdout, stderr = assert_python_ok(*argv)
self.assertTrue(os.path.exists(self.coverfile))
- with open(self.coverfile) as f:
+ with open(self.coverfile, encoding='iso-8859-15') as f:
self.assertEqual(f.read(), textwrap.dedent('''\
- 1: x = 42
+ # coding: iso-8859-15
+ 1: x = 'spœm'
1: if []:
>>>>>> print('unreachable')
'''))
self.assertIn(message, stderr)
def test_listfuncs_flag_success(self):
- with open(TESTFN, 'w') as fd:
- self.addCleanup(unlink, TESTFN)
+ filename = TESTFN + '.py'
+ modulename = os.path.basename(TESTFN)
+ with open(filename, 'w', encoding='utf-8') as fd:
+ self.addCleanup(unlink, filename)
fd.write("a = 1\n")
- status, stdout, stderr = assert_python_ok('-m', 'trace', '-l', TESTFN)
+ status, stdout, stderr = assert_python_ok('-m', 'trace', '-l', filename,
+ PYTHONIOENCODING='utf-8')
self.assertIn(b'functions called:', stdout)
+ expected = f'filename: {filename}, modulename: {modulename}, funcname: <module>'
+ self.assertIn(expected.encode(), stdout)
def test_sys_argv_list(self):
- with open(TESTFN, 'w') as fd:
+ with open(TESTFN, 'w', encoding='utf-8') as fd:
self.addCleanup(unlink, TESTFN)
fd.write("import sys\n")
fd.write("print(type(sys.argv))\n")
def test_count_and_summary(self):
filename = f'{TESTFN}.py'
coverfilename = f'{TESTFN}.cover'
- with open(filename, 'w') as fd:
+ modulename = os.path.basename(TESTFN)
+ with open(filename, 'w', encoding='utf-8') as fd:
self.addCleanup(unlink, filename)
self.addCleanup(unlink, coverfilename)
fd.write(textwrap.dedent("""\
stdout = stdout.decode()
self.assertEqual(status, 0)
self.assertIn('lines cov% module (path)', stdout)
- self.assertIn(f'6 100% {TESTFN} ({filename})', stdout)
+ self.assertIn(f'6 100% {modulename} ({filename})', stdout)
def test_run_as_module(self):
assert_python_ok('-m', 'trace', '-l', '--module', 'timeit', '-n', '1')
self._do_copyish(filename, filename)
# Filename should appear in glob output
self.assertTrue(
- os.path.abspath(filename)==os.path.abspath(glob.glob(filename)[0]))
+ os.path.abspath(filename)==os.path.abspath(glob.glob(glob.escape(filename))[0]))
# basename should appear in listdir.
path, base = os.path.split(os.path.abspath(filename))
file_list = os.listdir(path)
import tempfile
from test.support import (captured_stdout, captured_stderr, requires_zlib,
can_symlink, EnvironmentVarGuard, rmtree,
- import_module)
+ import_module,
+ skip_if_broken_multiprocessing_synchronize)
import threading
import unittest
import venv
"""
Test that the multiprocessing is able to spawn.
"""
- # Issue bpo-36342: Instanciation of a Pool object imports the
+ # bpo-36342: Instantiation of a Pool object imports the
# multiprocessing.synchronize module. Skip the test if this module
# cannot be imported.
- import_module('multiprocessing.synchronize')
+ skip_if_broken_multiprocessing_synchronize()
+
rmtree(self.env_dir)
self.run_with_capture(venv.create, self.env_dir)
envpy = os.path.join(os.path.realpath(self.env_dir),
# executing pip with sudo, you may want sudo's -H flag."
# where $HOME is replaced by the HOME environment variable.
err = re.sub("^(WARNING: )?The directory .* or its parent directory "
- "is not owned by the current user .*$", "",
+ "is not owned or is not writable by the current user.*$", "",
err, flags=re.MULTILINE)
self.assertEqual(err.rstrip(), "")
# Being fairly specific regarding the expected behaviour for the
@unittest.skipUnless(sys.getfilesystemencoding() != 'ascii',
'requires non-ascii filesystemencoding')
def test_nonascii(self):
+ PYTHONWARNINGS="ignore:DeprecationWarning" + (support.FS_NONASCII or '')
rc, stdout, stderr = assert_python_ok("-c",
"import sys; sys.stdout.write(str(sys.warnoptions))",
PYTHONIOENCODING="utf-8",
- PYTHONWARNINGS="ignore:DeprecaciónWarning",
+ PYTHONWARNINGS=PYTHONWARNINGS,
PYTHONDEVMODE="")
- self.assertEqual(stdout,
- "['ignore:DeprecaciónWarning']".encode('utf-8'))
+ self.assertEqual(stdout, str([PYTHONWARNINGS]).encode())
class CEnvironmentVariableTests(EnvironmentVariableTests, unittest.TestCase):
module = c_warnings
self.assertEqual(zf.filelist[0].filename, "foo.txt")
self.assertEqual(zf.filelist[1].filename, "\xf6.txt")
+ def test_read_after_write_unicode_filenames(self):
+ with zipfile.ZipFile(TESTFN2, 'w') as zipfp:
+ zipfp.writestr('приклад', b'sample')
+ self.assertEqual(zipfp.read('приклад'), b'sample')
+
def test_exclusive_create_zip_file(self):
"""Test exclusive creating a new zipfile."""
unlink(TESTFN2)
if self.outfile:
# try and store counts and module info into self.outfile
try:
- pickle.dump((self.counts, self.calledfuncs, self.callers),
- open(self.outfile, 'wb'), 1)
+ with open(self.outfile, 'wb') as f:
+ pickle.dump((self.counts, self.calledfuncs, self.callers),
+ f, 1)
except OSError as err:
print("Can't save counts files because %s" % err, file=sys.stderr)
sys.argv = [opts.progname, *opts.arguments]
sys.path[0] = os.path.dirname(opts.progname)
- with open(opts.progname) as fp:
+ with open(opts.progname, 'rb') as fp:
code = compile(fp.read(), opts.progname, 'exec')
# try to emulate __main__ namespace as much as possible
globs = {
def __enter__(self):
# The __warningregistry__'s need to be in a pristine state for tests
# to work properly.
- for v in sys.modules.values():
+ for v in list(sys.modules.values()):
if getattr(v, '__warningregistry__', None):
v.__warningregistry__ = {}
self.warnings_manager = warnings.catch_warnings(record=True)
import warnings
import weakref
import inspect
+import types
from copy import deepcopy
from test import support
pass
self.assertRaises(TypeError, self.assertWarnsRegex, MyWarn, lambda: True)
+ def testAssertWarnsModifySysModules(self):
+ # bpo-29620: handle modified sys.modules during iteration
+ class Foo(types.ModuleType):
+ @property
+ def __warningregistry__(self):
+ sys.modules['@bar@'] = 'bar'
+
+ sys.modules['@foo@'] = Foo('foo')
+ try:
+ self.assertWarns(UserWarning, warnings.warn, 'expected')
+ finally:
+ del sys.modules['@foo@']
+ del sys.modules['@bar@']
+
def testAssertRaisesRegexMismatch(self):
def Stub():
raise Exception('Unexpected')
PS C:\> Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser\r
\r
For more information on Execution Policies: \r
-ttps:/go.microsoft.com/fwlink/?LinkID=135170\r
+https://go.microsoft.com/fwlink/?LinkID=135170\r
\r
#>\r
Param(\r
tempdir = os.path.join(tempfile.gettempdir(),
".grail-unix")
user = pwd.getpwuid(os.getuid())[0]
- filename = os.path.join(tempdir, user + "-*")
+ filename = os.path.join(glob.escape(tempdir), glob.escape(user) + "-*")
maybes = glob.glob(filename)
if not maybes:
return None
register(browser, None, BackgroundBrowser(browser))
else:
# Prefer X browsers if present
- if os.environ.get("DISPLAY"):
+ if os.environ.get("DISPLAY") or os.environ.get("WAYLAND_DISPLAY"):
try:
cmd = "xdg-settings get default-web-browser".split()
raw_result = subprocess.check_output(cmd, stderr=subprocess.DEVNULL)
result = raw_result.decode().strip()
- except (FileNotFoundError, subprocess.CalledProcessError):
+ except (FileNotFoundError, subprocess.CalledProcessError, PermissionError) :
pass
else:
global _os_preferred_browser
__all__ = ["BadZipFile", "BadZipfile", "error",
"ZIP_STORED", "ZIP_DEFLATED", "ZIP_BZIP2", "ZIP_LZMA",
- "is_zipfile", "ZipInfo", "ZipFile", "PyZipFile", "LargeZipFile"]
+ "is_zipfile", "ZipInfo", "ZipFile", "PyZipFile", "LargeZipFile",
+ "Path"]
class BadZipFile(Exception):
pass
# strong encryption
raise NotImplementedError("strong encryption (flag bit 6)")
- if zinfo.flag_bits & 0x800:
+ if fheader[_FH_GENERAL_PURPOSE_FLAG_BITS] & 0x800:
# UTF-8 filename
fname_str = fname.decode("utf-8")
else:
$DESTROOT, massages that installation to remove .pyc files and such, creates
an Installer package from the installation plus other files in ``resources``
and ``scripts`` and placed that on a ``.dmg`` disk image.
-
-For Python 3.4.0, PSF practice is to build two installer variants
-for each release.
-
-1. 32-bit-only, i386 and PPC universal, capable on running on all machines
- supported by Mac OS X 10.5 through (at least) 10.9::
-
- /path/to/bootstrap/python2.7 build-installer.py \
- --sdk-path=/Developer/SDKs/MacOSX10.5.sdk \
- --universal-archs=32-bit \
- --dep-target=10.5
-
- - builds the following third-party libraries
-
- * NCurses 5.9 (http://bugs.python.org/issue15037)
- * SQLite 3.8.11
- * XZ 5.0.5
-
- - uses system-supplied versions of third-party libraries
-
- * readline module links with Apple BSD editline (libedit)
-
- - requires ActiveState ``Tcl/Tk 8.4`` (currently 8.4.20) to be installed for building
-
- - recommended build environment:
-
- * Mac OS X 10.5.8 Intel or PPC
- * Xcode 3.1.4
- * ``MacOSX10.5`` SDK
- * ``MACOSX_DEPLOYMENT_TARGET=10.5``
- * Apple ``gcc-4.2``
- * bootstrap non-framework Python 2.7 for documentation build with
- Sphinx (as of 3.4.1)
-
- - alternate build environments:
-
- * Mac OS X 10.6.8 with Xcode 3.2.6
- - need to change ``/System/Library/Frameworks/{Tcl,Tk}.framework/Version/Current`` to ``8.4``
- * Note Xcode 4.* does not support building for PPC so cannot be used for this build
-
-2. 64-bit / 32-bit, x86_64 and i386 universal, for OS X 10.6 (and later)::
+The installer package built on the dmg is a macOS bundle format installer
+package. This format is deprecated and is no longer supported by modern
+macOS systems; it is usable on macOS 10.6 and earlier systems.
+To be usable on newer versions of macOS, the bits in the bundle package
+must be assembled in a macOS flat installer package, using current
+versions of the pkgbuild and productbuild utilities. To pass macoS
+Gatekeeper download quarantine, the final package must be signed
+with a valid Apple Developer ID certificate using productsign.
+Starting with macOS 10.15 Catalina, Gatekeeper now also requires
+that installer packages are submitted to and pass Apple's automated
+notarization service using the altool command. To pass notarization,
+the binaries included in the package must be built with at least
+the macOS 10.9 SDK, mout now be signed with the codesign utility
+and executables must opt in to the hardened run time option with
+any necessary entitlements. Details of these processes are
+available in the on-line Apple Developer Documentation and man pages.
+
+As of 3.8.0 and 3.7.7, PSF practice is to build one installer variants
+for each release. Note that as of this writing, no Pythons support
+building on a newer version of macOS that will run on older versions
+by setting MACOSX_DEPLOYMENT_TARGET. This is because the various
+Python C modules do not yet support runtime testing of macOS
+feature availability (for example, by using macOS AvailabilityMacros.h
+and weak-linking). To build a Python that is to be used on a
+range of macOS releases, always build on the oldest release to be
+supported; the necessary shared libraries for that release will
+normally also be available on later systems, with the occasional
+exception such as the removal of 32-bit libraries in macOS 10.15.
+
+build-installer requires Apple Developer tools, either from the
+Command Line Tools package or from a full Xcode installation.
+You should use the most recent version of either for the operating
+system version in use. (One notable exception: on macOS 10.6,
+Snow Leopard, use Xcode 3, not Xcode 4 which was released later
+in the 10.6 support cycle.)
+
+1. 64-bit, x86_64, for OS X 10.9 (and later)::
/path/to/bootstrap/python2.7 build-installer.py \
- --sdk-path=/Developer/SDKs/MacOSX10.6.sdk \
- --universal-archs=intel \
- --dep-target=10.6
+ --universal-archs=intel-64 \
+ --dep-target=10.9
- builds the following third-party libraries
- * NCurses 5.9 (http://bugs.python.org/issue15037)
- * SQLite 3.8.11
- * XZ 5.0.5
+ * OpenSSL 1.1.1
+ * Tcl/Tk 8.6
+ * NCurses
+ * SQLite
+ * XZ
+ * libffi
- uses system-supplied versions of third-party libraries
* readline module links with Apple BSD editline (libedit)
-
- - requires ActiveState Tcl/Tk 8.5.15.1 (or later) to be installed for building
-
- - recommended build environment:
-
- * Mac OS X 10.6.8 (or later)
- * Xcode 3.2.6
- * ``MacOSX10.6`` SDK
- * ``MACOSX_DEPLOYMENT_TARGET=10.6``
- * Apple ``gcc-4.2``
- * bootstrap non-framework Python 2.7 for documentation build with
- Sphinx (as of 3.4.1)
-
- - alternate build environments:
-
- * none. Xcode 4.x currently supplies two C compilers.
- ``llvm-gcc-4.2.1`` has been found to miscompile Python 3.3.x and
- produce a non-functional Python executable. As it appears to be
- considered a migration aid by Apple and is not likely to be fixed,
- its use should be avoided. The other compiler, ``clang``, has been
- undergoing rapid development. While it appears to have become
- production-ready in the most recent Xcode 5 releases, the versions
- available on the deprecated Xcode 4.x for 10.6 were early releases
- and did not receive the level of exposure in production environments
- that the Xcode 3 gcc-4.2 compiler has had.
-
-
-* For Python 2.7.x and 3.2.x, the 32-bit-only installer was configured to
- support Mac OS X 10.3.9 through (at least) 10.6. Because it is
- believed that there are few systems still running OS X 10.3 or 10.4
- and because it has become increasingly difficult to test and
- support the differences in these earlier systems, as of Python 3.3.0 the PSF
- 32-bit installer no longer supports them. For reference in building such
- an installer yourself, the details are::
-
- /usr/bin/python build-installer.py \
- --sdk-path=/Developer/SDKs/MacOSX10.4u.sdk \
- --universal-archs=32-bit \
- --dep-target=10.3
-
- - builds the following third-party libraries
-
- * Bzip2
- * NCurses
- * GNU Readline (GPL)
- * SQLite 3
- * XZ
- * Zlib 1.2.3
- * Oracle Sleepycat DB 4.8 (Python 2.x only)
-
- - requires ActiveState ``Tcl/Tk 8.4`` (currently 8.4.20) to be installed for building
+ * zlib
+ * bz2
- recommended build environment:
- * Mac OS X 10.5.8 PPC or Intel
- * Xcode 3.1.4 (or later)
- * ``MacOSX10.4u`` SDK (later SDKs do not support PPC G3 processors)
- * ``MACOSX_DEPLOYMENT_TARGET=10.3``
- * Apple ``gcc-4.0``
- * system Python 2.5 for documentation build with Sphinx
-
- - alternate build environments:
-
- * Mac OS X 10.6.8 with Xcode 3.2.6
- - need to change ``/System/Library/Frameworks/{Tcl,Tk}.framework/Version/Current`` to ``8.4``
-
+ * Mac OS X 10.9.5
+ * Xcode Command Line Tools 6.2
+ * ``MacOSX10.9`` SDK
+ * ``MACOSX_DEPLOYMENT_TARGET=10.9``
+ * Apple ``clang``
General Prerequisites
---------------------
-* No Fink (in ``/sw``) or MacPorts (in ``/opt/local``) or other local
- libraries or utilities (in ``/usr/local``) as they could
+* No Fink (in ``/sw``) or MacPorts (in ``/opt/local``) or Homebrew or
+ other local libraries or utilities (in ``/usr/local``) as they could
interfere with the build.
-* The documentation for the release is built using Sphinx
- because it is included in the installer. For 2.7.x and 3.x.x up to and
- including 3.4.0, the ``Doc/Makefile`` uses ``svn`` to download repos of
- ``Sphinx`` and its dependencies. Beginning with 3.4.1, the ``Doc/Makefile``
- assumes there is an externally-provided ``sphinx-build`` and requires at
- least Python 2.6 to run. Because of this, it is no longer possible to
- build a 3.4.1 or later installer on OS X 10.5 using the Apple-supplied
- Python 2.5.
-
* It is safest to start each variant build with an empty source directory
- populated with a fresh copy of the untarred source.
+ populated with a fresh copy of the untarred source or a source repo.
* It is recommended that you remove any existing installed version of the
Python being built::
sudo rm -rf /Library/Frameworks/Python.framework/Versions/n.n
-
-The Recipe
-----------
-
-Here are the steps you need to follow to build a Python installer:
-
-* Run ``build-installer.py``. Optionally you can pass a number of arguments
- to specify locations of various files. Please see the top of
- ``build-installer.py`` for its usage.
-
- Running this script takes some time, it will not only build Python itself
- but also some 3th-party libraries that are needed for extensions.
-
-* When done the script will tell you where the DMG image is (by default
- somewhere in ``/tmp/_py``).
-
-Building other universal installers
-...................................
-
-It is also possible to build a 4-way universal installer that runs on
-OS X 10.5 Leopard or later::
-
- /usr/bin/python /build-installer.py \
- --dep-target=10.5
- --universal-archs=all
- --sdk-path=/Developer/SDKs/MacOSX10.5.sdk
-
-This requires that the deployment target is 10.5, and hence
-also that you are building on at least OS X 10.5. 4-way includes
-``i386``, ``x86_64``, ``ppc``, and ``ppc64`` (G5). ``ppc64`` executable
-variants can only be run on G5 machines running 10.5. Note that,
-while OS X 10.6 is only supported on Intel-based machines, it is possible
-to run ``ppc`` (32-bit) executables unmodified thanks to the Rosetta ppc
-emulation in OS X 10.5 and 10.6. The 4-way installer variant must be
-built with Xcode 3. It is not regularly built or tested.
-
-Other ``--universal-archs`` options are ``64-bit`` (``x86_64``, ``ppc64``),
-and ``3-way`` (``ppc``, ``i386``, ``x86_64``). None of these options
-are regularly exercised; use at your own risk.
-
-
-Testing
--------
-
-Ideally, the resulting binaries should be installed and the test suite run
-on all supported OS X releases and architectures. As a practical matter,
-that is generally not possible. At a minimum, variant 1 should be run on
-a PPC G4 system with OS X 10.5 and at least one Intel system running OS X
-10.9, 10.8, 10.7, 10.6, or 10.5. Variant 2 should be run on 10.9, 10.8,
-10.7, and 10.6 systems in both 32-bit and 64-bit modes.::
-
- /usr/local/bin/pythonn.n -m test -w -u all,-largefile
- /usr/local/bin/pythonn.n-32 -m test -w -u all
-
-Certain tests will be skipped and some cause the interpreter to fail
-which will likely generate ``Python quit unexpectedly`` alert messages
-to be generated at several points during a test run. These are normal
-during testing and can be ignored.
-
-It is also recommend to launch IDLE and verify that it is at least
-functional. Double-click on the IDLE app icon in ``/Applications/Python n.n``.
-It should also be tested from the command line::
-
- /usr/local/bin/idlen.n
-
"""
This script is used to build "official" universal installers on macOS.
+NEW for 3.9.0 and backports:
+- 2.7 end-of-life issues:
+ - Python 3 installs now update the Current version link
+ in /Library/Frameworks/Python.framework/Versions
+- fully support running under Python 3 as well as 2.7
+- support building on newer macOS systems with SIP
+- fully support building on macOS 10.9+
+- support 10.6+ on best effort
+- support bypassing docs build by supplying a prebuilt
+ docs html tarball in the third-party source library,
+ in the format and filename conventional of those
+ downloadable from python.org:
+ python-3.x.y-docs-html.tar.bz2
+
NEW for 3.7.0:
- support Intel 64-bit-only () and 32-bit-only installer builds
- build and use internal Tcl/Tk 8.6 for 10.6+ builds
- use generic "gcc" as compiler (CC env var) rather than "gcc-4.2"
TODO:
-- support SDKROOT and DEVELOPER_DIR xcrun env variables
-- test with 10.5 and 10.4 and determine support status
-
-Please ensure that this script keeps working with Python 2.5, to avoid
-bootstrap issues (/usr/bin/python is Python 2.5 on OSX 10.5). Doc builds
-use current versions of Sphinx and require a reasonably current python3.
-Sphinx and dependencies are installed into a venv using the python3's pip
-so will fetch them from PyPI if necessary. Since python3 is now used for
-Sphinx, build-installer.py should also be converted to use python3!
-
-For 3.7.0, when building for a 10.6 or higher deployment target,
-build-installer builds and links with its own copy of Tcl/Tk 8.6.
-Otherwise, it requires an installed third-party version of
-Tcl/Tk 8.4 (for OS X 10.4 and 10.5 deployment targets), Tcl/TK 8.5
-(for 10.6 or later), or Tcl/TK 8.6 (for 10.9 or later)
-installed in /Library/Frameworks. When installed,
-the Python built by this script will attempt to dynamically link first to
-Tcl and Tk frameworks in /Library/Frameworks if available otherwise fall
-back to the ones in /System/Library/Framework. For the build, we recommend
-installing the most recent ActiveTcl 8.6. 8.5, or 8.4 version, depending
-on the deployment target. The actual version linked to depends on the
-path of /Library/Frameworks/{Tcl,Tk}.framework/Versions/Current.
+- test building with SDKROOT and DEVELOPER_DIR xcrun env variables
Usage: see USAGE variable in the script.
"""
INCLUDE_TIMESTAMP = 1
VERBOSE = 1
-from plistlib import Plist
+RUNNING_ON_PYTHON2 = sys.version_info.major == 2
-try:
+if RUNNING_ON_PYTHON2:
from plistlib import writePlist
-except ImportError:
- # We're run using python2.3
- def writePlist(plist, path):
- plist.write(path)
+else:
+ from plistlib import dump
+ def writePlist(path, plist):
+ with open(plist, 'wb') as fp:
+ dump(path, fp)
def shellQuote(value):
"""
"--libdir=/Library/Frameworks/Python.framework/Versions/%s/lib"%(getVersion(),),
],
patchscripts=[
- ("ftp://invisible-island.net/ncurses//5.9/ncurses-5.9-20120616-patch.sh.bz2",
+ ("ftp://ftp.invisible-island.net/ncurses//5.9/ncurses-5.9-20120616-patch.sh.bz2",
"f54bf02a349f96a7c4f0d00922f3a0d4"),
],
useLDFlags=False,
),
),
dict(
- name="SQLite 3.31.1",
- url="https://sqlite.org/2020/sqlite-autoconf-3310100.tar.gz",
- checksum='2d0a553534c521504e3ac3ad3b90f125',
+ name="SQLite 3.32.2",
+ url="https://sqlite.org/2020/sqlite-autoconf-3320200.tar.gz",
+ checksum='eb498918a33159cdf8104997aad29e83',
extra_cflags=('-Os '
'-DSQLITE_ENABLE_FTS5 '
'-DSQLITE_ENABLE_FTS4 '
curDir = os.getcwd()
os.chdir(buildDir)
runCommand('make clean')
- # Create virtual environment for docs builds with blurb and sphinx
- runCommand('make venv')
- runCommand('venv/bin/python3 -m pip install -U Sphinx==2.0.1')
- runCommand('make html PYTHON=venv/bin/python')
+
+ # Search third-party source directory for a pre-built version of the docs.
+ # Use the naming convention of the docs.python.org html downloads:
+ # python-3.9.0b1-docs-html.tar.bz2
+ doctarfiles = [ f for f in os.listdir(DEPSRC)
+ if f.startswith('python-'+getFullVersion())
+ if f.endswith('-docs-html.tar.bz2') ]
+ if doctarfiles:
+ doctarfile = doctarfiles[0]
+ if not os.path.exists('build'):
+ os.mkdir('build')
+ # if build directory existed, it was emptied by make clean, above
+ os.chdir('build')
+ # Extract the first archive found for this version into build
+ runCommand('tar xjf %s'%shellQuote(os.path.join(DEPSRC, doctarfile)))
+ # see if tar extracted a directory ending in -docs-html
+ archivefiles = [ f for f in os.listdir('.')
+ if f.endswith('-docs-html')
+ if os.path.isdir(f) ]
+ if archivefiles:
+ archivefile = archivefiles[0]
+ # make it our 'Docs/build/html' directory
+ print(' -- using pre-built python documentation from %s'%archivefile)
+ os.rename(archivefile, 'html')
+ os.chdir(buildDir)
+
+ htmlDir = os.path.join('build', 'html')
+ if not os.path.exists(htmlDir):
+ # Create virtual environment for docs builds with blurb and sphinx
+ runCommand('make venv')
+ runCommand('venv/bin/python3 -m pip install -U Sphinx==2.3.1')
+ runCommand('make html PYTHON=venv/bin/python')
+ os.rename(htmlDir, docdir)
os.chdir(curDir)
- if not os.path.exists(docdir):
- os.mkdir(docdir)
- os.rename(os.path.join(buildDir, 'build', 'html'), docdir)
def buildPython():
# Since the extra libs are not in their installed framework location
# during the build, augment the library path so that the interpreter
# will find them during its extension import sanity checks.
- os.environ['DYLD_LIBRARY_PATH'] = os.path.join(WORKDIR,
- 'libraries', 'usr', 'local', 'lib')
+
print("Running configure...")
runCommand("%s -C --enable-framework --enable-universalsdk=/ "
"--with-universal-archs=%s "
"%s "
"%s "
"%s "
+ "%s "
"LDFLAGS='-g -L%s/libraries/usr/local/lib' "
"CFLAGS='-g -I%s/libraries/usr/local/include' 2>&1"%(
shellQuote(os.path.join(SRCDIR, 'configure')),
UNIVERSALARCHS,
(' ', '--with-computed-gotos ')[PYTHON_3],
(' ', '--without-ensurepip ')[PYTHON_3],
+ (' ', "--with-openssl='%s/libraries/usr/local'"%(
+ shellQuote(WORKDIR)[1:-1],))[PYTHON_3],
(' ', "--with-tcltk-includes='-I%s/libraries/usr/local/include'"%(
shellQuote(WORKDIR)[1:-1],))[internalTk()],
(' ', "--with-tcltk-libs='-L%s/libraries/usr/local/lib -ltcl8.6 -ltk8.6'"%(
shellQuote(WORKDIR)[1:-1],
shellQuote(WORKDIR)[1:-1]))
+ # As of macOS 10.11 with SYSTEM INTEGRITY PROTECTION, DYLD_*
+ # environment variables are no longer automatically inherited
+ # by child processes from their parents. We used to just set
+ # DYLD_LIBRARY_PATH, pointing to the third-party libs,
+ # in build-installer.py's process environment and it was
+ # passed through the make utility into the environment of
+ # setup.py. Instead, we now append DYLD_LIBRARY_PATH to
+ # the existing RUNSHARED configuration value when we call
+ # make for extension module builds.
+
+ runshared_for_make = "".join([
+ " RUNSHARED=",
+ "'",
+ grepValue("Makefile", "RUNSHARED"),
+ ' DYLD_LIBRARY_PATH=',
+ os.path.join(WORKDIR, 'libraries', 'usr', 'local', 'lib'),
+ "'" ])
+
# Look for environment value BUILDINSTALLER_BUILDPYTHON_MAKE_EXTRAS
# and, if defined, append its value to the make command. This allows
# us to pass in version control tags, like GITTAG, to a build from a
make_extras = os.getenv("BUILDINSTALLER_BUILDPYTHON_MAKE_EXTRAS")
if make_extras:
- make_cmd = "make " + make_extras
+ make_cmd = "make " + make_extras + runshared_for_make
else:
- make_cmd = "make"
+ make_cmd = "make" + runshared_for_make
print("Running " + make_cmd)
runCommand(make_cmd)
- print("Running make install")
- runCommand("make install DESTDIR=%s"%(
- shellQuote(rootDir)))
+ make_cmd = "make install DESTDIR=%s %s"%(
+ shellQuote(rootDir),
+ runshared_for_make)
+ print("Running " + make_cmd)
+ runCommand(make_cmd)
- print("Running make frameworkinstallextras")
- runCommand("make frameworkinstallextras DESTDIR=%s"%(
- shellQuote(rootDir)))
+ make_cmd = "make frameworkinstallextras DESTDIR=%s %s"%(
+ shellQuote(rootDir),
+ runshared_for_make)
+ print("Running " + make_cmd)
+ runCommand(make_cmd)
- del os.environ['DYLD_LIBRARY_PATH']
print("Copying required shared libraries")
if os.path.exists(os.path.join(WORKDIR, 'libraries', 'Library')):
build_lib_dir = os.path.join(
data = fp.read()
fp.close()
# create build_time_vars dict
- exec(data)
+ if RUNNING_ON_PYTHON2:
+ exec(data)
+ else:
+ g_dict = {}
+ l_dict = {}
+ exec(data, g_dict, l_dict)
+ build_time_vars = l_dict['build_time_vars']
vars = {}
for k, v in build_time_vars.items():
if type(v) == type(''):
os.chdir(curdir)
- if PYTHON_3:
- # Remove the 'Current' link, that way we don't accidentally mess
- # with an already installed version of python 2
- os.unlink(os.path.join(rootDir, 'Library', 'Frameworks',
- 'Python.framework', 'Versions', 'Current'))
-
def patchFile(inPath, outPath):
data = fileContents(inPath)
data = data.replace('$FULL_VERSION', getFullVersion())
vers = getFullVersion()
major, minor = getVersionMajorMinor()
- pl = Plist(
+ pl = dict(
CFBundleGetInfoString="Python.%s %s"%(pkgname, vers,),
CFBundleIdentifier='org.python.Python.%s'%(pkgname,),
CFBundleName='Python.%s'%(pkgname,),
)
writePlist(pl, os.path.join(packageContents, 'Info.plist'))
- pl = Plist(
+ pl = dict(
IFPkgDescriptionDescription=readme,
IFPkgDescriptionTitle=recipe.get('long_name', "Python.%s"%(pkgname,)),
IFPkgDescriptionVersion=vers,
vers = getFullVersion()
major, minor = getVersionMajorMinor()
- pl = Plist(
+ pl = dict(
CFBundleGetInfoString="Python %s"%(vers,),
CFBundleIdentifier='org.python.Python',
CFBundleName='Python',
os.mkdir(rsrcDir)
makeMpkgPlist(os.path.join(pkgroot, 'Info.plist'))
- pl = Plist(
+ pl = dict(
IFPkgDescriptionTitle="Python",
IFPkgDescriptionVersion=getVersion(),
)
-{\rtf1\ansi\ansicpg1252\cocoartf2511
+{\rtf1\ansi\ansicpg1252\cocoartf2513
\cocoatextscaling0\cocoaplatform0{\fonttbl\f0\fswiss\fcharset0 Helvetica;\f1\fswiss\fcharset0 Helvetica-Bold;\f2\fswiss\fcharset0 Helvetica-Oblique;
\f3\fmodern\fcharset0 CourierNewPSMT;}
{\colortbl;\red255\green255\blue255;}
macOS 10.15 (Catalina) Gatekeeper Requirements [changed in 3.8.2]\
\f0\b0 \ulnone \
-As of 2020-02-03, Apple has changed how third-party installer packages, like those provided by python.org, are notarized for verification by Gatekeeper and begun enforcing additional requirements such as code signing and use of the hardened runtime. As of 3.8.2, python.org installer packages now meet those additional notarization requirements. The necessary changes in packaging should be transparent to your use of Python but, in the unlikely event that you encounter changes in behavior between 3.8.1 and 3.8.2 in areas like ctypes, importlib, or mmap, please check bugs.python.org for existing reports and, if necessary, open a new issue.\
+As of 2020-02-03, Apple has changed how third-party installer packages, like those provided by python.org, are notarized for verification by Gatekeeper and begun enforcing additional requirements such as code signing and use of the hardened runtime. As of 3.8.2, python.org installer packages now meet those additional notarization requirements. The necessary changes in packaging should be transparent to your use of Python but, in the unlikely event that you encounter changes in behavior between 3.8.1 and newer 3.8.x releases in areas like ctypes, importlib, or mmap, please check bugs.python.org for existing reports and, if necessary, open a new issue.\
+
+\f1\b \ul \
+Python 2.7 end-of-life [changed in 3.8.4]\
+\
+
+\f0\b0 \ulnone Python 2.7 has now reached end-of-life. As of Python 3.8.4, the
+\f3 Python Launcher
+\f0 app now has
+\f3 python3
+\f0 factory defaults. Also, the
+\f3 Current
+\f0 link in the
+\f3 /Library/Frameworks/Python.framework/Versions
+\f0 directory is now updated to point to the Python 3 being installed; previously, only Python 2 installs updated
+\f3 Current
+\f0 . This change might affect developers using the framework to embed Python in their applications. If another version is desired for embedding, the
+\f3 Current
+\f0 symlink can be changed manually without affecting 3.8.x behavior.\
\f1\b \ul \
Other changes\
# make link in /Applications/Python m.n/ for Finder users
if [ -d "${APPDIR}" ]; then
ln -fhs "${FWK_DOCDIR}/index.html" "${APPDIR}/Python Documentation.html"
- open "${APPDIR}" || true # open the applications folder
+ if [ "${COMMAND_LINE_INSTALL}" != 1 ]; then
+ open "${APPDIR}" || true # open the applications folder
+ fi
fi
# make share/doc link in framework for command line users
<false/>
<key>interpreter_list</key>
<array>
- <string>/usr/local/bin/pythonw</string>
- <string>/usr/bin/pythonw</string>
- <string>/sw/bin/pythonw</string>
+ <string>/usr/local/bin/python3</string>
+ <string>/opt/local/bin/python3</string>
+ <string>/sw/bin/python3</string>
</array>
<key>honourhashbang</key>
<false/>
<false/>
<key>interpreter_list</key>
<array>
- <string>/usr/local/bin/pythonw</string>
- <string>/usr/local/bin/python</string>
- <string>/usr/bin/pythonw</string>
- <string>/usr/bin/python</string>
- <string>/sw/bin/pythonw</string>
- <string>/sw/bin/python</string>
+ <string>/usr/local/bin/python3</string>
+ <string>/opt/local/bin/python3</string>
+ <string>/sw/bin/python3</string>
</array>
<key>honourhashbang</key>
<false/>
<false/>
<key>interpreter_list</key>
<array>
- <string>/usr/local/bin/pythonw</string>
- <string>/usr/local/bin/python</string>
- <string>/usr/bin/pythonw</string>
- <string>/usr/bin/python</string>
- <string>/sw/bin/pythonw</string>
- <string>/sw/bin/python</string>
+ <string>/usr/local/bin/python3</string>
+ <string>/opt/local/bin/python3</string>
+ <string>/sw/bin/python3</string>
</array>
<key>honourhashbang</key>
<false/>
<key>CFBundleExecutable</key>
<string>Python</string>
<key>CFBundleGetInfoString</key>
- <string>%version%, (c) 2001-2016 Python Software Foundation.</string>
+ <string>%version%, (c) 2001-2020 Python Software Foundation.</string>
<key>CFBundleHelpBookFolder</key>
<array>
<string>Documentation</string>
<key>NSAppleScriptEnabled</key>
<true/>
<key>NSHumanReadableCopyright</key>
- <string>(c) 2001-2016 Python Software Foundation.</string>
+ <string>(c) 2001-2020 Python Software Foundation.</string>
<key>NSHighResolutionCapable</key>
<true/>
</dict>
Guido Kollerie
Jacek Kołodziej
Jacek Konieczny
+Krzysztof Konopko
Arkady Koplyarov
Peter A. Koren
Марк Коренберг
Bryan Olson
Grant Olson
Koray Oner
+Ethan Onstott
Piet van Oostrum
Tomas Oppelstrup
Jason Orendorff
Victor Terrón
Pablo Galindo
Richard M. Tew
+Srinivas Reddy Thatiparthy
Tobias Thelen
Christian Theune
Févry Thibault
Python News
+++++++++++
+What's New in Python 3.8.4 final?
+=================================
+
+*Release date: 2020-07-13*
+
+Security
+--------
+
+- bpo-41162: Audit hooks are now cleared later during finalization to avoid
+ missing events.
+
+- bpo-29778: Ensure :file:`python3.dll` is loaded from correct locations
+ when Python is embedded (CVE-2020-15523).
+
+Core and Builtins
+-----------------
+
+- bpo-41247: Always cache the running loop holder when running
+ ``asyncio.set_running_loop``.
+
+- bpo-41252: Fix incorrect refcounting in _ssl.c's
+ ``_servername_callback()``.
+
+- bpo-41218: Python 3.8.3 had a regression where compiling with
+ ast.PyCF_ALLOW_TOP_LEVEL_AWAIT would aggressively mark list comprehension
+ with CO_COROUTINE. Now only list comprehension making use of async/await
+ will tagged as so.
+
+- bpo-41175: Guard against a NULL pointer dereference within bytearrayobject
+ triggered by the ``bytearray() + bytearray()`` operation.
+
+- bpo-39960: The "hackcheck" that prevents sneaking around a type's
+ __setattr__() by calling the superclass method was rewritten to allow C
+ implemented heap types.
+
+Library
+-------
+
+- bpo-41235: Fix the error handling in
+ :meth:`ssl.SSLContext.load_dh_params`.
+
+- bpo-41193: The ``write_history()`` atexit function of the readline
+ completer now ignores any :exc:`OSError` to ignore error if the filesystem
+ is read-only, instead of only ignoring :exc:`FileNotFoundError` and
+ :exc:`PermissionError`.
+
+- bpo-41043: Fixed the use of :func:`~glob.glob` in the stdlib: literal part
+ of the path is now always correctly escaped.
+
+- bpo-39384: Fixed email.contentmanager to allow set_content() to set a null
+ string.
+
+IDLE
+----
+
+- bpo-37765: Add keywords to module name completion list. Rewrite
+ Completions section of IDLE doc.
+
+- bpo-41152: The encoding of ``stdin``, ``stdout`` and ``stderr`` in IDLE is
+ now always UTF-8.
+
+
+What's New in Python 3.8.4 release candidate 1?
+===============================================
+
+*Release date: 2020-06-29*
+
+Security
+--------
+
+- bpo-41004: The __hash__() methods of ipaddress.IPv4Interface and
+ ipaddress.IPv6Interface incorrectly generated constant hash values of 32
+ and 128 respectively. This resulted in always causing hash collisions. The
+ fix uses hash() to generate hash values for the tuple of (address, mask
+ length, network address).
+
+- bpo-39073: Disallow CR or LF in email.headerregistry.Address arguments to
+ guard against header injection attacks.
+
+Core and Builtins
+-----------------
+
+- bpo-41094: Fix decoding errors with audit when open files with non-ASCII
+ names on non-UTF-8 locale.
+
+- bpo-41056: Fixes a reference to deallocated stack space during startup
+ when constructing sys.path involving a relative symlink when code was
+ supplied via -c. (discovered via Coverity)
+
+- bpo-35975: Stefan Behnel reported that cf_feature_version is used even
+ when PyCF_ONLY_AST is not set. This is against the intention and against
+ the documented behavior, so it's been fixed.
+
+- bpo-40957: Fix refleak in _Py_fopen_obj() when PySys_Audit() fails
+
+- bpo-40870: Raise :exc:`ValueError` when validating custom AST's where the
+ constants ``True``, ``False`` and ``None`` are used within a
+ :class:`ast.Name` node.
+
+- bpo-40826: Fix GIL usage in :c:func:`PyOS_Readline`: lock the GIL to set
+ an exception and pass the Python thread state when checking if there is a
+ pending signal.
+
+- bpo-40824: Unexpected errors in calling the ``__iter__`` method are no
+ longer masked by ``TypeError`` in the :keyword:`in` operator and functions
+ :func:`~operator.contains`, :func:`~operator.indexOf` and
+ :func:`~operator.countOf` of the :mod:`operator` module.
+
+- bpo-40663: Correctly generate annotations where parentheses are omitted
+ but required (e.g: ``Type[(str, int, *other))]``.
+
+Library
+-------
+
+- bpo-41138: Fixed the :mod:`trace` module CLI for Python source files with
+ non-UTF-8 encoding.
+
+- bpo-31938: Fix default-value signatures of several functions in the
+ :mod:`select` module - by Anthony Sottile.
+
+- bpo-41068: Fixed reading files with non-ASCII names from ZIP archive
+ directly after writing them.
+
+- bpo-41058: :func:`pdb.find_function` now correctly determines the source
+ file encoding.
+
+- bpo-41056: Fix a NULL pointer dereference within the ssl module during a
+ MemoryError in the keylog callback. (discovered by Coverity)
+
+- bpo-41048: :func:`mimetypes.read_mime_types` function reads the rule file
+ using UTF-8 encoding, not the locale encoding. Patch by Srinivas Reddy
+ Thatiparthy.
+
+- bpo-40448: :mod:`ensurepip` now disables the use of `pip` cache when
+ installing the bundled versions of `pip` and `setuptools`. Patch by
+ Krzysztof Konopko.
+
+- bpo-40855: The standard deviation and variance functions in the statistics
+ module were ignoring their mu and xbar arguments.
+
+- bpo-40807: Stop codeop._maybe_compile, used by code.InteractiveInterpreter
+ (and IDLE). from from emitting each warning three times.
+
+- bpo-40834: Fix truncate when sending str object
+ with_xxsubinterpreters.channel_send.
+
+- bpo-38488: Update ensurepip to install pip 20.1.1 and setuptools 47.1.0.
+
+- bpo-40767: :mod:`webbrowser` now properly finds the default browser in
+ pure Wayland systems by checking the WAYLAND_DISPLAY environment variable.
+ Patch contributed by Jérémy Attali.
+
+- bpo-40795: :mod:`ctypes` module: If ctypes fails to convert the result of
+ a callback or if a ctypes callback function raises an exception,
+ sys.unraisablehook is now called with an exception set. Previously, the
+ error was logged into stderr by :c:func:`PyErr_Print`.
+
+- bpo-30008: Fix :mod:`ssl` code to be compatible with OpenSSL 1.1.x builds
+ that use ``no-deprecated`` and ``--api=1.1.0``.
+
+- bpo-40614: :func:`ast.parse` will not parse self documenting expressions
+ in f-strings when passed ``feature_version`` is less than ``(3, 8)``.
+
+- bpo-40626: Add h5 file extension as MIME Type application/x-hdf5, as per
+ HDF Group recommendation for HDF5 formatted data files. Patch contributed
+ by Mark Schwab.
+
+- bpo-25872: :mod:`linecache` could crash with a :exc:`KeyError` when
+ accessed from multiple threads. Fix by Michael Graczyk.
+
+- bpo-40597: If text content lines are longer than policy.max_line_length,
+ always use a content-encoding to make sure they are wrapped.
+
+- bpo-40515: The :mod:`ssl` and :mod:`hashlib` modules now actively check
+ that OpenSSL is build with thread support. Python 3.7.0 made thread
+ support mandatory and no longer works safely with a no-thread builds.
+
+- bpo-13097: ``ctypes`` now raises an ``ArgumentError`` when a callback is
+ invoked with more than 1024 arguments.
+
+- bpo-40457: The ssl module now support OpenSSL builds without TLS 1.0 and
+ 1.1 methods.
+
+- bpo-39830: Add :class:`zipfile.Path` to ``__all__`` in the :mod:`zipfile`
+ module.
+
+- bpo-40025: Raise TypeError when _generate_next_value_ is defined after
+ members. Patch by Ethan Onstott.
+
+- bpo-39244: Fixed :class:`multiprocessing.context.get_all_start_methods` to
+ properly return the default method first on macOS.
+
+- bpo-39040: Fix parsing of invalid mime headers parameters by collapsing
+ whitespace between encoded words in a bare-quote-string.
+
+- bpo-35714: :exc:`struct.error` is now raised if there is a null character
+ in a :mod:`struct` format string.
+
+- bpo-36290: AST nodes are now raising :exc:`TypeError` on conflicting
+ keyword arguments. Patch contributed by Rémi Lapeyre.
+
+- bpo-29620: :func:`~unittest.TestCase.assertWarns` no longer raises a
+ ``RuntimeException`` when accessing a module's ``__warningregistry__``
+ causes importation of a new module, or when a new module is imported in
+ another thread. Patch by Kernc.
+
+- bpo-34226: Fix `cgi.parse_multipart` without content_length. Patch by
+ Roger Duran
+
+Tests
+-----
+
+- bpo-41085: Fix integer overflow in the :meth:`array.array.index` method on
+ 64-bit Windows for index larger than ``2**31``.
+
+- bpo-38377: On Linux, skip tests using multiprocessing if the current user
+ cannot create a file in ``/dev/shm/`` directory. Add the
+ :func:`~test.support.skip_if_broken_multiprocessing_synchronize` function
+ to the :mod:`test.support` module.
+
+- bpo-41009: Fix use of ``support.require_{linux|mac|freebsd}_version()``
+ decorators as class decorator.
+
+- bpo-41003: Fix ``test_copyreg`` when ``numpy`` is installed:
+ ``test.pickletester`` now saves/restores warnings filters when importing
+ ``numpy``, to ignore filters installed by ``numpy``.
+
+- bpo-40964: Disable remote :mod:`imaplib` tests, host cyrus.andrew.cmu.edu
+ is blocking incoming connections.
+
+- bpo-40055: distutils.tests now saves/restores warnings filters to leave
+ them unchanged. Importing tests imports docutils which imports
+ pkg_resources which adds a warnings filter.
+
+- bpo-34401: Make test_gdb properly run on HP-UX. Patch by Michael Osipov.
+
+Build
+-----
+
+- bpo-40204: Pin Sphinx version to 2.3.1 in ``Doc/Makefile``.
+
+- bpo-40653: Move _dirnameW out of HAVE_SYMLINK to fix a potential compiling
+ issue.
+
+Windows
+-------
+
+- bpo-41074: Fixed support of non-ASCII names in functions
+ :func:`msilib.OpenDatabase` and :func:`msilib.init_database` and non-ASCII
+ SQL in method :meth:`msilib.Database.OpenView`.
+
+- bpo-40164: Updates Windows OpenSSL to 1.1.1g
+
+- bpo-39631: Changes the registered MIME type for ``.py`` files on Windows
+ to ``text/x-python`` instead of ``text/plain``.
+
+- bpo-40677: Manually define IO_REPARSE_TAG_APPEXECLINK in case some old
+ Windows SDK doesn't have it.
+
+- bpo-40650: Include winsock2.h in pytime.c for timeval.
+
+- bpo-39148: Add IPv6 support to :mod:`asyncio` datagram endpoints in
+ ProactorEventLoop. Change the raised exception for unknown address
+ families to ValueError as it's not coming from Windows API.
+
+macOS
+-----
+
+- bpo-39580: Avoid opening Finder window if running installer from the
+ command line. Patch contributed by Rick Heil.
+
+- bpo-41100: Fix configure error when building on macOS 11. Note that the
+ current Python release was released shortly after the first developer
+ preview of macOS 11 (Big Sur); there are other known issues with building
+ and running on the developer preview. Big Sur is expected to be fully
+ supported in a future bugfix release of Python 3.8.x and with 3.9.0.
+
+- bpo-41005: fixed an XDG settings issue not allowing macos to open browser
+ in webbrowser.py
+
+- bpo-40741: Update macOS installer to use SQLite 3.32.2.
+
+IDLE
+----
+
+- bpo-41144: Make Open Module open a special module such as os.path.
+
+- bpo-39885: Make context menu Cut and Copy work again when right-clicking
+ within a selection.
+
+- bpo-40723: Make test_idle pass when run after import.
+
+Tools/Demos
+-----------
+
+- bpo-40479: Update multissltest helper to test with latest OpenSSL 1.0.2,
+ 1.1.0, 1.1.1, and 3.0.0-alpha.
+
+- bpo-40163: Fix multissltest tool. OpenSSL has changed download URL for old
+ releases. The multissltest tool now tries to download from current and old
+ download URLs.
+
+
What's New in Python 3.8.3 final?
=================================
static int
set_running_loop(PyObject *loop)
{
- cached_running_holder = NULL;
- cached_running_holder_tsid = 0;
-
PyObject *ts_dict = PyThreadState_GetDict(); // borrowed
if (ts_dict == NULL) {
PyErr_SetString(
}
Py_DECREF(rl);
+ cached_running_holder = (PyObject *)rl;
+
+ /* safe to assume state is not NULL as the call to PyThreadState_GetDict()
+ above already checks if state is NULL */
+ cached_running_holder_tsid = PyThreadState_Get()->id;
+
return 0;
}
pArgs++;
}
-#define CHECK(what, x) \
-if (x == NULL) _PyTraceback_Add(what, "_ctypes/callbacks.c", __LINE__ - 1), PyErr_Print()
-
if (flags & (FUNCFLAG_USE_ERRNO | FUNCFLAG_USE_LASTERROR)) {
error_object = _ctypes_get_errobj(&space);
if (error_object == NULL)
}
result = PyObject_CallObject(callable, arglist);
- CHECK("'calling callback function'", result);
+ if (result == NULL) {
+ _PyErr_WriteUnraisableMsg("on calling ctypes callback function",
+ callable);
+ }
#ifdef MS_WIN32
if (flags & FUNCFLAG_USE_LASTERROR) {
}
Py_XDECREF(error_object);
- if ((restype != &ffi_type_void) && result) {
- PyObject *keep;
+ if (restype != &ffi_type_void && result) {
assert(setfunc);
+
#ifdef WORDS_BIGENDIAN
- /* See the corresponding code in callproc.c, around line 961 */
- if (restype->type != FFI_TYPE_FLOAT && restype->size < sizeof(ffi_arg))
+ /* See the corresponding code in _ctypes_callproc():
+ in callproc.c, around line 1219. */
+ if (restype->type != FFI_TYPE_FLOAT && restype->size < sizeof(ffi_arg)) {
mem = (char *)mem + sizeof(ffi_arg) - restype->size;
+ }
#endif
- keep = setfunc(mem, result, 0);
- CHECK("'converting callback result'", keep);
+
/* keep is an object we have to keep alive so that the result
stays valid. If there is no such object, the setfunc will
have returned Py_None.
be the result. EXCEPT when restype is py_object - Python
itself knows how to manage the refcount of these objects.
*/
- if (keep == NULL) /* Could not convert callback result. */
- PyErr_WriteUnraisable(callable);
- else if (keep == Py_None) /* Nothing to keep */
+ PyObject *keep = setfunc(mem, result, 0);
+
+ if (keep == NULL) {
+ /* Could not convert callback result. */
+ _PyErr_WriteUnraisableMsg("on converting result "
+ "of ctypes callback function",
+ callable);
+ }
+ else if (keep == Py_None) {
+ /* Nothing to keep */
Py_DECREF(keep);
+ }
else if (setfunc != _ctypes_get_fielddesc("O")->setfunc) {
if (-1 == PyErr_WarnEx(PyExc_RuntimeWarning,
"memory leak in callback function.",
1))
- PyErr_WriteUnraisable(callable);
+ {
+ _PyErr_WriteUnraisableMsg("on converting result "
+ "of ctypes callback function",
+ callable);
+ }
}
}
+
Py_XDECREF(result);
+
Done:
Py_XDECREF(arglist);
PyGILState_Release(state);
#define IS_PASS_BY_REF(x) (x > 8 || !POW2(x))
#endif
+/*
+ * bpo-13097: Max number of arguments _ctypes_callproc will accept.
+ *
+ * This limit is enforced for the `alloca()` call in `_ctypes_callproc`,
+ * to avoid allocating a massive buffer on the stack.
+ */
+#define CTYPES_MAX_ARGCOUNT 1024
+
/*
* Requirements, must be ensured by the caller:
* - argtuple is tuple of arguments
++argcount;
#endif
+ if (argcount > CTYPES_MAX_ARGCOUNT)
+ {
+ PyErr_Format(PyExc_ArgError, "too many arguments (%zi), maximum is %i",
+ argcount, CTYPES_MAX_ARGCOUNT);
+ return NULL;
+ }
+
args = (struct argument *)alloca(sizeof(struct argument) * argcount);
if (!args) {
PyErr_NoMemory();
if (rtype->type != FFI_TYPE_FLOAT
&& rtype->type != FFI_TYPE_STRUCT
&& rtype->size < sizeof(ffi_arg))
+ {
resbuf = (char *)resbuf + sizeof(ffi_arg) - rtype->size;
+ }
#endif
#ifdef MS_WIN32
if (name != Py_None) {
if (PyUnicode_FSConverter(name, &name2) == 0)
return NULL;
- if (PyBytes_Check(name2))
- name_str = PyBytes_AS_STRING(name2);
- else
- name_str = PyByteArray_AS_STRING(name2);
+ name_str = PyBytes_AS_STRING(name2);
} else {
name_str = NULL;
name2 = NULL;
}
- if (PySys_Audit("ctypes.dlopen", "s", name_str) < 0) {
+ if (PySys_Audit("ctypes.dlopen", "O", name) < 0) {
return NULL;
}
handle = ctypes_dlopen(name_str, mode);
const mpd_context_t *ctx, uint32_t *status)
{
_mpd_qdiv(SET_IDEAL_EXP, q, a, b, ctx, status);
-
- if (*status & MPD_Malloc_error) {
- /* Inexact quotients (the usual case) fill the entire context precision,
- * which can lead to malloc() failures for very high precisions. Retry
- * the operation with a lower precision in case the result is exact.
- *
- * We need an upper bound for the number of digits of a_coeff / b_coeff
- * when the result is exact. If a_coeff' * 1 / b_coeff' is in lowest
- * terms, then maxdigits(a_coeff') + maxdigits(1 / b_coeff') is a suitable
- * bound.
- *
- * 1 / b_coeff' is exact iff b_coeff' exclusively has prime factors 2 or 5.
- * The largest amount of digits is generated if b_coeff' is a power of 2 or
- * a power of 5 and is less than or equal to log5(b_coeff') <= log2(b_coeff').
- *
- * We arrive at a total upper bound:
- *
- * maxdigits(a_coeff') + maxdigits(1 / b_coeff') <=
- * a->digits + log2(b_coeff) =
- * a->digits + log10(b_coeff) / log10(2) <=
- * a->digits + b->digits * 4;
- */
- uint32_t workstatus = 0;
- mpd_context_t workctx = *ctx;
- workctx.prec = a->digits + b->digits * 4;
- if (workctx.prec >= ctx->prec) {
- return; /* No point in retrying, keep the original error. */
- }
-
- _mpd_qdiv(SET_IDEAL_EXP, q, a, b, &workctx, &workstatus);
- if (workstatus == 0) { /* The result is exact, unrounded, normal etc. */
- *status = 0;
- return;
- }
-
- mpd_seterror(q, *status, status);
- }
}
/* Internal function. */
/* END LIBMPDEC_ONLY */
/* Algorithm from decimal.py */
-static void
-_mpd_qsqrt(mpd_t *result, const mpd_t *a, const mpd_context_t *ctx,
- uint32_t *status)
+void
+mpd_qsqrt(mpd_t *result, const mpd_t *a, const mpd_context_t *ctx,
+ uint32_t *status)
{
mpd_context_t maxcontext;
MPD_NEW_STATIC(c,0,0,0,0);
goto out;
}
-void
-mpd_qsqrt(mpd_t *result, const mpd_t *a, const mpd_context_t *ctx,
- uint32_t *status)
-{
- _mpd_qsqrt(result, a, ctx, status);
-
- if (*status & (MPD_Malloc_error|MPD_Division_impossible)) {
- /* The above conditions can occur at very high context precisions
- * if intermediate values get too large. Retry the operation with
- * a lower context precision in case the result is exact.
- *
- * If the result is exact, an upper bound for the number of digits
- * is the number of digits in the input.
- *
- * NOTE: sqrt(40e9) = 2.0e+5 /\ digits(40e9) = digits(2.0e+5) = 2
- */
- uint32_t workstatus = 0;
- mpd_context_t workctx = *ctx;
- workctx.prec = a->digits;
-
- if (workctx.prec >= ctx->prec) {
- return; /* No point in repeating this, keep the original error. */
- }
-
- _mpd_qsqrt(result, a, &workctx, &workstatus);
- if (workstatus == 0) {
- *status = 0;
- return;
- }
-
- mpd_seterror(result, *status, status);
- }
-}
-
/******************************************************************************/
/* Base conversions */
'special': ('context.__reduce_ex__', 'context.create_decimal_from_float')
}
-# Functions that set no context flags but whose result can differ depending
-# on prec, Emin and Emax.
-MaxContextSkip = ['is_normal', 'is_subnormal', 'logical_invert', 'next_minus',
- 'next_plus', 'number_class', 'logical_and', 'logical_or',
- 'logical_xor', 'next_toward', 'rotate', 'shift']
-
# Functions that require a restricted exponent range for reasonable runtimes.
UnaryRestricted = [
'__ceil__', '__floor__', '__int__', '__trunc__',
self.pex = RestrictedList() # Python exceptions for P.Decimal
self.presults = RestrictedList() # P.Decimal results
- # If the above results are exact, unrounded and not clamped, repeat
- # the operation with a maxcontext to ensure that huge intermediate
- # values do not cause a MemoryError.
- self.with_maxcontext = False
- self.maxcontext = context.c.copy()
- self.maxcontext.prec = C.MAX_PREC
- self.maxcontext.Emax = C.MAX_EMAX
- self.maxcontext.Emin = C.MIN_EMIN
- self.maxcontext.clear_flags()
-
- self.maxop = RestrictedList() # converted C.Decimal operands
- self.maxex = RestrictedList() # Python exceptions for C.Decimal
- self.maxresults = RestrictedList() # C.Decimal results
-
# ======================================================================
# SkipHandler: skip known discrepancies
if t.contextfunc:
cargs = t.cop
pargs = t.pop
- maxargs = t.maxop
cfunc = "c_func: %s(" % t.funcname
pfunc = "p_func: %s(" % t.funcname
- maxfunc = "max_func: %s(" % t.funcname
else:
cself, cargs = t.cop[0], t.cop[1:]
pself, pargs = t.pop[0], t.pop[1:]
- maxself, maxargs = t.maxop[0], t.maxop[1:]
cfunc = "c_func: %s.%s(" % (repr(cself), t.funcname)
pfunc = "p_func: %s.%s(" % (repr(pself), t.funcname)
- maxfunc = "max_func: %s.%s(" % (repr(maxself), t.funcname)
err = cfunc
for arg in cargs:
err = err.rstrip(", ")
err += ")"
- if t.with_maxcontext:
- err += "\n"
- err += maxfunc
- for arg in maxargs:
- err += "%s, " % repr(arg)
- err = err.rstrip(", ")
- err += ")"
-
return err
def raise_error(t):
err = "Error in %s:\n\n" % t.funcname
err += "input operands: %s\n\n" % (t.op,)
err += function_as_string(t)
-
- err += "\n\nc_result: %s\np_result: %s\n" % (t.cresults, t.presults)
- if t.with_maxcontext:
- err += "max_result: %s\n\n" % (t.maxresults)
- else:
- err += "\n"
-
- err += "c_exceptions: %s\np_exceptions: %s\n" % (t.cex, t.pex)
- if t.with_maxcontext:
- err += "max_exceptions: %s\n\n" % t.maxex
- else:
- err += "\n"
-
- err += "%s\n" % str(t.context)
- if t.with_maxcontext:
- err += "%s\n" % str(t.maxcontext)
- else:
- err += "\n"
+ err += "\n\nc_result: %s\np_result: %s\n\n" % (t.cresults, t.presults)
+ err += "c_exceptions: %s\np_exceptions: %s\n\n" % (t.cex, t.pex)
+ err += "%s\n\n" % str(t.context)
raise VerifyError(err)
# are printed to stdout.
# ======================================================================
-def all_nan(a):
- if isinstance(a, C.Decimal):
- return a.is_nan()
- elif isinstance(a, tuple):
- return all(all_nan(v) for v in a)
- return False
-
def convert(t, convstr=True):
""" t is the testset. At this stage the testset contains a tuple of
operands t.op of various types. For decimal methods the first
for i, op in enumerate(t.op):
context.clear_status()
- t.maxcontext.clear_flags()
if op in RoundModes:
t.cop.append(op)
t.pop.append(op)
- t.maxop.append(op)
elif not t.contextfunc and i == 0 or \
convstr and isinstance(op, str):
p = None
pex = e.__class__
- try:
- C.setcontext(t.maxcontext)
- maxop = C.Decimal(op)
- maxex = None
- except (TypeError, ValueError, OverflowError) as e:
- maxop = None
- maxex = e.__class__
- finally:
- C.setcontext(context.c)
-
t.cop.append(c)
t.cex.append(cex)
-
t.pop.append(p)
t.pex.append(pex)
- t.maxop.append(maxop)
- t.maxex.append(maxex)
-
if cex is pex:
if str(c) != str(p) or not context.assert_eq_status():
raise_error(t)
else:
raise_error(t)
- # The exceptions in the maxcontext operation can legitimately
- # differ, only test that maxex implies cex:
- if maxex is not None and cex is not maxex:
- raise_error(t)
-
elif isinstance(op, Context):
t.context = op
t.cop.append(op.c)
t.pop.append(op.p)
- t.maxop.append(t.maxcontext)
else:
t.cop.append(op)
t.pop.append(op)
- t.maxop.append(op)
return 1
t.rc and t.rp are the results of the operation.
"""
context.clear_status()
- t.maxcontext.clear_flags()
try:
if t.contextfunc:
t.rp = None
t.pex.append(e.__class__)
- # If the above results are exact, unrounded, normal etc., repeat the
- # operation with a maxcontext to ensure that huge intermediate values
- # do not cause a MemoryError.
- if (t.funcname not in MaxContextSkip and
- not context.c.flags[C.InvalidOperation] and
- not context.c.flags[C.Inexact] and
- not context.c.flags[C.Rounded] and
- not context.c.flags[C.Subnormal] and
- not context.c.flags[C.Clamped] and
- not context.clamp and # results are padded to context.prec if context.clamp==1.
- not any(isinstance(v, C.Context) for v in t.cop)): # another context is used.
- t.with_maxcontext = True
- try:
- if t.contextfunc:
- maxargs = t.maxop
- t.rmax = getattr(t.maxcontext, t.funcname)(*maxargs)
- else:
- maxself = t.maxop[0]
- maxargs = t.maxop[1:]
- try:
- C.setcontext(t.maxcontext)
- t.rmax = getattr(maxself, t.funcname)(*maxargs)
- finally:
- C.setcontext(context.c)
- t.maxex.append(None)
- except (TypeError, ValueError, OverflowError, MemoryError) as e:
- t.rmax = None
- t.maxex.append(e.__class__)
-
def verify(t, stat):
""" t is the testset. At this stage the testset contains the following
tuples:
"""
t.cresults.append(str(t.rc))
t.presults.append(str(t.rp))
- if t.with_maxcontext:
- t.maxresults.append(str(t.rmax))
-
if isinstance(t.rc, C.Decimal) and isinstance(t.rp, P.Decimal):
# General case: both results are Decimals.
t.cresults.append(t.rc.to_eng_string())
t.presults.append(str(t.rp.imag))
t.presults.append(str(t.rp.real))
- if t.with_maxcontext and isinstance(t.rmax, C.Decimal):
- t.maxresults.append(t.rmax.to_eng_string())
- t.maxresults.append(t.rmax.as_tuple())
- t.maxresults.append(str(t.rmax.imag))
- t.maxresults.append(str(t.rmax.real))
-
nc = t.rc.number_class().lstrip('+-s')
stat[nc] += 1
else:
if not isinstance(t.rc, tuple) and not isinstance(t.rp, tuple):
if t.rc != t.rp:
raise_error(t)
- if t.with_maxcontext and not isinstance(t.rmax, tuple):
- if t.rmax != t.rc:
- raise_error(t)
stat[type(t.rc).__name__] += 1
# The return value lists must be equal.
if not t.context.assert_eq_status():
raise_error(t)
- if t.with_maxcontext:
- # NaN payloads etc. depend on precision and clamp.
- if all_nan(t.rc) and all_nan(t.rmax):
- return
- # The return value lists must be equal.
- if t.maxresults != t.cresults:
- raise_error(t)
- # The Python exception lists (TypeError, etc.) must be equal.
- if t.maxex != t.cex:
- raise_error(t)
- # The context flags must be equal.
- if t.maxcontext.flags != t.context.c.flags:
- raise_error(t)
-
# ======================================================================
# Main test loops
#include <openssl/objects.h>
#include "openssl/err.h"
+#ifndef OPENSSL_THREADS
+# error "OPENSSL_THREADS is not defined, Python requires thread-safe OpenSSL"
+#endif
+
#if (OPENSSL_VERSION_NUMBER < 0x10100000L) || defined(LIBRESSL_VERSION_NUMBER)
/* OpenSSL < 1.1.0 */
#define EVP_MD_CTX_new EVP_MD_CTX_create
# endif
#endif
+#ifndef OPENSSL_THREADS
+# error "OPENSSL_THREADS is not defined, Python requires thread-safe OpenSSL"
+#endif
+
/* SSL error object */
static PyObject *PySSLErrorObject;
static PyObject *PySSLCertVerificationErrorObject;
# define PY_OPENSSL_1_1_API 1
#endif
+/* OpenSSL API compat */
+#ifdef OPENSSL_API_COMPAT
+#if OPENSSL_API_COMPAT >= 0x10100000L
+
+/* OpenSSL API 1.1.0+ does not include version methods */
+#ifndef OPENSSL_NO_TLS1_METHOD
+#define OPENSSL_NO_TLS1_METHOD 1
+#endif
+#ifndef OPENSSL_NO_TLS1_1_METHOD
+#define OPENSSL_NO_TLS1_1_METHOD 1
+#endif
+#ifndef OPENSSL_NO_TLS1_2_METHOD
+#define OPENSSL_NO_TLS1_2_METHOD 1
+#endif
+
+#endif /* >= 1.1.0 compcat */
+#endif /* OPENSSL_API_COMPAT */
+
/* LibreSSL 2.7.0 provides necessary OpenSSL 1.1.0 APIs */
#if defined(LIBRESSL_VERSION_NUMBER) && LIBRESSL_VERSION_NUMBER >= 0x2070000fL
# define PY_OPENSSL_1_1_API 1
#endif
-/* Openssl comes with TLSv1.1 and TLSv1.2 between 1.0.0h and 1.0.1
- http://www.openssl.org/news/changelog.html
- */
-#if OPENSSL_VERSION_NUMBER >= 0x10001000L
-# define HAVE_TLSv1_2 1
-#else
-# define HAVE_TLSv1_2 0
-#endif
-
/* SNI support (client- and server-side) appeared in OpenSSL 1.0.0 and 0.9.8f
* This includes the SSL_set_SSL_CTX() function.
*/
#define TLS_method SSLv23_method
#define TLS_client_method SSLv23_client_method
#define TLS_server_method SSLv23_server_method
+#define ASN1_STRING_get0_data ASN1_STRING_data
+#define X509_get0_notBefore X509_get_notBefore
+#define X509_get0_notAfter X509_get_notAfter
+#define OpenSSL_version_num SSLeay
+#define OpenSSL_version SSLeay_version
+#define OPENSSL_VERSION SSLEAY_VERSION
static int X509_NAME_ENTRY_set(const X509_NAME_ENTRY *ne)
{
PY_SSL_VERSION_SSL2,
PY_SSL_VERSION_SSL3=1,
PY_SSL_VERSION_TLS, /* SSLv23 */
-#if HAVE_TLSv1_2
PY_SSL_VERSION_TLS1,
PY_SSL_VERSION_TLS1_1,
PY_SSL_VERSION_TLS1_2,
-#else
- PY_SSL_VERSION_TLS1,
-#endif
PY_SSL_VERSION_TLS_CLIENT=0x10,
PY_SSL_VERSION_TLS_SERVER,
};
goto error;
}
} else {
- if (!X509_VERIFY_PARAM_set1_ip(param, ASN1_STRING_data(ip),
+ if (!X509_VERIFY_PARAM_set1_ip(param, ASN1_STRING_get0_data(ip),
ASN1_STRING_length(ip))) {
_setSSLError(NULL, 0, __FILE__, __LINE__);
goto error;
goto fail;
}
PyTuple_SET_ITEM(t, 0, v);
- v = PyUnicode_FromStringAndSize((char *)ASN1_STRING_data(as),
+ v = PyUnicode_FromStringAndSize((char *)ASN1_STRING_get0_data(as),
ASN1_STRING_length(as));
if (v == NULL) {
Py_DECREF(t);
ASN1_INTEGER *serialNumber;
char buf[2048];
int len, result;
- ASN1_TIME *notBefore, *notAfter;
+ const ASN1_TIME *notBefore, *notAfter;
PyObject *pnotBefore, *pnotAfter;
retval = PyDict_New();
Py_DECREF(sn_obj);
(void) BIO_reset(biobuf);
- notBefore = X509_get_notBefore(certificate);
+ notBefore = X509_get0_notBefore(certificate);
ASN1_TIME_print(biobuf, notBefore);
len = BIO_gets(biobuf, buf, sizeof(buf)-1);
if (len < 0) {
Py_DECREF(pnotBefore);
(void) BIO_reset(biobuf);
- notAfter = X509_get_notAfter(certificate);
+ notAfter = X509_get0_notAfter(certificate);
ASN1_TIME_print(biobuf, notAfter);
len = BIO_gets(biobuf, buf, sizeof(buf)-1);
if (len < 0) {
#endif
PySSL_BEGIN_ALLOW_THREADS
- if (proto_version == PY_SSL_VERSION_TLS1)
+ switch(proto_version) {
+#if defined(SSL3_VERSION) && !defined(OPENSSL_NO_SSL3)
+ case PY_SSL_VERSION_SSL3:
+ ctx = SSL_CTX_new(SSLv3_method());
+ break;
+#endif
+#if (defined(TLS1_VERSION) && \
+ !defined(OPENSSL_NO_TLS1) && \
+ !defined(OPENSSL_NO_TLS1_METHOD))
+ case PY_SSL_VERSION_TLS1:
ctx = SSL_CTX_new(TLSv1_method());
-#if HAVE_TLSv1_2
- else if (proto_version == PY_SSL_VERSION_TLS1_1)
- ctx = SSL_CTX_new(TLSv1_1_method());
- else if (proto_version == PY_SSL_VERSION_TLS1_2)
- ctx = SSL_CTX_new(TLSv1_2_method());
+ break;
#endif
-#ifndef OPENSSL_NO_SSL3
- else if (proto_version == PY_SSL_VERSION_SSL3)
- ctx = SSL_CTX_new(SSLv3_method());
+#if (defined(TLS1_1_VERSION) && \
+ !defined(OPENSSL_NO_TLS1_1) && \
+ !defined(OPENSSL_NO_TLS1_1_METHOD))
+ case PY_SSL_VERSION_TLS1_1:
+ ctx = SSL_CTX_new(TLSv1_1_method());
+ break;
#endif
-#ifndef OPENSSL_NO_SSL2
- else if (proto_version == PY_SSL_VERSION_SSL2)
- ctx = SSL_CTX_new(SSLv2_method());
+#if (defined(TLS1_2_VERSION) && \
+ !defined(OPENSSL_NO_TLS1_2) && \
+ !defined(OPENSSL_NO_TLS1_2_METHOD))
+ case PY_SSL_VERSION_TLS1_2:
+ ctx = SSL_CTX_new(TLSv1_2_method());
+ break;
#endif
- else if (proto_version == PY_SSL_VERSION_TLS) /* SSLv23 */
+ case PY_SSL_VERSION_TLS:
+ /* SSLv23 */
ctx = SSL_CTX_new(TLS_method());
- else if (proto_version == PY_SSL_VERSION_TLS_CLIENT)
+ break;
+ case PY_SSL_VERSION_TLS_CLIENT:
ctx = SSL_CTX_new(TLS_client_method());
- else if (proto_version == PY_SSL_VERSION_TLS_SERVER)
+ break;
+ case PY_SSL_VERSION_TLS_SERVER:
ctx = SSL_CTX_new(TLS_server_method());
- else
+ break;
+ default:
proto_version = -1;
+ }
PySSL_END_ALLOW_THREADS
if (proto_version == -1) {
PyErr_SetString(PyExc_ValueError,
- "invalid protocol version");
+ "invalid or unsupported protocol version");
return NULL;
}
if (ctx == NULL) {
conservative and assume it wasn't fixed until release. We do this check
at runtime to avoid problems from the dynamic linker.
See #25672 for more on this. */
- libver = SSLeay();
+ libver = OpenSSL_version_num();
if (!(libver >= 0x10001000UL && libver < 0x1000108fUL) &&
!(libver >= 0x10000000UL && libver < 0x100000dfUL)) {
SSL_CTX_set_mode(self->ctx, SSL_MODE_RELEASE_BUFFERS);
}
return NULL;
}
- if (SSL_CTX_set_tmp_dh(self->ctx, dh) == 0)
- _setSSLError(NULL, 0, __FILE__, __LINE__);
+ if (!SSL_CTX_set_tmp_dh(self->ctx, dh)) {
+ DH_free(dh);
+ return _setSSLError(NULL, 0, __FILE__, __LINE__);
+ }
DH_free(dh);
Py_RETURN_NONE;
}
* back into a str object, but still as an A-label (bpo-28414)
*/
servername_str = PyUnicode_FromEncodedObject(servername_bytes, "ascii", NULL);
- Py_DECREF(servername_bytes);
if (servername_str == NULL) {
PyErr_WriteUnraisable(servername_bytes);
+ Py_DECREF(servername_bytes);
goto error;
}
+ Py_DECREF(servername_bytes);
result = PyObject_CallFunctionObjArgs(
ssl_ctx->set_sni_cb, ssl_socket, servername_str,
ssl_ctx, NULL);
if (bytes == NULL)
return NULL;
if (pseudo) {
+#ifdef PY_OPENSSL_1_1_API
+ ok = RAND_bytes((unsigned char*)PyBytes_AS_STRING(bytes), len);
+#else
ok = RAND_pseudo_bytes((unsigned char*)PyBytes_AS_STRING(bytes), len);
+#endif
if (ok == 0 || ok == 1)
return Py_BuildValue("NO", bytes, ok == 1 ? Py_True : Py_False);
}
if (!_setup_ssl_threads()) {
return NULL;
}
-#elif OPENSSL_VERSION_1_1 && defined(OPENSSL_THREADS)
+#elif OPENSSL_VERSION_1_1
/* OpenSSL 1.1.0 builtin thread support is enabled */
_ssl_locks_count++;
#endif
PY_SSL_VERSION_TLS_SERVER);
PyModule_AddIntConstant(m, "PROTOCOL_TLSv1",
PY_SSL_VERSION_TLS1);
-#if HAVE_TLSv1_2
PyModule_AddIntConstant(m, "PROTOCOL_TLSv1_1",
PY_SSL_VERSION_TLS1_1);
PyModule_AddIntConstant(m, "PROTOCOL_TLSv1_2",
PY_SSL_VERSION_TLS1_2);
-#endif
/* protocol options */
PyModule_AddIntConstant(m, "OP_ALL",
PyModule_AddIntConstant(m, "OP_NO_SSLv2", SSL_OP_NO_SSLv2);
PyModule_AddIntConstant(m, "OP_NO_SSLv3", SSL_OP_NO_SSLv3);
PyModule_AddIntConstant(m, "OP_NO_TLSv1", SSL_OP_NO_TLSv1);
-#if HAVE_TLSv1_2
PyModule_AddIntConstant(m, "OP_NO_TLSv1_1", SSL_OP_NO_TLSv1_1);
PyModule_AddIntConstant(m, "OP_NO_TLSv1_2", SSL_OP_NO_TLSv1_2);
-#endif
#ifdef SSL_OP_NO_TLSv1_3
PyModule_AddIntConstant(m, "OP_NO_TLSv1_3", SSL_OP_NO_TLSv1_3);
#else
/* SSLeay() gives us the version of the library linked against,
which could be different from the headers version.
*/
- libver = SSLeay();
+ libver = OpenSSL_version_num();
r = PyLong_FromUnsignedLong(libver);
if (r == NULL)
return NULL;
r = Py_BuildValue("IIIII", major, minor, fix, patch, status);
if (r == NULL || PyModule_AddObject(m, "OPENSSL_VERSION_INFO", r))
return NULL;
- r = PyUnicode_FromString(SSLeay_version(SSLEAY_VERSION));
+ r = PyUnicode_FromString(OpenSSL_version(OPENSSL_VERSION));
if (r == NULL || PyModule_AddObject(m, "OPENSSL_VERSION", r))
return NULL;
threadstate = PyGILState_Ensure();
+ ssl_obj = (PySSLSocket *)SSL_get_app_data(ssl);
+ assert(PySSLSocket_Check(ssl_obj));
+ if (ssl_obj->ctx->keylog_bio == NULL) {
+ return;
+ }
+
/* Allocate a static lock to synchronize writes to keylog file.
* The lock is neither released on exit nor on fork(). The lock is
* also shared between all SSLContexts although contexts may write to
}
}
- ssl_obj = (PySSLSocket *)SSL_get_app_data(ssl);
- assert(PySSLSocket_Check(ssl_obj));
- if (ssl_obj->ctx->keylog_bio == NULL) {
- return;
- }
-
PySSL_BEGIN_ALLOW_THREADS
PyThread_acquire_lock(lock, 1);
res = BIO_printf(ssl_obj->ctx->keylog_bio, "%s\n", line);
# define FILE_ATTRIBUTE_NO_SCRUB_DATA 0x20000
#endif
+#ifndef IO_REPARSE_TAG_APPEXECLINK
+# define IO_REPARSE_TAG_APPEXECLINK 0x8000001BL
+#endif
+
#endif /* MS_WINDOWS */
/* From Python's stat.py */
size_t ncodes;
fmt = PyBytes_AS_STRING(self->s_format);
+ if (strlen(fmt) != (size_t)PyBytes_GET_SIZE(self->s_format)) {
+ PyErr_SetString(StructError, "embedded null character");
+ return -1;
+ }
f = whichtable(&fmt);
const char *code;
int r;
PyThreadState *substate, *mainstate;
+ /* only initialise 'cflags.cf_flags' to test backwards compatibility */
+ PyCompilerFlags cflags = {0};
if (!PyArg_ParseTuple(args, "s:run_in_subinterp",
&code))
PyErr_SetString(PyExc_RuntimeError, "sub-interpreter creation failed");
return NULL;
}
- r = PyRun_SimpleString(code);
+ r = PyRun_SimpleStringFlags(code, &cflags);
Py_EndInterpreter(substate);
PyThreadState_Swap(mainstate);
HeapCTypeSubclassWithFinalizer_slots
};
+PyDoc_STRVAR(heapctypesetattr__doc__,
+"A heap type without GC, but with overridden __setattr__.\n\n"
+"The 'value' attribute is set to 10 in __init__ and updated via attribute setting.");
+
+typedef struct {
+ PyObject_HEAD
+ long value;
+} HeapCTypeSetattrObject;
+
+static struct PyMemberDef heapctypesetattr_members[] = {
+ {"pvalue", T_LONG, offsetof(HeapCTypeSetattrObject, value)},
+ {NULL} /* Sentinel */
+};
+
+static int
+heapctypesetattr_init(PyObject *self, PyObject *args, PyObject *kwargs)
+{
+ ((HeapCTypeSetattrObject *)self)->value = 10;
+ return 0;
+}
+
+static void
+heapctypesetattr_dealloc(HeapCTypeSetattrObject *self)
+{
+ PyTypeObject *tp = Py_TYPE(self);
+ PyObject_Del(self);
+ Py_DECREF(tp);
+}
+
+static int
+heapctypesetattr_setattro(HeapCTypeSetattrObject *self, PyObject *attr, PyObject *value)
+{
+ PyObject *svalue = PyUnicode_FromString("value");
+ if (svalue == NULL)
+ return -1;
+ int eq = PyObject_RichCompareBool(svalue, attr, Py_EQ);
+ Py_DECREF(svalue);
+ if (eq < 0)
+ return -1;
+ if (!eq) {
+ return PyObject_GenericSetAttr((PyObject*) self, attr, value);
+ }
+ if (value == NULL) {
+ self->value = 0;
+ return 0;
+ }
+ PyObject *ivalue = PyNumber_Long(value);
+ if (ivalue == NULL)
+ return -1;
+ long v = PyLong_AsLong(ivalue);
+ Py_DECREF(ivalue);
+ if (v == -1 && PyErr_Occurred())
+ return -1;
+ self->value = v;
+ return 0;
+}
+
+static PyType_Slot HeapCTypeSetattr_slots[] = {
+ {Py_tp_init, heapctypesetattr_init},
+ {Py_tp_members, heapctypesetattr_members},
+ {Py_tp_setattro, heapctypesetattr_setattro},
+ {Py_tp_dealloc, heapctypesetattr_dealloc},
+ {Py_tp_doc, (char*)heapctypesetattr__doc__},
+ {0, 0},
+};
+
+static PyType_Spec HeapCTypeSetattr_spec = {
+ "_testcapi.HeapCTypeSetattr",
+ sizeof(HeapCTypeSetattrObject),
+ 0,
+ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE,
+ HeapCTypeSetattr_slots
+};
static struct PyModuleDef _testcapimodule = {
PyModuleDef_HEAD_INIT,
Py_DECREF(subclass_bases);
PyModule_AddObject(m, "HeapCTypeSubclass", HeapCTypeSubclass);
+ PyObject *HeapCTypeSetattr = PyType_FromSpec(&HeapCTypeSetattr_spec);
+ if (HeapCTypeSetattr == NULL) {
+ return NULL;
+ }
+ PyModule_AddObject(m, "HeapCTypeSetattr", HeapCTypeSetattr);
+
PyObject *subclass_with_finalizer_bases = PyTuple_Pack(1, HeapCTypeSubclass);
if (subclass_with_finalizer_bases == NULL) {
return NULL;
#include "pycore_initconfig.h"
+#ifdef MS_WINDOWS
+#include <windows.h>
+
+static int
+_add_windows_config(PyObject *configs)
+{
+ HMODULE hPython3;
+ wchar_t py3path[MAX_PATH];
+ PyObject *dict = PyDict_New();
+ PyObject *obj = NULL;
+ if (!dict) {
+ return -1;
+ }
+
+ hPython3 = GetModuleHandleW(PY3_DLLNAME);
+ if (hPython3 && GetModuleFileNameW(hPython3, py3path, MAX_PATH)) {
+ obj = PyUnicode_FromWideChar(py3path, -1);
+ } else {
+ obj = Py_None;
+ Py_INCREF(obj);
+ }
+ if (obj &&
+ !PyDict_SetItemString(dict, "python3_dll", obj) &&
+ !PyDict_SetItemString(configs, "windows", dict)) {
+ Py_DECREF(obj);
+ Py_DECREF(dict);
+ return 0;
+ }
+ Py_DECREF(obj);
+ Py_DECREF(dict);
+ return -1;
+}
+#endif
+
+
static PyObject *
get_configs(PyObject *self, PyObject *Py_UNUSED(args))
{
- return _Py_GetConfigsAsDict();
+ PyObject *dict = _Py_GetConfigsAsDict();
+#ifdef MS_WINDOWS
+ if (dict) {
+ if (_add_windows_config(dict) < 0) {
+ Py_CLEAR(dict);
+ }
+ }
+#endif
+ return dict;
}
cmp = PyObject_RichCompareBool(selfi, v, Py_EQ);
Py_DECREF(selfi);
if (cmp > 0) {
- return PyLong_FromLong((long)i);
+ return PyLong_FromSsize_t(i);
}
else if (cmp < 0)
return NULL;
"\n"
"Wait until one or more file descriptors are ready for some kind of I/O.\n"
"\n"
-"The first three arguments are sequences of file descriptors to be waited for:\n"
+"The first three arguments are iterables of file descriptors to be waited for:\n"
"rlist -- wait until ready for reading\n"
"wlist -- wait until ready for writing\n"
"xlist -- wait for an \"exceptional condition\"\n"
#if (defined(HAVE_POLL) && !defined(HAVE_BROKEN_POLL))
PyDoc_STRVAR(select_poll_register__doc__,
-"register($self, fd, eventmask=POLLIN | POLLPRI | POLLOUT, /)\n"
+"register($self, fd,\n"
+" eventmask=select.POLLIN | select.POLLPRI | select.POLLOUT, /)\n"
"--\n"
"\n"
"Register a file descriptor with the polling object.\n"
#if (defined(HAVE_POLL) && !defined(HAVE_BROKEN_POLL)) && defined(HAVE_SYS_DEVPOLL_H)
PyDoc_STRVAR(select_devpoll_register__doc__,
-"register($self, fd, eventmask=POLLIN | POLLPRI | POLLOUT, /)\n"
+"register($self, fd,\n"
+" eventmask=select.POLLIN | select.POLLPRI | select.POLLOUT, /)\n"
"--\n"
"\n"
"Register a file descriptor with the polling object.\n"
#if (defined(HAVE_POLL) && !defined(HAVE_BROKEN_POLL)) && defined(HAVE_SYS_DEVPOLL_H)
PyDoc_STRVAR(select_devpoll_modify__doc__,
-"modify($self, fd, eventmask=POLLIN | POLLPRI | POLLOUT, /)\n"
+"modify($self, fd,\n"
+" eventmask=select.POLLIN | select.POLLPRI | select.POLLOUT, /)\n"
"--\n"
"\n"
"Modify a possible already registered file descriptor.\n"
#if defined(HAVE_EPOLL)
PyDoc_STRVAR(select_epoll_register__doc__,
-"register($self, /, fd, eventmask=EPOLLIN | EPOLLPRI | EPOLLOUT)\n"
+"register($self, /, fd,\n"
+" eventmask=select.EPOLLIN | select.EPOLLPRI | select.EPOLLOUT)\n"
"--\n"
"\n"
"Registers a new fd or raises an OSError if the fd is already registered.\n"
#ifndef SELECT_KQUEUE_CONTROL_METHODDEF
#define SELECT_KQUEUE_CONTROL_METHODDEF
#endif /* !defined(SELECT_KQUEUE_CONTROL_METHODDEF) */
-/*[clinic end generated code: output=03041f3d09b04a3d input=a9049054013a1b77]*/
+/*[clinic end generated code: output=86010dde10ca89c6 input=a9049054013a1b77]*/
static int
pymain_run_startup(PyConfig *config, PyCompilerFlags *cf, int *exitcode)
{
+ int ret;
+ PyObject *startup_obj = NULL;
+ if (!config->use_environment) {
+ return 0;
+ }
+#ifdef MS_WINDOWS
+ const wchar_t *wstartup = _wgetenv(L"PYTHONSTARTUP");
+ if (wstartup == NULL || wstartup[0] == L'\0') {
+ return 0;
+ }
+ PyObject *startup_bytes = NULL;
+ startup_obj = PyUnicode_FromWideChar(wstartup, wcslen(wstartup));
+ if (startup_obj == NULL) {
+ goto error;
+ }
+ startup_bytes = PyUnicode_EncodeFSDefault(startup_obj);
+ if (startup_bytes == NULL) {
+ goto error;
+ }
+ const char *startup = PyBytes_AS_STRING(startup_bytes);
+#else
const char *startup = _Py_GetEnv(config->use_environment, "PYTHONSTARTUP");
if (startup == NULL) {
return 0;
}
- if (PySys_Audit("cpython.run_startup", "s", startup) < 0) {
- return pymain_err_print(exitcode);
+ startup_obj = PyUnicode_DecodeFSDefault(startup);
+ if (startup_obj == NULL) {
+ goto error;
+ }
+#endif
+ if (PySys_Audit("cpython.run_startup", "O", startup_obj) < 0) {
+ goto error;
}
+#ifdef MS_WINDOWS
+ FILE *fp = _Py_wfopen(wstartup, L"r");
+#else
FILE *fp = _Py_fopen(startup, "r");
+#endif
if (fp == NULL) {
int save_errno = errno;
+ PyErr_Clear();
PySys_WriteStderr("Could not open PYTHONSTARTUP\n");
errno = save_errno;
- PyErr_SetFromErrnoWithFilename(PyExc_OSError, startup);
-
- return pymain_err_print(exitcode);
+ PyErr_SetFromErrnoWithFilenameObjects(PyExc_OSError, startup_obj, NULL);
+ goto error;
}
(void) PyRun_SimpleFileExFlags(fp, startup, 0, cf);
PyErr_Clear();
fclose(fp);
- return 0;
+ ret = 0;
+
+done:
+#ifdef MS_WINDOWS
+ Py_XDECREF(startup_bytes);
+#endif
+ Py_XDECREF(startup_obj);
+ return ret;
+
+error:
+ ret = pymain_err_print(exitcode);
+ goto done;
}
return PyUnicode_FromString(buf);
}
-#ifdef ENABLE_IPV6
/* Convert IPv6 sockaddr to a Python str. */
static PyObject *
}
return PyUnicode_FromString(buf);
}
-#endif
static PyObject*
unparse_address(LPSOCKADDR Address, DWORD Length)
}
return ret;
}
-#ifdef ENABLE_IPV6
case AF_INET6: {
const struct sockaddr_in6 *a = (const struct sockaddr_in6 *)Address;
PyObject *addrobj = make_ipv6_addr(a);
}
return ret;
}
-#endif /* ENABLE_IPV6 */
default: {
- return SetFromWindowsErr(ERROR_INVALID_PARAMETER);
+ PyErr_SetString(PyExc_ValueError, "recvfrom returned unsupported address family");
+ return NULL;
}
}
}
}
#endif /* defined(HAVE_READLINK) || defined(MS_WINDOWS) */
-#ifdef HAVE_SYMLINK
-
#if defined(MS_WINDOWS)
/* Remove the last portion of the path - return 0 on success */
return 0;
}
+#endif
+
+#ifdef HAVE_SYMLINK
+
+#if defined(MS_WINDOWS)
+
/* Is this path absolute? */
static int
_is_absW(const WCHAR *path)
Wait until one or more file descriptors are ready for some kind of I/O.
-The first three arguments are sequences of file descriptors to be waited for:
+The first three arguments are iterables of file descriptors to be waited for:
rlist -- wait until ready for reading
wlist -- wait until ready for writing
xlist -- wait for an "exceptional condition"
static PyObject *
select_select_impl(PyObject *module, PyObject *rlist, PyObject *wlist,
PyObject *xlist, PyObject *timeout_obj)
-/*[clinic end generated code: output=2b3cfa824f7ae4cf input=177e72184352df25]*/
+/*[clinic end generated code: output=2b3cfa824f7ae4cf input=e467f5d68033de00]*/
{
#ifdef SELECT_USES_HEAP
pylist *rfd2obj, *wfd2obj, *efd2obj;
}
#endif /* SELECT_USES_HEAP */
- /* Convert sequences to fd_sets, and get maximum fd number
+ /* Convert iterables to fd_sets, and get maximum fd number
* propagates the Python exception set in seq2set()
*/
rfd2obj[0].sentinel = -1;
fd: fildes
either an integer, or an object with a fileno() method returning an int
- eventmask: unsigned_short(c_default="POLLIN | POLLPRI | POLLOUT") = POLLIN | POLLPRI | POLLOUT
+ eventmask: unsigned_short(c_default="POLLIN | POLLPRI | POLLOUT") = select.POLLIN | select.POLLPRI | select.POLLOUT
an optional bitmask describing the type of events to check for
/
static PyObject *
select_poll_register_impl(pollObject *self, int fd, unsigned short eventmask)
-/*[clinic end generated code: output=0dc7173c800a4a65 input=f18711d9bb021e25]*/
+/*[clinic end generated code: output=0dc7173c800a4a65 input=34e16cfb28d3c900]*/
{
PyObject *key, *value;
int err;
fd: fildes
either an integer, or an object with a fileno() method returning
an int
- eventmask: unsigned_short(c_default="POLLIN | POLLPRI | POLLOUT") = POLLIN | POLLPRI | POLLOUT
+ eventmask: unsigned_short(c_default="POLLIN | POLLPRI | POLLOUT") = select.POLLIN | select.POLLPRI | select.POLLOUT
an optional bitmask describing the type of events to check for
/
static PyObject *
select_devpoll_register_impl(devpollObject *self, int fd,
unsigned short eventmask)
-/*[clinic end generated code: output=6e07fe8b74abba0c input=5bd7cacc47a8ee46]*/
+/*[clinic end generated code: output=6e07fe8b74abba0c input=22006fabe9567522]*/
{
return internal_devpoll_register(self, fd, eventmask, 0);
}
fd: fildes
either an integer, or an object with a fileno() method returning
an int
- eventmask: unsigned_short(c_default="POLLIN | POLLPRI | POLLOUT") = POLLIN | POLLPRI | POLLOUT
+ eventmask: unsigned_short(c_default="POLLIN | POLLPRI | POLLOUT") = select.POLLIN | select.POLLPRI | select.POLLOUT
an optional bitmask describing the type of events to check for
/
static PyObject *
select_devpoll_modify_impl(devpollObject *self, int fd,
unsigned short eventmask)
-/*[clinic end generated code: output=bc2e6d23aaff98b4 input=48a820fc5967165d]*/
+/*[clinic end generated code: output=bc2e6d23aaff98b4 input=09fa335db7cdc09e]*/
{
return internal_devpoll_register(self, fd, eventmask, 1);
}
fd: fildes
the target file descriptor of the operation
- eventmask: unsigned_int(c_default="EPOLLIN | EPOLLPRI | EPOLLOUT", bitwise=True) = EPOLLIN | EPOLLPRI | EPOLLOUT
+ eventmask: unsigned_int(c_default="EPOLLIN | EPOLLPRI | EPOLLOUT", bitwise=True) = select.EPOLLIN | select.EPOLLPRI | select.EPOLLOUT
a bit set composed of the various EPOLL constants
Registers a new fd or raises an OSError if the fd is already registered.
static PyObject *
select_epoll_register_impl(pyEpoll_Object *self, int fd,
unsigned int eventmask)
-/*[clinic end generated code: output=318e5e6386520599 input=6cf699c152dd8ca9]*/
+/*[clinic end generated code: output=318e5e6386520599 input=a5071b71edfe3578]*/
{
return pyepoll_internal_ctl(self->epfd, EPOLL_CTL_ADD, fd, eventmask);
}
#endif
static int
-is_main(_PyRuntimeState *runtime)
+is_main_interp(_PyRuntimeState *runtime, PyInterpreterState *interp)
{
unsigned long thread = PyThread_get_thread_ident();
- PyInterpreterState *interp = _PyRuntimeState_GetThreadState(runtime)->interp;
return (thread == runtime->main_thread
&& interp == runtime->interpreters.main);
}
+static int
+is_main(_PyRuntimeState *runtime)
+{
+ PyInterpreterState *interp = _PyRuntimeState_GetThreadState(runtime)->interp;
+ return is_main_interp(runtime, interp);
+}
+
static PyObject *
signal_default_int_handler(PyObject *self, PyObject *args)
{
finisignal();
}
+
+// The caller doesn't have to hold the GIL
int
-PyOS_InterruptOccurred(void)
+_PyOS_InterruptOccurred(PyThreadState *tstate)
{
if (_Py_atomic_load_relaxed(&Handlers[SIGINT].tripped)) {
_PyRuntimeState *runtime = &_PyRuntime;
- if (!is_main(runtime)) {
+ if (!is_main_interp(runtime, tstate->interp)) {
return 0;
}
_Py_atomic_store_relaxed(&Handlers[SIGINT].tripped, 0);
return 0;
}
+
+// The caller must to hold the GIL
+int
+PyOS_InterruptOccurred(void)
+{
+ PyThreadState *tstate = _PyThreadState_GET();
+ return _PyOS_InterruptOccurred(tstate);
+}
+
+
static void
_clear_pending_signals(void)
{
it = PyObject_GetIter(seq);
if (it == NULL) {
- type_error("argument of type '%.200s' is not iterable", seq);
+ if (PyErr_ExceptionMatches(PyExc_TypeError)) {
+ type_error("argument of type '%.200s' is not iterable", seq);
+ }
return -1;
}
result = (PyByteArrayObject *) \
PyByteArray_FromStringAndSize(NULL, va.len + vb.len);
- if (result != NULL) {
+ // result->ob_bytes is NULL if result is an empty string:
+ // if va.len + vb.len equals zero.
+ if (result != NULL && result->ob_bytes != NULL) {
memcpy(result->ob_bytes, va.buf, va.len);
memcpy(result->ob_bytes + va.len, vb.buf, vb.len);
}
static PyObject *
lookup_maybe_method(PyObject *self, _Py_Identifier *attrid, int *unbound);
+static int
+slot_tp_setattro(PyObject *self, PyObject *name, PyObject *value);
+
/*
* finds the beginning of the docstring's introspection signature.
* if present, returns a pointer pointing to the first '('.
}
/* Helper to check for object.__setattr__ or __delattr__ applied to a type.
- This is called the Carlo Verre hack after its discoverer. */
+ This is called the Carlo Verre hack after its discoverer. See
+ https://mail.python.org/pipermail/python-dev/2003-April/034535.html
+ */
static int
hackcheck(PyObject *self, setattrofunc func, const char *what)
{
PyTypeObject *type = Py_TYPE(self);
- while (type && type->tp_flags & Py_TPFLAGS_HEAPTYPE)
- type = type->tp_base;
- /* If type is NULL now, this is a really weird type.
- In the spirit of backwards compatibility (?), just shut up. */
- if (type && type->tp_setattro != func) {
- PyErr_Format(PyExc_TypeError,
- "can't apply this %s to %s object",
- what,
- type->tp_name);
- return 0;
+ PyObject *mro = type->tp_mro;
+ if (!mro) {
+ /* Probably ok not to check the call in this case. */
+ return 1;
+ }
+ assert(PyTuple_Check(mro));
+ Py_ssize_t i, n;
+ n = PyTuple_GET_SIZE(mro);
+ for (i = 0; i < n; i++) {
+ PyTypeObject *base = (PyTypeObject*) PyTuple_GET_ITEM(mro, i);
+ if (base->tp_setattro == func) {
+ /* 'func' is the earliest non-Python implementation in the MRO. */
+ break;
+ } else if (base->tp_setattro != slot_tp_setattro) {
+ /* 'base' is not a Python class and overrides 'func'.
+ Its tp_setattro should be called instead. */
+ PyErr_Format(PyExc_TypeError,
+ "can't apply this %s to %s object",
+ what,
+ type->tp_name);
+ return 0;
+ }
}
+ /* Either 'func' is not in the mro (which should fail when checking 'self'),
+ or it's the right slot function to call. */
return 1;
}
msidb_openview(msiobj *msidb, PyObject *args)
{
int status;
- char *sql;
+ const wchar_t *sql;
MSIHANDLE hView;
msiobj *result;
- if (!PyArg_ParseTuple(args, "s:OpenView", &sql))
+ if (!PyArg_ParseTuple(args, "u:OpenView", &sql))
return NULL;
- if ((status = MsiDatabaseOpenView(msidb->h, sql, &hView)) != ERROR_SUCCESS)
+ if ((status = MsiDatabaseOpenViewW(msidb->h, sql, &hView)) != ERROR_SUCCESS)
return msierror(status);
result = PyObject_NEW(struct msiobj, &msiview_Type);
static PyObject* msiopendb(PyObject *obj, PyObject *args)
{
int status;
- char *path;
+ const wchar_t *path;
int persist;
MSIHANDLE h;
msiobj *result;
- if (!PyArg_ParseTuple(args, "si:MSIOpenDatabase", &path, &persist))
+ if (!PyArg_ParseTuple(args, "ui:MSIOpenDatabase", &path, &persist))
return NULL;
/* We need to validate that persist is a valid MSIDBOPEN_* value. Otherwise,
MsiOpenDatabase may treat the value as a pointer, leading to unexpected
behavior. */
if (Py_INVALID_PERSIST(persist))
return msierror(ERROR_INVALID_PARAMETER);
- status = MsiOpenDatabase(path, (LPCSTR)(SIZE_T)persist, &h);
+ status = MsiOpenDatabaseW(path, (LPCWSTR)(SIZE_T)persist, &h);
if (status != ERROR_SUCCESS)
return msierror(status);
wchar_t *machine_path; /* from HKEY_LOCAL_MACHINE */
wchar_t *user_path; /* from HKEY_CURRENT_USER */
- wchar_t *dll_path;
-
const wchar_t *pythonpath_env;
} PyCalculatePath;
static int
change_ext(wchar_t *dest, const wchar_t *src, const wchar_t *ext)
{
- size_t src_len = wcsnlen_s(src, MAXPATHLEN+1);
- size_t i = src_len;
- if (i >= MAXPATHLEN+1) {
- Py_FatalError("buffer overflow in getpathp.c's reduce()");
- }
+ if (src && src != dest) {
+ size_t src_len = wcsnlen_s(src, MAXPATHLEN+1);
+ size_t i = src_len;
+ if (i >= MAXPATHLEN+1) {
+ Py_FatalError("buffer overflow in getpathp.c's reduce()");
+ }
- while (i > 0 && src[i] != '.' && !is_sep(src[i]))
- --i;
+ while (i > 0 && src[i] != '.' && !is_sep(src[i]))
+ --i;
- if (i == 0) {
- dest[0] = '\0';
- return -1;
- }
+ if (i == 0) {
+ dest[0] = '\0';
+ return -1;
+ }
+
+ if (is_sep(src[i])) {
+ i = src_len;
+ }
- if (is_sep(src[i])) {
- i = src_len;
+ if (wcsncpy_s(dest, MAXPATHLEN+1, src, i)) {
+ dest[0] = '\0';
+ return -1;
+ }
+ } else {
+ wchar_t *s = wcsrchr(dest, L'.');
+ if (s) {
+ s[0] = '\0';
+ }
}
- if (wcsncpy_s(dest, MAXPATHLEN+1, src, i) ||
- wcscat_s(dest, MAXPATHLEN+1, ext))
- {
+ if (wcscat_s(dest, MAXPATHLEN+1, ext)) {
dest[0] = '\0';
return -1;
}
}
+static int
+get_dllpath(wchar_t *dllpath)
+{
+#ifdef Py_ENABLE_SHARED
+ extern HANDLE PyWin_DLLhModule;
+ if (PyWin_DLLhModule && GetModuleFileNameW(PyWin_DLLhModule, dllpath, MAXPATHLEN)) {
+ return 0;
+ }
+#endif
+ return -1;
+}
+
+
#ifdef Py_ENABLE_SHARED
/* a string loaded from the DLL at startup.*/
goto done;
}
/* Find out how big our core buffer is, and how many subkeys we have */
- rc = RegQueryInfoKey(newKey, NULL, NULL, NULL, &numKeys, NULL, NULL,
+ rc = RegQueryInfoKeyW(newKey, NULL, NULL, NULL, &numKeys, NULL, NULL,
NULL, NULL, &dataSize, NULL, NULL);
if (rc!=ERROR_SUCCESS) {
goto done;
#endif /* Py_ENABLE_SHARED */
-wchar_t*
-_Py_GetDLLPath(void)
-{
- wchar_t dll_path[MAXPATHLEN+1];
- memset(dll_path, 0, sizeof(dll_path));
-
-#ifdef Py_ENABLE_SHARED
- extern HANDLE PyWin_DLLhModule;
- if (PyWin_DLLhModule) {
- if (!GetModuleFileNameW(PyWin_DLLhModule, dll_path, MAXPATHLEN)) {
- dll_path[0] = 0;
- }
- }
-#else
- dll_path[0] = 0;
-#endif
-
- return _PyMem_RawWcsdup(dll_path);
-}
-
-
static PyStatus
get_program_full_path(_PyPathConfig *pathconfig)
{
get_pth_filename(PyCalculatePath *calculate, wchar_t *filename,
const _PyPathConfig *pathconfig)
{
- if (calculate->dll_path[0]) {
- if (!change_ext(filename, calculate->dll_path, L"._pth") &&
- exists(filename))
- {
- return 1;
- }
+ if (get_dllpath(filename) &&
+ !change_ext(filename, filename, L"._pth") &&
+ exists(filename))
+ {
+ return 1;
}
- if (pathconfig->program_full_path[0]) {
- if (!change_ext(filename, pathconfig->program_full_path, L"._pth") &&
- exists(filename))
- {
- return 1;
- }
+ if (pathconfig->program_full_path[0] &&
+ !change_ext(filename, pathconfig->program_full_path, L"._pth") &&
+ exists(filename))
+ {
+ return 1;
}
return 0;
}
wchar_t zip_path[MAXPATHLEN+1];
memset(zip_path, 0, sizeof(zip_path));
- change_ext(zip_path,
- calculate->dll_path[0] ? calculate->dll_path : pathconfig->program_full_path,
- L".zip");
+ if (get_dllpath(zip_path) || change_ext(zip_path, zip_path, L".zip"))
+ {
+ if (change_ext(zip_path, pathconfig->program_full_path, L".zip")) {
+ zip_path[0] = L'\0';
+ }
+ }
calculate_home_prefix(calculate, argv0_path, zip_path, prefix);
calculate->home = pathconfig->home;
calculate->path_env = _wgetenv(L"PATH");
- calculate->dll_path = _Py_GetDLLPath();
- if (calculate->dll_path == NULL) {
- return _PyStatus_NO_MEMORY();
- }
-
calculate->pythonpath_env = config->pythonpath_env;
return _PyStatus_OK();
{
PyMem_RawFree(calculate->machine_path);
PyMem_RawFree(calculate->user_path);
- PyMem_RawFree(calculate->dll_path);
}
- PyConfig.pythonpath_env: PYTHONPATH environment variable
- _PyPathConfig.home: Py_SetPythonHome() or PYTHONHOME environment variable
- - DLL path: _Py_GetDLLPath()
- PATH environment variable
- __PYVENV_LAUNCHER__ environment variable
- GetModuleFileNameW(NULL): fully qualified path of the executable file of
_Py_CheckPython3(void)
{
wchar_t py3path[MAXPATHLEN+1];
- wchar_t *s;
if (python3_checked) {
return hPython3 != NULL;
}
python3_checked = 1;
/* If there is a python3.dll next to the python3y.dll,
- assume this is a build tree; use that DLL */
- if (_Py_dll_path != NULL) {
- wcscpy(py3path, _Py_dll_path);
- }
- else {
- wcscpy(py3path, L"");
- }
- s = wcsrchr(py3path, L'\\');
- if (!s) {
- s = py3path;
+ use that DLL */
+ if (!get_dllpath(py3path)) {
+ reduce(py3path);
+ join(py3path, PY3_DLLNAME);
+ hPython3 = LoadLibraryExW(py3path, NULL, LOAD_LIBRARY_SEARCH_DEFAULT_DIRS);
+ if (hPython3 != NULL) {
+ return 1;
+ }
}
- wcscpy(s, L"\\python3.dll");
- hPython3 = LoadLibraryExW(py3path, NULL, LOAD_WITH_ALTERED_SEARCH_PATH);
+
+ /* If we can locate python3.dll in our application dir,
+ use that DLL */
+ hPython3 = LoadLibraryExW(PY3_DLLNAME, NULL, LOAD_LIBRARY_SEARCH_APPLICATION_DIR);
if (hPython3 != NULL) {
return 1;
}
- /* Check sys.prefix\DLLs\python3.dll */
+ /* For back-compat, also search {sys.prefix}\DLLs, though
+ that has not been a normal install layout for a while */
wcscpy(py3path, Py_GetPrefix());
- wcscat(py3path, L"\\DLLs\\python3.dll");
- hPython3 = LoadLibraryExW(py3path, NULL, LOAD_WITH_ALTERED_SEARCH_PATH);
+ if (py3path[0]) {
+ join(py3path, L"DLLs\\" PY3_DLLNAME);
+ hPython3 = LoadLibraryExW(py3path, NULL, LOAD_LIBRARY_SEARCH_DEFAULT_DIRS);
+ }
return hPython3 != NULL;
}
// Resource script for Python core DLL.
// Currently only holds version information.
//
+#pragma code_page(1252)
#include "winver.h"
#define PYTHON_COMPANY "Python Software Foundation"
if (PySys_Audit("winreg.QueryInfoKey", "n", (Py_ssize_t)key) < 0) {
return NULL;
}
- if ((rc = RegQueryInfoKey(key, NULL, NULL, 0, &nSubKeys, NULL, NULL,
- &nValues, NULL, NULL, NULL, &ft))
- != ERROR_SUCCESS) {
+ if ((rc = RegQueryInfoKeyW(key, NULL, NULL, 0, &nSubKeys, NULL, NULL,
+ &nValues, NULL, NULL, NULL, &ft))
+ != ERROR_SUCCESS) {
return PyErr_SetFromWindowsErrWithFunction(rc, "RegQueryInfoKey");
}
li.LowPart = ft.dwLowDateTime;
set libraries=\r
set libraries=%libraries% bzip2-1.0.6\r
if NOT "%IncludeLibffiSrc%"=="false" set libraries=%libraries% libffi-3.3.0-rc0-r1\r
-if NOT "%IncludeSSLSrc%"=="false" set libraries=%libraries% openssl-1.1.1f\r
+if NOT "%IncludeSSLSrc%"=="false" set libraries=%libraries% openssl-1.1.1g\r
set libraries=%libraries% sqlite-3.31.1.0\r
if NOT "%IncludeTkinterSrc%"=="false" set libraries=%libraries% tcl-core-8.6.9.0\r
if NOT "%IncludeTkinterSrc%"=="false" set libraries=%libraries% tk-8.6.9.0\r
\r
set binaries=\r
if NOT "%IncludeLibffi%"=="false" set binaries=%binaries% libffi\r
-if NOT "%IncludeSSL%"=="false" set binaries=%binaries% openssl-bin-1.1.1f\r
+if NOT "%IncludeSSL%"=="false" set binaries=%binaries% openssl-bin-1.1.1g\r
if NOT "%IncludeTkinter%"=="false" set binaries=%binaries% tcltk-8.6.9.0\r
if NOT "%IncludeSSLSrc%"=="false" set binaries=%binaries% nasm-2.11.06\r
\r
</PropertyGroup>\r
<ItemDefinitionGroup>\r
<ClCompile>\r
- <PreprocessorDefinitions>WIN32;HAVE_CONFIG_H;_DEBUG;_LIB;%(PreprocessorDefinitions)</PreprocessorDefinitions>\r
- <WarningLevel>Level3</WarningLevel>\r
- <DebugInformationFormat>ProgramDatabase</DebugInformationFormat>\r
- <Optimization>Disabled</Optimization>\r
- <AdditionalIncludeDirectories>$(lzmaDir)windows;$(lzmaDir)src/liblzma/common;$(lzmaDir)src/common;$(lzmaDir)src/liblzma/api;$(lzmaDir)src/liblzma/check;$(lzmaDir)src/liblzma/delta;$(lzmaDir)src/liblzma/lz;$(lzmaDir)src/liblzma/lzma;$(lzmaDir)src/liblzma/rangecoder;$(lzmaDir)src/liblzma/simple</AdditionalIncludeDirectories>\r
+ <PreprocessorDefinitions>WIN32;HAVE_CONFIG_H;_LIB;%(PreprocessorDefinitions)</PreprocessorDefinitions>\r
+ <AdditionalIncludeDirectories>$(lzmaDir)windows;$(lzmaDir)src/liblzma/common;$(lzmaDir)src/common;$(lzmaDir)src/liblzma/api;$(lzmaDir)src/liblzma/check;$(lzmaDir)src/liblzma/delta;$(lzmaDir)src/liblzma/lz;$(lzmaDir)src/liblzma/lzma;$(lzmaDir)src/liblzma/rangecoder;$(lzmaDir)src/liblzma/simple;%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>\r
<DisableSpecificWarnings>4028;4113;4133;4244;4267;4996;%(DisableSpecificWarnings)</DisableSpecificWarnings>\r
</ClCompile>\r
</ItemDefinitionGroup>\r
<_PlatformPreprocessorDefinition>_WIN32;</_PlatformPreprocessorDefinition>\r
<_PlatformPreprocessorDefinition Condition="$(Platform) == 'x64'">_WIN64;_M_X64;</_PlatformPreprocessorDefinition>\r
<_PydPreprocessorDefinition Condition="$(TargetExt) == '.pyd'">Py_BUILD_CORE_MODULE;</_PydPreprocessorDefinition>\r
+ <_Py3NamePreprocessorDefinition>PY3_DLLNAME=L"$(Py3DllName)";</_Py3NamePreprocessorDefinition>\r
</PropertyGroup>\r
<ItemDefinitionGroup>\r
<ClCompile>\r
<AdditionalIncludeDirectories>$(PySourcePath)Include;$(PySourcePath)Include\internal;$(PySourcePath)PC;$(IntDir);%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>\r
- <PreprocessorDefinitions>WIN32;$(_PlatformPreprocessorDefinition)$(_DebugPreprocessorDefinition)$(_PydPreprocessorDefinition)%(PreprocessorDefinitions)</PreprocessorDefinitions>\r
+ <PreprocessorDefinitions>WIN32;$(_Py3NamePreprocessorDefinition);$(_PlatformPreprocessorDefinition)$(_DebugPreprocessorDefinition)$(_PydPreprocessorDefinition)%(PreprocessorDefinitions)</PreprocessorDefinitions>\r
\r
<Optimization>MaxSpeed</Optimization>\r
<IntrinsicFunctions>true</IntrinsicFunctions>\r
<libffiDir>$(ExternalsDir)libffi\</libffiDir>\r
<libffiOutDir>$(ExternalsDir)libffi\$(ArchName)\</libffiOutDir>\r
<libffiIncludeDir>$(libffiOutDir)include</libffiIncludeDir>\r
- <opensslDir>$(ExternalsDir)openssl-1.1.1f\</opensslDir>\r
- <opensslOutDir>$(ExternalsDir)openssl-bin-1.1.1f\$(ArchName)\</opensslOutDir>\r
+ <opensslDir>$(ExternalsDir)openssl-1.1.1g\</opensslDir>\r
+ <opensslOutDir>$(ExternalsDir)openssl-bin-1.1.1g\$(ArchName)\</opensslOutDir>\r
<opensslIncludeDir>$(opensslOutDir)include</opensslIncludeDir>\r
<nasmDir>$(ExternalsDir)\nasm-2.11.06\</nasmDir>\r
<zlibDir>$(ExternalsDir)\zlib-1.2.11\</zlibDir>\r
\r
<!-- The name of the resulting pythonXY.dll (without the extension) -->\r
<PyDllName>python$(MajorVersionNumber)$(MinorVersionNumber)$(PyDebugExt)</PyDllName>\r
+ <!-- The name of the resulting pythonX.dll (without the extension) -->\r
+ <Py3DllName>python3$(PyDebugExt)</Py3DllName>\r
\r
<!-- The version and platform tag to include in .pyd filenames -->\r
<PydTag Condition="$(ArchName) == 'win32'">.cp$(MajorVersionNumber)$(MinorVersionNumber)-win32</PydTag>\r
}
if (fields) {
numfields = PySequence_Size(fields);
- if (numfields == -1)
+ if (numfields == -1) {
goto cleanup;
+ }
}
res = 0; /* if no error occurs, this stays 0 to the end */
}
res = PyObject_SetAttr(self, name, PyTuple_GET_ITEM(args, i));
Py_DECREF(name);
- if (res < 0)
+ if (res < 0) {
goto cleanup;
+ }
}
if (kw) {
i = 0; /* needed by PyDict_Next */
while (PyDict_Next(kw, &i, &key, &value)) {
+ int contains = PySequence_Contains(fields, key);
+ if (contains == -1) {
+ res = -1;
+ goto cleanup;
+ } else if (contains == 1) {
+ Py_ssize_t p = PySequence_Index(fields, key);
+ if (p == -1) {
+ res = -1;
+ goto cleanup;
+ }
+ if (p < PyTuple_GET_SIZE(args)) {
+ PyErr_Format(PyExc_TypeError,
+ "%.400s got multiple values for argument '%U'",
+ Py_TYPE(self)->tp_name, key);
+ res = -1;
+ goto cleanup;
+ }
+ }
res = PyObject_SetAttr(self, key, value);
- if (res < 0)
+ if (res < 0) {
goto cleanup;
+ }
}
}
cleanup:
int (*PyOS_InputHook)(void) = NULL;
/* This function restarts a fgets() after an EINTR error occurred
- except if PyOS_InterruptOccurred() returns true. */
+ except if _PyOS_InterruptOccurred() returns true. */
static int
-my_fgets(char *buf, int len, FILE *fp)
+my_fgets(PyThreadState* tstate, char *buf, int len, FILE *fp)
{
#ifdef MS_WINDOWS
- HANDLE hInterruptEvent;
+ HANDLE handle;
+ _Py_BEGIN_SUPPRESS_IPH
+ handle = (HANDLE)_get_osfhandle(fileno(fp));
+ _Py_END_SUPPRESS_IPH
+
+ /* bpo-40826: fgets(fp) does crash if fileno(fp) is closed */
+ if (handle == INVALID_HANDLE_VALUE) {
+ return -1; /* EOF */
+ }
#endif
- char *p;
- int err;
+
while (1) {
- if (PyOS_InputHook != NULL)
+ if (PyOS_InputHook != NULL) {
(void)(PyOS_InputHook)();
+ }
+
errno = 0;
clearerr(fp);
- p = fgets(buf, len, fp);
- if (p != NULL)
+ char *p = fgets(buf, len, fp);
+ if (p != NULL) {
return 0; /* No error */
- err = errno;
+ }
+ int err = errno;
+
#ifdef MS_WINDOWS
/* Ctrl-C anywhere on the line or Ctrl-Z if the only character
on a line will set ERROR_OPERATION_ABORTED. Under normal
through to check for EOF.
*/
if (GetLastError()==ERROR_OPERATION_ABORTED) {
- hInterruptEvent = _PyOS_SigintEvent();
+ HANDLE hInterruptEvent = _PyOS_SigintEvent();
switch (WaitForSingleObjectEx(hInterruptEvent, 10, FALSE)) {
case WAIT_OBJECT_0:
ResetEvent(hInterruptEvent);
}
}
#endif /* MS_WINDOWS */
+
if (feof(fp)) {
clearerr(fp);
return -1; /* EOF */
}
+
#ifdef EINTR
if (err == EINTR) {
- int s;
- PyEval_RestoreThread(_PyOS_ReadlineTState);
- s = PyErr_CheckSignals();
+ PyEval_RestoreThread(tstate);
+ int s = PyErr_CheckSignals();
PyEval_SaveThread();
- if (s < 0)
- return 1;
- /* try again */
+
+ if (s < 0) {
+ return 1;
+ }
+ /* try again */
continue;
}
#endif
- if (PyOS_InterruptOccurred()) {
+
+ if (_PyOS_InterruptOccurred(tstate)) {
return 1; /* Interrupt */
}
return -2; /* Error */
extern char _get_console_type(HANDLE handle);
char *
-_PyOS_WindowsConsoleReadline(HANDLE hStdIn)
+_PyOS_WindowsConsoleReadline(PyThreadState *tstate, HANDLE hStdIn)
{
static wchar_t wbuf_local[1024 * 16];
const DWORD chunk_size = 1024;
if (WaitForSingleObjectEx(hInterruptEvent, 100, FALSE)
== WAIT_OBJECT_0) {
ResetEvent(hInterruptEvent);
- PyEval_RestoreThread(_PyOS_ReadlineTState);
+ PyEval_RestoreThread(tstate);
s = PyErr_CheckSignals();
PyEval_SaveThread();
- if (s < 0)
+ if (s < 0) {
goto exit;
+ }
}
break;
}
if (wbuf == wbuf_local) {
wbuf[total_read] = '\0';
wbuf = (wchar_t*)PyMem_RawMalloc(wbuflen * sizeof(wchar_t));
- if (wbuf)
+ if (wbuf) {
wcscpy_s(wbuf, wbuflen, wbuf_local);
+ }
else {
+ PyEval_RestoreThread(tstate);
PyErr_NoMemory();
+ PyEval_SaveThread();
goto exit;
}
}
else {
wchar_t *tmp = PyMem_RawRealloc(wbuf, wbuflen * sizeof(wchar_t));
if (tmp == NULL) {
+ PyEval_RestoreThread(tstate);
PyErr_NoMemory();
+ PyEval_SaveThread();
goto exit;
}
wbuf = tmp;
if (wbuf[0] == '\x1a') {
buf = PyMem_RawMalloc(1);
- if (buf)
+ if (buf) {
buf[0] = '\0';
+ }
else {
+ PyEval_RestoreThread(tstate);
PyErr_NoMemory();
+ PyEval_SaveThread();
}
goto exit;
}
- u8len = WideCharToMultiByte(CP_UTF8, 0, wbuf, total_read, NULL, 0, NULL, NULL);
+ u8len = WideCharToMultiByte(CP_UTF8, 0,
+ wbuf, total_read,
+ NULL, 0,
+ NULL, NULL);
buf = PyMem_RawMalloc(u8len + 1);
if (buf == NULL) {
+ PyEval_RestoreThread(tstate);
PyErr_NoMemory();
+ PyEval_SaveThread();
goto exit;
}
- u8len = WideCharToMultiByte(CP_UTF8, 0, wbuf, total_read, buf, u8len, NULL, NULL);
+
+ u8len = WideCharToMultiByte(CP_UTF8, 0,
+ wbuf, total_read,
+ buf, u8len,
+ NULL, NULL);
buf[u8len] = '\0';
exit:
- if (wbuf != wbuf_local)
+ if (wbuf != wbuf_local) {
PyMem_RawFree(wbuf);
+ }
if (err) {
- PyEval_RestoreThread(_PyOS_ReadlineTState);
+ PyEval_RestoreThread(tstate);
PyErr_SetFromWindowsErr(err);
PyEval_SaveThread();
}
-
return buf;
}
{
size_t n;
char *p, *pr;
+ PyThreadState *tstate = _PyOS_ReadlineTState;
+ assert(tstate != NULL);
#ifdef MS_WINDOWS
if (!Py_LegacyWindowsStdioFlag && sys_stdin == stdin) {
if (wlen) {
wbuf = PyMem_RawMalloc(wlen * sizeof(wchar_t));
if (wbuf == NULL) {
+ PyEval_RestoreThread(tstate);
PyErr_NoMemory();
+ PyEval_SaveThread();
return NULL;
}
wlen = MultiByteToWideChar(CP_UTF8, 0, prompt, -1,
}
}
clearerr(sys_stdin);
- return _PyOS_WindowsConsoleReadline(hStdIn);
+ return _PyOS_WindowsConsoleReadline(tstate, hStdIn);
}
}
#endif
n = 100;
p = (char *)PyMem_RawMalloc(n);
if (p == NULL) {
+ PyEval_RestoreThread(tstate);
PyErr_NoMemory();
+ PyEval_SaveThread();
return NULL;
}
fflush(sys_stdout);
- if (prompt)
+ if (prompt) {
fprintf(stderr, "%s", prompt);
+ }
fflush(stderr);
- switch (my_fgets(p, (int)n, sys_stdin)) {
+ switch (my_fgets(tstate, p, (int)n, sys_stdin)) {
case 0: /* Normal case */
break;
case 1: /* Interrupt */
*p = '\0';
break;
}
+
n = strlen(p);
while (n > 0 && p[n-1] != '\n') {
size_t incr = n+2;
if (incr > INT_MAX) {
PyMem_RawFree(p);
+ PyEval_RestoreThread(tstate);
PyErr_SetString(PyExc_OverflowError, "input line too long");
+ PyEval_SaveThread();
return NULL;
}
+
pr = (char *)PyMem_RawRealloc(p, n + incr);
if (pr == NULL) {
PyMem_RawFree(p);
+ PyEval_RestoreThread(tstate);
PyErr_NoMemory();
+ PyEval_SaveThread();
return NULL;
}
p = pr;
- if (my_fgets(p+n, (int)incr, sys_stdin) != 0)
+
+ if (my_fgets(tstate, p+n, (int)incr, sys_stdin) != 0) {
break;
+ }
n += strlen(p+n);
}
+
pr = (char *)PyMem_RawRealloc(p, n+1);
if (pr == NULL) {
PyMem_RawFree(p);
+ PyEval_RestoreThread(tstate);
PyErr_NoMemory();
+ PyEval_SaveThread();
return NULL;
}
return pr;
char *rv, *res;
size_t len;
- if (_PyOS_ReadlineTState == _PyThreadState_GET()) {
+ PyThreadState *tstate = _PyThreadState_GET();
+ if (_PyOS_ReadlineTState == tstate) {
PyErr_SetString(PyExc_RuntimeError,
"can't re-enter readline");
return NULL;
}
}
- _PyOS_ReadlineTState = _PyThreadState_GET();
+ _PyOS_ReadlineTState = tstate;
Py_BEGIN_ALLOW_THREADS
PyThread_acquire_lock(_PyOS_ReadlineLock, 1);
return result;
}
+static int _audit_hook_clear_count = 0;
+
static int _audit_hook(const char *event, PyObject *args, void *userdata)
{
+ assert(args && PyTuple_CheckExact(args));
if (strcmp(event, "_testembed.raise") == 0) {
PyErr_SetString(PyExc_RuntimeError, "Intentional error");
return -1;
return -1;
}
return 0;
+ } else if (strcmp(event, "cpython._PySys_ClearAuditHooks") == 0) {
+ _audit_hook_clear_count += 1;
}
return 0;
}
{
int result = _test_audit(42);
Py_Finalize();
+ if (_audit_hook_clear_count != 1) {
+ return 0x1000 | _audit_hook_clear_count;
+ }
return result;
}
}
if (fields) {
numfields = PySequence_Size(fields);
- if (numfields == -1)
+ if (numfields == -1) {
goto cleanup;
+ }
}
res = 0; /* if no error occurs, this stays 0 to the end */
}
res = PyObject_SetAttr(self, name, PyTuple_GET_ITEM(args, i));
Py_DECREF(name);
- if (res < 0)
+ if (res < 0) {
goto cleanup;
+ }
}
if (kw) {
i = 0; /* needed by PyDict_Next */
while (PyDict_Next(kw, &i, &key, &value)) {
+ int contains = PySequence_Contains(fields, key);
+ if (contains == -1) {
+ res = -1;
+ goto cleanup;
+ } else if (contains == 1) {
+ Py_ssize_t p = PySequence_Index(fields, key);
+ if (p == -1) {
+ res = -1;
+ goto cleanup;
+ }
+ if (p < PyTuple_GET_SIZE(args)) {
+ PyErr_Format(PyExc_TypeError,
+ "%.400s got multiple values for argument '%U'",
+ Py_TYPE(self)->tp_name, key);
+ res = -1;
+ goto cleanup;
+ }
+ }
res = PyObject_SetAttr(self, key, value);
- if (res < 0)
+ if (res < 0) {
goto cleanup;
+ }
}
}
cleanup:
static int validate_stmt(stmt_ty);
static int validate_expr(expr_ty, expr_context_ty);
+static int
+validate_name(PyObject *name)
+{
+ assert(PyUnicode_Check(name));
+ static const char * const forbidden[] = {
+ "None",
+ "True",
+ "False",
+ NULL
+ };
+ for (int i = 0; forbidden[i] != NULL; i++) {
+ if (_PyUnicode_EqualToASCIIString(name, forbidden[i])) {
+ PyErr_Format(PyExc_ValueError, "Name node can't be used with '%s' constant", forbidden[i]);
+ return 0;
+ }
+ }
+ return 1;
+}
+
static int
validate_comprehension(asdl_seq *gens)
{
actual_ctx = exp->v.Starred.ctx;
break;
case Name_kind:
+ if (!validate_name(exp->v.Name.id)) {
+ return 0;
+ }
actual_ctx = exp->v.Name.ctx;
break;
case List_kind:
/* borrowed reference */
c.c_filename = filename;
c.c_normalize = NULL;
- c.c_feature_version = flags ? flags->cf_feature_version : PY_MINOR_VERSION;
+ c.c_feature_version = flags && (flags->cf_flags & PyCF_ONLY_AST) ?
+ flags->cf_feature_version : PY_MINOR_VERSION;
if (TYPE(n) == encoding_decl)
n = CHILD(n, 0);
len = expr_end - expr_start;
/* Allocate 3 extra bytes: open paren, close paren, null byte. */
- str = PyMem_RawMalloc(len + 3);
+ str = PyMem_Malloc(len + 3);
if (str == NULL) {
PyErr_NoMemory();
return NULL;
mod_n = PyParser_SimpleParseStringFlagsFilename(str, "<fstring>",
Py_eval_input, 0);
if (!mod_n) {
- PyMem_RawFree(str);
+ PyMem_Free(str);
return NULL;
}
/* Reuse str to find the correct column offset. */
str[len+1] = '}';
fstring_fix_node_location(n, mod_n, str);
mod = PyAST_FromNode(mod_n, &cf, "<fstring>", c->c_arena);
- PyMem_RawFree(str);
+ PyMem_Free(str);
PyNode_Free(mod_n);
if (!mod)
return NULL;
/* Check for =, which puts the text value of the expression in
expr_text. */
if (**str == '=') {
+ if (c->c_feature_version < 8) {
+ ast_error(c, n,
+ "f-string: self documenting expressions are "
+ "only supported in Python 3.8 and greater");
+ goto error;
+ }
*str += 1;
/* Skip over ASCII whitespace. No need to test for end of string
Py_ssize_t i;
/* We're still using the cached data. Switch to
alloc-ing. */
- l->p = PyMem_RawMalloc(sizeof(expr_ty) * new_size);
+ l->p = PyMem_Malloc(sizeof(expr_ty) * new_size);
if (!l->p)
return -1;
/* Copy the cached data into the new buffer. */
l->p[i] = l->data[i];
} else {
/* Just realloc. */
- expr_ty *tmp = PyMem_RawRealloc(l->p, sizeof(expr_ty) * new_size);
+ expr_ty *tmp = PyMem_Realloc(l->p, sizeof(expr_ty) * new_size);
if (!tmp) {
- PyMem_RawFree(l->p);
+ PyMem_Free(l->p);
l->p = NULL;
return -1;
}
/* Do nothing. */
} else {
/* We have dynamically allocated. Free the memory. */
- PyMem_RawFree(l->p);
+ PyMem_Free(l->p);
}
l->p = NULL;
l->size = -1;
return 0;
}
+static int
+append_ast_index_slice(_PyUnicodeWriter *writer, slice_ty slice)
+{
+ int level = PR_TUPLE;
+ expr_ty value = slice->v.Index.value;
+ if (value->kind == Tuple_kind) {
+ for (Py_ssize_t i = 0; i < asdl_seq_LEN(value->v.Tuple.elts); i++) {
+ expr_ty element = asdl_seq_GET(value->v.Tuple.elts, i);
+ if (element->kind == Starred_kind) {
+ ++level;
+ break;
+ }
+ }
+ }
+ APPEND_EXPR(value, level);
+ return 0;
+}
+
static int
append_ast_slice(_PyUnicodeWriter *writer, slice_ty slice)
{
case ExtSlice_kind:
return append_ast_ext_slice(writer, slice);
case Index_kind:
- APPEND_EXPR(slice->v.Index.value, PR_TUPLE);
- return 0;
+ return append_ast_index_slice(writer, slice);
default:
PyErr_SetString(PyExc_SystemError,
"unexpected slice kind");
};
PyDoc_STRVAR(zip_doc,
-"zip(*iterables) --> zip object\n\
+"zip(*iterables) --> A zip object yielding tuples until an input is exhausted.\n\
\n\
-Return a zip object whose .__next__() method returns a tuple where\n\
-the i-th element comes from the i-th iterable argument. The .__next__()\n\
-method continues until the shortest iterable in the argument sequence\n\
-is exhausted and then it raises StopIteration.");
+ >>> list(zip('abcdefg', range(3), range(4)))\n\
+ [('a', 0, 0), ('b', 1, 1), ('c', 2, 2)]\n\
+\n\
+The zip object yields n-length tuples, where n is the number of iterables\n\
+passed as positional arguments to zip(). The i-th element in every tuple\n\
+comes from the i-th iterable argument to zip(). This continues until the\n\
+shortest argument is exhausted.");
PyTypeObject PyZip_Type = {
PyVarObject_HEAD_INIT(&PyType_Type, 0)
win32_urandom_init(int raise)
{
/* Acquire context */
- if (!CryptAcquireContext(&hCryptProv, NULL, NULL,
- PROV_RSA_FULL, CRYPT_VERIFYCONTEXT))
+ if (!CryptAcquireContextW(&hCryptProv, NULL, NULL,
+ PROV_RSA_FULL, CRYPT_VERIFYCONTEXT))
goto error;
return 0;
comprehension_ty outermost;
PyObject *qualname = NULL;
int is_async_generator = 0;
+ int top_level_await = IS_TOP_LEVEL_AWAIT(c);
+
- if (IS_TOP_LEVEL_AWAIT(c)) {
- c->u->u_ste->ste_coroutine = 1;
- }
int is_async_function = c->u->u_ste->ste_coroutine;
outermost = (comprehension_ty) asdl_seq_GET(generators, 0);
is_async_generator = c->u->u_ste->ste_coroutine;
- if (is_async_generator && !is_async_function && type != COMP_GENEXP) {
+ if (is_async_generator && !is_async_function && type != COMP_GENEXP && !top_level_await) {
compiler_error(c, "asynchronous comprehension outside of "
"an asynchronous function");
goto error_in_scope;
qualname = c->u->u_qualname;
Py_INCREF(qualname);
compiler_exit_scope(c);
+ if (top_level_await && is_async_generator){
+ c->u->u_ste->ste_coroutine = 1;
+ }
if (co == NULL)
goto error;
char funcname[258], *import_python;
const wchar_t *wpathname;
-#ifndef _DEBUG
_Py_CheckPython3();
-#endif
wpathname = _PyUnicode_AsUnicode(pathname);
if (wpathname == NULL)
#endif
if (gil_held) {
- if (PySys_Audit("open", "sOi", pathname, Py_None, flags) < 0) {
+ PyObject *pathname_obj = PyUnicode_DecodeFSDefault(pathname);
+ if (pathname_obj == NULL) {
+ return -1;
+ }
+ if (PySys_Audit("open", "OOi", pathname_obj, Py_None, flags) < 0) {
+ Py_DECREF(pathname_obj);
return -1;
}
Py_END_ALLOW_THREADS
} while (fd < 0
&& errno == EINTR && !(async_err = PyErr_CheckSignals()));
- if (async_err)
+ if (async_err) {
+ Py_DECREF(pathname_obj);
return -1;
+ }
if (fd < 0) {
- PyErr_SetFromErrnoWithFilename(PyExc_OSError, pathname);
+ PyErr_SetFromErrnoWithFilenameObjects(PyExc_OSError, pathname_obj, NULL);
+ Py_DECREF(pathname_obj);
return -1;
}
+ Py_DECREF(pathname_obj);
}
else {
fd = open(pathname, flags);
FILE*
_Py_fopen(const char *pathname, const char *mode)
{
- if (PySys_Audit("open", "ssi", pathname, mode, 0) < 0) {
+ PyObject *pathname_obj = PyUnicode_DecodeFSDefault(pathname);
+ if (pathname_obj == NULL) {
return NULL;
}
+ if (PySys_Audit("open", "Osi", pathname_obj, mode, 0) < 0) {
+ Py_DECREF(pathname_obj);
+ return NULL;
+ }
+ Py_DECREF(pathname_obj);
FILE *f = fopen(pathname, mode);
if (f == NULL)
path_bytes = PyBytes_AS_STRING(bytes);
if (PySys_Audit("open", "Osi", path, mode, 0) < 0) {
+ Py_DECREF(bytes);
return NULL;
}
if (lnotab == NULL)
goto code_error;
+ if (PySys_Audit("code.__new__", "OOOiiiiii",
+ code, filename, name, argcount, posonlyargcount,
+ kwonlyargcount, nlocals, stacksize, flags) < 0) {
+ goto code_error;
+ }
+
v = (PyObject *) PyCode_NewWithPosOnlyArgs(
argcount, posonlyargcount, kwonlyargcount,
nlocals, stacksize, flags,
_PyPathConfig _Py_path_config = _PyPathConfig_INIT;
-#ifdef MS_WINDOWS
-wchar_t *_Py_dll_path = NULL;
-#endif
static int
_PyMem_SetDefaultAllocator(PYMEM_DOMAIN_RAW, &old_alloc);
pathconfig_clear(&_Py_path_config);
-#ifdef MS_WINDOWS
- PyMem_RawFree(_Py_dll_path);
- _Py_dll_path = NULL;
-#endif
PyMem_SetAllocator(PYMEM_DOMAIN_RAW, &old_alloc);
}
}
-#ifdef MS_WINDOWS
-/* Initialize _Py_dll_path on Windows. Do nothing on other platforms. */
-static PyStatus
-_PyPathConfig_InitDLLPath(void)
-{
- if (_Py_dll_path != NULL) {
- /* Already set: nothing to do */
- return _PyStatus_OK();
- }
-
- PyMemAllocatorEx old_alloc;
- _PyMem_SetDefaultAllocator(PYMEM_DOMAIN_RAW, &old_alloc);
-
- _Py_dll_path = _Py_GetDLLPath();
-
- PyMem_SetAllocator(PYMEM_DOMAIN_RAW, &old_alloc);
-
- if (_Py_dll_path == NULL) {
- return _PyStatus_NO_MEMORY();
- }
- return _PyStatus_OK();
-}
-#endif
-
-
static PyStatus
pathconfig_set_from_config(_PyPathConfig *pathconfig, const PyConfig *config)
{
PyStatus
_PyConfig_WritePathConfig(const PyConfig *config)
{
-#ifdef MS_WINDOWS
- PyStatus status = _PyPathConfig_InitDLLPath();
- if (_PyStatus_EXCEPTION(status)) {
- return status;
- }
-#endif
-
return pathconfig_set_from_config(&_Py_path_config, config);
}
{
PyStatus status;
-#ifdef MS_WINDOWS
- status = _PyPathConfig_InitDLLPath();
- if (_PyStatus_EXCEPTION(status)) {
- Py_ExitStatusException(status);
- }
-#endif
-
if (_Py_path_config.module_search_path == NULL) {
status = pathconfig_global_read(&_Py_path_config);
if (_PyStatus_EXCEPTION(status)) {
#ifdef HAVE_READLINK
wchar_t link[MAXPATHLEN + 1];
int nr = 0;
+ wchar_t path0copy[2 * MAXPATHLEN + 1];
if (have_script_arg) {
nr = _Py_wreadlink(path0, link, Py_ARRAY_LENGTH(link));
}
else {
/* Must make a copy, path0copy has room for 2 * MAXPATHLEN */
- wchar_t path0copy[2 * MAXPATHLEN + 1];
wcsncpy(path0copy, path0, MAXPATHLEN);
q = wcsrchr(path0copy, SEP);
wcsncpy(q+1, link, MAXPATHLEN);
/* nothing */;
#endif
- /* Clear all loghooks */
- /* We want minimal exposure of this function, so define the extern
- * here. The linker should discover the correct function without
- * exporting a symbol. */
- extern void _PySys_ClearAuditHooks(void);
- _PySys_ClearAuditHooks();
-
/* Destroy all modules */
PyImport_Cleanup();
/* Clear interpreter state and all thread states. */
PyInterpreterState_Clear(interp);
+ /* Clear all loghooks */
+ /* We want minimal exposure of this function, so define the extern
+ * here. The linker should discover the correct function without
+ * exporting a symbol. */
+ extern void _PySys_ClearAuditHooks(void);
+ _PySys_ClearAuditHooks();
+
/* Now we decref the exception classes. After this point nothing
can raise an exception. That's okay, because each Fini() method
below has been checked to make sure no exceptions are ever
struct _shared_str_data *shared = PyMem_NEW(struct _shared_str_data, 1);
shared->kind = PyUnicode_KIND(obj);
shared->buffer = PyUnicode_DATA(obj);
- shared->len = PyUnicode_GET_LENGTH(obj) - 1;
+ shared->len = PyUnicode_GET_LENGTH(obj);
data->data = (void *)shared;
Py_INCREF(obj);
data->obj = obj; // Will be "released" (decref'ed) when data released.
PyCompilerFlags localflags = _PyCompilerFlags_INIT;
perrdetail err;
int iflags = PARSER_FLAGS(flags);
- if (flags && flags->cf_feature_version < 7)
+ if (flags && (flags->cf_flags & PyCF_ONLY_AST) && flags->cf_feature_version < 7)
iflags |= PyPARSE_ASYNC_HACKS;
node *n = PyParser_ParseStringObject(s, filename,
#include "Python.h"
#ifdef MS_WINDOWS
-#include <windows.h>
+#include <winsock2.h> /* struct timeval */
#endif
#if defined(__APPLE__)
-This is Python version 3.8.3
+This is Python version 3.8.4
============================
.. image:: https://travis-ci.org/python/cpython.svg?branch=3.8
def find_capi_vars(root):
capi_vars = {}
for dirname in SOURCE_DIRS:
- for filename in glob.glob(os.path.join(ROOT_DIR, dirname, '**/*.[hc]'),
+ for filename in glob.glob(os.path.join(
+ glob.escape(os.path.join(ROOT_DIR, dirname)),
+ '**/*.[hc]'),
recursive=True):
with open(filename) as file:
for name in _find_capi_vars(file):
<RegistryValue KeyPath="yes" Root="HKMU" Key="Software\Python\PyLauncher" Name="AssociateFiles" Value="1" Type="integer" />
<ProgId Id="Python.File" Description="!(loc.PythonFileDescription)" Advertise="no" Icon="py.exe" IconIndex="1">
- <Extension Id="py" ContentType="text/plain">
+ <Extension Id="py" ContentType="text/x-python">
<Verb Id="open" TargetFile="py.exe" Argument=""%L" %*" />
</Extension>
</ProgId>
<RegistryValue Root="HKCR" Key="Python.File\shellex\DropHandler" Value="{BEA218D2-6950-497B-9434-61683EC065FE}" Type="string" />
<ProgId Id="Python.NoConFile" Description="!(loc.PythonNoConFileDescription)" Advertise="no" Icon="py.exe" IconIndex="1">
- <Extension Id="pyw" ContentType="text/plain">
+ <Extension Id="pyw" ContentType="text/x-python">
<Verb Id="open" TargetFile="pyw.exe" Argument=""%L" %*" />
</Extension>
</ProgId>
f = sys.stdout if use_stdout else open(outfile, "w")
# mnemonic -> (library code, error prefix, header file)
error_libraries = {}
- for error_header in glob.glob(os.path.join(openssl_inc, 'include/openssl/*err.h')):
+ for error_header in glob.glob(os.path.join(glob.escape(openssl_inc), 'include/openssl/*err.h')):
base = os.path.basename(error_header)
if base in ('buffererr.h', 'objectserr.h', 'storeerr.h'):
# Deprecated in 3.0.
import os
try:
from urllib.request import urlopen
+ from urllib.error import HTTPError
except ImportError:
- from urllib2 import urlopen
-import subprocess
+ from urllib2 import urlopen, HTTPError
import shutil
+import string
+import subprocess
import sys
import tarfile
log = logging.getLogger("multissl")
OPENSSL_OLD_VERSIONS = [
- "1.0.2",
+ "1.0.2u",
+ "1.1.0l",
]
OPENSSL_RECENT_VERSIONS = [
- "1.0.2t",
- "1.1.0l",
- "1.1.1f",
+ "1.1.1g",
+ # "3.0.0-alpha2"
]
LIBRESSL_OLD_VERSIONS = [
+ "2.9.2",
]
LIBRESSL_RECENT_VERSIONS = [
- "2.9.2",
+ "3.1.0",
]
# store files in ../multissl
parser.add_argument(
'--disable-ancient',
action='store_true',
- help="Don't test OpenSSL < 1.0.2 and LibreSSL < 2.5.3.",
+ help="Don't test OpenSSL and LibreSSL versions without upstream support",
)
parser.add_argument(
'--openssl',
help="Keep original sources for debugging."
)
+OPENSSL_FIPS_CNF = """\
+openssl_conf = openssl_init
+
+.include {self.install_dir}/ssl/fipsinstall.cnf
+# .include {self.install_dir}/ssl/openssl.cnf
+
+[openssl_init]
+providers = provider_sect
+
+[provider_sect]
+fips = fips_sect
+default = default_sect
+
+[default_sect]
+activate = 1
+"""
+
class AbstractBuilder(object):
library = None
- url_template = None
+ url_templates = None
src_template = None
build_template = None
install_target = 'install'
def __hash__(self):
return hash((self.library, self.version))
+ @property
+ def short_version(self):
+ """Short version for OpenSSL download URL"""
+ return None
+
@property
def openssl_cli(self):
"""openssl CLI binary"""
src_dir = os.path.dirname(self.src_file)
if not os.path.isdir(src_dir):
os.makedirs(src_dir)
- url = self.url_template.format(self.version)
- log.info("Downloading from {}".format(url))
- req = urlopen(url)
- # KISS, read all, write all
- data = req.read()
+ data = None
+ for url_template in self.url_templates:
+ url = url_template.format(v=self.version, s=self.short_version)
+ log.info("Downloading from {}".format(url))
+ try:
+ req = urlopen(url)
+ # KISS, read all, write all
+ data = req.read()
+ except HTTPError as e:
+ log.error(
+ "Download from {} has from failed: {}".format(url, e)
+ )
+ else:
+ log.info("Successfully downloaded from {}".format(url))
+ break
+ if data is None:
+ raise ValueError("All download URLs have failed")
log.info("Storing {}".format(self.src_file))
with open(self.src_file, "wb") as f:
f.write(data)
"shared", "--debug",
"--prefix={}".format(self.install_dir)
]
+ # cmd.extend(["no-deprecated", "--api=1.1.0"])
env = os.environ.copy()
# set rpath
env["LD_RUN_PATH"] = self.lib_dir
["make", "-j1", self.install_target],
cwd=self.build_dir
)
+ self._post_install()
if not self.args.keep_sources:
shutil.rmtree(self.build_dir)
+ def _post_install(self):
+ pass
+
def install(self):
log.info(self.openssl_cli)
if not self.has_openssl or self.args.force:
class BuildOpenSSL(AbstractBuilder):
library = "OpenSSL"
- url_template = "https://www.openssl.org/source/openssl-{}.tar.gz"
+ url_templates = (
+ "https://www.openssl.org/source/openssl-{v}.tar.gz",
+ "https://www.openssl.org/source/old/{s}/openssl-{v}.tar.gz"
+ )
src_template = "openssl-{}.tar.gz"
build_template = "openssl-{}"
# only install software, skip docs
install_target = 'install_sw'
+ def _post_install(self):
+ if self.version.startswith("3.0"):
+ self._post_install_300()
+
+ def _post_install_300(self):
+ # create ssl/ subdir with example configs
+ self._subprocess_call(
+ ["make", "-j1", "install_ssldirs"],
+ cwd=self.build_dir
+ )
+ # Install FIPS module
+ # https://wiki.openssl.org/index.php/OpenSSL_3.0#Completing_the_installation_of_the_FIPS_Module
+ fipsinstall_cnf = os.path.join(
+ self.install_dir, "ssl", "fipsinstall.cnf"
+ )
+ openssl_fips_cnf = os.path.join(
+ self.install_dir, "ssl", "openssl-fips.cnf"
+ )
+ fips_mod = os.path.join(self.lib_dir, "ossl-modules/fips.so")
+ self._subprocess_call(
+ [
+ self.openssl_cli, "fipsinstall",
+ "-out", fipsinstall_cnf,
+ "-module", fips_mod,
+ "-provider_name", "fips",
+ "-mac_name", "HMAC",
+ "-macopt", "digest:SHA256",
+ "-macopt", "hexkey:00",
+ "-section_name", "fips_sect"
+ ]
+ )
+ with open(openssl_fips_cnf, "w") as f:
+ f.write(OPENSSL_FIPS_CNF.format(self=self))
+ @property
+ def short_version(self):
+ """Short version for OpenSSL download URL"""
+ short_version = self.version.rstrip(string.ascii_letters)
+ if short_version.startswith("0.9"):
+ short_version = "0.9.x"
+ return short_version
+
class BuildLibreSSL(AbstractBuilder):
library = "LibreSSL"
- url_template = (
- "https://ftp.openbsd.org/pub/OpenBSD/LibreSSL/libressl-{}.tar.gz")
+ url_templates = (
+ "https://ftp.openbsd.org/pub/OpenBSD/LibreSSL/libressl-{v}.tar.gz",
+ )
src_template = "libressl-{}.tar.gz"
build_template = "libressl-{}"
# has no effect, don't bother defining them
Darwin/[6789].*)
define_xopen_source=no;;
- Darwin/1[0-9].*)
+ Darwin/[12][0-9].*)
define_xopen_source=no;;
# On AIX 4 and 5.1, mbstate_t is defined only when _XOPEN_SOURCE == 500 but
# used in wcsnrtombs() and mbsnrtowcs() even if _XOPEN_SOURCE is not defined
# has no effect, don't bother defining them
Darwin/@<:@6789@:>@.*)
define_xopen_source=no;;
- Darwin/1@<:@0-9@:>@.*)
+ Darwin/@<:@[12]@:>@@<:@0-9@:>@.*)
define_xopen_source=no;;
# On AIX 4 and 5.1, mbstate_t is defined only when _XOPEN_SOURCE == 500 but
# used in wcsnrtombs() and mbsnrtowcs() even if _XOPEN_SOURCE is not defined
import re
import sys
import sysconfig
-from glob import glob
+from glob import glob, escape
from distutils import log
from distutils.command.build_ext import build_ext
# Python header files
headers = [sysconfig.get_config_h_filename()]
- headers += glob(os.path.join(sysconfig.get_path('include'), "*.h"))
+ headers += glob(os.path.join(escape(sysconfig.get_path('include')), "*.h"))
# The sysconfig variables built by makesetup that list the already
# built modules and the disabled modules as configured by the Setup
self.add(Extension('_sha1', ['sha1module.c'],
depends=['hashlib.h']))
- blake2_deps = glob(os.path.join(self.srcdir,
+ blake2_deps = glob(os.path.join(escape(self.srcdir),
'Modules/_blake2/impl/*'))
blake2_deps.append('hashlib.h')
'_blake2/blake2s_impl.c'],
depends=blake2_deps))
- sha3_deps = glob(os.path.join(self.srcdir,
+ sha3_deps = glob(os.path.join(escape(self.srcdir),
'Modules/_sha3/kcp/*'))
sha3_deps.append('hashlib.h')
self.add(Extension('_sha3',