- job: macOS_CI_Tests
displayName: macOS CI Tests
dependsOn: Prebuild
- condition: and(succeeded(), eq(dependencies.Prebuild.outputs['tests.run'], 'true'))
+ #condition: and(succeeded(), eq(dependencies.Prebuild.outputs['tests.run'], 'true'))
+ # bpo-39837: macOS tests on Azure Pipelines are disabled
+ condition: false
variables:
testRunTitle: '$(build.sourceBranchName)-macos'
- script: ./configure --with-pydebug --with-openssl=/usr/local/opt/openssl --prefix=/opt/python-azdev
displayName: 'Configure CPython (debug)'
-- script: make -s -j4
+- script: make -j4
displayName: 'Build CPython'
- script: make pythoninfo
- script: ./configure --with-pydebug
displayName: 'Configure CPython (debug)'
-- script: make -s -j4
+- script: make -j4
displayName: 'Build CPython'
- ${{ if eq(parameters.coverage, 'true') }}:
- job: macOS_PR_Tests
displayName: macOS PR Tests
dependsOn: Prebuild
- condition: and(succeeded(), eq(dependencies.Prebuild.outputs['tests.run'], 'true'))
+ #condition: and(succeeded(), eq(dependencies.Prebuild.outputs['tests.run'], 'true'))
+ # bpo-39837: macOS tests on Azure Pipelines are disabled
+ condition: false
variables:
testRunTitle: '$(system.pullRequest.TargetBranch)-macos'
Like ``s*``, but the Python object may also be ``None``, in which case the
``buf`` member of the :c:type:`Py_buffer` structure is set to ``NULL``.
-``z#`` (:class:`str`, read-only :term:`bytes-like object` or ``None``) [const char \*, int]
+``z#`` (:class:`str`, read-only :term:`bytes-like object` or ``None``) [const char \*, int or :c:type:`Py_ssize_t`]
Like ``s#``, but the Python object may also be ``None``, in which case the C
pointer is set to ``NULL``.
bytes-like objects. **This is the recommended way to accept
binary data.**
-``y#`` (read-only :term:`bytes-like object`) [const char \*, int]
+``y#`` (read-only :term:`bytes-like object`) [const char \*, int or :c:type:`Py_ssize_t`]
This variant on ``s#`` doesn't accept Unicode objects, only bytes-like
objects.
Part of the old-style :c:type:`Py_UNICODE` API; please migrate to using
:c:func:`PyUnicode_AsWideCharString`.
-``u#`` (:class:`str`) [const Py_UNICODE \*, int]
+``u#`` (:class:`str`) [const Py_UNICODE \*, int or :c:type:`Py_ssize_t`]
This variant on ``u`` stores into two C variables, the first one a pointer to a
Unicode data buffer, the second one its length. This variant allows
null code points.
Part of the old-style :c:type:`Py_UNICODE` API; please migrate to using
:c:func:`PyUnicode_AsWideCharString`.
-``Z#`` (:class:`str` or ``None``) [const Py_UNICODE \*, int]
+``Z#`` (:class:`str` or ``None``) [const Py_UNICODE \*, int or :c:type:`Py_ssize_t`]
Like ``u#``, but the Python object may also be ``None``, in which case the
:c:type:`Py_UNICODE` pointer is set to ``NULL``.
recoding them. Instead, the implementation assumes that the byte string object uses
the encoding passed in as parameter.
-``es#`` (:class:`str`) [const char \*encoding, char \*\*buffer, int \*buffer_length]
+``es#`` (:class:`str`) [const char \*encoding, char \*\*buffer, int or :c:type:`Py_ssize_t` \*buffer_length]
This variant on ``s#`` is used for encoding Unicode into a character buffer.
Unlike the ``es`` format, this variant allows input data which contains NUL
characters.
In both cases, *\*buffer_length* is set to the length of the encoded data
without the trailing NUL byte.
-``et#`` (:class:`str`, :class:`bytes` or :class:`bytearray`) [const char \*encoding, char \*\*buffer, int \*buffer_length]
+``et#`` (:class:`str`, :class:`bytes` or :class:`bytearray`) [const char \*encoding, char \*\*buffer, int or :c:type:`Py_ssize_t` \*buffer_length]
Same as ``es#`` except that byte string objects are passed through without recoding
them. Instead, the implementation assumes that the byte string object uses the
encoding passed in as parameter.
Convert a null-terminated C string to a Python :class:`str` object using ``'utf-8'``
encoding. If the C string pointer is ``NULL``, ``None`` is used.
- ``s#`` (:class:`str` or ``None``) [const char \*, int]
+ ``s#`` (:class:`str` or ``None``) [const char \*, int or :c:type:`Py_ssize_t`]
Convert a C string and its length to a Python :class:`str` object using ``'utf-8'``
encoding. If the C string pointer is ``NULL``, the length is ignored and
``None`` is returned.
This converts a C string to a Python :class:`bytes` object. If the C
string pointer is ``NULL``, ``None`` is returned.
- ``y#`` (:class:`bytes`) [const char \*, int]
+ ``y#`` (:class:`bytes`) [const char \*, int or :c:type:`Py_ssize_t`]
This converts a C string and its lengths to a Python object. If the C
string pointer is ``NULL``, ``None`` is returned.
``z`` (:class:`str` or ``None``) [const char \*]
Same as ``s``.
- ``z#`` (:class:`str` or ``None``) [const char \*, int]
+ ``z#`` (:class:`str` or ``None``) [const char \*, int or :c:type:`Py_ssize_t`]
Same as ``s#``.
``u`` (:class:`str`) [const wchar_t \*]
data to a Python Unicode object. If the Unicode buffer pointer is ``NULL``,
``None`` is returned.
- ``u#`` (:class:`str`) [const wchar_t \*, int]
+ ``u#`` (:class:`str`) [const wchar_t \*, int or :c:type:`Py_ssize_t`]
Convert a Unicode (UTF-16 or UCS-4) data buffer and its length to a Python
Unicode object. If the Unicode buffer pointer is ``NULL``, the length is ignored
and ``None`` is returned.
``U`` (:class:`str` or ``None``) [const char \*]
Same as ``s``.
- ``U#`` (:class:`str` or ``None``) [const char \*, int]
+ ``U#`` (:class:`str` or ``None``) [const char \*, int or :c:type:`Py_ssize_t`]
Same as ``s#``.
``i`` (:class:`int`) [int]
This is a base class for other standard exceptions.
(2)
- This is the same as :exc:`weakref.ReferenceError`.
-
-(3)
Only defined on Windows; protect code that uses this by testing that the
preprocessor macro ``MS_WINDOWS`` is defined.
/* propagate error */
}
- while (item = PyIter_Next(iterator)) {
+ while ((item = PyIter_Next(iterator))) {
/* do something with item */
...
/* release reference when done */
(if present) to convert it to a :c:type:`PyLongObject`.
Raise :exc:`OverflowError` if the value of *obj* is out of range for a
- :c:type:`long`.
+ :c:type:`long long`.
Returns ``-1`` on error. Use :c:func:`PyErr_Occurred` to disambiguate.
.. versionadded:: 3.7
-.. c:function: int PyTraceMalloc_Track(unsigned int domain, uintptr_t ptr, size_t size)
+.. c:function:: int PyTraceMalloc_Track(unsigned int domain, uintptr_t ptr, size_t size)
Track an allocated memory block in the :mod:`tracemalloc` module.
If memory block is already tracked, update the existing trace.
-.. c:function: int PyTraceMalloc_Untrack(unsigned int domain, uintptr_t ptr)
+.. c:function:: int PyTraceMalloc_Untrack(unsigned int domain, uintptr_t ptr)
Untrack an allocated memory block in the :mod:`tracemalloc` module.
Do nothing if the block was not tracked.
The :c:member:`~PyTypeObject.tp_traverse` pointer is used by the garbage collector to detect
reference cycles. A typical implementation of a :c:member:`~PyTypeObject.tp_traverse` function
simply calls :c:func:`Py_VISIT` on each of the instance's members that are Python
- objects. For example, this is function :c:func:`local_traverse` from the
+ objects that the instance owns. For example, this is function :c:func:`local_traverse` from the
:mod:`_thread` extension module::
static int
debugging aid you may want to visit it anyway just so the :mod:`gc` module's
:func:`~gc.get_referents` function will include it.
+ .. warning::
+ When implementing :c:member:`~PyTypeObject.tp_traverse`, only the members
+ that the instance *owns* (by having strong references to them) must be
+ visited. For instance, if an object supports weak references via the
+ :c:member:`~PyTypeObject.tp_weaklist` slot, the pointer supporting
+ the linked list (what *tp_weaklist* points to) must **not** be
+ visited as the instance does not directly own the weak references to itself
+ (the weakreference list is there to support the weak reference machinery,
+ but the instance has no strong reference to the elements inside it, as they
+ are allowed to be removed even if the instance is still alive).
+
+
Note that :c:func:`Py_VISIT` requires the *visit* and *arg* parameters to
:c:func:`local_traverse` to have these specific names; don't name them just
anything.
Python and this documentation is:
-Copyright © 2001-2019 Python Software Foundation. All rights reserved.
+Copyright © 2001-2020 Python Software Foundation. All rights reserved.
Copyright © 2000 BeOpen.com. All rights reserved.
remember before diving further:
* Performance characteristics vary across Python implementations. This FAQ
- focusses on :term:`CPython`.
+ focuses on :term:`CPython`.
* Behaviour can vary across operating systems, especially when talking about
I/O or multi-threading.
* You should always find the hot spots in your program *before* attempting to
Mertz also wrote a 3-part series of articles on functional programming
for IBM's DeveloperWorks site; see
-`part 1 <https://www.ibm.com/developerworks/linux/library/l-prog/index.html>`__,
-`part 2 <https://www.ibm.com/developerworks/linux/library/l-prog2/index.html>`__, and
-`part 3 <https://www.ibm.com/developerworks/linux/library/l-prog3/index.html>`__,
+`part 1 <https://developer.ibm.com/articles/l-prog/>`__,
+`part 2 <https://developer.ibm.com/tutorials/l-prog2/>`__, and
+`part 3 <https://developer.ibm.com/tutorials/l-prog3/>`__,
Python documentation
Python 3! But to fully understand how your code is going to change and what
you want to look out for while you code, you will want to learn what changes
Python 3 makes in terms of Python 2. Typically the two best ways of doing that
-is reading the `"What's New"`_ doc for each release of Python 3 and the
+is reading the :ref:`"What's New" <whatsnew-index>` doc for each release of Python 3 and the
`Porting to Python 3`_ book (which is free online). There is also a handy
`cheat sheet`_ from the Python-Future project.
against Python 2 and not Python 3. To help explain this, let's look at an
example.
-Let's pretend that you need access to a feature of importlib_ that
+Let's pretend that you need access to a feature of :mod:`importlib` that
is available in Python's standard library since Python 3.3 and available for
Python 2 through importlib2_ on PyPI. You might be tempted to write code to
-access e.g. the ``importlib.abc`` module by doing the following::
+access e.g. the :mod:`importlib.abc` module by doing the following::
import sys
to make sure everything functions as expected in both versions of Python.
-.. _2to3: https://docs.python.org/3/library/2to3.html
.. _caniusepython3: https://pypi.org/project/caniusepython3
.. _cheat sheet: http://python-future.org/compatible_idioms.html
.. _coverage.py: https://pypi.org/project/coverage
.. _Futurize: http://python-future.org/automatic_conversion.html
-.. _importlib: https://docs.python.org/3/library/importlib.html#module-importlib
.. _importlib2: https://pypi.org/project/importlib2
.. _Modernize: https://python-modernize.readthedocs.io/
.. _mypy: http://mypy-lang.org/
.. _tox: https://pypi.org/project/tox
.. _trove classifier: https://pypi.org/classifiers
-.. _"What's New": https://docs.python.org/3/whatsnew/index.html
-
.. _Why Python 3 exists: https://snarky.ca/why-python-3-exists
Py_INCREF(&CustomType);
if (PyModule_AddObject(m, "Custom", (PyObject *) &CustomType) < 0) {
Py_DECREF(&CustomType);
- PY_DECREF(m);
+ Py_DECREF(m);
return NULL;
}
class RewriteName(NodeTransformer):
def visit_Name(self, node):
- return copy_location(Subscript(
+ return Subscript(
value=Name(id='data', ctx=Load()),
slice=Index(value=Str(s=node.id)),
ctx=node.ctx
statement nodes), the visitor may also return a list of nodes rather than
just a single node.
+ If :class:`NodeTransformer` introduces new nodes (that weren't part of
+ original tree) without giving them location information (such as
+ :attr:`lineno`), :func:`fix_missing_locations` should be called with
+ the new sub-tree to recalculate the location information::
+
+ tree = ast.parse('foo', mode='eval')
+ new_tree = fix_missing_locations(RewriteName().visit(tree))
+
Usually you use the transformer like this::
node = YourTransformer().visit(node)
.. function:: get_event_loop()
- Get the current event loop. If there is no current event loop set
- in the current OS thread and :func:`set_event_loop` has not yet
+ Get the current event loop.
+
+ If there is no current event loop set in the current OS thread,
+ the OS thread is main, and :func:`set_event_loop` has not yet
been called, asyncio will create a new event loop and set it as the
current one.
Compile and run some source in the interpreter. Arguments are the same as for
:func:`compile_command`; the default for *filename* is ``'<input>'``, and for
- *symbol* is ``'single'``. One several things can happen:
+ *symbol* is ``'single'``. One of several things can happen:
* The input is incorrect; :func:`compile_command` raised an exception
(:exc:`SyntaxError` or :exc:`OverflowError`). A syntax traceback will be
0x1d000000
>>>
-.. note::
-
- :mod:`ctypes` may raise a :exc:`ValueError` after calling the function, if
- it detects that an invalid number of arguments were passed. This behavior
- should not be relied upon. It is deprecated in 3.6.2, and will be removed
- in 3.7.
-
:exc:`ValueError` is raised when you call an ``stdcall`` function with the
``cdecl`` calling convention, or vice versa::
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
By default, Structure and Union fields are aligned in the same way the C
-compiler does it. It is possible to override this behavior be specifying a
+compiler does it. It is possible to override this behavior by specifying a
:attr:`_pack_` class attribute in the subclass definition. This must be set to a
positive integer and specifies the maximum alignment for the fields. This is
what ``#pragma pack(n)`` also does in MSVC.
... ("next", POINTER(cell))]
>>>
-Lets try it. We create two instances of ``cell``, and let them point to each
+Let's try it. We create two instances of ``cell``, and let them point to each
other, and finally follow the pointer chain a few times::
>>> c1 = cell()
>>>
The fact that standard Python has a frozen module and a frozen package
-(indicated by the negative size member) is not well known, it is only used for
-testing. Try it out with ``import __hello__`` for example.
+(indicated by the negative ``size`` member) is not well known, it is only used
+for testing. Try it out with ``import __hello__`` for example.
.. _ctypes-surprises:
.. data:: HAVE_THREADS
- The default value is ``True``. If Python is compiled without threads, the
- C version automatically disables the expensive thread local context
- machinery. In this case, the value is ``False``.
+ The value is ``True``. Deprecated, because Python now always has threads.
+
+.. deprecated:: 3.9
+
+.. data:: HAVE_CONTEXTVAR
+
+ The default value is ``True``. If Python is compiled ``--without-decimal-contextvar``,
+ the C version uses a thread-local rather than a coroutine-local context and the value
+ is ``False``. This is slightly faster in some nested context scenarios.
+
+.. versionadded:: 3.9 backported to 3.7 and 3.8
+
Rounding modes
--------------
A. Yes. In the CPython and PyPy3 implementations, the C/CFFI versions of
the decimal module integrate the high speed `libmpdec
<https://www.bytereef.org/mpdecimal/doc/libmpdec/index.html>`_ library for
-arbitrary precision correctly-rounded decimal floating point arithmetic.
+arbitrary precision correctly-rounded decimal floating point arithmetic [#]_.
``libmpdec`` uses `Karatsuba multiplication
<https://en.wikipedia.org/wiki/Karatsuba_algorithm>`_
for medium-sized numbers and the `Number Theoretic Transform
<https://en.wikipedia.org/wiki/Discrete_Fourier_transform_(general)#Number-theoretic_transform>`_
-for very large numbers. However, to realize this performance gain, the
-context needs to be set for unrounded calculations.
+for very large numbers.
+
+The context must be adapted for exact arbitrary precision arithmetic. :attr:`Emin`
+and :attr:`Emax` should always be set to the maximum values, :attr:`clamp`
+should always be 0 (the default). Setting :attr:`prec` requires some care.
+
+The easiest approach for trying out bignum arithmetic is to use the maximum
+value for :attr:`prec` as well [#]_::
+
+ >>> setcontext(Context(prec=MAX_PREC, Emax=MAX_EMAX, Emin=MIN_EMIN))
+ >>> x = Decimal(2) ** 256
+ >>> x / 128
+ Decimal('904625697166532776746648320380374280103671755200316906558262375061821325312')
+
+
+For inexact results, :attr:`MAX_PREC` is far too large on 64-bit platforms and
+the available memory will be insufficient::
+
+ >>> Decimal(1) / 3
+ Traceback (most recent call last):
+ File "<stdin>", line 1, in <module>
+ MemoryError
+
+On systems with overallocation (e.g. Linux), a more sophisticated approach is to
+adjust :attr:`prec` to the amount of available RAM. Suppose that you have 8GB of
+RAM and expect 10 simultaneous operands using a maximum of 500MB each::
+
+ >>> import sys
+ >>>
+ >>> # Maximum number of digits for a single operand using 500MB in 8-byte words
+ >>> # with 19 digits per word (4-byte and 9 digits for the 32-bit build):
+ >>> maxdigits = 19 * ((500 * 1024**2) // 8)
+ >>>
+ >>> # Check that this works:
+ >>> c = Context(prec=maxdigits, Emax=MAX_EMAX, Emin=MIN_EMIN)
+ >>> c.traps[Inexact] = True
+ >>> setcontext(c)
+ >>>
+ >>> # Fill the available precision with nines:
+ >>> x = Decimal(0).logical_invert() * 9
+ >>> sys.getsizeof(x)
+ 524288112
+ >>> x + 2
+ Traceback (most recent call last):
+ File "<stdin>", line 1, in <module>
+ decimal.Inexact: [<class 'decimal.Inexact'>]
+
+In general (and especially on systems without overallocation), it is recommended
+to estimate even tighter bounds and set the :attr:`Inexact` trap if all calculations
+are expected to be exact.
+
- >>> c = getcontext()
- >>> c.prec = MAX_PREC
- >>> c.Emax = MAX_EMAX
- >>> c.Emin = MIN_EMIN
+.. [#]
+ .. versionadded:: 3.3
-.. versionadded:: 3.3
\ No newline at end of file
+.. [#]
+ .. versionchanged:: 3.9
+ This approach now works for all exact results except for non-integer powers.
+ Also backported to 3.7 and 3.8.
.. opcode:: LOAD_METHOD (namei)
- Loads a method named ``co_names[namei]`` from TOS object. TOS is popped and
- method and TOS are pushed when interpreter can call unbound method directly.
- TOS will be used as the first argument (``self``) by :opcode:`CALL_METHOD`.
- Otherwise, ``NULL`` and method is pushed (method is bound method or
- something else).
+ Loads a method named ``co_names[namei]`` from the TOS object. TOS is popped.
+ This bytecode distinguishes two cases: if TOS has a method with the correct
+ name, the bytecode pushes the unbound method and TOS. TOS will be used as
+ the first argument (``self``) by :opcode:`CALL_METHOD` when calling the
+ unbound method. Otherwise, ``NULL`` and the object return by the attribute
+ lookup are pushed.
.. versionadded:: 3.7
.. opcode:: CALL_METHOD (argc)
- Calls a method. *argc* is number of positional arguments.
+ Calls a method. *argc* is the number of positional arguments.
Keyword arguments are not supported. This opcode is designed to be used
with :opcode:`LOAD_METHOD`. Positional arguments are on top of the stack.
- Below them, two items described in :opcode:`LOAD_METHOD` on the stack.
- All of them are popped and return value is pushed.
+ Below them, the two items described in :opcode:`LOAD_METHOD` are on the
+ stack (either ``self`` and an unbound method object or ``NULL`` and an
+ arbitrary callable). All of them are popped and the return value is pushed.
.. versionadded:: 3.7
script will *not* be installed.
* ``--default-pip``: if a "default pip" installation is requested, the
- ``pip`` script will be installed in addition to the two regular scripts.
+ ``pip`` script will be installed in addition to the two regular scripts.
Providing both of the script selection options will trigger an exception.
.. class:: auto
- Instances are replaced with an appropriate value for Enum members.
+ Instances are replaced with an appropriate value for Enum members. Initial value starts at 1.
.. versionadded:: 3.6 ``Flag``, ``IntFlag``, ``auto``
.. _func-memoryview:
-.. function:: memoryview(obj)
+.. class:: memoryview(obj)
:noindex:
Return a "memory view" object created from the given argument. See
.. _func-range:
-.. function:: range(stop)
+.. class:: range(stop)
range(start, stop[, step])
:noindex:
.. _func-tuple:
-.. function:: tuple([iterable])
+.. class:: tuple([iterable])
:noindex:
Rather than being a function, :class:`tuple` is actually an immutable
.. [#] The default locale directory is system dependent; for example, on RedHat Linux
it is :file:`/usr/share/locale`, but on Solaris it is :file:`/usr/lib/locale`.
The :mod:`gettext` module does not try to support these system dependent
- defaults; instead its default is :file:`{sys.prefix}/share/locale` (see
- :data:`sys.prefix`). For this reason, it is always best to call
+ defaults; instead its default is :file:`{sys.base_prefix}/share/locale` (see
+ :data:`sys.base_prefix`). For this reason, it is always best to call
:func:`bindtextdomain` with an explicit absolute path at the start of your
application.
Client side ``HTTP PUT`` requests are very similar to ``POST`` requests. The
difference lies only the server side where HTTP server will allow resources to
be created via ``PUT`` request. It should be noted that custom HTTP methods
-+are also handled in :class:`urllib.request.Request` by sending the appropriate
-+method attribute.Here is an example session that shows how to do ``PUT``
+are also handled in :class:`urllib.request.Request` by setting the appropriate
+method attribute. Here is an example session that shows how to send a ``PUT``
request using http.client::
>>> # This creates an HTTP message
Editor windows also have breakpoint functions. Lines with a breakpoint set are
specially marked. Breakpoints only have an effect when running under the
-debugger. Breakpoints for a file are saved in the user's .idlerc directory.
+debugger. Breakpoints for a file are saved in the user's ``.idlerc``
+directory.
Set Breakpoint
Set a breakpoint on the current line.
completely remove Python and start over.
A zombie pythonw.exe process could be a problem. On Windows, use Task
-Manager to detect and stop one. Sometimes a restart initiated by a program
-crash or Keyboard Interrupt (control-C) may fail to connect. Dismissing
-the error box or Restart Shell on the Shell menu may fix a temporary problem.
+Manager to check for one and stop it if there is. Sometimes a restart
+initiated by a program crash or Keyboard Interrupt (control-C) may fail
+to connect. Dismissing the error box or using Restart Shell on the Shell
+menu may fix a temporary problem.
When IDLE first starts, it attempts to read user configuration files in
-~/.idlerc/ (~ is one's home directory). If there is a problem, an error
+``~/.idlerc/`` (~ is one's home directory). If there is a problem, an error
message should be displayed. Leaving aside random disk glitches, this can
-be prevented by never editing the files by hand, using the configuration
-dialog, under Options, instead Options. Once it happens, the solution may
-be to delete one or more of the configuration files.
+be prevented by never editing the files by hand. Instead, use the
+configuration dialog, under Options. Once there is an error in a user
+configuration file, the best solution may be to delete it and start over
+with the settings dialog.
If IDLE quits with no message, and it was not started from a console, try
-starting from a console (``python -m idlelib)`` and see if a message appears.
+starting it from a console or terminal (``python -m idlelib``) and see if
+this results in an error message.
Running user code
^^^^^^^^^^^^^^^^^
header in the opened box.
Help menu entry "Python Docs" opens the extensive sources of help,
-including tutorials, available at docs.python.org/x.y, where 'x.y'
+including tutorials, available at ``docs.python.org/x.y``, where 'x.y'
is the currently running Python version. If your system
has an off-line copy of the docs (this may be an installation option),
that will be opened instead.
Selected URLs can be added or removed from the help menu at any time using the
-General tab of the Configure IDLE dialog .
+General tab of the Configure IDLE dialog.
.. _preferences:
The font preferences, highlighting, keys, and general preferences can be
changed via Configure IDLE on the Option menu.
-Non-default user settings are saved in a .idlerc directory in the user's
+Non-default user settings are saved in a ``.idlerc`` directory in the user's
home directory. Problems caused by bad user configuration files are solved
-by editing or deleting one or more of the files in .idlerc.
+by editing or deleting one or more of the files in ``.idlerc``.
On the Font tab, see the text sample for the effect of font face and size
on multiple characters in multiple languages. Edit the sample to add
| | | method is bound, or |
| | | ``None`` |
+-----------+-------------------+---------------------------+
+| | __module__ | name of module in which |
+| | | this method was defined |
++-----------+-------------------+---------------------------+
| function | __doc__ | documentation string |
+-----------+-------------------+---------------------------+
| | __name__ | name with which this |
| | | reserved for return |
| | | annotations. |
+-----------+-------------------+---------------------------+
+| | __module__ | name of module in which |
+| | | this function was defined |
++-----------+-------------------+---------------------------+
| traceback | tb_frame | frame object at this |
| | | level |
+-----------+-------------------+---------------------------+
.. method:: successful()
Return whether the call completed without raising an exception. Will
- raise :exc:`AssertionError` if the result is not ready.
+ raise :exc:`ValueError` if the result is not ready.
The following example demonstrates the use of a pool::
Here are two small examples of how it can be used. To list some statistics
about a newsgroup and print the subjects of the last 10 articles::
- >>> s = nntplib.NNTP('news.gmane.org')
+ >>> s = nntplib.NNTP('news.gmane.io')
>>> resp, count, first, last, name = s.group('gmane.comp.python.committers')
>>> print('Group', name, 'has', count, 'articles, range', first, 'to', last)
Group gmane.comp.python.committers has 1096 articles, range 1 to 1096
To post an article from a binary file (this assumes that the article has valid
headers, and that you have right to post on the particular newsgroup)::
- >>> s = nntplib.NNTP('news.gmane.org')
+ >>> s = nntplib.NNTP('news.gmane.io')
>>> f = open('article.txt', 'rb')
>>> s.post(f)
'240 Article posted successfully.'
connection when done, e.g.:
>>> from nntplib import NNTP
- >>> with NNTP('news.gmane.org') as n:
+ >>> with NNTP('news.gmane.io') as n:
... n.group('gmane.comp.python.committers')
... # doctest: +SKIP
('211 1755 1 1755 gmane.comp.python.committers', 1755, 1, 1755, 'gmane.comp.python.committers')
of values. On legacy servers which don't understand the ``CAPABILITIES``
command, an empty dictionary is returned instead.
- >>> s = NNTP('news.gmane.org')
+ >>> s = NNTP('news.gmane.io')
>>> 'POST' in s.getcapabilities()
True
calls to :func:`unsetenv` don't update ``os.environ``, so it is actually
preferable to delete items of ``os.environ``.
- .. availability:: most flavors of Unix, Windows.
+ .. availability:: most flavors of Unix.
.. _os-newstreams:
See the Unix manual page
:manpage:`times(2)` and :manpage:`times(3)` manual page on Unix or `the GetProcessTimes MSDN
- <https://docs.microsoft.com/windows/win32/api/processthreadsapi/nf-processthreadsapi-getprocesstimes>`
- _ on Windows.
- On Windows, only :attr:`user` and :attr:`system` are known; the other
- attributes are zero.
+ <https://docs.microsoft.com/windows/win32/api/processthreadsapi/nf-processthreadsapi-getprocesstimes>`_
+ on Windows. On Windows, only :attr:`user` and :attr:`system` are known; the other attributes are zero.
.. availability:: Unix, Windows.
.. method:: Path.stat()
- Return information about this path (similarly to :func:`os.stat`).
+ Return a :class:`os.stat_result` object containing information about this path, like :func:`os.stat`.
The result is looked up at each call to this method.
::
.. function:: system()
- Returns the system/OS name, e.g. ``'Linux'``, ``'Windows'``, or ``'Java'``. An
- empty string is returned if the value cannot be determined.
+ Returns the system/OS name, such as ``'Linux'``, ``'Darwin'``, ``'Java'``,
+ ``'Windows'``. An empty string is returned if the value cannot be determined.
.. function:: system_alias(system, release, version)
using :program:`gcc`.
The file is read and scanned in chunks of *chunksize* bytes.
-
-:mod:`pyclbr` --- Python class browser support
-==============================================
+:mod:`pyclbr` --- Python module browser support
+===============================================
.. module:: pyclbr
- :synopsis: Supports information extraction for a Python class browser.
+ :synopsis: Supports information extraction for a Python module browser.
.. sectionauthor:: Fred L. Drake, Jr. <fdrake@acm.org>
*path* is a sequence of directory paths prepended to ``sys.path``,
which is used to locate the module source code.
+ This function is the original interface and is only kept for back
+ compatibility. It returns a filtered version of the following.
+
.. function:: readmodule_ex(module, path=None)
.. versionadded:: 3.2
+.. _site-commandline:
+
+Command Line Interface
+----------------------
+
+.. program:: site
+
The :mod:`site` module also provides a way to get the user directories from the
command line:
$ python3 -m site --user-site
/home/user/.local/lib/python3.3/site-packages
-.. program:: site
-
If it is called without arguments, it will print the contents of
:data:`sys.path` on the standard output, followed by the value of
:data:`USER_BASE` and whether the directory exists, then the same thing for
When :const:`SOCK_NONBLOCK` or :const:`SOCK_CLOEXEC`
bit flags are applied to *type* they are cleared, and
:attr:`socket.type` will not reflect them. They are still passed
- to the underlying system `socket()` call. Therefore::
+ to the underlying system `socket()` call. Therefore,
+
+ ::
sock = socket.socket(
socket.AF_INET,
Python fully supports mixed arithmetic: when a binary arithmetic operator has
operands of different numeric types, the operand with the "narrower" type is
widened to that of the other, where integer is narrower than floating point,
-which is narrower than complex. Comparisons between numbers of mixed type use
-the same rule. [2]_ The constructors :func:`int`, :func:`float`, and
+which is narrower than complex. A comparison between numbers of different types
+behaves as though the exact values of those numbers were being compared. [2]_
+
+The constructors :func:`int`, :func:`float`, and
:func:`complex` can be used to produce numbers of a specific type.
All numeric types (except complex) support the following operations (for priorities of
Most built-in types implement the following options for format specifications,
although some of the formatting options are only supported by the numeric types.
-A general convention is that an empty format string (``""``) produces
+A general convention is that an empty format specification produces
the same result as if you had called :func:`str` on the value. A
-non-empty format string typically modifies the result.
+non-empty format specification typically modifies the result.
The general form of a *standard format specifier* is:
.. versionchanged:: 3.6
Added the ``'_'`` option (see also :pep:`515`).
-*width* is a decimal integer defining the minimum field width. If not
-specified, then the field width will be determined by the content.
+*width* is a decimal integer defining the minimum total field width,
+including any prefixes, separators, and other formatting characters.
+If not specified, then the field width will be determined by the content.
When no explicit alignment is given, preceding the *width* field by a zero
(``'0'``) character enables
arguments for additional differences from the default behavior. Unless
otherwise stated, it is recommended to pass *args* as a sequence.
+ An example of passing some arguments to an external program
+ as a sequence is::
+
+ Popen(["/usr/bin/git", "commit", "-m", "Fixes a bug."])
+
On POSIX, if *args* is a string, the string is interpreted as the name or
path of the program to execute. However, this can only be done if not
passing arguments to the program.
.. note::
- :meth:`shlex.split` can be useful when determining the correct
- tokenization for *args*, especially in complex cases::
+ It may not be obvious how to break a shell command into a sequence of arguments,
+ especially in complex cases. :meth:`shlex.split` can illustrate how to
+ determine the correct tokenization for *args*::
>>> import shlex, subprocess
>>> command_line = input()
+---------------------+----------------+--------------------------------------------------+
| attribute | float.h macro | explanation |
+=====================+================+==================================================+
- | :const:`epsilon` | DBL_EPSILON | difference between 1 and the least value greater |
- | | | than 1 that is representable as a float |
+ | :const:`epsilon` | DBL_EPSILON | difference between 1.0 and the least value |
+ | | | greater than 1.0 that is representable as a float|
+---------------------+----------------+--------------------------------------------------+
| :const:`dig` | DBL_DIG | maximum number of decimal digits that can be |
| | | faithfully represented in a float; see below |
| :const:`mant_dig` | DBL_MANT_DIG | float precision: the number of base-``radix`` |
| | | digits in the significand of a float |
+---------------------+----------------+--------------------------------------------------+
- | :const:`max` | DBL_MAX | maximum representable finite float |
+ | :const:`max` | DBL_MAX | maximum representable positive finite float |
+---------------------+----------------+--------------------------------------------------+
- | :const:`max_exp` | DBL_MAX_EXP | maximum integer e such that ``radix**(e-1)`` is |
+ | :const:`max_exp` | DBL_MAX_EXP | maximum integer *e* such that ``radix**(e-1)`` is|
| | | a representable finite float |
+---------------------+----------------+--------------------------------------------------+
- | :const:`max_10_exp` | DBL_MAX_10_EXP | maximum integer e such that ``10**e`` is in the |
+ | :const:`max_10_exp` | DBL_MAX_10_EXP | maximum integer *e* such that ``10**e`` is in the|
| | | range of representable finite floats |
+---------------------+----------------+--------------------------------------------------+
- | :const:`min` | DBL_MIN | minimum positive normalized float |
+ | :const:`min` | DBL_MIN | minimum representable positive *normalized* float|
+---------------------+----------------+--------------------------------------------------+
- | :const:`min_exp` | DBL_MIN_EXP | minimum integer e such that ``radix**(e-1)`` is |
+ | :const:`min_exp` | DBL_MIN_EXP | minimum integer *e* such that ``radix**(e-1)`` is|
| | | a normalized float |
+---------------------+----------------+--------------------------------------------------+
- | :const:`min_10_exp` | DBL_MIN_10_EXP | minimum integer e such that ``10**e`` is a |
+ | :const:`min_10_exp` | DBL_MIN_10_EXP | minimum integer *e* such that ``10**e`` is a |
| | | normalized float |
+---------------------+----------------+--------------------------------------------------+
| :const:`radix` | FLT_RADIX | radix of exponent representation |
There is no return value.
.. method:: locked()
+
Return true if the lock is acquired.
4,10-4,11: RPAR ')'
4,11-4,12: NEWLINE '\n'
5,0-5,0: ENDMARKER ''
+
+Example of tokenizing a file programmatically, reading unicode
+strings instead of bytes with :func:`generate_tokens`::
+
+ import tokenize
+
+ with tokenize.open('hello.py') as f:
+ tokens = tokenize.generate_tokens(f.readline)
+ for token in tokens:
+ print(token)
+
+Or reading bytes directly with :func:`.tokenize`::
+
+ import tokenize
+
+ with open('hello.py', 'rb') as f:
+ tokens = tokenize.tokenize(f.readline)
+ for token in tokens:
+ print(token)
Fill the shape drawn after the last call to :func:`begin_fill`.
+ Whether or not overlap regions for self-intersecting polygons
+ or multiple shapes are filled depends on the operating system graphics,
+ type of overlap, and number of overlaps. For example, the Turtle star
+ above may be either all yellow or have some white regions.
+
.. doctest::
>>> turtle.color("black", "red")
types.
-.. exception:: ReferenceError
-
- Exception raised when a proxy object is used but the underlying object has been
- collected. This is the same as the standard :exc:`ReferenceError` exception.
-
-
.. seealso::
:pep:`205` - Weak References
analyze, test, perform and/or display publicly, prepare derivative works,
distribute, and otherwise use Python |release| alone or in any derivative
version, provided, however, that PSF's License Agreement and PSF's notice of
- copyright, i.e., "Copyright © 2001-2019 Python Software Foundation; All Rights
+ copyright, i.e., "Copyright © 2001-2020 Python Software Foundation; All Rights
Reserved" are retained in Python |release| alone or in any derivative version
prepared by Licensee.
)\r
:skiphhcsearch\r
\r
-if "%DISTVERSION%" EQU "" for /f "usebackq" %%v in (`%PYTHON% tools/extensions/patchlevel.py`) do set DISTVERSION=%%v\r
+if not defined DISTVERSION for /f "usebackq" %%v in (`%PYTHON% tools/extensions/patchlevel.py`) do set DISTVERSION=%%v\r
\r
-if "%BUILDDIR%" EQU "" set BUILDDIR=build\r
+if not defined BUILDDIR set BUILDDIR=build\r
\r
rem Targets that don't require sphinx-build\r
if "%1" EQU "" goto help\r
)\r
)\r
\r
-if NOT "%PAPER%" == "" (\r
+if defined PAPER (\r
set SPHINXOPTS=-D latex_elements.papersize=%PAPER% %SPHINXOPTS%\r
)\r
if "%1" EQU "htmlhelp" (\r
compiled; :attr:`co_firstlineno` is the first line number of the function;
:attr:`co_lnotab` is a string encoding the mapping from bytecode offsets to
line numbers (for details see the source code of the interpreter);
- :attr:`co_stacksize` is the required stack size (including local variables);
- :attr:`co_flags` is an integer encoding a number of flags for the interpreter.
+ :attr:`co_stacksize` is the required stack size; :attr:`co_flags` is an
+ integer encoding a number of flags for the interpreter.
.. index:: object: generator
Once the appropriate metaclass has been identified, then the class namespace
is prepared. If the metaclass has a ``__prepare__`` attribute, it is called
as ``namespace = metaclass.__prepare__(name, bases, **kwds)`` (where the
-additional keyword arguments, if any, come from the class definition).
+additional keyword arguments, if any, come from the class definition). The
+``__prepare__`` method should be implemented as a :func:`classmethod`. The
+namespace returned by ``__prepare__`` is passed in to ``__new__``, but when
+the final class object is created the namespace is copied into a new ``dict``.
If the metaclass has no ``__prepare__`` attribute, then the class namespace
is initialised as an empty ordered mapping.
object.__rfloordiv__(self, other)
object.__rmod__(self, other)
object.__rdivmod__(self, other)
- object.__rpow__(self, other)
+ object.__rpow__(self, other[, modulo])
object.__rlshift__(self, other)
object.__rrshift__(self, other)
object.__rand__(self, other)
This spec will always have "loader" set (with one exception).
To indicate to the import machinery that the spec represents a namespace
-:term:`portion`. the path entry finder sets "loader" on the spec to
+:term:`portion`, the path entry finder sets "loader" on the spec to
``None`` and "submodule_search_locations" to a list containing the
portion.
cannot contain comments. Each expression is evaluated in the context
where the formatted string literal appears, in order from left to right.
+.. versionchanged:: 3.7
+ Prior to Python 3.7, an :keyword:`await` expression and comprehensions
+ containing an :keyword:`async for` clause were illegal in the expressions
+ in formatted string literals due to a problem with the implementation.
+
If a conversion is specified, the result of evaluating the expression
is converted before formatting. Conversion ``'!s'`` calls :func:`str` on
the result, ``'!r'`` calls :func:`repr`, and ``'!a'`` calls :func:`ascii`.
<li><a href="https://docs.python.org/3.7/">{% trans %}Python 3.7 (stable){% endtrans %}</a></li>
<li><a href="https://docs.python.org/3.6/">{% trans %}Python 3.6 (security-fixes){% endtrans %}</a></li>
<li><a href="https://docs.python.org/3.5/">{% trans %}Python 3.5 (security-fixes){% endtrans %}</a></li>
- <li><a href="https://docs.python.org/2.7/">{% trans %}Python 2.7 (stable){% endtrans %}</a></li>
+ <li><a href="https://docs.python.org/2.7/">{% trans %}Python 2.7 (EOL){% endtrans %}</a></li>
<li><a href="https://www.python.org/doc/versions/">{% trans %}All versions{% endtrans %}</a></li>
</ul>
time, so don't rely on dynamic name resolution! (In fact, local variables are
already determined statically.)
-A special quirk of Python is that -- if no :keyword:`global` statement is in
-effect -- assignments to names always go into the innermost scope. Assignments
-do not copy data --- they just bind names to objects. The same is true for
-deletions: the statement ``del x`` removes the binding of ``x`` from the
+A special quirk of Python is that -- if no :keyword:`global` or :keyword:`nonlocal`
+statement is in effect -- assignments to names always go into the innermost scope.
+Assignments do not copy data --- they just bind names to objects. The same is true
+for deletions: the statement ``del x`` removes the binding of ``x`` from the
namespace referenced by the local scope. In fact, all operations that introduce
new names use the local scope: in particular, :keyword:`import` statements and
function definitions bind the module or function name in the local scope.
File "<stdin>", line 2, in <module>
KeyboardInterrupt
-If a :keyword:`finally` clause is present, the :keyword:`finally` clause will execute as the last task before the :keyword:`try` statement completes. The :keyword:`finally` clause runs whether or not the :keyword:`try` statement produces an exception. The following points discuss more complex cases when an exception occurs:
-
-* If an exception occurs during execution of the :keyword:`!try` clause, the exception may be handled by an :keyword:`except` clause. If the exception is not handled by an :keyword:`except` clause, the exception is re-raised after the :keyword:`!finally` clause has been executed.
-
-* An exception could occur during execution of an :keyword:`!except` or :keyword:`!else` clause. Again, the exception is re-raised after the :keyword:`!finally` clause has been executed.
-
-* If the :keyword:`!try` statement reaches a :keyword:`break`, :keyword:`continue` or :keyword:`return` statement, the :keyword:`finally` clause will execute just prior to the :keyword:`break`, :keyword:`continue` or :keyword:`return` statement's execution.
-
-* If a :keyword:`finally` clause includes a :keyword:`return` statement, the :keyword:`finally` clause's :keyword:`return` statement will execute before, and instead of, the :keyword:`return` statement in a :keyword:`try` clause.
+If a :keyword:`finally` clause is present, the :keyword:`!finally`
+clause will execute as the last task before the :keyword:`try`
+statement completes. The :keyword:`!finally` clause runs whether or
+not the :keyword:`!try` statement produces an exception. The following
+points discuss more complex cases when an exception occurs:
+
+* If an exception occurs during execution of the :keyword:`!try`
+ clause, the exception may be handled by an :keyword:`except`
+ clause. If the exception is not handled by an :keyword:`!except`
+ clause, the exception is re-raised after the :keyword:`!finally`
+ clause has been executed.
+
+* An exception could occur during execution of an :keyword:`!except`
+ or :keyword:`!else` clause. Again, the exception is re-raised after
+ the :keyword:`!finally` clause has been executed.
+
+* If the :keyword:`!try` statement reaches a :keyword:`break`,
+ :keyword:`continue` or :keyword:`return` statement, the
+ :keyword:`!finally` clause will execute just prior to the
+ :keyword:`!break`, :keyword:`!continue` or :keyword:`!return`
+ statement's execution.
+
+* If a :keyword:`!finally` clause includes a :keyword:`!return`
+ statement, the returned value will be the one from the
+ :keyword:`!finally` clause's :keyword:`!return` statement, not the
+ value from the :keyword:`!try` clause's :keyword:`!return`
+ statement.
For example::
Many standard library modules contain code that is invoked on their execution
as a script. An example is the :mod:`timeit` module::
- python -mtimeit -s 'setup here' 'benchmarked code here'
- python -mtimeit -h # for details
+ python -m timeit -s 'setup here' 'benchmarked code here'
+ python -m timeit -h # for details
.. seealso::
:func:`runpy.run_module`
Also available as the :option:`-X` ``utf8`` option.
- .. availability:: \*nix.
-
.. versionadded:: 3.7
See :pep:`540` for more details.
C:\WINDOWS\system32;C:\WINDOWS;C:\Program Files\Python 3.7
+.. _win-utf8-mode:
+
+UTF-8 mode
+==========
+
+.. versionadded:: 3.7
+
+Windows still uses legacy encodings for the system encoding (the ANSI Code
+Page). Python uses it for the default encoding of text files (e.g.
+:func:`locale.getpreferredencoding`).
+
+This may cause issues because UTF-8 is widely used on the internet
+and most Unix systems, including WSL (Windows Subsystem for Linux).
+
+You can use UTF-8 mode to change the default text encoding to UTF-8.
+You can enable UTF-8 mode via the ``-X utf8`` command line option, or
+the ``PYTHONUTF8=1`` environment variable. See :envvar:`PYTHONUTF8` for
+enabling UTF-8 mode, and :ref:`setting-envvars` for how to modify
+environment variables.
+
+When UTF-8 mode is enabled:
+
+* :func:`locale.getpreferredencoding` returns ``'UTF-8'`` instead of
+ the system encoding. This function is used for the default text
+ encoding in many places, including :func:`open`, :class:`Popen`,
+ :meth:`Path.read_text`, etc.
+* :data:`sys.stdin`, :data:`sys.stdout`, and :data:`sys.stderr`
+ all use UTF-8 as their text encoding.
+* You can still use the system encoding via the "mbcs" codec.
+
+Note that adding ``PYTHONUTF8=1`` to the default environment variables
+will affect all Python 3.7+ applications on your system.
+If you have any Python 3.7+ applications which rely on the legacy
+system encoding, it is recommended to set the environment variable
+temporarily or use the ``-X utf8`` command line option.
+
+.. note::
+ Even when UTF-8 mode is disabled, Python uses UTF-8 by default
+ on Windows for:
+
+ * Console I/O including standard I/O (see :pep:`528` for details).
+ * The filesystem encoding (see :pep:`529` for details).
+
+
.. _launcher:
Python Launcher for Windows
Other Language Changes
======================
+* An :keyword:`await` expression and comprehensions containing an
+ :keyword:`async for` clause were illegal in the expressions in
+ :ref:`formatted string literals <f-strings>` due to a problem with the
+ implementation. In Python 3.7 this restriction was lifted.
+
* More than 255 arguments can now be passed to a function, and a function can
now have more than 255 parameters. (Contributed by Serhiy Storchaka in
:issue:`12844` and :issue:`18896`.)
/*--start constants--*/
#define PY_MAJOR_VERSION 3
#define PY_MINOR_VERSION 7
-#define PY_MICRO_VERSION 6
+#define PY_MICRO_VERSION 7
#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_FINAL
#define PY_RELEASE_SERIAL 0
/* Version as a string */
-#define PY_VERSION "3.7.6"
+#define PY_VERSION "3.7.7"
/*--end constants--*/
/* Version as a single 4-byte hex number, e.g. 0x010502B2 == 1.5.2b2.
distribute, and otherwise use Python alone or in any derivative version,
provided, however, that PSF's License Agreement and PSF's notice of copyright,
i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010,
-2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019 Python Software Foundation;
+2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020 Python Software Foundation;
All Rights Reserved" are retained in Python alone or in any derivative version
prepared by Licensee.
# Limits for the C version for compatibility
'MAX_PREC', 'MAX_EMAX', 'MIN_EMIN', 'MIN_ETINY',
- # C version: compile time choice that enables the thread local context
- 'HAVE_THREADS'
+ # C version: compile time choice that enables the thread local context (deprecated, now always true)
+ 'HAVE_THREADS',
+
+ # C version: compile time choice that enables the coroutine local context
+ 'HAVE_CONTEXTVAR'
]
__xname__ = __name__ # sys.modules lookup (--without-threads)
# Compatibility with the C version
HAVE_THREADS = True
+HAVE_CONTEXTVAR = True
if sys.maxsize == 2**63-1:
MAX_PREC = 999999999999999999
MAX_EMAX = 999999999999999999
'asyncgen': agen
})
- def run_forever(self):
- """Run until stop() is called."""
- self._check_closed()
+ def _check_runnung(self):
if self.is_running():
raise RuntimeError('This event loop is already running')
if events._get_running_loop() is not None:
raise RuntimeError(
'Cannot run the event loop while another loop is running')
+
+ def run_forever(self):
+ """Run until stop() is called."""
+ self._check_closed()
+ self._check_runnung()
self._set_coroutine_origin_tracking(self._debug)
self._thread_id = threading.get_ident()
Return the Future's result, or raise its exception.
"""
self._check_closed()
+ self._check_runnung()
new_task = not futures.isfuture(future)
future = tasks.ensure_future(future, loop=self)
altchars = _bytes_from_decode_data(altchars)
assert len(altchars) == 2, repr(altchars)
s = s.translate(bytes.maketrans(altchars, b'+/'))
- if validate and not re.match(b'^[A-Za-z0-9+/]*={0,2}$', s):
+ if validate and not re.fullmatch(b'[A-Za-z0-9+/]*={0,2}', s):
raise binascii.Error('Non-base64 digit found')
return binascii.a2b_base64(s)
s += frame.f_code.co_name
else:
s += "<lambda>"
- if '__args__' in frame.f_locals:
- args = frame.f_locals['__args__']
- else:
- args = None
- if args:
- s += reprlib.repr(args)
- else:
- s += '()'
+ s += '()'
if '__return__' in frame.f_locals:
rv = frame.f_locals['__return__']
s += '->'
Arguments are as for compile_command().
- One several things can happen:
+ One of several things can happen:
1) The input is incorrect; compile_command() raised an
exception (SyntaxError or OverflowError). A syntax traceback
file = builtins.open(filename, mode, buffering)
if encoding is None:
return file
- info = lookup(encoding)
- srw = StreamReaderWriter(file, info.streamreader, info.streamwriter, errors)
- # Add attributes to simplify introspection
- srw.encoding = encoding
- return srw
+
+ try:
+ info = lookup(encoding)
+ srw = StreamReaderWriter(file, info.streamreader, info.streamwriter, errors)
+ # Add attributes to simplify introspection
+ srw.encoding = encoding
+ return srw
+ except:
+ file.close()
+ raise
def EncodedFile(file, data_encoding, file_encoding=None, errors='strict'):
else:
dfile = None
if not os.path.isdir(fullname):
- yield fullname
+ yield fullname, ddir
elif (maxlevels > 0 and name != os.curdir and name != os.pardir and
os.path.isdir(fullname) and not os.path.islink(fullname)):
yield from _walk_dir(fullname, ddir=dfile,
from concurrent.futures import ProcessPoolExecutor
except ImportError:
workers = 1
- files = _walk_dir(dir, quiet=quiet, maxlevels=maxlevels,
- ddir=ddir)
+ files_and_ddirs = _walk_dir(dir, quiet=quiet, maxlevels=maxlevels,
+ ddir=ddir)
success = True
if workers is not None and workers != 1 and ProcessPoolExecutor is not None:
workers = workers or None
with ProcessPoolExecutor(max_workers=workers) as executor:
- results = executor.map(partial(compile_file,
- ddir=ddir, force=force,
- rx=rx, quiet=quiet,
- legacy=legacy,
- optimize=optimize,
- invalidation_mode=invalidation_mode),
- files)
+ results = executor.map(
+ partial(_compile_file_tuple,
+ force=force, rx=rx, quiet=quiet,
+ legacy=legacy, optimize=optimize,
+ invalidation_mode=invalidation_mode,
+ ),
+ files_and_ddirs)
success = min(results, default=True)
else:
- for file in files:
- if not compile_file(file, ddir, force, rx, quiet,
+ for file, dfile in files_and_ddirs:
+ if not compile_file(file, dfile, force, rx, quiet,
legacy, optimize, invalidation_mode):
success = False
return success
+def _compile_file_tuple(file_and_dfile, **kwargs):
+ """Needs to be toplevel for ProcessPoolExecutor."""
+ file, dfile = file_and_dfile
+ return compile_file(file, dfile, **kwargs)
+
def compile_file(fullname, ddir=None, force=False, rx=None, quiet=0,
legacy=False, optimize=-1,
invalidation_mode=None):
def _copy_immutable(x):
return x
for t in (type(None), int, float, bool, complex, str, tuple,
- bytes, frozenset, type, range, slice,
+ bytes, frozenset, type, range, slice, property,
types.BuiltinFunctionType, type(Ellipsis), type(NotImplemented),
types.FunctionType, weakref.ref):
d[t] = _copy_immutable
d[types.BuiltinFunctionType] = _deepcopy_atomic
d[types.FunctionType] = _deepcopy_atomic
d[weakref.ref] = _deepcopy_atomic
+d[property] = _deepcopy_atomic
def _deepcopy_list(x, memo, deepcopy=deepcopy):
y = []
self.assertEqual(f2, [0x4567, 0x0123, 0xcdef, 0x89ab,
0x3210, 0x7654, 0xba98, 0xfedc])
+ @unittest.skipIf(True, 'Test disabled for now - see bpo-16575/bpo-16576')
def test_union_by_value(self):
# See bpo-16575
self.assertEqual(test5.nested.an_int, 0)
self.assertEqual(test5.another_int, 0)
- #@unittest.skipIf('s390' in MACHINE, 'Test causes segfault on S390')
+ @unittest.skipIf(True, 'Test disabled for now - see bpo-16575/bpo-16576')
def test_bitfield_by_value(self):
# See bpo-16576
return None, None
def _find_vcvarsall(plat_spec):
+ # bpo-38597: Removed vcruntime return value
_, best_dir = _find_vc2017()
- vcruntime = None
- vcruntime_plat = 'x64' if 'amd64' in plat_spec else 'x86'
- if best_dir:
- vcredist = os.path.join(best_dir, "..", "..", "redist", "MSVC", "**",
- vcruntime_plat, "Microsoft.VC14*.CRT", "vcruntime140.dll")
- try:
- import glob
- vcruntime = glob.glob(vcredist, recursive=True)[-1]
- except (ImportError, OSError, LookupError):
- vcruntime = None
if not best_dir:
best_version, best_dir = _find_vc2015()
- if best_version:
- vcruntime = os.path.join(best_dir, 'redist', vcruntime_plat,
- "Microsoft.VC140.CRT", "vcruntime140.dll")
if not best_dir:
log.debug("No suitable Visual C++ version found")
log.debug("%s cannot be found", vcvarsall)
return None, None
- if not vcruntime or not os.path.isfile(vcruntime):
- log.debug("%s cannot be found", vcruntime)
- vcruntime = None
-
- return vcvarsall, vcruntime
+ return vcvarsall, None
def _get_vc_env(plat_spec):
if os.getenv("DISTUTILS_USE_SDK"):
for key, value in os.environ.items()
}
- vcvarsall, vcruntime = _find_vcvarsall(plat_spec)
+ vcvarsall, _ = _find_vcvarsall(plat_spec)
if not vcvarsall:
raise DistutilsPlatformError("Unable to find vcvarsall.bat")
if key and value
}
- if vcruntime:
- env['py_vcruntime_redist'] = vcruntime
return env
def _find_exe(exe, paths=None):
'win-amd64' : 'x86_amd64',
}
-# A set containing the DLLs that are guaranteed to be available for
-# all micro versions of this Python version. Known extension
-# dependencies that are not in this set will be copied to the output
-# path.
-_BUNDLED_DLLS = frozenset(['vcruntime140.dll'])
-
class MSVCCompiler(CCompiler) :
"""Concrete class that implements an interface to Microsoft Visual C++,
as defined by the CCompiler abstract class."""
self.rc = _find_exe("rc.exe", paths) # resource compiler
self.mc = _find_exe("mc.exe", paths) # message compiler
self.mt = _find_exe("mt.exe", paths) # message compiler
- self._vcruntime_redist = vc_env.get('py_vcruntime_redist', '')
for dir in vc_env.get('include', '').split(os.pathsep):
if dir:
self.add_library_dir(dir.rstrip(os.sep))
self.preprocess_options = None
- # If vcruntime_redist is available, link against it dynamically. Otherwise,
- # use /MT[d] to build statically, then switch from libucrt[d].lib to ucrt[d].lib
- # later to dynamically link to ucrtbase but not vcruntime.
+ # bpo-38597: Always compile with dynamic linking
+ # Future releases of Python 3.x will include all past
+ # versions of vcruntime*.dll for compatibility.
self.compile_options = [
- '/nologo', '/Ox', '/W3', '/GL', '/DNDEBUG'
+ '/nologo', '/Ox', '/W3', '/GL', '/DNDEBUG', '/MD'
]
- self.compile_options.append('/MD' if self._vcruntime_redist else '/MT')
self.compile_options_debug = [
'/nologo', '/Od', '/MDd', '/Zi', '/W3', '/D_DEBUG'
ldflags = [
'/nologo', '/INCREMENTAL:NO', '/LTCG'
]
- if not self._vcruntime_redist:
- ldflags.extend(('/nodefaultlib:libucrt.lib', 'ucrt.lib'))
ldflags_debug = [
'/nologo', '/INCREMENTAL:NO', '/LTCG', '/DEBUG:FULL'
try:
log.debug('Executing "%s" %s', self.linker, ' '.join(ld_args))
self.spawn([self.linker] + ld_args)
- self._copy_vcruntime(output_dir)
except DistutilsExecError as msg:
raise LinkError(msg)
else:
log.debug("skipping %s (up-to-date)", output_filename)
- def _copy_vcruntime(self, output_dir):
- vcruntime = self._vcruntime_redist
- if not vcruntime or not os.path.isfile(vcruntime):
- return
-
- if os.path.basename(vcruntime).lower() in _BUNDLED_DLLS:
- return
-
- log.debug('Copying "%s"', vcruntime)
- vcruntime = shutil.copy(vcruntime, output_dir)
- os.chmod(vcruntime, stat.S_IWRITE)
-
def spawn(self, cmd):
old_path = os.getenv('path')
try:
finally:
_msvccompiler._find_vcvarsall = old_find_vcvarsall
- def test_compiler_options(self):
- import distutils._msvccompiler as _msvccompiler
- # suppress path to vcruntime from _find_vcvarsall to
- # check that /MT is added to compile options
- old_find_vcvarsall = _msvccompiler._find_vcvarsall
- def _find_vcvarsall(plat_spec):
- return old_find_vcvarsall(plat_spec)[0], None
- _msvccompiler._find_vcvarsall = _find_vcvarsall
- try:
- compiler = _msvccompiler.MSVCCompiler()
- compiler.initialize()
-
- self.assertIn('/MT', compiler.compile_options)
- self.assertNotIn('/MD', compiler.compile_options)
- finally:
- _msvccompiler._find_vcvarsall = old_find_vcvarsall
-
- def test_vcruntime_copy(self):
- import distutils._msvccompiler as _msvccompiler
- # force path to a known file - it doesn't matter
- # what we copy as long as its name is not in
- # _msvccompiler._BUNDLED_DLLS
- old_find_vcvarsall = _msvccompiler._find_vcvarsall
- def _find_vcvarsall(plat_spec):
- return old_find_vcvarsall(plat_spec)[0], __file__
- _msvccompiler._find_vcvarsall = _find_vcvarsall
- try:
- tempdir = self.mkdtemp()
- compiler = _msvccompiler.MSVCCompiler()
- compiler.initialize()
- compiler._copy_vcruntime(tempdir)
-
- self.assertTrue(os.path.isfile(os.path.join(
- tempdir, os.path.basename(__file__))))
- finally:
- _msvccompiler._find_vcvarsall = old_find_vcvarsall
-
- def test_vcruntime_skip_copy(self):
- import distutils._msvccompiler as _msvccompiler
-
- tempdir = self.mkdtemp()
- compiler = _msvccompiler.MSVCCompiler()
- compiler.initialize()
- dll = compiler._vcruntime_redist
- self.assertTrue(os.path.isfile(dll), dll or "<None>")
-
- compiler._copy_vcruntime(tempdir)
-
- self.assertFalse(os.path.isfile(os.path.join(
- tempdir, os.path.basename(dll))), dll or "<None>")
-
def test_get_vc_env_unicode(self):
import distutils._msvccompiler as _msvccompiler
if module is None:
filename = None
else:
- filename = getattr(module, '__file__', module.__name__)
+ # __file__ can be None for namespace packages.
+ filename = getattr(module, '__file__', None) or module.__name__
if filename[-4:] == ".pyc":
filename = filename[:-1]
return self._parser.get_doctest(docstring, globs, name,
digit = char - 22 # 0x30-26
elif errors == "strict":
raise UnicodeError("Invalid extended code point '%s'"
- % extended[extpos])
+ % extended[extpos-1])
else:
return extpos, None
t = T(j, bias)
def __bool__(a):
"""a != 0"""
- return a._numerator != 0
+ # bpo-39274: Use bool() because (a._numerator != 0) can return an
+ # object which is not a bool.
+ return bool(a._numerator)
# support for pickling, copy, and deepcopy
READ, WRITE = 1, 2
+_COMPRESS_LEVEL_FAST = 1
+_COMPRESS_LEVEL_TRADEOFF = 6
+_COMPRESS_LEVEL_BEST = 9
+
+
def open(filename, mode="rb", compresslevel=9,
encoding=None, errors=None, newline=None):
"""Open a gzip-compressed file in binary or text mode.
self.fileobj = fileobj
if self.mode == WRITE:
- self._write_gzip_header()
+ self._write_gzip_header(compresslevel)
@property
def filename(self):
self.bufsize = 0
self.offset = 0 # Current file offset for seek(), tell(), etc
- def _write_gzip_header(self):
+ def _write_gzip_header(self, compresslevel):
self.fileobj.write(b'\037\213') # magic header
self.fileobj.write(b'\010') # compression method
try:
if mtime is None:
mtime = time.time()
write32u(self.fileobj, int(mtime))
- self.fileobj.write(b'\002')
+ if compresslevel == _COMPRESS_LEVEL_BEST:
+ xfl = b'\002'
+ elif compresslevel == _COMPRESS_LEVEL_FAST:
+ xfl = b'\004'
+ else:
+ xfl = b'\000'
+ self.fileobj.write(xfl)
self.fileobj.write(b'\377')
if fname:
self.fileobj.write(fname + b'\000')
======================================
+bpo-39781: Selecting code context lines no longer causes a jump.
+
+bpo-39663: Add tests for pyparse find_good_parse_start().
+
+bpo-39600: Remove duplicate font names from configuration list.
+
+bpo-38792: Close a shell calltip if a :exc:`KeyboardInterrupt`
+or shell restart occurs. Patch by Zackery Spytz.
+
+bpo-30780: Add remaining configdialog tests for buttons and
+highlights and keys tabs.
+
+bpo-39388: Settings dialog Cancel button cancels pending changes.
+
+bpo-39050: Settings dialog Help button again displays help text.
+
+bpo-32989: Add tests for editor newline_and_indent_event method.
+Remove unneeded arguments and dead code from pyparse
+find_good_parse_start method.
+
bpo-38943: Fix autocomplete windows not always appearing on some
systems. Patch by Johnny Najera.
-bpo-38944: Excape key now closes IDLE completion windows. Patch by
+bpo-38944: Escape key now closes IDLE completion windows. Patch by
Johnny Najera.
bpo-38862: 'Strip Trailing Whitespace' on the Format menu removes extra
config-extensions.def # Defaults for extensions
config-highlight.def # Defaults for colorizing
config-keys.def # Defaults for key bindings
-config-main.def # Defai;ts fpr font and geneal
+config-main.def # Defaults for font and general tabs
Text
----
# See __init__ for usage
return calltip_w.CalltipWindow(self.text)
- def _remove_calltip_window(self, event=None):
+ def remove_calltip_window(self, event=None):
if self.active_calltip:
self.active_calltip.hidetip()
self.active_calltip = None
self.open_calltip(False)
def open_calltip(self, evalfuncs):
- self._remove_calltip_window()
+ self.remove_calltip_window()
hp = HyperParser(self.editwin, "insert")
sur_paren = hp.get_surrounding_brackets('(')
enclosing block. The number of hint lines is determined by the maxlines
variable in the codecontext section of config-extensions.def. Lines which do
not open blocks are not shown in the context hints pane.
-
"""
import re
from sys import maxsize as INFINITY
from idlelib.config import idleConf
-BLOCKOPENERS = {"class", "def", "elif", "else", "except", "finally", "for",
- "if", "try", "while", "with", "async"}
+BLOCKOPENERS = {'class', 'def', 'if', 'elif', 'else', 'while', 'for',
+ 'try', 'except', 'finally', 'with', 'async'}
def get_spaces_firstword(codeline, c=re.compile(r"^(\s*)(\w*)")):
if self.t1 is not None:
try:
self.text.after_cancel(self.t1)
- except tkinter.TclError:
+ except tkinter.TclError: # pragma: no cover
pass
self.t1 = None
padx += widget.tk.getint(info['padx'])
padx += widget.tk.getint(widget.cget('padx'))
border += widget.tk.getint(widget.cget('border'))
- self.context = tkinter.Text(
+ context = self.context = tkinter.Text(
self.editwin.text_frame,
height=1,
width=1, # Don't request more than we get.
padx=padx, border=border, relief=SUNKEN, state='disabled')
self.update_font()
self.update_highlight_colors()
- self.context.bind('<ButtonRelease-1>', self.jumptoline)
+ context.bind('<ButtonRelease-1>', self.jumptoline)
# Get the current context and initiate the recurring update event.
self.timer_event()
# Grid the context widget above the text widget.
- self.context.grid(row=0, column=1, sticky=NSEW)
+ context.grid(row=0, column=1, sticky=NSEW)
line_number_colors = idleConf.GetHighlight(idleConf.CurrentTheme(),
'linenumber')
self.context['state'] = 'disabled'
def jumptoline(self, event=None):
- "Show clicked context line at top of editor."
- lines = len(self.info)
- if lines == 1: # No context lines are showing.
- newtop = 1
- else:
- # Line number clicked.
- contextline = int(float(self.context.index('insert')))
- # Lines not displayed due to maxlines.
- offset = max(1, lines - self.context_depth) - 1
- newtop = self.info[offset + contextline][0]
- self.text.yview(f'{newtop}.0')
- self.update_code_context()
+ """ Show clicked context line at top of editor.
+
+ If a selection was made, don't jump; allow copying.
+ If no visible context, show the top line of the file.
+ """
+ try:
+ self.context.index("sel.first")
+ except tkinter.TclError:
+ lines = len(self.info)
+ if lines == 1: # No context lines are showing.
+ newtop = 1
+ else:
+ # Line number clicked.
+ contextline = int(float(self.context.index('insert')))
+ # Lines not displayed due to maxlines.
+ offset = max(1, lines - self.context_depth) - 1
+ newtop = self.info[offset + contextline][0]
+ self.text.yview(f'{newtop}.0')
+ self.update_code_context()
def timer_event(self):
"Event on editor text widget triggered every UPDATEINTERVAL ms."
else:
padding_args = {'padding': (6, 3)}
outer = Frame(self, padding=2)
- buttons = Frame(outer, padding=2)
+ buttons_frame = Frame(outer, padding=2)
+ self.buttons = {}
for txt, cmd in (
('Ok', self.ok),
('Apply', self.apply),
('Cancel', self.cancel),
('Help', self.help)):
- Button(buttons, text=txt, command=cmd, takefocus=FALSE,
- **padding_args).pack(side=LEFT, padx=5)
+ self.buttons[txt] = Button(buttons_frame, text=txt, command=cmd,
+ takefocus=FALSE, **padding_args)
+ self.buttons[txt].pack(side=LEFT, padx=5)
# Add space above buttons.
Frame(outer, height=2, borderwidth=0).pack(side=TOP)
- buttons.pack(side=BOTTOM)
+ buttons_frame.pack(side=BOTTOM)
return outer
def ok(self):
Methods:
destroy: inherited
"""
+ changes.clear()
self.destroy()
def destroy(self):
Attributes accessed:
note
-
Methods:
view_text: Method from textview module.
"""
page = self.note.tab(self.note.select(), option='text').strip()
view_text(self, title='Help for IDLE preferences',
- text=help_common+help_pages.get(page, ''))
+ contents=help_common+help_pages.get(page, ''))
def deactivate_current_config(self):
"""Remove current key bindings.
font_size = configured_font[1]
font_bold = configured_font[2]=='bold'
- # Set editor font selection list and font_name.
- fonts = list(tkFont.families(self))
- fonts.sort()
+ # Set sorted no-duplicate editor font selection list and font_name.
+ fonts = sorted(set(tkFont.families(self)))
for font in fonts:
self.fontlist.insert(END, font)
self.font_name.set(font_name)
text.configure(
font=('courier', 12, ''), cursor='hand2', width=1, height=1,
takefocus=FALSE, highlightthickness=0, wrap=NONE)
+ # Prevent perhaps invisible selection of word or slice.
text.bind('<Double-Button-1>', lambda e: 'break')
text.bind('<B1-Motion>', lambda e: 'break')
string_tags=(
theme_name - string, the name of the new theme
theme - dictionary containing the new theme
"""
- if not idleConf.userCfg['highlight'].has_section(theme_name):
- idleConf.userCfg['highlight'].add_section(theme_name)
+ idleConf.userCfg['highlight'].AddSection(theme_name)
for element in theme:
value = theme[element]
idleConf.userCfg['highlight'].SetOption(theme_name, element, value)
keyset_name - string, the name of the new key set
keyset - dictionary containing the new keybindings
"""
- if not idleConf.userCfg['keys'].has_section(keyset_name):
- idleConf.userCfg['keys'].add_section(keyset_name)
+ idleConf.userCfg['keys'].AddSection(keyset_name)
for event in keyset:
value = keyset[event]
idleConf.userCfg['keys'].SetOption(keyset_name, event, value)
text.bind("<<run-module>>", scriptbinding.run_module_event)
text.bind("<<run-custom>>", scriptbinding.run_custom_event)
text.bind("<<do-rstrip>>", self.Rstrip(self).do_rstrip)
- ctip = self.Calltip(self)
+ self.ctip = ctip = self.Calltip(self)
text.bind("<<try-open-calltip>>", ctip.try_open_calltip_event)
#refresh-calltip must come after paren-closed to work right
text.bind("<<refresh-calltip>>", ctip.refresh_calltip_event)
text.undo_block_stop()
def newline_and_indent_event(self, event):
+ """Insert a newline and indentation after Enter keypress event.
+
+ Properly position the cursor on the new line based on information
+ from the current line. This takes into account if the current line
+ is a shell prompt, is empty, has selected text, contains a block
+ opener, contains a block closer, is a continuation line, or
+ is inside a string.
+ """
text = self.text
first, last = self.get_selection_indices()
text.undo_block_start()
- try:
+ try: # Close undo block and expose new line in finally clause.
if first and last:
text.delete(first, last)
text.mark_set("insert", first)
line = text.get("insert linestart", "insert")
+
+ # Count leading whitespace for indent size.
i, n = 0, len(line)
while i < n and line[i] in " \t":
- i = i+1
+ i += 1
if i == n:
- # the cursor is in or at leading indentation in a continuation
- # line; just inject an empty line at the start
+ # The cursor is in or at leading indentation in a continuation
+ # line; just inject an empty line at the start.
text.insert("insert linestart", '\n')
return "break"
indent = line[:i]
- # strip whitespace before insert point unless it's in the prompt
+
+ # Strip whitespace before insert point unless it's in the prompt.
i = 0
while line and line[-1] in " \t" and line != self.prompt_last_line:
line = line[:-1]
- i = i+1
+ i += 1
if i:
text.delete("insert - %d chars" % i, "insert")
- # strip whitespace after insert point
+
+ # Strip whitespace after insert point.
while text.get("insert") in " \t":
text.delete("insert")
- # start new line
+
+ # Insert new line.
text.insert("insert", '\n')
- # adjust indentation for continuations and block
- # open/close first need to find the last stmt
+ # Adjust indentation for continuations and block open/close.
+ # First need to find the last statement.
lno = index2line(text.index('insert'))
y = pyparse.Parser(self.indentwidth, self.tabwidth)
if not self.prompt_last_line:
rawtext = text.get(startatindex, "insert")
y.set_code(rawtext)
bod = y.find_good_parse_start(
- self._build_char_in_string_func(startatindex))
+ self._build_char_in_string_func(startatindex))
if bod is not None or startat == 1:
break
y.set_lo(bod or 0)
c = y.get_continuation_type()
if c != pyparse.C_NONE:
- # The current stmt hasn't ended yet.
+ # The current statement hasn't ended yet.
if c == pyparse.C_STRING_FIRST_LINE:
- # after the first line of a string; do not indent at all
+ # After the first line of a string do not indent at all.
pass
elif c == pyparse.C_STRING_NEXT_LINES:
- # inside a string which started before this line;
- # just mimic the current indent
+ # Inside a string which started before this line;
+ # just mimic the current indent.
text.insert("insert", indent)
elif c == pyparse.C_BRACKET:
- # line up with the first (if any) element of the
+ # Line up with the first (if any) element of the
# last open bracket structure; else indent one
# level beyond the indent of the line with the
- # last open bracket
+ # last open bracket.
self.reindent_to(y.compute_bracket_indent())
elif c == pyparse.C_BACKSLASH:
- # if more than one line in this stmt already, just
+ # If more than one line in this statement already, just
# mimic the current indent; else if initial line
# has a start on an assignment stmt, indent to
# beyond leftmost =; else to beyond first chunk of
- # non-whitespace on initial line
+ # non-whitespace on initial line.
if y.get_num_lines_in_stmt() > 1:
text.insert("insert", indent)
else:
assert 0, "bogus continuation type %r" % (c,)
return "break"
- # This line starts a brand new stmt; indent relative to
+ # This line starts a brand new statement; indent relative to
# indentation of initial line of closest preceding
- # interesting stmt.
+ # interesting statement.
indent = y.get_base_indent_string()
text.insert("insert", indent)
if y.is_block_opener():
</dl>
<p>Editor windows also have breakpoint functions. Lines with a breakpoint set are
specially marked. Breakpoints only have an effect when running under the
-debugger. Breakpoints for a file are saved in the user’s .idlerc directory.</p>
+debugger. Breakpoints for a file are saved in the user’s <code class="docutils literal notranslate"><span class="pre">.idlerc</span></code>
+directory.</p>
<dl class="simple">
<dt>Set Breakpoint</dt><dd><p>Set a breakpoint on the current line.</p>
</dd>
crash or Keyboard Interrupt (control-C) may fail to connect. Dismissing
the error box or Restart Shell on the Shell menu may fix a temporary problem.</p>
<p>When IDLE first starts, it attempts to read user configuration files in
-~/.idlerc/ (~ is one’s home directory). If there is a problem, an error
+<code class="docutils literal notranslate"><span class="pre">~/.idlerc/</span></code> (~ is one’s home directory). If there is a problem, an error
message should be displayed. Leaving aside random disk glitches, this can
be prevented by never editing the files by hand, using the configuration
dialog, under Options, instead Options. Once it happens, the solution may
be to delete one or more of the configuration files.</p>
<p>If IDLE quits with no message, and it was not started from a console, try
-starting from a console (<code class="docutils literal notranslate"><span class="pre">python</span> <span class="pre">-m</span> <span class="pre">idlelib)</span></code> and see if a message appears.</p>
+starting from a console (<code class="docutils literal notranslate"><span class="pre">python</span> <span class="pre">-m</span> <span class="pre">idlelib</span></code>) and see if a message appears.</p>
</div>
<div class="section" id="running-user-code">
<h3>Running user code<a class="headerlink" href="#running-user-code" title="Permalink to this headline">¶</a></h3>
Or click the TOC (Table of Contents) button and select a section
header in the opened box.</p>
<p>Help menu entry “Python Docs” opens the extensive sources of help,
-including tutorials, available at docs.python.org/x.y, where ‘x.y’
+including tutorials, available at <code class="docutils literal notranslate"><span class="pre">docs.python.org/x.y</span></code>, where ‘x.y’
is the currently running Python version. If your system
has an off-line copy of the docs (this may be an installation option),
that will be opened instead.</p>
<p>Selected URLs can be added or removed from the help menu at any time using the
-General tab of the Configure IDLE dialog .</p>
+General tab of the Configure IDLE dialog.</p>
</div>
<div class="section" id="setting-preferences">
<span id="preferences"></span><h3>Setting preferences<a class="headerlink" href="#setting-preferences" title="Permalink to this headline">¶</a></h3>
<p>The font preferences, highlighting, keys, and general preferences can be
changed via Configure IDLE on the Option menu.
-Non-default user settings are saved in a .idlerc directory in the user’s
+Non-default user settings are saved in a <code class="docutils literal notranslate"><span class="pre">.idlerc</span></code> directory in the user’s
home directory. Problems caused by bad user configuration files are solved
-by editing or deleting one or more of the files in .idlerc.</p>
+by editing or deleting one or more of the files in <code class="docutils literal notranslate"><span class="pre">.idlerc</span></code>.</p>
<p>On the Font tab, see the text sample for the effect of font face and size
on multiple characters in multiple languages. Edit the sample to add
other characters of personal interest. Use the sample to select
jump()
eq(cc.topvisible, 8)
+ # Context selection stops jump.
+ cc.text.yview('5.0')
+ cc.update_code_context()
+ cc.context.tag_add('sel', '1.0', '2.0')
+ cc.context.mark_set('insert', '1.0')
+ jump() # Without selection, to line 2.
+ eq(cc.topvisible, 5)
+
@mock.patch.object(codecontext.CodeContext, 'update_code_context')
def test_timer_event(self, mock_update):
# Ensure code context is not active.
import unittest
from unittest import mock
from idlelib.idle_test.mock_idle import Func
-from tkinter import Tk, StringVar, IntVar, BooleanVar, DISABLED, NORMAL
+from tkinter import (Tk, StringVar, IntVar, BooleanVar, DISABLED, NORMAL)
from idlelib import config
from idlelib.configdialog import idleConf, changes, tracers
keyspage = changes['keys']
extpage = changes['extensions']
+
def setUpModule():
global root, dialog
idleConf.userCfg = testcfg
# root.withdraw() # Comment out, see issue 30870
dialog = configdialog.ConfigDialog(root, 'Test', _utest=True)
+
def tearDownModule():
global root, dialog
idleConf.userCfg = usercfg
root = dialog = None
+class ConfigDialogTest(unittest.TestCase):
+
+ def test_deactivate_current_config(self):
+ pass
+
+ def activate_config_changes(self):
+ pass
+
+
+class ButtonTest(unittest.TestCase):
+
+ def test_click_ok(self):
+ d = dialog
+ apply = d.apply = mock.Mock()
+ destroy = d.destroy = mock.Mock()
+ d.buttons['Ok'].invoke()
+ apply.assert_called_once()
+ destroy.assert_called_once()
+ del d.destroy, d.apply
+
+ def test_click_apply(self):
+ d = dialog
+ deactivate = d.deactivate_current_config = mock.Mock()
+ save_ext = d.save_all_changed_extensions = mock.Mock()
+ activate = d.activate_config_changes = mock.Mock()
+ d.buttons['Apply'].invoke()
+ deactivate.assert_called_once()
+ save_ext.assert_called_once()
+ activate.assert_called_once()
+ del d.save_all_changed_extensions
+ del d.activate_config_changes, d.deactivate_current_config
+
+ def test_click_cancel(self):
+ d = dialog
+ d.destroy = Func()
+ changes['main']['something'] = 1
+ d.buttons['Cancel'].invoke()
+ self.assertEqual(changes['main'], {})
+ self.assertEqual(d.destroy.called, 1)
+ del d.destroy
+
+ def test_click_help(self):
+ dialog.note.select(dialog.keyspage)
+ with mock.patch.object(configdialog, 'view_text',
+ new_callable=Func) as view:
+ dialog.buttons['Help'].invoke()
+ title, contents = view.kwds['title'], view.kwds['contents']
+ self.assertEqual(title, 'Help for IDLE preferences')
+ self.assertTrue(contents.startswith('When you click') and
+ contents.endswith('a different name.\n'))
+
+
class FontPageTest(unittest.TestCase):
"""Test that font widgets enable users to make font changes.
eq(d.highlight_target.get(), elem[tag])
eq(d.set_highlight_target.called, count)
+ def test_highlight_sample_double_click(self):
+ # Test double click on highlight_sample.
+ eq = self.assertEqual
+ d = self.page
+
+ hs = d.highlight_sample
+ hs.focus_force()
+ hs.see(1.0)
+ hs.update_idletasks()
+
+ # Test binding from configdialog.
+ hs.event_generate('<Enter>', x=0, y=0)
+ hs.event_generate('<Motion>', x=0, y=0)
+ # Double click is a sequence of two clicks in a row.
+ for _ in range(2):
+ hs.event_generate('<ButtonPress-1>', x=0, y=0)
+ hs.event_generate('<ButtonRelease-1>', x=0, y=0)
+
+ eq(hs.tag_ranges('sel'), ())
+
+ def test_highlight_sample_b1_motion(self):
+ # Test button motion on highlight_sample.
+ eq = self.assertEqual
+ d = self.page
+
+ hs = d.highlight_sample
+ hs.focus_force()
+ hs.see(1.0)
+ hs.update_idletasks()
+
+ x, y, dx, dy, offset = hs.dlineinfo('1.0')
+
+ # Test binding from configdialog.
+ hs.event_generate('<Leave>')
+ hs.event_generate('<Enter>')
+ hs.event_generate('<Motion>', x=x, y=y)
+ hs.event_generate('<ButtonPress-1>', x=x, y=y)
+ hs.event_generate('<B1-Motion>', x=dx, y=dy)
+ hs.event_generate('<ButtonRelease-1>', x=dx, y=dy)
+
+ eq(hs.tag_ranges('sel'), ())
+
def test_set_theme_type(self):
eq = self.assertEqual
d = self.page
idleConf.userCfg['highlight'].SetOption(theme_name, 'name', 'value')
highpage[theme_name] = {'option': 'True'}
+ theme_name2 = 'other theme'
+ idleConf.userCfg['highlight'].SetOption(theme_name2, 'name', 'value')
+ highpage[theme_name2] = {'option': 'False'}
+
# Force custom theme.
- d.theme_source.set(False)
+ d.custom_theme_on.state(('!disabled',))
+ d.custom_theme_on.invoke()
d.custom_name.set(theme_name)
# Cancel deletion.
d.button_delete_custom.invoke()
eq(yesno.called, 1)
eq(highpage[theme_name], {'option': 'True'})
- eq(idleConf.GetSectionList('user', 'highlight'), ['spam theme'])
+ eq(idleConf.GetSectionList('user', 'highlight'), [theme_name, theme_name2])
eq(dialog.deactivate_current_config.called, 0)
eq(dialog.activate_config_changes.called, 0)
eq(d.set_theme_type.called, 0)
d.button_delete_custom.invoke()
eq(yesno.called, 2)
self.assertNotIn(theme_name, highpage)
- eq(idleConf.GetSectionList('user', 'highlight'), [])
- eq(d.custom_theme_on.state(), ('disabled',))
- eq(d.custom_name.get(), '- no custom themes -')
+ eq(idleConf.GetSectionList('user', 'highlight'), [theme_name2])
+ eq(d.custom_theme_on.state(), ())
+ eq(d.custom_name.get(), theme_name2)
eq(dialog.deactivate_current_config.called, 1)
eq(dialog.activate_config_changes.called, 1)
eq(d.set_theme_type.called, 1)
+ # Confirm deletion of second theme - empties list.
+ d.custom_name.set(theme_name2)
+ yesno.result = True
+ d.button_delete_custom.invoke()
+ eq(yesno.called, 3)
+ self.assertNotIn(theme_name, highpage)
+ eq(idleConf.GetSectionList('user', 'highlight'), [])
+ eq(d.custom_theme_on.state(), ('disabled',))
+ eq(d.custom_name.get(), '- no custom themes -')
+ eq(dialog.deactivate_current_config.called, 2)
+ eq(dialog.activate_config_changes.called, 2)
+ eq(d.set_theme_type.called, 2)
+
del dialog.activate_config_changes, dialog.deactivate_current_config
del d.askyesno
idleConf.userCfg['keys'].SetOption(keyset_name, 'name', 'value')
keyspage[keyset_name] = {'option': 'True'}
+ keyset_name2 = 'other key set'
+ idleConf.userCfg['keys'].SetOption(keyset_name2, 'name', 'value')
+ keyspage[keyset_name2] = {'option': 'False'}
+
# Force custom keyset.
- d.keyset_source.set(False)
+ d.custom_keyset_on.state(('!disabled',))
+ d.custom_keyset_on.invoke()
d.custom_name.set(keyset_name)
# Cancel deletion.
d.button_delete_custom_keys.invoke()
eq(yesno.called, 1)
eq(keyspage[keyset_name], {'option': 'True'})
- eq(idleConf.GetSectionList('user', 'keys'), ['spam key set'])
+ eq(idleConf.GetSectionList('user', 'keys'), [keyset_name, keyset_name2])
eq(dialog.deactivate_current_config.called, 0)
eq(dialog.activate_config_changes.called, 0)
eq(d.set_keys_type.called, 0)
d.button_delete_custom_keys.invoke()
eq(yesno.called, 2)
self.assertNotIn(keyset_name, keyspage)
- eq(idleConf.GetSectionList('user', 'keys'), [])
- eq(d.custom_keyset_on.state(), ('disabled',))
- eq(d.custom_name.get(), '- no custom keys -')
+ eq(idleConf.GetSectionList('user', 'keys'), [keyset_name2])
+ eq(d.custom_keyset_on.state(), ())
+ eq(d.custom_name.get(), keyset_name2)
eq(dialog.deactivate_current_config.called, 1)
eq(dialog.activate_config_changes.called, 1)
eq(d.set_keys_type.called, 1)
+ # Confirm deletion of second keyset - empties list.
+ d.custom_name.set(keyset_name2)
+ yesno.result = True
+ d.button_delete_custom_keys.invoke()
+ eq(yesno.called, 3)
+ self.assertNotIn(keyset_name, keyspage)
+ eq(idleConf.GetSectionList('user', 'keys'), [])
+ eq(d.custom_keyset_on.state(), ('disabled',))
+ eq(d.custom_name.get(), '- no custom keys -')
+ eq(dialog.deactivate_current_config.called, 2)
+ eq(dialog.activate_config_changes.called, 2)
+ eq(d.set_keys_type.called, 2)
+
del dialog.activate_config_changes, dialog.deactivate_current_config
del d.askyesno
from idlelib import editor
import unittest
+from collections import namedtuple
from test.support import requires
from tkinter import Tk
)
+class IndentAndNewlineTest(unittest.TestCase):
+
+ @classmethod
+ def setUpClass(cls):
+ requires('gui')
+ cls.root = Tk()
+ cls.root.withdraw()
+ cls.window = Editor(root=cls.root)
+ cls.window.indentwidth = 2
+ cls.window.tabwidth = 2
+
+ @classmethod
+ def tearDownClass(cls):
+ cls.window._close()
+ del cls.window
+ cls.root.update_idletasks()
+ for id in cls.root.tk.call('after', 'info'):
+ cls.root.after_cancel(id)
+ cls.root.destroy()
+ del cls.root
+
+ def insert(self, text):
+ t = self.window.text
+ t.delete('1.0', 'end')
+ t.insert('end', text)
+ # Force update for colorizer to finish.
+ t.update()
+
+ def test_indent_and_newline_event(self):
+ eq = self.assertEqual
+ w = self.window
+ text = w.text
+ get = text.get
+ nl = w.newline_and_indent_event
+
+ TestInfo = namedtuple('Tests', ['label', 'text', 'expected', 'mark'])
+
+ tests = (TestInfo('Empty line inserts with no indent.',
+ ' \n def __init__(self):',
+ '\n \n def __init__(self):\n',
+ '1.end'),
+ TestInfo('Inside bracket before space, deletes space.',
+ ' def f1(self, a, b):',
+ ' def f1(self,\n a, b):\n',
+ '1.14'),
+ TestInfo('Inside bracket after space, deletes space.',
+ ' def f1(self, a, b):',
+ ' def f1(self,\n a, b):\n',
+ '1.15'),
+ TestInfo('Inside string with one line - no indent.',
+ ' """Docstring."""',
+ ' """Docstring.\n"""\n',
+ '1.15'),
+ TestInfo('Inside string with more than one line.',
+ ' """Docstring.\n Docstring Line 2"""',
+ ' """Docstring.\n Docstring Line 2\n """\n',
+ '2.18'),
+ TestInfo('Backslash with one line.',
+ 'a =\\',
+ 'a =\\\n \n',
+ '1.end'),
+ TestInfo('Backslash with more than one line.',
+ 'a =\\\n multiline\\',
+ 'a =\\\n multiline\\\n \n',
+ '2.end'),
+ TestInfo('Block opener - indents +1 level.',
+ ' def f1(self):\n pass',
+ ' def f1(self):\n \n pass\n',
+ '1.end'),
+ TestInfo('Block closer - dedents -1 level.',
+ ' def f1(self):\n pass',
+ ' def f1(self):\n pass\n \n',
+ '2.end'),
+ )
+
+ w.prompt_last_line = ''
+ for test in tests:
+ with self.subTest(label=test.label):
+ self.insert(test.text)
+ text.mark_set('insert', test.mark)
+ nl(event=None)
+ eq(get('1.0', 'end'), test.expected)
+
+ # Selected text.
+ self.insert(' def f1(self, a, b):\n return a + b')
+ text.tag_add('sel', '1.17', '1.end')
+ nl(None)
+ # Deletes selected text before adding new line.
+ eq(get('1.0', 'end'), ' def f1(self, a,\n \n return a + b\n')
+
+ # Preserves the whitespace in shell prompt.
+ w.prompt_last_line = '>>> '
+ self.insert('>>> \t\ta =')
+ text.mark_set('insert', '1.5')
+ nl(None)
+ eq(get('1.0', 'end'), '>>> \na =\n')
+
+
if __name__ == '__main__':
unittest.main(verbosity=2)
# trans is the production instance of ParseMap, used in _study1
parser = pyparse.Parser(4, 4)
self.assertEqual('\t a([{b}])b"c\'d\n'.translate(pyparse.trans),
- 'xxx(((x)))x"x\'x\n')
+ 'xxx(((x)))x"x\'x\n')
class PyParseTest(unittest.TestCase):
p = self.parser
setcode = p.set_code
start = p.find_good_parse_start
+ def char_in_string_false(index): return False
+
+ # First line starts with 'def' and ends with ':', then 0 is the pos.
+ setcode('def spam():\n')
+ eq(start(char_in_string_false), 0)
+
+ # First line begins with a keyword in the list and ends
+ # with an open brace, then 0 is the pos. This is how
+ # hyperparser calls this function as the newline is not added
+ # in the editor, but rather on the call to setcode.
+ setcode('class spam( ' + ' \n')
+ eq(start(char_in_string_false), 0)
# Split def across lines.
setcode('"""This is a module docstring"""\n'
- 'class C():\n'
- ' def __init__(self, a,\n'
- ' b=True):\n'
- ' pass\n'
- )
+ 'class C():\n'
+ ' def __init__(self, a,\n'
+ ' b=True):\n'
+ ' pass\n'
+ )
- # No value sent for is_char_in_string().
- self.assertIsNone(start())
+ # Passing no value or non-callable should fail (issue 32989).
+ with self.assertRaises(TypeError):
+ start()
+ with self.assertRaises(TypeError):
+ start(False)
# Make text look like a string. This returns pos as the start
# position, but it's set to None.
# Make all text look like it's not in a string. This means that it
# found a good start position.
- eq(start(is_char_in_string=lambda index: False), 44)
+ eq(start(char_in_string_false), 44)
# If the beginning of the def line is not in a string, then it
# returns that as the index.
# Code without extra line break in def line - mostly returns the same
# values.
setcode('"""This is a module docstring"""\n'
- 'class C():\n'
- ' def __init__(self, a, b=True):\n'
- ' pass\n'
- )
- eq(start(is_char_in_string=lambda index: False), 44)
+ 'class C():\n'
+ ' def __init__(self, a, b=True):\n'
+ ' pass\n'
+ )
+ eq(start(char_in_string_false), 44)
eq(start(is_char_in_string=lambda index: index > 44), 44)
eq(start(is_char_in_string=lambda index: index >= 44), 33)
# When the def line isn't split, this returns which doesn't match the
self.code = s
self.study_level = 0
- def find_good_parse_start(self, is_char_in_string=None,
- _synchre=_synchre):
+ def find_good_parse_start(self, is_char_in_string):
"""
Return index of a good place to begin parsing, as close to the
end of the string as possible. This will be the start of some
"""
code, pos = self.code, None
- if not is_char_in_string:
- # no clue -- make the caller pass everything
- return None
-
# Peek back from the end for a good place to start,
# but don't try too often; pos will be left None, or
# bumped to a legitimate synch point.
self.text.insert("end-1c", "\n")
self.text.mark_set("iomark", "end-1c")
self.set_line_and_column()
+ self.ctip.remove_calltip_window()
def write(self, s, tags=()):
try:
return sys.modules.get(modulesbyfile[file])
# Update the filename to module name cache and check yet again
# Copy sys.modules in order to cope with changes while iterating
- for modname, module in list(sys.modules.items()):
+ for modname, module in sys.modules.copy().items():
if ismodule(module) and hasattr(module, '__file__'):
f = module.__file__
if f == _filesbymodname.get(modname, None):
from .. import fixer_base
from ..pytree import Node
from ..pygram import python_symbols as syms
-from ..fixer_util import Name, ArgList, ListComp, in_special_context
+from ..fixer_util import Name, ArgList, ListComp, in_special_context, parenthesize
class FixFilter(fixer_base.ConditionalFix):
trailers.append(t.clone())
if "filter_lambda" in results:
+ xp = results.get("xp").clone()
+ if xp.type == syms.test:
+ xp.prefix = ""
+ xp = parenthesize(xp)
+
new = ListComp(results.get("fp").clone(),
results.get("fp").clone(),
- results.get("it").clone(),
- results.get("xp").clone())
+ results.get("it").clone(), xp)
new = Node(syms.power, [new] + trailers, prefix="")
elif "none" in results:
a = """x = [x for x in range(10) if x%2 == 0]"""
self.check(b, a)
+ # bpo-38871
+ b = """filter(lambda x: True if x > 2 else False, [1, 2, 3])"""
+ a = """[x for x in [1, 2, 3] if (True if x > 2 else False)]"""
+ self.check(b, a)
+
def test_filter_trailers(self):
b = """x = filter(None, 'abc')[0]"""
a = """x = [_f for _f in 'abc' if _f][0]"""
return self._cache[level]
except KeyError:
_acquireLock()
- if self.manager.disable >= level:
- is_enabled = self._cache[level] = False
- else:
- is_enabled = self._cache[level] = level >= self.getEffectiveLevel()
- _releaseLock()
-
+ try:
+ if self.manager.disable >= level:
+ is_enabled = self._cache[level] = False
+ else:
+ is_enabled = self._cache[level] = (
+ level >= self.getEffectiveLevel()
+ )
+ finally:
+ _releaseLock()
return is_enabled
def getChild(self, suffix):
value = ConvertingList(value)
value.configurator = self
elif not isinstance(value, ConvertingTuple) and\
- isinstance(value, tuple):
+ isinstance(value, tuple) and not hasattr(value, '_fields'):
value = ConvertingTuple(value)
value.configurator = self
elif isinstance(value, str): # str for py3k
try:
hp, ht, pid, tid = _winapi.CreateProcess(
python_exe, cmd,
- env, None, False, 0, None, None, None)
+ None, None, False, 0, env, None, None)
_winapi.CloseHandle(ht)
except:
_winapi.CloseHandle(rhandle)
nntplib built-in demo - display the latest articles in a newsgroup""")
parser.add_argument('-g', '--group', default='gmane.comp.python.general',
help='group to fetch messages from (default: %(default)s)')
- parser.add_argument('-s', '--server', default='news.gmane.org',
+ parser.add_argument('-s', '--server', default='news.gmane.io',
help='NNTP server hostname (default: %(default)s)')
parser.add_argument('-p', '--port', default=-1, type=int,
help='NNTP port number (default: %s / %s)' % (NNTP_PORT, NNTP_SSL_PORT))
import sys
import stat as st
+from _collections_abc import _check_methods
+
_names = sys.builtin_module_names
# Note: more names are added to __all__ later.
@classmethod
def __subclasshook__(cls, subclass):
- return hasattr(subclass, '__fspath__')
+ if cls is PathLike:
+ return _check_methods(subclass, '__fspath__')
+ return NotImplemented
import platform
import re
import sys
+import sysconfig
import time
import tokenize
import urllib.parse
docmodule = docclass = docroutine = docother = docproperty = docdata = fail
- def getdocloc(self, object,
- basedir=os.path.join(sys.base_exec_prefix, "lib",
- "python%d.%d" % sys.version_info[:2])):
+ def getdocloc(self, object, basedir=sysconfig.get_path('stdlib')):
"""Return the location of module docs or None"""
try:
# -*- coding: utf-8 -*-
-# Autogenerated by Sphinx on Wed Dec 18 13:43:31 2019
+# Autogenerated by Sphinx on Tue Mar 10 02:06:42 2020
topics = {'assert': 'The "assert" statement\n'
'**********************\n'
'\n'
'only\n'
'supported by the numeric types.\n'
'\n'
- 'A general convention is that an empty format string ("""") '
+ 'A general convention is that an empty format specification '
'produces\n'
'the same result as if you had called "str()" on the value. '
'A non-empty\n'
- 'format string typically modifies the result.\n'
+ 'format specification typically modifies the result.\n'
'\n'
'The general form of a *standard format specifier* is:\n'
'\n'
'Changed in version 3.6: Added the "\'_\'" option (see also '
'**PEP 515**).\n'
'\n'
- '*width* is a decimal integer defining the minimum field '
- 'width. If not\n'
- 'specified, then the field width will be determined by the '
+ '*width* is a decimal integer defining the minimum total '
+ 'field width,\n'
+ 'including any prefixes, separators, and other formatting '
+ 'characters.\n'
+ 'If not specified, then the field width will be determined '
+ 'by the\n'
'content.\n'
'\n'
'When no explicit alignment is given, preceding the *width* '
'object.__rfloordiv__(self, other)\n'
'object.__rmod__(self, other)\n'
'object.__rdivmod__(self, other)\n'
- 'object.__rpow__(self, other)\n'
+ 'object.__rpow__(self, other[, modulo])\n'
'object.__rlshift__(self, other)\n'
'object.__rrshift__(self, other)\n'
'object.__rand__(self, other)\n'
'bases,\n'
'**kwds)" (where the additional keyword arguments, if any, '
'come from\n'
- 'the class definition).\n'
+ 'the class definition). The "__prepare__" method should be '
+ 'implemented\n'
+ 'as a "classmethod()". The namespace returned by '
+ '"__prepare__" is\n'
+ 'passed in to "__new__", but when the final class object is '
+ 'created the\n'
+ 'namespace is copied into a new "dict".\n'
'\n'
'If the metaclass has no "__prepare__" attribute, then the '
'class\n'
'object.__rfloordiv__(self, other)\n'
'object.__rmod__(self, other)\n'
'object.__rdivmod__(self, other)\n'
- 'object.__rpow__(self, other)\n'
+ 'object.__rpow__(self, other[, modulo])\n'
'object.__rlshift__(self, other)\n'
'object.__rrshift__(self, other)\n'
'object.__rand__(self, other)\n'
' "co_lnotab" is a string encoding the mapping from bytecode\n'
' offsets to line numbers (for details see the source code of '
'the\n'
- ' interpreter); "co_stacksize" is the required stack size\n'
- ' (including local variables); "co_flags" is an integer '
- 'encoding a\n'
- ' number of flags for the interpreter.\n'
+ ' interpreter); "co_stacksize" is the required stack size;\n'
+ ' "co_flags" is an integer encoding a number of flags for the\n'
+ ' interpreter.\n'
'\n'
' The following flag bits are defined for "co_flags": bit '
'"0x04"\n'
with _PopenSelector() as selector:
if self.stdin and input:
selector.register(self.stdin, selectors.EVENT_WRITE)
- if self.stdout:
+ if self.stdout and not self.stdout.closed:
selector.register(self.stdout, selectors.EVENT_READ)
- if self.stderr:
+ if self.stderr and not self.stderr.closed:
selector.register(self.stderr, selectors.EVENT_READ)
while selector.get_map():
raise ValueError("mode must be 'r', 'w' or 'x'")
try:
- import gzip
- gzip.GzipFile
- except (ImportError, AttributeError):
+ from gzip import GzipFile
+ except ImportError:
raise CompressionError("gzip module is not available")
try:
- fileobj = gzip.GzipFile(name, mode + "b", compresslevel, fileobj)
+ fileobj = GzipFile(name, mode + "b", compresslevel, fileobj)
except OSError:
if fileobj is not None and mode == 'r':
raise ReadError("not a gzip file")
raise ValueError("mode must be 'r', 'w' or 'x'")
try:
- import bz2
+ from bz2 import BZ2File
except ImportError:
raise CompressionError("bz2 module is not available")
- fileobj = bz2.BZ2File(fileobj or name, mode,
- compresslevel=compresslevel)
+ fileobj = BZ2File(fileobj or name, mode, compresslevel=compresslevel)
try:
t = cls.taropen(name, mode, fileobj, **kwargs)
raise ValueError("mode must be 'r', 'w' or 'x'")
try:
- import lzma
+ from lzma import LZMAFile, LZMAError
except ImportError:
raise CompressionError("lzma module is not available")
- fileobj = lzma.LZMAFile(fileobj or name, mode, preset=preset)
+ fileobj = LZMAFile(fileobj or name, mode, preset=preset)
try:
t = cls.taropen(name, mode, fileobj, **kwargs)
- except (lzma.LZMAError, EOFError):
+ except (LZMAError, EOFError):
fileobj.close()
if mode == 'r':
raise ReadError("not an lzma file")
L = list(range(half - 1, -1, -1))
L.extend(range(half))
# Force to float, so that the timings are comparable. This is
- # significantly faster if we leave tham as ints.
+ # significantly faster if we leave them as ints.
L = list(map(float, L))
doit(L) # !sort
print()
"while v:pass",
# If
"if v:pass",
+ # If-Elif
+ "if a:\n pass\nelif b:\n pass",
+ # If-Elif-Else
+ "if a:\n pass\nelif b:\n pass\nelse:\n pass",
# With
"with x as y: pass",
"with x as y, z as q: pass",
node = ast.parse('async def foo():\n x = "not docstring"')
self.assertIsNone(ast.get_docstring(node.body[0]))
+ def test_elif_stmt_start_position(self):
+ node = ast.parse('if a:\n pass\nelif b:\n pass\n')
+ elif_stmt = node.body[0].orelse[0]
+ self.assertEqual(elif_stmt.lineno, 3)
+ self.assertEqual(elif_stmt.col_offset, 0)
+
+ def test_elif_stmt_start_position_with_else(self):
+ node = ast.parse('if a:\n pass\nelif b:\n pass\nelse:\n pass\n')
+ elif_stmt = node.body[0].orelse[0]
+ self.assertEqual(elif_stmt.lineno, 3)
+ self.assertEqual(elif_stmt.col_offset, 0)
+
def test_literal_eval(self):
self.assertEqual(ast.literal_eval('[1, 2, 3]'), [1, 2, 3])
self.assertEqual(ast.literal_eval('{"foo": 42}'), {"foo": 42})
('Module', [('For', (1, 0), ('Name', (1, 4), 'v', ('Store',)), ('Name', (1, 9), 'v', ('Load',)), [('Pass', (1, 11))], [])]),
('Module', [('While', (1, 0), ('Name', (1, 6), 'v', ('Load',)), [('Pass', (1, 8))], [])]),
('Module', [('If', (1, 0), ('Name', (1, 3), 'v', ('Load',)), [('Pass', (1, 5))], [])]),
+('Module', [('If', (1, 0), ('Name', (1, 3), 'a', ('Load',)), [('Pass', (2, 2))], [('If', (3, 0), ('Name', (3, 5), 'b', ('Load',)), [('Pass', (4, 2))], [])])]),
+('Module', [('If', (1, 0), ('Name', (1, 3), 'a', ('Load',)), [('Pass', (2, 2))], [('If', (3, 0), ('Name', (3, 5), 'b', ('Load',)), [('Pass', (4, 2))], [('Pass', (6, 2))])])]),
('Module', [('With', (1, 0), [('withitem', ('Name', (1, 5), 'x', ('Load',)), ('Name', (1, 10), 'y', ('Store',)))], [('Pass', (1, 13))])]),
('Module', [('With', (1, 0), [('withitem', ('Name', (1, 5), 'x', ('Load',)), ('Name', (1, 10), 'y', ('Store',))), ('withitem', ('Name', (1, 13), 'z', ('Load',)), ('Name', (1, 18), 'q', ('Store',)))], [('Pass', (1, 21))])]),
('Module', [('Raise', (1, 0), ('Call', (1, 6), ('Name', (1, 6), 'Exception', ('Load',)), [('Str', (1, 16), 'string')], []), None)]),
self.assertEqual([], messages)
+ def test_async_gen_await_same_anext_coro_twice(self):
+ async def async_iterate():
+ yield 1
+ yield 2
+
+ async def run():
+ it = async_iterate()
+ nxt = it.__anext__()
+ await nxt
+ with self.assertRaisesRegex(
+ RuntimeError,
+ r"cannot reuse already awaited __anext__\(\)/asend\(\)"
+ ):
+ await nxt
+
+ await it.aclose() # prevent unfinished iterator warning
+
+ self.loop.run_until_complete(run())
+
+ def test_async_gen_await_same_aclose_coro_twice(self):
+ async def async_iterate():
+ yield 1
+ yield 2
+
+ async def run():
+ it = async_iterate()
+ nxt = it.aclose()
+ await nxt
+ with self.assertRaisesRegex(
+ RuntimeError,
+ r"cannot reuse already awaited aclose\(\)/athrow\(\)"
+ ):
+ await nxt
+
+ self.loop.run_until_complete(run())
+
+ def test_async_gen_aclose_twice_with_different_coros(self):
+ # Regression test for https://bugs.python.org/issue39606
+ async def async_iterate():
+ yield 1
+ yield 2
+
+ async def run():
+ it = async_iterate()
+ await it.aclose()
+ await it.aclose()
+
+ self.loop.run_until_complete(run())
+
+ def test_async_gen_aclose_after_exhaustion(self):
+ # Regression test for https://bugs.python.org/issue39606
+ async def async_iterate():
+ yield 1
+ yield 2
+
+ async def run():
+ it = async_iterate()
+ async for _ in it:
+ pass
+ await it.aclose()
+
+ self.loop.run_until_complete(run())
if __name__ == "__main__":
unittest.main()
import unittest
+@unittest.skipUnless(decimal.HAVE_CONTEXTVAR, "decimal is built with a thread-local context")
class DecimalContextTest(unittest.TestCase):
def test_asyncio_task_decimal_context(self):
(b'3d}==', b'\xdd'),
(b'@@', b''),
(b'!', b''),
+ (b"YWJj\n", b"abc"),
(b'YWJj\nYWI=', b'abcab'))
funcs = (
base64.b64decode,
# the same way as the original. Thus, a substantial part of the
# memoryview tests is now in this module.
#
+# Written and designed by Stefan Krah for Python 3.3.
+#
import contextlib
import unittest
got = ostream.getvalue()
self.assertEqual(got, unistring)
+
class EscapeDecodeTest(unittest.TestCase):
def test_empty(self):
self.assertEqual(codecs.escape_decode(b""), (b"", 0))
puny = puny.decode("ascii").encode("ascii")
self.assertEqual(uni, puny.decode("punycode"))
+ def test_decode_invalid(self):
+ testcases = [
+ (b"xn--w&", "strict", UnicodeError()),
+ (b"xn--w&", "ignore", "xn-"),
+ ]
+ for puny, errors, expected in testcases:
+ with self.subTest(puny=puny, errors=errors):
+ if isinstance(expected, Exception):
+ self.assertRaises(UnicodeError, puny.decode, "punycode", errors)
+ else:
+ self.assertEqual(puny.decode("punycode", errors), expected)
+
class UnicodeInternalTest(unittest.TestCase):
@unittest.skipUnless(SIZEOF_WCHAR_T == 4, 'specific to 32-bit wchar_t')
self.assertRaises(UnicodeError,
codecs.decode, b'abc', 'undefined', errors)
+ def test_file_closes_if_lookup_error_raised(self):
+ mock_open = mock.mock_open()
+ with mock.patch('builtins.open', mock_open) as file:
+ with self.assertRaises(LookupError):
+ codecs.open(support.TESTFN, 'wt', 'invalid-encoding')
+
+ file().close.assert_called()
+
class StreamReaderTest(unittest.TestCase):
self.assertTrue(compile_dir.called)
self.assertEqual(compile_dir.call_args[-1]['workers'], None)
+ def _test_ddir_only(self, *, ddir, parallel=True):
+ """Recursive compile_dir ddir must contain package paths; bpo39769."""
+ fullpath = ["test", "foo"]
+ path = self.directory
+ mods = []
+ for subdir in fullpath:
+ path = os.path.join(path, subdir)
+ os.mkdir(path)
+ script_helper.make_script(path, "__init__", "")
+ mods.append(script_helper.make_script(path, "mod",
+ "def fn(): 1/0\nfn()\n"))
+ compileall.compile_dir(
+ self.directory, quiet=True, ddir=ddir,
+ workers=2 if parallel else 1)
+ self.assertTrue(mods)
+ for mod in mods:
+ self.assertTrue(mod.startswith(self.directory), mod)
+ modcode = importlib.util.cache_from_source(mod)
+ modpath = mod[len(self.directory+os.sep):]
+ _, _, err = script_helper.assert_python_failure(modcode)
+ expected_in = os.path.join(ddir, modpath)
+ mod_code_obj = test.test_importlib.util._get_code_from_pyc(modcode)
+ self.assertEqual(mod_code_obj.co_filename, expected_in)
+ self.assertIn(f'"{expected_in}"', os.fsdecode(err))
+
+ def test_ddir_only_one_worker(self):
+ """Recursive compile_dir ddir= contains package paths; bpo39769."""
+ return self._test_ddir_only(ddir="<a prefix>", parallel=False)
+
+ def test_ddir_multiple_workers(self):
+ """Recursive compile_dir ddir= contains package paths; bpo39769."""
+ return self._test_ddir_only(ddir="<a prefix>", parallel=True)
+
+ def test_ddir_empty_only_one_worker(self):
+ """Recursive compile_dir ddir='' contains package paths; bpo39769."""
+ return self._test_ddir_only(ddir="", parallel=False)
+
+ def test_ddir_empty_multiple_workers(self):
+ """Recursive compile_dir ddir='' contains package paths; bpo39769."""
+ return self._test_ddir_only(ddir="", parallel=True)
+
class CommmandLineTestsWithSourceEpoch(CommandLineTestsBase,
unittest.TestCase,
class EventfulGCObj():
- def __init__(self, ctx):
- mgr = get_context(ctx).Manager()
+ def __init__(self, mgr):
self.event = mgr.Event()
def __del__(self):
def test_ressources_gced_in_workers(self):
# Ensure that argument for a job are correctly gc-ed after the job
# is finished
- obj = EventfulGCObj(self.ctx)
+ mgr = get_context(self.ctx).Manager()
+ obj = EventfulGCObj(mgr)
future = self.executor.submit(id, obj)
future.result()
self.assertTrue(obj.event.wait(timeout=1))
+ # explicitly destroy the object to ensure that EventfulGCObj.__del__()
+ # is called while manager is still running.
+ obj = None
+ test.support.gc_collect()
+
+ mgr.shutdown()
+ mgr.join()
+
create_executor_tests(ProcessPoolExecutorTest,
executor_mixins=(ProcessPoolForkMixin,
42, 2**100, 3.14, True, False, 1j,
"hello", "hello\u1234", f.__code__,
b"world", bytes(range(256)), range(10), slice(1, 10, 2),
- NewStyle, Classic, max, WithMetaclass]
+ NewStyle, Classic, max, WithMetaclass, property()]
for x in tests:
self.assertIs(copy.copy(x), x)
pass
tests = [None, 42, 2**100, 3.14, True, False, 1j,
"hello", "hello\u1234", f.__code__,
- NewStyle, Classic, max]
+ NewStyle, Classic, max, property()]
for x in tests:
self.assertIs(copy.deepcopy(x), x)
o = C(42)
self.assertEqual(o.x, 42)
+ def test_field_default_default_factory_error(self):
+ msg = "cannot specify both default and default_factory"
+ with self.assertRaisesRegex(ValueError, msg):
+ @dataclass
+ class C:
+ x: int = field(default=1, default_factory=int)
+
+ def test_field_repr(self):
+ int_field = field(default=1, init=True, repr=False)
+ int_field.name = "id"
+ repr_output = repr(int_field)
+ expected_output = "Field(name='id',type=None," \
+ f"default=1,default_factory={MISSING!r}," \
+ "init=True,repr=False,hash=None," \
+ "compare=True,metadata=mappingproxy({})," \
+ "_field_type=None)"
+
+ self.assertEqual(repr_output, expected_output)
+
def test_named_init_params(self):
@dataclass
class C:
self.assertEqual(Decimal.from_float(cls(101.1)),
Decimal.from_float(101.1))
+ def test_maxcontext_exact_arith(self):
+
+ # Make sure that exact operations do not raise MemoryError due
+ # to huge intermediate values when the context precision is very
+ # large.
+
+ # The following functions fill the available precision and are
+ # therefore not suitable for large precisions (by design of the
+ # specification).
+ MaxContextSkip = ['logical_invert', 'next_minus', 'next_plus',
+ 'logical_and', 'logical_or', 'logical_xor',
+ 'next_toward', 'rotate', 'shift']
+
+ Decimal = C.Decimal
+ Context = C.Context
+ localcontext = C.localcontext
+
+ # Here only some functions that are likely candidates for triggering a
+ # MemoryError are tested. deccheck.py has an exhaustive test.
+ maxcontext = Context(prec=C.MAX_PREC, Emin=C.MIN_EMIN, Emax=C.MAX_EMAX)
+ with localcontext(maxcontext):
+ self.assertEqual(Decimal(0).exp(), 1)
+ self.assertEqual(Decimal(1).ln(), 0)
+ self.assertEqual(Decimal(1).log10(), 0)
+ self.assertEqual(Decimal(10**2).log10(), 2)
+ self.assertEqual(Decimal(10**223).log10(), 223)
+ self.assertEqual(Decimal(10**19).logb(), 19)
+ self.assertEqual(Decimal(4).sqrt(), 2)
+ self.assertEqual(Decimal("40E9").sqrt(), Decimal('2.0E+5'))
+ self.assertEqual(divmod(Decimal(10), 3), (3, 1))
+ self.assertEqual(Decimal(10) // 3, 3)
+ self.assertEqual(Decimal(4) / 2, 2)
+ self.assertEqual(Decimal(400) ** -1, Decimal('0.0025'))
+
+
@requires_docstrings
@unittest.skipUnless(C, "test requires C version")
class SignatureTest(unittest.TestCase):
support.check_free_after_iterating(self, lambda d: iter(d.items()), dict)
def test_equal_operator_modifying_operand(self):
- # test fix for seg fault reported in issue 27945 part 3.
+ # test fix for seg fault reported in bpo-27945 part 3.
class X():
def __del__(self):
dict_b.clear()
dict_b = {X(): X()}
self.assertTrue(dict_a == dict_b)
+ # test fix for seg fault reported in bpo-38588 part 1.
+ class Y:
+ def __eq__(self, other):
+ dict_d.clear()
+ return True
+
+ dict_c = {0: Y()}
+ dict_d = {0: set()}
+ self.assertTrue(dict_c == dict_d)
+
def test_fromkeys_operator_modifying_dict_operand(self):
# test fix for seg fault reported in issue 27945 part 4a.
class X(int):
finally:
support.forget(pkg_name)
sys.path.pop()
- assert doctest.DocTestFinder().find(mod) == []
+ include_empty_finder = doctest.DocTestFinder(exclude_empty=False)
+ exclude_empty_finder = doctest.DocTestFinder(exclude_empty=True)
+
+ self.assertEqual(len(include_empty_finder.find(mod)), 1)
+ self.assertEqual(len(exclude_empty_finder.find(mod)), 0)
def test_DocTestParser(): r"""
Unit tests for the `DocTestParser` class.
import numbers
import operator
import fractions
+import functools
import sys
import unittest
import warnings
self.assertTypedEquals(0.1+0j, complex(F(1,10)))
+ def testBoolGuarateesBoolReturn(self):
+ # Ensure that __bool__ is used on numerator which guarantees a bool
+ # return. See also bpo-39274.
+ @functools.total_ordering
+ class CustomValue:
+ denominator = 1
+
+ def __init__(self, value):
+ self.value = value
+
+ def __bool__(self):
+ return bool(self.value)
+
+ @property
+ def numerator(self):
+ # required to preserve `self` during instantiation
+ return self
+
+ def __eq__(self, other):
+ raise AssertionError("Avoid comparisons in Fraction.__bool__")
+
+ __lt__ = __eq__
+
+ # We did not implement all abstract methods, so register:
+ numbers.Rational.register(CustomValue)
+
+ numerator = CustomValue(1)
+ r = F(numerator)
+ # ensure the numerator was not lost during instantiation:
+ self.assertIs(r.numerator, numerator)
+ self.assertIs(bool(r), True)
+
+ numerator = CustomValue(0)
+ r = F(numerator)
+ self.assertIs(bool(r), False)
+
def testRound(self):
self.assertTypedEquals(F(-200), round(F(-150), -2))
self.assertTypedEquals(F(-200), round(F(-250), -2))
isizeBytes = fRead.read(4)
self.assertEqual(isizeBytes, struct.pack('<i', len(data1)))
+ def test_compresslevel_metadata(self):
+ # see RFC 1952: http://www.faqs.org/rfcs/rfc1952.html
+ # specifically, discussion of XFL in section 2.3.1
+ cases = [
+ ('fast', 1, b'\x04'),
+ ('best', 9, b'\x02'),
+ ('tradeoff', 6, b'\x00'),
+ ]
+ xflOffset = 8
+
+ for (name, level, expectedXflByte) in cases:
+ with self.subTest(name):
+ fWrite = gzip.GzipFile(self.filename, 'w', compresslevel=level)
+ with fWrite:
+ fWrite.write(data1)
+ with open(self.filename, 'rb') as fRead:
+ fRead.seek(xflOffset)
+ xflByte = fRead.read(1)
+ self.assertEqual(xflByte, expectedXflByte)
+
def test_with_open(self):
# GzipFile supports the context management protocol
with gzip.GzipFile(self.filename, "wb") as f:
with self.assertRaises((IndexError, RuntimeError)):
self.module.heappop(heap)
+ def test_comparison_operator_modifiying_heap(self):
+ # See bpo-39421: Strong references need to be taken
+ # when comparing objects as they can alter the heap
+ class EvilClass(int):
+ def __lt__(self, o):
+ heap.clear()
+ return NotImplemented
+
+ heap = []
+ self.module.heappush(heap, EvilClass(0))
+ self.assertRaises(IndexError, self.module.heappushpop, heap, 1)
+
+ def test_comparison_operator_modifiying_heap_two_heaps(self):
+
+ class h(int):
+ def __lt__(self, o):
+ list2.clear()
+ return NotImplemented
+
+ class g(int):
+ def __lt__(self, o):
+ list1.clear()
+ return NotImplemented
+
+ list1, list2 = [], []
+
+ self.module.heappush(list1, h(0))
+ self.module.heappush(list2, g(0))
+
+ self.assertRaises((IndexError, RuntimeError), self.module.heappush, list1, g(1))
+ self.assertRaises((IndexError, RuntimeError), self.module.heappush, list2, h(1))
class TestErrorHandlingPython(TestErrorHandling, TestCase):
module = py_heapq
from importlib import machinery, util, invalidate_caches
from importlib.abc import ResourceReader
import io
+import marshal
import os
import os.path
from pathlib import Path, PurePath
return '{}.{}'.format(parent, name), path
+def _get_code_from_pyc(pyc_path):
+ """Reads a pyc file and returns the unmarshalled code object within.
+
+ No header validation is performed.
+ """
+ with open(pyc_path, 'rb') as pyc_f:
+ pyc_f.seek(16)
+ return marshal.load(pyc_f)
+
+
@contextlib.contextmanager
def uncache(*names):
"""Uncache a module from sys.modules.
file.seek(0)
file.close()
self.assertRaises(ValueError, file.read)
+ with self.open(support.TESTFN, "rb") as f:
+ file = self.open(f.fileno(), "rb", closefd=False)
+ self.assertEqual(file.read()[:3], b"egg")
+ file.close()
+ self.assertRaises(ValueError, file.readinto, bytearray(1))
def test_no_closefd_with_filename(self):
# can't use closefd in combination with a file name
# blown
self.assertRaises(RecursionError, blowstack, isinstance, '', str)
+ def test_issubclass_refcount_handling(self):
+ # bpo-39382: abstract_issubclass() didn't hold item reference while
+ # peeking in the bases tuple, in the single inheritance case.
+ class A:
+ @property
+ def __bases__(self):
+ return (int, )
+
+ class B:
+ def __init__(self):
+ # setting this here increases the chances of exhibiting the bug,
+ # probably due to memory layout changes.
+ self.x = 1
+
+ @property
+ def __bases__(self):
+ return (A(), )
+
+ self.assertEqual(True, issubclass(B(), int))
+
+
def blowstack(fxn, arg, compare_to):
# Make sure that calling isinstance with a deeply nested tuple for its
# argument will raise RecursionError eventually.
with self.assertRaises(TypeError):
(3,) + L([1,2])
+ def test_equal_operator_modifying_operand(self):
+ # test fix for seg fault reported in bpo-38588 part 2.
+ class X:
+ def __eq__(self,other) :
+ list2.clear()
+ return NotImplemented
+
+ class Y:
+ def __eq__(self, other):
+ list1.clear()
+ return NotImplemented
+
+ class Z:
+ def __eq__(self, other):
+ list3.clear()
+ return NotImplemented
+
+ list1 = [X()]
+ list2 = [Y()]
+ self.assertTrue(list1 == list2)
+
+ list3 = [Z()]
+ list4 = [1]
+ self.assertFalse(list3 == list4)
+
+ def test_count_index_remove_crashes(self):
+ # bpo-38610: The count(), index(), and remove() methods were not
+ # holding strong references to list elements while calling
+ # PyObject_RichCompareBool().
+ class X:
+ def __eq__(self, other):
+ lst.clear()
+ return NotImplemented
+
+ lst = [X()]
+ with self.assertRaises(ValueError):
+ lst.index(lst)
+
+ class L(list):
+ def __eq__(self, other):
+ str(other)
+ return NotImplemented
+
+ lst = L([X()])
+ lst.count(lst)
+
+ lst = L([X()])
+ with self.assertRaises(ValueError):
+ lst.remove(lst)
+
+ # bpo-39453: list.__contains__ was not holding strong references
+ # to list elements while calling PyObject_RichCompareBool().
+ lst = [X(), X()]
+ 3 in lst
+ lst = [X(), X()]
+ X() in lst
+
+
if __name__ == "__main__":
unittest.main()
self.assertRaises(ValueError, bc.convert, 'cfg://!')
self.assertRaises(KeyError, bc.convert, 'cfg://adict[2]')
+ def test_namedtuple(self):
+ # see bpo-39142
+ from collections import namedtuple
+
+ class MyHandler(logging.StreamHandler):
+ def __init__(self, resource, *args, **kwargs):
+ super().__init__(*args, **kwargs)
+ self.resource: namedtuple = resource
+
+ def emit(self, record):
+ record.msg += f' {self.resource.type}'
+ return super().emit(record)
+
+ Resource = namedtuple('Resource', ['type', 'labels'])
+ resource = Resource(type='my_type', labels=['a'])
+
+ config = {
+ 'version': 1,
+ 'handlers': {
+ 'myhandler': {
+ '()': MyHandler,
+ 'resource': resource
+ }
+ },
+ 'root': {'level': 'INFO', 'handlers': ['myhandler']},
+ }
+ with support.captured_stderr() as stderr:
+ self.apply_config(config)
+ logging.info('some log')
+ self.assertEqual(stderr.getvalue(), 'some log my_type\n')
+
class ManagerTest(BaseTest):
def test_manager_loggerclass(self):
logged = []
"\tSat, 19 Jun 2010 18:04:08 -0400"
"\t<4FD05F05-F98B-44DC-8111-C6009C925F0C@gmail.com>"
"\t<hvalf7$ort$1@dough.gmane.org>\t7103\t16"
- "\tXref: news.gmane.org gmane.comp.python.authors:57"
+ "\tXref: news.gmane.io gmane.comp.python.authors:57"
"\n"
"58\tLooking for a few good bloggers"
"\tDoug Hellmann <doug.hellmann-Re5JQEeQqe8AvxtiuMwx3w@public.gmane.org>"
"references": "<hvalf7$ort$1@dough.gmane.org>",
":bytes": "7103",
":lines": "16",
- "xref": "news.gmane.org gmane.comp.python.authors:57"
+ "xref": "news.gmane.io gmane.comp.python.authors:57"
})
art_num, over = overviews[1]
self.assertEqual(over["xref"], None)
self.assertEqual(list(unpickled), expected)
self.assertEqual(list(it), expected)
+ @support.cpython_only
+ def test_weakref_list_is_not_traversed(self):
+ # Check that the weakref list is not traversed when collecting
+ # OrderedDict objects. See bpo-39778 for more information.
+
+ gc.collect()
+
+ x = self.OrderedDict()
+ x.cycle = x
+
+ cycle = []
+ cycle.append(cycle)
+
+ x_ref = weakref.ref(x)
+ cycle.append(x_ref)
+
+ del x, cycle, x_ref
+
+ gc.collect()
+
class PurePythonOrderedDictSubclassTests(PurePythonOrderedDictTests):
self.assertRaises(ZeroDivisionError, self.fspath,
FakePath(ZeroDivisionError()))
+ def test_pathlike_subclasshook(self):
+ # bpo-38878: subclasshook causes subclass checks
+ # true on abstract implementation.
+ class A(os.PathLike):
+ pass
+ self.assertFalse(issubclass(FakePath, A))
+ self.assertTrue(issubclass(FakePath, os.PathLike))
+
class TimesTests(unittest.TestCase):
def test_times(self):
self.assertEqual(returncode, -3)
+ def test_communicate_repeated_call_after_stdout_close(self):
+ proc = subprocess.Popen([sys.executable, '-c',
+ 'import os, time; os.close(1), time.sleep(2)'],
+ stdout=subprocess.PIPE)
+ while True:
+ try:
+ proc.communicate(timeout=0.1)
+ return
+ except subprocess.TimeoutExpired:
+ pass
+
@unittest.skipUnless(mswindows, "Windows specific tests")
class Win32ProcessTestCase(BaseTestCase):
self.assertEqual(fr, typing.ForwardRef('int'))
self.assertNotEqual(List['int'], List[int])
+ def test_forward_equality_gth(self):
+ c1 = typing.ForwardRef('C')
+ c1_gth = typing.ForwardRef('C')
+ c2 = typing.ForwardRef('C')
+ c2_gth = typing.ForwardRef('C')
+
+ class C:
+ pass
+ def foo(a: c1_gth, b: c2_gth):
+ pass
+
+ self.assertEqual(get_type_hints(foo, globals(), locals()), {'a': C, 'b': C})
+ self.assertEqual(c1, c2)
+ self.assertEqual(c1, c1_gth)
+ self.assertEqual(c1_gth, c2_gth)
+ self.assertEqual(List[c1], List[c1_gth])
+ self.assertNotEqual(List[c1], List[C])
+ self.assertNotEqual(List[c1_gth], List[C])
+ self.assertEquals(Union[c1, c1_gth], Union[c1])
+ self.assertEquals(Union[c1, c1_gth, int], Union[c1, int])
+
+ def test_forward_equality_hash(self):
+ c1 = typing.ForwardRef('int')
+ c1_gth = typing.ForwardRef('int')
+ c2 = typing.ForwardRef('int')
+ c2_gth = typing.ForwardRef('int')
+
+ def foo(a: c1_gth, b: c2_gth):
+ pass
+ get_type_hints(foo, globals(), locals())
+
+ self.assertEqual(hash(c1), hash(c2))
+ self.assertEqual(hash(c1_gth), hash(c2_gth))
+ self.assertEqual(hash(c1), hash(c1_gth))
+
+ def test_forward_equality_namespace(self):
+ class A:
+ pass
+ def namespace1():
+ a = typing.ForwardRef('A')
+ def fun(x: a):
+ pass
+ get_type_hints(fun, globals(), locals())
+ return a
+
+ def namespace2():
+ a = typing.ForwardRef('A')
+
+ class A:
+ pass
+ def fun(x: a):
+ pass
+
+ get_type_hints(fun, globals(), locals())
+ return a
+
+ self.assertEqual(namespace1(), namespace1())
+ self.assertNotEqual(namespace1(), namespace2())
+
def test_forward_repr(self):
self.assertEqual(repr(List['int']), "typing.List[ForwardRef('int')]")
self.assertEqual(get_type_hints(foo, globals(), locals()),
{'a': Tuple[T]})
+ def test_forward_recursion_actually(self):
+ def namespace1():
+ a = typing.ForwardRef('A')
+ A = a
+ def fun(x: a): pass
+
+ ret = get_type_hints(fun, globals(), locals())
+ return a
+
+ def namespace2():
+ a = typing.ForwardRef('A')
+ A = a
+ def fun(x: a): pass
+
+ ret = get_type_hints(fun, globals(), locals())
+ return a
+
+ def cmp(o1, o2):
+ return o1 == o2
+
+ r1 = namespace1()
+ r2 = namespace2()
+ self.assertIsNot(r1, r2)
+ self.assertRaises(RecursionError, cmp, r1, r2)
+
+ def test_union_forward_recursion(self):
+ ValueList = List['Value']
+ Value = Union[str, ValueList]
+
+ class C:
+ foo: List[Value]
+ class D:
+ foo: Union[Value, ValueList]
+ class E:
+ foo: Union[List[Value], ValueList]
+ class F:
+ foo: Union[Value, List[Value], ValueList]
+
+ self.assertEqual(get_type_hints(C, globals(), locals()), get_type_hints(C, globals(), locals()))
+ self.assertEqual(get_type_hints(C, globals(), locals()),
+ {'foo': List[Union[str, List[Union[str, List['Value']]]]]})
+ self.assertEqual(get_type_hints(D, globals(), locals()),
+ {'foo': Union[str, List[Union[str, List['Value']]]]})
+ self.assertEqual(get_type_hints(E, globals(), locals()),
+ {'foo': Union[
+ List[Union[str, List[Union[str, List['Value']]]]],
+ List[Union[str, List['Value']]]
+ ]
+ })
+ self.assertEqual(get_type_hints(F, globals(), locals()),
+ {'foo': Union[
+ str,
+ List[Union[str, List['Value']]],
+ List[Union[str, List[Union[str, List['Value']]]]]
+ ]
+ })
+
def test_callable_forward(self):
def foo(a: Callable[['T'], 'T']):
self.assertTrue(bypass('localhost'))
self.assertTrue(bypass('LocalHost')) # MixedCase
self.assertTrue(bypass('LOCALHOST')) # UPPERCASE
+ self.assertTrue(bypass('.localhost'))
self.assertTrue(bypass('newdomain.com:1234'))
+ self.assertTrue(bypass('.newdomain.com:1234'))
self.assertTrue(bypass('foo.d.o.t')) # issue 29142
+ self.assertTrue(bypass('d.o.t'))
self.assertTrue(bypass('anotherdomain.com:8888'))
+ self.assertTrue(bypass('.anotherdomain.com:8888'))
self.assertTrue(bypass('www.newdomain.com:1234'))
self.assertFalse(bypass('prelocalhost'))
self.assertFalse(bypass('newdomain.com')) # no port
self.assertFalse(bypass('newdomain.com:1235')) # wrong port
+ def test_proxy_bypass_environment_always_match(self):
+ bypass = urllib.request.proxy_bypass_environment
+ self.env.set('NO_PROXY', '*')
+ self.assertTrue(bypass('newdomain.com'))
+ self.assertTrue(bypass('newdomain.com:1234'))
+ self.env.set('NO_PROXY', '*, anotherdomain.com')
+ self.assertTrue(bypass('anotherdomain.com'))
+ self.assertFalse(bypass('newdomain.com'))
+ self.assertFalse(bypass('newdomain.com:1234'))
+
+ def test_proxy_bypass_environment_newline(self):
+ bypass = urllib.request.proxy_bypass_environment
+ self.env.set('NO_PROXY',
+ 'localhost, anotherdomain.com, newdomain.com:1234')
+ self.assertFalse(bypass('localhost\n'))
+ self.assertFalse(bypass('anotherdomain.com:8888\n'))
+ self.assertFalse(bypass('newdomain.com:1234\n'))
+
class ProxyTests_withOrderedEnv(unittest.TestCase):
def test_portseparator(self):
# Issue 754016 makes changes for port separator ':' from scheme separator
- self.assertEqual(urllib.parse.urlparse("http:80"), ('http','','80','','',''))
- self.assertEqual(urllib.parse.urlparse("https:80"), ('https','','80','','',''))
- self.assertEqual(urllib.parse.urlparse("path:80"), ('path','','80','','',''))
+ self.assertEqual(urllib.parse.urlparse("path:80"),
+ ('','','path:80','','',''))
self.assertEqual(urllib.parse.urlparse("http:"),('http','','','','',''))
self.assertEqual(urllib.parse.urlparse("https:"),('https','','','','',''))
self.assertEqual(urllib.parse.urlparse("http://www.python.org:80"),
('http','www.python.org:80','','','',''))
# As usual, need to check bytes input as well
- self.assertEqual(urllib.parse.urlparse(b"http:80"), (b'http',b'',b'80',b'',b'',b''))
- self.assertEqual(urllib.parse.urlparse(b"https:80"), (b'https',b'',b'80',b'',b'',b''))
- self.assertEqual(urllib.parse.urlparse(b"path:80"), (b'path',b'',b'80',b'',b'',b''))
+ self.assertEqual(urllib.parse.urlparse(b"path:80"),
+ (b'',b'',b'path:80',b'',b'',b''))
self.assertEqual(urllib.parse.urlparse(b"http:"),(b'http',b'',b'',b'',b'',b''))
self.assertEqual(urllib.parse.urlparse(b"https:"),(b'https',b'',b'',b'',b'',b''))
self.assertEqual(urllib.parse.urlparse(b"http://www.python.org:80"),
module.filters = original_filters
+class TestWarning(Warning):
+ pass
+
+
class BaseTest:
"""Basic bookkeeping required for testing."""
self.module._setoption, 'bogus::Warning')
self.assertRaises(self.module._OptionError,
self.module._setoption, 'ignore:2::4:-5')
+ with self.assertRaises(self.module._OptionError):
+ self.module._setoption('ignore::123')
+ with self.assertRaises(self.module._OptionError):
+ self.module._setoption('ignore::123abc')
+ with self.assertRaises(self.module._OptionError):
+ self.module._setoption('ignore::===')
+ with self.assertRaisesRegex(self.module._OptionError, 'Wärning'):
+ self.module._setoption('ignore::Wärning')
self.module._setoption('error::Warning::0')
self.assertRaises(UserWarning, self.module.warn, 'convert to error')
+ def test_import_from_module(self):
+ with original_warnings.catch_warnings(module=self.module):
+ self.module._setoption('ignore::Warning')
+ with self.assertRaises(self.module._OptionError):
+ self.module._setoption('ignore::TestWarning')
+ with self.assertRaises(self.module._OptionError):
+ self.module._setoption('ignore::test.test_warnings.bogus')
+ self.module._setoption('error::test.test_warnings.TestWarning')
+ with self.assertRaises(TestWarning):
+ self.module.warn('test warning', TestWarning)
+
class CWCmdLineTests(WCmdLineTests, unittest.TestCase):
module = c_warnings
import unittest
import sys
import tkinter
-from tkinter.ttk import Scale
from tkinter.test.support import (AbstractTkTest, tcl_version, requires_tcl,
get_tk_patchlevel, pixels_conv, tcl_obj_eq)
import test.support
eq = tcl_obj_eq
self.assertEqual2(widget[name], expected, eq=eq)
self.assertEqual2(widget.cget(name), expected, eq=eq)
- # XXX
- if not isinstance(widget, Scale):
- t = widget.configure(name)
- self.assertEqual(len(t), 5)
- self.assertEqual2(t[4], expected, eq=eq)
+ t = widget.configure(name)
+ self.assertEqual(len(t), 5)
+ self.assertEqual2(t[4], expected, eq=eq)
def checkInvalidParam(self, widget, name, value, errmsg=None, *,
keep_orig=True):
def test_keys(self):
widget = self.create()
keys = widget.keys()
- # XXX
- if not isinstance(widget, Scale):
- self.assertEqual(sorted(keys), sorted(widget.configure()))
+ self.assertEqual(sorted(keys), sorted(widget.configure()))
for k in keys:
widget[k]
# Test if OPTIONS contains all keys
containing the current size setting of the given column. When
option-value pairs are given, the corresponding options of the
size setting of the given column are changed. Options may be one
- of the follwing:
+ of the following:
pad0 pixels
Specifies the paddings to the left of a column.
pad1 pixels
When no option-value pair is given, this command returns a list con-
taining the current size setting of the given row . When option-value
pairs are given, the corresponding options of the size setting of the
- given row are changed. Options may be one of the follwing:
+ given row are changed. Options may be one of the following:
pad0 pixels
Specifies the paddings to the top of a row.
pad1 pixels
Setting a value for any of the "from", "from_" or "to" options
generates a <<RangeChanged>> event."""
- if cnf:
+ retval = Widget.configure(self, cnf, **kw)
+ if not isinstance(cnf, (type(None), str)):
kw.update(cnf)
- Widget.configure(self, **kw)
if any(['from' in kw, 'from_' in kw, 'to' in kw]):
self.event_generate('<<RangeChanged>>')
+ return retval
def get(self, x=None, y=None):
import os
import re
import sys
+import sysconfig
import token
import tokenize
import inspect
opts = parser.parse_args()
if opts.ignore_dir:
- rel_path = 'lib', 'python{0.major}.{0.minor}'.format(sys.version_info)
- _prefix = os.path.join(sys.base_prefix, *rel_path)
- _exec_prefix = os.path.join(sys.base_exec_prefix, *rel_path)
+ _prefix = sysconfig.get_path("stdlib")
+ _exec_prefix = sysconfig.get_path("platstdlib")
def parse_ignore_dir(s):
s = os.path.expanduser(os.path.expandvars(s))
def __eq__(self, other):
if not isinstance(other, ForwardRef):
return NotImplemented
- return (self.__forward_arg__ == other.__forward_arg__ and
- self.__forward_value__ == other.__forward_value__)
+ if self.__forward_evaluated__ and other.__forward_evaluated__:
+ return (self.__forward_arg__ == other.__forward_arg__ and
+ self.__forward_value__ == other.__forward_value__)
+ return self.__forward_arg__ == other.__forward_arg__
def __hash__(self):
- return hash((self.__forward_arg__, self.__forward_value__))
+ return hash(self.__forward_arg__)
def __repr__(self):
return f'ForwardRef({self.__forward_arg__!r})'
def close(self) -> None:
pass
- @abstractmethod
+ @abstractproperty
def closed(self) -> bool:
pass
the specified test method's docstring.
"""
doc = self._testMethodDoc
- return doc and doc.split("\n")[0].strip() or None
+ return doc.strip().split("\n")[0].strip() if doc else None
def id(self):
if child is None or isinstance(child, _SpecState):
break
else:
+ # If an autospecced object is attached using attach_mock the
+ # child would be a function with mock object as attribute from
+ # which signature has to be derived.
+ child = _extract_mock(child)
children = child._mock_children
sig = child._spec_signature
continue
if isinstance(result, (staticmethod, classmethod)):
return False
- elif isinstance(getattr(result, '__get__', None), MethodWrapperTypes):
+ elif isinstance(result, FunctionTypes):
# Normal method => skip if looked up on type
# (if looked up on instance, self is already skipped)
return is_type
type(ANY.__eq__),
)
-MethodWrapperTypes = (
- type(ANY.__eq__.__get__),
-)
-
file_spec = None
'Tests shortDescription() for a method with a longer '
'docstring.')
+ def testShortDescriptionWhitespaceTrimming(self):
+ """
+ Tests shortDescription() whitespace is trimmed, so that the first
+ line of nonwhite-space text becomes the docstring.
+ """
+ self.assertEqual(
+ self.shortDescription(),
+ 'Tests shortDescription() whitespace is trimmed, so that the first')
+
def testAddTypeEqualityFunc(self):
class SadSnake(object):
"""Dummy class for test_addTypeEqualityFunc."""
self.assertEqual(mock_func.mock._extract_mock_name(), 'mock.child')
+ def test_attach_mock_patch_autospec_signature(self):
+ with mock.patch(f'{__name__}.Something.meth', autospec=True) as mocked:
+ manager = Mock()
+ manager.attach_mock(mocked, 'attach_meth')
+ obj = Something()
+ obj.meth(1, 2, 3, d=4)
+ manager.assert_has_calls([call.attach_meth(mock.ANY, 1, 2, 3, d=4)])
+ obj.meth.assert_has_calls([call(mock.ANY, 1, 2, 3, d=4)])
+ mocked.assert_has_calls([call(mock.ANY, 1, 2, 3, d=4)])
+
+ with mock.patch(f'{__name__}.something', autospec=True) as mocked:
+ manager = Mock()
+ manager.attach_mock(mocked, 'attach_func')
+ something(1)
+ manager.assert_has_calls([call.attach_func(1)])
+ something.assert_has_calls([call(1)])
+ mocked.assert_has_calls([call(1)])
+
+ with mock.patch(f'{__name__}.Something', autospec=True) as mocked:
+ manager = Mock()
+ manager.attach_mock(mocked, 'attach_obj')
+ obj = Something()
+ obj.meth(1, 2, 3, d=4)
+ manager.assert_has_calls([call.attach_obj(),
+ call.attach_obj().meth(1, 2, 3, d=4)])
+ obj.meth.assert_has_calls([call(1, 2, 3, d=4)])
+ mocked.assert_has_calls([call(), call().meth(1, 2, 3, d=4)])
+
+
def test_attribute_deletion(self):
for mock in (Mock(), MagicMock(), NonCallableMagicMock(),
NonCallableMock()):
netloc = query = fragment = ''
i = url.find(':')
if i > 0:
+ if url[:i] == 'http': # optimize the common case
+ url = url[i+1:]
+ if url[:2] == '//':
+ netloc, url = _splitnetloc(url, 2)
+ if (('[' in netloc and ']' not in netloc) or
+ (']' in netloc and '[' not in netloc)):
+ raise ValueError("Invalid IPv6 URL")
+ if allow_fragments and '#' in url:
+ url, fragment = url.split('#', 1)
+ if '?' in url:
+ url, query = url.split('?', 1)
+ _checknetloc(netloc)
+ v = SplitResult('http', netloc, url, query, fragment)
+ _parse_cache[key] = v
+ return _coerce_result(v)
for c in url[:i]:
if c not in scheme_chars:
break
else:
- scheme, url = url[:i].lower(), url[i+1:]
+ # make sure "url" is not actually a port number (in which case
+ # "scheme" is really part of the path)
+ rest = url[i+1:]
+ if not rest or any(c not in '0123456789' for c in rest):
+ # not a port number
+ scheme, url = url[:i].lower(), rest
if url[:2] == '//':
netloc, url = _splitnetloc(url, 2)
"""splitport('host:port') --> 'host', 'port'."""
global _portprog
if _portprog is None:
- _portprog = re.compile('(.*):([0-9]*)$', re.DOTALL)
+ _portprog = re.compile('(.*):([0-9]*)', re.DOTALL)
- match = _portprog.match(host)
+ match = _portprog.fullmatch(host)
if match:
host, port = match.groups()
if port:
req.selector)
# NOTE: As per RFC 2617, when server sends "auth,auth-int", the client could use either `auth`
# or `auth-int` to the response back. we use `auth` to send the response back.
- if 'auth' in qop.split(','):
+ if qop is None:
+ respdig = KD(H(A1), "%s:%s" % (nonce, H(A2)))
+ elif 'auth' in qop.split(','):
if nonce == self.last_nonce:
self.nonce_count += 1
else:
cnonce = self.get_cnonce(nonce)
noncebit = "%s:%s:%s:%s:%s" % (nonce, ncvalue, cnonce, 'auth', H(A2))
respdig = KD(H(A1), noncebit)
- elif qop is None:
- respdig = KD(H(A1), "%s:%s" % (nonce, H(A2)))
else:
# XXX handle auth-int.
raise URLError("qop '%s' is not supported." % qop)
try:
no_proxy = proxies['no']
except KeyError:
- return 0
+ return False
# '*' is special case for always bypass
if no_proxy == '*':
- return 1
+ return True
+ host = host.lower()
# strip port off host
hostonly, port = splitport(host)
# check if the host ends with any of the DNS suffixes
- no_proxy_list = [proxy.strip() for proxy in no_proxy.split(',')]
- for name in no_proxy_list:
+ for name in no_proxy.split(','):
+ name = name.strip()
if name:
name = name.lstrip('.') # ignore leading dots
- name = re.escape(name)
- pattern = r'(.+\.)?%s$' % name
- if (re.match(pattern, hostonly, re.I)
- or re.match(pattern, host, re.I)):
- return 1
+ name = name.lower()
+ if hostonly == name or host == name:
+ return True
+ name = '.' + name
+ if hostonly.endswith(name) or host.endswith(name):
+ return True
# otherwise, don't bypass
- return 0
+ return False
# This code tests an OSX specific data structure but is testable on all
for p in proxyServer.split(';'):
protocol, address = p.split('=', 1)
# See if address has a type:// prefix
- if not re.match('^([^/:]+)://', address):
+ if not re.match('(?:[^/:]+)://', address):
address = '%s://%s' % (protocol, address)
proxies[protocol] = address
else:
# Helper for _processoptions()
def _setoption(arg):
- import re
parts = arg.split(':')
if len(parts) > 5:
raise _OptionError("too many fields (max 5): %r" % (arg,))
action, message, category, module, lineno = [s.strip()
for s in parts]
action = _getaction(action)
- message = re.escape(message)
category = _getcategory(category)
- module = re.escape(module)
+ if message or module:
+ import re
+ if message:
+ message = re.escape(message)
if module:
- module = module + '$'
+ module = re.escape(module) + r'\Z'
if lineno:
try:
lineno = int(lineno)
# Helper for _setoption()
def _getcategory(category):
- import re
if not category:
return Warning
- if re.match("^[a-zA-Z0-9_]+$", category):
- try:
- cat = eval(category)
- except NameError:
- raise _OptionError("unknown warning category: %r" % (category,)) from None
+ if '.' not in category:
+ import builtins as m
+ klass = category
else:
- i = category.rfind(".")
- module = category[:i]
- klass = category[i+1:]
+ module, _, klass = category.rpartition('.')
try:
m = __import__(module, None, None, [klass])
except ImportError:
raise _OptionError("invalid module name: %r" % (module,)) from None
- try:
- cat = getattr(m, klass)
- except AttributeError:
- raise _OptionError("unknown warning category: %r" % (category,)) from None
+ try:
+ cat = getattr(m, klass)
+ except AttributeError:
+ raise _OptionError("unknown warning category: %r" % (category,)) from None
if not issubclass(cat, Warning):
raise _OptionError("invalid warning category: %r" % (category,))
return cat
def _synthesize(browser, *, preferred=False):
- """Attempt to synthesize a controller base on existing controllers.
+ """Attempt to synthesize a controller based on existing controllers.
This is useful to create a controller when a user specifies a path to
an entry in the BROWSER environment variable -- we can copy a general
),
),
dict(
- name="SQLite 3.28.0",
- url="https://www.sqlite.org/2019/sqlite-autoconf-3280000.tar.gz",
- checksum='3c68eb400f8354605736cd55400e1572',
+ name="SQLite 3.31.1",
+ url="https://sqlite.org/2020/sqlite-autoconf-3310100.tar.gz",
+ checksum='2d0a553534c521504e3ac3ad3b90f125',
extra_cflags=('-Os '
'-DSQLITE_ENABLE_FTS5 '
'-DSQLITE_ENABLE_FTS4 '
-{\rtf1\ansi\ansicpg1252\cocoartf1671\cocoasubrtf600
-{\fonttbl\f0\fswiss\fcharset0 Helvetica-Bold;\f1\fswiss\fcharset0 Helvetica;\f2\fmodern\fcharset0 CourierNewPS-BoldMT;
+{\rtf1\ansi\ansicpg1252\cocoartf2511
+\cocoatextscaling0\cocoaplatform0{\fonttbl\f0\fswiss\fcharset0 Helvetica-Bold;\f1\fswiss\fcharset0 Helvetica;\f2\fmodern\fcharset0 CourierNewPS-BoldMT;
\f3\fmodern\fcharset0 CourierNewPSMT;}
{\colortbl;\red255\green255\blue255;}
{\*\expandedcolortbl;;}
\f1\b0 \
1. This LICENSE AGREEMENT is between the Python Software Foundation ("PSF"), and the Individual or Organization ("Licensee") accessing and otherwise using this software ("Python") in source or binary form and its associated documentation.\
\
-2. Subject to the terms and conditions of this License Agreement, PSF hereby grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce, analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019 Python Software Foundation; All Rights Reserved" are retained in Python alone or in any derivative version prepared by Licensee.\
+2. Subject to the terms and conditions of this License Agreement, PSF hereby grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce, analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright \'a9 2001-2020 Python Software Foundation; All Rights Reserved" are retained in Python alone or in any derivative version prepared by Licensee.\
\
3. In the event Licensee prepares a derivative work that is based on or incorporates Python or any part thereof, and wants to make the derivative work available to others as provided herein, then Licensee hereby agrees to include in any such work a brief summary of the changes made to Python.\
\
-{\rtf1\ansi\ansicpg1252\cocoartf1671\cocoasubrtf600
-{\fonttbl\f0\fswiss\fcharset0 Helvetica;\f1\fswiss\fcharset0 Helvetica-Bold;\f2\fswiss\fcharset0 Helvetica-Oblique;
+{\rtf1\ansi\ansicpg1252\cocoartf2511
+\cocoatextscaling0\cocoaplatform0{\fonttbl\f0\fswiss\fcharset0 Helvetica;\f1\fswiss\fcharset0 Helvetica-Bold;\f2\fswiss\fcharset0 Helvetica-Oblique;
\f3\fmodern\fcharset0 CourierNewPSMT;}
{\colortbl;\red255\green255\blue255;}
{\*\expandedcolortbl;;}
\cf0 \
\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural\partightenfactor0
-\f1\b \cf0 \ul Certificate verification and OpenSSL\
+\f1\b \cf0 \ul \ulc0 Certificate verification and OpenSSL\
+\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural\partightenfactor0
-\f0\b0 \ulnone \
+\f0\b0 \cf0 \ulnone \
This package includes its own private copy of OpenSSL 1.1.1. The trust certificates in system and user keychains managed by the
\f2\i Keychain Access
\f0\i0 application and the
\f3 pip
\f0 has its own default certificate store for verifying download connections.\
\
+\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural\partightenfactor0
-\f1\b \ul Which installer variant should I use?
+\f1\b \cf0 \ul \ulc0 Which installer variant should I use?
\f0\b0 \ulnone \
\
\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\partightenfactor0
-\cf0 In almost all cases, you should use the
+\cf0 Use the
\f1\b macOS 64-bit installer for OS X 10.9 and later
-\f0\b0 .\
-\
-The legacy
+\f0\b0 . As of 3.7.7, the deprecated
\f1\b macOS 64-bit/32-bit installer for Mac OS X 10.6 and later
-\f0\b0 variant is now deprecated.
-\f1\b Python 3.8.0
-\f0\b0 will
-\f1\b not
-\f0\b0 include a binary installer for the 10.6+ variant and
-\f1\b future bugfix releases of 3.7.x
-\f0\b0 may not, either. macOS 10.6 Snow Leopard was released in 2009 and has not been supported by Apple for many years including lack of security updates. It is becoming increasingly difficult to ensure new Python features and bug fixes are compatible with such old systems. Note that, due to recent Apple installer packaging changes, the 10.6+ installer pkg we provide can no longer be opened by the Apple system installer application on 10.6; 10.7 and 10.8 are not affected. We believe that there is now very little usage of this installer variant and so we would like to focus our resources on supporting newer systems. We do not plan to intentionally break Python support on 10.6 through 10.8 and we will consider bug fixes for problems found when building from source on those systems.
-\f1\b macOS 10.15 Catalina
-\f0\b0 removes support for running 32-bit architecture programs; we do not recommend trying to use the 10.6+ variant on it and it may not install on 10.15 systems without intervention. \
+\f0\b0 variant is no longer provided. \
\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural\partightenfactor0
-\f1\b \cf0 \ul \
+\f1\b \cf0 \ul \ulc0 \
Using IDLE or other Tk applications
\f0\b0 \ulnone \
\
This package includes its own private version of Tcl/Tk 8.6. It does not use any system-supplied or third-party supplied versions of Tcl/Tk.\
\
-\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural\partightenfactor0
-\cf0 Due to new security checks on macOS 10.15 Catalina, when launching IDLE macOS may open a window with a message
+Due to new security checks on macOS 10.15 Catalina, when launching IDLE macOS may open a window with a message
\f1\b "Python" would like to access files in your Documents folder
\f0\b0 . This is normal as IDLE uses your
\f1\b Documents
<key>CFBundleExecutable</key>
<string>IDLE</string>
<key>CFBundleGetInfoString</key>
- <string>%version%, © 2001-2019 Python Software Foundation</string>
+ <string>%version%, © 2001-2020 Python Software Foundation</string>
<key>CFBundleIconFile</key>
<string>IDLE.icns</string>
<key>CFBundleIdentifier</key>
<key>CFBundleExecutable</key>
<string>Python Launcher</string>
<key>CFBundleGetInfoString</key>
- <string>%VERSION%, © 2001-2019 Python Software Foundation</string>
+ <string>%VERSION%, © 2001-2020 Python Software Foundation</string>
<key>CFBundleIconFile</key>
<string>PythonLauncher.icns</string>
<key>CFBundleIdentifier</key>
system header files in their traditional locations, like ``/usr/include`` and
``/System/Library/Frameworks``; instead they are found within a MacOSX SDK.
The Apple-supplied build tools handle this transparently and current
- versiona of Python now handle this as well. So it is no longer necessary,
+ versions of Python now handle this as well. So it is no longer necessary,
and since macOS 10.14, no longer possible to force the installation of system
headers with ``xcode-select``.
<key>CFBundleInfoDictionaryVersion</key>
<string>6.0</string>
<key>CFBundleLongVersionString</key>
- <string>%version%, (c) 2001-2019 Python Software Foundation.</string>
+ <string>%version%, (c) 2001-2020 Python Software Foundation.</string>
<key>CFBundleName</key>
<string>Python</string>
<key>CFBundlePackageType</key>
ctags -w $(srcdir)/Include/*.h $(srcdir)/Include/internal/*.h
for i in $(SRCDIRS); do ctags -f tags -w -a $(srcdir)/$$i/*.[ch]; done
ctags -f tags -w -a $(srcdir)/Modules/_ctypes/*.[ch]
+ find $(srcdir)/Lib -type f -name "*.py" -not -name "test_*.py" -not -path "*/test/*" -not -path "*/tests/*" -not -path "*/*_test/*" | ctags -f tags -w -a -L -
LC_ALL=C sort -o tags tags
# Create a tags file for GNU Emacs
cd $(srcdir); \
etags Include/*.h Include/internal/*.h; \
for i in $(SRCDIRS); do etags -a $$i/*.[ch]; done
+ etags -a $(srcdir)/Modules/_ctypes/*.[ch]
+ find $(srcdir)/Lib -type f -name "*.py" -not -name "test_*.py" -not -path "*/test/*" -not -path "*/tests/*" -not -path "*/*_test/*" | etags - -a
# Sanitation targets -- clean leaves libraries, executables and tags
# files, which clobber removes as well
Jeroen Van Goey
Christoph Gohlke
Tim Golden
+Yonatan Goldschmidt
Mark Gollahon
Guilherme Gonçalves
Tiago Gonçalves
Robert Li
Xuanji Li
Zekun Li
+Zheao Li
Robert van Liere
Ross Light
Shawn Ligocki
Anish Shah
Daniel Shahaf
Hui Shang
+Geoff Shannon
Mark Shannon
Ha Shao
Richard Shapiro
Gennadiy Zlobin
Doug Zongker
Peter Åstrand
-Zheao Li
-Geoff Shannon
-Ngalim Siregar
+
+(Entries should be added in rough alphabetical order by last names)
Python News
+++++++++++
+What's New in Python 3.7.7 final?
+=================================
+
+*Release date: 2020-03-10*
+
+Library
+-------
+
+- bpo-13487: Avoid a possible *"RuntimeError: dictionary changed size during
+ iteration"* from :func:`inspect.getmodule` when it tried to loop through
+ :attr:`sys.modules`.
+
+Documentation
+-------------
+
+- bpo-17422: The language reference no longer restricts default class
+ namespaces to dicts only.
+
+
+What's New in Python 3.7.7 release candidate 1?
+===============================================
+
+*Release date: 2020-03-04*
+
+Security
+--------
+
+- bpo-39401: Avoid unsafe load of ``api-ms-win-core-path-l1-1-0.dll`` at
+ startup on Windows 7.
+
+Core and Builtins
+-----------------
+
+- bpo-39776: Fix race condition where threads created by PyGILState_Ensure()
+ could get a duplicate id.
+
+ This affects consumers of tstate->id like the contextvar caching
+ machinery, which could return invalid cached objects under heavy thread
+ load (observed in embedded scenarios).
+
+- bpo-39778: Fixed a crash due to incorrect handling of weak references in
+ ``collections.OrderedDict`` classes. Patch by Pablo Galindo.
+
+- bpo-39382: Fix a use-after-free in the single inheritance path of
+ ``issubclass()``, when the ``__bases__`` of an object has a single
+ reference, and so does its first item. Patch by Yonatan Goldschmidt.
+
+- bpo-39606: Fix regression caused by fix for bpo-39386, that prevented
+ calling ``aclose`` on an async generator that had already been closed or
+ exhausted.
+
+- bpo-39510: Fix segfault in ``readinto()`` method on closed BufferedReader.
+
+- bpo-39453: Fixed a possible crash in :meth:`list.__contains__` when a list
+ is changed during comparing items. Patch by Dong-hee Na.
+
+- bpo-39427: Document all possibilities for the ``-X`` options in the
+ command line help section. Patch by Pablo Galindo.
+
+- bpo-39421: Fix possible crashes when operating with the functions in the
+ :mod:`heapq` module and custom comparison operators.
+
+- bpo-39386: Prevent double awaiting of async iterator.
+
+- bpo-38588: Fix possible crashes in dict and list when calling
+ :c:func:`PyObject_RichCompareBool`.
+
+- bpo-39031: When parsing an "elif" node, lineno and col_offset of the node
+ now point to the "elif" keyword and not to its condition, making it
+ consistent with the "if" node. Patch by Lysandros Nikolaou.
+
+- bpo-38610: Fix possible crashes in several list methods by holding strong
+ references to list elements when calling
+ :c:func:`PyObject_RichCompareBool`.
+
+Library
+-------
+
+- bpo-39794: Add --without-decimal-contextvar build option. This enables a
+ thread-local rather than a coroutine local context.
+
+- bpo-39769: The :func:`compileall.compile_dir` function's *ddir* parameter
+ and the compileall command line flag `-d` no longer write the wrong
+ pathname to the generated pyc file for submodules beneath the root of the
+ directory tree being compiled. This fixes a regression introduced with
+ Python 3.5.
+
+- bpo-30566: Fix :exc:`IndexError` when trying to decode an invalid string
+ with punycode codec.
+
+- bpo-39649: Remove obsolete check for `__args__` in
+ bdb.Bdb.format_stack_entry.
+
+- bpo-27657: The original fix for bpo-27657, "Fix urlparse() with numeric
+ paths" (GH-16839) included in 3.7.6, inadvertently introduced a behavior
+ change that broke several third-party packages relying on the original
+ undefined parsing behavior. The change is reverted in 3.7.7, restoring the
+ behavior of 3.7.5 and earlier releases.
+
+- bpo-21016: The :mod:`pydoc` and :mod:`trace` modules now use the
+ :mod:`sysconfig` module to get the path to the Python standard library, to
+ support uncommon installation path like ``/usr/lib64/python3.9/`` on
+ Fedora. Patch by Jan Matějek.
+
+- bpo-39548: Fix handling of header in
+ :class:`urllib.request.AbstractDigestAuthHandler` when the optional
+ ``qop`` parameter is not present.
+
+- bpo-39450: Striped whitespace from docstring before returning it from
+ :func:`unittest.case.shortDescription`.
+
+- bpo-39493: Mark ``typing.IO.closed`` as a property
+
+- bpo-39485: Fix a bug in :func:`unittest.mock.create_autospec` that would
+ complain about the wrong number of arguments for custom descriptors
+ defined in an extension module returning functions.
+
+- bpo-39430: Fixed race condition in lazy imports in :mod:`tarfile`.
+
+- bpo-39389: Write accurate compression level metadata in :mod:`gzip`
+ archives, rather than always signaling maximum compression.
+
+- bpo-39274: ``bool(fraction.Fraction)`` now returns a boolean even if
+ (numerator != 0) does not return a boolean (ex: numpy number).
+
+- bpo-39242: Updated the Gmane domain from news.gmane.org to news.gmane.io
+ which is used for examples of :class:`~nntplib.NNTP` news reader server
+ and nntplib tests.
+
+- bpo-39152: Fix ttk.Scale.configure([name]) to return configuration tuple
+ for name or all options. Giovanni Lombardo contributed part of the patch.
+
+- bpo-39198: If an exception were to be thrown in `Logger.isEnabledFor`
+ (say, by asyncio timeouts or stopit) , the `logging` global lock may not
+ be released appropriately, resulting in deadlock. This change wraps that
+ block of code with `try...finally` to ensure the lock is released.
+
+- bpo-39191: Perform a check for running loop before starting a new task in
+ ``loop.run_until_complete()`` to fail fast; it prevents the side effect of
+ new task spawning before exception raising.
+
+- bpo-38871: Correctly parenthesize filter-based statements that contain
+ lambda expressions in mod:`lib2to3`. Patch by Dong-hee Na.
+
+- bpo-39142: A change was made to logging.config.dictConfig to avoid
+ converting instances of named tuples to ConvertingTuple. It's assumed that
+ named tuples are too specialised to be treated like ordinary tuples; if a
+ user of named tuples requires ConvertingTuple functionality, they will
+ have to implement that themselves in their named tuple class.
+
+- bpo-38971: Open issue in the BPO indicated a desire to make the
+ implementation of codecs.open() at parity with io.open(), which implements
+ a try/except to assure file stream gets closed before an exception is
+ raised.
+
+- bpo-39057: :func:`urllib.request.proxy_bypass_environment` now ignores
+ leading dots and no longer ignores a trailing newline.
+
+- bpo-39056: Fixed handling invalid warning category in the -W option. No
+ longer import the re module if it is not needed.
+
+- bpo-39055: :func:`base64.b64decode` with ``validate=True`` raises now a
+ binascii.Error if the input ends with a single ``\n``.
+
+- bpo-38878: Fixed __subclasshook__ of :class:`os.PathLike` to return a
+ correct result upon inheritence. Patch by Bar Harel.
+
+- bpo-35182: Fixed :func:`Popen.communicate` subsequent call crash when the
+ child process has already closed any piped standard stream, but still
+ continues to be running. Patch by Andriy Maletsky.
+
+- bpo-38473: Use signature from inner mock for autospecced methods attached
+ with :func:`unittest.mock.attach_mock`. Patch by Karthikeyan Singaravelan.
+
+- bpo-38293: Add :func:`copy.copy` and :func:`copy.deepcopy` support to
+ :func:`property` objects.
+
+- bpo-37953: In :mod:`typing`, improved the ``__hash__`` and ``__eq__``
+ methods for :class:`ForwardReferences`.
+
+- bpo-36406: Handle namespace packages in :mod:`doctest`. Patch by
+ Karthikeyan Singaravelan.
+
+Documentation
+-------------
+
+- bpo-13790: Change 'string' to 'specification' in format doc.
+
+- bpo-39530: Fix misleading documentation about mixed-type numeric
+ comparisons.
+
+- bpo-17422: The language reference now specifies restrictions on class
+ namespaces. Adapted from a patch by Ethan Furman.
+
+- bpo-39654: In pyclbr doc, update 'class' to 'module' where appropriate and
+ add readmodule comment. Patch by Hakan Çelik.
+
+- bpo-39392: Explain that when filling with turtle, overlap regions may be
+ left unfilled.
+
+- bpo-39381: Mention in docs that :func:`asyncio.get_event_loop` implicitly
+ creates new event loop only if called from the main thread.
+
+- bpo-38918: Add an entry for ``__module__`` in the "function" & "method"
+ sections of the `inspect docs types and members table
+ <https://docs.python.org/3/library/inspect.html#types-and-members>`_
+
+- bpo-3530: In the :mod:`ast` module documentation, fix a misleading
+ ``NodeTransformer`` example and add advice on when to use the
+ ``fix_missing_locations`` function.
+
+Tests
+-----
+
+- bpo-38546: Fix test_ressources_gced_in_workers() of
+ test_concurrent_futures: explicitly stop the manager to prevent leaking a
+ child process running in the background after the test completes.
+
+Build
+-----
+
+- bpo-39144: The ctags and etags build targets both include Modules/_ctypes
+ and Python standard library source files.
+
+Windows
+-------
+
+- bpo-38597: :mod:`distutils` will no longer statically link
+ :file:`vcruntime140.dll` when a redistributable version is unavailable.
+ All future releases of CPython will include a copy of this DLL to ensure
+ distributed extensions can continue to load.
+
+- bpo-38380: Update Windows builds to use SQLite 3.31.1
+
+- bpo-39439: Reduce overhead when using multiprocessing in a Windows virtual
+ environment
+
+- bpo-39185: The build.bat script has additional options for very-quiet
+ output (-q) and very-verbose output (-vv)
+
+macOS
+-----
+
+- bpo-38380: Update macOS builds to use SQLite 3.31.1
+
+IDLE
+----
+
+- bpo-39781: Selecting code context lines no longer causes a jump.
+
+- bpo-39663: Add tests for pyparse find_good_parse_start().
+
+- bpo-39600: In the font configuration window, remove duplicated font names.
+
+- bpo-30780: Add remaining configdialog tests for buttons and highlights and
+ keys tabs.
+
+- bpo-39388: IDLE Settings Cancel button now cancels pending changes
+
+- bpo-39050: Make IDLE Settings dialog Help button work again.
+
+- bpo-34118: Tag memoryview, range, and tuple as classes, the same as list,
+ etcetera, in the library manual built-in functions list.
+
+- bpo-38792: Close an IDLE shell calltip if a :exc:`KeyboardInterrupt` or
+ shell restart occurs. Patch by Zackery Spytz.
+
+- bpo-32989: Add tests for editor newline_and_indent_event method. Remove
+ dead code from pyparse find_good_parse_start method.
+
+
What's New in Python 3.7.6 final?
=================================
is thus equivalent to an omitted line number.
.TP
.BI "\-X " option
-Set implementation specific option.
+Set implementation specific option. The following options are available:
+
+ -X faulthandler: enable faulthandler
+
+ -X showrefcount: output the total reference count and number of used
+ memory blocks when the program finishes or after each statement in the
+ interactive interpreter. This only works on debug builds
+
+ -X tracemalloc: start tracing Python memory allocations using the
+ tracemalloc module. By default, only the most recent frame is stored in a
+ traceback of a trace. Use -X tracemalloc=NFRAME to start tracing with a
+ traceback limit of NFRAME frames
+
+ -X showalloccount: output the total count of allocated objects for each
+ type when the program finishes. This only works when Python was built with
+ COUNT_ALLOCS defined
+
+ -X importtime: show how long each import takes. It shows module name,
+ cumulative time (including nested imports) and self time (excluding
+ nested imports). Note that its output may be broken in multi-threaded
+ application. Typical usage is python3 -X importtime -c 'import asyncio'
+
+ -X dev: enable CPython’s “development mode”, introducing additional runtime
+ checks which are too expensive to be enabled by default. It will not be
+ more verbose than the default if the code is correct: new warnings are
+ only emitted when an issue is detected. Effect of the developer mode:
+ * Add default warning filter, as -W default
+ * Install debug hooks on memory allocators: see the PyMem_SetupDebugHooks() C function
+ * Enable the faulthandler module to dump the Python traceback on a crash
+ * Enable asyncio debug mode
+ * Set the dev_mode attribute of sys.flags to True
+
+ -X utf8: enable UTF-8 mode for operating system interfaces, overriding the default
+ locale-aware mode. -X utf8=0 explicitly disables UTF-8 mode (even when it would
+ otherwise activate automatically). See PYTHONUTF8 for more details
+
.TP
.B \-x
Skip the first line of the source. This is intended for a DOS
}
{
+ Uninitialised byte(s) false alarm, see bpo-35561
+ Memcheck:Param
+ epoll_ctl(event)
+ fun:epoll_ctl
+ fun:pyepoll_internal_ctl
+}
+
+{
ZLIB problems, see test_gzip
Memcheck:Cond
obj:/lib/libz.so.1.2.3
for (i = 0; i < nArgs; ++i) {
PyObject *tp = PyTuple_GET_ITEM(ob, i);
PyObject *cnv;
+/*
+ * The following checks, relating to bpo-16575 and bpo-16576, have been
+ * disabled. The reason is that, although there is a definite problem with
+ * how libffi handles unions (https://github.com/libffi/libffi/issues/33),
+ * there are numerous libraries which pass structures containing unions
+ * by values - especially on Windows but examples also exist on Linux
+ * (https://bugs.python.org/msg359834).
+ *
+ * It may not be possible to get proper support for unions and bitfields
+ * until support is forthcoming in libffi, but for now, adding the checks
+ * has caused problems in otherwise-working software, which suggests it
+ * is better to disable the checks.
+ *
+ * Although specific examples reported relate specifically to unions and
+ * not bitfields, the bitfields check is also being disabled as a
+ * precaution.
+
StgDictObject *stgdict = PyType_stgdict(tp);
if (stgdict != NULL) {
return NULL;
}
}
+ */
+
cnv = PyObject_GetAttrString(tp, "from_param");
if (!cnv)
goto argtypes_error_1;
}
-static PyObject *current_context_var;
+#ifndef WITH_DECIMAL_CONTEXTVAR
+/* Key for thread state dictionary */
+static PyObject *tls_context_key = NULL;
+/* Invariant: NULL or the most recently accessed thread local context */
+static PyDecContextObject *cached_context = NULL;
+#else
+static PyObject *current_context_var = NULL;
+#endif
/* Template for creating new thread contexts, calling Context() without
* arguments and initializing the module_context on first access. */
static void
context_dealloc(PyDecContextObject *self)
{
+#ifndef WITH_DECIMAL_CONTEXTVAR
+ if (self == cached_context) {
+ cached_context = NULL;
+ }
+#endif
+
Py_XDECREF(self->traps);
Py_XDECREF(self->flags);
Py_TYPE(self)->tp_free(self);
* operation.
*/
+#ifndef WITH_DECIMAL_CONTEXTVAR
+/* Get the context from the thread state dictionary. */
+static PyObject *
+current_context_from_dict(void)
+{
+ PyObject *dict;
+ PyObject *tl_context;
+ PyThreadState *tstate;
+
+ dict = PyThreadState_GetDict();
+ if (dict == NULL) {
+ PyErr_SetString(PyExc_RuntimeError,
+ "cannot get thread state");
+ return NULL;
+ }
+
+ tl_context = PyDict_GetItemWithError(dict, tls_context_key);
+ if (tl_context != NULL) {
+ /* We already have a thread local context. */
+ CONTEXT_CHECK(tl_context);
+ }
+ else {
+ if (PyErr_Occurred()) {
+ return NULL;
+ }
+
+ /* Set up a new thread local context. */
+ tl_context = context_copy(default_context_template, NULL);
+ if (tl_context == NULL) {
+ return NULL;
+ }
+ CTX(tl_context)->status = 0;
+
+ if (PyDict_SetItem(dict, tls_context_key, tl_context) < 0) {
+ Py_DECREF(tl_context);
+ return NULL;
+ }
+ Py_DECREF(tl_context);
+ }
+
+ /* Cache the context of the current thread, assuming that it
+ * will be accessed several times before a thread switch. */
+ tstate = PyThreadState_GET();
+ if (tstate) {
+ cached_context = (PyDecContextObject *)tl_context;
+ cached_context->tstate = tstate;
+ }
+
+ /* Borrowed reference with refcount==1 */
+ return tl_context;
+}
+
+/* Return borrowed reference to thread local context. */
+static PyObject *
+current_context(void)
+{
+ PyThreadState *tstate;
+
+ tstate = PyThreadState_GET();
+ if (cached_context && cached_context->tstate == tstate) {
+ return (PyObject *)cached_context;
+ }
+
+ return current_context_from_dict();
+}
+
+/* ctxobj := borrowed reference to the current context */
+#define CURRENT_CONTEXT(ctxobj) \
+ ctxobj = current_context(); \
+ if (ctxobj == NULL) { \
+ return NULL; \
+ }
+
+/* Return a new reference to the current context */
+static PyObject *
+PyDec_GetCurrentContext(PyObject *self UNUSED, PyObject *args UNUSED)
+{
+ PyObject *context;
+
+ context = current_context();
+ if (context == NULL) {
+ return NULL;
+ }
+
+ Py_INCREF(context);
+ return context;
+}
+
+/* Set the thread local context to a new context, decrement old reference */
+static PyObject *
+PyDec_SetCurrentContext(PyObject *self UNUSED, PyObject *v)
+{
+ PyObject *dict;
+
+ CONTEXT_CHECK(v);
+
+ dict = PyThreadState_GetDict();
+ if (dict == NULL) {
+ PyErr_SetString(PyExc_RuntimeError,
+ "cannot get thread state");
+ return NULL;
+ }
+
+ /* If the new context is one of the templates, make a copy.
+ * This is the current behavior of decimal.py. */
+ if (v == default_context_template ||
+ v == basic_context_template ||
+ v == extended_context_template) {
+ v = context_copy(v, NULL);
+ if (v == NULL) {
+ return NULL;
+ }
+ CTX(v)->status = 0;
+ }
+ else {
+ Py_INCREF(v);
+ }
+
+ cached_context = NULL;
+ if (PyDict_SetItem(dict, tls_context_key, v) < 0) {
+ Py_DECREF(v);
+ return NULL;
+ }
+
+ Py_DECREF(v);
+ Py_RETURN_NONE;
+}
+#else
static PyObject *
init_current_context(void)
{
Py_RETURN_NONE;
}
+#endif
/* Context manager object for the 'with' statement. The manager
* owns one reference to the global (outer) context and one
mpd_ssize_t exp;
uint32_t status = 0;
mpd_context_t maxctx;
- PyObject *context;
- context = current_context();
- if (context == NULL) {
- return -1;
- }
- Py_DECREF(context);
-
if (mpd_isspecial(MPD(v))) {
if (mpd_issnan(MPD(v))) {
PyErr_SetString(PyExc_TypeError,
mpd_free = PyMem_Free;
mpd_setminalloc(_Py_DEC_MINALLOC);
- /* Init context variable */
- current_context_var = PyContextVar_New("decimal_context", NULL);
- if (current_context_var == NULL) {
- goto error;
- }
/* Init external C-API functions */
_py_long_multiply = PyLong_Type.tp_as_number->nb_multiply;
CHECK_INT(PyModule_AddObject(m, "DefaultContext",
default_context_template));
+#ifndef WITH_DECIMAL_CONTEXTVAR
+ ASSIGN_PTR(tls_context_key, PyUnicode_FromString("___DECIMAL_CTX__"));
+ Py_INCREF(Py_False);
+ CHECK_INT(PyModule_AddObject(m, "HAVE_CONTEXTVAR", Py_False));
+#else
+ ASSIGN_PTR(current_context_var, PyContextVar_New("decimal_context", NULL));
+ Py_INCREF(Py_True);
+ CHECK_INT(PyModule_AddObject(m, "HAVE_CONTEXTVAR", Py_True));
+#endif
Py_INCREF(Py_True);
CHECK_INT(PyModule_AddObject(m, "HAVE_THREADS", Py_True));
Py_CLEAR(SignalTuple); /* GCOV_NOT_REACHED */
Py_CLEAR(DecimalTuple); /* GCOV_NOT_REACHED */
Py_CLEAR(default_context_template); /* GCOV_NOT_REACHED */
+#ifndef WITH_DECIMAL_CONTEXTVAR
+ Py_CLEAR(tls_context_key); /* GCOV_NOT_REACHED */
+#else
+ Py_CLEAR(current_context_var); /* GCOV_NOT_REACHED */
+#endif
Py_CLEAR(basic_context_template); /* GCOV_NOT_REACHED */
Py_CLEAR(extended_context_template); /* GCOV_NOT_REACHED */
- Py_CLEAR(current_context_var); /* GCOV_NOT_REACHED */
Py_CLEAR(m); /* GCOV_NOT_REACHED */
return NULL; /* GCOV_NOT_REACHED */
const mpd_context_t *ctx, uint32_t *status)
{
_mpd_qdiv(SET_IDEAL_EXP, q, a, b, ctx, status);
+
+ if (*status & MPD_Malloc_error) {
+ /* Inexact quotients (the usual case) fill the entire context precision,
+ * which can lead to malloc() failures for very high precisions. Retry
+ * the operation with a lower precision in case the result is exact.
+ *
+ * We need an upper bound for the number of digits of a_coeff / b_coeff
+ * when the result is exact. If a_coeff' * 1 / b_coeff' is in lowest
+ * terms, then maxdigits(a_coeff') + maxdigits(1 / b_coeff') is a suitable
+ * bound.
+ *
+ * 1 / b_coeff' is exact iff b_coeff' exclusively has prime factors 2 or 5.
+ * The largest amount of digits is generated if b_coeff' is a power of 2 or
+ * a power of 5 and is less than or equal to log5(b_coeff') <= log2(b_coeff').
+ *
+ * We arrive at a total upper bound:
+ *
+ * maxdigits(a_coeff') + maxdigits(1 / b_coeff') <=
+ * a->digits + log2(b_coeff) =
+ * a->digits + log10(b_coeff) / log10(2) <=
+ * a->digits + b->digits * 4;
+ */
+ uint32_t workstatus = 0;
+ mpd_context_t workctx = *ctx;
+ workctx.prec = a->digits + b->digits * 4;
+ if (workctx.prec >= ctx->prec) {
+ return; /* No point in retrying, keep the original error. */
+ }
+
+ _mpd_qdiv(SET_IDEAL_EXP, q, a, b, &workctx, &workstatus);
+ if (workstatus == 0) { /* The result is exact, unrounded, normal etc. */
+ *status = 0;
+ return;
+ }
+
+ mpd_seterror(q, *status, status);
+ }
}
/* Internal function. */
/* END LIBMPDEC_ONLY */
/* Algorithm from decimal.py */
-void
-mpd_qsqrt(mpd_t *result, const mpd_t *a, const mpd_context_t *ctx,
- uint32_t *status)
+static void
+_mpd_qsqrt(mpd_t *result, const mpd_t *a, const mpd_context_t *ctx,
+ uint32_t *status)
{
mpd_context_t maxcontext;
MPD_NEW_STATIC(c,0,0,0,0);
goto out;
}
+void
+mpd_qsqrt(mpd_t *result, const mpd_t *a, const mpd_context_t *ctx,
+ uint32_t *status)
+{
+ _mpd_qsqrt(result, a, ctx, status);
+
+ if (*status & (MPD_Malloc_error|MPD_Division_impossible)) {
+ /* The above conditions can occur at very high context precisions
+ * if intermediate values get too large. Retry the operation with
+ * a lower context precision in case the result is exact.
+ *
+ * If the result is exact, an upper bound for the number of digits
+ * is the number of digits in the input.
+ *
+ * NOTE: sqrt(40e9) = 2.0e+5 /\ digits(40e9) = digits(2.0e+5) = 2
+ */
+ uint32_t workstatus = 0;
+ mpd_context_t workctx = *ctx;
+ workctx.prec = a->digits;
+
+ if (workctx.prec >= ctx->prec) {
+ return; /* No point in repeating this, keep the original error. */
+ }
+
+ _mpd_qsqrt(result, a, &workctx, &workstatus);
+ if (workstatus == 0) {
+ *status = 0;
+ return;
+ }
+
+ mpd_seterror(result, *status, status);
+ }
+}
+
/******************************************************************************/
/* Base conversions */
'special': ('context.__reduce_ex__', 'context.create_decimal_from_float')
}
+# Functions that set no context flags but whose result can differ depending
+# on prec, Emin and Emax.
+MaxContextSkip = ['is_normal', 'is_subnormal', 'logical_invert', 'next_minus',
+ 'next_plus', 'number_class', 'logical_and', 'logical_or',
+ 'logical_xor', 'next_toward', 'rotate', 'shift']
+
# Functions that require a restricted exponent range for reasonable runtimes.
UnaryRestricted = [
'__ceil__', '__floor__', '__int__', '__trunc__',
self.pex = RestrictedList() # Python exceptions for P.Decimal
self.presults = RestrictedList() # P.Decimal results
+ # If the above results are exact, unrounded and not clamped, repeat
+ # the operation with a maxcontext to ensure that huge intermediate
+ # values do not cause a MemoryError.
+ self.with_maxcontext = False
+ self.maxcontext = context.c.copy()
+ self.maxcontext.prec = C.MAX_PREC
+ self.maxcontext.Emax = C.MAX_EMAX
+ self.maxcontext.Emin = C.MIN_EMIN
+ self.maxcontext.clear_flags()
+
+ self.maxop = RestrictedList() # converted C.Decimal operands
+ self.maxex = RestrictedList() # Python exceptions for C.Decimal
+ self.maxresults = RestrictedList() # C.Decimal results
+
# ======================================================================
# SkipHandler: skip known discrepancies
if t.contextfunc:
cargs = t.cop
pargs = t.pop
+ maxargs = t.maxop
cfunc = "c_func: %s(" % t.funcname
pfunc = "p_func: %s(" % t.funcname
+ maxfunc = "max_func: %s(" % t.funcname
else:
cself, cargs = t.cop[0], t.cop[1:]
pself, pargs = t.pop[0], t.pop[1:]
+ maxself, maxargs = t.maxop[0], t.maxop[1:]
cfunc = "c_func: %s.%s(" % (repr(cself), t.funcname)
pfunc = "p_func: %s.%s(" % (repr(pself), t.funcname)
+ maxfunc = "max_func: %s.%s(" % (repr(maxself), t.funcname)
err = cfunc
for arg in cargs:
err = err.rstrip(", ")
err += ")"
+ if t.with_maxcontext:
+ err += "\n"
+ err += maxfunc
+ for arg in maxargs:
+ err += "%s, " % repr(arg)
+ err = err.rstrip(", ")
+ err += ")"
+
return err
def raise_error(t):
err = "Error in %s:\n\n" % t.funcname
err += "input operands: %s\n\n" % (t.op,)
err += function_as_string(t)
- err += "\n\nc_result: %s\np_result: %s\n\n" % (t.cresults, t.presults)
- err += "c_exceptions: %s\np_exceptions: %s\n\n" % (t.cex, t.pex)
- err += "%s\n\n" % str(t.context)
+
+ err += "\n\nc_result: %s\np_result: %s\n" % (t.cresults, t.presults)
+ if t.with_maxcontext:
+ err += "max_result: %s\n\n" % (t.maxresults)
+ else:
+ err += "\n"
+
+ err += "c_exceptions: %s\np_exceptions: %s\n" % (t.cex, t.pex)
+ if t.with_maxcontext:
+ err += "max_exceptions: %s\n\n" % t.maxex
+ else:
+ err += "\n"
+
+ err += "%s\n" % str(t.context)
+ if t.with_maxcontext:
+ err += "%s\n" % str(t.maxcontext)
+ else:
+ err += "\n"
raise VerifyError(err)
# are printed to stdout.
# ======================================================================
+def all_nan(a):
+ if isinstance(a, C.Decimal):
+ return a.is_nan()
+ elif isinstance(a, tuple):
+ return all(all_nan(v) for v in a)
+ return False
+
def convert(t, convstr=True):
""" t is the testset. At this stage the testset contains a tuple of
operands t.op of various types. For decimal methods the first
for i, op in enumerate(t.op):
context.clear_status()
+ t.maxcontext.clear_flags()
if op in RoundModes:
t.cop.append(op)
t.pop.append(op)
+ t.maxop.append(op)
elif not t.contextfunc and i == 0 or \
convstr and isinstance(op, str):
p = None
pex = e.__class__
+ try:
+ C.setcontext(t.maxcontext)
+ maxop = C.Decimal(op)
+ maxex = None
+ except (TypeError, ValueError, OverflowError) as e:
+ maxop = None
+ maxex = e.__class__
+ finally:
+ C.setcontext(context.c)
+
t.cop.append(c)
t.cex.append(cex)
+
t.pop.append(p)
t.pex.append(pex)
+ t.maxop.append(maxop)
+ t.maxex.append(maxex)
+
if cex is pex:
if str(c) != str(p) or not context.assert_eq_status():
raise_error(t)
else:
raise_error(t)
+ # The exceptions in the maxcontext operation can legitimately
+ # differ, only test that maxex implies cex:
+ if maxex is not None and cex is not maxex:
+ raise_error(t)
+
elif isinstance(op, Context):
t.context = op
t.cop.append(op.c)
t.pop.append(op.p)
+ t.maxop.append(t.maxcontext)
else:
t.cop.append(op)
t.pop.append(op)
+ t.maxop.append(op)
return 1
t.rc and t.rp are the results of the operation.
"""
context.clear_status()
+ t.maxcontext.clear_flags()
try:
if t.contextfunc:
t.rp = None
t.pex.append(e.__class__)
+ # If the above results are exact, unrounded, normal etc., repeat the
+ # operation with a maxcontext to ensure that huge intermediate values
+ # do not cause a MemoryError.
+ if (t.funcname not in MaxContextSkip and
+ not context.c.flags[C.InvalidOperation] and
+ not context.c.flags[C.Inexact] and
+ not context.c.flags[C.Rounded] and
+ not context.c.flags[C.Subnormal] and
+ not context.c.flags[C.Clamped] and
+ not context.clamp and # results are padded to context.prec if context.clamp==1.
+ not any(isinstance(v, C.Context) for v in t.cop)): # another context is used.
+ t.with_maxcontext = True
+ try:
+ if t.contextfunc:
+ maxargs = t.maxop
+ t.rmax = getattr(t.maxcontext, t.funcname)(*maxargs)
+ else:
+ maxself = t.maxop[0]
+ maxargs = t.maxop[1:]
+ try:
+ C.setcontext(t.maxcontext)
+ t.rmax = getattr(maxself, t.funcname)(*maxargs)
+ finally:
+ C.setcontext(context.c)
+ t.maxex.append(None)
+ except (TypeError, ValueError, OverflowError, MemoryError) as e:
+ t.rmax = None
+ t.maxex.append(e.__class__)
+
def verify(t, stat):
""" t is the testset. At this stage the testset contains the following
tuples:
"""
t.cresults.append(str(t.rc))
t.presults.append(str(t.rp))
+ if t.with_maxcontext:
+ t.maxresults.append(str(t.rmax))
+
if isinstance(t.rc, C.Decimal) and isinstance(t.rp, P.Decimal):
# General case: both results are Decimals.
t.cresults.append(t.rc.to_eng_string())
t.presults.append(str(t.rp.imag))
t.presults.append(str(t.rp.real))
+ if t.with_maxcontext and isinstance(t.rmax, C.Decimal):
+ t.maxresults.append(t.rmax.to_eng_string())
+ t.maxresults.append(t.rmax.as_tuple())
+ t.maxresults.append(str(t.rmax.imag))
+ t.maxresults.append(str(t.rmax.real))
+
nc = t.rc.number_class().lstrip('+-s')
stat[nc] += 1
else:
if not isinstance(t.rc, tuple) and not isinstance(t.rp, tuple):
if t.rc != t.rp:
raise_error(t)
+ if t.with_maxcontext and not isinstance(t.rmax, tuple):
+ if t.rmax != t.rc:
+ raise_error(t)
stat[type(t.rc).__name__] += 1
# The return value lists must be equal.
if not t.context.assert_eq_status():
raise_error(t)
+ if t.with_maxcontext:
+ # NaN payloads etc. depend on precision and clamp.
+ if all_nan(t.rc) and all_nan(t.rmax):
+ return
+ # The return value lists must be equal.
+ if t.maxresults != t.cresults:
+ raise_error(t)
+ # The Python exception lists (TypeError, etc.) must be equal.
+ if t.maxex != t.cex:
+ raise_error(t)
+ # The context flags must be equal.
+ if t.maxcontext.flags != t.context.c.flags:
+ raise_error(t)
+
# ======================================================================
# Main test loops
#!/bin/sh
#
-# Purpose: test all machine configurations, pydebug, refleaks, release build
-# and release build with valgrind.
+# Purpose: test with and without contextvar, all machine configurations, pydebug,
+# refleaks, release build and release build with valgrind.
#
# Synopsis: ./runall-memorydebugger.sh [--all-configs64 | --all-configs32]
#
CONFIGS_32="ppro ansi32 ansi-legacy universal"
VALGRIND="valgrind --tool=memcheck --leak-resolution=high \
- --db-attach=yes --suppressions=Misc/valgrind-python.supp"
+ --suppressions=Misc/valgrind-python.supp"
# Get args
case $@ in
cd ..
# test_decimal: refleak, regular and Valgrind tests
-for config in $CONFIGS; do
+for args in "--without-decimal-contextvar" ""; do
+ for config in $CONFIGS; do
unset PYTHON_DECIMAL_WITH_MACHINE
libmpdec_config=$config
fi
############ refleak tests ###########
- print_config "refleak tests: config=$config"
+ print_config "refleak tests: config=$config" $args
printf "\nbuilding python ...\n\n"
cd ../../
$GMAKE distclean > /dev/null 2>&1
- ./configure CFLAGS="$ADD_CFLAGS" LDFLAGS="$ADD_LDFLAGS" --with-pydebug > /dev/null 2>&1
+ ./configure CFLAGS="$ADD_CFLAGS" LDFLAGS="$ADD_LDFLAGS" --with-pydebug $args > /dev/null 2>&1
$GMAKE | grep _decimal
printf "\n\n# ======================== refleak tests ===========================\n\n"
- ./python -m test -uall -R 2:2 test_decimal
+ ./python -m test -uall -R 3:3 test_decimal
############ regular tests ###########
- print_config "regular tests: config=$config"
+ print_config "regular tests: config=$config" $args
printf "\nbuilding python ...\n\n"
$GMAKE distclean > /dev/null 2>&1
- ./configure CFLAGS="$ADD_CFLAGS" LDFLAGS="$ADD_LDFLAGS" > /dev/null 2>&1
+ ./configure CFLAGS="$ADD_CFLAGS" LDFLAGS="$ADD_LDFLAGS" $args > /dev/null 2>&1
$GMAKE | grep _decimal
printf "\n\n# ======================== regular tests ===========================\n\n"
esac
esac
- print_config "valgrind tests: config=$config"
+ print_config "valgrind tests: config=$config" $args
printf "\nbuilding python ...\n\n"
$GMAKE distclean > /dev/null 2>&1
- ./configure CFLAGS="$ADD_CFLAGS" LDFLAGS="$ADD_LDFLAGS" --without-pymalloc > /dev/null 2>&1
+ ./configure CFLAGS="$ADD_CFLAGS" LDFLAGS="$ADD_LDFLAGS" --without-pymalloc $args > /dev/null 2>&1
$GMAKE | grep _decimal
printf "\n\n# ======================== valgrind tests ===========================\n\n"
$valgrind ./python -m test -uall test_decimal
cd Modules/_decimal
+ done
done
# deccheck
cd ../../
-for config in $CONFIGS; do
+for args in "--without-decimal-contextvar" ""; do
+ for config in $CONFIGS; do
unset PYTHON_DECIMAL_WITH_MACHINE
if [ X"$config" != X"auto" ]; then
fi
############ debug ############
- print_config "deccheck: config=$config --with-pydebug"
+ print_config "deccheck: config=$config --with-pydebug" $args
printf "\nbuilding python ...\n\n"
$GMAKE distclean > /dev/null 2>&1
- ./configure CFLAGS="$ADD_CFLAGS" LDFLAGS="$ADD_LDFLAGS" --with-pydebug > /dev/null 2>&1
+ ./configure CFLAGS="$ADD_CFLAGS" LDFLAGS="$ADD_LDFLAGS" --with-pydebug $args > /dev/null 2>&1
$GMAKE | grep _decimal
printf "\n\n# ========================== debug ===========================\n\n"
./python Modules/_decimal/tests/deccheck.py
########### regular ###########
- print_config "deccheck: config=$config "
+ print_config "deccheck: config=$config" $args
printf "\nbuilding python ...\n\n"
$GMAKE distclean > /dev/null 2>&1
- ./configure CFLAGS="$ADD_CFLAGS" LDFLAGS="$ADD_LDFLAGS" > /dev/null 2>&1
+ ./configure CFLAGS="$ADD_CFLAGS" LDFLAGS="$ADD_LDFLAGS" $args > /dev/null 2>&1
$GMAKE | grep _decimal
printf "\n\n# ======================== regular ===========================\n\n"
esac
esac
- print_config "valgrind deccheck: config=$config "
+ print_config "valgrind deccheck: config=$config" $args
printf "\nbuilding python ...\n\n"
$GMAKE distclean > /dev/null 2>&1
- ./configure CFLAGS="$ADD_CFLAGS" LDFLAGS="$ADD_LDFLAGS" --without-pymalloc > /dev/null 2>&1
+ ./configure CFLAGS="$ADD_CFLAGS" LDFLAGS="$ADD_LDFLAGS" --without-pymalloc $args > /dev/null 2>&1
$GMAKE | grep _decimal
printf "\n\n# ======================== valgrind ==========================\n\n"
$valgrind ./python Modules/_decimal/tests/deccheck.py
+ done
done
\r
echo.\r
echo # ======================================================================\r
-echo # Building Python\r
+echo # Building Python (Debug^|x64)\r
echo # ======================================================================\r
echo.\r
\r
-call "%VS100COMNTOOLS%\..\..\VC\vcvarsall.bat" x64\r
-msbuild /noconsolelogger /target:clean PCbuild\pcbuild.sln /p:Configuration=Release /p:PlatformTarget=x64\r
-msbuild /noconsolelogger /target:clean PCbuild\pcbuild.sln /p:Configuration=Debug /p:PlatformTarget=x64\r
-msbuild /noconsolelogger PCbuild\pcbuild.sln /p:Configuration=Release /p:Platform=x64\r
-msbuild /noconsolelogger PCbuild\pcbuild.sln /p:Configuration=Debug /p:Platform=x64\r
-\r
-call "%VS100COMNTOOLS%\..\..\VC\vcvarsall.bat" x86\r
-msbuild /noconsolelogger PCbuild\pcbuild.sln /p:Configuration=Release /p:Platform=Win32\r
-msbuild /noconsolelogger PCbuild\pcbuild.sln /p:Configuration=Debug /p:Platform=Win32\r
-echo.\r
-echo.\r
+call .\Tools\buildbot\clean.bat\r
+call .\Tools\buildbot\build.bat -c Debug -p x64\r
\r
echo.\r
echo # ======================================================================\r
-echo # test_decimal: platform=x64\r
+echo # platform=Debug^|x64\r
echo # ======================================================================\r
echo.\r
\r
-cd PCbuild\amd64\r
-\r
echo # ==================== refleak tests =======================\r
echo.\r
-python_d.exe -m test -uall -R 2:2 test_decimal\r
+call python.bat -m test -uall -R 3:3 test_decimal\r
echo.\r
echo.\r
\r
echo # ==================== regular tests =======================\r
echo.\r
-python.exe -m test -uall test_decimal\r
+call python.bat -m test -uall test_decimal\r
+echo.\r
+echo.\r
+\r
+echo # ==================== deccheck =======================\r
+echo.\r
+call python.bat .\Modules\_decimal\tests\deccheck.py\r
echo.\r
echo.\r
\r
-cd ..\r
\r
echo.\r
echo # ======================================================================\r
-echo # test_decimal: platform=x86\r
+echo # Building Python (Release^|x64)\r
echo # ======================================================================\r
echo.\r
\r
-echo # ==================== refleak tests =======================\r
-echo.\r
-python_d.exe -m test -uall -R 2:2 test_decimal\r
+call .\Tools\buildbot\clean.bat\r
+call .\Tools\buildbot\build.bat -c Release -p x64\r
+\r
echo.\r
+echo # ======================================================================\r
+echo # platform=Release^|x64\r
+echo # ======================================================================\r
echo.\r
\r
echo # ==================== regular tests =======================\r
echo.\r
-python.exe -m test -uall test_decimal\r
+call python.bat -m test -uall test_decimal\r
+echo.\r
+echo.\r
+\r
+echo # ==================== deccheck =======================\r
+echo.\r
+call python.bat .\Modules\_decimal\tests\deccheck.py\r
+echo.\r
+echo.\r
+\r
+\r
echo.\r
+echo # ======================================================================\r
+echo # Building Python (Debug^|Win32)\r
+echo # ======================================================================\r
echo.\r
\r
-cd amd64\r
+call .\Tools\buildbot\clean.bat\r
+call Tools\buildbot\build.bat -c Debug -p Win32\r
\r
echo.\r
echo # ======================================================================\r
-echo # deccheck: platform=x64\r
+echo # platform=Debug^|Win32\r
echo # ======================================================================\r
echo.\r
\r
-echo # ==================== debug build =======================\r
+echo # ==================== refleak tests =======================\r
+echo.\r
+call python.bat -m test -uall -R 3:3 test_decimal\r
+echo.\r
+echo.\r
+\r
+echo # ==================== regular tests =======================\r
echo.\r
-python_d.exe ..\..\Modules\_decimal\tests\deccheck.py\r
+call python.bat -m test -uall test_decimal\r
echo.\r
echo.\r
\r
-echo # =================== release build ======================\r
+echo # ==================== deccheck =======================\r
echo.\r
-python.exe ..\..\Modules\_decimal\tests\deccheck.py\r
+call python.bat .\Modules\_decimal\tests\deccheck.py\r
echo.\r
echo.\r
\r
-cd ..\r
\r
echo.\r
echo # ======================================================================\r
-echo # deccheck: platform=x86\r
+echo # Building Python (Release^|Win32)\r
echo # ======================================================================\r
echo.\r
+\r
+call .\Tools\buildbot\clean.bat\r
+call .\Tools\buildbot\build.bat -c Release -p Win32\r
+\r
+echo.\r
+echo # ======================================================================\r
+echo # platform=Release^|Win32\r
+echo # ======================================================================\r
echo.\r
\r
-echo # ==================== debug build =======================\r
+echo # ==================== regular tests =======================\r
echo.\r
-python_d.exe ..\Modules\_decimal\tests\deccheck.py\r
+call python.bat -m test -uall test_decimal\r
echo.\r
echo.\r
\r
-echo # =================== release build ======================\r
+echo # ==================== deccheck =======================\r
echo.\r
-python.exe ..\Modules\_decimal\tests\deccheck.py\r
+call python.bat .\Modules\_decimal\tests\deccheck.py\r
echo.\r
echo.\r
-\r
-\r
-cd ..\Modules\_decimal\tests\r
-\r
-\r
-\r
while (pos > startpos) {
parentpos = (pos - 1) >> 1;
parent = arr[parentpos];
+ Py_INCREF(newitem);
+ Py_INCREF(parent);
cmp = PyObject_RichCompareBool(newitem, parent, Py_LT);
+ Py_DECREF(parent);
+ Py_DECREF(newitem);
if (cmp < 0)
return -1;
if (size != PyList_GET_SIZE(heap)) {
/* Set childpos to index of smaller child. */
childpos = 2*pos + 1; /* leftmost child position */
if (childpos + 1 < endpos) {
- cmp = PyObject_RichCompareBool(
- arr[childpos],
- arr[childpos + 1],
- Py_LT);
+ PyObject* a = arr[childpos];
+ PyObject* b = arr[childpos + 1];
+ Py_INCREF(a);
+ Py_INCREF(b);
+ cmp = PyObject_RichCompareBool(a, b, Py_LT);
+ Py_DECREF(a);
+ Py_DECREF(b);
if (cmp < 0)
return -1;
childpos += ((unsigned)cmp ^ 1); /* increment when cmp==0 */
return item;
}
- cmp = PyObject_RichCompareBool(PyList_GET_ITEM(heap, 0), item, Py_LT);
+ PyObject* top = PyList_GET_ITEM(heap, 0);
+ Py_INCREF(top);
+ cmp = PyObject_RichCompareBool(top, item, Py_LT);
+ Py_DECREF(top);
if (cmp < 0)
return NULL;
if (cmp == 0) {
while (pos > startpos) {
parentpos = (pos - 1) >> 1;
parent = arr[parentpos];
+ Py_INCREF(parent);
+ Py_INCREF(newitem);
cmp = PyObject_RichCompareBool(parent, newitem, Py_LT);
+ Py_DECREF(parent);
+ Py_DECREF(newitem);
if (cmp < 0)
return -1;
if (size != PyList_GET_SIZE(heap)) {
/* Set childpos to index of smaller child. */
childpos = 2*pos + 1; /* leftmost child position */
if (childpos + 1 < endpos) {
- cmp = PyObject_RichCompareBool(
- arr[childpos + 1],
- arr[childpos],
- Py_LT);
+ PyObject* a = arr[childpos + 1];
+ PyObject* b = arr[childpos];
+ Py_INCREF(a);
+ Py_INCREF(b);
+ cmp = PyObject_RichCompareBool(a, b, Py_LT);
+ Py_DECREF(a);
+ Py_DECREF(b);
if (cmp < 0)
return -1;
childpos += ((unsigned)cmp ^ 1); /* increment when cmp==0 */
PyObject *res = NULL;
CHECK_INITIALIZED(self)
+ CHECK_CLOSED(self, "readinto of closed file")
n = Py_SAFE_DOWNCAST(READAHEAD(self), Py_off_t, Py_ssize_t);
if (n > 0) {
Py_VISIT(st->import_mapping_3to2);
Py_VISIT(st->codecs_encode);
Py_VISIT(st->getattr);
+ Py_VISIT(st->partial);
return 0;
}
-W arg : warning control; arg is action:message:category:module:lineno\n\
also PYTHONWARNINGS=arg\n\
-x : skip first line of source, allowing use of non-Unix forms of #!cmd\n\
--X opt : set implementation-specific option\n\
+-X opt : set implementation-specific option. The following options are available:\n\
+\n\
+ -X faulthandler: enable faulthandler\n\
+ -X showrefcount: output the total reference count and number of used\n\
+ memory blocks when the program finishes or after each statement in the\n\
+ interactive interpreter. This only works on debug builds\n\
+ -X tracemalloc: start tracing Python memory allocations using the\n\
+ tracemalloc module. By default, only the most recent frame is stored in a\n\
+ traceback of a trace. Use -X tracemalloc=NFRAME to start tracing with a\n\
+ traceback limit of NFRAME frames\n\
+ -X showalloccount: output the total count of allocated objects for each\n\
+ type when the program finishes. This only works when Python was built with\n\
+ COUNT_ALLOCS defined\n\
+ -X importtime: show how long each import takes. It shows module name,\n\
+ cumulative time (including nested imports) and self time (excluding\n\
+ nested imports). Note that its output may be broken in multi-threaded\n\
+ application. Typical usage is python3 -X importtime -c 'import asyncio'\n\
+ -X dev: enable CPython’s “development mode”, introducing additional runtime\n\
+ checks which are too expensive to be enabled by default. Effect of the\n\
+ developer mode:\n\
+ * Add default warning filter, as -W default\n\
+ * Install debug hooks on memory allocators: see the PyMem_SetupDebugHooks() C function\n\
+ * Enable the faulthandler module to dump the Python traceback on a crash\n\
+ * Enable asyncio debug mode\n\
+ * Set the dev_mode attribute of sys.flags to True\n\
+ -X utf8: enable UTF-8 mode for operating system interfaces, overriding the default\n\
+ locale-aware mode. -X utf8=0 explicitly disables UTF-8 mode (even when it would\n\
+ otherwise activate automatically)\n\
+\n\
--check-hash-based-pycs always|default|never:\n\
control how Python invalidates hash-based .pyc files\n\
";
int r = 0;
while (1) {
- if (derived == cls)
+ if (derived == cls) {
+ Py_XDECREF(bases); /* See below comment */
return 1;
- bases = abstract_get_bases(derived);
+ }
+ /* Use XSETREF to drop bases reference *after* finishing with
+ derived; bases might be the only reference to it.
+ XSETREF is used instead of SETREF, because bases is NULL on the
+ first iteration of the loop.
+ */
+ Py_XSETREF(bases, abstract_get_bases(derived));
if (bases == NULL) {
if (PyErr_Occurred())
return -1;
/* Avoid recursivity in the single inheritance case */
if (n == 1) {
derived = PyTuple_GET_ITEM(bases, 0);
- Py_DECREF(bases);
continue;
}
for (i = 0; i < n; i++) {
return -1;
return 0;
}
+ Py_INCREF(bval);
cmp = PyObject_RichCompareBool(aval, bval, Py_EQ);
Py_DECREF(key);
Py_DECREF(aval);
+ Py_DECREF(bval);
if (cmp <= 0) /* error or not equal */
return cmp;
}
PyObject *result;
if (o->ags_state == AWAITABLE_STATE_CLOSED) {
- PyErr_SetNone(PyExc_StopIteration);
+ PyErr_SetString(
+ PyExc_RuntimeError,
+ "cannot reuse already awaited __anext__()/asend()");
return NULL;
}
PyObject *result;
if (o->ags_state == AWAITABLE_STATE_CLOSED) {
- PyErr_SetNone(PyExc_StopIteration);
+ PyErr_SetString(
+ PyExc_RuntimeError,
+ "cannot reuse already awaited __anext__()/asend()");
return NULL;
}
PyFrameObject *f = gen->gi_frame;
PyObject *retval;
- if (f == NULL || f->f_stacktop == NULL ||
- o->agt_state == AWAITABLE_STATE_CLOSED) {
+ if (o->agt_state == AWAITABLE_STATE_CLOSED) {
+ PyErr_SetString(
+ PyExc_RuntimeError,
+ "cannot reuse already awaited aclose()/athrow()");
+ return NULL;
+ }
+
+ if (f == NULL || f->f_stacktop == NULL) {
+ o->agt_state = AWAITABLE_STATE_CLOSED;
PyErr_SetNone(PyExc_StopIteration);
return NULL;
}
}
yield_close:
+ o->agt_state = AWAITABLE_STATE_CLOSED;
PyErr_SetString(
PyExc_RuntimeError, ASYNC_GEN_IGNORED_EXIT_MSG);
return NULL;
PyObject *retval;
if (o->agt_state == AWAITABLE_STATE_CLOSED) {
- PyErr_SetNone(PyExc_StopIteration);
+ PyErr_SetString(
+ PyExc_RuntimeError,
+ "cannot reuse already awaited aclose()/athrow()");
return NULL;
}
} else {
/* aclose() mode */
if (retval && _PyAsyncGenWrappedValue_CheckExact(retval)) {
+ o->agt_state = AWAITABLE_STATE_CLOSED;
Py_DECREF(retval);
PyErr_SetString(PyExc_RuntimeError, ASYNC_GEN_IGNORED_EXIT_MSG);
return NULL;
static int
list_contains(PyListObject *a, PyObject *el)
{
+ PyObject *item;
Py_ssize_t i;
int cmp;
- for (i = 0, cmp = 0 ; cmp == 0 && i < Py_SIZE(a); ++i)
- cmp = PyObject_RichCompareBool(el, PyList_GET_ITEM(a, i),
- Py_EQ);
+ for (i = 0, cmp = 0 ; cmp == 0 && i < Py_SIZE(a); ++i) {
+ item = PyList_GET_ITEM(a, i);
+ Py_INCREF(item);
+ cmp = PyObject_RichCompareBool(el, item, Py_EQ);
+ Py_DECREF(item);
+ }
return cmp;
}
stop = 0;
}
for (i = start; i < stop && i < Py_SIZE(self); i++) {
- int cmp = PyObject_RichCompareBool(self->ob_item[i], value, Py_EQ);
+ PyObject *obj = self->ob_item[i];
+ Py_INCREF(obj);
+ int cmp = PyObject_RichCompareBool(obj, value, Py_EQ);
+ Py_DECREF(obj);
if (cmp > 0)
return PyLong_FromSsize_t(i);
else if (cmp < 0)
Py_ssize_t i;
for (i = 0; i < Py_SIZE(self); i++) {
- int cmp = PyObject_RichCompareBool(self->ob_item[i], value, Py_EQ);
+ PyObject *obj = self->ob_item[i];
+ if (obj == value) {
+ count++;
+ continue;
+ }
+ Py_INCREF(obj);
+ int cmp = PyObject_RichCompareBool(obj, value, Py_EQ);
+ Py_DECREF(obj);
if (cmp > 0)
count++;
else if (cmp < 0)
Py_ssize_t i;
for (i = 0; i < Py_SIZE(self); i++) {
- int cmp = PyObject_RichCompareBool(self->ob_item[i], value, Py_EQ);
+ PyObject *obj = self->ob_item[i];
+ Py_INCREF(obj);
+ int cmp = PyObject_RichCompareBool(obj, value, Py_EQ);
+ Py_DECREF(obj);
if (cmp > 0) {
if (list_ass_slice(self, i, i+1,
(PyObject *)NULL) == 0)
/* Search for the first index where items are different */
for (i = 0; i < Py_SIZE(vl) && i < Py_SIZE(wl); i++) {
+ PyObject *vitem = vl->ob_item[i];
+ PyObject *witem = wl->ob_item[i];
+ if (vitem == witem) {
+ continue;
+ }
+
+ Py_INCREF(vitem);
+ Py_INCREF(witem);
int k = PyObject_RichCompareBool(vl->ob_item[i],
wl->ob_item[i], Py_EQ);
+ Py_DECREF(vitem);
+ Py_DECREF(witem);
if (k < 0)
return NULL;
if (!k)
although this may require some temp memory (more on that later).
When a run is identified, its base address and length are pushed on a stack
-in the MergeState struct. merge_collapse() is then called to see whether it
-should merge it with preceding run(s). We would like to delay merging as
-long as possible in order to exploit patterns that may come up later, but we
-like even more to do merging as soon as possible to exploit that the run just
-found is still high in the memory hierarchy. We also can't delay merging
-"too long" because it consumes memory to remember the runs that are still
-unmerged, and the stack has a fixed size.
+in the MergeState struct. merge_collapse() is then called to potentially
+merge runs on that stack. We would like to delay merging as long as possible
+in order to exploit patterns that may come up later, but we like even more to
+do merging as soon as possible to exploit that the run just found is still
+high in the memory hierarchy. We also can't delay merging "too long" because
+it consumes memory to remember the runs that are still unmerged, and the
+stack has a fixed size.
What turned out to be a good compromise maintains two invariants on the
stack entries, where A, B and C are the lengths of the three righmost not-yet
inclusive = 2**(k-1) through (2**k-1)-1 inclusive, which has
(2**k-1)-1 - 2**(k-1) + 1 =
2**k-1 - 2**(k-1) =
- 2*2**k-1 - 2**(k-1) =
+ 2*2**(k-1)-1 - 2**(k-1) =
(2-1)*2**(k-1) - 1 =
2**(k-1) - 1
elements.
-/* Memoryview object implementation */
+/*
+ * Memoryview object implementation
+ * --------------------------------
+ *
+ * This implementation is a complete rewrite contributed by Stefan Krah in
+ * Python 3.3. Substantial credit goes to Antoine Pitrou (who had already
+ * fortified and rewritten the previous implementation) and Nick Coghlan
+ * (who came up with the idea of the ManagedBuffer) for analyzing the complex
+ * ownership rules.
+ *
+ */
#include "Python.h"
#include "internal/mem.h"
/* ignore op->ob_ref: its value can have be modified
by Py_INCREF() and Py_DECREF(). */
#ifdef Py_TRACE_REFS
- if (_PyMem_IsPtrFreed(op->_ob_next) || _PyMem_IsPtrFreed(op->_ob_prev)) {
+ if (op->_ob_next != NULL && _PyMem_IsPtrFreed(op->_ob_next)) {
return 1;
}
+ if (op->_ob_prev != NULL && _PyMem_IsPtrFreed(op->_ob_prev)) {
+ return 1;
+ }
#endif
return 0;
}
#if defined(__has_feature) /* Clang */
# if __has_feature(address_sanitizer) /* is ASAN enabled? */
-# define _Py_NO_ADDRESS_SAFETY_ANALYSIS \
- __attribute__((no_address_safety_analysis))
+# define _Py_NO_SANITIZE_ADDRESS \
+ __attribute__((no_sanitize("address")))
# endif
# if __has_feature(thread_sanitizer) /* is TSAN enabled? */
# define _Py_NO_SANITIZE_THREAD __attribute__((no_sanitize_thread))
# endif
#elif defined(__GNUC__)
# if defined(__SANITIZE_ADDRESS__) /* GCC 4.8+, is ASAN enabled? */
-# define _Py_NO_ADDRESS_SAFETY_ANALYSIS \
- __attribute__((no_address_safety_analysis))
+# define _Py_NO_SANITIZE_ADDRESS \
+ __attribute__((no_sanitize_address))
# endif
// TSAN is supported since GCC 5.1, but __SANITIZE_THREAD__ macro
// is provided only since GCC 7.
# endif
#endif
-#ifndef _Py_NO_ADDRESS_SAFETY_ANALYSIS
-# define _Py_NO_ADDRESS_SAFETY_ANALYSIS
+#ifndef _Py_NO_SANITIZE_ADDRESS
+# define _Py_NO_SANITIZE_ADDRESS
#endif
#ifndef _Py_NO_SANITIZE_THREAD
# define _Py_NO_SANITIZE_THREAD
extremely desirable that it be this fast.
*/
-static bool _Py_NO_ADDRESS_SAFETY_ANALYSIS
+static bool _Py_NO_SANITIZE_ADDRESS
_Py_NO_SANITIZE_THREAD
_Py_NO_SANITIZE_MEMORY
address_in_range(void *p, poolp pool)
_ODictNode *node;
Py_VISIT(od->od_inst_dict);
- Py_VISIT(od->od_weakreflist);
_odict_FOREACH(od, node) {
Py_VISIT(_odictnode_KEY(node));
}
odict_tp_clear(PyODictObject *od)
{
Py_CLEAR(od->od_inst_dict);
- Py_CLEAR(od->od_weakreflist);
PyDict_Clear((PyObject *)od);
_odict_clear_nodes(od);
return 0;
join(wchar_t *buffer, const wchar_t *stuff)
{
if (_PathCchCombineEx_Initialized == 0) {
- HMODULE pathapi = LoadLibraryW(L"api-ms-win-core-path-l1-1-0.dll");
+ HMODULE pathapi = LoadLibraryExW(L"api-ms-win-core-path-l1-1-0.dll", NULL,
+ LOAD_LIBRARY_SEARCH_SYSTEM32);
if (pathapi) {
_PathCchCombineEx = (PPathCchCombineEx)GetProcAddress(pathapi, "PathCchCombineEx");
}
}
if (_PathCchCanonicalizeEx_Initialized == 0) {
- HMODULE pathapi = LoadLibraryW(L"api-ms-win-core-path-l1-1-0.dll");
+ HMODULE pathapi = LoadLibraryExW(L"api-ms-win-core-path-l1-1-0.dll", NULL,
+ LOAD_LIBRARY_SEARCH_SYSTEM32);
if (pathapi) {
_PathCchCanonicalizeEx = (PPathCchCanonicalizeEx)GetProcAddress(pathapi, "PathCchCanonicalizeEx");
}
(which you can't on SCO ODT 3.0). */
/* #undef SYS_SELECT_WITH_SYS_TIME */
+/* Define if you want build the _decimal module using a coroutine-local rather
+ than a thread-local context */
+#define WITH_DECIMAL_CONTEXTVAR 1
+
/* Define if you want documentation strings in extension modules */
#define WITH_DOC_STRINGS 1
echo. -m Enable parallel build (enabled by default)\r
echo. -M Disable parallel build\r
echo. -v Increased output messages\r
+echo. -vv Verbose output messages\r
+echo. -q Quiet output messages (errors and warnings only)\r
echo. -k Attempt to kill any running Pythons before building (usually done\r
echo. automatically by the pythoncore project)\r
echo. --pgo Build with Profile-Guided Optimization. This flag\r
if "%~1"=="-m" (set parallel=/m) & shift & goto CheckOpts\r
if "%~1"=="-M" (set parallel=) & shift & goto CheckOpts\r
if "%~1"=="-v" (set verbose=/v:n) & shift & goto CheckOpts\r
+if "%~1"=="-vv" (set verbose=/v:d /ds) & shift & goto CheckOpts\r
+if "%~1"=="-q" (set verbose=/v:q /nologo /clp:summary) & shift & goto CheckOpts\r
if "%~1"=="-k" (set kill=true) & shift & goto CheckOpts\r
if "%~1"=="--pgo" (set do_pgo=true) & shift & goto CheckOpts\r
if "%~1"=="--pgo-job" (set do_pgo=true) & (set pgo_job=%~2) & shift & shift & goto CheckOpts\r
set libraries=\r
set libraries=%libraries% bzip2-1.0.6\r
if NOT "%IncludeSSLSrc%"=="false" set libraries=%libraries% openssl-1.1.1d\r
-set libraries=%libraries% sqlite-3.28.0.0\r
+set libraries=%libraries% sqlite-3.31.1.0\r
if NOT "%IncludeTkinterSrc%"=="false" set libraries=%libraries% tcl-core-8.6.9.0\r
if NOT "%IncludeTkinterSrc%"=="false" set libraries=%libraries% tk-8.6.9.0\r
if NOT "%IncludeTkinterSrc%"=="false" set libraries=%libraries% tix-8.4.3.6\r
<ExternalsDir>$(EXTERNALS_DIR)</ExternalsDir>\r
<ExternalsDir Condition="$(ExternalsDir) == ''">$([System.IO.Path]::GetFullPath(`$(PySourcePath)externals`))</ExternalsDir>\r
<ExternalsDir Condition="!HasTrailingSlash($(ExternalsDir))">$(ExternalsDir)\</ExternalsDir>\r
- <sqlite3Dir>$(ExternalsDir)sqlite-3.28.0.0\</sqlite3Dir>\r
+ <sqlite3Dir>$(ExternalsDir)sqlite-3.31.1.0\</sqlite3Dir>\r
<bz2Dir>$(ExternalsDir)bzip2-1.0.6\</bz2Dir>\r
<lzmaDir>$(ExternalsDir)xz-5.2.2\</lzmaDir>\r
<opensslDir>$(ExternalsDir)openssl-1.1.1d\</opensslDir>\r
</ItemGroup>\r
<PropertyGroup Label="Globals">\r
<ProjectGuid>{9DE9E23D-C8D4-4817-92A9-920A8B1FE5FF}</ProjectGuid>\r
+ <PlatformToolset Condition="'$(PlatformToolset)' == '' and ('$(MSBuildToolsVersion)' == '16.0' or '$(VisualStudioVersion)' == '16.0')">v142</PlatformToolset>\r
</PropertyGroup>\r
<Import Project="python.props" />\r
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.Default.props" />\r
<VCRuntimeDLL Include="$(VCRedistDir)\**\vcruntime*.dll" />\r
</ItemGroup>\r
<Target Name="_CopyVCRuntime" AfterTargets="Build" Inputs="@(VCRuntimeDLL)" Outputs="$(OutDir)%(Filename)%(Extension)">\r
+ <!-- bpo-38597: When we switch to another VCRuntime DLL, include vcruntime140.dll as well -->\r
+ <Warning Text="A copy of vcruntime140.dll is also required" Condition="!$(VCToolsRedistVersion.StartsWith(`14.`))" />\r
<Copy SourceFiles="%(VCRuntimeDLL.FullPath)" DestinationFolder="$(OutDir)" />\r
</Target>\r
<Target Name="_CleanVCRuntime" AfterTargets="Clean">\r
</ItemGroup>\r
<PropertyGroup Label="Globals">\r
<ProjectGuid>{AB603547-1E2A-45B3-9E09-B04596006393}</ProjectGuid>\r
+ <PlatformToolset Condition="'$(PlatformToolset)' == '' and ('$(MSBuildToolsVersion)' == '16.0' or '$(VisualStudioVersion)' == '16.0')">v142</PlatformToolset>\r
</PropertyGroup>\r
<Import Project="python.props" />\r
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.Default.props" />\r
again when building.\r
\r
_sqlite3\r
- Wraps SQLite 3.28.0.0, which is itself built by sqlite3.vcxproj\r
+ Wraps SQLite 3.31.1.0, which is itself built by sqlite3.vcxproj\r
Homepage:\r
http://www.sqlite.org/\r
_tkinter\r
*registry = PyDict_New();
if (*registry == NULL)
- return 0;
+ goto handle_error;
rc = PyDict_SetItemString(globals, "__warningregistry__", *registry);
if (rc < 0)
dangling reference. */
Py_XDECREF(*registry);
Py_XDECREF(*module);
+ Py_XDECREF(*filename);
return 0;
}
asdl_seq_SET(orelse, 0,
If(expression, suite_seq, suite_seq2,
- LINENO(CHILD(n, NCH(n) - 6)),
- CHILD(n, NCH(n) - 6)->n_col_offset,
+ LINENO(CHILD(n, NCH(n) - 7)),
+ CHILD(n, NCH(n) - 7)->n_col_offset,
c->c_arena));
/* the just-created orelse handled the last elif */
n_elif--;
asdl_seq_SET(newobj, 0,
If(expression, suite_seq, orelse,
- LINENO(CHILD(n, off)),
- CHILD(n, off)->n_col_offset, c->c_arena));
+ LINENO(CHILD(n, off - 1)),
+ CHILD(n, off - 1)->n_col_offset, c->c_arena));
orelse = newobj;
}
expression = ast_for_expr(c, CHILD(n, 1));
static const char cprt[] =
"\
-Copyright (c) 2001-2019 Python Software Foundation.\n\
+Copyright (c) 2001-2020 Python Software Foundation.\n\
All Rights Reserved.\n\
\n\
Copyright (c) 2000 BeOpen.com.\n\
tstate->context = NULL;
tstate->context_ver = 1;
- tstate->id = ++interp->tstate_next_unique_id;
if (init)
_PyThreadState_Init(tstate);
HEAD_LOCK();
+ tstate->id = ++interp->tstate_next_unique_id;
tstate->prev = NULL;
tstate->next = interp->tstate_head;
if (tstate->next)
-This is Python version 3.7.6
+This is Python version 3.7.7
============================
.. image:: https://travis-ci.org/python/cpython.svg?branch=3.7
:alt: CPython code coverage on Codecov
:target: https://codecov.io/gh/python/cpython/branch/3.7
-Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011,
-2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019 Python Software Foundation. All
-rights reserved.
+Copyright (c) 2001-2020 Python Software Foundation. All rights reserved.
See the end of this file for further copyright and license information.
Copyright and License Information
---------------------------------
-Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011,
-2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019 Python Software Foundation. All
-rights reserved.
+Copyright (c) 2001-2020 Python Software Foundation. All rights reserved.
Copyright (c) 2000 BeOpen.com. All rights reserved.
}
} else {
if (IsWindows7SP1OrGreater()) {
- BalLog(BOOTSTRAPPER_LOG_LEVEL_STANDARD, "Target OS is Windows 7 SP1 or later");
- return;
+ HMODULE hKernel32 = GetModuleHandleW(L"kernel32");
+ if (hKernel32 && !GetProcAddress(hKernel32, "AddDllDirectory")) {
+ BalLog(BOOTSTRAPPER_LOG_LEVEL_ERROR, "Detected Windows 7 SP1 without KB2533623");
+ BalLog(BOOTSTRAPPER_LOG_LEVEL_ERROR, "KB2533623 update is required to continue.");
+ /* The "MissingSP1" error also specifies updates are required */
+ LocGetString(_wixLoc, L"#(loc.FailureWin7MissingSP1)", &pLocString);
+ } else {
+ BalLog(BOOTSTRAPPER_LOG_LEVEL_STANDARD, "Target OS is Windows 7 SP1 or later");
+ return;
+ }
} else if (IsWindows7OrGreater()) {
BalLog(BOOTSTRAPPER_LOG_LEVEL_ERROR, "Detected Windows 7 RTM");
BalLog(BOOTSTRAPPER_LOG_LEVEL_ERROR, "Service Pack 1 is required to continue installation");
-# generated automatically by aclocal 1.16.1 -*- Autoconf -*-
+# generated automatically by aclocal 1.15 -*- Autoconf -*-
-# Copyright (C) 1996-2018 Free Software Foundation, Inc.
+# Copyright (C) 1996-2014 Free Software Foundation, Inc.
# This file is free software; the Free Software Foundation
# gives unlimited permission to copy and/or distribute it,
with_system_expat
with_system_ffi
with_system_libmpdec
+with_decimal_contextvar
enable_loadable_sqlite_extensions
with_tcltk_includes
with_tcltk_libs
--with-system-ffi build _ctypes module using an installed ffi library
--with-system-libmpdec build _decimal module using an installed libmpdec
library
+ --with-decimal-contextvar
+ build _decimal module using a coroutine-local rather
+ than a thread-local context (default is yes)
--with-tcltk-includes='-I...'
override search for Tcl and Tk include files
--with-tcltk-libs='-L...'
{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $with_system_libmpdec" >&5
$as_echo "$with_system_libmpdec" >&6; }
+# Check whether _decimal should use a coroutine-local or thread-local context
+{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for --with-decimal-contextvar" >&5
+$as_echo_n "checking for --with-decimal-contextvar... " >&6; }
+
+# Check whether --with-decimal_contextvar was given.
+if test "${with_decimal_contextvar+set}" = set; then :
+ withval=$with_decimal_contextvar;
+else
+ with_decimal_contextvar="yes"
+fi
+
+
+if test "$with_decimal_contextvar" != "no"
+then
+
+$as_echo "#define WITH_DECIMAL_CONTEXTVAR 1" >>confdefs.h
+
+fi
+
+{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $with_decimal_contextvar" >&5
+$as_echo "$with_decimal_contextvar" >&6; }
+
# Check for support for loadable sqlite extensions
{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for --enable-loadable-sqlite-extensions" >&5
$as_echo_n "checking for --enable-loadable-sqlite-extensions... " >&6; }
AC_MSG_RESULT($with_system_libmpdec)
+# Check whether _decimal should use a coroutine-local or thread-local context
+AC_MSG_CHECKING(for --with-decimal-contextvar)
+AC_ARG_WITH(decimal_contextvar,
+ AS_HELP_STRING([--with-decimal-contextvar], [build _decimal module using a coroutine-local rather than a thread-local context (default is yes)]),
+ [],
+ [with_decimal_contextvar="yes"])
+
+if test "$with_decimal_contextvar" != "no"
+then
+ AC_DEFINE(WITH_DECIMAL_CONTEXTVAR, 1,
+ [Define if you want build the _decimal module using a coroutine-local rather than a thread-local context])
+fi
+
+AC_MSG_RESULT($with_decimal_contextvar)
+
# Check for support for loadable sqlite extensions
AC_MSG_CHECKING(for --enable-loadable-sqlite-extensions)
AC_ARG_ENABLE(loadable-sqlite-extensions,
/* Define if WINDOW in curses.h offers a field _flags. */
#undef WINDOW_HAS_FLAGS
+/* Define if you want build the _decimal module using a coroutine-local rather
+ than a thread-local context */
+#undef WITH_DECIMAL_CONTEXTVAR
+
/* Define if you want documentation strings in extension modules */
#undef WITH_DOC_STRINGS