Metadata-Version: 2.1
Name: numpy
-Version: 1.22.2
+Version: 1.22.3
Summary: NumPy is the fundamental package for array computing with Python.
Home-page: https://www.numpy.org
Author: Travis E. Oliphant et al.
--- /dev/null
+
+Contributors
+============
+
+A total of 9 people contributed to this release. People with a "+" by their
+names contributed a patch for the first time.
+
+* @GalaxySnail +
+* Alexandre de Siqueira
+* Bas van Beek
+* Charles Harris
+* Melissa Weber Mendonça
+* Ross Barnowski
+* Sebastian Berg
+* Tirth Patel
+* Matthieu Darbois
+
+Pull requests merged
+====================
+
+A total of 10 pull requests were merged for this release.
+
+* `#21048 <https://github.com/numpy/numpy/pull/21048>`__: MAINT: Use "3.10" instead of "3.10-dev" on travis.
+* `#21106 <https://github.com/numpy/numpy/pull/21106>`__: TYP,MAINT: Explicitly allow sequences of array-likes in ``np.concatenate``
+* `#21137 <https://github.com/numpy/numpy/pull/21137>`__: BLD,DOC: skip broken ipython 8.1.0
+* `#21138 <https://github.com/numpy/numpy/pull/21138>`__: BUG, ENH: np._from_dlpack: export correct device information
+* `#21139 <https://github.com/numpy/numpy/pull/21139>`__: BUG: Fix numba DUFuncs added loops getting picked up
+* `#21140 <https://github.com/numpy/numpy/pull/21140>`__: BUG: Fix unpickling an empty ndarray with a none-zero dimension...
+* `#21141 <https://github.com/numpy/numpy/pull/21141>`__: BUG: use ThreadPoolExecutor instead of ThreadPool
+* `#21142 <https://github.com/numpy/numpy/pull/21142>`__: API: Disallow strings in logical ufuncs
+* `#21143 <https://github.com/numpy/numpy/pull/21143>`__: MAINT, DOC: Fix SciPy intersphinx link
+* `#21148 <https://github.com/numpy/numpy/pull/21148>`__: BUG,ENH: np._from_dlpack: export arrays with any strided size-1...
# -----------------------------------------------------------------------------
intersphinx_mapping = {
'neps': ('https://numpy.org/neps', None),
- 'python': ('https://docs.python.org/dev', None),
- 'scipy': ('https://docs.scipy.org/doc/scipy/reference', None),
+ 'python': ('https://docs.python.org/3', None),
+ 'scipy': ('https://docs.scipy.org/doc/scipy', None),
'matplotlib': ('https://matplotlib.org/stable', None),
'imageio': ('https://imageio.readthedocs.io/en/stable', None),
'skimage': ('https://scikit-image.org/docs/stable', None),
.. toctree::
:maxdepth: 3
+ 1.22.3 <release/1.22.3-notes>
1.22.2 <release/1.22.2-notes>
1.22.1 <release/1.22.1-notes>
1.22.0 <release/1.22.0-notes>
The Python versions supported in this release are 3.8-3.10, Python 3.7 has been
dropped. Note that 32 bit wheels are only provided for Python 3.8 and 3.9 on
Windows, all other wheels are 64 bits on account of Ubuntu, Fedora, and other
-Linux distributions dropping 32 bit support. All 64 bit wheels are also linked
-with 64 bit integer OpenBLAS, which should fix the occasional problems
+Linux distributions dropping 32 bit support. The Mac wheels are now based on
+OS X 10.14 rather than 10.6 that was used in previous NumPy release cycles.
+10.14 is the oldest release supported by Apple. All 64 bit wheels are also
+linked with 64 bit integer OpenBLAS, which should fix the occasional problems
encountered by folks using truly huge arrays.
--- /dev/null
+.. currentmodule:: numpy
+
+==========================
+NumPy 1.22.3 Release Notes
+==========================
+
+The NumPy 1.22.3 is maintenance release that fixes bugs discovered after the
+1.22.2 release. The most noticeable fixes may be those for DLPack. One that may
+cause some problems is disallowing strings as inputs to logical ufuncs. It is
+still undecided how strings should be treated in those functions and it was
+thought best to simply disallow them until a decision was reached. That should
+not cause problems with older code.
+
+The Python versions supported for this release are 3.8-3.10. Note that the Mac
+wheels are now based on OS X 10.14 rather than 10.6 that was used in previous
+NumPy release cycles. 10.14 is the oldest release supported by Apple.
+
+Contributors
+============
+
+A total of 9 people contributed to this release. People with a "+" by their
+names contributed a patch for the first time.
+
+* @GalaxySnail +
+* Alexandre de Siqueira
+* Bas van Beek
+* Charles Harris
+* Melissa Weber Mendonça
+* Ross Barnowski
+* Sebastian Berg
+* Tirth Patel
+* Matthieu Darbois
+
+Pull requests merged
+====================
+
+A total of 10 pull requests were merged for this release.
+
+* `#21048 <https://github.com/numpy/numpy/pull/21048>`__: MAINT: Use "3.10" instead of "3.10-dev" on travis.
+* `#21106 <https://github.com/numpy/numpy/pull/21106>`__: TYP,MAINT: Explicitly allow sequences of array-likes in ``np.concatenate``
+* `#21137 <https://github.com/numpy/numpy/pull/21137>`__: BLD,DOC: skip broken ipython 8.1.0
+* `#21138 <https://github.com/numpy/numpy/pull/21138>`__: BUG, ENH: np._from_dlpack: export correct device information
+* `#21139 <https://github.com/numpy/numpy/pull/21139>`__: BUG: Fix numba DUFuncs added loops getting picked up
+* `#21140 <https://github.com/numpy/numpy/pull/21140>`__: BUG: Fix unpickling an empty ndarray with a none-zero dimension...
+* `#21141 <https://github.com/numpy/numpy/pull/21141>`__: BUG: use ThreadPoolExecutor instead of ThreadPool
+* `#21142 <https://github.com/numpy/numpy/pull/21142>`__: API: Disallow strings in logical ufuncs
+* `#21143 <https://github.com/numpy/numpy/pull/21143>`__: MAINT, DOC: Fix SciPy intersphinx link
+* `#21148 <https://github.com/numpy/numpy/pull/21148>`__: BUG,ENH: np._from_dlpack: export arrays with any strided size-1...
numpydoc==1.1.0
pydata-sphinx-theme==0.7.2
sphinx-panels
-ipython
+ipython!=8.1.0
scipy
matplotlib
pandas
version_json = '''
{
- "date": "2022-02-03T14:24:02-0700",
+ "date": "2022-03-07T12:27:54-0700",
"dirty": false,
"error": null,
- "full-revisionid": "f6dddcb2e5ea5ed39675f14429af3585c585a666",
- "version": "1.22.2"
+ "full-revisionid": "7d4349e332fcba2bc3f266267421531b3ec5d3e6",
+ "version": "1.22.3"
}
''' # END VERSION_JSON
_TD64Like_co,
)
+_T_co = TypeVar("_T_co", covariant=True)
+_T_contra = TypeVar("_T_contra", contravariant=True)
_SCT = TypeVar("_SCT", bound=generic)
_ArrayType = TypeVar("_ArrayType", bound=NDArray[Any])
"modifiedpreceding",
]
+class _SupportsLenAndGetItem(Protocol[_T_contra, _T_co]):
+ def __len__(self) -> int: ...
+ def __getitem__(self, __key: _T_contra) -> _T_co: ...
+
__all__: List[str]
ALLOW_THREADS: Final[int] # 0 or 1 (system-specific)
order: _OrderCF = ...,
) -> NDArray[intp]: ...
+# NOTE: Allow any sequence of array-like objects
@overload
def concatenate( # type: ignore[misc]
arrays: _ArrayLike[_SCT],
) -> NDArray[_SCT]: ...
@overload
def concatenate( # type: ignore[misc]
- arrays: ArrayLike,
+ arrays: _SupportsLenAndGetItem[int, ArrayLike],
/,
axis: Optional[SupportsIndex] = ...,
out: None = ...,
) -> NDArray[Any]: ...
@overload
def concatenate( # type: ignore[misc]
- arrays: ArrayLike,
+ arrays: _SupportsLenAndGetItem[int, ArrayLike],
/,
axis: Optional[SupportsIndex] = ...,
out: None = ...,
) -> NDArray[_SCT]: ...
@overload
def concatenate( # type: ignore[misc]
- arrays: ArrayLike,
+ arrays: _SupportsLenAndGetItem[int, ArrayLike],
/,
axis: Optional[SupportsIndex] = ...,
out: None = ...,
) -> NDArray[Any]: ...
@overload
def concatenate(
- arrays: ArrayLike,
+ arrays: _SupportsLenAndGetItem[int, ArrayLike],
/,
axis: Optional[SupportsIndex] = ...,
out: _ArrayType = ...,
ret.device_type = kDLCPU;
ret.device_id = 0;
PyObject *base = PyArray_BASE(self);
+
+ // walk the bases (see gh-20340)
+ while (base != NULL && PyArray_Check(base)) {
+ base = PyArray_BASE((PyArrayObject *)base);
+ }
+
// The outer if is due to the fact that NumPy arrays are on the CPU
// by default (if not created from DLPack).
if (PyCapsule_IsValid(base, NPY_DLPACK_INTERNAL_CAPSULE_NAME)) {
if (!PyArray_IS_C_CONTIGUOUS(self) && PyArray_SIZE(self) != 1) {
for (int i = 0; i < ndim; ++i) {
- if (strides[i] % itemsize != 0) {
+ if (shape[i] != 1 && strides[i] % itemsize != 0) {
PyErr_SetString(PyExc_RuntimeError,
"DLPack only supports strides which are a multiple of "
"itemsize.");
Py_DECREF(typecode);
}
else {
- memcpy(PyArray_DATA(self), datastr, num);
+ memcpy(PyArray_DATA(self), datastr, PyArray_NBYTES(self));
}
PyArray_ENABLEFLAGS(self, NPY_ARRAY_OWNDATA);
fa->base = NULL;
}
info = promote_and_get_info_and_ufuncimpl(ufunc,
ops, signature, new_op_dtypes, NPY_FALSE);
+ if (info == NULL) {
+ /*
+ * NOTE: This block exists solely to support numba's DUFuncs which add
+ * new loops dynamically, so our list may get outdated. Thus, we
+ * have to make sure that the loop exists.
+ *
+ * Before adding a new loop, ensure that it actually exists. There
+ * is a tiny chance that this would not work, but it would require an
+ * extension additionally have a custom loop getter.
+ * This check should ensure a the right error message, but in principle
+ * we could try to call the loop getter here.
+ */
+ char *types = ufunc->types;
+ npy_bool loop_exists = NPY_FALSE;
+ for (int i = 0; i < ufunc->ntypes; ++i) {
+ loop_exists = NPY_TRUE; /* assume it exists, break if not */
+ for (int j = 0; j < ufunc->nargs; ++j) {
+ if (types[j] != new_op_dtypes[j]->type_num) {
+ loop_exists = NPY_FALSE;
+ break;
+ }
+ }
+ if (loop_exists) {
+ break;
+ }
+ types += ufunc->nargs;
+ }
+
+ if (loop_exists) {
+ info = add_and_return_legacy_wrapping_ufunc_loop(
+ ufunc, new_op_dtypes, 0);
+ }
+ }
+
for (int i = 0; i < ufunc->nargs; i++) {
Py_XDECREF(new_op_dtypes[i]);
}
/* bail out, this is _only_ to give future/deprecation warning! */
return -1;
}
+ if ((op_dtypes[0] != NULL && PyTypeNum_ISSTRING(op_dtypes[0]->type_num))
+ || PyTypeNum_ISSTRING(op_dtypes[1]->type_num)) {
+ /* bail out on strings: currently casting them to bool is too weird */
+ return -1;
+ }
for (int i = 0; i < 3; i++) {
PyArray_DTypeMeta *item;
def test_dlpack_device(self):
x = np.arange(5)
assert x.__dlpack_device__() == (1, 0)
- assert np._from_dlpack(x).__dlpack_device__() == (1, 0)
+ y = np._from_dlpack(x)
+ assert y.__dlpack_device__() == (1, 0)
+ z = y[::2]
+ assert z.__dlpack_device__() == (1, 0)
def dlpack_deleter_exception(self):
x = np.arange(5)
_ = x.__dlpack__()
raise RuntimeError
-
+
def test_dlpack_destructor_exception(self):
with pytest.raises(RuntimeError):
self.dlpack_deleter_exception()
x.flags.writeable = False
with pytest.raises(TypeError):
x.__dlpack__()
+
+ def test_ndim0(self):
+ x = np.array(1.0)
+ y = np._from_dlpack(x)
+ assert_array_equal(x, y)
+
+ def test_size1dims_arrays(self):
+ x = np.ndarray(dtype='f8', shape=(10, 5, 1), strides=(8, 80, 4),
+ buffer=np.ones(1000, dtype=np.uint8), order='F')
+ y = np._from_dlpack(x)
+ assert_array_equal(x, y)
assert_equal(zs.dtype, zs2.dtype)
+ def test_pickle_empty(self):
+ """Checking if an empty array pickled and un-pickled will not cause a
+ segmentation fault"""
+ arr = np.array([]).reshape(999999, 0)
+ pk_dmp = pickle.dumps(arr)
+ pk_load = pickle.loads(pk_dmp)
+
+ assert pk_load.size == 0
+
@pytest.mark.skipif(pickle.HIGHEST_PROTOCOL < 5,
reason="requires pickle protocol 5")
def test_pickle_with_buffercallback(self):
[np.logical_and, np.logical_or, np.logical_xor])
def test_logical_ufuncs_support_anything(self, ufunc):
# The logical ufuncs support even input that can't be promoted:
- a = np.array('1')
+ a = np.array(b'1', dtype="V3")
c = np.array([1., 2.])
assert_array_equal(ufunc(a, c), ufunc([True, True], True))
assert ufunc.reduce(a) == True
out = np.zeros((), dtype=a.dtype)
assert ufunc.reduce(a, out=out) == 1
+ @pytest.mark.parametrize("ufunc",
+ [np.logical_and, np.logical_or, np.logical_xor])
+ def test_logical_ufuncs_reject_string(self, ufunc):
+ """
+ Logical ufuncs are normally well defined by working with the boolean
+ equivalent, i.e. casting all inputs to bools should work.
+
+ However, casting strings to bools is *currently* weird, because it
+ actually uses `bool(int(str))`. Thus we explicitly reject strings.
+ This test should succeed (and can probably just be removed) as soon as
+ string to bool casts are well defined in NumPy.
+ """
+ with pytest.raises(TypeError, match="contain a loop with signature"):
+ ufunc(["1"], ["3"])
+ with pytest.raises(TypeError, match="contain a loop with signature"):
+ ufunc.reduce(["1", "2", "0"])
+
@pytest.mark.parametrize("ufunc",
[np.logical_and, np.logical_or, np.logical_xor])
def test_logical_ufuncs_out_cast_check(self, ufunc):
if len(build) > 1 and jobs > 1:
# build parallel
- import multiprocessing.pool
- pool = multiprocessing.pool.ThreadPool(jobs)
- pool.map(single_compile, build_items)
- pool.close()
+ from concurrent.futures import ThreadPoolExecutor
+ with ThreadPoolExecutor(jobs) as pool:
+ pool.map(single_compile, build_items)
else:
# build serial
for o in build_items:
--- /dev/null
+import numpy as np
+import numpy.typing as npt
+
+AR_f8: npt.NDArray[np.float64]
+
+# NOTE: Mypy bug presumably due to the special-casing of heterogeneous tuples;
+# xref numpy/numpy#20901
+#
+# The expected output should be no different than, e.g., when using a
+# list instead of a tuple
+np.concatenate(([1], AR_f8)) # E: Argument 1 to "concatenate" has incompatible type
reveal_type(np.empty([1, 5, 6], dtype='c16')) # E: ndarray[Any, dtype[Any]]
reveal_type(np.concatenate(A)) # E: ndarray[Any, dtype[{float64}]]
+reveal_type(np.concatenate([A, A])) # E: Any
+reveal_type(np.concatenate([[1], A])) # E: ndarray[Any, dtype[Any]]
+reveal_type(np.concatenate([[1], [1]])) # E: ndarray[Any, dtype[Any]]
+reveal_type(np.concatenate((A, A))) # E: ndarray[Any, dtype[{float64}]]
+reveal_type(np.concatenate(([1], [1]))) # E: ndarray[Any, dtype[Any]]
reveal_type(np.concatenate([1, 1.0])) # E: ndarray[Any, dtype[Any]]
reveal_type(np.concatenate(A, dtype=np.int64)) # E: ndarray[Any, dtype[{int64}]]
reveal_type(np.concatenate(A, dtype='c16')) # E: ndarray[Any, dtype[Any]]
--- /dev/null
+from typing import Any
+import numpy.typing as npt
+
+AR_Any: npt.NDArray[Any]
+
+# Mypy bug where overload ambiguity is ignored for `Any`-parametrized types;
+# xref numpy/numpy#20099 and python/mypy#11347
+#
+# The expected output would be something akin to `ndarray[Any, dtype[Any]]`
+reveal_type(AR_Any + 2) # E: ndarray[Any, dtype[signedinteger[Any]]]
#-----------------------------------
# Path to the release notes
-RELEASE_NOTES = 'doc/source/release/1.22.2-notes.rst'
+RELEASE_NOTES = 'doc/source/release/1.22.3-notes.rst'
#-------------------------------------------------------