--- /dev/null
+<!-- First time contributors: Take a moment to review https://setuptools.readthedocs.io/en/latest/developer-guide.html! -->
+<!-- Remove sections if not applicable -->
+
+## Summary of changes
+
+<!-- Summary goes here -->
+
+Closes <!-- issue number here -->
+
+### Pull Request Checklist
+- [ ] Changes have tests
+- [ ] News fragment added in changelog.d. See [documentation](http://setuptools.readthedocs.io/en/latest/developer-guide.html#making-a-pull-request) for details
bin
build
dist
+docs/build
include
lib
distribute.egg-info
language: python
python:
- &latest_py2 2.7
-- 3.3
- 3.4
- 3.5
- &latest_py3 3.6
cache: pip
install:
+# ensure we have recent pip/setuptools
+- pip install --upgrade pip setuptools
# need tox to get started
-- pip install tox 'tox-venv; python_version!="3.3"'
+- pip install tox tox-venv
# Output the env, to verify behavior
- env
+v39.2.0
+-------
+
+* #1359: Support using "file:" to load a PEP 440-compliant package version from
+ a text file.
+* #1360: Fixed issue with a mismatch between the name of the package and the
+ name of the .dist-info file in wheel files
+* #1365: Take the package_dir option into account when loading the version from
+ a module attribute.
+* #1353: Added coverage badge to README.
+* #1356: Made small fixes to the developer guide documentation.
+* #1357: Fixed warnings in documentation builds and started enforcing that the
+ docs build without warnings in tox.
+* #1376: Updated release process docs.
+* #1343: The ``setuptools`` specific ``long_description_content_type``,
+ ``project_urls`` and ``provides_extras`` fields are now set consistently
+ after any ``distutils`` ``setup_keywords`` calls, allowing them to override
+ values.
+* #1352: Added ``tox`` environment for documentation builds.
+* #1354: Added ``towncrier`` for changelog managment.
+* #1355: Add PR template.
+* #1368: Fixed tests which failed without network connectivity.
+* #1369: Added unit tests for PEP 425 compatibility tags support.
+* #1372: Stop testing Python 3.3 in Travis CI, now that the latest version of
+ ``wheel`` no longer installs on it.
+
v39.1.0
-------
.. image:: https://img.shields.io/travis/pypa/setuptools/master.svg?label=Linux%20build%20%40%20Travis%20CI
:target: https://travis-ci.org/pypa/setuptools
-.. image:: https://img.shields.io/appveyor/ci/jaraco/setuptools/master.svg?label=Windows%20build%20%40%20Appveyor
- :target: https://ci.appveyor.com/project/jaraco/setuptools/branch/master
+.. image:: https://img.shields.io/appveyor/ci/pypa/setuptools/master.svg?label=Windows%20build%20%40%20Appveyor
+ :target: https://ci.appveyor.com/project/pypa/setuptools/branch/master
+
+.. image:: https://img.shields.io/codecov/c/github/pypa/setuptools/master.svg
+ :target: https://codecov.io/gh/pypa/setuptools
.. image:: https://img.shields.io/pypi/pyversions/setuptools.svg
CODECOV_ENV: APPVEYOR_JOB_NAME
matrix:
- - APPVEYOR_JOB_NAME: "python35-x64"
- PYTHON: "C:\\Python35-x64"
+ - APPVEYOR_JOB_NAME: "python36-x64"
+ PYTHON: "C:\\Python36-x64"
- APPVEYOR_JOB_NAME: "python27-x64"
PYTHON: "C:\\Python27-x64"
--- /dev/null
+Add `__dir__()` implementation to `pkg_resources.Distribution()` that includes the attributes in the `_provider` instance variable.
\ No newline at end of file
# Add any Sphinx extension module names here, as strings. They can be extensions
# coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
-extensions = ['jaraco.packaging.sphinx', 'rst.linker', 'sphinx.ext.autosectionlabel']
+extensions = ['jaraco.packaging.sphinx', 'rst.linker']
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
# The master toctree document.
master_doc = 'index'
+# A list of glob-style patterns that should be excluded when looking for source files.
+exclude_patterns = ['requirements.txt']
+
# List of directories, relative to source directory, that shouldn't be searched
# for source files.
exclude_trees = []
that system integrators and other users can get a quick summary, but then
jump to the in-depth discussion about any subject referenced.
+---------------------
+Making a pull request
+---------------------
+
+When making a pull request, please include a short summary of the changes
+and a reference to any issue tickets that the PR is intended to solve.
+All PRs with code changes should include tests. All changes should include a
+changelog entry.
+
+``setuptools`` uses `towncrier <https://town-crier.readthedocs.io/en/latest/>`_
+for changelog managment, so when making a PR, please add a news fragment in the
+``changelog.d/`` folder. Changelog files are written in Restructured Text and
+should be a 1 or 2 sentence description of the substantive changes in the PR.
+They should be named ``<pr_number>.<category>.rst``, where the categories are:
+
+- ``change``: Any backwards compatible code change
+- ``breaking``: Any backwards-compatibility breaking change
+- ``doc``: A change to the documentation
+- ``misc``: Changes internal to the repo like CI, test and build changes
+- ``deprecation``: For deprecations of an existing feature of behavior
+
+A pull request may have more than one of these components, for example a code
+change may introduce a new feature that deprecates an old feature, in which
+case two fragments should be added. It is not necessary to make a separate
+documentation fragment for documentation changes accompanying the relevant
+code changes. See the following for an example news fragment:
+
+.. code-block:: bash
+
+ $ cat changelog.d/1288.change.rst
+ Add support for maintainer in PKG-INFO
+
-----------
Source Code
-----------
Grab the code at Github::
- $ git checkout https://github.com/pypa/setuptools
+ $ git clone https://github.com/pypa/setuptools
If you want to contribute changes, we recommend you fork the repository on
Github, commit the changes to your repository, and then make a pull request
Testing
-------
-The primary tests are run using tox. To run the tests, first make
-sure you have tox installed, then invoke it::
+The primary tests are run using tox. To run the tests, first create the metadata
+needed to run the tests::
+
+ $ python bootstrap.py
+
+Then make sure you have tox installed, and invoke it::
$ tox
Building Documentation
----------------------
-Setuptools relies on the Sphinx system for building documentation.
-To accommodate RTD, docs must be built from the docs/ directory.
+Setuptools relies on the `Sphinx`_ system for building documentation.
+The `published documentation`_ is hosted on Read the Docs.
+
+To build the docs locally, use tox::
-To build them, you need to have installed the requirements specified
-in docs/requirements.txt. One way to do this is to use rwt:
+ $ tox -e docs
- setuptools/docs$ python -m rwt -r requirements.txt -- -m sphinx . html
+.. _Sphinx: http://www.sphinx-doc.org/en/master/
+.. _published documentation: https://setuptools.readthedocs.io/en/latest/
simply by creating an appropriate `IResourceProvider`_ implementation; see the
section below on `Supporting Custom Importers`_ for more details.
+.. _ResourceManager API:
``ResourceManager`` API
=======================
successful build of a tagged release per
`PyPI deployment <https://docs.travis-ci.com/user/deployment/pypi>`_.
-Prior to cutting a release, please check that the CHANGES.rst reflects
-the summary of changes since the last release.
-Ideally, these changelog entries would have been added
-along with the changes, but it's always good to check.
-Think about it from the
-perspective of a user not involved with the development--what would
-that person want to know about what has changed--or from the
-perspective of your future self wanting to know when a particular
-change landed.
-
-To cut a release, install and run ``bump2version {part}`` where ``part``
+Prior to cutting a release, please use `towncrier`_ to update
+``CHANGES.rst`` to summarize the changes since the last release.
+To update the changelog:
+
+1. Install towncrier via ``pip install towncrier`` if not already installed.
+2. Preview the new ``CHANGES.rst`` entry by running
+ ``towncrier --draft --version {new.version.number}`` (enter the desired
+ version number for the next release). If any changes are needed, make
+ them and generate a new preview until the output is acceptable. Run
+ ``git add`` for any modified files.
+3. Run ``towncrier --version {new.version.number}`` to stage the changelog
+ updates in git.
+
+Once the changelog edits are staged and ready to commit, cut a release by
+installing and running ``bump2version {part}`` where ``part``
is major, minor, or patch based on the scope of the changes in the
-release. Then, push the commits to the master branch. If tests pass,
-the release will be uploaded to PyPI (from the Python 3.6 tests).
+release. Then, push the commits to the master branch::
+
+ $ git push origin master
+ $ git push --tags
+
+If tests pass, the release will be uploaded to PyPI (from the Python 3.6
+tests).
+
+.. _towncrier: https://pypi.org/project/towncrier/
Release Frequency
-----------------
pre- or post-release tags. See the following sections for more details:
* `Tagging and "Daily Build" or "Snapshot" Releases`_
-* `Managing "Continuous Releases" Using Subversion`_
* The `egg_info`_ command
Anyway, ``find_packages()`` walks the target directory, filtering by inclusion
patterns, and finds Python packages (any directory). Packages are only
-recognized if they include an ``__init__.py`` file. Finally, exclusion
+recognized if they include an ``__init__.py`` file. Finally, exclusion
patterns are applied to remove matching packages.
Inclusion and exclusion patterns are package names, optionally including
flag, so that it will not be necessary for ``bdist_egg`` or ``EasyInstall`` to
try to guess whether your project can work as a zipfile.
+.. _Namespace Packages:
Namespace Packages
------------------
``Setuptools`` allows using configuration files (usually :file:`setup.cfg`)
to define a package’s metadata and other options that are normally supplied
-to the ``setup()`` function.
+to the ``setup()`` function (declarative config).
This approach not only allows automation scenarios but also reduces
boilerplate code in some cases.
scripts =
bin/first.py
bin/second.py
+ install_requires =
+ requests
+ importlib; python_version == "2.6"
[options.package_data]
* = *.txt, *.rst
Key Aliases Type
============================== ================= =====
name str
-version attr:, str
+version attr:, file:, str
url home-page str
download_url download-url str
project_urls dict
obsoletes list-comma
============================== ================= =====
+.. note::
+ A version loaded using the ``file:`` directive must comply with PEP 440.
+ It is easy to accidentally put something other than a valid version
+ string in such a file, so validation is stricter in this case.
Options
-------
--- /dev/null
+# Configuration for pull request documentation previews via Netlify
+
+[build]
+ publish = "docs/build/html"
+ command = "pip install tox && tox -e docs"
XXX Currently this is the same as ``distutils.util.get_platform()``, but it
needs some hacks for Linux and Mac OS X.
"""
- try:
- # Python 2.7 or >=3.2
- from sysconfig import get_platform
- except ImportError:
- from distutils.util import get_platform
+ from sysconfig import get_platform
plat = get_platform()
if sys.platform == "darwin" and not plat.startswith('macosx-'):
raise AttributeError(attr)
return getattr(self._provider, attr)
+ def __dir__(self):
+ return list(
+ set(super(Distribution, self).__dir__())
+ | set(
+ attr for attr in self._provider.__dir__()
+ if not attr.startswith('_')
+ )
+ )
+
+ if not hasattr(object, '__dir__'):
+ # python 2.7 not supported
+ del __dir__
+
@classmethod
def from_filename(cls, filename, metadata=None, **kw):
return cls.from_location(
for v in "Twisted>=1.5", "Twisted>=1.5\nZConfig>=2.0":
self.checkRequires(self.distRequires(v), v)
+ needs_object_dir = pytest.mark.skipif(
+ not hasattr(object, '__dir__'),
+ reason='object.__dir__ necessary for self.__dir__ implementation',
+ )
+
+ def test_distribution_dir(self):
+ d = pkg_resources.Distribution()
+ dir(d)
+
+ @needs_object_dir
+ def test_distribution_dir_includes_provider_dir(self):
+ d = pkg_resources.Distribution()
+ before = d.__dir__()
+ assert 'test_attr' not in before
+ d._provider.test_attr = None
+ after = d.__dir__()
+ assert len(after) == len(before) + 1
+ assert 'test_attr' in after
+
+ @needs_object_dir
+ def test_distribution_dir_ignores_provider_dir_leading_underscore(self):
+ d = pkg_resources.Distribution()
+ before = d.__dir__()
+ assert '_test_attr' not in before
+ d._provider._test_attr = None
+ after = d.__dir__()
+ assert len(after) == len(before)
+ assert '_test_attr' not in after
+
def testResolve(self):
ad = pkg_resources.Environment([])
ws = WorkingSet([])
--- /dev/null
+[tool.towncrier]
+ package = "setuptools"
+ package_dir = "setuptools"
+ filename = "CHANGES.rst"
+ directory = "changelog.d"
+ title_format = "v{version}"
+ issue_format = "#{issue}"
+ template = "towncrier_template.rst"
+ underlines = ["-"]
+
+ [[tool.towncrier.type]]
+ directory = "deprecation"
+ name = "Deprecations"
+ showcontent = true
+
+ [[tool.towncrier.type]]
+ directory = "breaking"
+ name = "Breaking Changes"
+ showcontent = true
+
+ [[tool.towncrier.type]]
+ directory = "change"
+ name = "Changes"
+ showcontent = true
+
+ [[tool.towncrier.type]]
+ directory = "doc"
+ name = "Documentation changes"
+ showcontent = true
+
+ [[tool.towncrier.type]]
+ directory = "misc"
+ name = "Misc"
+ showcontent = true
[bumpversion]
-current_version = 39.1.0
+current_version = 39.2.0
commit = True
tag = True
setup_params = dict(
name="setuptools",
- version="39.1.0",
+ version="39.2.0",
description=(
"Easily download, build, install, upgrade, and uninstall "
"Python packages"
from importlib import import_module
from distutils.errors import DistutilsOptionError, DistutilsFileError
+from setuptools.extern.packaging.version import LegacyVersion, parse
from setuptools.extern.six import string_types
If False exceptions are propagated as expected.
:rtype: list
"""
- meta = ConfigMetadataHandler(
- distribution.metadata, command_options, ignore_option_errors)
- meta.parse()
-
options = ConfigOptionsHandler(
distribution, command_options, ignore_option_errors)
options.parse()
+ meta = ConfigMetadataHandler(
+ distribution.metadata, command_options, ignore_option_errors, distribution.package_dir)
+ meta.parse()
+
return meta, options
return f.read()
@classmethod
- def _parse_attr(cls, value):
+ def _parse_attr(cls, value, package_dir=None):
"""Represents value as a module attribute.
Examples:
module_name = '.'.join(attrs_path)
module_name = module_name or '__init__'
- sys.path.insert(0, os.getcwd())
+ parent_path = os.getcwd()
+ if package_dir:
+ if attrs_path[0] in package_dir:
+ # A custom path was specified for the module we want to import
+ custom_path = package_dir[attrs_path[0]]
+ parts = custom_path.rsplit('/', 1)
+ if len(parts) > 1:
+ parent_path = os.path.join(os.getcwd(), parts[0])
+ module_name = parts[1]
+ else:
+ module_name = custom_path
+ elif '' in package_dir:
+ # A custom parent directory was specified for all root modules
+ parent_path = os.path.join(os.getcwd(), package_dir[''])
+ sys.path.insert(0, parent_path)
try:
module = import_module(module_name)
value = getattr(module, attr_name)
"""
+ def __init__(self, target_obj, options, ignore_option_errors=False,
+ package_dir=None):
+ super(ConfigMetadataHandler, self).__init__(target_obj, options,
+ ignore_option_errors)
+ self.package_dir = package_dir
+
@property
def parsers(self):
"""Metadata item name to parser function mapping."""
:rtype: str
"""
- version = self._parse_attr(value)
+ version = self._parse_file(value)
+
+ if version != value:
+ version = version.strip()
+ # Be strict about versions loaded from file because it's easy to
+ # accidentally include newlines and other unintended content
+ if isinstance(parse(version), LegacyVersion):
+ raise DistutilsOptionError('Version loaded from %s does not comply with PEP 440: %s' % (
+ value, version
+ ))
+ return version
+
+ version = self._parse_attr(value, self.package_dir)
if callable(version):
version = version()
distribution for the included and excluded features.
"""
+ _DISTUTILS_UNSUPPORTED_METADATA = {
+ 'long_description_content_type': None,
+ 'project_urls': dict,
+ 'provides_extras': set,
+ }
+
_patched_dist = None
def patch_missing_pkg_info(self, attrs):
self.require_features = []
self.features = {}
self.dist_files = []
+ # Filter-out setuptools' specific options.
self.src_root = attrs.pop("src_root", None)
self.patch_missing_pkg_info(attrs)
- self.project_urls = attrs.get('project_urls', {})
self.dependency_links = attrs.pop('dependency_links', [])
self.setup_requires = attrs.pop('setup_requires', [])
for ep in pkg_resources.iter_entry_points('distutils.setup_keywords'):
vars(self).setdefault(ep.name, None)
- _Distribution.__init__(self, attrs)
-
- # The project_urls attribute may not be supported in distutils, so
- # prime it here from our value if not automatically set
- self.metadata.project_urls = getattr(
- self.metadata, 'project_urls', self.project_urls)
- self.metadata.long_description_content_type = attrs.get(
- 'long_description_content_type'
- )
- self.metadata.provides_extras = getattr(
- self.metadata, 'provides_extras', set()
- )
+ _Distribution.__init__(self, {
+ k: v for k, v in attrs.items()
+ if k not in self._DISTUTILS_UNSUPPORTED_METADATA
+ })
+
+ # Fill-in missing metadata fields not supported by distutils.
+ # Note some fields may have been set by other tools (e.g. pbr)
+ # above; they are taken preferrentially to setup() arguments
+ for option, default in self._DISTUTILS_UNSUPPORTED_METADATA.items():
+ for source in self.metadata.__dict__, attrs:
+ if option in source:
+ value = source[option]
+ break
+ else:
+ value = default() if default else None
+ setattr(self.metadata, option, value)
if isinstance(self.metadata.version, numbers.Number):
# Some people apparently take "version number" too literally :)
import contextlib
import pytest
from distutils.errors import DistutilsOptionError, DistutilsFileError
-from setuptools.dist import Distribution
+from mock import patch
+from setuptools.dist import Distribution, _Distribution
from setuptools.config import ConfigHandler, read_configuration
def make_package_dir(name, base_dir):
- dir_package = base_dir.mkdir(name)
+ dir_package = base_dir
+ for dir_name in name.split('/'):
+ dir_package = dir_package.mkdir(dir_name)
init_file = dir_package.join('__init__.py')
init_file.write('')
return dir_package, init_file
-def fake_env(tmpdir, setup_cfg, setup_py=None):
+def fake_env(tmpdir, setup_cfg, setup_py=None, package_path='fake_package'):
if setup_py is None:
setup_py = (
config = tmpdir.join('setup.cfg')
config.write(setup_cfg)
- package_dir, init_file = make_package_dir('fake_package', tmpdir)
+ package_dir, init_file = make_package_dir(package_path, tmpdir)
init_file.write(
'VERSION = (1, 2, 3)\n'
with get_dist(tmpdir) as dist:
assert dist.metadata.version == '2016.11.26'
+ def test_version_file(self, tmpdir):
+
+ _, config = fake_env(
+ tmpdir,
+ '[metadata]\n'
+ 'version = file: fake_package/version.txt\n'
+ )
+ tmpdir.join('fake_package', 'version.txt').write('1.2.3\n')
+
+ with get_dist(tmpdir) as dist:
+ assert dist.metadata.version == '1.2.3'
+
+ tmpdir.join('fake_package', 'version.txt').write('1.2.3\n4.5.6\n')
+ with pytest.raises(DistutilsOptionError):
+ with get_dist(tmpdir) as dist:
+ _ = dist.metadata.version
+
+ def test_version_with_package_dir_simple(self, tmpdir):
+
+ _, config = fake_env(
+ tmpdir,
+ '[metadata]\n'
+ 'version = attr: fake_package_simple.VERSION\n'
+ '[options]\n'
+ 'package_dir =\n'
+ ' = src\n',
+ package_path='src/fake_package_simple'
+ )
+
+ with get_dist(tmpdir) as dist:
+ assert dist.metadata.version == '1.2.3'
+
+ def test_version_with_package_dir_rename(self, tmpdir):
+
+ _, config = fake_env(
+ tmpdir,
+ '[metadata]\n'
+ 'version = attr: fake_package_rename.VERSION\n'
+ '[options]\n'
+ 'package_dir =\n'
+ ' fake_package_rename = fake_dir\n',
+ package_path='fake_dir'
+ )
+
+ with get_dist(tmpdir) as dist:
+ assert dist.metadata.version == '1.2.3'
+
+ def test_version_with_package_dir_complex(self, tmpdir):
+
+ _, config = fake_env(
+ tmpdir,
+ '[metadata]\n'
+ 'version = attr: fake_package_complex.VERSION\n'
+ '[options]\n'
+ 'package_dir =\n'
+ ' fake_package_complex = src/fake_dir\n',
+ package_path='src/fake_dir'
+ )
+
+ with get_dist(tmpdir) as dist:
+ assert dist.metadata.version == '1.2.3'
+
def test_unknown_meta_item(self, tmpdir):
fake_env(
with get_dist(tmpdir) as dist:
assert dist.entry_points == expected
+
+saved_dist_init = _Distribution.__init__
+class TestExternalSetters:
+ # During creation of the setuptools Distribution() object, we call
+ # the init of the parent distutils Distribution object via
+ # _Distribution.__init__ ().
+ #
+ # It's possible distutils calls out to various keyword
+ # implementations (i.e. distutils.setup_keywords entry points)
+ # that may set a range of variables.
+ #
+ # This wraps distutil's Distribution.__init__ and simulates
+ # pbr or something else setting these values.
+ def _fake_distribution_init(self, dist, attrs):
+ saved_dist_init(dist, attrs)
+ # see self._DISTUTUILS_UNSUPPORTED_METADATA
+ setattr(dist.metadata, 'long_description_content_type',
+ 'text/something')
+ # Test overwrite setup() args
+ setattr(dist.metadata, 'project_urls', {
+ 'Link One': 'https://example.com/one/',
+ 'Link Two': 'https://example.com/two/',
+ })
+ return None
+
+ @patch.object(_Distribution, '__init__', autospec=True)
+ def test_external_setters(self, mock_parent_init, tmpdir):
+ mock_parent_init.side_effect = self._fake_distribution_init
+
+ dist = Distribution(attrs={
+ 'project_urls': {
+ 'will_be': 'ignored'
+ }
+ })
+
+ assert dist.metadata.long_description_content_type == 'text/something'
+ assert dist.metadata.project_urls == {
+ 'Link One': 'https://example.com/one/',
+ 'Link Two': 'https://example.com/two/',
+ }
self._validate_content_order(content, expected_order)
- def test_egg_base_installed_egg_info(self, tmpdir_cwd, env):
+ def test_expected_files_produced(self, tmpdir_cwd, env):
self._create_project()
- self._run_install_command(tmpdir_cwd, env)
- actual = self._find_egg_info_files(env.paths['lib'])
+ self._run_egg_info_command(tmpdir_cwd, env)
+ actual = os.listdir('foo.egg-info')
expected = [
'PKG-INFO',
'usage.rst': "Run 'hi'",
}
})
- self._run_install_command(tmpdir_cwd, env)
- egg_info_dir = self._find_egg_info_files(env.paths['lib']).base
+ self._run_egg_info_command(tmpdir_cwd, env)
+ egg_info_dir = os.path.join('.', 'foo.egg-info')
sources_txt = os.path.join(egg_info_dir, 'SOURCES.txt')
with open(sources_txt) as f:
assert 'docs/usage.rst' in f.read().split('\n')
'''
install_requires_deterministic
- install_requires=["fake-factory==0.5.2", "pytz"]
+ install_requires=["wheel>=0.5", "pytest"]
[options]
install_requires =
- fake-factory==0.5.2
- pytz
+ wheel>=0.5
+ pytest
- fake-factory==0.5.2
- pytz
+ wheel>=0.5
+ pytest
''',
'''
install_requires_ordered
- install_requires=["fake-factory>=1.12.3,!=2.0"]
+ install_requires=["pytest>=3.0.2,!=10.9999"]
[options]
install_requires =
- fake-factory>=1.12.3,!=2.0
+ pytest>=3.0.2,!=10.9999
- fake-factory!=2.0,>=1.12.3
+ pytest!=10.9999,>=3.0.2
''',
'''
self, tmpdir_cwd, env, requires, use_setup_cfg,
expected_requires, install_cmd_kwargs):
self._setup_script_with_requires(requires, use_setup_cfg)
- self._run_install_command(tmpdir_cwd, env, **install_cmd_kwargs)
+ self._run_egg_info_command(tmpdir_cwd, env, **install_cmd_kwargs)
egg_info_dir = os.path.join('.', 'foo.egg-info')
requires_txt = os.path.join(egg_info_dir, 'requires.txt')
if os.path.exists(requires_txt):
req = 'install_requires={"fake-factory==0.5.2", "pytz"}'
self._setup_script_with_requires(req)
with pytest.raises(AssertionError):
- self._run_install_command(tmpdir_cwd, env)
+ self._run_egg_info_command(tmpdir_cwd, env)
def test_extras_require_with_invalid_marker(self, tmpdir_cwd, env):
tmpl = 'extras_require={{":{marker}": ["barbazquux"]}},'
req = tmpl.format(marker=self.invalid_marker)
self._setup_script_with_requires(req)
with pytest.raises(AssertionError):
- self._run_install_command(tmpdir_cwd, env)
+ self._run_egg_info_command(tmpdir_cwd, env)
assert glob.glob(os.path.join(env.paths['lib'], 'barbazquux*')) == []
def test_extras_require_with_invalid_marker_in_req(self, tmpdir_cwd, env):
req = tmpl.format(marker=self.invalid_marker)
self._setup_script_with_requires(req)
with pytest.raises(AssertionError):
- self._run_install_command(tmpdir_cwd, env)
+ self._run_egg_info_command(tmpdir_cwd, env)
assert glob.glob(os.path.join(env.paths['lib'], 'barbazquux*')) == []
def test_provides_extra(self, tmpdir_cwd, env):
assert 'Requires-Python: >=2.7.12' in pkg_info_lines
assert 'Metadata-Version: 1.2' in pkg_info_lines
- def test_python_requires_install(self, tmpdir_cwd, env):
- self._setup_script_with_requires(
- """python_requires='>=1.2.3',""")
- self._run_install_command(tmpdir_cwd, env)
- egg_info_dir = self._find_egg_info_files(env.paths['lib']).base
- pkginfo = os.path.join(egg_info_dir, 'PKG-INFO')
- with open(pkginfo) as f:
- assert 'Requires-Python: >=1.2.3' in f.read().split('\n')
-
def test_manifest_maker_warning_suppression(self):
fixtures = [
"standard file not found: should have one of foo.py, bar.py",
for msg in fixtures:
assert manifest_maker._should_suppress_warning(msg)
- def _run_install_command(self, tmpdir_cwd, env, cmd=None, output=None):
+ def _run_egg_info_command(self, tmpdir_cwd, env, cmd=None, output=None):
environ = os.environ.copy().update(
HOME=env.paths['home'],
)
if cmd is None:
cmd = [
- 'install',
- '--home', env.paths['home'],
- '--install-lib', env.paths['lib'],
- '--install-scripts', env.paths['scripts'],
- '--install-data', env.paths['data'],
+ 'egg_info',
]
code, data = environment.run_setup_py(
cmd=cmd,
raise AssertionError(data)
if output:
assert output in data
-
- def _find_egg_info_files(self, root):
- class DirList(list):
- def __init__(self, files, base):
- super(DirList, self).__init__(files)
- self.base = base
-
- results = (
- DirList(filenames, dirpath)
- for dirpath, dirnames, filenames in os.walk(root)
- if os.path.basename(dirpath) == 'EGG-INFO'
- )
- # expect exactly one result
- result, = results
- return result
--- /dev/null
+import warnings
+
+from setuptools.glibc import check_glibc_version
+
+
+class TestGlibc(object):
+ def test_manylinux1_check_glibc_version(self):
+ """
+ Test that the check_glibc_version function is robust against weird
+ glibc version strings.
+ """
+ for two_twenty in ["2.20",
+ # used by "linaro glibc", see gh-3588
+ "2.20-2014.11",
+ # weird possibilities that I just made up
+ "2.20+dev",
+ "2.20-custom",
+ "2.20.1",
+ ]:
+ assert check_glibc_version(two_twenty, 2, 15)
+ assert check_glibc_version(two_twenty, 2, 20)
+ assert not check_glibc_version(two_twenty, 2, 21)
+ assert not check_glibc_version(two_twenty, 3, 15)
+ assert not check_glibc_version(two_twenty, 1, 15)
+
+ # For strings that we just can't parse at all, we should warn and
+ # return false
+ for bad_string in ["asdf", "", "foo.bar"]:
+ with warnings.catch_warnings(record=True) as ws:
+ warnings.filterwarnings("always")
+ assert not check_glibc_version(bad_string, 2, 5)
+ for w in ws:
+ if "Expected glibc version with" in str(w.message):
+ break
+ else:
+ # Didn't find the warning we were expecting
+ assert False
--- /dev/null
+import sys
+
+from mock import patch
+
+from setuptools import pep425tags
+
+
+class TestPEP425Tags(object):
+
+ def mock_get_config_var(self, **kwd):
+ """
+ Patch sysconfig.get_config_var for arbitrary keys.
+ """
+ get_config_var = pep425tags.sysconfig.get_config_var
+
+ def _mock_get_config_var(var):
+ if var in kwd:
+ return kwd[var]
+ return get_config_var(var)
+ return _mock_get_config_var
+
+ def abi_tag_unicode(self, flags, config_vars):
+ """
+ Used to test ABI tags, verify correct use of the `u` flag
+ """
+ config_vars.update({'SOABI': None})
+ base = pep425tags.get_abbr_impl() + pep425tags.get_impl_ver()
+
+ if sys.version_info < (3, 3):
+ config_vars.update({'Py_UNICODE_SIZE': 2})
+ mock_gcf = self.mock_get_config_var(**config_vars)
+ with patch('setuptools.pep425tags.sysconfig.get_config_var', mock_gcf):
+ abi_tag = pep425tags.get_abi_tag()
+ assert abi_tag == base + flags
+
+ config_vars.update({'Py_UNICODE_SIZE': 4})
+ mock_gcf = self.mock_get_config_var(**config_vars)
+ with patch('setuptools.pep425tags.sysconfig.get_config_var',
+ mock_gcf):
+ abi_tag = pep425tags.get_abi_tag()
+ assert abi_tag == base + flags + 'u'
+
+ else:
+ # On Python >= 3.3, UCS-4 is essentially permanently enabled, and
+ # Py_UNICODE_SIZE is None. SOABI on these builds does not include
+ # the 'u' so manual SOABI detection should not do so either.
+ config_vars.update({'Py_UNICODE_SIZE': None})
+ mock_gcf = self.mock_get_config_var(**config_vars)
+ with patch('setuptools.pep425tags.sysconfig.get_config_var',
+ mock_gcf):
+ abi_tag = pep425tags.get_abi_tag()
+ assert abi_tag == base + flags
+
+ def test_broken_sysconfig(self):
+ """
+ Test that pep425tags still works when sysconfig is broken.
+ Can be a problem on Python 2.7
+ Issue #1074.
+ """
+ def raises_ioerror(var):
+ raise IOError("I have the wrong path!")
+
+ with patch('setuptools.pep425tags.sysconfig.get_config_var',
+ raises_ioerror):
+ assert len(pep425tags.get_supported())
+
+ def test_no_hyphen_tag(self):
+ """
+ Test that no tag contains a hyphen.
+ """
+ mock_gcf = self.mock_get_config_var(SOABI='cpython-35m-darwin')
+
+ with patch('setuptools.pep425tags.sysconfig.get_config_var',
+ mock_gcf):
+ supported = pep425tags.get_supported()
+
+ for (py, abi, plat) in supported:
+ assert '-' not in py
+ assert '-' not in abi
+ assert '-' not in plat
+
+ def test_manual_abi_noflags(self):
+ """
+ Test that no flags are set on a non-PyDebug, non-Pymalloc ABI tag.
+ """
+ self.abi_tag_unicode('', {'Py_DEBUG': False, 'WITH_PYMALLOC': False})
+
+ def test_manual_abi_d_flag(self):
+ """
+ Test that the `d` flag is set on a PyDebug, non-Pymalloc ABI tag.
+ """
+ self.abi_tag_unicode('d', {'Py_DEBUG': True, 'WITH_PYMALLOC': False})
+
+ def test_manual_abi_m_flag(self):
+ """
+ Test that the `m` flag is set on a non-PyDebug, Pymalloc ABI tag.
+ """
+ self.abi_tag_unicode('m', {'Py_DEBUG': False, 'WITH_PYMALLOC': True})
+
+ def test_manual_abi_dm_flags(self):
+ """
+ Test that the `dm` flags are set on a PyDebug, Pymalloc ABI tag.
+ """
+ self.abi_tag_unicode('dm', {'Py_DEBUG': True, 'WITH_PYMALLOC': True})
+
+
+class TestManylinux1Tags(object):
+
+ @patch('setuptools.pep425tags.get_platform', lambda: 'linux_x86_64')
+ @patch('setuptools.glibc.have_compatible_glibc',
+ lambda major, minor: True)
+ def test_manylinux1_compatible_on_linux_x86_64(self):
+ """
+ Test that manylinux1 is enabled on linux_x86_64
+ """
+ assert pep425tags.is_manylinux1_compatible()
+
+ @patch('setuptools.pep425tags.get_platform', lambda: 'linux_i686')
+ @patch('setuptools.glibc.have_compatible_glibc',
+ lambda major, minor: True)
+ def test_manylinux1_compatible_on_linux_i686(self):
+ """
+ Test that manylinux1 is enabled on linux_i686
+ """
+ assert pep425tags.is_manylinux1_compatible()
+
+ @patch('setuptools.pep425tags.get_platform', lambda: 'linux_x86_64')
+ @patch('setuptools.glibc.have_compatible_glibc',
+ lambda major, minor: False)
+ def test_manylinux1_2(self):
+ """
+ Test that manylinux1 is disabled with incompatible glibc
+ """
+ assert not pep425tags.is_manylinux1_compatible()
+
+ @patch('setuptools.pep425tags.get_platform', lambda: 'arm6vl')
+ @patch('setuptools.glibc.have_compatible_glibc',
+ lambda major, minor: True)
+ def test_manylinux1_3(self):
+ """
+ Test that manylinux1 is disabled on arm6vl
+ """
+ assert not pep425tags.is_manylinux1_compatible()
+
+ @patch('setuptools.pep425tags.get_platform', lambda: 'linux_x86_64')
+ @patch('setuptools.glibc.have_compatible_glibc',
+ lambda major, minor: True)
+ @patch('sys.platform', 'linux2')
+ def test_manylinux1_tag_is_first(self):
+ """
+ Test that the more specific tag manylinux1 comes first.
+ """
+ groups = {}
+ for pyimpl, abi, arch in pep425tags.get_supported():
+ groups.setdefault((pyimpl, abi), []).append(arch)
+
+ for arches in groups.values():
+ if arches == ['any']:
+ continue
+ # Expect the most specific arch first:
+ if len(arches) == 3:
+ assert arches == ['manylinux1_x86_64', 'linux_x86_64', 'any']
+ else:
+ assert arches == ['manylinux1_x86_64', 'linux_x86_64']
import glob
import inspect
import os
+import shutil
import subprocess
import sys
+import zipfile
import pytest
from pkg_resources import Distribution, PathMetadata, PY_MAJOR
+from setuptools.extern.packaging.utils import canonicalize_name
from setuptools.wheel import Wheel
from .contexts import tempdir
_check_wheel_install(filename, install_dir,
install_tree, project_name,
version, requires_txt)
+
+
+def test_wheel_install_pep_503():
+ project_name = 'Foo_Bar' # PEP 503 canonicalized name is "foo-bar"
+ version = '1.0'
+ with build_wheel(
+ name=project_name,
+ version=version,
+ ) as filename, tempdir() as install_dir:
+ new_filename = filename.replace(project_name,
+ canonicalize_name(project_name))
+ shutil.move(filename, new_filename)
+ _check_wheel_install(new_filename, install_dir, None,
+ canonicalize_name(project_name),
+ version, None)
+
+
+def test_wheel_no_dist_dir():
+ project_name = 'nodistinfo'
+ version = '1.0'
+ wheel_name = '{0}-{1}-py2.py3-none-any.whl'.format(project_name, version)
+ with tempdir() as source_dir:
+ wheel_path = os.path.join(source_dir, wheel_name)
+ # create an empty zip file
+ zipfile.ZipFile(wheel_path, 'w').close()
+ with tempdir() as install_dir:
+ with pytest.raises(ValueError):
+ _check_wheel_install(wheel_path, install_dir, None,
+ project_name,
+ version, None)
import email
import itertools
import os
+import posixpath
import re
import zipfile
from pkg_resources import Distribution, PathMetadata, parse_version
+from setuptools.extern.packaging.utils import canonicalize_name
from setuptools.extern.six import PY3
from setuptools import Distribution as SetuptoolsDistribution
from setuptools import pep425tags
platform=(None if self.platform == 'any' else get_platform()),
).egg_name() + '.egg'
+ def get_dist_info(self, zf):
+ # find the correct name of the .dist-info dir in the wheel file
+ for member in zf.namelist():
+ dirname = posixpath.dirname(member)
+ if (dirname.endswith('.dist-info') and
+ canonicalize_name(dirname).startswith(
+ canonicalize_name(self.project_name))):
+ return dirname
+ raise ValueError("unsupported wheel format. .dist-info not found")
+
def install_as_egg(self, destination_eggdir):
'''Install wheel as an egg directory.'''
with zipfile.ZipFile(self.filename) as zf:
dist_basename = '%s-%s' % (self.project_name, self.version)
- dist_info = '%s.dist-info' % dist_basename
+ dist_info = self.get_dist_info(zf)
dist_data = '%s.data' % dist_basename
def get_metadata(name):
- with zf.open('%s/%s' % (dist_info, name)) as fp:
+ with zf.open(posixpath.join(dist_info, name)) as fp:
value = fp.read().decode('utf-8') if PY3 else fp.read()
return email.parser.Parser().parsestr(value)
wheel_metadata = get_metadata('WHEEL')
importlib; python_version<"2.7"
mock
-pytest-flake8; python_version>="2.7"
+pytest-flake8<=1.0.0; python_version>="3.3" and python_version<"3.5"
+pytest-flake8; python_version>="2.7" and python_version!="3.3" and python_version!="3.4"
virtualenv>=13.0.0
pytest-virtualenv>=1.2.7
pytest>=3.0.2
--- /dev/null
+{% for section, _ in sections.items() %}
+{% set underline = underlines[0] %}{% if section %}{{section}}
+{{ underline * section|length }}
+{% endif %}
+{% if sections[section] %}
+{% for category, val in definitions.items() if category in sections[section]%}
+{% if definitions[category]['showcontent'] %}
+{% for text, values in sections[section][category].items() %}
+* {{ values|join(', ') }}: {{ text }}
+{% endfor %}
+{% else %}
+* {{ sections[section][category]['']|join(', ') }}
+
+{% endif %}
+{% if sections[section][category]|length == 0 %}
+No significant changes.
+{% else %}
+{% endif %}
+{% endfor %}
+
+{% else %}
+No significant changes.
+
+
+{% endif %}
+{% endfor %}
skip_install=True
commands=codecov --file {toxworkdir}/coverage.xml
+[testenv:docs]
+deps = -r{toxinidir}/docs/requirements.txt
+skip_install=True
+commands =
+ python {toxinidir}/bootstrap.py
+ sphinx-build -W -b html -d {envtmpdir}/doctrees docs docs/build/html
+ sphinx-build -W -b man -d {envtmpdir}/doctrees docs docs/build/man
+
[coverage:run]
source=
pkg_resources