From: DongHun Kwak Date: Tue, 4 Apr 2017 02:04:57 +0000 (+0900) Subject: Imported Upstream version 34.3.3 X-Git-Tag: upstream/34.3.3^0 X-Git-Url: http://review.tizen.org/git/?a=commitdiff_plain;h=962cf9195f3c806f70649bf1bb13e0adfcd39d8b;p=platform%2Fupstream%2Fpython-setuptools.git Imported Upstream version 34.3.3 Change-Id: I6bcc6839205cdaf0eadde132c78a3df1fd9f36e9 Signed-off-by: DongHun Kwak --- diff --git a/CHANGES.rst b/CHANGES.rst new file mode 100644 index 0000000..54eba03 --- /dev/null +++ b/CHANGES.rst @@ -0,0 +1,3294 @@ +v34.3.3 +------- + +* #967 (and #997): Explicitly import submodules of + packaging to account for environments where the imports + of those submodules is not implied by other behavior. + +v34.3.2 +------- + +* #993: Fix documentation upload by correcting + rendering of content-type in _build_multipart + on Python 3. + +v34.3.1 +------- + +* #988: Trap ``os.unlink`` same as ``os.remove`` in + ``auto_chmod`` error handler. + +* #983: Fixes to invalid escape sequence deprecations on + Python 3.6. + +v34.3.0 +------- + +* #941: In the upload command, if the username is blank, + default to ``getpass.getuser()``. + +* #971: Correct distutils findall monkeypatch to match + appropriate versions (namely Python 3.4.6). + +v34.2.0 +------- + +* #966: Add support for reading dist-info metadata and + thus locating Distributions from zip files. + +* #968: Allow '+' and '!' in egg fragments + so that it can take package names that contain + PEP 440 conforming version specifiers. + +v34.1.1 +------- + +* #953: More aggressively employ the compatibility issue + originally added in #706. + +v34.1.0 +------- + +* #930: ``build_info`` now accepts two new parameters + to optimize and customize the building of C libraries. + +v34.0.3 +------- + +* #947: Loosen restriction on the version of six required, + restoring compatibility with environments relying on + six 1.6.0 and later. + +v34.0.2 +------- + +* #882: Ensure extras are honored when building the + working set. +* #913: Fix issue in develop if package directory has + a trailing slash. + +v34.0.1 +------- + +* #935: Fix glob syntax in graft. + +v34.0.0 +------- + +* #581: Instead of vendoring the growing list of + dependencies that Setuptools requires to function, + Setuptools now requires these dependencies just like + any other project. Unlike other projects, however, + Setuptools cannot rely on ``setup_requires`` to + demand the dependencies it needs to install because + its own machinery would be necessary to pull those + dependencies if not present (a bootstrapping problem). + As a result, Setuptools no longer supports self upgrade or + installation in the general case. Instead, users are + directed to use pip to install and upgrade using the + ``wheel`` distributions of setuptools. + + Users are welcome to contrive other means to install + or upgrade Setuptools using other means, such as + pre-installing the Setuptools dependencies with pip + or a bespoke bootstrap tool, but such usage is not + recommended and is not supported. + + As discovered in #940, not all versions of pip will + successfully install Setuptools from its pre-built + wheel. If you encounter issues with "No module named + six" or "No module named packaging", especially + following a line "Running setup.py egg_info for package + setuptools", then your pip is not new enough. + + There's an additional issue in pip where setuptools + is upgraded concurrently with other source packages, + described in pip #4253. The proposed workaround is to + always upgrade Setuptools first prior to upgrading + other packages that would upgrade Setuptools. + +v33.1.1 +------- + +* #921: Correct issue where certifi fallback not being + reached on Windows. + +v33.1.0 +------- + +Installation via pip, as indicated in the `Python Packaging +User's Guide `_, +is the officially-supported mechanism for installing +Setuptools, and this recommendation is now explicit in the +much more concise README. + +Other edits and tweaks were made to the documentation. The +codebase is unchanged. + +v33.0.0 +------- + +* #619: Removed support for the ``tag_svn_revision`` + distribution option. If Subversion tagging support is + still desired, consider adding the functionality to + setuptools_svn in setuptools_svn #2. + +v32.3.1 +------- + +* #866: Use ``dis.Bytecode`` on Python 3.4 and later in + ``setuptools.depends``. + +v32.3.0 +------- + +* #889: Backport proposed fix for disabling interpolation in + distutils.Distribution.parse_config_files. + +v32.2.0 +------- + +* #884: Restore support for running the tests under + `pytest-runner `_ + by ensuring that PYTHONPATH is honored in tests invoking + a subprocess. + +v32.1.3 +------- + +* #706: Add rmtree compatibility shim for environments where + rmtree fails when passed a unicode string. + +v32.1.2 +------- + +* #893: Only release sdist in zip format as warehouse now + disallows releasing two different formats. + +v32.1.1 +------- + +* #704: More selectively ensure that 'rmtree' is not invoked with + a byte string, enabling it to remove files that are non-ascii, + even on Python 2. + +* #712: In 'sandbox.run_setup', ensure that ``__file__`` is + always a ``str``, modeling the behavior observed by the + interpreter when invoking scripts and modules. + +v32.1.0 +------- + +* #891: In 'test' command on test failure, raise DistutilsError, + suppression invocation of subsequent commands. + +v32.0.0 +------- + +* #890: Revert #849. ``global-exclude .foo`` will not match all + ``*.foo`` files any more. Package authors must add an explicit + wildcard, such as ``global-exclude *.foo``, to match all + ``.foo`` files. See #886, #849. + +v31.0.1 +------- + +* #885: Fix regression where 'pkg_resources._rebuild_mod_path' + would fail when a namespace package's '__path__' was not + a list with a sort attribute. + +v31.0.0 +------- + +* #250: Install '-nspkg.pth' files for packages installed + with 'setup.py develop'. These .pth files allow + namespace packages installed by pip or develop to + co-mingle. This change required the removal of the + change for #805 and pip #1924, introduced in 28.3.0 and implicated + in #870, but means that namespace packages not in a + site packages directory will no longer work on Python + earlier than 3.5, whereas before they would work on + Python not earlier than 3.3. + +v30.4.0 +------- + +* #879: For declarative config: + + - read_configuration() now accepts ignore_option_errors argument. This allows scraping tools to read metadata without a need to download entire packages. E.g. we can gather some stats right from GitHub repos just by downloading setup.cfg. + + - packages find: directive now supports fine tuning from a subsection. The same arguments as for find() are accepted. + +v30.3.0 +------- + +* #394 via #862: Added support for `declarative package + config in a setup.cfg file + `_. + +v30.2.1 +------- + +* #850: In test command, invoke unittest.main with + indication not to exit the process. + +v30.2.0 +------- + +* #854: Bump to vendored Packaging 16.8. + +v30.1.0 +------- + +* #846: Also trap 'socket.error' when opening URLs in + package_index. + +* #849: Manifest processing now matches the filename + pattern anywhere in the filename and not just at the + start. Restores behavior found prior to 28.5.0. + +v30.0.0 +------- + +* #864: Drop support for Python 3.2. Systems requiring + Python 3.2 support must use 'setuptools < 30'. + +* #825: Suppress warnings for single files. + +* #830 via #843: Once again restored inclusion of data + files to sdists, but now trap TypeError caused by + techniques employed rjsmin and similar. + +v29.0.1 +------- + +* #861: Re-release of v29.0.1 with the executable script + launchers bundled. Now, launchers are included by default + and users that want to disable this behavior must set the + environment variable + 'SETUPTOOLS_INSTALL_WINDOWS_SPECIFIC_FILES' to + a false value like "false" or "0". + +v29.0.0 +------- + +* #841: Drop special exception for packages invoking + win32com during the build/install process. See + Distribute #118 for history. + +v28.8.0 +------- + +* #629: Per the discussion, refine the sorting to use version + value order for more accurate detection of the latest + available version when scanning for packages. See also + #829. + +* #837: Rely on the config var "SO" for Python 3.3.0 only + when determining the ext filename. + +v28.7.1 +------- + +* #827: Update PyPI root for dependency links. + +* #833: Backed out changes from #830 as the implementation + seems to have problems in some cases. + +v28.7.0 +------- + +* #832: Moved much of the namespace package handling + functionality into a separate module for re-use in something + like #789. +* #830: ``sdist`` command no longer suppresses the inclusion + of data files, re-aligning with the expectation of distutils + and addressing #274 and #521. + +v28.6.1 +------- + +* #816: Fix manifest file list order in tests. + +v28.6.0 +------- + +* #629: When scanning for packages, ``pkg_resources`` now + ignores empty egg-info directories and gives precedence to + packages whose versions are lexicographically greatest, + a rough approximation for preferring the latest available + version. + +v28.5.0 +------- + +* #810: Tests are now invoked with tox and not setup.py test. +* #249 and #450 via #764: Avoid scanning the whole tree + when building the manifest. Also fixes a long-standing bug + where patterns in ``MANIFEST.in`` had implicit wildcard + matching. This caused ``global-exclude .foo`` to exclude + all ``*.foo`` files, but also ``global-exclude bar.py`` to + exclude ``foo_bar.py``. + +v28.4.0 +------- + +* #732: Now extras with a hyphen are honored per PEP 426. +* #811: Update to pyparsing 2.1.10. +* Updated ``setuptools.command.sdist`` to re-use most of + the functionality directly from ``distutils.command.sdist`` + for the ``add_defaults`` method with strategic overrides. + See #750 for rationale. +* #760 via #762: Look for certificate bundle where SUSE + Linux typically presents it. Use ``certifi.where()`` to locate + the bundle. + +v28.3.0 +------- + +* #809: In ``find_packages()``, restore support for excluding + a parent package without excluding a child package. + +* #805: Disable ``-nspkg.pth`` behavior on Python 3.3+ where + PEP-420 functionality is adequate. Fixes pip #1924. + +v28.1.0 +------- + +* #803: Bump certifi to 2016.9.26. + +v28.0.0 +------- + +* #733: Do not search excluded directories for packages. + This introduced a backwards incompatible change in ``find_packages()`` + so that ``find_packages(exclude=['foo']) == []``, excluding subpackages of ``foo``. + Previously, ``find_packages(exclude=['foo']) == ['foo.bar']``, + even though the parent ``foo`` package was excluded. + +* #795: Bump certifi. + +* #719: Suppress decoding errors and instead log a warning + when metadata cannot be decoded. + +v27.3.1 +------- + +* #790: In MSVC monkeypatching, explicitly patch each + function by name in the target module instead of inferring + the module from the function's ``__module__``. Improves + compatibility with other packages that might have previously + patched distutils functions (i.e. NumPy). + +v27.3.0 +------- + +* #794: In test command, add installed eggs to PYTHONPATH + when invoking tests so that subprocesses will also have the + dependencies available. Fixes `tox 330 + `_. + +* #795: Update vendored pyparsing 2.1.9. + +v27.2.0 +------- + +* #520 and #513: Suppress ValueErrors in fixup_namespace_packages + when lookup fails. + +* Nicer, more consistent interfaces for msvc monkeypatching. + +v27.1.2 +------- + +* #779 via #781: Fix circular import. + +v27.1.1 +------- + +* #778: Fix MSVC monkeypatching. + +v27.1.0 +------- + +* Introduce the (private) ``monkey`` module to encapsulate + the distutils monkeypatching behavior. + +v27.0.0 +------- + +* Now use Warehouse by default for + ``upload``, patching ``distutils.config.PyPIRCCommand`` to + affect default behavior. + + Any config in .pypirc should be updated to replace + + https://pypi.python.org/pypi/ + + with + + https://upload.pypi.org/legacy/ + + Similarly, any passwords stored in the keyring should be + updated to use this new value for "system". + + The ``upload_docs`` command will continue to use the python.org + site, but the command is now deprecated. Users are urged to use + Read The Docs instead. + +* #776: Use EXT_SUFFIX for py_limited_api renaming. + +* #774 and #775: Use LegacyVersion from packaging when + detecting numpy versions. + +v26.1.1 +------- + +* Re-release of 26.1.0 with pytest pinned to allow for automated + deployment and thus proper packaging environment variables, + fixing issues with missing executable launchers. + +v26.1.0 +------- + +* #763: ``pkg_resources.get_default_cache`` now defers to the + `appdirs project `_ to + resolve the cache directory. Adds a vendored dependency on + appdirs to pkg_resources. + +v26.0.0 +------- + +* #748: By default, sdists are now produced in gzipped tarfile + format by default on all platforms, adding forward compatibility + for the same behavior in Python 3.6 (See Python #27819). + +* #459 via #736: On Windows with script launchers, + sys.argv[0] now reflects + the name of the entry point, consistent with the behavior in + distlib and pip wrappers. + +* #752 via #753: When indicating ``py_limited_api`` to Extension, + it must be passed as a keyword argument. + +v25.4.0 +------- + +* Add Extension(py_limited_api=True). When set to a truthy value, + that extension gets a filename appropriate for code using Py_LIMITED_API. + When used correctly this allows a single compiled extension to work on + all future versions of CPython 3. + The py_limited_api argument only controls the filename. To be + compatible with multiple versions of Python 3, the C extension + will also need to set -DPy_LIMITED_API=... and be modified to use + only the functions in the limited API. + +v25.3.0 +------- + +* #739 Fix unquoted libpaths by fixing compatibility between `numpy.distutils` and `distutils._msvccompiler` for numpy < 1.11.2 (Fix issue #728, error also fixed in Numpy). + +* #731: Bump certifi. + +* Style updates. See #740, #741, #743, #744, #742, #747. + +* #735: include license file. + +v25.2.0 +------- + +* #612 via #730: Add a LICENSE file which needs to be provided by the terms of + the MIT license. + +v25.1.6 +------- + +* #725: revert `library_dir_option` patch (Error is related to `numpy.distutils` and make errors on non Numpy users). + +v25.1.5 +------- + +* #720 +* #723: Improve patch for `library_dir_option`. + +v25.1.4 +------- + +* #717 +* #713 +* #707: Fix Python 2 compatibility for MSVC by catching errors properly. +* #715: Fix unquoted libpaths by patching `library_dir_option`. + +v25.1.3 +------- + +* #714 and #704: Revert fix as it breaks other components + downstream that can't handle unicode. See #709, #710, + and #712. + +v25.1.2 +------- + +* #704: Fix errors when installing a zip sdist that contained + files named with non-ascii characters on Windows would + crash the install when it attempted to clean up the build. +* #646: MSVC compatibility - catch errors properly in + RegistryInfo.lookup. +* #702: Prevent UnboundLocalError when initial working_set + is empty. + +v25.1.1 +------- + +* #686: Fix issue in sys.path ordering by pkg_resources when + rewrite technique is "raw". +* #699: Fix typo in msvc support. + +v25.1.0 +------- + +* #609: Setuptools will now try to download a distribution from + the next possible download location if the first download fails. + This means you can now specify multiple links as ``dependency_links`` + and all links will be tried until a working download link is encountered. + +v25.0.2 +------- + +* #688: Fix AttributeError in setup.py when invoked not from + the current directory. + +v25.0.1 +------- + +* Cleanup of setup.py script. + +* Fixed documentation builders by allowing setup.py + to be imported without having bootstrapped the + metadata. + +* More style cleanup. See #677, #678, #679, #681, #685. + +v25.0.0 +------- + +* #674: Default ``sys.path`` manipulation by easy-install.pth + is now "raw", meaning that when writing easy-install.pth + during any install operation, the ``sys.path`` will not be + rewritten and will no longer give preference to easy_installed + packages. + + To retain the old behavior when using any easy_install + operation (including ``setup.py install`` when setuptools is + present), set the environment variable: + + SETUPTOOLS_SYS_PATH_TECHNIQUE=rewrite + + This project hopes that that few if any environments find it + necessary to retain the old behavior, and intends to drop + support for it altogether in a future release. Please report + any relevant concerns in the ticket for this change. + +v24.3.1 +------- + +* #398: Fix shebang handling on Windows in script + headers where spaces in ``sys.executable`` would + produce an improperly-formatted shebang header, + introduced in 12.0 with the fix for #188. + +* #663, #670: More style updates. + +v24.3.0 +------- + +* #516: Disable ``os.link`` to avoid hard linking + in ``sdist.make_distribution``, avoiding errors on + systems that support hard links but not on the + file system in which the build is occurring. + +v24.2.1 +------- + +* #667: Update Metadata-Version to 1.2 when + ``python_requires`` is supplied. + +v24.2.0 +------- + +* #631: Add support for ``python_requires`` keyword. + +v24.1.1 +------- + +* More style updates. See #660, #661, #641. + +v24.1.0 +------- + +* #659: ``setup.py`` now will fail fast and with a helpful + error message when the necessary metadata is missing. +* More style updates. See #656, #635, #640, + #644, #650, #652, and #655. + +v24.0.3 +------- + +* Updated style in much of the codebase to match + community expectations. See #632, #633, #634, + #637, #639, #638, #642, #648. + +v24.0.2 +------- + +* If MSVC++14 is needed ``setuptools.msvc`` now redirect + user to Visual C++ Build Tools web page. + +v24.0.1 +------- + +* #625 and #626: Fixes on ``setuptools.msvc`` mainly + for Python 2 and Linux. + +v24.0.0 +------- + +* Pull Request #174: Add more aggressive support for + standalone Microsoft Visual C++ compilers in + msvc9compiler patch. + Particularly : Windows SDK 6.1 and 7.0 + (MSVC++ 9.0), Windows SDK 7.1 (MSVC++ 10.0), + Visual C++ Build Tools 2015 (MSVC++14) +* Renamed ``setuptools.msvc9_support`` to + ``setuptools.msvc``. + +v23.2.1 +------- + +Re-release of v23.2.0, which was missing the intended +commits. + +* #623: Remove used of deprecated 'U' flag when reading + manifests. + +v23.1.0 +------- + +* #619: Deprecated ``tag_svn_revision`` distribution + option. + +v23.0.0 +------- + +* #611: Removed ARM executables for CLI and GUI script + launchers on Windows. If this was a feature you cared + about, please comment in the ticket. +* #604: Removed docs building support. The project + now relies on documentation hosted at + https://setuptools.readthedocs.io/. + +v22.0.5 +------- + +* #604: Restore repository for upload_docs command + to restore publishing of docs during release. + +v22.0.4 +------- + +* #589: Upload releases to pypi.io using the upload + hostname and legacy path. + +v22.0.3 +------- + +* #589: Releases are now uploaded to pypi.io (Warehouse) + even when releases are made on Twine via Travis. + +v22.0.2 +------- + +* #589: Releases are now uploaded to pypi.io (Warehouse). + +v22.0.1 +------- + +* #190: On Python 2, if unicode is passed for packages to + ``build_py`` command, it will be handled just as with + text on Python 3. + +v22.0.0 +------- + +Intended to be v21.3.0, but jaraco accidentally released as +a major bump. + +* #598: Setuptools now lists itself first in the User-Agent + for web requests, better following the guidelines in + `RFC 7231 + `_. + +v21.2.2 +------- + +* Minor fixes to changelog and docs. + +v21.2.1 +------- + +* #261: Exclude directories when resolving globs in + package_data. + +v21.2.0 +------- + +* #539: In the easy_install get_site_dirs, honor all + paths found in ``site.getsitepackages``. + +v21.1.0 +------- + +* #572: In build_ext, now always import ``_CONFIG_VARS`` + from ``distutils`` rather than from ``sysconfig`` + to allow ``distutils.sysconfig.customize_compiler`` + configure the OS X compiler for ``-dynamiclib``. + +v21.0.0 +------- + +* Removed ez_setup.py from Setuptools sdist. The + bootstrap script will be maintained in its own + branch and should be generally be retrieved from + its canonical location at + https://bootstrap.pypa.io/ez_setup.py. + +v20.10.0 +-------- + +* #553: egg_info section is now generated in a + deterministic order, matching the order generated + by earlier versions of Python. Except on Python 2.6, + order is preserved when existing settings are present. +* #556: Update to Packaging 16.7, restoring support + for deprecated ``python_implmentation`` marker. +* #555: Upload command now prompts for a password + when uploading to PyPI (or other repository) if no + password is present in .pypirc or in the keyring. + +v20.9.0 +------- + +* #548: Update certify version to 2016.2.28 +* #545: Safely handle deletion of non-zip eggs in rotate + command. + +v20.8.1 +------- + +* Issue #544: Fix issue with extra environment marker + processing in WorkingSet due to refactor in v20.7.0. + +v20.8.0 +------- + +* Issue #543: Re-release so that latest release doesn't + cause déjà vu with distribute and setuptools 0.7 in + older environments. + +v20.7.0 +------- + +* Refactored extra environment marker processing + in WorkingSet. +* Issue #533: Fixed intermittent test failures. +* Issue #536: In msvc9_support, trap additional exceptions + that might occur when importing + ``distutils.msvc9compiler`` in mingw environments. +* Issue #537: Provide better context when package + metadata fails to decode in UTF-8. + +v20.6.8 +------- + +* Issue #523: Restored support for environment markers, + now honoring 'extra' environment markers. + +v20.6.7 +------- + +* Issue #523: Disabled support for environment markers + introduced in v20.5. + +v20.6.6 +------- + +* Issue #503: Restore support for PEP 345 environment + markers by updating to Packaging 16.6. + +v20.6.0 +------- + +* New release process that relies on + `bumpversion `_ + and Travis CI for continuous deployment. +* Project versioning semantics now follow + `semver `_ precisely. + The 'v' prefix on version numbers now also allows + version numbers to be referenced in the changelog, + e.g. http://setuptools.readthedocs.io/en/latest/history.html#v20-6-0. + +20.5 +---- + +* BB Pull Request #185, #470: Add support for environment markers + in requirements in install_requires, setup_requires, + tests_require as well as adding a test for the existing + extra_requires machinery. + +20.4 +---- + +* Issue #422: Moved hosting to + `Github `_ + from `Bitbucket `_. + Issues have been migrated, though all issues and comments + are attributed to bb-migration. So if you have a particular + issue or issues to which you've been subscribed, you will + want to "watch" the equivalent issue in Github. + The Bitbucket project will be retained for the indefinite + future, but Github now hosts the canonical project repository. + +20.3.1 +------ + +* Issue #519: Remove import hook when reloading the + ``pkg_resources`` module. +* BB Pull Request #184: Update documentation in ``pkg_resources`` + around new ``Requirement`` implementation. + +20.3 +---- + +* BB Pull Request #179: ``pkg_resources.Requirement`` objects are + now a subclass of ``packaging.requirements.Requirement``, + allowing any environment markers and url (if any) to be + affiliated with the requirement +* BB Pull Request #179: Restore use of RequirementParseError + exception unintentionally dropped in 20.2. + +20.2.2 +------ + +* Issue #502: Correct regression in parsing of multiple + version specifiers separated by commas and spaces. + +20.2.1 +------ + +* Issue #499: Restore compatibility for legacy versions + by bumping to packaging 16.4. + +20.2 +---- + +* Changelog now includes release dates and links to PEPs. +* BB Pull Request #173: Replace dual PEP 345 _markerlib implementation + and PEP 426 implementation of environment marker support from + packaging 16.1 and PEP 508. Fixes Issue #122. + See also BB Pull Request #175, BB Pull Request #168, and + BB Pull Request #164. Additionally: + + - ``Requirement.parse`` no longer retains the order of extras. + - ``parse_requirements`` now requires that all versions be + PEP-440 compliant, as revealed in #499. Packages released + with invalid local versions should be re-released using + the proper local version syntax, e.g. ``mypkg-1.0+myorg.1``. + +20.1.1 +------ + +* Update ``upload_docs`` command to also honor keyring + for password resolution. + +20.1 +---- + +* Added support for using passwords from keyring in the upload + command. See `the upload docs + `_ + for details. + +20.0 +---- + +* Issue #118: Once again omit the package metadata (egg-info) + from the list of outputs in ``--record``. This version of setuptools + can no longer be used to upgrade pip earlier than 6.0. + +19.7 +---- + +* `Off-project PR `_: + For FreeBSD, also honor root certificates from ca_root_nss. + +19.6.2 +------ + +* Issue #491: Correct regression incurred in 19.4 where + a double-namespace package installed using pip would + cause a TypeError. + +19.6.1 +------ + +* Restore compatibility for PyPy 3 compatibility lost in + 19.4.1 addressing Issue #487. +* ``setuptools.launch`` shim now loads scripts in a new + namespace, avoiding getting relative imports from + the setuptools package on Python 2. + +19.6 +---- + +* Added a new entry script ``setuptools.launch``, + implementing the shim found in + ``pip.util.setuptools_build``. Use this command to launch + distutils-only packages under setuptools in the same way that + pip does, causing the setuptools monkeypatching of distutils + to be invoked prior to invoking a script. Useful for debugging + or otherwise installing a distutils-only package under + setuptools when pip isn't available or otherwise does not + expose the desired functionality. For example:: + + $ python -m setuptools.launch setup.py develop + +* Issue #488: Fix dual manifestation of Extension class in + extension packages installed as dependencies when Cython + is present. + +19.5 +---- + +* Issue #486: Correct TypeError when getfilesystemencoding + returns None. +* Issue #139: Clarified the license as MIT. +* BB Pull Request #169: Removed special handling of command + spec in scripts for Jython. + +19.4.1 +------ + +* Issue #487: Use direct invocation of ``importlib.machinery`` + in ``pkg_resources`` to avoid missing detection on relevant + platforms. + +19.4 +---- + +* Issue #341: Correct error in path handling of package data + files in ``build_py`` command when package is empty. +* Distribute #323, Issue #141, Issue #207, and + BB Pull Request #167: Another implementation of + ``pkg_resources.WorkingSet`` and ``pkg_resources.Distribution`` + that supports replacing an extant package with a new one, + allowing for setup_requires dependencies to supersede installed + packages for the session. + +19.3 +---- + +* Issue #229: Implement new technique for readily incorporating + dependencies conditionally from vendored copies or primary + locations. Adds a new dependency on six. + +19.2 +---- + +* BB Pull Request #163: Add get_command_list method to Distribution. +* BB Pull Request #162: Add missing whitespace to multiline string + literals. + +19.1.1 +------ + +* Issue #476: Cast version to string (using default encoding) + to avoid creating Unicode types on Python 2 clients. +* Issue #477: In Powershell downloader, use explicit rendering + of strings, rather than rely on ``repr``, which can be + incorrect (especially on Python 2). + +19.1 +---- + +* Issue #215: The bootstrap script ``ez_setup.py`` now + automatically detects + the latest version of setuptools (using PyPI JSON API) rather + than hard-coding a particular value. +* Issue #475: Fix incorrect usage in _translate_metadata2. + +19.0 +---- + +* Issue #442: Use RawConfigParser for parsing .pypirc file. + Interpolated values are no longer honored in .pypirc files. + +18.8.1 +------ + +* Issue #440: Prevent infinite recursion when a SandboxViolation + or other UnpickleableException occurs in a sandbox context + with setuptools hidden. Fixes regression introduced in Setuptools + 12.0. + +18.8 +---- + +* Deprecated ``egg_info.get_pkg_info_revision``. +* Issue #471: Don't rely on repr for an HTML attribute value in + package_index. +* Issue #419: Avoid errors in FileMetadata when the metadata directory + is broken. +* Issue #472: Remove deprecated use of 'U' in mode parameter + when opening files. + +18.7.1 +------ + +* Issue #469: Refactored logic for Issue #419 fix to re-use metadata + loading from Provider. + +18.7 +---- + +* Update dependency on certify. +* BB Pull Request #160: Improve detection of gui script in + ``easy_install._adjust_header``. +* Made ``test.test_args`` a non-data property; alternate fix + for the issue reported in BB Pull Request #155. +* Issue #453: In ``ez_setup`` bootstrap module, unload all + ``pkg_resources`` modules following download. +* BB Pull Request #158: Honor PEP-488 when excluding + files for namespace packages. +* Issue #419 and BB Pull Request #144: Add experimental support for + reading the version info from distutils-installed metadata rather + than using the version in the filename. + +18.6.1 +------ + +* Issue #464: Correct regression in invocation of superclass on old-style + class on Python 2. + +18.6 +---- + +* Issue #439: When installing entry_point scripts under development, + omit the version number of the package, allowing any version of the + package to be used. + +18.5 +---- + +* In preparation for dropping support for Python 3.2, a warning is + now logged when pkg_resources is imported on Python 3.2 or earlier + Python 3 versions. +* `Add support for python_platform_implementation environment marker + `_. +* `Fix dictionary mutation during iteration + `_. + +18.4 +---- + +* Issue #446: Test command now always invokes unittest, even + if no test suite is supplied. + +18.3.2 +------ + +* Correct another regression in setuptools.findall + where the fix for Python #12885 was lost. + +18.3.1 +------ + +* Issue #425: Correct regression in setuptools.findall. + +18.3 +---- + +* BB Pull Request #135: Setuptools now allows disabling of + the manipulation of the sys.path + during the processing of the easy-install.pth file. To do so, set + the environment variable ``SETUPTOOLS_SYS_PATH_TECHNIQUE`` to + anything but "rewrite" (consider "raw"). During any install operation + with manipulation disabled, setuptools packages will be appended to + sys.path naturally. + + Future versions may change the default behavior to disable + manipulation. If so, the default behavior can be retained by setting + the variable to "rewrite". + +* Issue #257: ``easy_install --version`` now shows more detail + about the installation location and Python version. + +* Refactor setuptools.findall in preparation for re-submission + back to distutils. + +18.2 +---- + +* Issue #412: More efficient directory search in ``find_packages``. + +18.1 +---- + +* Upgrade to vendored packaging 15.3. + +18.0.1 +------ + +* Issue #401: Fix failure in test suite. + +18.0 +---- + +* Dropped support for builds with Pyrex. Only Cython is supported. +* Issue #288: Detect Cython later in the build process, after + ``setup_requires`` dependencies are resolved. + Projects backed by Cython can now be readily built + with a ``setup_requires`` dependency. For example:: + + ext = setuptools.Extension('mylib', ['src/CythonStuff.pyx', 'src/CStuff.c']) + setuptools.setup( + ... + ext_modules=[ext], + setup_requires=['cython'], + ) + + For compatibility with older versions of setuptools, packagers should + still include ``src/CythonMod.c`` in the source distributions or + require that Cython be present before building source distributions. + However, for systems with this build of setuptools, Cython will be + downloaded on demand. +* Issue #396: Fixed test failure on OS X. +* BB Pull Request #136: Remove excessive quoting from shebang headers + for Jython. + +17.1.1 +------ + +* Backed out unintended changes to pkg_resources, restoring removal of + deprecated imp module (`ref + `_). + +17.1 +---- + +* Issue #380: Add support for range operators on environment + marker evaluation. + +17.0 +---- + +* Issue #378: Do not use internal importlib._bootstrap module. +* Issue #390: Disallow console scripts with path separators in + the name. Removes unintended functionality and brings behavior + into parity with pip. + +16.0 +---- + +* BB Pull Request #130: Better error messages for errors in + parsed requirements. +* BB Pull Request #133: Removed ``setuptools.tests`` from the + installed packages. +* BB Pull Request #129: Address deprecation warning due to usage + of imp module. + +15.2 +---- + +* Issue #373: Provisionally expose + ``pkg_resources._initialize_master_working_set``, allowing for + imperative re-initialization of the master working set. + +15.1 +---- + +* Updated to Packaging 15.1 to address Packaging #28. +* Fix ``setuptools.sandbox._execfile()`` with Python 3.1. + +15.0 +---- + +* BB Pull Request #126: DistributionNotFound message now lists the package or + packages that required it. E.g.:: + + pkg_resources.DistributionNotFound: The 'colorama>=0.3.1' distribution was not found and is required by smlib.log. + + Note that zc.buildout once dependended on the string rendering of this + message to determine the package that was not found. This expectation + has since been changed, but older versions of buildout may experience + problems. See Buildout #242 for details. + +14.3.1 +------ + +* Issue #307: Removed PEP-440 warning during parsing of versions + in ``pkg_resources.Distribution``. +* Issue #364: Replace deprecated usage with recommended usage of + ``EntryPoint.load``. + +14.3 +---- + +* Issue #254: When creating temporary egg cache on Unix, use mode 755 + for creating the directory to avoid the subsequent warning if + the directory is group writable. + +14.2 +---- + +* Issue #137: Update ``Distribution.hashcmp`` so that Distributions with + None for pyversion or platform can be compared against Distributions + defining those attributes. + +14.1.1 +------ + +* Issue #360: Removed undesirable behavior from test runs, preventing + write tests and installation to system site packages. + +14.1 +---- + +* BB Pull Request #125: Add ``__ne__`` to Requirement class. +* Various refactoring of easy_install. + +14.0 +---- + +* Bootstrap script now accepts ``--to-dir`` to customize save directory or + allow for re-use of existing repository of setuptools versions. See + BB Pull Request #112 for background. +* Issue #285: ``easy_install`` no longer will default to installing + packages to the "user site packages" directory if it is itself installed + there. Instead, the user must pass ``--user`` in all cases to install + packages to the user site packages. + This behavior now matches that of "pip install". To configure + an environment to always install to the user site packages, consider + using the "install-dir" and "scripts-dir" parameters to easy_install + through an appropriate distutils config file. + +13.0.2 +------ + +* Issue #359: Include pytest.ini in the sdist so invocation of py.test on the + sdist honors the pytest configuration. + +13.0.1 +------ + +Re-release of 13.0. Intermittent connectivity issues caused the release +process to fail and PyPI uploads no longer accept files for 13.0. + +13.0 +---- + +* Issue #356: Back out BB Pull Request #119 as it requires Setuptools 10 or later + as the source during an upgrade. +* Removed build_py class from setup.py. According to 892f439d216e, this + functionality was added to support upgrades from old Distribute versions, + 0.6.5 and 0.6.6. + +12.4 +---- + +* BB Pull Request #119: Restore writing of ``setup_requires`` to metadata + (previously added in 8.4 and removed in 9.0). + +12.3 +---- + +* Documentation is now linked using the rst.linker package. +* Fix ``setuptools.command.easy_install.extract_wininst_cfg()`` + with Python 2.6 and 2.7. +* Issue #354. Added documentation on building setuptools + documentation. + +12.2 +---- + +* Issue #345: Unload all modules under pkg_resources during + ``ez_setup.use_setuptools()``. +* Issue #336: Removed deprecation from ``ez_setup.use_setuptools``, + as it is clearly still used by buildout's bootstrap. ``ez_setup`` + remains deprecated for use by individual packages. +* Simplified implementation of ``ez_setup.use_setuptools``. + +12.1 +---- + +* BB Pull Request #118: Soften warning for non-normalized versions in + Distribution. + +12.0.5 +------ + +* Issue #339: Correct Attribute reference in ``cant_write_to_target``. +* Issue #336: Deprecated ``ez_setup.use_setuptools``. + +12.0.4 +------ + +* Issue #335: Fix script header generation on Windows. + +12.0.3 +------ + +* Fixed incorrect class attribute in ``install_scripts``. Tests would be nice. + +12.0.2 +------ + +* Issue #331: Fixed ``install_scripts`` command on Windows systems corrupting + the header. + +12.0.1 +------ + +* Restore ``setuptools.command.easy_install.sys_executable`` for pbr + compatibility. For the future, tools should construct a CommandSpec + explicitly. + +12.0 +---- + +* Issue #188: Setuptools now support multiple entities in the value for + ``build.executable``, such that an executable of "/usr/bin/env my-python" may + be specified. This means that systems with a specified executable whose name + has spaces in the path must be updated to escape or quote that value. +* Deprecated ``easy_install.ScriptWriter.get_writer``, replaced by ``.best()`` + with slightly different semantics (no force_windows flag). + +11.3.1 +------ + +* Issue #327: Formalize and restore support for any printable character in an + entry point name. + +11.3 +---- + +* Expose ``EntryPoint.resolve`` in place of EntryPoint._load, implementing the + simple, non-requiring load. Deprecated all uses of ``EntryPoint._load`` + except for calling with no parameters, which is just a shortcut for + ``ep.require(); ep.resolve();``. + + Apps currently invoking ``ep.load(require=False)`` should instead do the + following if wanting to avoid the deprecating warning:: + + getattr(ep, "resolve", lambda: ep.load(require=False))() + +11.2 +---- + +* Pip #2326: Report deprecation warning at stacklevel 2 for easier diagnosis. + +11.1 +---- + +* Issue #281: Since Setuptools 6.1 (Issue #268), a ValueError would be raised + in certain cases where VersionConflict was raised with two arguments, which + occurred in ``pkg_resources.WorkingSet.find``. This release adds support + for indicating the dependent packages while maintaining support for + a VersionConflict when no dependent package context is known. New unit tests + now capture the expected interface. + +11.0 +---- + +* Interop #3: Upgrade to Packaging 15.0; updates to PEP 440 so that >1.7 does + not exclude 1.7.1 but does exclude 1.7.0 and 1.7.0.post1. + +10.2.1 +------ + +* Issue #323: Fix regression in entry point name parsing. + +10.2 +---- + +* Deprecated use of EntryPoint.load(require=False). Passing a boolean to a + function to select behavior is an anti-pattern. Instead use + ``Entrypoint._load()``. +* Substantial refactoring of all unit tests. Tests are now much leaner and + re-use a lot of fixtures and contexts for better clarity of purpose. + +10.1 +---- + +* Issue #320: Added a compatibility implementation of + ``sdist._default_revctrl`` + so that systems relying on that interface do not fail (namely, Ubuntu 12.04 + and similar Debian releases). + +10.0.1 +------ + +* Issue #319: Fixed issue installing pure distutils packages. + +10.0 +---- + +* Issue #313: Removed built-in support for subversion. Projects wishing to + retain support for subversion will need to use a third party library. The + extant implementation is being ported to `setuptools_svn + `_. +* Issue #315: Updated setuptools to hide its own loaded modules during + installation of another package. This change will enable setuptools to + upgrade (or downgrade) itself even when its own metadata and implementation + change. + +9.1 +--- + +* Prefer vendored packaging library `as recommended + `_. + +9.0.1 +----- + +* Issue #312: Restored presence of pkg_resources API tests (doctest) to sdist. + +9.0 +--- + +* Issue #314: Disabled support for ``setup_requires`` metadata to avoid issue + where Setuptools was unable to upgrade over earlier versions. + +8.4 +--- + +* BB Pull Request #106: Now write ``setup_requires`` metadata. + +8.3 +--- + +* Issue #311: Decoupled pkg_resources from setuptools once again. + ``pkg_resources`` is now a package instead of a module. + +8.2.1 +----- + +* Issue #306: Suppress warnings about Version format except in select scenarios + (such as installation). + +8.2 +--- + +* BB Pull Request #85: Search egg-base when adding egg-info to manifest. + +8.1 +--- + +* Upgrade ``packaging`` to 14.5, giving preference to "rc" as designator for + release candidates over "c". +* PEP-440 warnings are now raised as their own class, + ``pkg_resources.PEP440Warning``, instead of RuntimeWarning. +* Disabled warnings on empty versions. + +8.0.4 +----- + +* Upgrade ``packaging`` to 14.4, fixing an error where there is a + different result for if 2.0.5 is contained within >2.0dev and >2.0.dev even + though normalization rules should have made them equal. +* Issue #296: Add warning when a version is parsed as legacy. This warning will + make it easier for developers to recognize deprecated version numbers. + +8.0.3 +----- + +* Issue #296: Restored support for ``__hash__`` on parse_version results. + +8.0.2 +----- + +* Issue #296: Restored support for ``__getitem__`` and sort operations on + parse_version result. + +8.0.1 +----- + +* Issue #296: Restore support for iteration over parse_version result, but + deprecated that usage with a warning. Fixes failure with buildout. + +8.0 +--- + +* Implement PEP 440 within + pkg_resources and setuptools. This change + deprecates some version numbers such that they will no longer be installable + without using the ``===`` escape hatch. See `the changes to test_resources + `_ + for specific examples of version numbers and specifiers that are no longer + supported. Setuptools now "vendors" the `packaging + `_ library. + +7.0 +--- + +* Issue #80, Issue #209: Eggs that are downloaded for ``setup_requires``, + ``test_requires``, etc. are now placed in a ``./.eggs`` directory instead of + directly in the current directory. This choice of location means the files + can be readily managed (removed, ignored). Additionally, + later phases or invocations of setuptools will not detect the package as + already installed and ignore it for permanent install (See #209). + + This change is indicated as backward-incompatible as installations that + depend on the installation in the current directory will need to account for + the new location. Systems that ignore ``*.egg`` will probably need to be + adapted to ignore ``.eggs``. The files will need to be manually moved or + will be retrieved again. Most use cases will require no attention. + +6.1 +--- + +* Issue #268: When resolving package versions, a VersionConflict now reports + which package previously required the conflicting version. + +6.0.2 +----- + +* Issue #262: Fixed regression in pip install due to egg-info directories + being omitted. Re-opens Issue #118. + +6.0.1 +----- + +* Issue #259: Fixed regression with namespace package handling on ``single + version, externally managed`` installs. + +6.0 +--- + +* Issue #100: When building a distribution, Setuptools will no longer match + default files using platform-dependent case sensitivity, but rather will + only match the files if their case matches exactly. As a result, on Windows + and other case-insensitive file systems, files with names such as + 'readme.txt' or 'README.TXT' will be omitted from the distribution and a + warning will be issued indicating that 'README.txt' was not found. Other + filenames affected are: + + - README.rst + - README + - setup.cfg + - setup.py (or the script name) + - test/test*.py + + Any users producing distributions with filenames that match those above + case-insensitively, but not case-sensitively, should rename those files in + their repository for better portability. +* BB Pull Request #72: When using ``single_version_externally_managed``, the + exclusion list now includes Python 3.2 ``__pycache__`` entries. +* BB Pull Request #76 and BB Pull Request #78: lines in top_level.txt are now + ordered deterministically. +* Issue #118: The egg-info directory is now no longer included in the list + of outputs. +* Issue #258: Setuptools now patches distutils msvc9compiler to + recognize the specially-packaged compiler package for easy extension module + support on Python 2.6, 2.7, and 3.2. + +5.8 +--- + +* Issue #237: ``pkg_resources`` now uses explicit detection of Python 2 vs. + Python 3, supporting environments where builtins have been patched to make + Python 3 look more like Python 2. + +5.7 +--- + +* Issue #240: Based on real-world performance measures against 5.4, zip + manifests are now cached in all circumstances. The + ``PKG_RESOURCES_CACHE_ZIP_MANIFESTS`` environment variable is no longer + relevant. The observed "memory increase" referenced in the 5.4 release + notes and detailed in Issue #154 was likely not an increase over the status + quo, but rather only an increase over not storing the zip info at all. + +5.6 +--- + +* Issue #242: Use absolute imports in svn_utils to avoid issues if the + installing package adds an xml module to the path. + +5.5.1 +----- + +* Issue #239: Fix typo in 5.5 such that fix did not take. + +5.5 +--- + +* Issue #239: Setuptools now includes the setup_requires directive on + Distribution objects and validates the syntax just like install_requires + and tests_require directives. + +5.4.2 +----- + +* Issue #236: Corrected regression in execfile implementation for Python 2.6. + +5.4.1 +----- + +* Python #7776: (ssl_support) Correct usage of host for validation when + tunneling for HTTPS. + +5.4 +--- + +* Issue #154: ``pkg_resources`` will now cache the zip manifests rather than + re-processing the same file from disk multiple times, but only if the + environment variable ``PKG_RESOURCES_CACHE_ZIP_MANIFESTS`` is set. Clients + that package many modules in the same zip file will see some improvement + in startup time by enabling this feature. This feature is not enabled by + default because it causes a substantial increase in memory usage. + +5.3 +--- + +* Issue #185: Make svn tagging work on the new style SVN metadata. + Thanks cazabon! +* Prune revision control directories (e.g .svn) from base path + as well as sub-directories. + +5.2 +--- + +* Added a `Developer Guide + `_ to the official + documentation. +* Some code refactoring and cleanup was done with no intended behavioral + changes. +* During install_egg_info, the generated lines for namespace package .pth + files are now processed even during a dry run. + +5.1 +--- + +* Issue #202: Implemented more robust cache invalidation for the ZipImporter, + building on the work in Issue #168. Special thanks to Jurko Gospodnetic and + PJE. + +5.0.2 +----- + +* Issue #220: Restored script templates. + +5.0.1 +----- + +* Renamed script templates to end with .tmpl now that they no longer need + to be processed by 2to3. Fixes spurious syntax errors during build/install. + +5.0 +--- + +* Issue #218: Re-release of 3.8.1 to signal that it supersedes 4.x. +* Incidentally, script templates were updated not to include the triple-quote + escaping. + +3.7.1 and 3.8.1 and 4.0.1 +------------------------- + +* Issue #213: Use legacy StringIO behavior for compatibility under pbr. +* Issue #218: Setuptools 3.8.1 superseded 4.0.1, and 4.x was removed + from the available versions to install. + +4.0 +--- + +* Issue #210: ``setup.py develop`` now copies scripts in binary mode rather + than text mode, matching the behavior of the ``install`` command. + +3.8 +--- + +* Extend Issue #197 workaround to include all Python 3 versions prior to + 3.2.2. + +3.7 +--- + +* Issue #193: Improved handling of Unicode filenames when building manifests. + +3.6 +--- + +* Issue #203: Honor proxy settings for Powershell downloader in the bootstrap + routine. + +3.5.2 +----- + +* Issue #168: More robust handling of replaced zip files and stale caches. + Fixes ZipImportError complaining about a 'bad local header'. + +3.5.1 +----- + +* Issue #199: Restored ``install._install`` for compatibility with earlier + NumPy versions. + +3.5 +--- + +* Issue #195: Follow symbolic links in find_packages (restoring behavior + broken in 3.4). +* Issue #197: On Python 3.1, PKG-INFO is now saved in a UTF-8 encoding instead + of ``sys.getpreferredencoding`` to match the behavior on Python 2.6-3.4. +* Issue #192: Preferred bootstrap location is now + https://bootstrap.pypa.io/ez_setup.py (mirrored from former location). + +3.4.4 +----- + +* Issue #184: Correct failure where find_package over-matched packages + when directory traversal isn't short-circuited. + +3.4.3 +----- + +* Issue #183: Really fix test command with Python 3.1. + +3.4.2 +----- + +* Issue #183: Fix additional regression in test command on Python 3.1. + +3.4.1 +----- + +* Issue #180: Fix regression in test command not caught by py.test-run tests. + +3.4 +--- + +* Issue #176: Add parameter to the test command to support a custom test + runner: --test-runner or -r. +* Issue #177: Now assume most common invocation to install command on + platforms/environments without stack support (issuing a warning). Setuptools + now installs naturally on IronPython. Behavior on CPython should be + unchanged. + +3.3 +--- + +* Add ``include`` parameter to ``setuptools.find_packages()``. + +3.2 +--- + +* BB Pull Request #39: Add support for C++ targets from Cython ``.pyx`` files. +* Issue #162: Update dependency on certifi to 1.0.1. +* Issue #164: Update dependency on wincertstore to 0.2. + +3.1 +--- + +* Issue #161: Restore Features functionality to allow backward compatibility + (for Features) until the uses of that functionality is sufficiently removed. + +3.0.2 +----- + +* Correct typo in previous bugfix. + +3.0.1 +----- + +* Issue #157: Restore support for Python 2.6 in bootstrap script where + ``zipfile.ZipFile`` does not yet have support for context managers. + +3.0 +--- + +* Issue #125: Prevent Subversion support from creating a ~/.subversion + directory just for checking the presence of a Subversion repository. +* Issue #12: Namespace packages are now imported lazily. That is, the mere + declaration of a namespace package in an egg on ``sys.path`` no longer + causes it to be imported when ``pkg_resources`` is imported. Note that this + change means that all of a namespace package's ``__init__.py`` files must + include a ``declare_namespace()`` call in order to ensure that they will be + handled properly at runtime. In 2.x it was possible to get away without + including the declaration, but only at the cost of forcing namespace + packages to be imported early, which 3.0 no longer does. +* Issue #148: When building (bdist_egg), setuptools no longer adds + ``__init__.py`` files to namespace packages. Any packages that rely on this + behavior will need to create ``__init__.py`` files and include the + ``declare_namespace()``. +* Issue #7: Setuptools itself is now distributed as a zip archive in addition to + tar archive. ez_setup.py now uses zip archive. This approach avoids the potential + security vulnerabilities presented by use of tar archives in ez_setup.py. + It also leverages the security features added to ZipFile.extract in Python 2.7.4. +* Issue #65: Removed deprecated Features functionality. +* BB Pull Request #28: Remove backport of ``_bytecode_filenames`` which is + available in Python 2.6 and later, but also has better compatibility with + Python 3 environments. +* Issue #156: Fix spelling of __PYVENV_LAUNCHER__ variable. + +2.2 +--- + +* Issue #141: Restored fix for allowing setup_requires dependencies to + override installed dependencies during setup. +* Issue #128: Fixed issue where only the first dependency link was honored + in a distribution where multiple dependency links were supplied. + +2.1.2 +----- + +* Issue #144: Read long_description using codecs module to avoid errors + installing on systems where LANG=C. + +2.1.1 +----- + +* Issue #139: Fix regression in re_finder for CVS repos (and maybe Git repos + as well). + +2.1 +--- + +* Issue #129: Suppress inspection of ``*.whl`` files when searching for files + in a zip-imported file. +* Issue #131: Fix RuntimeError when constructing an egg fetcher. + +2.0.2 +----- + +* Fix NameError during installation with Python implementations (e.g. Jython) + not containing parser module. +* Fix NameError in ``sdist:re_finder``. + +2.0.1 +----- + +* Issue #124: Fixed error in list detection in upload_docs. + +2.0 +--- + +* Issue #121: Exempt lib2to3 pickled grammars from DirectorySandbox. +* Issue #41: Dropped support for Python 2.4 and Python 2.5. Clients requiring + setuptools for those versions of Python should use setuptools 1.x. +* Removed ``setuptools.command.easy_install.HAS_USER_SITE``. Clients + expecting this boolean variable should use ``site.ENABLE_USER_SITE`` + instead. +* Removed ``pkg_resources.ImpWrapper``. Clients that expected this class + should use ``pkgutil.ImpImporter`` instead. + +1.4.2 +----- + +* Issue #116: Correct TypeError when reading a local package index on Python + 3. + +1.4.1 +----- + +* Issue #114: Use ``sys.getfilesystemencoding`` for decoding config in + ``bdist_wininst`` distributions. + +* Issue #105 and Issue #113: Establish a more robust technique for + determining the terminal encoding:: + + 1. Try ``getpreferredencoding`` + 2. If that returns US_ASCII or None, try the encoding from + ``getdefaultlocale``. If that encoding was a "fallback" because Python + could not figure it out from the environment or OS, encoding remains + unresolved. + 3. If the encoding is resolved, then make sure Python actually implements + the encoding. + 4. On the event of an error or unknown codec, revert to fallbacks + (UTF-8 on Darwin, ASCII on everything else). + 5. On the encoding is 'mac-roman' on Darwin, use UTF-8 as 'mac-roman' was + a bug on older Python releases. + + On a side note, it would seem that the encoding only matters for when SVN + does not yet support ``--xml`` and when getting repository and svn version + numbers. The ``--xml`` technique should yield UTF-8 according to some + messages on the SVN mailing lists. So if the version numbers are always + 7-bit ASCII clean, it may be best to only support the file parsing methods + for legacy SVN releases and support for SVN without the subprocess command + would simple go away as support for the older SVNs does. + +1.4 +--- + +* Issue #27: ``easy_install`` will now use credentials from .pypirc if + present for connecting to the package index. +* BB Pull Request #21: Omit unwanted newlines in ``package_index._encode_auth`` + when the username/password pair length indicates wrapping. + +1.3.2 +----- + +* Issue #99: Fix filename encoding issues in SVN support. + +1.3.1 +----- + +* Remove exuberant warning in SVN support when SVN is not used. + +1.3 +--- + +* Address security vulnerability in SSL match_hostname check as reported in + Python #17997. +* Prefer `backports.ssl_match_hostname + `_ for backport + implementation if present. +* Correct NameError in ``ssl_support`` module (``socket.error``). + +1.2 +--- + +* Issue #26: Add support for SVN 1.7. Special thanks to Philip Thiem for the + contribution. +* Issue #93: Wheels are now distributed with every release. Note that as + reported in Issue #108, as of Pip 1.4, scripts aren't installed properly + from wheels. Therefore, if using Pip to install setuptools from a wheel, + the ``easy_install`` command will not be available. +* Setuptools "natural" launcher support, introduced in 1.0, is now officially + supported. + +1.1.7 +----- + +* Fixed behavior of NameError handling in 'script template (dev).py' (script + launcher for 'develop' installs). +* ``ez_setup.py`` now ensures partial downloads are cleaned up following + a failed download. +* Distribute #363 and Issue #55: Skip an sdist test that fails on locales + other than UTF-8. + +1.1.6 +----- + +* Distribute #349: ``sandbox.execfile`` now opens the target file in binary + mode, thus honoring a BOM in the file when compiled. + +1.1.5 +----- + +* Issue #69: Second attempt at fix (logic was reversed). + +1.1.4 +----- + +* Issue #77: Fix error in upload command (Python 2.4). + +1.1.3 +----- + +* Fix NameError in previous patch. + +1.1.2 +----- + +* Issue #69: Correct issue where 404 errors are returned for URLs with + fragments in them (such as #egg=). + +1.1.1 +----- + +* Issue #75: Add ``--insecure`` option to ez_setup.py to accommodate + environments where a trusted SSL connection cannot be validated. +* Issue #76: Fix AttributeError in upload command with Python 2.4. + +1.1 +--- + +* Issue #71 (Distribute #333): EasyInstall now puts less emphasis on the + condition when a host is blocked via ``--allow-hosts``. +* Issue #72: Restored Python 2.4 compatibility in ``ez_setup.py``. + +1.0 +--- + +* Issue #60: On Windows, Setuptools supports deferring to another launcher, + such as Vinay Sajip's `pylauncher `_ + (included with Python 3.3) to launch console and GUI scripts and not install + its own launcher executables. This experimental functionality is currently + only enabled if the ``SETUPTOOLS_LAUNCHER`` environment variable is set to + "natural". In the future, this behavior may become default, but only after + it has matured and seen substantial adoption. The ``SETUPTOOLS_LAUNCHER`` + also accepts "executable" to force the default behavior of creating launcher + executables. +* Issue #63: Bootstrap script (ez_setup.py) now prefers Powershell, curl, or + wget for retrieving the Setuptools tarball for improved security of the + install. The script will still fall back to a simple ``urlopen`` on + platforms that do not have these tools. +* Issue #65: Deprecated the ``Features`` functionality. +* Issue #52: In ``VerifyingHTTPSConn``, handle a tunnelled (proxied) + connection. + +Backward-Incompatible Changes +============================= + +This release includes a couple of backward-incompatible changes, but most if +not all users will find 1.0 a drop-in replacement for 0.9. + +* Issue #50: Normalized API of environment marker support. Specifically, + removed line number and filename from SyntaxErrors when returned from + `pkg_resources.invalid_marker`. Any clients depending on the specific + string representation of exceptions returned by that function may need to + be updated to account for this change. +* Issue #50: SyntaxErrors generated by `pkg_resources.invalid_marker` are + normalized for cross-implementation consistency. +* Removed ``--ignore-conflicts-at-my-risk`` and ``--delete-conflicting`` + options to easy_install. These options have been deprecated since 0.6a11. + +0.9.8 +----- + +* Issue #53: Fix NameErrors in `_vcs_split_rev_from_url`. + +0.9.7 +----- + +* Issue #49: Correct AttributeError on PyPy where a hashlib.HASH object does + not have a `.name` attribute. +* Issue #34: Documentation now refers to bootstrap script in code repository + referenced by bookmark. +* Add underscore-separated keys to environment markers (markerlib). + +0.9.6 +----- + +* Issue #44: Test failure on Python 2.4 when MD5 hash doesn't have a `.name` + attribute. + +0.9.5 +----- + +* Python #17980: Fix security vulnerability in SSL certificate validation. + +0.9.4 +----- + +* Issue #43: Fix issue (introduced in 0.9.1) with version resolution when + upgrading over other releases of Setuptools. + +0.9.3 +----- + +* Issue #42: Fix new ``AttributeError`` introduced in last fix. + +0.9.2 +----- + +* Issue #42: Fix regression where blank checksums would trigger an + ``AttributeError``. + +0.9.1 +----- + +* Distribute #386: Allow other positional and keyword arguments to os.open. +* Corrected dependency on certifi mis-referenced in 0.9. + +0.9 +--- + +* `package_index` now validates hashes other than MD5 in download links. + +0.8 +--- + +* Code base now runs on Python 2.4 - Python 3.3 without Python 2to3 + conversion. + +0.7.8 +----- + +* Distribute #375: Yet another fix for yet another regression. + +0.7.7 +----- + +* Distribute #375: Repair AttributeError created in last release (redo). +* Issue #30: Added test for get_cache_path. + +0.7.6 +----- + +* Distribute #375: Repair AttributeError created in last release. + +0.7.5 +----- + +* Issue #21: Restore Python 2.4 compatibility in ``test_easy_install``. +* Distribute #375: Merged additional warning from Distribute 0.6.46. +* Now honor the environment variable + ``SETUPTOOLS_DISABLE_VERSIONED_EASY_INSTALL_SCRIPT`` in addition to the now + deprecated ``DISTRIBUTE_DISABLE_VERSIONED_EASY_INSTALL_SCRIPT``. + +0.7.4 +----- + +* Issue #20: Fix comparison of parsed SVN version on Python 3. + +0.7.3 +----- + +* Issue #1: Disable installation of Windows-specific files on non-Windows systems. +* Use new sysconfig module with Python 2.7 or >=3.2. + +0.7.2 +----- + +* Issue #14: Use markerlib when the `parser` module is not available. +* Issue #10: ``ez_setup.py`` now uses HTTPS to download setuptools from PyPI. + +0.7.1 +----- + +* Fix NameError (Issue #3) again - broken in bad merge. + +0.7 +--- + +* Merged Setuptools and Distribute. See docs/merge.txt for details. + +Added several features that were slated for setuptools 0.6c12: + +* Index URL now defaults to HTTPS. +* Added experimental environment marker support. Now clients may designate a + PEP-426 environment marker for "extra" dependencies. Setuptools uses this + feature in ``setup.py`` for optional SSL and certificate validation support + on older platforms. Based on Distutils-SIG discussions, the syntax is + somewhat tentative. There should probably be a PEP with a firmer spec before + the feature should be considered suitable for use. +* Added support for SSL certificate validation when installing packages from + an HTTPS service. + +0.7b4 +----- + +* Issue #3: Fixed NameError in SSL support. + +0.6.49 +------ + +* Move warning check in ``get_cache_path`` to follow the directory creation + to avoid errors when the cache path does not yet exist. Fixes the error + reported in Distribute #375. + +0.6.48 +------ + +* Correct AttributeError in ``ResourceManager.get_cache_path`` introduced in + 0.6.46 (redo). + +0.6.47 +------ + +* Correct AttributeError in ``ResourceManager.get_cache_path`` introduced in + 0.6.46. + +0.6.46 +------ + +* Distribute #375: Issue a warning if the PYTHON_EGG_CACHE or otherwise + customized egg cache location specifies a directory that's group- or + world-writable. + +0.6.45 +------ + +* Distribute #379: ``distribute_setup.py`` now traps VersionConflict as well, + restoring ability to upgrade from an older setuptools version. + +0.6.44 +------ + +* ``distribute_setup.py`` has been updated to allow Setuptools 0.7 to + satisfy use_setuptools. + +0.6.43 +------ + +* Distribute #378: Restore support for Python 2.4 Syntax (regression in 0.6.42). + +0.6.42 +------ + +* External links finder no longer yields duplicate links. +* Distribute #337: Moved site.py to setuptools/site-patch.py (graft of very old + patch from setuptools trunk which inspired PR #31). + +0.6.41 +------ + +* Distribute #27: Use public api for loading resources from zip files rather than + the private method `_zip_directory_cache`. +* Added a new function ``easy_install.get_win_launcher`` which may be used by + third-party libraries such as buildout to get a suitable script launcher. + +0.6.40 +------ + +* Distribute #376: brought back cli.exe and gui.exe that were deleted in the + previous release. + +0.6.39 +------ + +* Add support for console launchers on ARM platforms. +* Fix possible issue in GUI launchers where the subsystem was not supplied to + the linker. +* Launcher build script now refactored for robustness. +* Distribute #375: Resources extracted from a zip egg to the file system now also + check the contents of the file against the zip contents during each + invocation of get_resource_filename. + +0.6.38 +------ + +* Distribute #371: The launcher manifest file is now installed properly. + +0.6.37 +------ + +* Distribute #143: Launcher scripts, including easy_install itself, are now + accompanied by a manifest on 32-bit Windows environments to avoid the + Installer Detection Technology and thus undesirable UAC elevation described + in `this Microsoft article + `_. + +0.6.36 +------ + +* BB Pull Request #35: In Buildout #64, it was reported that + under Python 3, installation of distutils scripts could attempt to copy + the ``__pycache__`` directory as a file, causing an error, apparently only + under Windows. Easy_install now skips all directories when processing + metadata scripts. + +0.6.35 +------ + + +Note this release is backward-incompatible with distribute 0.6.23-0.6.34 in +how it parses version numbers. + +* Distribute #278: Restored compatibility with distribute 0.6.22 and setuptools + 0.6. Updated the documentation to match more closely with the version + parsing as intended in setuptools 0.6. + +0.6.34 +------ + +* Distribute #341: 0.6.33 fails to build under Python 2.4. + +0.6.33 +------ + +* Fix 2 errors with Jython 2.5. +* Fix 1 failure with Jython 2.5 and 2.7. +* Disable workaround for Jython scripts on Linux systems. +* Distribute #336: `setup.py` no longer masks failure exit code when tests fail. +* Fix issue in pkg_resources where try/except around a platform-dependent + import would trigger hook load failures on Mercurial. See pull request 32 + for details. +* Distribute #341: Fix a ResourceWarning. + +0.6.32 +------ + +* Fix test suite with Python 2.6. +* Fix some DeprecationWarnings and ResourceWarnings. +* Distribute #335: Backed out `setup_requires` superceding installed requirements + until regression can be addressed. + +0.6.31 +------ + +* Distribute #303: Make sure the manifest only ever contains UTF-8 in Python 3. +* Distribute #329: Properly close files created by tests for compatibility with + Jython. +* Work around Jython #1980 and Jython #1981. +* Distribute #334: Provide workaround for packages that reference `sys.__stdout__` + such as numpy does. This change should address + `virtualenv #359 `_ as long + as the system encoding is UTF-8 or the IO encoding is specified in the + environment, i.e.:: + + PYTHONIOENCODING=utf8 pip install numpy + +* Fix for encoding issue when installing from Windows executable on Python 3. +* Distribute #323: Allow `setup_requires` requirements to supercede installed + requirements. Added some new keyword arguments to existing pkg_resources + methods. Also had to updated how __path__ is handled for namespace packages + to ensure that when a new egg distribution containing a namespace package is + placed on sys.path, the entries in __path__ are found in the same order they + would have been in had that egg been on the path when pkg_resources was + first imported. + +0.6.30 +------ + +* Distribute #328: Clean up temporary directories in distribute_setup.py. +* Fix fatal bug in distribute_setup.py. + +0.6.29 +------ + +* BB Pull Request #14: Honor file permissions in zip files. +* Distribute #327: Merged pull request #24 to fix a dependency problem with pip. +* Merged pull request #23 to fix https://github.com/pypa/virtualenv/issues/301. +* If Sphinx is installed, the `upload_docs` command now runs `build_sphinx` + to produce uploadable documentation. +* Distribute #326: `upload_docs` provided mangled auth credentials under Python 3. +* Distribute #320: Fix check for "createable" in distribute_setup.py. +* Distribute #305: Remove a warning that was triggered during normal operations. +* Distribute #311: Print metadata in UTF-8 independent of platform. +* Distribute #303: Read manifest file with UTF-8 encoding under Python 3. +* Distribute #301: Allow to run tests of namespace packages when using 2to3. +* Distribute #304: Prevent import loop in site.py under Python 3.3. +* Distribute #283: Reenable scanning of `*.pyc` / `*.pyo` files on Python 3.3. +* Distribute #299: The develop command didn't work on Python 3, when using 2to3, + as the egg link would go to the Python 2 source. Linking to the 2to3'd code + in build/lib makes it work, although you will have to rebuild the module + before testing it. +* Distribute #306: Even if 2to3 is used, we build in-place under Python 2. +* Distribute #307: Prints the full path when .svn/entries is broken. +* Distribute #313: Support for sdist subcommands (Python 2.7) +* Distribute #314: test_local_index() would fail an OS X. +* Distribute #310: Non-ascii characters in a namespace __init__.py causes errors. +* Distribute #218: Improved documentation on behavior of `package_data` and + `include_package_data`. Files indicated by `package_data` are now included + in the manifest. +* `distribute_setup.py` now allows a `--download-base` argument for retrieving + distribute from a specified location. + +0.6.28 +------ + +* Distribute #294: setup.py can now be invoked from any directory. +* Scripts are now installed honoring the umask. +* Added support for .dist-info directories. +* Distribute #283: Fix and disable scanning of `*.pyc` / `*.pyo` files on + Python 3.3. + +0.6.27 +------ + +* Support current snapshots of CPython 3.3. +* Distribute now recognizes README.rst as a standard, default readme file. +* Exclude 'encodings' modules when removing modules from sys.modules. + Workaround for #285. +* Distribute #231: Don't fiddle with system python when used with buildout + (bootstrap.py) + +0.6.26 +------ + +* Distribute #183: Symlinked files are now extracted from source distributions. +* Distribute #227: Easy_install fetch parameters are now passed during the + installation of a source distribution; now fulfillment of setup_requires + dependencies will honor the parameters passed to easy_install. + +0.6.25 +------ + +* Distribute #258: Workaround a cache issue +* Distribute #260: distribute_setup.py now accepts the --user parameter for + Python 2.6 and later. +* Distribute #262: package_index.open_with_auth no longer throws LookupError + on Python 3. +* Distribute #269: AttributeError when an exception occurs reading Manifest.in + on late releases of Python. +* Distribute #272: Prevent TypeError when namespace package names are unicode + and single-install-externally-managed is used. Also fixes PIP issue + 449. +* Distribute #273: Legacy script launchers now install with Python2/3 support. + +0.6.24 +------ + +* Distribute #249: Added options to exclude 2to3 fixers + +0.6.23 +------ + +* Distribute #244: Fixed a test +* Distribute #243: Fixed a test +* Distribute #239: Fixed a test +* Distribute #240: Fixed a test +* Distribute #241: Fixed a test +* Distribute #237: Fixed a test +* Distribute #238: easy_install now uses 64bit executable wrappers on 64bit Python +* Distribute #208: Fixed parsed_versions, it now honors post-releases as noted in the documentation +* Distribute #207: Windows cli and gui wrappers pass CTRL-C to child python process +* Distribute #227: easy_install now passes its arguments to setup.py bdist_egg +* Distribute #225: Fixed a NameError on Python 2.5, 2.4 + +0.6.21 +------ + +* Distribute #225: FIxed a regression on py2.4 + +0.6.20 +------ + +* Distribute #135: Include url in warning when processing URLs in package_index. +* Distribute #212: Fix issue where easy_instal fails on Python 3 on windows installer. +* Distribute #213: Fix typo in documentation. + +0.6.19 +------ + +* Distribute #206: AttributeError: 'HTTPMessage' object has no attribute 'getheaders' + +0.6.18 +------ + +* Distribute #210: Fixed a regression introduced by Distribute #204 fix. + +0.6.17 +------ + +* Support 'DISTRIBUTE_DISABLE_VERSIONED_EASY_INSTALL_SCRIPT' environment + variable to allow to disable installation of easy_install-${version} script. +* Support Python >=3.1.4 and >=3.2.1. +* Distribute #204: Don't try to import the parent of a namespace package in + declare_namespace +* Distribute #196: Tolerate responses with multiple Content-Length headers +* Distribute #205: Sandboxing doesn't preserve working_set. Leads to setup_requires + problems. + +0.6.16 +------ + +* Builds sdist gztar even on Windows (avoiding Distribute #193). +* Distribute #192: Fixed metadata omitted on Windows when package_dir + specified with forward-slash. +* Distribute #195: Cython build support. +* Distribute #200: Issues with recognizing 64-bit packages on Windows. + +0.6.15 +------ + +* Fixed typo in bdist_egg +* Several issues under Python 3 has been solved. +* Distribute #146: Fixed missing DLL files after easy_install of windows exe package. + +0.6.14 +------ + +* Distribute #170: Fixed unittest failure. Thanks to Toshio. +* Distribute #171: Fixed race condition in unittests cause deadlocks in test suite. +* Distribute #143: Fixed a lookup issue with easy_install. + Thanks to David and Zooko. +* Distribute #174: Fixed the edit mode when its used with setuptools itself + +0.6.13 +------ + +* Distribute #160: 2.7 gives ValueError("Invalid IPv6 URL") +* Distribute #150: Fixed using ~/.local even in a --no-site-packages virtualenv +* Distribute #163: scan index links before external links, and don't use the md5 when + comparing two distributions + +0.6.12 +------ + +* Distribute #149: Fixed various failures on 2.3/2.4 + +0.6.11 +------ + +* Found another case of SandboxViolation - fixed +* Distribute #15 and Distribute #48: Introduced a socket timeout of 15 seconds on url openings +* Added indexsidebar.html into MANIFEST.in +* Distribute #108: Fixed TypeError with Python3.1 +* Distribute #121: Fixed --help install command trying to actually install. +* Distribute #112: Added an os.makedirs so that Tarek's solution will work. +* Distribute #133: Added --no-find-links to easy_install +* Added easy_install --user +* Distribute #100: Fixed develop --user not taking '.' in PYTHONPATH into account +* Distribute #134: removed spurious UserWarnings. Patch by VanLindberg +* Distribute #138: cant_write_to_target error when setup_requires is used. +* Distribute #147: respect the sys.dont_write_bytecode flag + +0.6.10 +------ + +* Reverted change made for the DistributionNotFound exception because + zc.buildout uses the exception message to get the name of the + distribution. + +0.6.9 +----- + +* Distribute #90: unknown setuptools version can be added in the working set +* Distribute #87: setupt.py doesn't try to convert distribute_setup.py anymore + Initial Patch by arfrever. +* Distribute #89: added a side bar with a download link to the doc. +* Distribute #86: fixed missing sentence in pkg_resources doc. +* Added a nicer error message when a DistributionNotFound is raised. +* Distribute #80: test_develop now works with Python 3.1 +* Distribute #93: upload_docs now works if there is an empty sub-directory. +* Distribute #70: exec bit on non-exec files +* Distribute #99: now the standalone easy_install command doesn't uses a + "setup.cfg" if any exists in the working directory. It will use it + only if triggered by ``install_requires`` from a setup.py call + (install, develop, etc). +* Distribute #101: Allowing ``os.devnull`` in Sandbox +* Distribute #92: Fixed the "no eggs" found error with MacPort + (platform.mac_ver() fails) +* Distribute #103: test_get_script_header_jython_workaround not run + anymore under py3 with C or POSIX local. Contributed by Arfrever. +* Distribute #104: remvoved the assertion when the installation fails, + with a nicer message for the end user. +* Distribute #100: making sure there's no SandboxViolation when + the setup script patches setuptools. + +0.6.8 +----- + +* Added "check_packages" in dist. (added in Setuptools 0.6c11) +* Fixed the DONT_PATCH_SETUPTOOLS state. + +0.6.7 +----- + +* Distribute #58: Added --user support to the develop command +* Distribute #11: Generated scripts now wrap their call to the script entry point + in the standard "if name == 'main'" +* Added the 'DONT_PATCH_SETUPTOOLS' environment variable, so virtualenv + can drive an installation that doesn't patch a global setuptools. +* Reviewed unladen-swallow specific change from + http://code.google.com/p/unladen-swallow/source/detail?spec=svn875&r=719 + and determined that it no longer applies. Distribute should work fine with + Unladen Swallow 2009Q3. +* Distribute #21: Allow PackageIndex.open_url to gracefully handle all cases of a + httplib.HTTPException instead of just InvalidURL and BadStatusLine. +* Removed virtual-python.py from this distribution and updated documentation + to point to the actively maintained virtualenv instead. +* Distribute #64: use_setuptools no longer rebuilds the distribute egg every + time it is run +* use_setuptools now properly respects the requested version +* use_setuptools will no longer try to import a distribute egg for the + wrong Python version +* Distribute #74: no_fake should be True by default. +* Distribute #72: avoid a bootstrapping issue with easy_install -U + +0.6.6 +----- + +* Unified the bootstrap file so it works on both py2.x and py3k without 2to3 + (patch by Holger Krekel) + +0.6.5 +----- + +* Distribute #65: cli.exe and gui.exe are now generated at build time, + depending on the platform in use. + +* Distribute #67: Fixed doc typo (PEP 381/PEP 382). + +* Distribute no longer shadows setuptools if we require a 0.7-series + setuptools. And an error is raised when installing a 0.7 setuptools with + distribute. + +* When run from within buildout, no attempt is made to modify an existing + setuptools egg, whether in a shared egg directory or a system setuptools. + +* Fixed a hole in sandboxing allowing builtin file to write outside of + the sandbox. + +0.6.4 +----- + +* Added the generation of `distribute_setup_3k.py` during the release. + This closes Distribute #52. + +* Added an upload_docs command to easily upload project documentation to + PyPI's https://pythonhosted.org. This close issue Distribute #56. + +* Fixed a bootstrap bug on the use_setuptools() API. + +0.6.3 +----- + +setuptools +========== + +* Fixed a bunch of calls to file() that caused crashes on Python 3. + +bootstrapping +============= + +* Fixed a bug in sorting that caused bootstrap to fail on Python 3. + +0.6.2 +----- + +setuptools +========== + +* Added Python 3 support; see docs/python3.txt. + This closes Old Setuptools #39. + +* Added option to run 2to3 automatically when installing on Python 3. + This closes issue Distribute #31. + +* Fixed invalid usage of requirement.parse, that broke develop -d. + This closes Old Setuptools #44. + +* Fixed script launcher for 64-bit Windows. + This closes Old Setuptools #2. + +* KeyError when compiling extensions. + This closes Old Setuptools #41. + +bootstrapping +============= + +* Fixed bootstrap not working on Windows. This closes issue Distribute #49. + +* Fixed 2.6 dependencies. This closes issue Distribute #50. + +* Make sure setuptools is patched when running through easy_install + This closes Old Setuptools #40. + +0.6.1 +----- + +setuptools +========== + +* package_index.urlopen now catches BadStatusLine and malformed url errors. + This closes Distribute #16 and Distribute #18. + +* zip_ok is now False by default. This closes Old Setuptools #33. + +* Fixed invalid URL error catching. Old Setuptools #20. + +* Fixed invalid bootstraping with easy_install installation (Distribute #40). + Thanks to Florian Schulze for the help. + +* Removed buildout/bootstrap.py. A new repository will create a specific + bootstrap.py script. + + +bootstrapping +============= + +* The boostrap process leave setuptools alone if detected in the system + and --root or --prefix is provided, but is not in the same location. + This closes Distribute #10. + +0.6 +--- + +setuptools +========== + +* Packages required at build time where not fully present at install time. + This closes Distribute #12. + +* Protected against failures in tarfile extraction. This closes Distribute #10. + +* Made Jython api_tests.txt doctest compatible. This closes Distribute #7. + +* sandbox.py replaced builtin type file with builtin function open. This + closes Distribute #6. + +* Immediately close all file handles. This closes Distribute #3. + +* Added compatibility with Subversion 1.6. This references Distribute #1. + +pkg_resources +============= + +* Avoid a call to /usr/bin/sw_vers on OSX and use the official platform API + instead. Based on a patch from ronaldoussoren. This closes issue #5. + +* Fixed a SandboxViolation for mkdir that could occur in certain cases. + This closes Distribute #13. + +* Allow to find_on_path on systems with tight permissions to fail gracefully. + This closes Distribute #9. + +* Corrected inconsistency between documentation and code of add_entry. + This closes Distribute #8. + +* Immediately close all file handles. This closes Distribute #3. + +easy_install +============ + +* Immediately close all file handles. This closes Distribute #3. + +0.6c9 +----- + + * Fixed a missing files problem when using Windows source distributions on + non-Windows platforms, due to distutils not handling manifest file line + endings correctly. + + * Updated Pyrex support to work with Pyrex 0.9.6 and higher. + + * Minor changes for Jython compatibility, including skipping tests that can't + work on Jython. + + * Fixed not installing eggs in ``install_requires`` if they were also used for + ``setup_requires`` or ``tests_require``. + + * Fixed not fetching eggs in ``install_requires`` when running tests. + + * Allow ``ez_setup.use_setuptools()`` to upgrade existing setuptools + installations when called from a standalone ``setup.py``. + + * Added a warning if a namespace package is declared, but its parent package + is not also declared as a namespace. + + * Support Subversion 1.5 + + * Removed use of deprecated ``md5`` module if ``hashlib`` is available + + * Fixed ``bdist_wininst upload`` trying to upload the ``.exe`` twice + + * Fixed ``bdist_egg`` putting a ``native_libs.txt`` in the source package's + ``.egg-info``, when it should only be in the built egg's ``EGG-INFO``. + + * Ensure that _full_name is set on all shared libs before extensions are + checked for shared lib usage. (Fixes a bug in the experimental shared + library build support.) + + * Fix to allow unpacked eggs containing native libraries to fail more + gracefully under Google App Engine (with an ``ImportError`` loading the + C-based module, instead of getting a ``NameError``). + +0.6c7 +----- + + * Fixed ``distutils.filelist.findall()`` crashing on broken symlinks, and + ``egg_info`` command failing on new, uncommitted SVN directories. + + * Fix import problems with nested namespace packages installed via + ``--root`` or ``--single-version-externally-managed``, due to the + parent package not having the child package as an attribute. + +0.6c6 +----- + + * Added ``--egg-path`` option to ``develop`` command, allowing you to force + ``.egg-link`` files to use relative paths (allowing them to be shared across + platforms on a networked drive). + + * Fix not building binary RPMs correctly. + + * Fix "eggsecutables" (such as setuptools' own egg) only being runnable with + bash-compatible shells. + + * Fix ``#!`` parsing problems in Windows ``.exe`` script wrappers, when there + was whitespace inside a quoted argument or at the end of the ``#!`` line + (a regression introduced in 0.6c4). + + * Fix ``test`` command possibly failing if an older version of the project + being tested was installed on ``sys.path`` ahead of the test source + directory. + + * Fix ``find_packages()`` treating ``ez_setup`` and directories with ``.`` in + their names as packages. + +0.6c5 +----- + + * Fix uploaded ``bdist_rpm`` packages being described as ``bdist_egg`` + packages under Python versions less than 2.5. + + * Fix uploaded ``bdist_wininst`` packages being described as suitable for + "any" version by Python 2.5, even if a ``--target-version`` was specified. + +0.6c4 +----- + + * Overhauled Windows script wrapping to support ``bdist_wininst`` better. + Scripts installed with ``bdist_wininst`` will always use ``#!python.exe`` or + ``#!pythonw.exe`` as the executable name (even when built on non-Windows + platforms!), and the wrappers will look for the executable in the script's + parent directory (which should find the right version of Python). + + * Fix ``upload`` command not uploading files built by ``bdist_rpm`` or + ``bdist_wininst`` under Python 2.3 and 2.4. + + * Add support for "eggsecutable" headers: a ``#!/bin/sh`` script that is + prepended to an ``.egg`` file to allow it to be run as a script on Unix-ish + platforms. (This is mainly so that setuptools itself can have a single-file + installer on Unix, without doing multiple downloads, dealing with firewalls, + etc.) + + * Fix problem with empty revision numbers in Subversion 1.4 ``entries`` files + + * Use cross-platform relative paths in ``easy-install.pth`` when doing + ``develop`` and the source directory is a subdirectory of the installation + target directory. + + * Fix a problem installing eggs with a system packaging tool if the project + contained an implicit namespace package; for example if the ``setup()`` + listed a namespace package ``foo.bar`` without explicitly listing ``foo`` + as a namespace package. + +0.6c3 +----- + + * Fixed breakages caused by Subversion 1.4's new "working copy" format + +0.6c2 +----- + + * The ``ez_setup`` module displays the conflicting version of setuptools (and + its installation location) when a script requests a version that's not + available. + + * Running ``setup.py develop`` on a setuptools-using project will now install + setuptools if needed, instead of only downloading the egg. + +0.6c1 +----- + + * Fixed ``AttributeError`` when trying to download a ``setup_requires`` + dependency when a distribution lacks a ``dependency_links`` setting. + + * Made ``zip-safe`` and ``not-zip-safe`` flag files contain a single byte, so + as to play better with packaging tools that complain about zero-length + files. + + * Made ``setup.py develop`` respect the ``--no-deps`` option, which it + previously was ignoring. + + * Support ``extra_path`` option to ``setup()`` when ``install`` is run in + backward-compatibility mode. + + * Source distributions now always include a ``setup.cfg`` file that explicitly + sets ``egg_info`` options such that they produce an identical version number + to the source distribution's version number. (Previously, the default + version number could be different due to the use of ``--tag-date``, or if + the version was overridden on the command line that built the source + distribution.) + +0.6b4 +----- + + * Fix ``register`` not obeying name/version set by ``egg_info`` command, if + ``egg_info`` wasn't explicitly run first on the same command line. + + * Added ``--no-date`` and ``--no-svn-revision`` options to ``egg_info`` + command, to allow suppressing tags configured in ``setup.cfg``. + + * Fixed redundant warnings about missing ``README`` file(s); it should now + appear only if you are actually a source distribution. + +0.6b3 +----- + + * Fix ``bdist_egg`` not including files in subdirectories of ``.egg-info``. + + * Allow ``.py`` files found by the ``include_package_data`` option to be + automatically included. Remove duplicate data file matches if both + ``include_package_data`` and ``package_data`` are used to refer to the same + files. + +0.6b1 +----- + + * Strip ``module`` from the end of compiled extension modules when computing + the name of a ``.py`` loader/wrapper. (Python's import machinery ignores + this suffix when searching for an extension module.) + +0.6a11 +------ + + * Added ``test_loader`` keyword to support custom test loaders + + * Added ``setuptools.file_finders`` entry point group to allow implementing + revision control plugins. + + * Added ``--identity`` option to ``upload`` command. + + * Added ``dependency_links`` to allow specifying URLs for ``--find-links``. + + * Enhanced test loader to scan packages as well as modules, and call + ``additional_tests()`` if present to get non-unittest tests. + + * Support namespace packages in conjunction with system packagers, by omitting + the installation of any ``__init__.py`` files for namespace packages, and + adding a special ``.pth`` file to create a working package in + ``sys.modules``. + + * Made ``--single-version-externally-managed`` automatic when ``--root`` is + used, so that most system packagers won't require special support for + setuptools. + + * Fixed ``setup_requires``, ``tests_require``, etc. not using ``setup.cfg`` or + other configuration files for their option defaults when installing, and + also made the install use ``--multi-version`` mode so that the project + directory doesn't need to support .pth files. + + * ``MANIFEST.in`` is now forcibly closed when any errors occur while reading + it. Previously, the file could be left open and the actual error would be + masked by problems trying to remove the open file on Windows systems. + +0.6a10 +------ + + * Fixed the ``develop`` command ignoring ``--find-links``. + +0.6a9 +----- + + * The ``sdist`` command no longer uses the traditional ``MANIFEST`` file to + create source distributions. ``MANIFEST.in`` is still read and processed, + as are the standard defaults and pruning. But the manifest is built inside + the project's ``.egg-info`` directory as ``SOURCES.txt``, and it is rebuilt + every time the ``egg_info`` command is run. + + * Added the ``include_package_data`` keyword to ``setup()``, allowing you to + automatically include any package data listed in revision control or + ``MANIFEST.in`` + + * Added the ``exclude_package_data`` keyword to ``setup()``, allowing you to + trim back files included via the ``package_data`` and + ``include_package_data`` options. + + * Fixed ``--tag-svn-revision`` not working when run from a source + distribution. + + * Added warning for namespace packages with missing ``declare_namespace()`` + + * Added ``tests_require`` keyword to ``setup()``, so that e.g. packages + requiring ``nose`` to run unit tests can make this dependency optional + unless the ``test`` command is run. + + * Made all commands that use ``easy_install`` respect its configuration + options, as this was causing some problems with ``setup.py install``. + + * Added an ``unpack_directory()`` driver to ``setuptools.archive_util``, so + that you can process a directory tree through a processing filter as if it + were a zipfile or tarfile. + + * Added an internal ``install_egg_info`` command to use as part of old-style + ``install`` operations, that installs an ``.egg-info`` directory with the + package. + + * Added a ``--single-version-externally-managed`` option to the ``install`` + command so that you can more easily wrap a "flat" egg in a system package. + + * Enhanced ``bdist_rpm`` so that it installs single-version eggs that + don't rely on a ``.pth`` file. The ``--no-egg`` option has been removed, + since all RPMs are now built in a more backwards-compatible format. + + * Support full roundtrip translation of eggs to and from ``bdist_wininst`` + format. Running ``bdist_wininst`` on a setuptools-based package wraps the + egg in an .exe that will safely install it as an egg (i.e., with metadata + and entry-point wrapper scripts), and ``easy_install`` can turn the .exe + back into an ``.egg`` file or directory and install it as such. + + +0.6a8 +----- + + * Fixed some problems building extensions when Pyrex was installed, especially + with Python 2.4 and/or packages using SWIG. + + * Made ``develop`` command accept all the same options as ``easy_install``, + and use the ``easy_install`` command's configuration settings as defaults. + + * Made ``egg_info --tag-svn-revision`` fall back to extracting the revision + number from ``PKG-INFO`` in case it is being run on a source distribution of + a snapshot taken from a Subversion-based project. + + * Automatically detect ``.dll``, ``.so`` and ``.dylib`` files that are being + installed as data, adding them to ``native_libs.txt`` automatically. + + * Fixed some problems with fresh checkouts of projects that don't include + ``.egg-info/PKG-INFO`` under revision control and put the project's source + code directly in the project directory. If such a package had any + requirements that get processed before the ``egg_info`` command can be run, + the setup scripts would fail with a "Missing 'Version:' header and/or + PKG-INFO file" error, because the egg runtime interpreted the unbuilt + metadata in a directory on ``sys.path`` (i.e. the current directory) as + being a corrupted egg. Setuptools now monkeypatches the distribution + metadata cache to pretend that the egg has valid version information, until + it has a chance to make it actually be so (via the ``egg_info`` command). + +0.6a5 +----- + + * Fixed missing gui/cli .exe files in distribution. Fixed bugs in tests. + +0.6a3 +----- + + * Added ``gui_scripts`` entry point group to allow installing GUI scripts + on Windows and other platforms. (The special handling is only for Windows; + other platforms are treated the same as for ``console_scripts``.) + +0.6a2 +----- + + * Added ``console_scripts`` entry point group to allow installing scripts + without the need to create separate script files. On Windows, console + scripts get an ``.exe`` wrapper so you can just type their name. On other + platforms, the scripts are written without a file extension. + +0.6a1 +----- + + * Added support for building "old-style" RPMs that don't install an egg for + the target package, using a ``--no-egg`` option. + + * The ``build_ext`` command now works better when using the ``--inplace`` + option and multiple Python versions. It now makes sure that all extensions + match the current Python version, even if newer copies were built for a + different Python version. + + * The ``upload`` command no longer attaches an extra ``.zip`` when uploading + eggs, as PyPI now supports egg uploads without trickery. + + * The ``ez_setup`` script/module now displays a warning before downloading + the setuptools egg, and attempts to check the downloaded egg against an + internal MD5 checksum table. + + * Fixed the ``--tag-svn-revision`` option of ``egg_info`` not finding the + latest revision number; it was using the revision number of the directory + containing ``setup.py``, not the highest revision number in the project. + + * Added ``eager_resources`` setup argument + + * The ``sdist`` command now recognizes Subversion "deleted file" entries and + does not include them in source distributions. + + * ``setuptools`` now embeds itself more thoroughly into the distutils, so that + other distutils extensions (e.g. py2exe, py2app) will subclass setuptools' + versions of things, rather than the native distutils ones. + + * Added ``entry_points`` and ``setup_requires`` arguments to ``setup()``; + ``setup_requires`` allows you to automatically find and download packages + that are needed in order to *build* your project (as opposed to running it). + + * ``setuptools`` now finds its commands, ``setup()`` argument validators, and + metadata writers using entry points, so that they can be extended by + third-party packages. See `Creating distutils Extensions + `_ + for more details. + + * The vestigial ``depends`` command has been removed. It was never finished + or documented, and never would have worked without EasyInstall - which it + pre-dated and was never compatible with. + +0.5a12 +------ + + * The zip-safety scanner now checks for modules that might be used with + ``python -m``, and marks them as unsafe for zipping, since Python 2.4 can't + handle ``-m`` on zipped modules. + +0.5a11 +------ + + * Fix breakage of the "develop" command that was caused by the addition of + ``--always-unzip`` to the ``easy_install`` command. + +0.5a9 +----- + + * Include ``svn:externals`` directories in source distributions as well as + normal subversion-controlled files and directories. + + * Added ``exclude=patternlist`` option to ``setuptools.find_packages()`` + + * Changed --tag-svn-revision to include an "r" in front of the revision number + for better readability. + + * Added ability to build eggs without including source files (except for any + scripts, of course), using the ``--exclude-source-files`` option to + ``bdist_egg``. + + * ``setup.py install`` now automatically detects when an "unmanaged" package + or module is going to be on ``sys.path`` ahead of a package being installed, + thereby preventing the newer version from being imported. If this occurs, + a warning message is output to ``sys.stderr``, but installation proceeds + anyway. The warning message informs the user what files or directories + need deleting, and advises them they can also use EasyInstall (with the + ``--delete-conflicting`` option) to do it automatically. + + * The ``egg_info`` command now adds a ``top_level.txt`` file to the metadata + directory that lists all top-level modules and packages in the distribution. + This is used by the ``easy_install`` command to find possibly-conflicting + "unmanaged" packages when installing the distribution. + + * Added ``zip_safe`` and ``namespace_packages`` arguments to ``setup()``. + Added package analysis to determine zip-safety if the ``zip_safe`` flag + is not given, and advise the author regarding what code might need changing. + + * Fixed the swapped ``-d`` and ``-b`` options of ``bdist_egg``. + +0.5a8 +----- + + * The "egg_info" command now always sets the distribution metadata to "safe" + forms of the distribution name and version, so that distribution files will + be generated with parseable names (i.e., ones that don't include '-' in the + name or version). Also, this means that if you use the various ``--tag`` + options of "egg_info", any distributions generated will use the tags in the + version, not just egg distributions. + + * Added support for defining command aliases in distutils configuration files, + under the "[aliases]" section. To prevent recursion and to allow aliases to + call the command of the same name, a given alias can be expanded only once + per command-line invocation. You can define new aliases with the "alias" + command, either for the local, global, or per-user configuration. + + * Added "rotate" command to delete old distribution files, given a set of + patterns to match and the number of files to keep. (Keeps the most + recently-modified distribution files matching each pattern.) + + * Added "saveopts" command that saves all command-line options for the current + invocation to the local, global, or per-user configuration file. Useful for + setting defaults without having to hand-edit a configuration file. + + * Added a "setopt" command that sets a single option in a specified distutils + configuration file. + +0.5a7 +----- + + * Added "upload" support for egg and source distributions, including a bug + fix for "upload" and a temporary workaround for lack of .egg support in + PyPI. + +0.5a6 +----- + + * Beefed up the "sdist" command so that if you don't have a MANIFEST.in, it + will include all files under revision control (CVS or Subversion) in the + current directory, and it will regenerate the list every time you create a + source distribution, not just when you tell it to. This should make the + default "do what you mean" more often than the distutils' default behavior + did, while still retaining the old behavior in the presence of MANIFEST.in. + + * Fixed the "develop" command always updating .pth files, even if you + specified ``-n`` or ``--dry-run``. + + * Slightly changed the format of the generated version when you use + ``--tag-build`` on the "egg_info" command, so that you can make tagged + revisions compare *lower* than the version specified in setup.py (e.g. by + using ``--tag-build=dev``). + +0.5a5 +----- + + * Added ``develop`` command to ``setuptools``-based packages. This command + installs an ``.egg-link`` pointing to the package's source directory, and + script wrappers that ``execfile()`` the source versions of the package's + scripts. This lets you put your development checkout(s) on sys.path without + having to actually install them. (To uninstall the link, use + use ``setup.py develop --uninstall``.) + + * Added ``egg_info`` command to ``setuptools``-based packages. This command + just creates or updates the "projectname.egg-info" directory, without + building an egg. (It's used by the ``bdist_egg``, ``test``, and ``develop`` + commands.) + + * Enhanced the ``test`` command so that it doesn't install the package, but + instead builds any C extensions in-place, updates the ``.egg-info`` + metadata, adds the source directory to ``sys.path``, and runs the tests + directly on the source. This avoids an "unmanaged" installation of the + package to ``site-packages`` or elsewhere. + + * Made ``easy_install`` a standard ``setuptools`` command, moving it from + the ``easy_install`` module to ``setuptools.command.easy_install``. Note + that if you were importing or extending it, you must now change your imports + accordingly. ``easy_install.py`` is still installed as a script, but not as + a module. + +0.5a4 +----- + + * Setup scripts using setuptools can now list their dependencies directly in + the setup.py file, without having to manually create a ``depends.txt`` file. + The ``install_requires`` and ``extras_require`` arguments to ``setup()`` + are used to create a dependencies file automatically. If you are manually + creating ``depends.txt`` right now, please switch to using these setup + arguments as soon as practical, because ``depends.txt`` support will be + removed in the 0.6 release cycle. For documentation on the new arguments, + see the ``setuptools.dist.Distribution`` class. + + * Setup scripts using setuptools now always install using ``easy_install`` + internally, for ease of uninstallation and upgrading. + +0.5a1 +----- + + * Added support for "self-installation" bootstrapping. Packages can now + include ``ez_setup.py`` in their source distribution, and add the following + to their ``setup.py``, in order to automatically bootstrap installation of + setuptools as part of their setup process:: + + from ez_setup import use_setuptools + use_setuptools() + + from setuptools import setup + # etc... + +0.4a2 +----- + + * Added ``ez_setup.py`` installer/bootstrap script to make initial setuptools + installation easier, and to allow distributions using setuptools to avoid + having to include setuptools in their source distribution. + + * All downloads are now managed by the ``PackageIndex`` class (which is now + subclassable and replaceable), so that embedders can more easily override + download logic, give download progress reports, etc. The class has also + been moved to the new ``setuptools.package_index`` module. + + * The ``Installer`` class no longer handles downloading, manages a temporary + directory, or tracks the ``zip_ok`` option. Downloading is now handled + by ``PackageIndex``, and ``Installer`` has become an ``easy_install`` + command class based on ``setuptools.Command``. + + * There is a new ``setuptools.sandbox.run_setup()`` API to invoke a setup + script in a directory sandbox, and a new ``setuptools.archive_util`` module + with an ``unpack_archive()`` API. These were split out of EasyInstall to + allow reuse by other tools and applications. + + * ``setuptools.Command`` now supports reinitializing commands using keyword + arguments to set/reset options. Also, ``Command`` subclasses can now set + their ``command_consumes_arguments`` attribute to ``True`` in order to + receive an ``args`` option containing the rest of the command line. + +0.3a2 +----- + + * Added new options to ``bdist_egg`` to allow tagging the egg's version number + with a subversion revision number, the current date, or an explicit tag + value. Run ``setup.py bdist_egg --help`` to get more information. + + * Misc. bug fixes + +0.3a1 +----- + + * Initial release. + diff --git a/EasyInstall.txt b/EasyInstall.txt deleted file mode 100755 index 2bf69e9..0000000 --- a/EasyInstall.txt +++ /dev/null @@ -1,1741 +0,0 @@ -============ -Easy Install -============ - -Easy Install is a python module (``easy_install``) bundled with ``setuptools`` -that lets you automatically download, build, install, and manage Python -packages. - -Please share your experiences with us! If you encounter difficulty installing -a package, please contact us via the `distutils mailing list -`_. (Note: please DO NOT send -private email directly to the author of setuptools; it will be discarded. The -mailing list is a searchable archive of previously-asked and answered -questions; you should begin your research there before reporting something as a -bug -- and then do so via list discussion first.) - -(Also, if you'd like to learn about how you can use ``setuptools`` to make your -own packages work better with EasyInstall, or provide EasyInstall-like features -without requiring your users to use EasyInstall directly, you'll probably want -to check out the full `setuptools`_ documentation as well.) - -.. contents:: **Table of Contents** - - -Using "Easy Install" -==================== - - -.. _installation instructions: - -Installing "Easy Install" -------------------------- - -Please see the `setuptools PyPI page `_ -for download links and basic installation instructions for each of the -supported platforms. - -You will need at least Python 2.3.5, or if you are on a 64-bit platform, Python -2.4. An ``easy_install`` script will be installed in the normal location for -Python scripts on your platform. - -Note that the instructions on the setuptools PyPI page assume that you are -are installling to Python's primary ``site-packages`` directory. If this is -not the case, you should consult the section below on `Custom Installation -Locations`_ before installing. (And, on Windows, you should not use the -``.exe`` installer when installing to an alternate location.) - -Note that ``easy_install`` normally works by downloading files from the -internet. If you are behind an NTLM-based firewall that prevents Python -programs from accessing the net directly, you may wish to first install and use -the `APS proxy server `_, which lets you get past such -firewalls in the same way that your web browser(s) do. - -(Alternately, if you do not wish easy_install to actually download anything, you -can restrict it from doing so with the ``--allow-hosts`` option; see the -sections on `restricting downloads with --allow-hosts`_ and `command-line -options`_ for more details.) - - -Troubleshooting -~~~~~~~~~~~~~~~ - -If EasyInstall/setuptools appears to install correctly, and you can run the -``easy_install`` command but it fails with an ``ImportError``, the most likely -cause is that you installed to a location other than ``site-packages``, -without taking any of the steps described in the `Custom Installation -Locations`_ section below. Please see that section and follow the steps to -make sure that your custom location will work correctly. Then re-install. - -Similarly, if you can run ``easy_install``, and it appears to be installing -packages, but then you can't import them, the most likely issue is that you -installed EasyInstall correctly but are using it to install packages to a -non-standard location that hasn't been properly prepared. Again, see the -section on `Custom Installation Locations`_ for more details. - - -Windows Notes -~~~~~~~~~~~~~ - -On Windows, an ``easy_install.exe`` launcher will also be installed, so that -you can just type ``easy_install`` as long as it's on your ``PATH``. If typing -``easy_install`` at the command prompt doesn't work, check to make sure your -``PATH`` includes the appropriate ``C:\\Python2X\\Scripts`` directory. On -most current versions of Windows, you can change the ``PATH`` by right-clicking -"My Computer", choosing "Properties" and selecting the "Advanced" tab, then -clicking the "Environment Variables" button. ``PATH`` will be in the "System -Variables" section, and you will need to exit and restart your command shell -(command.com, cmd.exe, bash, or other) for the change to take effect. Be sure -to add a ``;`` after the last item on ``PATH`` before adding the scripts -directory to it. - -Note that instead of changing your ``PATH`` to include the Python scripts -directory, you can also retarget the installation location for scripts so they -go on a directory that's already on the ``PATH``. For more information see the -sections below on `Command-Line Options`_ and `Configuration Files`_. You -can pass command line options (such as ``--script-dir``) to ``ez_setup.py`` to -control where ``easy_install.exe`` will be installed. - - - -Downloading and Installing a Package ------------------------------------- - -For basic use of ``easy_install``, you need only supply the filename or URL of -a source distribution or .egg file (`Python Egg`__). - -__ http://peak.telecommunity.com/DevCenter/PythonEggs - -**Example 1**. Install a package by name, searching PyPI for the latest -version, and automatically downloading, building, and installing it:: - - easy_install SQLObject - -**Example 2**. Install or upgrade a package by name and version by finding -links on a given "download page":: - - easy_install -f http://pythonpaste.org/package_index.html SQLObject - -**Example 3**. Download a source distribution from a specified URL, -automatically building and installing it:: - - easy_install http://example.com/path/to/MyPackage-1.2.3.tgz - -**Example 4**. Install an already-downloaded .egg file:: - - easy_install /my_downloads/OtherPackage-3.2.1-py2.3.egg - -**Example 5**. Upgrade an already-installed package to the latest version -listed on PyPI:: - - easy_install --upgrade PyProtocols - -**Example 6**. Install a source distribution that's already downloaded and -extracted in the current directory (New in 0.5a9):: - - easy_install . - -**Example 7**. (New in 0.6a1) Find a source distribution or Subversion -checkout URL for a package, and extract it or check it out to -``~/projects/sqlobject`` (the name will always be in all-lowercase), where it -can be examined or edited. (The package will not be installed, but it can -easily be installed with ``easy_install ~/projects/sqlobject``. See `Editing -and Viewing Source Packages`_ below for more info.):: - - easy_install --editable --build-directory ~/projects SQLObject - -Easy Install accepts URLs, filenames, PyPI package names (i.e., ``distutils`` -"distribution" names), and package+version specifiers. In each case, it will -attempt to locate the latest available version that meets your criteria. - -When downloading or processing downloaded files, Easy Install recognizes -distutils source distribution files with extensions of .tgz, .tar, .tar.gz, -.tar.bz2, or .zip. And of course it handles already-built .egg -distributions as well as ``.win32.exe`` installers built using distutils. - -By default, packages are installed to the running Python installation's -``site-packages`` directory, unless you provide the ``-d`` or ``--install-dir`` -option to specify an alternative directory, or specify an alternate location -using distutils configuration files. (See `Configuration Files`_, below.) - -By default, any scripts included with the package are installed to the running -Python installation's standard script installation location. However, if you -specify an installation directory via the command line or a config file, then -the default directory for installing scripts will be the same as the package -installation directory, to ensure that the script will have access to the -installed package. You can override this using the ``-s`` or ``--script-dir`` -option. - -Installed packages are added to an ``easy-install.pth`` file in the install -directory, so that Python will always use the most-recently-installed version -of the package. If you would like to be able to select which version to use at -runtime, you should use the ``-m`` or ``--multi-version`` option. - - -Upgrading a Package -------------------- - -You don't need to do anything special to upgrade a package: just install the -new version, either by requesting a specific version, e.g.:: - - easy_install "SomePackage==2.0" - -a version greater than the one you have now:: - - easy_install "SomePackage>2.0" - -using the upgrade flag, to find the latest available version on PyPI:: - - easy_install --upgrade SomePackage - -or by using a download page, direct download URL, or package filename:: - - easy_install -f http://example.com/downloads ExamplePackage - - easy_install http://example.com/downloads/ExamplePackage-2.0-py2.4.egg - - easy_install my_downloads/ExamplePackage-2.0.tgz - -If you're using ``-m`` or ``--multi-version`` , using the ``require()`` -function at runtime automatically selects the newest installed version of a -package that meets your version criteria. So, installing a newer version is -the only step needed to upgrade such packages. - -If you're installing to a directory on PYTHONPATH, or a configured "site" -directory (and not using ``-m``), installing a package automatically replaces -any previous version in the ``easy-install.pth`` file, so that Python will -import the most-recently installed version by default. So, again, installing -the newer version is the only upgrade step needed. - -If you haven't suppressed script installation (using ``--exclude-scripts`` or -``-x``), then the upgraded version's scripts will be installed, and they will -be automatically patched to ``require()`` the corresponding version of the -package, so that you can use them even if they are installed in multi-version -mode. - -``easy_install`` never actually deletes packages (unless you're installing a -package with the same name and version number as an existing package), so if -you want to get rid of older versions of a package, please see `Uninstalling -Packages`_, below. - - -Changing the Active Version ---------------------------- - -If you've upgraded a package, but need to revert to a previously-installed -version, you can do so like this:: - - easy_install PackageName==1.2.3 - -Where ``1.2.3`` is replaced by the exact version number you wish to switch to. -If a package matching the requested name and version is not already installed -in a directory on ``sys.path``, it will be located via PyPI and installed. - -If you'd like to switch to the latest installed version of ``PackageName``, you -can do so like this:: - - easy_install PackageName - -This will activate the latest installed version. (Note: if you have set any -``find_links`` via distutils configuration files, those download pages will be -checked for the latest available version of the package, and it will be -downloaded and installed if it is newer than your current version.) - -Note that changing the active version of a package will install the newly -active version's scripts, unless the ``--exclude-scripts`` or ``-x`` option is -specified. - - -Uninstalling Packages ---------------------- - -If you have replaced a package with another version, then you can just delete -the package(s) you don't need by deleting the PackageName-versioninfo.egg file -or directory (found in the installation directory). - -If you want to delete the currently installed version of a package (or all -versions of a package), you should first run:: - - easy_install -mxN PackageName - -This will ensure that Python doesn't continue to search for a package you're -planning to remove. After you've done this, you can safely delete the .egg -files or directories, along with any scripts you wish to remove. - - -Managing Scripts ----------------- - -Whenever you install, upgrade, or change versions of a package, EasyInstall -automatically installs the scripts for the selected package version, unless -you tell it not to with ``-x`` or ``--exclude-scripts``. If any scripts in -the script directory have the same name, they are overwritten. - -Thus, you do not normally need to manually delete scripts for older versions of -a package, unless the newer version of the package does not include a script -of the same name. However, if you are completely uninstalling a package, you -may wish to manually delete its scripts. - -EasyInstall's default behavior means that you can normally only run scripts -from one version of a package at a time. If you want to keep multiple versions -of a script available, however, you can simply use the ``--multi-version`` or -``-m`` option, and rename the scripts that EasyInstall creates. This works -because EasyInstall installs scripts as short code stubs that ``require()`` the -matching version of the package the script came from, so renaming the script -has no effect on what it executes. - -For example, suppose you want to use two versions of the ``rst2html`` tool -provided by the `docutils `_ package. You might -first install one version:: - - easy_install -m docutils==0.3.9 - -then rename the ``rst2html.py`` to ``r2h_039``, and install another version:: - - easy_install -m docutils==0.3.10 - -This will create another ``rst2html.py`` script, this one using docutils -version 0.3.10 instead of 0.3.9. You now have two scripts, each using a -different version of the package. (Notice that we used ``-m`` for both -installations, so that Python won't lock us out of using anything but the most -recently-installed version of the package.) - - - -Tips & Techniques ------------------ - - -Multiple Python Versions -~~~~~~~~~~~~~~~~~~~~~~~~ - -As of version 0.6a11, EasyInstall installs itself under two names: -``easy_install`` and ``easy_install-N.N``, where ``N.N`` is the Python version -used to install it. Thus, if you install EasyInstall for both Python 2.3 and -2.4, you can use the ``easy_install-2.3`` or ``easy_install-2.4`` scripts to -install packages for Python 2.3 or 2.4, respectively. - -Also, if you're working with Python version 2.4 or higher, you can run Python -with ``-m easy_install`` to run that particular Python version's -``easy_install`` command. - - -Restricting Downloads with ``--allow-hosts`` -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -You can use the ``--allow-hosts`` (``-H``) option to restrict what domains -EasyInstall will look for links and downloads on. ``--allow-hosts=None`` -prevents downloading altogether. You can also use wildcards, for example -to restrict downloading to hosts in your own intranet. See the section below -on `Command-Line Options`_ for more details on the ``--allow-hosts`` option. - -By default, there are no host restrictions in effect, but you can change this -default by editing the appropriate `configuration files`_ and adding:: - - [easy_install] - allow_hosts = *.myintranet.example.com,*.python.org - -The above example would then allow downloads only from hosts in the -``python.org`` and ``myintranet.example.com`` domains, unless overridden on the -command line. - - -Installing on Un-networked Machines -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -Just copy the eggs or source packages you need to a directory on the target -machine, then use the ``-f`` or ``--find-links`` option to specify that -directory's location. For example:: - - easy_install -H None -f somedir SomePackage - -will attempt to install SomePackage using only eggs and source packages found -in ``somedir`` and disallowing all remote access. You should of course make -sure you have all of SomePackage's dependencies available in somedir. - -If you have another machine of the same operating system and library versions -(or if the packages aren't platform-specific), you can create the directory of -eggs using a command like this:: - - easy_install -zmaxd somedir SomePackage - -This will tell EasyInstall to put zipped eggs or source packages for -SomePackage and all its dependencies into ``somedir``, without creating any -scripts or .pth files. You can then copy the contents of ``somedir`` to the -target machine. (``-z`` means zipped eggs, ``-m`` means multi-version, which -prevents .pth files from being used, ``-a`` means to copy all the eggs needed, -even if they're installed elsewhere on the machine, and ``-d`` indicates the -directory to place the eggs in.) - -You can also build the eggs from local development packages that were installed -with the ``setup.py develop`` command, by including the ``-l`` option, e.g.:: - - easy_install -zmaxld somedir SomePackage - -This will use locally-available source distributions to build the eggs. - - -Packaging Others' Projects As Eggs -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -Need to distribute a package that isn't published in egg form? You can use -EasyInstall to build eggs for a project. You'll want to use the ``--zip-ok``, -``--exclude-scripts``, and possibly ``--no-deps`` options (``-z``, ``-x`` and -``-N``, respectively). Use ``-d`` or ``--install-dir`` to specify the location -where you'd like the eggs placed. By placing them in a directory that is -published to the web, you can then make the eggs available for download, either -in an intranet or to the internet at large. - -If someone distributes a package in the form of a single ``.py`` file, you can -wrap it in an egg by tacking an ``#egg=name-version`` suffix on the file's URL. -So, something like this:: - - easy_install -f "http://some.example.com/downloads/foo.py#egg=foo-1.0" foo - -will install the package as an egg, and this:: - - easy_install -zmaxd. \ - -f "http://some.example.com/downloads/foo.py#egg=foo-1.0" foo - -will create a ``.egg`` file in the current directory. - - -Creating your own Package Index -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -In addition to local directories and the Python Package Index, EasyInstall can -find download links on most any web page whose URL is given to the ``-f`` -(``--find-links``) option. In the simplest case, you can simply have a web -page with links to eggs or Python source packages, even an automatically -generated directory listing (such as the Apache web server provides). - -If you are setting up an intranet site for package downloads, you may want to -configure the target machines to use your download site by default, adding -something like this to their `configuration files`_:: - - [easy_install] - find_links = http://mypackages.example.com/somedir/ - http://turbogears.org/download/ - http://peak.telecommunity.com/dist/ - -As you can see, you can list multiple URLs separated by whitespace, continuing -on multiple lines if necessary (as long as the subsequent lines are indented. - -If you are more ambitious, you can also create an entirely custom package index -or PyPI mirror. See the ``--index-url`` option under `Command-Line Options`_, -below, and also the section on the `Package Index "API"`_. - - -Password-Protected Sites -~~~~~~~~~~~~~~~~~~~~~~~~ - -If a site you want to download from is password-protected using HTTP "Basic" -authentication, you can specify your credentials in the URL, like so:: - - http://some_userid:some_password@some.example.com/some_path/ - -You can do this with both index page URLs and direct download URLs. As long -as any HTML pages read by easy_install use *relative* links to point to the -downloads, the same user ID and password will be used to do the downloading. - - -Controlling Build Options -~~~~~~~~~~~~~~~~~~~~~~~~~ - -EasyInstall respects standard distutils `Configuration Files`_, so you can use -them to configure build options for packages that it installs from source. For -example, if you are on Windows using the MinGW compiler, you can configure the -default compiler by putting something like this:: - - [build] - compiler = mingw32 - -into the appropriate distutils configuration file. In fact, since this is just -normal distutils configuration, it will affect any builds using that config -file, not just ones done by EasyInstall. For example, if you add those lines -to ``distutils.cfg`` in the ``distutils`` package directory, it will be the -default compiler for *all* packages you build. See `Configuration Files`_ -below for a list of the standard configuration file locations, and links to -more documentation on using distutils configuration files. - - -Editing and Viewing Source Packages -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -Sometimes a package's source distribution contains additional documentation, -examples, configuration files, etc., that are not part of its actual code. If -you want to be able to examine these files, you can use the ``--editable`` -option to EasyInstall, and EasyInstall will look for a source distribution -or Subversion URL for the package, then download and extract it or check it out -as a subdirectory of the ``--build-directory`` you specify. If you then wish -to install the package after editing or configuring it, you can do so by -rerunning EasyInstall with that directory as the target. - -Note that using ``--editable`` stops EasyInstall from actually building or -installing the package; it just finds, obtains, and possibly unpacks it for -you. This allows you to make changes to the package if necessary, and to -either install it in development mode using ``setup.py develop`` (if the -package uses setuptools, that is), or by running ``easy_install projectdir`` -(where ``projectdir`` is the subdirectory EasyInstall created for the -downloaded package. - -In order to use ``--editable`` (``-e`` for short), you *must* also supply a -``--build-directory`` (``-b`` for short). The project will be placed in a -subdirectory of the build directory. The subdirectory will have the same -name as the project itself, but in all-lowercase. If a file or directory of -that name already exists, EasyInstall will print an error message and exit. - -Also, when using ``--editable``, you cannot use URLs or filenames as arguments. -You *must* specify project names (and optional version requirements) so that -EasyInstall knows what directory name(s) to create. If you need to force -EasyInstall to use a particular URL or filename, you should specify it as a -``--find-links`` item (``-f`` for short), and then also specify -the project name, e.g.:: - - easy_install -eb ~/projects \ - -fhttp://prdownloads.sourceforge.net/ctypes/ctypes-0.9.6.tar.gz?download \ - ctypes==0.9.6 - - -Dealing with Installation Conflicts -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -(NOTE: As of 0.6a11, this section is obsolete; it is retained here only so that -people using older versions of EasyInstall can consult it. As of version -0.6a11, installation conflicts are handled automatically without deleting the -old or system-installed packages, and without ignoring the issue. Instead, -eggs are automatically shifted to the front of ``sys.path`` using special -code added to the ``easy-install.pth`` file. So, if you are using version -0.6a11 or better of setuptools, you do not need to worry about conflicts, -and the following issues do not apply to you.) - -EasyInstall installs distributions in a "managed" way, such that each -distribution can be independently activated or deactivated on ``sys.path``. -However, packages that were not installed by EasyInstall are "unmanaged", -in that they usually live all in one directory and cannot be independently -activated or deactivated. - -As a result, if you are using EasyInstall to upgrade an existing package, or -to install a package with the same name as an existing package, EasyInstall -will warn you of the conflict. (This is an improvement over ``setup.py -install``, becuase the ``distutils`` just install new packages on top of old -ones, possibly combining two unrelated packages or leaving behind modules that -have been deleted in the newer version of the package.) - -By default, EasyInstall will stop the installation if it detects a conflict -between an existing, "unmanaged" package, and a module or package in any of -the distributions you're installing. It will display a list of all of the -existing files and directories that would need to be deleted for the new -package to be able to function correctly. You can then either delete these -conflicting files and directories yourself and re-run EasyInstall, or you can -just use the ``--delete-conflicting`` or ``--ignore-conflicts-at-my-risk`` -options, as described under `Command-Line Options`_, below. - -Of course, once you've replaced all of your existing "unmanaged" packages with -versions managed by EasyInstall, you won't have any more conflicts to worry -about! - - -Compressed Installation -~~~~~~~~~~~~~~~~~~~~~~~ - -EasyInstall tries to install packages in zipped form, if it can. Zipping -packages can improve Python's overall import performance if you're not using -the ``--multi-version`` option, because Python processes zipfile entries on -``sys.path`` much faster than it does directories. - -As of version 0.5a9, EasyInstall analyzes packages to determine whether they -can be safely installed as a zipfile, and then acts on its analysis. (Previous -versions would not install a package as a zipfile unless you used the -``--zip-ok`` option.) - -The current analysis approach is fairly conservative; it currenly looks for: - - * Any use of the ``__file__`` or ``__path__`` variables (which should be - replaced with ``pkg_resources`` API calls) - - * Possible use of ``inspect`` functions that expect to manipulate source files - (e.g. ``inspect.getsource()``) - - * Top-level modules that might be scripts used with ``python -m`` (Python 2.4) - -If any of the above are found in the package being installed, EasyInstall will -assume that the package cannot be safely run from a zipfile, and unzip it to -a directory instead. You can override this analysis with the ``-zip-ok`` flag, -which will tell EasyInstall to install the package as a zipfile anyway. Or, -you can use the ``--always-unzip`` flag, in which case EasyInstall will always -unzip, even if its analysis says the package is safe to run as a zipfile. - -Normally, however, it is simplest to let EasyInstall handle the determination -of whether to zip or unzip, and only specify overrides when needed to work -around a problem. If you find you need to override EasyInstall's guesses, you -may want to contact the package author and the EasyInstall maintainers, so that -they can make appropriate changes in future versions. - -(Note: If a package uses ``setuptools`` in its setup script, the package author -has the option to declare the package safe or unsafe for zipped usage via the -``zip_safe`` argument to ``setup()``. If the package author makes such a -declaration, EasyInstall believes the package's author and does not perform its -own analysis. However, your command-line option, if any, will still override -the package author's choice.) - - -Reference Manual -================ - -Configuration Files -------------------- - -(New in 0.4a2) - -You may specify default options for EasyInstall using the standard -distutils configuration files, under the command heading ``easy_install``. -EasyInstall will look first for a ``setup.cfg`` file in the current directory, -then a ``~/.pydistutils.cfg`` or ``$HOME\\pydistutils.cfg`` (on Unix-like OSes -and Windows, respectively), and finally a ``distutils.cfg`` file in the -``distutils`` package directory. Here's a simple example:: - - [easy_install] - - # set the default location to install packages - install_dir = /home/me/lib/python - - # Notice that indentation can be used to continue an option - # value; this is especially useful for the "--find-links" - # option, which tells easy_install to use download links on - # these pages before consulting PyPI: - # - find_links = http://sqlobject.org/ - http://peak.telecommunity.com/dist/ - -In addition to accepting configuration for its own options under -``[easy_install]``, EasyInstall also respects defaults specified for other -distutils commands. For example, if you don't set an ``install_dir`` for -``[easy_install]``, but *have* set an ``install_lib`` for the ``[install]`` -command, this will become EasyInstall's default installation directory. Thus, -if you are already using distutils configuration files to set default install -locations, build options, etc., EasyInstall will respect your existing settings -until and unless you override them explicitly in an ``[easy_install]`` section. - -For more information, see also the current Python documentation on the `use and -location of distutils configuration files `_. - - -Command-Line Options --------------------- - -``--zip-ok, -z`` - Install all packages as zip files, even if they are marked as unsafe for - running as a zipfile. This can be useful when EasyInstall's analysis - of a non-setuptools package is too conservative, but keep in mind that - the package may not work correctly. (Changed in 0.5a9; previously this - option was required in order for zipped installation to happen at all.) - -``--always-unzip, -Z`` - Don't install any packages as zip files, even if the packages are marked - as safe for running as a zipfile. This can be useful if a package does - something unsafe, but not in a way that EasyInstall can easily detect. - EasyInstall's default analysis is currently very conservative, however, so - you should only use this option if you've had problems with a particular - package, and *after* reporting the problem to the package's maintainer and - to the EasyInstall maintainers. - - (Note: the ``-z/-Z`` options only affect the installation of newly-built - or downloaded packages that are not already installed in the target - directory; if you want to convert an existing installed version from - zipped to unzipped or vice versa, you'll need to delete the existing - version first, and re-run EasyInstall.) - -``--multi-version, -m`` - "Multi-version" mode. Specifying this option prevents ``easy_install`` from - adding an ``easy-install.pth`` entry for the package being installed, and - if an entry for any version the package already exists, it will be removed - upon successful installation. In multi-version mode, no specific version of - the package is available for importing, unless you use - ``pkg_resources.require()`` to put it on ``sys.path``. This can be as - simple as:: - - from pkg_resources import require - require("SomePackage", "OtherPackage", "MyPackage") - - which will put the latest installed version of the specified packages on - ``sys.path`` for you. (For more advanced uses, like selecting specific - versions and enabling optional dependencies, see the ``pkg_resources`` API - doc.) - - Changed in 0.6a10: this option is no longer silently enabled when - installing to a non-PYTHONPATH, non-"site" directory. You must always - explicitly use this option if you want it to be active. - -``--upgrade, -U`` (New in 0.5a4) - By default, EasyInstall only searches online if a project/version - requirement can't be met by distributions already installed - on sys.path or the installation directory. However, if you supply the - ``--upgrade`` or ``-U`` flag, EasyInstall will always check the package - index and ``--find-links`` URLs before selecting a version to install. In - this way, you can force EasyInstall to use the latest available version of - any package it installs (subject to any version requirements that might - exclude such later versions). - -``--install-dir=DIR, -d DIR`` - Set the installation directory. It is up to you to ensure that this - directory is on ``sys.path`` at runtime, and to use - ``pkg_resources.require()`` to enable the installed package(s) that you - need. - - (New in 0.4a2) If this option is not directly specified on the command line - or in a distutils configuration file, the distutils default installation - location is used. Normally, this would be the ``site-packages`` directory, - but if you are using distutils configuration files, setting things like - ``prefix`` or ``install_lib``, then those settings are taken into - account when computing the default installation directory, as is the - ``--prefix`` option. - -``--script-dir=DIR, -s DIR`` - Set the script installation directory. If you don't supply this option - (via the command line or a configuration file), but you *have* supplied - an ``--install-dir`` (via command line or config file), then this option - defaults to the same directory, so that the scripts will be able to find - their associated package installation. Otherwise, this setting defaults - to the location where the distutils would normally install scripts, taking - any distutils configuration file settings into account. - -``--exclude-scripts, -x`` - Don't install scripts. This is useful if you need to install multiple - versions of a package, but do not want to reset the version that will be - run by scripts that are already installed. - -``--always-copy, -a`` (New in 0.5a4) - Copy all needed distributions to the installation directory, even if they - are already present in a directory on sys.path. In older versions of - EasyInstall, this was the default behavior, but now you must explicitly - request it. By default, EasyInstall will no longer copy such distributions - from other sys.path directories to the installation directory, unless you - explicitly gave the distribution's filename on the command line. - - Note that as of 0.6a10, using this option excludes "system" and - "development" eggs from consideration because they can't be reliably - copied. This may cause EasyInstall to choose an older version of a package - than what you expected, or it may cause downloading and installation of a - fresh copy of something that's already installed. You will see warning - messages for any eggs that EasyInstall skips, before it falls back to an - older version or attempts to download a fresh copy. - -``--find-links=URLS_OR_FILENAMES, -f URLS_OR_FILENAMES`` - Scan the specified "download pages" or directories for direct links to eggs - or other distributions. Any existing file or directory names or direct - download URLs are immediately added to EasyInstall's search cache, and any - indirect URLs (ones that don't point to eggs or other recognized archive - formats) are added to a list of additional places to search for download - links. As soon as EasyInstall has to go online to find a package (either - because it doesn't exist locally, or because ``--upgrade`` or ``-U`` was - used), the specified URLs will be downloaded and scanned for additional - direct links. - - Eggs and archives found by way of ``--find-links`` are only downloaded if - they are needed to meet a requirement specified on the command line; links - to unneeded packages are ignored. - - If all requested packages can be found using links on the specified - download pages, the Python Package Index will not be consulted unless you - also specified the ``--upgrade`` or ``-U`` option. - - (Note: if you want to refer to a local HTML file containing links, you must - use a ``file:`` URL, as filenames that do not refer to a directory, egg, or - archive are ignored.) - - You may specify multiple URLs or file/directory names with this option, - separated by whitespace. Note that on the command line, you will probably - have to surround the URL list with quotes, so that it is recognized as a - single option value. You can also specify URLs in a configuration file; - see `Configuration Files`_, above. - - Changed in 0.6a10: previously all URLs and directories passed to this - option were scanned as early as possible, but from 0.6a10 on, only - directories and direct archive links are scanned immediately; URLs are not - retrieved unless a package search was already going to go online due to a - package not being available locally, or due to the use of the ``--update`` - or ``-U`` option. - -``--delete-conflicting, -D`` (Removed in 0.6a11) - (As of 0.6a11, this option is no longer necessary; please do not use it!) - - If you are replacing a package that was previously installed *without* - using EasyInstall, the old version may end up on ``sys.path`` before the - version being installed with EasyInstall. EasyInstall will normally abort - the installation of a package if it detects such a conflict, and ask you to - manually remove the conflicting files or directories. If you specify this - option, however, EasyInstall will attempt to delete the files or - directories itself, and then proceed with the installation. - -``--ignore-conflicts-at-my-risk`` (Removed in 0.6a11) - (As of 0.6a11, this option is no longer necessary; please do not use it!) - - Ignore conflicting packages and proceed with installation anyway, even - though it means the package probably won't work properly. If the - conflicting package is in a directory you can't write to, this may be your - only option, but you will need to take more invasive measures to get the - installed package to work, like manually adding it to ``PYTHONPATH`` or to - ``sys.path`` at runtime. - -``--index-url=URL, -i URL`` (New in 0.4a1; default changed in 0.6c7) - Specifies the base URL of the Python Package Index. The default is - http://pypi.python.org/simple if not specified. When a package is requested - that is not locally available or linked from a ``--find-links`` download - page, the package index will be searched for download pages for the needed - package, and those download pages will be searched for links to download - an egg or source distribution. - -``--editable, -e`` (New in 0.6a1) - Only find and download source distributions for the specified projects, - unpacking them to subdirectories of the specified ``--build-directory``. - EasyInstall will not actually build or install the requested projects or - their dependencies; it will just find and extract them for you. See - `Editing and Viewing Source Packages`_ above for more details. - -``--build-directory=DIR, -b DIR`` (UPDATED in 0.6a1) - Set the directory used to build source packages. If a package is built - from a source distribution or checkout, it will be extracted to a - subdirectory of the specified directory. The subdirectory will have the - same name as the extracted distribution's project, but in all-lowercase. - If a file or directory of that name already exists in the given directory, - a warning will be printed to the console, and the build will take place in - a temporary directory instead. - - This option is most useful in combination with the ``--editable`` option, - which forces EasyInstall to *only* find and extract (but not build and - install) source distributions. See `Editing and Viewing Source Packages`_, - above, for more information. - -``--verbose, -v, --quiet, -q`` (New in 0.4a4) - Control the level of detail of EasyInstall's progress messages. The - default detail level is "info", which prints information only about - relatively time-consuming operations like running a setup script, unpacking - an archive, or retrieving a URL. Using ``-q`` or ``--quiet`` drops the - detail level to "warn", which will only display installation reports, - warnings, and errors. Using ``-v`` or ``--verbose`` increases the detail - level to include individual file-level operations, link analysis messages, - and distutils messages from any setup scripts that get run. If you include - the ``-v`` option more than once, the second and subsequent uses are passed - down to any setup scripts, increasing the verbosity of their reporting as - well. - -``--dry-run, -n`` (New in 0.4a4) - Don't actually install the package or scripts. This option is passed down - to any setup scripts run, so packages should not actually build either. - This does *not* skip downloading, nor does it skip extracting source - distributions to a temporary/build directory. - -``--optimize=LEVEL``, ``-O LEVEL`` (New in 0.4a4) - If you are installing from a source distribution, and are *not* using the - ``--zip-ok`` option, this option controls the optimization level for - compiling installed ``.py`` files to ``.pyo`` files. It does not affect - the compilation of modules contained in ``.egg`` files, only those in - ``.egg`` directories. The optimization level can be set to 0, 1, or 2; - the default is 0 (unless it's set under ``install`` or ``install_lib`` in - one of your distutils configuration files). - -``--record=FILENAME`` (New in 0.5a4) - Write a record of all installed files to FILENAME. This is basically the - same as the same option for the standard distutils "install" command, and - is included for compatibility with tools that expect to pass this option - to "setup.py install". - -``--site-dirs=DIRLIST, -S DIRLIST`` (New in 0.6a1) - Specify one or more custom "site" directories (separated by commas). - "Site" directories are directories where ``.pth`` files are processed, such - as the main Python ``site-packages`` directory. As of 0.6a10, EasyInstall - automatically detects whether a given directory processes ``.pth`` files - (or can be made to do so), so you should not normally need to use this - option. It is is now only necessary if you want to override EasyInstall's - judgment and force an installation directory to be treated as if it - supported ``.pth`` files. - - (If you want to *make* a non-``PYTHONPATH`` directory support ``.pth`` - files, please see the `Administrator Installation`_ section below.) - -``--no-deps, -N`` (New in 0.6a6) - Don't install any dependencies. This is intended as a convenience for - tools that wrap eggs in a platform-specific packaging system. (We don't - recommend that you use it for anything else.) - -``--allow-hosts=PATTERNS, -H PATTERNS`` (New in 0.6a6) - Restrict downloading and spidering to hosts matching the specified glob - patterns. E.g. ``-H *.python.org`` restricts web access so that only - packages listed and downloadable from machines in the ``python.org`` - domain. The glob patterns must match the *entire* user/host/port section of - the target URL(s). For example, ``*.python.org`` will NOT accept a URL - like ``http://python.org/foo`` or ``http://www.python.org:8080/``. - Multiple patterns can be specified by separting them with commas. The - default pattern is ``*``, which matches anything. - - In general, this option is mainly useful for blocking EasyInstall's web - access altogether (e.g. ``-Hlocalhost``), or to restrict it to an intranet - or other trusted site. EasyInstall will do the best it can to satisfy - dependencies given your host restrictions, but of course can fail if it - can't find suitable packages. EasyInstall displays all blocked URLs, so - that you can adjust your ``--allow-hosts`` setting if it is more strict - than you intended. Some sites may wish to define a restrictive default - setting for this option in their `configuration files`_, and then manually - override the setting on the command line as needed. - -``--prefix=DIR`` (New in 0.6a10) - Use the specified directory as a base for computing the default - installation and script directories. On Windows, the resulting default - directories will be ``prefix\\Lib\\site-packages`` and ``prefix\\Scripts``, - while on other platforms the defaults will be - ``prefix/lib/python2.X/site-packages`` (with the appropriate version - substituted) for libraries and ``prefix/bin`` for scripts. - - Note that the ``--prefix`` option only sets the *default* installation and - script directories, and does not override the ones set on the command line - or in a configuration file. - -``--local-snapshots-ok, -l`` (New in 0.6c6) - Normally, EasyInstall prefers to only install *released* versions of - projects, not in-development ones, because such projects may not - have a currently-valid version number. So, it usually only installs them - when their ``setup.py`` directory is explicitly passed on the command line. - - However, if this option is used, then any in-development projects that were - installed using the ``setup.py develop`` command, will be used to build - eggs, effectively upgrading the "in-development" project to a snapshot - release. Normally, this option is used only in conjunction with the - ``--always-copy`` option to create a distributable snapshot of every egg - needed to run an application. - - Note that if you use this option, you must make sure that there is a valid - version number (such as an SVN revision number tag) for any in-development - projects that may be used, as otherwise EasyInstall may not be able to tell - what version of the project is "newer" when future installations or - upgrades are attempted. - - -.. _non-root installation: - -Custom Installation Locations ------------------------------ - -EasyInstall manages what packages are active using Python ``.pth`` files, which -are normally only usable in Python's main ``site-packages`` directory. On some -platforms (such as Mac OS X), there are additional ``site-packages`` -directories that you can use besides the main one, but usually there is only -one directory on the system where you can install packages without extra steps. - -There are many reasons, however, why you might want to install packages -somewhere other than the ``site-packages`` directory. For example, you might -not have write access to that directory. You may be working with unstable -versions of packages that you don't want to install system-wide. And so on. - -The following sections describe various approaches to custom installation; feel -free to choose which one best suits your system and needs. - -`Administrator Installation`_ - This approach is for when you have write access to ``site-packages`` (or - another directory where ``.pth`` files are processed), but don't want to - install packages there. This can also be used by a system administrator - to enable each user having their own private directories that EasyInstall - will use to install packages. - -`Mac OS X "User" Installation`_ - This approach produces a result similar to an administrator installation - that gives each user their own private package directory, but on Mac OS X - the hard part has already been done for you. This is probably the best - approach for Mac OS X users. - -`Creating a "Virtual" Python`_ - This approach is for when you don't have "root" or access to write to the - ``site-packages`` directory, and would like to be able to set up one or - more "virtual python" executables for your projects. This approach - gives you the benefits of multiple Python installations, but without having - to actually install Python more than once and use up lots of disk space. - (Only the Python executable is copied; the libraries will be symlinked - from the systemwide Python.) - - If you don't already have any ``PYTHONPATH`` customization or - special distutils configuration, and you can't use either of the preceding - approaches, this is probably the best one for you. - -`"Traditional" PYTHONPATH-based Installation`_ - If you already have a custom ``PYTHONPATH``, and/or a custom distutils - configuration, and don't want to change any of your existing setup, you may - be interested in this approach. (If you're using a custom ``.pth`` file to - point to your custom installation location, however, you should use - `Administrator Installation`_ to enable ``.pth`` processing in the custom - location instead, as that is easier and more flexible than this approach.) - - -Administrator Installation -~~~~~~~~~~~~~~~~~~~~~~~~~~ - -If you have root access to your machine, you can easily configure it to allow -each user to have their own directory where Python packages can be installed -and managed by EasyInstall. - -First, create an ``altinstall.pth`` file in Python's ``site-packages`` -directory, containing the following line (substituting the correct Python -version):: - - import os, site; site.addsitedir(os.path.expanduser('~/lib/python2.3')) - -This will automatically add each user's ``~/lib/python2.X`` directory to -``sys.path`` (if it exists), *and* it will process any ``.pth`` files in that -directory -- which is what makes it usable with EasyInstall. - -The next step is to create or modify ``distutils.cfg`` in the ``distutils`` -directory of your Python library. The correct directory will be something like -``/usr/lib/python2.X/distutils`` on most Posix systems and something like -``C:\\Python2X\Lib\distutils`` on Windows machines. Add the following lines -to the file, substituting the correct Python version if necessary:: - - [install] - install_lib = ~/lib/python2.3 - - # This next line is optional but often quite useful; it directs EasyInstall - # and the distutils to install scripts in the user's "bin" directory. For - # Mac OS X framework Python builds, you should use /usr/local/bin instead, - # because neither ~/bin nor the default script installation location are on - # the system PATH. - # - install_scripts = ~/bin - -This will configure the distutils and EasyInstall to install packages to the -user's home directory by default. - -Of course, you aren't limited to using a ``~/lib/python2.X`` directory with -this approach. You can substitute a specific systemwide directory if you like. -You can also edit ``~/.pydistutils.cfg`` (or ``~/pydistutils.cfg`` on Windows) -instead of changing the master ``distutils.cfg`` file. The true keys of this -approach are simply that: - -1. any custom installation directory must be added to ``sys.path`` using a - ``site.addsitedir()`` call from a working ``.pth`` file or - ``sitecustomize.py``. - -2. The active distutils configuration file(s) or ``easy_install`` command line - should include the custom directory in the ``--site-dirs`` option, so that - EasyInstall knows that ``.pth`` files will work in that location. (This is - because Python does not keep track of what directories are or aren't enabled - for ``.pth`` processing, in any way that EasyInstall can find out.) - -As long as both of these things have been done, your custom installation -location is good to go. - - -Mac OS X "User" Installation -~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -If you are on a Mac OS X machine, you should just use the -``~/Library/Python/2.x/site-packages`` directory as your custom installation -location, because it is already configured to process ``.pth`` files, and -EasyInstall already knows this. - -Before installing EasyInstall/setuptools, just create a ``~/.pydistutils.cfg`` -file with the following contents (or add this to the existing contents):: - - [install] - install_lib = ~/Library/Python/$py_version_short/site-packages - install_scripts = ~/bin - -This will tell the distutils and EasyInstall to always install packages in -your personal ``site-packages`` directory, and scripts to ``~/bin``. (Note: do -*not* replace ``$py_version_short`` with an actual Python version in the -configuration file! The distutils will substitute the correct value at -runtime, so that the above configuration file should work correctly no matter -what Python version you use, now or in the future.) - -Once you have done this, you can follow the normal `installation instructions`_ -and use ``easy_install`` without any other special options or steps. - -(Note, however, that ``~/bin`` is not in the default ``PATH``, so you may have -to refer to scripts by their full location. You may want to modify your shell -startup script (likely ``.bashrc`` or ``.profile``) or your -``~/.MacOSX/environment.plist`` to include ``~/bin`` in your ``PATH``. - - -Creating a "Virtual" Python -~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -If you are on a Linux, BSD, Cygwin, or other similar Unix-like operating -system, but don't have root access, you can create your own "virtual" -Python installation, which uses its own library directories and some symlinks -to the site-wide Python. - -In the simplest case, your virtual Python installation will live under the -``~/lib/python2.x``, ``~/include/python2.x``, and ``~/bin`` directories. Just -download `virtual-python.py`_ and run it using the site-wide Python. If you -want to customize the location, you can use the ``--prefix`` option to specify -an installation base directory in place of ``~``. (Use ``--help`` to get the -complete list of options.) - -.. _virtual-python.py: http://peak.telecommunity.com/dist/virtual-python.py - -When you're done, you'll have a ``~/bin/python`` executable that's linked to -the local Python installation and inherits all its current libraries, but which -allows you to add as many new libraries as you want. Simply use this new -Python in place of your system-defined one, and you can modify it as you like -without breaking anything that relies on the system Python. You'll also still -need to follow the standard `installation instructions`_ to install setuptools -and EasyInstall, using your new ``~/bin/python`` executable in place of the -system Python. - -Note that if you were previously setting a ``PYTHONPATH`` and/or had other -special configuration options in your ``~/.pydistutils.cfg``, you may need to -remove these settings *before* running ``virtual-python.py``. This is because -your new Python executable will not need *any* custom configuration for the -distutils or EasyInstall; everything will go to the correct ``~/lib`` and -``~/bin`` directories automatically. - -You should, however, also make sure that the ``bin`` subdirectory of your -installation prefix (e.g. ``~/bin``) is on your ``PATH``, because that is where -EasyInstall and the distutils will install new Python scripts. - - -"Traditional" ``PYTHONPATH``-based Installation -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -This installation method is not as robust or as flexible as `creating a -"virtual" python`_ installation, as it uses various tricks to fool Python into -processing ``.pth`` files where it normally wouldn't. We suggest you at least -consider using one of the other approaches, as they will generally result in -a cleaner, more usable Python configuration. However, if for some reason you -can't or won't use one of the other approaches, here's how to do it. - -Assuming that you want to install packages in a directory called ``~/py-lib``, -and scripts in ``~/bin``, here's what you need to do: - -First, edit ``~/.pydistutils.cfg`` to include these settings, if you don't -already have them:: - - [install] - install_lib = ~/py-lib - install_scripts = ~/bin - -Be sure to do this *before* you try to run the ``ez_setup.py`` installation -script. Then, follow the standard `installation instructions`_, but make -sure that ``~/py-lib`` is listed in your ``PYTHONPATH`` environment variable. - -Your library installation directory *must* be in listed in ``PYTHONPATH``, -not only when you install packages with EasyInstall, but also when you use -any packages that are installed using EasyInstall. You will probably want to -edit your ``~/.profile`` or other configuration file(s) to ensure that it is -set, if you haven't already got this set up on your machine. - - -Package Index "API" -------------------- - -Custom package indexes (and PyPI) must follow the following rules for -EasyInstall to be able to look up and download packages: - -1. Except where stated otherwise, "pages" are HTML or XHTML, and "links" - refer to ``href`` attributes. - -2. Individual project version pages' URLs must be of the form - ``base/projectname/version``, where ``base`` is the package index's base URL. - -3. Omitting the ``/version`` part of a project page's URL (but keeping the - trailing ``/``) should result in a page that is either: - - a) The single active version of that project, as though the version had been - explicitly included, OR - - b) A page with links to all of the active version pages for that project. - -4. Individual project version pages should contain direct links to downloadable - distributions where possible. It is explicitly permitted for a project's - "long_description" to include URLs, and these should be formatted as HTML - links by the package index, as EasyInstall does no special processing to - identify what parts of a page are index-specific and which are part of the - project's supplied description. - -5. Where available, MD5 information should be added to download URLs by - appending a fragment identifier of the form ``#md5=...``, where ``...`` is - the 32-character hex MD5 digest. EasyInstall will verify that the - downloaded file's MD5 digest matches the given value. - -6. Individual project version pages should identify any "homepage" or - "download" URLs using ``rel="homepage"`` and ``rel="download"`` attributes - on the HTML elements linking to those URLs. Use of these attributes will - cause EasyInstall to always follow the provided links, unless it can be - determined by inspection that they are downloadable distributions. If the - links are not to downloadable distributions, they are retrieved, and if they - are HTML, they are scanned for download links. They are *not* scanned for - additional "homepage" or "download" links, as these are only processed for - pages that are part of a package index site. - -7. The root URL of the index, if retrieved with a trailing ``/``, must result - in a page containing links to *all* projects' active version pages. - - (Note: This requirement is a workaround for the absence of case-insensitive - ``safe_name()`` matching of project names in URL paths. If project names are - matched in this fashion (e.g. via the PyPI server, mod_rewrite, or a similar - mechanism), then it is not necessary to include this all-packages listing - page.) - -8. If a package index is accessed via a ``file://`` URL, then EasyInstall will - automatically use ``index.html`` files, if present, when trying to read a - directory with a trailing ``/`` on the URL. - - -Backward Compatibility -~~~~~~~~~~~~~~~~~~~~~~ - -Package indexes that wish to support setuptools versions prior to 0.6b4 should -also follow these rules: - -* Homepage and download links must be preceded with ``"Home Page"`` or - ``"Download URL"``, in addition to (or instead of) the ``rel=""`` - attributes on the actual links. These marker strings do not need to be - visible, or uncommented, however! For example, the following is a valid - homepage link that will work with any version of setuptools:: - -
  • - Home Page: - - http://sqlobject.org -
  • - - Even though the marker string is in an HTML comment, older versions of - EasyInstall will still "see" it and know that the link that follows is the - project's home page URL. - -* The pages described by paragraph 3(b) of the preceding section *must* - contain the string ``"Index of Packages"`` somewhere in their text. - This can be inside of an HTML comment, if desired, and it can be anywhere - in the page. (Note: this string MUST NOT appear on normal project pages, as - described in paragraphs 2 and 3(a)!) - -In addition, for compatibility with PyPI versions that do not use ``#md5=`` -fragment IDs, EasyInstall uses the following regular expression to match PyPI's -displayed MD5 info (broken onto two lines for readability):: - - ([^<]+)\n\s+\(md5\) - - -Release Notes/Change History -============================ - -0.6c11 - * Fix installed script .exe files not working with 64-bit Python on Windows - (wasn't actually released in 0.6c10 due to a lost checkin) - -0.6c10 - * Fix easy_install.exe giving UAC errors on Windows Vista - - * Support for the most recent Sourceforge download link insanity - - * Stop crashing on certain types of HTTP error - - * Stop re-trying URLs that already failed retrieval once - - * Fixes for various dependency management problems such as looping builds, - re-downloading packages already present on sys.path (but not in a registered - "site" directory), and semi-randomly preferring local "-f" packages over - local installed packages - -0.6c9 - * Fixed ``win32.exe`` support for .pth files, so unnecessary directory nesting - is flattened out in the resulting egg. (There was a case-sensitivity - problem that affected some distributions, notably ``pywin32``.) - - * Prevent ``--help-commands`` and other junk from showing under Python 2.5 - when running ``easy_install --help``. - - * Fixed GUI scripts sometimes not executing on Windows - - * Fixed not picking up dependency links from recursive dependencies. - - * Only make ``.py``, ``.dll`` and ``.so`` files executable when unpacking eggs - - * Changes for Jython compatibility - - * Improved error message when a requirement is also a directory name, but the - specified directory is not a source package. - - * Fixed ``--allow-hosts`` option blocking ``file:`` URLs - - * Fixed HTTP SVN detection failing when the page title included a project - name (e.g. on SourceForge-hosted SVN) - - * Fix Jython script installation to handle ``#!`` lines better when - ``sys.executable`` is a script. - - * Removed use of deprecated ``md5`` module if ``hashlib`` is available - - * Keep site directories (e.g. ``site-packages``) from being included in - ``.pth`` files. - -0.6c7 - * ``ftp:`` download URLs now work correctly. - - * The default ``--index-url`` is now ``http://pypi.python.org/simple``, to use - the Python Package Index's new simpler (and faster!) REST API. - -0.6c6 - * EasyInstall no longer aborts the installation process if a URL it wants to - retrieve can't be downloaded, unless the URL is an actual package download. - Instead, it issues a warning and tries to keep going. - - * Fixed distutils-style scripts originally built on Windows having their line - endings doubled when installed on any platform. - - * Added ``--local-snapshots-ok`` flag, to allow building eggs from projects - installed using ``setup.py develop``. - - * Fixed not HTML-decoding URLs scraped from web pages - -0.6c5 - * Fixed ``.dll`` files on Cygwin not having executable permisions when an egg - is installed unzipped. - -0.6c4 - * Added support for HTTP "Basic" authentication using ``http://user:pass@host`` - URLs. If a password-protected page contains links to the same host (and - protocol), those links will inherit the credentials used to access the - original page. - - * Removed all special support for Sourceforge mirrors, as Sourceforge's - mirror system now works well for non-browser downloads. - - * Fixed not recognizing ``win32.exe`` installers that included a custom - bitmap. - - * Fixed not allowing ``os.open()`` of paths outside the sandbox, even if they - are opened read-only (e.g. reading ``/dev/urandom`` for random numbers, as - is done by ``os.urandom()`` on some platforms). - - * Fixed a problem with ``.pth`` testing on Windows when ``sys.executable`` - has a space in it (e.g., the user installed Python to a ``Program Files`` - directory). - -0.6c3 - * You can once again use "python -m easy_install" with Python 2.4 and above. - - * Python 2.5 compatibility fixes added. - -0.6c2 - * Windows script wrappers now support quoted arguments and arguments - containing spaces. (Patch contributed by Jim Fulton.) - - * The ``ez_setup.py`` script now actually works when you put a setuptools - ``.egg`` alongside it for bootstrapping an offline machine. - - * A writable installation directory on ``sys.path`` is no longer required to - download and extract a source distribution using ``--editable``. - - * Generated scripts now use ``-x`` on the ``#!`` line when ``sys.executable`` - contains non-ASCII characters, to prevent deprecation warnings about an - unspecified encoding when the script is run. - -0.6c1 - * EasyInstall now includes setuptools version information in the - ``User-Agent`` string sent to websites it visits. - -0.6b4 - * Fix creating Python wrappers for non-Python scripts - - * Fix ``ftp://`` directory listing URLs from causing a crash when used in the - "Home page" or "Download URL" slots on PyPI. - - * Fix ``sys.path_importer_cache`` not being updated when an existing zipfile - or directory is deleted/overwritten. - - * Fix not recognizing HTML 404 pages from package indexes. - - * Allow ``file://`` URLs to be used as a package index. URLs that refer to - directories will use an internally-generated directory listing if there is - no ``index.html`` file in the directory. - - * Allow external links in a package index to be specified using - ``rel="homepage"`` or ``rel="download"``, without needing the old - PyPI-specific visible markup. - - * Suppressed warning message about possibly-misspelled project name, if an egg - or link for that project name has already been seen. - -0.6b3 - * Fix local ``--find-links`` eggs not being copied except with - ``--always-copy``. - - * Fix sometimes not detecting local packages installed outside of "site" - directories. - - * Fix mysterious errors during initial ``setuptools`` install, caused by - ``ez_setup`` trying to run ``easy_install`` twice, due to a code fallthru - after deleting the egg from which it's running. - -0.6b2 - * Don't install or update a ``site.py`` patch when installing to a - ``PYTHONPATH`` directory with ``--multi-version``, unless an - ``easy-install.pth`` file is already in use there. - - * Construct ``.pth`` file paths in such a way that installing an egg whose - name begins with ``import`` doesn't cause a syntax error. - - * Fixed a bogus warning message that wasn't updated since the 0.5 versions. - -0.6b1 - * Better ambiguity management: accept ``#egg`` name/version even if processing - what appears to be a correctly-named distutils file, and ignore ``.egg`` - files with no ``-``, since valid Python ``.egg`` files always have a version - number (but Scheme eggs often don't). - - * Support ``file://`` links to directories in ``--find-links``, so that - easy_install can build packages from local source checkouts. - - * Added automatic retry for Sourceforge mirrors. The new download process is - to first just try dl.sourceforge.net, then randomly select mirror IPs and - remove ones that fail, until something works. The removed IPs stay removed - for the remainder of the run. - - * Ignore bdist_dumb distributions when looking at download URLs. - -0.6a11 - * Process ``dependency_links.txt`` if found in a distribution, by adding the - URLs to the list for scanning. - - * Use relative paths in ``.pth`` files when eggs are being installed to the - same directory as the ``.pth`` file. This maximizes portability of the - target directory when building applications that contain eggs. - - * Added ``easy_install-N.N`` script(s) for convenience when using multiple - Python versions. - - * Added automatic handling of installation conflicts. Eggs are now shifted to - the front of sys.path, in an order consistent with where they came from, - making EasyInstall seamlessly co-operate with system package managers. - - The ``--delete-conflicting`` and ``--ignore-conflicts-at-my-risk`` options - are now no longer necessary, and will generate warnings at the end of a - run if you use them. - - * Don't recursively traverse subdirectories given to ``--find-links``. - -0.6a10 - * Added exhaustive testing of the install directory, including a spawn test - for ``.pth`` file support, and directory writability/existence checks. This - should virtually eliminate the need to set or configure ``--site-dirs``. - - * Added ``--prefix`` option for more do-what-I-mean-ishness in the absence of - RTFM-ing. :) - - * Enhanced ``PYTHONPATH`` support so that you don't have to put any eggs on it - manually to make it work. ``--multi-version`` is no longer a silent - default; you must explicitly use it if installing to a non-PYTHONPATH, - non-"site" directory. - - * Expand ``$variables`` used in the ``--site-dirs``, ``--build-directory``, - ``--install-dir``, and ``--script-dir`` options, whether on the command line - or in configuration files. - - * Improved SourceForge mirror processing to work faster and be less affected - by transient HTML changes made by SourceForge. - - * PyPI searches now use the exact spelling of requirements specified on the - command line or in a project's ``install_requires``. Previously, a - normalized form of the name was used, which could lead to unnecessary - full-index searches when a project's name had an underscore (``_``) in it. - - * EasyInstall can now download bare ``.py`` files and wrap them in an egg, - as long as you include an ``#egg=name-version`` suffix on the URL, or if - the ``.py`` file is listed as the "Download URL" on the project's PyPI page. - This allows third parties to "package" trivial Python modules just by - linking to them (e.g. from within their own PyPI page or download links - page). - - * The ``--always-copy`` option now skips "system" and "development" eggs since - they can't be reliably copied. Note that this may cause EasyInstall to - choose an older version of a package than what you expected, or it may cause - downloading and installation of a fresh version of what's already installed. - - * The ``--find-links`` option previously scanned all supplied URLs and - directories as early as possible, but now only directories and direct - archive links are scanned immediately. URLs are not retrieved unless a - package search was already going to go online due to a package not being - available locally, or due to the use of the ``--update`` or ``-U`` option. - - * Fixed the annoying ``--help-commands`` wart. - -0.6a9 - * Fixed ``.pth`` file processing picking up nested eggs (i.e. ones inside - "baskets") when they weren't explicitly listed in the ``.pth`` file. - - * If more than one URL appears to describe the exact same distribution, prefer - the shortest one. This helps to avoid "table of contents" CGI URLs like the - ones on effbot.org. - - * Quote arguments to python.exe (including python's path) to avoid problems - when Python (or a script) is installed in a directory whose name contains - spaces on Windows. - - * Support full roundtrip translation of eggs to and from ``bdist_wininst`` - format. Running ``bdist_wininst`` on a setuptools-based package wraps the - egg in an .exe that will safely install it as an egg (i.e., with metadata - and entry-point wrapper scripts), and ``easy_install`` can turn the .exe - back into an ``.egg`` file or directory and install it as such. - -0.6a8 - * Update for changed SourceForge mirror format - - * Fixed not installing dependencies for some packages fetched via Subversion - - * Fixed dependency installation with ``--always-copy`` not using the same - dependency resolution procedure as other operations. - - * Fixed not fully removing temporary directories on Windows, if a Subversion - checkout left read-only files behind - - * Fixed some problems building extensions when Pyrex was installed, especially - with Python 2.4 and/or packages using SWIG. - -0.6a7 - * Fixed not being able to install Windows script wrappers using Python 2.3 - -0.6a6 - * Added support for "traditional" PYTHONPATH-based non-root installation, and - also the convenient ``virtual-python.py`` script, based on a contribution - by Ian Bicking. The setuptools egg now contains a hacked ``site`` module - that makes the PYTHONPATH-based approach work with .pth files, so that you - can get the full EasyInstall feature set on such installations. - - * Added ``--no-deps`` and ``--allow-hosts`` options. - - * Improved Windows ``.exe`` script wrappers so that the script can have the - same name as a module without confusing Python. - - * Changed dependency processing so that it's breadth-first, allowing a - depender's preferences to override those of a dependee, to prevent conflicts - when a lower version is acceptable to the dependee, but not the depender. - Also, ensure that currently installed/selected packages aren't given - precedence over ones desired by a package being installed, which could - cause conflict errors. - -0.6a3 - * Improved error message when trying to use old ways of running - ``easy_install``. Removed the ability to run via ``python -m`` or by - running ``easy_install.py``; ``easy_install`` is the command to run on all - supported platforms. - - * Improved wrapper script generation and runtime initialization so that a - VersionConflict doesn't occur if you later install a competing version of a - needed package as the default version of that package. - - * Fixed a problem parsing version numbers in ``#egg=`` links. - -0.6a2 - * EasyInstall can now install "console_scripts" defined by packages that use - ``setuptools`` and define appropriate entry points. On Windows, console - scripts get an ``.exe`` wrapper so you can just type their name. On other - platforms, the scripts are installed without a file extension. - - * Using ``python -m easy_install`` or running ``easy_install.py`` is now - DEPRECATED, since an ``easy_install`` wrapper is now available on all - platforms. - -0.6a1 - * EasyInstall now does MD5 validation of downloads from PyPI, or from any link - that has an "#md5=..." trailer with a 32-digit lowercase hex md5 digest. - - * EasyInstall now handles symlinks in target directories by removing the link, - rather than attempting to overwrite the link's destination. This makes it - easier to set up an alternate Python "home" directory (as described above in - the `Non-Root Installation`_ section). - - * Added support for handling MacOS platform information in ``.egg`` filenames, - based on a contribution by Kevin Dangoor. You may wish to delete and - reinstall any eggs whose filename includes "darwin" and "Power_Macintosh", - because the format for this platform information has changed so that minor - OS X upgrades (such as 10.4.1 to 10.4.2) do not cause eggs built with a - previous OS version to become obsolete. - - * easy_install's dependency processing algorithms have changed. When using - ``--always-copy``, it now ensures that dependencies are copied too. When - not using ``--always-copy``, it tries to use a single resolution loop, - rather than recursing. - - * Fixed installing extra ``.pyc`` or ``.pyo`` files for scripts with ``.py`` - extensions. - - * Added ``--site-dirs`` option to allow adding custom "site" directories. - Made ``easy-install.pth`` work in platform-specific alternate site - directories (e.g. ``~/Library/Python/2.x/site-packages`` on Mac OS X). - - * If you manually delete the current version of a package, the next run of - EasyInstall against the target directory will now remove the stray entry - from the ``easy-install.pth`` file. - - * EasyInstall now recognizes URLs with a ``#egg=project_name`` fragment ID - as pointing to the named project's source checkout. Such URLs have a lower - match precedence than any other kind of distribution, so they'll only be - used if they have a higher version number than any other available - distribution, or if you use the ``--editable`` option. The ``#egg`` - fragment can contain a version if it's formatted as ``#egg=proj-ver``, - where ``proj`` is the project name, and ``ver`` is the version number. You - *must* use the format for these values that the ``bdist_egg`` command uses; - i.e., all non-alphanumeric runs must be condensed to single underscore - characters. - - * Added the ``--editable`` option; see `Editing and Viewing Source Packages`_ - above for more info. Also, slightly changed the behavior of the - ``--build-directory`` option. - - * Fixed the setup script sandbox facility not recognizing certain paths as - valid on case-insensitive platforms. - -0.5a12 - * Fix ``python -m easy_install`` not working due to setuptools being installed - as a zipfile. Update safety scanner to check for modules that might be used - as ``python -m`` scripts. - - * Misc. fixes for win32.exe support, including changes to support Python 2.4's - changed ``bdist_wininst`` format. - -0.5a10 - * Put the ``easy_install`` module back in as a module, as it's needed for - ``python -m`` to run it! - - * Allow ``--find-links/-f`` to accept local directories or filenames as well - as URLs. - -0.5a9 - * EasyInstall now automatically detects when an "unmanaged" package or - module is going to be on ``sys.path`` ahead of a package you're installing, - thereby preventing the newer version from being imported. By default, it - will abort installation to alert you of the problem, but there are also - new options (``--delete-conflicting`` and ``--ignore-conflicts-at-my-risk``) - available to change the default behavior. (Note: this new feature doesn't - take effect for egg files that were built with older ``setuptools`` - versions, because they lack the new metadata file required to implement it.) - - * The ``easy_install`` distutils command now uses ``DistutilsError`` as its - base error type for errors that should just issue a message to stderr and - exit the program without a traceback. - - * EasyInstall can now be given a path to a directory containing a setup - script, and it will attempt to build and install the package there. - - * EasyInstall now performs a safety analysis on module contents to determine - whether a package is likely to run in zipped form, and displays - information about what modules may be doing introspection that would break - when running as a zipfile. - - * Added the ``--always-unzip/-Z`` option, to force unzipping of packages that - would ordinarily be considered safe to unzip, and changed the meaning of - ``--zip-ok/-z`` to "always leave everything zipped". - -0.5a8 - * There is now a separate documentation page for `setuptools`_; revision - history that's not specific to EasyInstall has been moved to that page. - - .. _setuptools: http://peak.telecommunity.com/DevCenter/setuptools - -0.5a5 - * Made ``easy_install`` a standard ``setuptools`` command, moving it from - the ``easy_install`` module to ``setuptools.command.easy_install``. Note - that if you were importing or extending it, you must now change your imports - accordingly. ``easy_install.py`` is still installed as a script, but not as - a module. - -0.5a4 - * Added ``--always-copy/-a`` option to always copy needed packages to the - installation directory, even if they're already present elsewhere on - sys.path. (In previous versions, this was the default behavior, but now - you must request it.) - - * Added ``--upgrade/-U`` option to force checking PyPI for latest available - version(s) of all packages requested by name and version, even if a matching - version is available locally. - - * Added automatic installation of dependencies declared by a distribution - being installed. These dependencies must be listed in the distribution's - ``EGG-INFO`` directory, so the distribution has to have declared its - dependencies by using setuptools. If a package has requirements it didn't - declare, you'll still have to deal with them yourself. (E.g., by asking - EasyInstall to find and install them.) - - * Added the ``--record`` option to ``easy_install`` for the benefit of tools - that run ``setup.py install --record=filename`` on behalf of another - packaging system.) - -0.5a3 - * Fixed not setting script permissions to allow execution. - - * Improved sandboxing so that setup scripts that want a temporary directory - (e.g. pychecker) can still run in the sandbox. - -0.5a2 - * Fix stupid stupid refactoring-at-the-last-minute typos. :( - -0.5a1 - * Added support for converting ``.win32.exe`` installers to eggs on the fly. - EasyInstall will now recognize such files by name and install them. - - * Fixed a problem with picking the "best" version to install (versions were - being sorted as strings, rather than as parsed values) - -0.4a4 - * Added support for the distutils "verbose/quiet" and "dry-run" options, as - well as the "optimize" flag. - - * Support downloading packages that were uploaded to PyPI (by scanning all - links on package pages, not just the homepage/download links). - -0.4a3 - * Add progress messages to the search/download process so that you can tell - what URLs it's reading to find download links. (Hopefully, this will help - people report out-of-date and broken links to package authors, and to tell - when they've asked for a package that doesn't exist.) - -0.4a2 - * Added support for installing scripts - - * Added support for setting options via distutils configuration files, and - using distutils' default options as a basis for EasyInstall's defaults. - - * Renamed ``--scan-url/-s`` to ``--find-links/-f`` to free up ``-s`` for the - script installation directory option. - - * Use ``urllib2`` instead of ``urllib``, to allow use of ``https:`` URLs if - Python includes SSL support. - -0.4a1 - * Added ``--scan-url`` and ``--index-url`` options, to scan download pages - and search PyPI for needed packages. - -0.3a4 - * Restrict ``--build-directory=DIR/-b DIR`` option to only be used with single - URL installs, to avoid running the wrong setup.py. - -0.3a3 - * Added ``--build-directory=DIR/-b DIR`` option. - - * Added "installation report" that explains how to use 'require()' when doing - a multiversion install or alternate installation directory. - - * Added SourceForge mirror auto-select (Contributed by Ian Bicking) - - * Added "sandboxing" that stops a setup script from running if it attempts to - write to the filesystem outside of the build area - - * Added more workarounds for packages with quirky ``install_data`` hacks - -0.3a2 - * Added subversion download support for ``svn:`` and ``svn+`` URLs, as well as - automatic recognition of HTTP subversion URLs (Contributed by Ian Bicking) - - * Misc. bug fixes - -0.3a1 - * Initial release. - - -Future Plans -============ - -* Additional utilities to list/remove/verify packages -* Signature checking? SSL? Ability to suppress PyPI search? -* Display byte progress meter when downloading distributions and long pages? -* Redirect stdout/stderr to log during run_setup? - diff --git a/LICENSE b/LICENSE new file mode 100644 index 0000000..6e0693b --- /dev/null +++ b/LICENSE @@ -0,0 +1,19 @@ +Copyright (C) 2016 Jason R Coombs + +Permission is hereby granted, free of charge, to any person obtaining a copy of +this software and associated documentation files (the "Software"), to deal in +the Software without restriction, including without limitation the rights to +use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies +of the Software, and to permit persons to whom the Software is furnished to do +so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/MANIFEST.in b/MANIFEST.in new file mode 100644 index 0000000..325bbed --- /dev/null +++ b/MANIFEST.in @@ -0,0 +1,14 @@ +recursive-include setuptools *.py *.exe *.xml +recursive-include tests *.py +recursive-include setuptools/tests *.html +recursive-include docs *.py *.txt *.conf *.css *.css_t Makefile indexsidebar.html +recursive-include setuptools/_vendor * +recursive-include pkg_resources *.py *.txt +include *.py +include *.rst +include MANIFEST.in +include LICENSE +include launcher.c +include msvc-build-launcher.cmd +include pytest.ini +include tox.ini diff --git a/PKG-INFO b/PKG-INFO index f2bca0d..439d58b 100644 --- a/PKG-INFO +++ b/PKG-INFO @@ -1,183 +1,51 @@ -Metadata-Version: 1.0 +Metadata-Version: 1.2 Name: setuptools -Version: 0.6c11 -Summary: Download, build, install, upgrade, and uninstall Python packages -- easily! -Home-page: http://pypi.python.org/pypi/setuptools -Author: Phillip J. Eby +Version: 34.3.3 +Summary: Easily download, build, install, upgrade, and uninstall Python packages +Home-page: https://github.com/pypa/setuptools +Author: Python Packaging Authority Author-email: distutils-sig@python.org -License: PSF or ZPL -Description: =============================== - Installing and Using Setuptools - =============================== +License: UNKNOWN +Description: .. image:: https://readthedocs.org/projects/setuptools/badge/?version=latest + :target: https://setuptools.readthedocs.io - .. contents:: **Table of Contents** + See the `Installation Instructions + `_ in the Python Packaging + User's Guide for instructions on installing, upgrading, and uninstalling + Setuptools. + The project is `maintained at GitHub `_. - ------------------------- - Installation Instructions - ------------------------- + Questions and comments should be directed to the `distutils-sig + mailing list `_. + Bug reports and especially tested patches may be + submitted directly to the `bug tracker + `_. - Windows - ======= - Install setuptools using the provided ``.exe`` installer. If you've previously - installed older versions of setuptools, please delete all ``setuptools*.egg`` - and ``setuptools.pth`` files from your system's ``site-packages`` directory - (and any other ``sys.path`` directories) FIRST. + Code of Conduct + --------------- - If you are upgrading a previous version of setuptools that was installed using - an ``.exe`` installer, please be sure to also *uninstall that older version* - via your system's "Add/Remove Programs" feature, BEFORE installing the newer - version. - - Once installation is complete, you will find an ``easy_install.exe`` program in - your Python ``Scripts`` subdirectory. Be sure to add this directory to your - ``PATH`` environment variable, if you haven't already done so. - - - RPM-Based Systems - ================= - - Install setuptools using the provided source RPM. The included ``.spec`` file - assumes you are installing using the default ``python`` executable, and is not - specific to a particular Python version. The ``easy_install`` executable will - be installed to a system ``bin`` directory such as ``/usr/bin``. - - If you wish to install to a location other than the default Python - installation's default ``site-packages`` directory (and ``$prefix/bin`` for - scripts), please use the ``.egg``-based installation approach described in the - following section. - - - Cygwin, Mac OS X, Linux, Other - ============================== - - 1. Download the appropriate egg for your version of Python (e.g. - ``setuptools-0.6c9-py2.4.egg``). Do NOT rename it. - - 2. Run it as if it were a shell script, e.g. ``sh setuptools-0.6c9-py2.4.egg``. - Setuptools will install itself using the matching version of Python (e.g. - ``python2.4``), and will place the ``easy_install`` executable in the - default location for installing Python scripts (as determined by the - standard distutils configuration files, or by the Python installation). - - If you want to install setuptools to somewhere other than ``site-packages`` or - your default distutils installation locations for libraries and scripts, you - may include EasyInstall command-line options such as ``--prefix``, - ``--install-dir``, and so on, following the ``.egg`` filename on the same - command line. For example:: - - sh setuptools-0.6c9-py2.4.egg --prefix=~ - - You can use ``--help`` to get a full options list, but we recommend consulting - the `EasyInstall manual`_ for detailed instructions, especially `the section - on custom installation locations`_. - - .. _EasyInstall manual: http://peak.telecommunity.com/DevCenter/EasyInstall - .. _the section on custom installation locations: http://peak.telecommunity.com/DevCenter/EasyInstall#custom-installation-locations - - - Cygwin Note - ----------- - - If you are trying to install setuptools for the **Windows** version of Python - (as opposed to the Cygwin version that lives in ``/usr/bin``), you must make - sure that an appropriate executable (``python2.3``, ``python2.4``, or - ``python2.5``) is on your **Cygwin** ``PATH`` when invoking the egg. For - example, doing the following at a Cygwin bash prompt will install setuptools - for the **Windows** Python found at ``C:\\Python24``:: - - ln -s /cygdrive/c/Python24/python.exe python2.4 - PATH=.:$PATH sh setuptools-0.6c9-py2.4.egg - rm python2.4 - - - Downloads - ========= - - All setuptools downloads can be found at `the project's home page in the Python - Package Index`_. Scroll to the very bottom of the page to find the links. - - .. _the project's home page in the Python Package Index: http://pypi.python.org/pypi/setuptools#files - - In addition to the PyPI downloads, the development version of ``setuptools`` - is available from the `Python SVN sandbox`_, and in-development versions of the - `0.6 branch`_ are available as well. - - .. _0.6 branch: http://svn.python.org/projects/sandbox/branches/setuptools-0.6/#egg=setuptools-dev06 - - .. _Python SVN sandbox: http://svn.python.org/projects/sandbox/trunk/setuptools/#egg=setuptools-dev - - -------------------------------- - Using Setuptools and EasyInstall - -------------------------------- - - Here are some of the available manuals, tutorials, and other resources for - learning about Setuptools, Python Eggs, and EasyInstall: - - * `The EasyInstall user's guide and reference manual`_ - * `The setuptools Developer's Guide`_ - * `The pkg_resources API reference`_ - * `Package Compatibility Notes`_ (user-maintained) - * `The Internal Structure of Python Eggs`_ - - Questions, comments, and bug reports should be directed to the `distutils-sig - mailing list`_. If you have written (or know of) any tutorials, documentation, - plug-ins, or other resources for setuptools users, please let us know about - them there, so this reference list can be updated. If you have working, - *tested* patches to correct problems or add features, you may submit them to - the `setuptools bug tracker`_. - - .. _setuptools bug tracker: http://bugs.python.org/setuptools/ - .. _Package Compatibility Notes: http://peak.telecommunity.com/DevCenter/PackageNotes - .. _The Internal Structure of Python Eggs: http://peak.telecommunity.com/DevCenter/EggFormats - .. _The setuptools Developer's Guide: http://peak.telecommunity.com/DevCenter/setuptools - .. _The pkg_resources API reference: http://peak.telecommunity.com/DevCenter/PkgResources - .. _The EasyInstall user's guide and reference manual: http://peak.telecommunity.com/DevCenter/EasyInstall - .. _distutils-sig mailing list: http://mail.python.org/pipermail/distutils-sig/ - - - ------- - Credits - ------- - - * The original design for the ``.egg`` format and the ``pkg_resources`` API was - co-created by Phillip Eby and Bob Ippolito. Bob also implemented the first - version of ``pkg_resources``, and supplied the OS X operating system version - compatibility algorithm. - - * Ian Bicking implemented many early "creature comfort" features of - easy_install, including support for downloading via Sourceforge and - Subversion repositories. Ian's comments on the Web-SIG about WSGI - application deployment also inspired the concept of "entry points" in eggs, - and he has given talks at PyCon and elsewhere to inform and educate the - community about eggs and setuptools. - - * Jim Fulton contributed time and effort to build automated tests of various - aspects of ``easy_install``, and supplied the doctests for the command-line - ``.exe`` wrappers on Windows. - - * Phillip J. Eby is the principal author and maintainer of setuptools, and - first proposed the idea of an importable binary distribution format for - Python application plug-ins. - - * Significant parts of the implementation of setuptools were funded by the Open - Source Applications Foundation, to provide a plug-in infrastructure for the - Chandler PIM application. In addition, many OSAF staffers (such as Mike - "Code Bear" Taylor) contributed their time and stress as guinea pigs for the - use of eggs and setuptools, even before eggs were "cool". (Thanks, guys!) - - .. _files: + Everyone interacting in the setuptools project's codebases, issue trackers, + chat rooms, and mailing lists is expected to follow the + `PyPA Code of Conduct `_. Keywords: CPAN PyPI distutils eggs package management Platform: UNKNOWN -Classifier: Development Status :: 3 - Alpha +Classifier: Development Status :: 5 - Production/Stable Classifier: Intended Audience :: Developers -Classifier: License :: OSI Approved :: Python Software Foundation License -Classifier: License :: OSI Approved :: Zope Public License +Classifier: License :: OSI Approved :: MIT License Classifier: Operating System :: OS Independent -Classifier: Programming Language :: Python +Classifier: Programming Language :: Python :: 2 +Classifier: Programming Language :: Python :: 2.6 +Classifier: Programming Language :: Python :: 2.7 +Classifier: Programming Language :: Python :: 3 +Classifier: Programming Language :: Python :: 3.3 +Classifier: Programming Language :: Python :: 3.4 +Classifier: Programming Language :: Python :: 3.5 +Classifier: Programming Language :: Python :: 3.6 Classifier: Topic :: Software Development :: Libraries :: Python Modules Classifier: Topic :: System :: Archiving :: Packaging Classifier: Topic :: System :: Systems Administration Classifier: Topic :: Utilities +Requires-Python: >=2.6,!=3.0.*,!=3.1.*,!=3.2.* diff --git a/README.rst b/README.rst new file mode 100755 index 0000000..2f55bce --- /dev/null +++ b/README.rst @@ -0,0 +1,23 @@ +.. image:: https://readthedocs.org/projects/setuptools/badge/?version=latest + :target: https://setuptools.readthedocs.io + +See the `Installation Instructions +`_ in the Python Packaging +User's Guide for instructions on installing, upgrading, and uninstalling +Setuptools. + +The project is `maintained at GitHub `_. + +Questions and comments should be directed to the `distutils-sig +mailing list `_. +Bug reports and especially tested patches may be +submitted directly to the `bug tracker +`_. + + +Code of Conduct +--------------- + +Everyone interacting in the setuptools project's codebases, issue trackers, +chat rooms, and mailing lists is expected to follow the +`PyPA Code of Conduct `_. diff --git a/README.txt b/README.txt deleted file mode 100755 index 03a2afc..0000000 --- a/README.txt +++ /dev/null @@ -1,162 +0,0 @@ -=============================== -Installing and Using Setuptools -=============================== - -.. contents:: **Table of Contents** - - -------------------------- -Installation Instructions -------------------------- - -Windows -======= - -Install setuptools using the provided ``.exe`` installer. If you've previously -installed older versions of setuptools, please delete all ``setuptools*.egg`` -and ``setuptools.pth`` files from your system's ``site-packages`` directory -(and any other ``sys.path`` directories) FIRST. - -If you are upgrading a previous version of setuptools that was installed using -an ``.exe`` installer, please be sure to also *uninstall that older version* -via your system's "Add/Remove Programs" feature, BEFORE installing the newer -version. - -Once installation is complete, you will find an ``easy_install.exe`` program in -your Python ``Scripts`` subdirectory. Be sure to add this directory to your -``PATH`` environment variable, if you haven't already done so. - - -RPM-Based Systems -================= - -Install setuptools using the provided source RPM. The included ``.spec`` file -assumes you are installing using the default ``python`` executable, and is not -specific to a particular Python version. The ``easy_install`` executable will -be installed to a system ``bin`` directory such as ``/usr/bin``. - -If you wish to install to a location other than the default Python -installation's default ``site-packages`` directory (and ``$prefix/bin`` for -scripts), please use the ``.egg``-based installation approach described in the -following section. - - -Cygwin, Mac OS X, Linux, Other -============================== - -1. Download the appropriate egg for your version of Python (e.g. - ``setuptools-0.6c9-py2.4.egg``). Do NOT rename it. - -2. Run it as if it were a shell script, e.g. ``sh setuptools-0.6c9-py2.4.egg``. - Setuptools will install itself using the matching version of Python (e.g. - ``python2.4``), and will place the ``easy_install`` executable in the - default location for installing Python scripts (as determined by the - standard distutils configuration files, or by the Python installation). - -If you want to install setuptools to somewhere other than ``site-packages`` or -your default distutils installation locations for libraries and scripts, you -may include EasyInstall command-line options such as ``--prefix``, -``--install-dir``, and so on, following the ``.egg`` filename on the same -command line. For example:: - - sh setuptools-0.6c9-py2.4.egg --prefix=~ - -You can use ``--help`` to get a full options list, but we recommend consulting -the `EasyInstall manual`_ for detailed instructions, especially `the section -on custom installation locations`_. - -.. _EasyInstall manual: http://peak.telecommunity.com/DevCenter/EasyInstall -.. _the section on custom installation locations: http://peak.telecommunity.com/DevCenter/EasyInstall#custom-installation-locations - - -Cygwin Note ------------ - -If you are trying to install setuptools for the **Windows** version of Python -(as opposed to the Cygwin version that lives in ``/usr/bin``), you must make -sure that an appropriate executable (``python2.3``, ``python2.4``, or -``python2.5``) is on your **Cygwin** ``PATH`` when invoking the egg. For -example, doing the following at a Cygwin bash prompt will install setuptools -for the **Windows** Python found at ``C:\\Python24``:: - - ln -s /cygdrive/c/Python24/python.exe python2.4 - PATH=.:$PATH sh setuptools-0.6c9-py2.4.egg - rm python2.4 - - -Downloads -========= - -All setuptools downloads can be found at `the project's home page in the Python -Package Index`_. Scroll to the very bottom of the page to find the links. - -.. _the project's home page in the Python Package Index: http://pypi.python.org/pypi/setuptools#files - -In addition to the PyPI downloads, the development version of ``setuptools`` -is available from the `Python SVN sandbox`_, and in-development versions of the -`0.6 branch`_ are available as well. - -.. _0.6 branch: http://svn.python.org/projects/sandbox/branches/setuptools-0.6/#egg=setuptools-dev06 - -.. _Python SVN sandbox: http://svn.python.org/projects/sandbox/trunk/setuptools/#egg=setuptools-dev - --------------------------------- -Using Setuptools and EasyInstall --------------------------------- - -Here are some of the available manuals, tutorials, and other resources for -learning about Setuptools, Python Eggs, and EasyInstall: - -* `The EasyInstall user's guide and reference manual`_ -* `The setuptools Developer's Guide`_ -* `The pkg_resources API reference`_ -* `Package Compatibility Notes`_ (user-maintained) -* `The Internal Structure of Python Eggs`_ - -Questions, comments, and bug reports should be directed to the `distutils-sig -mailing list`_. If you have written (or know of) any tutorials, documentation, -plug-ins, or other resources for setuptools users, please let us know about -them there, so this reference list can be updated. If you have working, -*tested* patches to correct problems or add features, you may submit them to -the `setuptools bug tracker`_. - -.. _setuptools bug tracker: http://bugs.python.org/setuptools/ -.. _Package Compatibility Notes: http://peak.telecommunity.com/DevCenter/PackageNotes -.. _The Internal Structure of Python Eggs: http://peak.telecommunity.com/DevCenter/EggFormats -.. _The setuptools Developer's Guide: http://peak.telecommunity.com/DevCenter/setuptools -.. _The pkg_resources API reference: http://peak.telecommunity.com/DevCenter/PkgResources -.. _The EasyInstall user's guide and reference manual: http://peak.telecommunity.com/DevCenter/EasyInstall -.. _distutils-sig mailing list: http://mail.python.org/pipermail/distutils-sig/ - - -------- -Credits -------- - -* The original design for the ``.egg`` format and the ``pkg_resources`` API was - co-created by Phillip Eby and Bob Ippolito. Bob also implemented the first - version of ``pkg_resources``, and supplied the OS X operating system version - compatibility algorithm. - -* Ian Bicking implemented many early "creature comfort" features of - easy_install, including support for downloading via Sourceforge and - Subversion repositories. Ian's comments on the Web-SIG about WSGI - application deployment also inspired the concept of "entry points" in eggs, - and he has given talks at PyCon and elsewhere to inform and educate the - community about eggs and setuptools. - -* Jim Fulton contributed time and effort to build automated tests of various - aspects of ``easy_install``, and supplied the doctests for the command-line - ``.exe`` wrappers on Windows. - -* Phillip J. Eby is the principal author and maintainer of setuptools, and - first proposed the idea of an importable binary distribution format for - Python application plug-ins. - -* Significant parts of the implementation of setuptools were funded by the Open - Source Applications Foundation, to provide a plug-in infrastructure for the - Chandler PIM application. In addition, many OSAF staffers (such as Mike - "Code Bear" Taylor) contributed their time and stress as guinea pigs for the - use of eggs and setuptools, even before eggs were "cool". (Thanks, guys!) - -.. _files: diff --git a/api_tests.txt b/api_tests.txt deleted file mode 100755 index 735ad8d..0000000 --- a/api_tests.txt +++ /dev/null @@ -1,330 +0,0 @@ -Pluggable Distributions of Python Software -========================================== - -Distributions -------------- - -A "Distribution" is a collection of files that represent a "Release" of a -"Project" as of a particular point in time, denoted by a -"Version":: - - >>> import sys, pkg_resources - >>> from pkg_resources import Distribution - >>> Distribution(project_name="Foo", version="1.2") - Foo 1.2 - -Distributions have a location, which can be a filename, URL, or really anything -else you care to use:: - - >>> dist = Distribution( - ... location="http://example.com/something", - ... project_name="Bar", version="0.9" - ... ) - - >>> dist - Bar 0.9 (http://example.com/something) - - -Distributions have various introspectable attributes:: - - >>> dist.location - 'http://example.com/something' - - >>> dist.project_name - 'Bar' - - >>> dist.version - '0.9' - - >>> dist.py_version == sys.version[:3] - True - - >>> print dist.platform - None - -Including various computed attributes:: - - >>> from pkg_resources import parse_version - >>> dist.parsed_version == parse_version(dist.version) - True - - >>> dist.key # case-insensitive form of the project name - 'bar' - -Distributions are compared (and hashed) by version first:: - - >>> Distribution(version='1.0') == Distribution(version='1.0') - True - >>> Distribution(version='1.0') == Distribution(version='1.1') - False - >>> Distribution(version='1.0') < Distribution(version='1.1') - True - -but also by project name (case-insensitive), platform, Python version, -location, etc.:: - - >>> Distribution(project_name="Foo",version="1.0") == \ - ... Distribution(project_name="Foo",version="1.0") - True - - >>> Distribution(project_name="Foo",version="1.0") == \ - ... Distribution(project_name="foo",version="1.0") - True - - >>> Distribution(project_name="Foo",version="1.0") == \ - ... Distribution(project_name="Foo",version="1.1") - False - - >>> Distribution(project_name="Foo",py_version="2.3",version="1.0") == \ - ... Distribution(project_name="Foo",py_version="2.4",version="1.0") - False - - >>> Distribution(location="spam",version="1.0") == \ - ... Distribution(location="spam",version="1.0") - True - - >>> Distribution(location="spam",version="1.0") == \ - ... Distribution(location="baz",version="1.0") - False - - - -Hash and compare distribution by prio/plat - -Get version from metadata -provider capabilities -egg_name() -as_requirement() -from_location, from_filename (w/path normalization) - -Releases may have zero or more "Requirements", which indicate -what releases of another project the release requires in order to -function. A Requirement names the other project, expresses some criteria -as to what releases of that project are acceptable, and lists any "Extras" -that the requiring release may need from that project. (An Extra is an -optional feature of a Release, that can only be used if its additional -Requirements are satisfied.) - - - -The Working Set ---------------- - -A collection of active distributions is called a Working Set. Note that a -Working Set can contain any importable distribution, not just pluggable ones. -For example, the Python standard library is an importable distribution that -will usually be part of the Working Set, even though it is not pluggable. -Similarly, when you are doing development work on a project, the files you are -editing are also a Distribution. (And, with a little attention to the -directory names used, and including some additional metadata, such a -"development distribution" can be made pluggable as well.) - - >>> from pkg_resources import WorkingSet - -A working set's entries are the sys.path entries that correspond to the active -distributions. By default, the working set's entries are the items on -``sys.path``:: - - >>> ws = WorkingSet() - >>> ws.entries == sys.path - True - -But you can also create an empty working set explicitly, and add distributions -to it:: - - >>> ws = WorkingSet([]) - >>> ws.add(dist) - >>> ws.entries - ['http://example.com/something'] - >>> dist in ws - True - >>> Distribution('foo',version="") in ws - False - -And you can iterate over its distributions:: - - >>> list(ws) - [Bar 0.9 (http://example.com/something)] - -Adding the same distribution more than once is a no-op:: - - >>> ws.add(dist) - >>> list(ws) - [Bar 0.9 (http://example.com/something)] - -For that matter, adding multiple distributions for the same project also does -nothing, because a working set can only hold one active distribution per -project -- the first one added to it:: - - >>> ws.add( - ... Distribution( - ... 'http://example.com/something', project_name="Bar", - ... version="7.2" - ... ) - ... ) - >>> list(ws) - [Bar 0.9 (http://example.com/something)] - -You can append a path entry to a working set using ``add_entry()``:: - - >>> ws.entries - ['http://example.com/something'] - >>> ws.add_entry(pkg_resources.__file__) - >>> ws.entries - ['http://example.com/something', '...pkg_resources.py...'] - -Multiple additions result in multiple entries, even if the entry is already in -the working set (because ``sys.path`` can contain the same entry more than -once):: - - >>> ws.add_entry(pkg_resources.__file__) - >>> ws.entries - ['...example.com...', '...pkg_resources...', '...pkg_resources...'] - -And you can specify the path entry a distribution was found under, using the -optional second parameter to ``add()``:: - - >>> ws = WorkingSet([]) - >>> ws.add(dist,"foo") - >>> ws.entries - ['foo'] - -But even if a distribution is found under multiple path entries, it still only -shows up once when iterating the working set: - - >>> ws.add_entry(ws.entries[0]) - >>> list(ws) - [Bar 0.9 (http://example.com/something)] - -You can ask a WorkingSet to ``find()`` a distribution matching a requirement:: - - >>> from pkg_resources import Requirement - >>> print ws.find(Requirement.parse("Foo==1.0")) # no match, return None - None - - >>> ws.find(Requirement.parse("Bar==0.9")) # match, return distribution - Bar 0.9 (http://example.com/something) - -Note that asking for a conflicting version of a distribution already in a -working set triggers a ``pkg_resources.VersionConflict`` error: - - >>> ws.find(Requirement.parse("Bar==1.0")) # doctest: +NORMALIZE_WHITESPACE - Traceback (most recent call last): - ... - VersionConflict: (Bar 0.9 (http://example.com/something), - Requirement.parse('Bar==1.0')) - -You can subscribe a callback function to receive notifications whenever a new -distribution is added to a working set. The callback is immediately invoked -once for each existing distribution in the working set, and then is called -again for new distributions added thereafter:: - - >>> def added(dist): print "Added", dist - >>> ws.subscribe(added) - Added Bar 0.9 - >>> foo12 = Distribution(project_name="Foo", version="1.2", location="f12") - >>> ws.add(foo12) - Added Foo 1.2 - -Note, however, that only the first distribution added for a given project name -will trigger a callback, even during the initial ``subscribe()`` callback:: - - >>> foo14 = Distribution(project_name="Foo", version="1.4", location="f14") - >>> ws.add(foo14) # no callback, because Foo 1.2 is already active - - >>> ws = WorkingSet([]) - >>> ws.add(foo12) - >>> ws.add(foo14) - >>> ws.subscribe(added) - Added Foo 1.2 - -And adding a callback more than once has no effect, either:: - - >>> ws.subscribe(added) # no callbacks - - # and no double-callbacks on subsequent additions, either - >>> just_a_test = Distribution(project_name="JustATest", version="0.99") - >>> ws.add(just_a_test) - Added JustATest 0.99 - - -Finding Plugins ---------------- - -``WorkingSet`` objects can be used to figure out what plugins in an -``Environment`` can be loaded without any resolution errors:: - - >>> from pkg_resources import Environment - - >>> plugins = Environment([]) # normally, a list of plugin directories - >>> plugins.add(foo12) - >>> plugins.add(foo14) - >>> plugins.add(just_a_test) - -In the simplest case, we just get the newest version of each distribution in -the plugin environment:: - - >>> ws = WorkingSet([]) - >>> ws.find_plugins(plugins) - ([JustATest 0.99, Foo 1.4 (f14)], {}) - -But if there's a problem with a version conflict or missing requirements, the -method falls back to older versions, and the error info dict will contain an -exception instance for each unloadable plugin:: - - >>> ws.add(foo12) # this will conflict with Foo 1.4 - >>> ws.find_plugins(plugins) - ([JustATest 0.99, Foo 1.2 (f12)], {Foo 1.4 (f14): VersionConflict(...)}) - -But if you disallow fallbacks, the failed plugin will be skipped instead of -trying older versions:: - - >>> ws.find_plugins(plugins, fallback=False) - ([JustATest 0.99], {Foo 1.4 (f14): VersionConflict(...)}) - - - -Platform Compatibility Rules ----------------------------- - -On the Mac, there are potential compatibility issues for modules compiled -on newer versions of Mac OS X than what the user is running. Additionally, -Mac OS X will soon have two platforms to contend with: Intel and PowerPC. - -Basic equality works as on other platforms:: - - >>> from pkg_resources import compatible_platforms as cp - >>> reqd = 'macosx-10.4-ppc' - >>> cp(reqd, reqd) - True - >>> cp("win32", reqd) - False - -Distributions made on other machine types are not compatible:: - - >>> cp("macosx-10.4-i386", reqd) - False - -Distributions made on earlier versions of the OS are compatible, as -long as they are from the same top-level version. The patchlevel version -number does not matter:: - - >>> cp("macosx-10.4-ppc", reqd) - True - >>> cp("macosx-10.3-ppc", reqd) - True - >>> cp("macosx-10.5-ppc", reqd) - False - >>> cp("macosx-9.5-ppc", reqd) - False - -Backwards compatibility for packages made via earlier versions of -setuptools is provided as well:: - - >>> cp("darwin-8.2.0-Power_Macintosh", reqd) - True - >>> cp("darwin-7.2.0-Power_Macintosh", reqd) - True - >>> cp("darwin-8.2.0-Power_Macintosh", "macosx-10.3-ppc") - False - diff --git a/bootstrap.py b/bootstrap.py new file mode 100644 index 0000000..ee3b53c --- /dev/null +++ b/bootstrap.py @@ -0,0 +1,104 @@ +""" +If setuptools is not already installed in the environment, it's not possible +to invoke setuptools' own commands. This routine will bootstrap this local +environment by creating a minimal egg-info directory and then invoking the +egg-info command to flesh out the egg-info directory. +""" + +from __future__ import unicode_literals + +import os +import io +import re +import contextlib +import tempfile +import shutil +import sys +import textwrap +import subprocess + + +minimal_egg_info = textwrap.dedent(""" + [distutils.commands] + egg_info = setuptools.command.egg_info:egg_info + + [distutils.setup_keywords] + include_package_data = setuptools.dist:assert_bool + install_requires = setuptools.dist:check_requirements + extras_require = setuptools.dist:check_extras + entry_points = setuptools.dist:check_entry_points + + [egg_info.writers] + dependency_links.txt = setuptools.command.egg_info:overwrite_arg + entry_points.txt = setuptools.command.egg_info:write_entries + requires.txt = setuptools.command.egg_info:write_requirements + """) + + +def ensure_egg_info(): + if os.path.exists('setuptools.egg-info'): + return + print("adding minimal entry_points") + build_egg_info() + + +def build_egg_info(): + """ + Build a minimal egg-info, enough to invoke egg_info + """ + + os.mkdir('setuptools.egg-info') + filename = 'setuptools.egg-info/entry_points.txt' + with io.open(filename, 'w', encoding='utf-8') as ep: + ep.write(minimal_egg_info) + + +def run_egg_info(): + cmd = [sys.executable, 'setup.py', 'egg_info'] + print("Regenerating egg_info") + subprocess.check_call(cmd) + print("...and again.") + subprocess.check_call(cmd) + + +def gen_deps(): + with io.open('setup.py', encoding='utf-8') as strm: + text = strm.read() + pattern = r'install_requires=\[(.*?)\]' + match = re.search(pattern, text, flags=re.M|re.DOTALL) + reqs = eval(match.group(1).replace('\n', '')) + with io.open('requirements.txt', 'w', encoding='utf-8') as reqs_file: + reqs_file.write('\n'.join(reqs)) + + +@contextlib.contextmanager +def install_deps(): + "Just in time make the deps available" + import pip + tmpdir = tempfile.mkdtemp() + args = [ + 'install', + '-t', tmpdir, + '-r', 'requirements.txt', + ] + pip.main(args) + os.environ['PYTHONPATH'] = tmpdir + try: + yield tmpdir + finally: + shutil.rmtree(tmpdir) + + +def main(): + ensure_egg_info() + gen_deps() + try: + # first assume dependencies are present + run_egg_info() + except Exception: + # but if that fails, try again with dependencies just in time + with install_deps(): + run_egg_info() + + +__name__ == '__main__' and main() diff --git a/conftest.py b/conftest.py new file mode 100644 index 0000000..3cccfe1 --- /dev/null +++ b/conftest.py @@ -0,0 +1,8 @@ +pytest_plugins = 'setuptools.tests.fixtures' + + +def pytest_addoption(parser): + parser.addoption( + "--package_name", action="append", default=[], + help="list of package_name to pass to test functions", + ) diff --git a/docs/Makefile b/docs/Makefile new file mode 100644 index 0000000..30bf10a --- /dev/null +++ b/docs/Makefile @@ -0,0 +1,75 @@ +# Makefile for Sphinx documentation +# + +# You can set these variables from the command line. +SPHINXOPTS = +SPHINXBUILD = sphinx-build +PAPER = + +# Internal variables. +PAPEROPT_a4 = -D latex_paper_size=a4 +PAPEROPT_letter = -D latex_paper_size=letter +ALLSPHINXOPTS = -d build/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . + +.PHONY: help clean html web pickle htmlhelp latex changes linkcheck + +help: + @echo "Please use \`make ' where is one of" + @echo " html to make standalone HTML files" + @echo " pickle to make pickle files" + @echo " json to make JSON files" + @echo " htmlhelp to make HTML files and a HTML help project" + @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" + @echo " changes to make an overview over all changed/added/deprecated items" + @echo " linkcheck to check all external links for integrity" + +clean: + -rm -rf build/* + +html: + mkdir -p build/html build/doctrees + $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) build/html + @echo + @echo "Build finished. The HTML pages are in build/html." + +pickle: + mkdir -p build/pickle build/doctrees + $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) build/pickle + @echo + @echo "Build finished; now you can process the pickle files." + +web: pickle + +json: + mkdir -p build/json build/doctrees + $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) build/json + @echo + @echo "Build finished; now you can process the JSON files." + +htmlhelp: + mkdir -p build/htmlhelp build/doctrees + $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) build/htmlhelp + @echo + @echo "Build finished; now you can run HTML Help Workshop with the" \ + ".hhp project file in build/htmlhelp." + +latex: + mkdir -p build/latex build/doctrees + $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) build/latex + @echo + @echo "Build finished; the LaTeX files are in build/latex." + @echo "Run \`make all-pdf' or \`make all-ps' in that directory to" \ + "run these through (pdf)latex." + +changes: + mkdir -p build/changes build/doctrees + $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) build/changes + @echo + @echo "The overview file is in build/changes." + +linkcheck: + mkdir -p build/linkcheck build/doctrees + $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) build/linkcheck + @echo + @echo "Link check complete; look for any errors in the above output " \ + "or in build/linkcheck/output.txt." diff --git a/docs/_templates/indexsidebar.html b/docs/_templates/indexsidebar.html new file mode 100644 index 0000000..3b12760 --- /dev/null +++ b/docs/_templates/indexsidebar.html @@ -0,0 +1,8 @@ +

    Download

    + +

    Current version: {{ version }}

    +

    Get Setuptools from the Python Package Index + +

    Questions? Suggestions? Contributions?

    + +

    Visit the Setuptools project page

    diff --git a/docs/_theme/nature/static/nature.css_t b/docs/_theme/nature/static/nature.css_t new file mode 100644 index 0000000..1a65426 --- /dev/null +++ b/docs/_theme/nature/static/nature.css_t @@ -0,0 +1,237 @@ +/** + * Sphinx stylesheet -- default theme + * ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + */ + +@import url("basic.css"); + +/* -- page layout ----------------------------------------------------------- */ + +body { + font-family: Arial, sans-serif; + font-size: 100%; + background-color: #111111; + color: #555555; + margin: 0; + padding: 0; +} + +div.documentwrapper { + float: left; + width: 100%; +} + +div.bodywrapper { + margin: 0 0 0 300px; +} + +hr{ + border: 1px solid #B1B4B6; +} + +div.document { + background-color: #fafafa; +} + +div.body { + background-color: #ffffff; + color: #3E4349; + padding: 1em 30px 30px 30px; + font-size: 0.9em; +} + +div.footer { + color: #555; + width: 100%; + padding: 13px 0; + text-align: center; + font-size: 75%; +} + +div.footer a { + color: #444444; +} + +div.related { + background-color: #6BA81E; + line-height: 36px; + color: #ffffff; + text-shadow: 0px 1px 0 #444444; + font-size: 1.1em; +} + +div.related a { + color: #E2F3CC; +} + +div.related .right { + font-size: 0.9em; +} + +div.sphinxsidebar { + font-size: 0.9em; + line-height: 1.5em; + width: 300px; +} + +div.sphinxsidebarwrapper{ + padding: 20px 0; +} + +div.sphinxsidebar h3, +div.sphinxsidebar h4 { + font-family: Arial, sans-serif; + color: #222222; + font-size: 1.2em; + font-weight: bold; + margin: 0; + padding: 5px 10px; + text-shadow: 1px 1px 0 white +} + +div.sphinxsidebar h3 a { + color: #444444; +} + +div.sphinxsidebar p { + color: #888888; + padding: 5px 20px; + margin: 0.5em 0px; +} + +div.sphinxsidebar p.topless { +} + +div.sphinxsidebar ul { + margin: 10px 10px 10px 20px; + padding: 0; + color: #000000; +} + +div.sphinxsidebar a { + color: #444444; +} + +div.sphinxsidebar a:hover { + color: #E32E00; +} + +div.sphinxsidebar input { + border: 1px solid #cccccc; + font-family: sans-serif; + font-size: 1.1em; + padding: 0.15em 0.3em; +} + +div.sphinxsidebar input[type=text]{ + margin-left: 20px; +} + +/* -- body styles ----------------------------------------------------------- */ + +a { + color: #005B81; + text-decoration: none; +} + +a:hover { + color: #E32E00; +} + +div.body h1, +div.body h2, +div.body h3, +div.body h4, +div.body h5, +div.body h6 { + font-family: Arial, sans-serif; + font-weight: normal; + color: #212224; + margin: 30px 0px 10px 0px; + padding: 5px 0 5px 0px; + text-shadow: 0px 1px 0 white; + border-bottom: 1px solid #C8D5E3; +} + +div.body h1 { margin-top: 0; font-size: 200%; } +div.body h2 { font-size: 150%; } +div.body h3 { font-size: 120%; } +div.body h4 { font-size: 110%; } +div.body h5 { font-size: 100%; } +div.body h6 { font-size: 100%; } + +a.headerlink { + color: #c60f0f; + font-size: 0.8em; + padding: 0 4px 0 4px; + text-decoration: none; +} + +a.headerlink:hover { + background-color: #c60f0f; + color: white; +} + +div.body p, div.body dd, div.body li { + line-height: 1.8em; +} + +div.admonition p.admonition-title + p { + display: inline; +} + +div.highlight{ + background-color: white; +} + +div.note { + background-color: #eeeeee; + border: 1px solid #cccccc; +} + +div.seealso { + background-color: #ffffcc; + border: 1px solid #ffff66; +} + +div.topic { + background-color: #fafafa; + border-width: 0; +} + +div.warning { + background-color: #ffe4e4; + border: 1px solid #ff6666; +} + +p.admonition-title { + display: inline; +} + +p.admonition-title:after { + content: ":"; +} + +pre { + padding: 10px; + background-color: #fafafa; + color: #222222; + line-height: 1.5em; + font-size: 1.1em; + margin: 1.5em 0 1.5em 0; + -webkit-box-shadow: 0px 0px 4px #d8d8d8; + -moz-box-shadow: 0px 0px 4px #d8d8d8; + box-shadow: 0px 0px 4px #d8d8d8; +} + +tt { + color: #222222; + padding: 1px 2px; + font-size: 1.2em; + font-family: monospace; +} + +#table-of-contents ul { + padding-left: 2em; +} + diff --git a/docs/_theme/nature/static/pygments.css b/docs/_theme/nature/static/pygments.css new file mode 100644 index 0000000..652b761 --- /dev/null +++ b/docs/_theme/nature/static/pygments.css @@ -0,0 +1,54 @@ +.c { color: #999988; font-style: italic } /* Comment */ +.k { font-weight: bold } /* Keyword */ +.o { font-weight: bold } /* Operator */ +.cm { color: #999988; font-style: italic } /* Comment.Multiline */ +.cp { color: #999999; font-weight: bold } /* Comment.preproc */ +.c1 { color: #999988; font-style: italic } /* Comment.Single */ +.gd { color: #000000; background-color: #ffdddd } /* Generic.Deleted */ +.ge { font-style: italic } /* Generic.Emph */ +.gr { color: #aa0000 } /* Generic.Error */ +.gh { color: #999999 } /* Generic.Heading */ +.gi { color: #000000; background-color: #ddffdd } /* Generic.Inserted */ +.go { color: #111 } /* Generic.Output */ +.gp { color: #555555 } /* Generic.Prompt */ +.gs { font-weight: bold } /* Generic.Strong */ +.gu { color: #aaaaaa } /* Generic.Subheading */ +.gt { color: #aa0000 } /* Generic.Traceback */ +.kc { font-weight: bold } /* Keyword.Constant */ +.kd { font-weight: bold } /* Keyword.Declaration */ +.kp { font-weight: bold } /* Keyword.Pseudo */ +.kr { font-weight: bold } /* Keyword.Reserved */ +.kt { color: #445588; font-weight: bold } /* Keyword.Type */ +.m { color: #009999 } /* Literal.Number */ +.s { color: #bb8844 } /* Literal.String */ +.na { color: #008080 } /* Name.Attribute */ +.nb { color: #999999 } /* Name.Builtin */ +.nc { color: #445588; font-weight: bold } /* Name.Class */ +.no { color: #ff99ff } /* Name.Constant */ +.ni { color: #800080 } /* Name.Entity */ +.ne { color: #990000; font-weight: bold } /* Name.Exception */ +.nf { color: #990000; font-weight: bold } /* Name.Function */ +.nn { color: #555555 } /* Name.Namespace */ +.nt { color: #000080 } /* Name.Tag */ +.nv { color: purple } /* Name.Variable */ +.ow { font-weight: bold } /* Operator.Word */ +.mf { color: #009999 } /* Literal.Number.Float */ +.mh { color: #009999 } /* Literal.Number.Hex */ +.mi { color: #009999 } /* Literal.Number.Integer */ +.mo { color: #009999 } /* Literal.Number.Oct */ +.sb { color: #bb8844 } /* Literal.String.Backtick */ +.sc { color: #bb8844 } /* Literal.String.Char */ +.sd { color: #bb8844 } /* Literal.String.Doc */ +.s2 { color: #bb8844 } /* Literal.String.Double */ +.se { color: #bb8844 } /* Literal.String.Escape */ +.sh { color: #bb8844 } /* Literal.String.Heredoc */ +.si { color: #bb8844 } /* Literal.String.Interpol */ +.sx { color: #bb8844 } /* Literal.String.Other */ +.sr { color: #808000 } /* Literal.String.Regex */ +.s1 { color: #bb8844 } /* Literal.String.Single */ +.ss { color: #bb8844 } /* Literal.String.Symbol */ +.bp { color: #999999 } /* Name.Builtin.Pseudo */ +.vc { color: #ff99ff } /* Name.Variable.Class */ +.vg { color: #ff99ff } /* Name.Variable.Global */ +.vi { color: #ff99ff } /* Name.Variable.Instance */ +.il { color: #009999 } /* Literal.Number.Integer.Long */ \ No newline at end of file diff --git a/docs/_theme/nature/theme.conf b/docs/_theme/nature/theme.conf new file mode 100644 index 0000000..1cc4004 --- /dev/null +++ b/docs/_theme/nature/theme.conf @@ -0,0 +1,4 @@ +[theme] +inherit = basic +stylesheet = nature.css +pygments_style = tango diff --git a/docs/conf.py b/docs/conf.py new file mode 100644 index 0000000..fe68427 --- /dev/null +++ b/docs/conf.py @@ -0,0 +1,169 @@ +# -*- coding: utf-8 -*- +# +# Setuptools documentation build configuration file, created by +# sphinx-quickstart on Fri Jul 17 14:22:37 2009. +# +# This file is execfile()d with the current directory set to its containing dir. +# +# The contents of this file are pickled, so don't put values in the namespace +# that aren't pickleable (module imports are okay, they're removed automatically). +# +# Note that not all possible configuration values are present in this +# autogenerated file. +# +# All configuration values have a default; values that are commented out +# serve to show the default + +# If extensions (or modules to document with autodoc) are in another directory, +# add these directories to sys.path here. If the directory is relative to the +# documentation root, use os.path.abspath to make it absolute, like shown here. + +# Allow Sphinx to find the setup command that is imported below, as referenced above. +import os +import sys +sys.path.append(os.path.abspath('..')) + +import setup as setup_script + +# -- General configuration ----------------------------------------------------- + +# Add any Sphinx extension module names here, as strings. They can be extensions +# coming with Sphinx (named 'sphinx.ext.*') or your custom ones. +extensions = ['rst.linker', 'sphinx.ext.autosectionlabel'] + +# Add any paths that contain templates here, relative to this directory. +templates_path = ['_templates'] + +# The suffix of source filenames. +source_suffix = '.txt' + +# The master toctree document. +master_doc = 'index' + +# General information about the project. +project = 'Setuptools' +copyright = '2009-2014, The fellowship of the packaging' + +# The version info for the project you're documenting, acts as replacement for +# |version| and |release|, also used in various other places throughout the +# built documents. +# +# The short X.Y version. +version = setup_script.setup_params['version'] +# The full version, including alpha/beta/rc tags. +release = setup_script.setup_params['version'] + +# List of directories, relative to source directory, that shouldn't be searched +# for source files. +exclude_trees = [] + +# The name of the Pygments (syntax highlighting) style to use. +pygments_style = 'sphinx' + +# -- Options for HTML output --------------------------------------------------- + +# The theme to use for HTML and HTML Help pages. Major themes that come with +# Sphinx are currently 'default' and 'sphinxdoc'. +html_theme = 'nature' + +# Add any paths that contain custom themes here, relative to this directory. +html_theme_path = ['_theme'] + +# The name for this set of Sphinx documents. If None, it defaults to +# " v documentation". +html_title = "Setuptools documentation" + +# A shorter title for the navigation bar. Default is the same as html_title. +html_short_title = "Setuptools" + +# If true, SmartyPants will be used to convert quotes and dashes to +# typographically correct entities. +html_use_smartypants = True + +# Custom sidebar templates, maps document names to template names. +html_sidebars = {'index': 'indexsidebar.html'} + +# If false, no module index is generated. +html_use_modindex = False + +# If false, no index is generated. +html_use_index = False + +# Output file base name for HTML help builder. +htmlhelp_basename = 'Setuptoolsdoc' + +# -- Options for LaTeX output -------------------------------------------------- + +# Grouping the document tree into LaTeX files. List of tuples +# (source start file, target name, title, author, documentclass [howto/manual]). +latex_documents = [ + ('index', 'Setuptools.tex', 'Setuptools Documentation', + 'The fellowship of the packaging', 'manual'), +] + +link_files = { + '../CHANGES.rst': dict( + using=dict( + BB='https://bitbucket.org', + GH='https://github.com', + ), + replace=[ + dict( + pattern=r"(Issue )?#(?P\d+)", + url='{GH}/pypa/setuptools/issues/{issue}', + ), + dict( + pattern=r"BB Pull Request ?#(?P\d+)", + url='{BB}/pypa/setuptools/pull-request/{bb_pull_request}', + ), + dict( + pattern=r"Distribute #(?P\d+)", + url='{BB}/tarek/distribute/issue/{distribute}', + ), + dict( + pattern=r"Buildout #(?P\d+)", + url='{GH}/buildout/buildout/issues/{buildout}', + ), + dict( + pattern=r"Old Setuptools #(?P\d+)", + url='http://bugs.python.org/setuptools/issue{old_setuptools}', + ), + dict( + pattern=r"Jython #(?P\d+)", + url='http://bugs.jython.org/issue{jython}', + ), + dict( + pattern=r"Python #(?P\d+)", + url='http://bugs.python.org/issue{python}', + ), + dict( + pattern=r"Interop #(?P\d+)", + url='{GH}/pypa/interoperability-peps/issues/{interop}', + ), + dict( + pattern=r"Pip #(?P\d+)", + url='{GH}/pypa/pip/issues/{pip}', + ), + dict( + pattern=r"Packaging #(?P\d+)", + url='{GH}/pypa/packaging/issues/{packaging}', + ), + dict( + pattern=r"[Pp]ackaging (?P\d+(\.\d+)+)", + url='{GH}/pypa/packaging/blob/{packaging_ver}/CHANGELOG.rst', + ), + dict( + pattern=r"PEP[- ](?P\d+)", + url='https://www.python.org/dev/peps/pep-{pep_number:0>4}/', + ), + dict( + pattern=r"setuptools_svn #(?P\d+)", + url='{GH}/jaraco/setuptools_svn/issues/{setuptools_svn}', + ), + dict( + pattern=r"^(?m)((?Pv?\d+(\.\d+){1,2}))\n[-=]+\n", + with_scm="{text}\n{rev[timestamp]:%d %b %Y}\n", + ), + ], + ), +} diff --git a/docs/developer-guide.txt b/docs/developer-guide.txt new file mode 100644 index 0000000..8a13638 --- /dev/null +++ b/docs/developer-guide.txt @@ -0,0 +1,118 @@ +================================ +Developer's Guide for Setuptools +================================ + +If you want to know more about contributing on Setuptools, this is the place. + + +.. contents:: **Table of Contents** + + +------------------- +Recommended Reading +------------------- + +Please read `How to write the perfect pull request +`_ for some tips +on contributing to open source projects. Although the article is not +authoritative, it was authored by the maintainer of Setuptools, so reflects +his opinions and will improve the likelihood of acceptance and quality of +contribution. + +------------------ +Project Management +------------------ + +Setuptools is maintained primarily in Github at `this home +`_. Setuptools is maintained under the +Python Packaging Authority (PyPA) with several core contributors. All bugs +for Setuptools are filed and the canonical source is maintained in Github. + +User support and discussions are done through the issue tracker (for specific) +issues, through the distutils-sig mailing list, or on IRC (Freenode) at +#pypa. + +Discussions about development happen on the pypa-dev mailing list or on +`Gitter `_. + +----------------- +Authoring Tickets +----------------- + +Before authoring any source code, it's often prudent to file a ticket +describing the motivation behind making changes. First search to see if a +ticket already exists for your issue. If not, create one. Try to think from +the perspective of the reader. Explain what behavior you expected, what you +got instead, and what factors might have contributed to the unexpected +behavior. In Github, surround a block of code or traceback with the triple +backtick "\`\`\`" so that it is formatted nicely. + +Filing a ticket provides a forum for justification, discussion, and +clarification. The ticket provides a record of the purpose for the change and +any hard decisions that were made. It provides a single place for others to +reference when trying to understand why the software operates the way it does +or why certain changes were made. + +Setuptools makes extensive use of hyperlinks to tickets in the changelog so +that system integrators and other users can get a quick summary, but then +jump to the in-depth discussion about any subject referenced. + +----------- +Source Code +----------- + +Grab the code at Github:: + + $ git checkout https://github.com/pypa/setuptools + +If you want to contribute changes, we recommend you fork the repository on +Github, commit the changes to your repository, and then make a pull request +on Github. If you make some changes, don't forget to: + +- add a note in CHANGES.rst + +Please commit all changes in the 'master' branch against the latest available +commit or for bug-fixes, against an earlier commit or release in which the +bug occurred. + +If you find yourself working on more than one issue at a time, Setuptools +generally prefers Git-style branches, so use Mercurial bookmarks or Git +branches or multiple forks to maintain separate efforts. + +The Continuous Integration tests that validate every release are run +from this repository. + +For posterity, the old `Bitbucket mirror +`_ is available. + +------- +Testing +------- + +The primary tests are run using tox. To run the tests, first make +sure you have tox installed, then invoke it:: + + $ tox + +Under continuous integration, additional tests may be run. See the +``.travis.yml`` file for full details on the tests run under Travis-CI. + +------------------- +Semantic Versioning +------------------- + +Setuptools follows ``semver``. + +.. explain value of reflecting meaning in versions. + +---------------------- +Building Documentation +---------------------- + +Setuptools relies on the Sphinx system for building documentation. +To accommodate RTD, docs must be built from the docs/ directory. + +To build them, you need to have installed the requirements specified +in docs/requirements.txt. One way to do this is to use rwt: + + setuptools/docs$ python -m rwt -r requirements.txt -- -m sphinx . html diff --git a/docs/development.txt b/docs/development.txt new file mode 100644 index 0000000..455f038 --- /dev/null +++ b/docs/development.txt @@ -0,0 +1,35 @@ +------------------------- +Development on Setuptools +------------------------- + +Setuptools is maintained by the Python community under the Python Packaging +Authority (PyPA) and led by Jason R. Coombs. + +This document describes the process by which Setuptools is developed. +This document assumes the reader has some passing familiarity with +*using* setuptools, the ``pkg_resources`` module, and EasyInstall. It +does not attempt to explain basic concepts like inter-project +dependencies, nor does it contain detailed lexical syntax for most +file formats. Neither does it explain concepts like "namespace +packages" or "resources" in any detail, as all of these subjects are +covered at length in the setuptools developer's guide and the +``pkg_resources`` reference manual. + +Instead, this is **internal** documentation for how those concepts and +features are *implemented* in concrete terms. It is intended for people +who are working on the setuptools code base, who want to be able to +troubleshoot setuptools problems, want to write code that reads the file +formats involved, or want to otherwise tinker with setuptools-generated +files and directories. + +Note, however, that these are all internal implementation details and +are therefore subject to change; stick to the published API if you don't +want to be responsible for keeping your code from breaking when +setuptools changes. You have been warned. + +.. toctree:: + :maxdepth: 1 + + developer-guide + formats + releases diff --git a/docs/easy_install.txt b/docs/easy_install.txt new file mode 100644 index 0000000..bd9f0e8 --- /dev/null +++ b/docs/easy_install.txt @@ -0,0 +1,1625 @@ +============ +Easy Install +============ + +Easy Install is a python module (``easy_install``) bundled with ``setuptools`` +that lets you automatically download, build, install, and manage Python +packages. + +Please share your experiences with us! If you encounter difficulty installing +a package, please contact us via the `distutils mailing list +`_. (Note: please DO NOT send +private email directly to the author of setuptools; it will be discarded. The +mailing list is a searchable archive of previously-asked and answered +questions; you should begin your research there before reporting something as a +bug -- and then do so via list discussion first.) + +(Also, if you'd like to learn about how you can use ``setuptools`` to make your +own packages work better with EasyInstall, or provide EasyInstall-like features +without requiring your users to use EasyInstall directly, you'll probably want +to check out the full `setuptools`_ documentation as well.) + +.. contents:: **Table of Contents** + + +Using "Easy Install" +==================== + + +.. _installation instructions: + +Installing "Easy Install" +------------------------- + +Please see the `setuptools PyPI page `_ +for download links and basic installation instructions for each of the +supported platforms. + +You will need at least Python 2.6. An ``easy_install`` script will be +installed in the normal location for Python scripts on your platform. + +Note that the instructions on the setuptools PyPI page assume that you are +are installing to Python's primary ``site-packages`` directory. If this is +not the case, you should consult the section below on `Custom Installation +Locations`_ before installing. (And, on Windows, you should not use the +``.exe`` installer when installing to an alternate location.) + +Note that ``easy_install`` normally works by downloading files from the +internet. If you are behind an NTLM-based firewall that prevents Python +programs from accessing the net directly, you may wish to first install and use +the `APS proxy server `_, which lets you get past such +firewalls in the same way that your web browser(s) do. + +(Alternately, if you do not wish easy_install to actually download anything, you +can restrict it from doing so with the ``--allow-hosts`` option; see the +sections on `restricting downloads with --allow-hosts`_ and `command-line +options`_ for more details.) + + +Troubleshooting +~~~~~~~~~~~~~~~ + +If EasyInstall/setuptools appears to install correctly, and you can run the +``easy_install`` command but it fails with an ``ImportError``, the most likely +cause is that you installed to a location other than ``site-packages``, +without taking any of the steps described in the `Custom Installation +Locations`_ section below. Please see that section and follow the steps to +make sure that your custom location will work correctly. Then re-install. + +Similarly, if you can run ``easy_install``, and it appears to be installing +packages, but then you can't import them, the most likely issue is that you +installed EasyInstall correctly but are using it to install packages to a +non-standard location that hasn't been properly prepared. Again, see the +section on `Custom Installation Locations`_ for more details. + + +Windows Notes +~~~~~~~~~~~~~ + +Installing setuptools will provide an ``easy_install`` command according to +the techniques described in `Executables and Launchers`_. If the +``easy_install`` command is not available after installation, that section +provides details on how to configure Windows to make the commands available. + + +Downloading and Installing a Package +------------------------------------ + +For basic use of ``easy_install``, you need only supply the filename or URL of +a source distribution or .egg file (`Python Egg`__). + +__ http://peak.telecommunity.com/DevCenter/PythonEggs + +**Example 1**. Install a package by name, searching PyPI for the latest +version, and automatically downloading, building, and installing it:: + + easy_install SQLObject + +**Example 2**. Install or upgrade a package by name and version by finding +links on a given "download page":: + + easy_install -f http://pythonpaste.org/package_index.html SQLObject + +**Example 3**. Download a source distribution from a specified URL, +automatically building and installing it:: + + easy_install http://example.com/path/to/MyPackage-1.2.3.tgz + +**Example 4**. Install an already-downloaded .egg file:: + + easy_install /my_downloads/OtherPackage-3.2.1-py2.3.egg + +**Example 5**. Upgrade an already-installed package to the latest version +listed on PyPI:: + + easy_install --upgrade PyProtocols + +**Example 6**. Install a source distribution that's already downloaded and +extracted in the current directory (New in 0.5a9):: + + easy_install . + +**Example 7**. (New in 0.6a1) Find a source distribution or Subversion +checkout URL for a package, and extract it or check it out to +``~/projects/sqlobject`` (the name will always be in all-lowercase), where it +can be examined or edited. (The package will not be installed, but it can +easily be installed with ``easy_install ~/projects/sqlobject``. See `Editing +and Viewing Source Packages`_ below for more info.):: + + easy_install --editable --build-directory ~/projects SQLObject + +**Example 7**. (New in 0.6.11) Install a distribution within your home dir:: + + easy_install --user SQLAlchemy + +Easy Install accepts URLs, filenames, PyPI package names (i.e., ``distutils`` +"distribution" names), and package+version specifiers. In each case, it will +attempt to locate the latest available version that meets your criteria. + +When downloading or processing downloaded files, Easy Install recognizes +distutils source distribution files with extensions of .tgz, .tar, .tar.gz, +.tar.bz2, or .zip. And of course it handles already-built .egg +distributions as well as ``.win32.exe`` installers built using distutils. + +By default, packages are installed to the running Python installation's +``site-packages`` directory, unless you provide the ``-d`` or ``--install-dir`` +option to specify an alternative directory, or specify an alternate location +using distutils configuration files. (See `Configuration Files`_, below.) + +By default, any scripts included with the package are installed to the running +Python installation's standard script installation location. However, if you +specify an installation directory via the command line or a config file, then +the default directory for installing scripts will be the same as the package +installation directory, to ensure that the script will have access to the +installed package. You can override this using the ``-s`` or ``--script-dir`` +option. + +Installed packages are added to an ``easy-install.pth`` file in the install +directory, so that Python will always use the most-recently-installed version +of the package. If you would like to be able to select which version to use at +runtime, you should use the ``-m`` or ``--multi-version`` option. + + +Upgrading a Package +------------------- + +You don't need to do anything special to upgrade a package: just install the +new version, either by requesting a specific version, e.g.:: + + easy_install "SomePackage==2.0" + +a version greater than the one you have now:: + + easy_install "SomePackage>2.0" + +using the upgrade flag, to find the latest available version on PyPI:: + + easy_install --upgrade SomePackage + +or by using a download page, direct download URL, or package filename:: + + easy_install -f http://example.com/downloads ExamplePackage + + easy_install http://example.com/downloads/ExamplePackage-2.0-py2.4.egg + + easy_install my_downloads/ExamplePackage-2.0.tgz + +If you're using ``-m`` or ``--multi-version`` , using the ``require()`` +function at runtime automatically selects the newest installed version of a +package that meets your version criteria. So, installing a newer version is +the only step needed to upgrade such packages. + +If you're installing to a directory on PYTHONPATH, or a configured "site" +directory (and not using ``-m``), installing a package automatically replaces +any previous version in the ``easy-install.pth`` file, so that Python will +import the most-recently installed version by default. So, again, installing +the newer version is the only upgrade step needed. + +If you haven't suppressed script installation (using ``--exclude-scripts`` or +``-x``), then the upgraded version's scripts will be installed, and they will +be automatically patched to ``require()`` the corresponding version of the +package, so that you can use them even if they are installed in multi-version +mode. + +``easy_install`` never actually deletes packages (unless you're installing a +package with the same name and version number as an existing package), so if +you want to get rid of older versions of a package, please see `Uninstalling +Packages`_, below. + + +Changing the Active Version +--------------------------- + +If you've upgraded a package, but need to revert to a previously-installed +version, you can do so like this:: + + easy_install PackageName==1.2.3 + +Where ``1.2.3`` is replaced by the exact version number you wish to switch to. +If a package matching the requested name and version is not already installed +in a directory on ``sys.path``, it will be located via PyPI and installed. + +If you'd like to switch to the latest installed version of ``PackageName``, you +can do so like this:: + + easy_install PackageName + +This will activate the latest installed version. (Note: if you have set any +``find_links`` via distutils configuration files, those download pages will be +checked for the latest available version of the package, and it will be +downloaded and installed if it is newer than your current version.) + +Note that changing the active version of a package will install the newly +active version's scripts, unless the ``--exclude-scripts`` or ``-x`` option is +specified. + + +Uninstalling Packages +--------------------- + +If you have replaced a package with another version, then you can just delete +the package(s) you don't need by deleting the PackageName-versioninfo.egg file +or directory (found in the installation directory). + +If you want to delete the currently installed version of a package (or all +versions of a package), you should first run:: + + easy_install -m PackageName + +This will ensure that Python doesn't continue to search for a package you're +planning to remove. After you've done this, you can safely delete the .egg +files or directories, along with any scripts you wish to remove. + + +Managing Scripts +---------------- + +Whenever you install, upgrade, or change versions of a package, EasyInstall +automatically installs the scripts for the selected package version, unless +you tell it not to with ``-x`` or ``--exclude-scripts``. If any scripts in +the script directory have the same name, they are overwritten. + +Thus, you do not normally need to manually delete scripts for older versions of +a package, unless the newer version of the package does not include a script +of the same name. However, if you are completely uninstalling a package, you +may wish to manually delete its scripts. + +EasyInstall's default behavior means that you can normally only run scripts +from one version of a package at a time. If you want to keep multiple versions +of a script available, however, you can simply use the ``--multi-version`` or +``-m`` option, and rename the scripts that EasyInstall creates. This works +because EasyInstall installs scripts as short code stubs that ``require()`` the +matching version of the package the script came from, so renaming the script +has no effect on what it executes. + +For example, suppose you want to use two versions of the ``rst2html`` tool +provided by the `docutils `_ package. You might +first install one version:: + + easy_install -m docutils==0.3.9 + +then rename the ``rst2html.py`` to ``r2h_039``, and install another version:: + + easy_install -m docutils==0.3.10 + +This will create another ``rst2html.py`` script, this one using docutils +version 0.3.10 instead of 0.3.9. You now have two scripts, each using a +different version of the package. (Notice that we used ``-m`` for both +installations, so that Python won't lock us out of using anything but the most +recently-installed version of the package.) + + +Executables and Launchers +------------------------- + +On Unix systems, scripts are installed with as natural files with a "#!" +header and no extension and they launch under the Python version indicated in +the header. + +On Windows, there is no mechanism to "execute" files without extensions, so +EasyInstall provides two techniques to mirror the Unix behavior. The behavior +is indicated by the SETUPTOOLS_LAUNCHER environment variable, which may be +"executable" (default) or "natural". + +Regardless of the technique used, the script(s) will be installed to a Scripts +directory (by default in the Python installation directory). It is recommended +for EasyInstall that you ensure this directory is in the PATH environment +variable. The easiest way to ensure the Scripts directory is in the PATH is +to run ``Tools\Scripts\win_add2path.py`` from the Python directory (requires +Python 2.6 or later). + +Note that instead of changing your ``PATH`` to include the Python scripts +directory, you can also retarget the installation location for scripts so they +go on a directory that's already on the ``PATH``. For more information see +`Command-Line Options`_ and `Configuration Files`_. During installation, +pass command line options (such as ``--script-dir``) to +``ez_setup.py`` to control where ``easy_install.exe`` will be installed. + + +Windows Executable Launcher +~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +If the "executable" launcher is used, EasyInstall will create a '.exe' +launcher of the same name beside each installed script (including +``easy_install`` itself). These small .exe files launch the script of the +same name using the Python version indicated in the '#!' header. + +This behavior is currently default. To force +the use of executable launchers, set ``SETUPTOOLS_LAUNCHER`` to "executable". + +Natural Script Launcher +~~~~~~~~~~~~~~~~~~~~~~~ + +EasyInstall also supports deferring to an external launcher such as +`pylauncher `_ for launching scripts. +Enable this experimental functionality by setting the +``SETUPTOOLS_LAUNCHER`` environment variable to "natural". EasyInstall will +then install scripts as simple +scripts with a .pya (or .pyw) extension appended. If these extensions are +associated with the pylauncher and listed in the PATHEXT environment variable, +these scripts can then be invoked simply and directly just like any other +executable. This behavior may become default in a future version. + +EasyInstall uses the .pya extension instead of simply +the typical '.py' extension. This distinct extension is necessary to prevent +Python +from treating the scripts as importable modules (where name conflicts exist). +Current releases of pylauncher do not yet associate with .pya files by +default, but future versions should do so. + + +Tips & Techniques +----------------- + +Multiple Python Versions +~~~~~~~~~~~~~~~~~~~~~~~~ + +EasyInstall installs itself under two names: +``easy_install`` and ``easy_install-N.N``, where ``N.N`` is the Python version +used to install it. Thus, if you install EasyInstall for both Python 3.2 and +2.7, you can use the ``easy_install-3.2`` or ``easy_install-2.7`` scripts to +install packages for the respective Python version. + +Setuptools also supplies easy_install as a runnable module which may be +invoked using ``python -m easy_install`` for any Python with Setuptools +installed. + +Restricting Downloads with ``--allow-hosts`` +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +You can use the ``--allow-hosts`` (``-H``) option to restrict what domains +EasyInstall will look for links and downloads on. ``--allow-hosts=None`` +prevents downloading altogether. You can also use wildcards, for example +to restrict downloading to hosts in your own intranet. See the section below +on `Command-Line Options`_ for more details on the ``--allow-hosts`` option. + +By default, there are no host restrictions in effect, but you can change this +default by editing the appropriate `configuration files`_ and adding: + +.. code-block:: ini + + [easy_install] + allow_hosts = *.myintranet.example.com,*.python.org + +The above example would then allow downloads only from hosts in the +``python.org`` and ``myintranet.example.com`` domains, unless overridden on the +command line. + + +Installing on Un-networked Machines +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +Just copy the eggs or source packages you need to a directory on the target +machine, then use the ``-f`` or ``--find-links`` option to specify that +directory's location. For example:: + + easy_install -H None -f somedir SomePackage + +will attempt to install SomePackage using only eggs and source packages found +in ``somedir`` and disallowing all remote access. You should of course make +sure you have all of SomePackage's dependencies available in somedir. + +If you have another machine of the same operating system and library versions +(or if the packages aren't platform-specific), you can create the directory of +eggs using a command like this:: + + easy_install -zmaxd somedir SomePackage + +This will tell EasyInstall to put zipped eggs or source packages for +SomePackage and all its dependencies into ``somedir``, without creating any +scripts or .pth files. You can then copy the contents of ``somedir`` to the +target machine. (``-z`` means zipped eggs, ``-m`` means multi-version, which +prevents .pth files from being used, ``-a`` means to copy all the eggs needed, +even if they're installed elsewhere on the machine, and ``-d`` indicates the +directory to place the eggs in.) + +You can also build the eggs from local development packages that were installed +with the ``setup.py develop`` command, by including the ``-l`` option, e.g.:: + + easy_install -zmaxld somedir SomePackage + +This will use locally-available source distributions to build the eggs. + + +Packaging Others' Projects As Eggs +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +Need to distribute a package that isn't published in egg form? You can use +EasyInstall to build eggs for a project. You'll want to use the ``--zip-ok``, +``--exclude-scripts``, and possibly ``--no-deps`` options (``-z``, ``-x`` and +``-N``, respectively). Use ``-d`` or ``--install-dir`` to specify the location +where you'd like the eggs placed. By placing them in a directory that is +published to the web, you can then make the eggs available for download, either +in an intranet or to the internet at large. + +If someone distributes a package in the form of a single ``.py`` file, you can +wrap it in an egg by tacking an ``#egg=name-version`` suffix on the file's URL. +So, something like this:: + + easy_install -f "http://some.example.com/downloads/foo.py#egg=foo-1.0" foo + +will install the package as an egg, and this:: + + easy_install -zmaxd. \ + -f "http://some.example.com/downloads/foo.py#egg=foo-1.0" foo + +will create a ``.egg`` file in the current directory. + + +Creating your own Package Index +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +In addition to local directories and the Python Package Index, EasyInstall can +find download links on most any web page whose URL is given to the ``-f`` +(``--find-links``) option. In the simplest case, you can simply have a web +page with links to eggs or Python source packages, even an automatically +generated directory listing (such as the Apache web server provides). + +If you are setting up an intranet site for package downloads, you may want to +configure the target machines to use your download site by default, adding +something like this to their `configuration files`_: + +.. code-block:: ini + + [easy_install] + find_links = http://mypackages.example.com/somedir/ + http://turbogears.org/download/ + http://peak.telecommunity.com/dist/ + +As you can see, you can list multiple URLs separated by whitespace, continuing +on multiple lines if necessary (as long as the subsequent lines are indented. + +If you are more ambitious, you can also create an entirely custom package index +or PyPI mirror. See the ``--index-url`` option under `Command-Line Options`_, +below, and also the section on `Package Index "API"`_. + + +Password-Protected Sites +------------------------ + +If a site you want to download from is password-protected using HTTP "Basic" +authentication, you can specify your credentials in the URL, like so:: + + http://some_userid:some_password@some.example.com/some_path/ + +You can do this with both index page URLs and direct download URLs. As long +as any HTML pages read by easy_install use *relative* links to point to the +downloads, the same user ID and password will be used to do the downloading. + +Using .pypirc Credentials +------------------------- + +In additional to supplying credentials in the URL, ``easy_install`` will also +honor credentials if present in the .pypirc file. Teams maintaining a private +repository of packages may already have defined access credentials for +uploading packages according to the distutils documentation. ``easy_install`` +will attempt to honor those if present. Refer to the distutils documentation +for Python 2.5 or later for details on the syntax. + +Controlling Build Options +~~~~~~~~~~~~~~~~~~~~~~~~~ + +EasyInstall respects standard distutils `Configuration Files`_, so you can use +them to configure build options for packages that it installs from source. For +example, if you are on Windows using the MinGW compiler, you can configure the +default compiler by putting something like this: + +.. code-block:: ini + + [build] + compiler = mingw32 + +into the appropriate distutils configuration file. In fact, since this is just +normal distutils configuration, it will affect any builds using that config +file, not just ones done by EasyInstall. For example, if you add those lines +to ``distutils.cfg`` in the ``distutils`` package directory, it will be the +default compiler for *all* packages you build. See `Configuration Files`_ +below for a list of the standard configuration file locations, and links to +more documentation on using distutils configuration files. + + +Editing and Viewing Source Packages +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +Sometimes a package's source distribution contains additional documentation, +examples, configuration files, etc., that are not part of its actual code. If +you want to be able to examine these files, you can use the ``--editable`` +option to EasyInstall, and EasyInstall will look for a source distribution +or Subversion URL for the package, then download and extract it or check it out +as a subdirectory of the ``--build-directory`` you specify. If you then wish +to install the package after editing or configuring it, you can do so by +rerunning EasyInstall with that directory as the target. + +Note that using ``--editable`` stops EasyInstall from actually building or +installing the package; it just finds, obtains, and possibly unpacks it for +you. This allows you to make changes to the package if necessary, and to +either install it in development mode using ``setup.py develop`` (if the +package uses setuptools, that is), or by running ``easy_install projectdir`` +(where ``projectdir`` is the subdirectory EasyInstall created for the +downloaded package. + +In order to use ``--editable`` (``-e`` for short), you *must* also supply a +``--build-directory`` (``-b`` for short). The project will be placed in a +subdirectory of the build directory. The subdirectory will have the same +name as the project itself, but in all-lowercase. If a file or directory of +that name already exists, EasyInstall will print an error message and exit. + +Also, when using ``--editable``, you cannot use URLs or filenames as arguments. +You *must* specify project names (and optional version requirements) so that +EasyInstall knows what directory name(s) to create. If you need to force +EasyInstall to use a particular URL or filename, you should specify it as a +``--find-links`` item (``-f`` for short), and then also specify +the project name, e.g.:: + + easy_install -eb ~/projects \ + -fhttp://prdownloads.sourceforge.net/ctypes/ctypes-0.9.6.tar.gz?download \ + ctypes==0.9.6 + + +Dealing with Installation Conflicts +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +(NOTE: As of 0.6a11, this section is obsolete; it is retained here only so that +people using older versions of EasyInstall can consult it. As of version +0.6a11, installation conflicts are handled automatically without deleting the +old or system-installed packages, and without ignoring the issue. Instead, +eggs are automatically shifted to the front of ``sys.path`` using special +code added to the ``easy-install.pth`` file. So, if you are using version +0.6a11 or better of setuptools, you do not need to worry about conflicts, +and the following issues do not apply to you.) + +EasyInstall installs distributions in a "managed" way, such that each +distribution can be independently activated or deactivated on ``sys.path``. +However, packages that were not installed by EasyInstall are "unmanaged", +in that they usually live all in one directory and cannot be independently +activated or deactivated. + +As a result, if you are using EasyInstall to upgrade an existing package, or +to install a package with the same name as an existing package, EasyInstall +will warn you of the conflict. (This is an improvement over ``setup.py +install``, because the ``distutils`` just install new packages on top of old +ones, possibly combining two unrelated packages or leaving behind modules that +have been deleted in the newer version of the package.) + +EasyInstall will stop the installation if it detects a conflict +between an existing, "unmanaged" package, and a module or package in any of +the distributions you're installing. It will display a list of all of the +existing files and directories that would need to be deleted for the new +package to be able to function correctly. To proceed, you must manually +delete these conflicting files and directories and re-run EasyInstall. + +Of course, once you've replaced all of your existing "unmanaged" packages with +versions managed by EasyInstall, you won't have any more conflicts to worry +about! + + +Compressed Installation +~~~~~~~~~~~~~~~~~~~~~~~ + +EasyInstall tries to install packages in zipped form, if it can. Zipping +packages can improve Python's overall import performance if you're not using +the ``--multi-version`` option, because Python processes zipfile entries on +``sys.path`` much faster than it does directories. + +As of version 0.5a9, EasyInstall analyzes packages to determine whether they +can be safely installed as a zipfile, and then acts on its analysis. (Previous +versions would not install a package as a zipfile unless you used the +``--zip-ok`` option.) + +The current analysis approach is fairly conservative; it currently looks for: + + * Any use of the ``__file__`` or ``__path__`` variables (which should be + replaced with ``pkg_resources`` API calls) + + * Possible use of ``inspect`` functions that expect to manipulate source files + (e.g. ``inspect.getsource()``) + + * Top-level modules that might be scripts used with ``python -m`` (Python 2.4) + +If any of the above are found in the package being installed, EasyInstall will +assume that the package cannot be safely run from a zipfile, and unzip it to +a directory instead. You can override this analysis with the ``-zip-ok`` flag, +which will tell EasyInstall to install the package as a zipfile anyway. Or, +you can use the ``--always-unzip`` flag, in which case EasyInstall will always +unzip, even if its analysis says the package is safe to run as a zipfile. + +Normally, however, it is simplest to let EasyInstall handle the determination +of whether to zip or unzip, and only specify overrides when needed to work +around a problem. If you find you need to override EasyInstall's guesses, you +may want to contact the package author and the EasyInstall maintainers, so that +they can make appropriate changes in future versions. + +(Note: If a package uses ``setuptools`` in its setup script, the package author +has the option to declare the package safe or unsafe for zipped usage via the +``zip_safe`` argument to ``setup()``. If the package author makes such a +declaration, EasyInstall believes the package's author and does not perform its +own analysis. However, your command-line option, if any, will still override +the package author's choice.) + + +Reference Manual +================ + +Configuration Files +------------------- + +(New in 0.4a2) + +You may specify default options for EasyInstall using the standard +distutils configuration files, under the command heading ``easy_install``. +EasyInstall will look first for a ``setup.cfg`` file in the current directory, +then a ``~/.pydistutils.cfg`` or ``$HOME\\pydistutils.cfg`` (on Unix-like OSes +and Windows, respectively), and finally a ``distutils.cfg`` file in the +``distutils`` package directory. Here's a simple example: + +.. code-block:: ini + + [easy_install] + + # set the default location to install packages + install_dir = /home/me/lib/python + + # Notice that indentation can be used to continue an option + # value; this is especially useful for the "--find-links" + # option, which tells easy_install to use download links on + # these pages before consulting PyPI: + # + find_links = http://sqlobject.org/ + http://peak.telecommunity.com/dist/ + +In addition to accepting configuration for its own options under +``[easy_install]``, EasyInstall also respects defaults specified for other +distutils commands. For example, if you don't set an ``install_dir`` for +``[easy_install]``, but *have* set an ``install_lib`` for the ``[install]`` +command, this will become EasyInstall's default installation directory. Thus, +if you are already using distutils configuration files to set default install +locations, build options, etc., EasyInstall will respect your existing settings +until and unless you override them explicitly in an ``[easy_install]`` section. + +For more information, see also the current Python documentation on the `use and +location of distutils configuration files `_. + +Notice that ``easy_install`` will use the ``setup.cfg`` from the current +working directory only if it was triggered from ``setup.py`` through the +``install_requires`` option. The standalone command will not use that file. + +Command-Line Options +-------------------- + +``--zip-ok, -z`` + Install all packages as zip files, even if they are marked as unsafe for + running as a zipfile. This can be useful when EasyInstall's analysis + of a non-setuptools package is too conservative, but keep in mind that + the package may not work correctly. (Changed in 0.5a9; previously this + option was required in order for zipped installation to happen at all.) + +``--always-unzip, -Z`` + Don't install any packages as zip files, even if the packages are marked + as safe for running as a zipfile. This can be useful if a package does + something unsafe, but not in a way that EasyInstall can easily detect. + EasyInstall's default analysis is currently very conservative, however, so + you should only use this option if you've had problems with a particular + package, and *after* reporting the problem to the package's maintainer and + to the EasyInstall maintainers. + + (Note: the ``-z/-Z`` options only affect the installation of newly-built + or downloaded packages that are not already installed in the target + directory; if you want to convert an existing installed version from + zipped to unzipped or vice versa, you'll need to delete the existing + version first, and re-run EasyInstall.) + +``--multi-version, -m`` + "Multi-version" mode. Specifying this option prevents ``easy_install`` from + adding an ``easy-install.pth`` entry for the package being installed, and + if an entry for any version the package already exists, it will be removed + upon successful installation. In multi-version mode, no specific version of + the package is available for importing, unless you use + ``pkg_resources.require()`` to put it on ``sys.path``. This can be as + simple as:: + + from pkg_resources import require + require("SomePackage", "OtherPackage", "MyPackage") + + which will put the latest installed version of the specified packages on + ``sys.path`` for you. (For more advanced uses, like selecting specific + versions and enabling optional dependencies, see the ``pkg_resources`` API + doc.) + + Changed in 0.6a10: this option is no longer silently enabled when + installing to a non-PYTHONPATH, non-"site" directory. You must always + explicitly use this option if you want it to be active. + +``--upgrade, -U`` (New in 0.5a4) + By default, EasyInstall only searches online if a project/version + requirement can't be met by distributions already installed + on sys.path or the installation directory. However, if you supply the + ``--upgrade`` or ``-U`` flag, EasyInstall will always check the package + index and ``--find-links`` URLs before selecting a version to install. In + this way, you can force EasyInstall to use the latest available version of + any package it installs (subject to any version requirements that might + exclude such later versions). + +``--install-dir=DIR, -d DIR`` + Set the installation directory. It is up to you to ensure that this + directory is on ``sys.path`` at runtime, and to use + ``pkg_resources.require()`` to enable the installed package(s) that you + need. + + (New in 0.4a2) If this option is not directly specified on the command line + or in a distutils configuration file, the distutils default installation + location is used. Normally, this would be the ``site-packages`` directory, + but if you are using distutils configuration files, setting things like + ``prefix`` or ``install_lib``, then those settings are taken into + account when computing the default installation directory, as is the + ``--prefix`` option. + +``--script-dir=DIR, -s DIR`` + Set the script installation directory. If you don't supply this option + (via the command line or a configuration file), but you *have* supplied + an ``--install-dir`` (via command line or config file), then this option + defaults to the same directory, so that the scripts will be able to find + their associated package installation. Otherwise, this setting defaults + to the location where the distutils would normally install scripts, taking + any distutils configuration file settings into account. + +``--exclude-scripts, -x`` + Don't install scripts. This is useful if you need to install multiple + versions of a package, but do not want to reset the version that will be + run by scripts that are already installed. + +``--user`` (New in 0.6.11) + Use the user-site-packages as specified in :pep:`370` + instead of the global site-packages. + +``--always-copy, -a`` (New in 0.5a4) + Copy all needed distributions to the installation directory, even if they + are already present in a directory on sys.path. In older versions of + EasyInstall, this was the default behavior, but now you must explicitly + request it. By default, EasyInstall will no longer copy such distributions + from other sys.path directories to the installation directory, unless you + explicitly gave the distribution's filename on the command line. + + Note that as of 0.6a10, using this option excludes "system" and + "development" eggs from consideration because they can't be reliably + copied. This may cause EasyInstall to choose an older version of a package + than what you expected, or it may cause downloading and installation of a + fresh copy of something that's already installed. You will see warning + messages for any eggs that EasyInstall skips, before it falls back to an + older version or attempts to download a fresh copy. + +``--find-links=URLS_OR_FILENAMES, -f URLS_OR_FILENAMES`` + Scan the specified "download pages" or directories for direct links to eggs + or other distributions. Any existing file or directory names or direct + download URLs are immediately added to EasyInstall's search cache, and any + indirect URLs (ones that don't point to eggs or other recognized archive + formats) are added to a list of additional places to search for download + links. As soon as EasyInstall has to go online to find a package (either + because it doesn't exist locally, or because ``--upgrade`` or ``-U`` was + used), the specified URLs will be downloaded and scanned for additional + direct links. + + Eggs and archives found by way of ``--find-links`` are only downloaded if + they are needed to meet a requirement specified on the command line; links + to unneeded packages are ignored. + + If all requested packages can be found using links on the specified + download pages, the Python Package Index will not be consulted unless you + also specified the ``--upgrade`` or ``-U`` option. + + (Note: if you want to refer to a local HTML file containing links, you must + use a ``file:`` URL, as filenames that do not refer to a directory, egg, or + archive are ignored.) + + You may specify multiple URLs or file/directory names with this option, + separated by whitespace. Note that on the command line, you will probably + have to surround the URL list with quotes, so that it is recognized as a + single option value. You can also specify URLs in a configuration file; + see `Configuration Files`_, above. + + Changed in 0.6a10: previously all URLs and directories passed to this + option were scanned as early as possible, but from 0.6a10 on, only + directories and direct archive links are scanned immediately; URLs are not + retrieved unless a package search was already going to go online due to a + package not being available locally, or due to the use of the ``--update`` + or ``-U`` option. + +``--no-find-links`` Blocks the addition of any link. + This parameter is useful if you want to avoid adding links defined in a + project easy_install is installing (whether it's a requested project or a + dependency). When used, ``--find-links`` is ignored. + + Added in Distribute 0.6.11 and Setuptools 0.7. + +``--index-url=URL, -i URL`` (New in 0.4a1; default changed in 0.6c7) + Specifies the base URL of the Python Package Index. The default is + https://pypi.python.org/simple if not specified. When a package is requested + that is not locally available or linked from a ``--find-links`` download + page, the package index will be searched for download pages for the needed + package, and those download pages will be searched for links to download + an egg or source distribution. + +``--editable, -e`` (New in 0.6a1) + Only find and download source distributions for the specified projects, + unpacking them to subdirectories of the specified ``--build-directory``. + EasyInstall will not actually build or install the requested projects or + their dependencies; it will just find and extract them for you. See + `Editing and Viewing Source Packages`_ above for more details. + +``--build-directory=DIR, -b DIR`` (UPDATED in 0.6a1) + Set the directory used to build source packages. If a package is built + from a source distribution or checkout, it will be extracted to a + subdirectory of the specified directory. The subdirectory will have the + same name as the extracted distribution's project, but in all-lowercase. + If a file or directory of that name already exists in the given directory, + a warning will be printed to the console, and the build will take place in + a temporary directory instead. + + This option is most useful in combination with the ``--editable`` option, + which forces EasyInstall to *only* find and extract (but not build and + install) source distributions. See `Editing and Viewing Source Packages`_, + above, for more information. + +``--verbose, -v, --quiet, -q`` (New in 0.4a4) + Control the level of detail of EasyInstall's progress messages. The + default detail level is "info", which prints information only about + relatively time-consuming operations like running a setup script, unpacking + an archive, or retrieving a URL. Using ``-q`` or ``--quiet`` drops the + detail level to "warn", which will only display installation reports, + warnings, and errors. Using ``-v`` or ``--verbose`` increases the detail + level to include individual file-level operations, link analysis messages, + and distutils messages from any setup scripts that get run. If you include + the ``-v`` option more than once, the second and subsequent uses are passed + down to any setup scripts, increasing the verbosity of their reporting as + well. + +``--dry-run, -n`` (New in 0.4a4) + Don't actually install the package or scripts. This option is passed down + to any setup scripts run, so packages should not actually build either. + This does *not* skip downloading, nor does it skip extracting source + distributions to a temporary/build directory. + +``--optimize=LEVEL``, ``-O LEVEL`` (New in 0.4a4) + If you are installing from a source distribution, and are *not* using the + ``--zip-ok`` option, this option controls the optimization level for + compiling installed ``.py`` files to ``.pyo`` files. It does not affect + the compilation of modules contained in ``.egg`` files, only those in + ``.egg`` directories. The optimization level can be set to 0, 1, or 2; + the default is 0 (unless it's set under ``install`` or ``install_lib`` in + one of your distutils configuration files). + +``--record=FILENAME`` (New in 0.5a4) + Write a record of all installed files to FILENAME. This is basically the + same as the same option for the standard distutils "install" command, and + is included for compatibility with tools that expect to pass this option + to "setup.py install". + +``--site-dirs=DIRLIST, -S DIRLIST`` (New in 0.6a1) + Specify one or more custom "site" directories (separated by commas). + "Site" directories are directories where ``.pth`` files are processed, such + as the main Python ``site-packages`` directory. As of 0.6a10, EasyInstall + automatically detects whether a given directory processes ``.pth`` files + (or can be made to do so), so you should not normally need to use this + option. It is is now only necessary if you want to override EasyInstall's + judgment and force an installation directory to be treated as if it + supported ``.pth`` files. + +``--no-deps, -N`` (New in 0.6a6) + Don't install any dependencies. This is intended as a convenience for + tools that wrap eggs in a platform-specific packaging system. (We don't + recommend that you use it for anything else.) + +``--allow-hosts=PATTERNS, -H PATTERNS`` (New in 0.6a6) + Restrict downloading and spidering to hosts matching the specified glob + patterns. E.g. ``-H *.python.org`` restricts web access so that only + packages listed and downloadable from machines in the ``python.org`` + domain. The glob patterns must match the *entire* user/host/port section of + the target URL(s). For example, ``*.python.org`` will NOT accept a URL + like ``http://python.org/foo`` or ``http://www.python.org:8080/``. + Multiple patterns can be specified by separating them with commas. The + default pattern is ``*``, which matches anything. + + In general, this option is mainly useful for blocking EasyInstall's web + access altogether (e.g. ``-Hlocalhost``), or to restrict it to an intranet + or other trusted site. EasyInstall will do the best it can to satisfy + dependencies given your host restrictions, but of course can fail if it + can't find suitable packages. EasyInstall displays all blocked URLs, so + that you can adjust your ``--allow-hosts`` setting if it is more strict + than you intended. Some sites may wish to define a restrictive default + setting for this option in their `configuration files`_, and then manually + override the setting on the command line as needed. + +``--prefix=DIR`` (New in 0.6a10) + Use the specified directory as a base for computing the default + installation and script directories. On Windows, the resulting default + directories will be ``prefix\\Lib\\site-packages`` and ``prefix\\Scripts``, + while on other platforms the defaults will be + ``prefix/lib/python2.X/site-packages`` (with the appropriate version + substituted) for libraries and ``prefix/bin`` for scripts. + + Note that the ``--prefix`` option only sets the *default* installation and + script directories, and does not override the ones set on the command line + or in a configuration file. + +``--local-snapshots-ok, -l`` (New in 0.6c6) + Normally, EasyInstall prefers to only install *released* versions of + projects, not in-development ones, because such projects may not + have a currently-valid version number. So, it usually only installs them + when their ``setup.py`` directory is explicitly passed on the command line. + + However, if this option is used, then any in-development projects that were + installed using the ``setup.py develop`` command, will be used to build + eggs, effectively upgrading the "in-development" project to a snapshot + release. Normally, this option is used only in conjunction with the + ``--always-copy`` option to create a distributable snapshot of every egg + needed to run an application. + + Note that if you use this option, you must make sure that there is a valid + version number (such as an SVN revision number tag) for any in-development + projects that may be used, as otherwise EasyInstall may not be able to tell + what version of the project is "newer" when future installations or + upgrades are attempted. + + +.. _non-root installation: + +Custom Installation Locations +----------------------------- + +By default, EasyInstall installs python packages into Python's main ``site-packages`` directory, +and manages them using a custom ``.pth`` file in that same directory. + +Very often though, a user or developer wants ``easy_install`` to install and manage python packages +in an alternative location, usually for one of 3 reasons: + +1. They don't have access to write to the main Python site-packages directory. + +2. They want a user-specific stash of packages, that is not visible to other users. + +3. They want to isolate a set of packages to a specific python application, usually to minimize + the possibility of version conflicts. + +Historically, there have been many approaches to achieve custom installation. +The following section lists only the easiest and most relevant approaches [1]_. + +`Use the "--user" option`_ + +`Use the "--user" option and customize "PYTHONUSERBASE"`_ + +`Use "virtualenv"`_ + +.. [1] There are older ways to achieve custom installation using various ``easy_install`` and ``setup.py install`` options, combined with ``PYTHONPATH`` and/or ``PYTHONUSERBASE`` alterations, but all of these are effectively deprecated by the User scheme brought in by `PEP-370`_ in Python 2.6. + +.. _PEP-370: http://www.python.org/dev/peps/pep-0370/ + + +Use the "--user" option +~~~~~~~~~~~~~~~~~~~~~~~ +With Python 2.6 came the User scheme for installation, which means that all +python distributions support an alternative install location that is specific to a user [2]_ [3]_. +The Default location for each OS is explained in the python documentation +for the ``site.USER_BASE`` variable. This mode of installation can be turned on by +specifying the ``--user`` option to ``setup.py install`` or ``easy_install``. +This approach serves the need to have a user-specific stash of packages. + +.. [2] Prior to Python2.6, Mac OS X offered a form of the User scheme. That is now subsumed into the User scheme introduced in Python 2.6. +.. [3] Prior to the User scheme, there was the Home scheme, which is still available, but requires more effort than the User scheme to get packages recognized. + +Use the "--user" option and customize "PYTHONUSERBASE" +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +The User scheme install location can be customized by setting the ``PYTHONUSERBASE`` environment +variable, which updates the value of ``site.USER_BASE``. To isolate packages to a specific +application, simply set the OS environment of that application to a specific value of +``PYTHONUSERBASE``, that contains just those packages. + +Use "virtualenv" +~~~~~~~~~~~~~~~~ +"virtualenv" is a 3rd-party python package that effectively "clones" a python installation, thereby +creating an isolated location to install packages. The evolution of "virtualenv" started before the existence +of the User installation scheme. "virtualenv" provides a version of ``easy_install`` that is +scoped to the cloned python install and is used in the normal way. "virtualenv" does offer various features +that the User installation scheme alone does not provide, e.g. the ability to hide the main python site-packages. + +Please refer to the `virtualenv`_ documentation for more details. + +.. _virtualenv: https://pypi.python.org/pypi/virtualenv + + + +Package Index "API" +------------------- + +Custom package indexes (and PyPI) must follow the following rules for +EasyInstall to be able to look up and download packages: + +1. Except where stated otherwise, "pages" are HTML or XHTML, and "links" + refer to ``href`` attributes. + +2. Individual project version pages' URLs must be of the form + ``base/projectname/version``, where ``base`` is the package index's base URL. + +3. Omitting the ``/version`` part of a project page's URL (but keeping the + trailing ``/``) should result in a page that is either: + + a) The single active version of that project, as though the version had been + explicitly included, OR + + b) A page with links to all of the active version pages for that project. + +4. Individual project version pages should contain direct links to downloadable + distributions where possible. It is explicitly permitted for a project's + "long_description" to include URLs, and these should be formatted as HTML + links by the package index, as EasyInstall does no special processing to + identify what parts of a page are index-specific and which are part of the + project's supplied description. + +5. Where available, MD5 information should be added to download URLs by + appending a fragment identifier of the form ``#md5=...``, where ``...`` is + the 32-character hex MD5 digest. EasyInstall will verify that the + downloaded file's MD5 digest matches the given value. + +6. Individual project version pages should identify any "homepage" or + "download" URLs using ``rel="homepage"`` and ``rel="download"`` attributes + on the HTML elements linking to those URLs. Use of these attributes will + cause EasyInstall to always follow the provided links, unless it can be + determined by inspection that they are downloadable distributions. If the + links are not to downloadable distributions, they are retrieved, and if they + are HTML, they are scanned for download links. They are *not* scanned for + additional "homepage" or "download" links, as these are only processed for + pages that are part of a package index site. + +7. The root URL of the index, if retrieved with a trailing ``/``, must result + in a page containing links to *all* projects' active version pages. + + (Note: This requirement is a workaround for the absence of case-insensitive + ``safe_name()`` matching of project names in URL paths. If project names are + matched in this fashion (e.g. via the PyPI server, mod_rewrite, or a similar + mechanism), then it is not necessary to include this all-packages listing + page.) + +8. If a package index is accessed via a ``file://`` URL, then EasyInstall will + automatically use ``index.html`` files, if present, when trying to read a + directory with a trailing ``/`` on the URL. + + +Backward Compatibility +~~~~~~~~~~~~~~~~~~~~~~ + +Package indexes that wish to support setuptools versions prior to 0.6b4 should +also follow these rules: + +* Homepage and download links must be preceded with ``"Home Page"`` or + ``"Download URL"``, in addition to (or instead of) the ``rel=""`` + attributes on the actual links. These marker strings do not need to be + visible, or uncommented, however! For example, the following is a valid + homepage link that will work with any version of setuptools:: + +
  • + Home Page: + + http://sqlobject.org +
  • + + Even though the marker string is in an HTML comment, older versions of + EasyInstall will still "see" it and know that the link that follows is the + project's home page URL. + +* The pages described by paragraph 3(b) of the preceding section *must* + contain the string ``"Index of Packages"`` somewhere in their text. + This can be inside of an HTML comment, if desired, and it can be anywhere + in the page. (Note: this string MUST NOT appear on normal project pages, as + described in paragraphs 2 and 3(a)!) + +In addition, for compatibility with PyPI versions that do not use ``#md5=`` +fragment IDs, EasyInstall uses the following regular expression to match PyPI's +displayed MD5 info (broken onto two lines for readability):: + + ([^<]+)\n\s+\(md5\) + +History +======= + +0.6c9 + * Fixed ``win32.exe`` support for .pth files, so unnecessary directory nesting + is flattened out in the resulting egg. (There was a case-sensitivity + problem that affected some distributions, notably ``pywin32``.) + + * Prevent ``--help-commands`` and other junk from showing under Python 2.5 + when running ``easy_install --help``. + + * Fixed GUI scripts sometimes not executing on Windows + + * Fixed not picking up dependency links from recursive dependencies. + + * Only make ``.py``, ``.dll`` and ``.so`` files executable when unpacking eggs + + * Changes for Jython compatibility + + * Improved error message when a requirement is also a directory name, but the + specified directory is not a source package. + + * Fixed ``--allow-hosts`` option blocking ``file:`` URLs + + * Fixed HTTP SVN detection failing when the page title included a project + name (e.g. on SourceForge-hosted SVN) + + * Fix Jython script installation to handle ``#!`` lines better when + ``sys.executable`` is a script. + + * Removed use of deprecated ``md5`` module if ``hashlib`` is available + + * Keep site directories (e.g. ``site-packages``) from being included in + ``.pth`` files. + +0.6c7 + * ``ftp:`` download URLs now work correctly. + + * The default ``--index-url`` is now ``https://pypi.python.org/simple``, to use + the Python Package Index's new simpler (and faster!) REST API. + +0.6c6 + * EasyInstall no longer aborts the installation process if a URL it wants to + retrieve can't be downloaded, unless the URL is an actual package download. + Instead, it issues a warning and tries to keep going. + + * Fixed distutils-style scripts originally built on Windows having their line + endings doubled when installed on any platform. + + * Added ``--local-snapshots-ok`` flag, to allow building eggs from projects + installed using ``setup.py develop``. + + * Fixed not HTML-decoding URLs scraped from web pages + +0.6c5 + * Fixed ``.dll`` files on Cygwin not having executable permissions when an egg + is installed unzipped. + +0.6c4 + * Added support for HTTP "Basic" authentication using ``http://user:pass@host`` + URLs. If a password-protected page contains links to the same host (and + protocol), those links will inherit the credentials used to access the + original page. + + * Removed all special support for Sourceforge mirrors, as Sourceforge's + mirror system now works well for non-browser downloads. + + * Fixed not recognizing ``win32.exe`` installers that included a custom + bitmap. + + * Fixed not allowing ``os.open()`` of paths outside the sandbox, even if they + are opened read-only (e.g. reading ``/dev/urandom`` for random numbers, as + is done by ``os.urandom()`` on some platforms). + + * Fixed a problem with ``.pth`` testing on Windows when ``sys.executable`` + has a space in it (e.g., the user installed Python to a ``Program Files`` + directory). + +0.6c3 + * You can once again use "python -m easy_install" with Python 2.4 and above. + + * Python 2.5 compatibility fixes added. + +0.6c2 + * Windows script wrappers now support quoted arguments and arguments + containing spaces. (Patch contributed by Jim Fulton.) + + * The ``ez_setup.py`` script now actually works when you put a setuptools + ``.egg`` alongside it for bootstrapping an offline machine. + + * A writable installation directory on ``sys.path`` is no longer required to + download and extract a source distribution using ``--editable``. + + * Generated scripts now use ``-x`` on the ``#!`` line when ``sys.executable`` + contains non-ASCII characters, to prevent deprecation warnings about an + unspecified encoding when the script is run. + +0.6c1 + * EasyInstall now includes setuptools version information in the + ``User-Agent`` string sent to websites it visits. + +0.6b4 + * Fix creating Python wrappers for non-Python scripts + + * Fix ``ftp://`` directory listing URLs from causing a crash when used in the + "Home page" or "Download URL" slots on PyPI. + + * Fix ``sys.path_importer_cache`` not being updated when an existing zipfile + or directory is deleted/overwritten. + + * Fix not recognizing HTML 404 pages from package indexes. + + * Allow ``file://`` URLs to be used as a package index. URLs that refer to + directories will use an internally-generated directory listing if there is + no ``index.html`` file in the directory. + + * Allow external links in a package index to be specified using + ``rel="homepage"`` or ``rel="download"``, without needing the old + PyPI-specific visible markup. + + * Suppressed warning message about possibly-misspelled project name, if an egg + or link for that project name has already been seen. + +0.6b3 + * Fix local ``--find-links`` eggs not being copied except with + ``--always-copy``. + + * Fix sometimes not detecting local packages installed outside of "site" + directories. + + * Fix mysterious errors during initial ``setuptools`` install, caused by + ``ez_setup`` trying to run ``easy_install`` twice, due to a code fallthru + after deleting the egg from which it's running. + +0.6b2 + * Don't install or update a ``site.py`` patch when installing to a + ``PYTHONPATH`` directory with ``--multi-version``, unless an + ``easy-install.pth`` file is already in use there. + + * Construct ``.pth`` file paths in such a way that installing an egg whose + name begins with ``import`` doesn't cause a syntax error. + + * Fixed a bogus warning message that wasn't updated since the 0.5 versions. + +0.6b1 + * Better ambiguity management: accept ``#egg`` name/version even if processing + what appears to be a correctly-named distutils file, and ignore ``.egg`` + files with no ``-``, since valid Python ``.egg`` files always have a version + number (but Scheme eggs often don't). + + * Support ``file://`` links to directories in ``--find-links``, so that + easy_install can build packages from local source checkouts. + + * Added automatic retry for Sourceforge mirrors. The new download process is + to first just try dl.sourceforge.net, then randomly select mirror IPs and + remove ones that fail, until something works. The removed IPs stay removed + for the remainder of the run. + + * Ignore bdist_dumb distributions when looking at download URLs. + +0.6a11 + * Process ``dependency_links.txt`` if found in a distribution, by adding the + URLs to the list for scanning. + + * Use relative paths in ``.pth`` files when eggs are being installed to the + same directory as the ``.pth`` file. This maximizes portability of the + target directory when building applications that contain eggs. + + * Added ``easy_install-N.N`` script(s) for convenience when using multiple + Python versions. + + * Added automatic handling of installation conflicts. Eggs are now shifted to + the front of sys.path, in an order consistent with where they came from, + making EasyInstall seamlessly co-operate with system package managers. + + The ``--delete-conflicting`` and ``--ignore-conflicts-at-my-risk`` options + are now no longer necessary, and will generate warnings at the end of a + run if you use them. + + * Don't recursively traverse subdirectories given to ``--find-links``. + +0.6a10 + * Added exhaustive testing of the install directory, including a spawn test + for ``.pth`` file support, and directory writability/existence checks. This + should virtually eliminate the need to set or configure ``--site-dirs``. + + * Added ``--prefix`` option for more do-what-I-mean-ishness in the absence of + RTFM-ing. :) + + * Enhanced ``PYTHONPATH`` support so that you don't have to put any eggs on it + manually to make it work. ``--multi-version`` is no longer a silent + default; you must explicitly use it if installing to a non-PYTHONPATH, + non-"site" directory. + + * Expand ``$variables`` used in the ``--site-dirs``, ``--build-directory``, + ``--install-dir``, and ``--script-dir`` options, whether on the command line + or in configuration files. + + * Improved SourceForge mirror processing to work faster and be less affected + by transient HTML changes made by SourceForge. + + * PyPI searches now use the exact spelling of requirements specified on the + command line or in a project's ``install_requires``. Previously, a + normalized form of the name was used, which could lead to unnecessary + full-index searches when a project's name had an underscore (``_``) in it. + + * EasyInstall can now download bare ``.py`` files and wrap them in an egg, + as long as you include an ``#egg=name-version`` suffix on the URL, or if + the ``.py`` file is listed as the "Download URL" on the project's PyPI page. + This allows third parties to "package" trivial Python modules just by + linking to them (e.g. from within their own PyPI page or download links + page). + + * The ``--always-copy`` option now skips "system" and "development" eggs since + they can't be reliably copied. Note that this may cause EasyInstall to + choose an older version of a package than what you expected, or it may cause + downloading and installation of a fresh version of what's already installed. + + * The ``--find-links`` option previously scanned all supplied URLs and + directories as early as possible, but now only directories and direct + archive links are scanned immediately. URLs are not retrieved unless a + package search was already going to go online due to a package not being + available locally, or due to the use of the ``--update`` or ``-U`` option. + + * Fixed the annoying ``--help-commands`` wart. + +0.6a9 + * Fixed ``.pth`` file processing picking up nested eggs (i.e. ones inside + "baskets") when they weren't explicitly listed in the ``.pth`` file. + + * If more than one URL appears to describe the exact same distribution, prefer + the shortest one. This helps to avoid "table of contents" CGI URLs like the + ones on effbot.org. + + * Quote arguments to python.exe (including python's path) to avoid problems + when Python (or a script) is installed in a directory whose name contains + spaces on Windows. + + * Support full roundtrip translation of eggs to and from ``bdist_wininst`` + format. Running ``bdist_wininst`` on a setuptools-based package wraps the + egg in an .exe that will safely install it as an egg (i.e., with metadata + and entry-point wrapper scripts), and ``easy_install`` can turn the .exe + back into an ``.egg`` file or directory and install it as such. + +0.6a8 + * Update for changed SourceForge mirror format + + * Fixed not installing dependencies for some packages fetched via Subversion + + * Fixed dependency installation with ``--always-copy`` not using the same + dependency resolution procedure as other operations. + + * Fixed not fully removing temporary directories on Windows, if a Subversion + checkout left read-only files behind + + * Fixed some problems building extensions when Pyrex was installed, especially + with Python 2.4 and/or packages using SWIG. + +0.6a7 + * Fixed not being able to install Windows script wrappers using Python 2.3 + +0.6a6 + * Added support for "traditional" PYTHONPATH-based non-root installation, and + also the convenient ``virtual-python.py`` script, based on a contribution + by Ian Bicking. The setuptools egg now contains a hacked ``site`` module + that makes the PYTHONPATH-based approach work with .pth files, so that you + can get the full EasyInstall feature set on such installations. + + * Added ``--no-deps`` and ``--allow-hosts`` options. + + * Improved Windows ``.exe`` script wrappers so that the script can have the + same name as a module without confusing Python. + + * Changed dependency processing so that it's breadth-first, allowing a + depender's preferences to override those of a dependee, to prevent conflicts + when a lower version is acceptable to the dependee, but not the depender. + Also, ensure that currently installed/selected packages aren't given + precedence over ones desired by a package being installed, which could + cause conflict errors. + +0.6a3 + * Improved error message when trying to use old ways of running + ``easy_install``. Removed the ability to run via ``python -m`` or by + running ``easy_install.py``; ``easy_install`` is the command to run on all + supported platforms. + + * Improved wrapper script generation and runtime initialization so that a + VersionConflict doesn't occur if you later install a competing version of a + needed package as the default version of that package. + + * Fixed a problem parsing version numbers in ``#egg=`` links. + +0.6a2 + * EasyInstall can now install "console_scripts" defined by packages that use + ``setuptools`` and define appropriate entry points. On Windows, console + scripts get an ``.exe`` wrapper so you can just type their name. On other + platforms, the scripts are installed without a file extension. + + * Using ``python -m easy_install`` or running ``easy_install.py`` is now + DEPRECATED, since an ``easy_install`` wrapper is now available on all + platforms. + +0.6a1 + * EasyInstall now does MD5 validation of downloads from PyPI, or from any link + that has an "#md5=..." trailer with a 32-digit lowercase hex md5 digest. + + * EasyInstall now handles symlinks in target directories by removing the link, + rather than attempting to overwrite the link's destination. This makes it + easier to set up an alternate Python "home" directory (as described above in + the `Non-Root Installation`_ section). + + * Added support for handling MacOS platform information in ``.egg`` filenames, + based on a contribution by Kevin Dangoor. You may wish to delete and + reinstall any eggs whose filename includes "darwin" and "Power_Macintosh", + because the format for this platform information has changed so that minor + OS X upgrades (such as 10.4.1 to 10.4.2) do not cause eggs built with a + previous OS version to become obsolete. + + * easy_install's dependency processing algorithms have changed. When using + ``--always-copy``, it now ensures that dependencies are copied too. When + not using ``--always-copy``, it tries to use a single resolution loop, + rather than recursing. + + * Fixed installing extra ``.pyc`` or ``.pyo`` files for scripts with ``.py`` + extensions. + + * Added ``--site-dirs`` option to allow adding custom "site" directories. + Made ``easy-install.pth`` work in platform-specific alternate site + directories (e.g. ``~/Library/Python/2.x/site-packages`` on Mac OS X). + + * If you manually delete the current version of a package, the next run of + EasyInstall against the target directory will now remove the stray entry + from the ``easy-install.pth`` file. + + * EasyInstall now recognizes URLs with a ``#egg=project_name`` fragment ID + as pointing to the named project's source checkout. Such URLs have a lower + match precedence than any other kind of distribution, so they'll only be + used if they have a higher version number than any other available + distribution, or if you use the ``--editable`` option. The ``#egg`` + fragment can contain a version if it's formatted as ``#egg=proj-ver``, + where ``proj`` is the project name, and ``ver`` is the version number. You + *must* use the format for these values that the ``bdist_egg`` command uses; + i.e., all non-alphanumeric runs must be condensed to single underscore + characters. + + * Added the ``--editable`` option; see `Editing and Viewing Source Packages`_ + above for more info. Also, slightly changed the behavior of the + ``--build-directory`` option. + + * Fixed the setup script sandbox facility not recognizing certain paths as + valid on case-insensitive platforms. + +0.5a12 + * Fix ``python -m easy_install`` not working due to setuptools being installed + as a zipfile. Update safety scanner to check for modules that might be used + as ``python -m`` scripts. + + * Misc. fixes for win32.exe support, including changes to support Python 2.4's + changed ``bdist_wininst`` format. + +0.5a10 + * Put the ``easy_install`` module back in as a module, as it's needed for + ``python -m`` to run it! + + * Allow ``--find-links/-f`` to accept local directories or filenames as well + as URLs. + +0.5a9 + * EasyInstall now automatically detects when an "unmanaged" package or + module is going to be on ``sys.path`` ahead of a package you're installing, + thereby preventing the newer version from being imported. By default, it + will abort installation to alert you of the problem, but there are also + new options (``--delete-conflicting`` and ``--ignore-conflicts-at-my-risk``) + available to change the default behavior. (Note: this new feature doesn't + take effect for egg files that were built with older ``setuptools`` + versions, because they lack the new metadata file required to implement it.) + + * The ``easy_install`` distutils command now uses ``DistutilsError`` as its + base error type for errors that should just issue a message to stderr and + exit the program without a traceback. + + * EasyInstall can now be given a path to a directory containing a setup + script, and it will attempt to build and install the package there. + + * EasyInstall now performs a safety analysis on module contents to determine + whether a package is likely to run in zipped form, and displays + information about what modules may be doing introspection that would break + when running as a zipfile. + + * Added the ``--always-unzip/-Z`` option, to force unzipping of packages that + would ordinarily be considered safe to unzip, and changed the meaning of + ``--zip-ok/-z`` to "always leave everything zipped". + +0.5a8 + * There is now a separate documentation page for `setuptools`_; revision + history that's not specific to EasyInstall has been moved to that page. + + .. _setuptools: http://peak.telecommunity.com/DevCenter/setuptools + +0.5a5 + * Made ``easy_install`` a standard ``setuptools`` command, moving it from + the ``easy_install`` module to ``setuptools.command.easy_install``. Note + that if you were importing or extending it, you must now change your imports + accordingly. ``easy_install.py`` is still installed as a script, but not as + a module. + +0.5a4 + * Added ``--always-copy/-a`` option to always copy needed packages to the + installation directory, even if they're already present elsewhere on + sys.path. (In previous versions, this was the default behavior, but now + you must request it.) + + * Added ``--upgrade/-U`` option to force checking PyPI for latest available + version(s) of all packages requested by name and version, even if a matching + version is available locally. + + * Added automatic installation of dependencies declared by a distribution + being installed. These dependencies must be listed in the distribution's + ``EGG-INFO`` directory, so the distribution has to have declared its + dependencies by using setuptools. If a package has requirements it didn't + declare, you'll still have to deal with them yourself. (E.g., by asking + EasyInstall to find and install them.) + + * Added the ``--record`` option to ``easy_install`` for the benefit of tools + that run ``setup.py install --record=filename`` on behalf of another + packaging system.) + +0.5a3 + * Fixed not setting script permissions to allow execution. + + * Improved sandboxing so that setup scripts that want a temporary directory + (e.g. pychecker) can still run in the sandbox. + +0.5a2 + * Fix stupid stupid refactoring-at-the-last-minute typos. :( + +0.5a1 + * Added support for converting ``.win32.exe`` installers to eggs on the fly. + EasyInstall will now recognize such files by name and install them. + + * Fixed a problem with picking the "best" version to install (versions were + being sorted as strings, rather than as parsed values) + +0.4a4 + * Added support for the distutils "verbose/quiet" and "dry-run" options, as + well as the "optimize" flag. + + * Support downloading packages that were uploaded to PyPI (by scanning all + links on package pages, not just the homepage/download links). + +0.4a3 + * Add progress messages to the search/download process so that you can tell + what URLs it's reading to find download links. (Hopefully, this will help + people report out-of-date and broken links to package authors, and to tell + when they've asked for a package that doesn't exist.) + +0.4a2 + * Added support for installing scripts + + * Added support for setting options via distutils configuration files, and + using distutils' default options as a basis for EasyInstall's defaults. + + * Renamed ``--scan-url/-s`` to ``--find-links/-f`` to free up ``-s`` for the + script installation directory option. + + * Use ``urllib2`` instead of ``urllib``, to allow use of ``https:`` URLs if + Python includes SSL support. + +0.4a1 + * Added ``--scan-url`` and ``--index-url`` options, to scan download pages + and search PyPI for needed packages. + +0.3a4 + * Restrict ``--build-directory=DIR/-b DIR`` option to only be used with single + URL installs, to avoid running the wrong setup.py. + +0.3a3 + * Added ``--build-directory=DIR/-b DIR`` option. + + * Added "installation report" that explains how to use 'require()' when doing + a multiversion install or alternate installation directory. + + * Added SourceForge mirror auto-select (Contributed by Ian Bicking) + + * Added "sandboxing" that stops a setup script from running if it attempts to + write to the filesystem outside of the build area + + * Added more workarounds for packages with quirky ``install_data`` hacks + +0.3a2 + * Added subversion download support for ``svn:`` and ``svn+`` URLs, as well as + automatic recognition of HTTP subversion URLs (Contributed by Ian Bicking) + + * Misc. bug fixes + +0.3a1 + * Initial release. + + +Future Plans +============ + +* Additional utilities to list/remove/verify packages +* Signature checking? SSL? Ability to suppress PyPI search? +* Display byte progress meter when downloading distributions and long pages? +* Redirect stdout/stderr to log during run_setup? + diff --git a/docs/formats.txt b/docs/formats.txt new file mode 100644 index 0000000..9e6fe72 --- /dev/null +++ b/docs/formats.txt @@ -0,0 +1,682 @@ +===================================== +The Internal Structure of Python Eggs +===================================== + +STOP! This is not the first document you should read! + + + +.. contents:: **Table of Contents** + + +---------------------- +Eggs and their Formats +---------------------- + +A "Python egg" is a logical structure embodying the release of a +specific version of a Python project, comprising its code, resources, +and metadata. There are multiple formats that can be used to physically +encode a Python egg, and others can be developed. However, a key +principle of Python eggs is that they should be discoverable and +importable. That is, it should be possible for a Python application to +easily and efficiently find out what eggs are present on a system, and +to ensure that the desired eggs' contents are importable. + +There are two basic formats currently implemented for Python eggs: + +1. ``.egg`` format: a directory or zipfile *containing* the project's + code and resources, along with an ``EGG-INFO`` subdirectory that + contains the project's metadata + +2. ``.egg-info`` format: a file or directory placed *adjacent* to the + project's code and resources, that directly contains the project's + metadata. + +Both formats can include arbitrary Python code and resources, including +static data files, package and non-package directories, Python +modules, C extension modules, and so on. But each format is optimized +for different purposes. + +The ``.egg`` format is well-suited to distribution and the easy +uninstallation or upgrades of code, since the project is essentially +self-contained within a single directory or file, unmingled with any +other projects' code or resources. It also makes it possible to have +multiple versions of a project simultaneously installed, such that +individual programs can select the versions they wish to use. + +The ``.egg-info`` format, on the other hand, was created to support +backward-compatibility, performance, and ease of installation for system +packaging tools that expect to install all projects' code and resources +to a single directory (e.g. ``site-packages``). Placing the metadata +in that same directory simplifies the installation process, since it +isn't necessary to create ``.pth`` files or otherwise modify +``sys.path`` to include each installed egg. + +Its disadvantage, however, is that it provides no support for clean +uninstallation or upgrades, and of course only a single version of a +project can be installed to a given directory. Thus, support from a +package management tool is required. (This is why setuptools' "install" +command refers to this type of egg installation as "single-version, +externally managed".) Also, they lack sufficient data to allow them to +be copied from their installation source. easy_install can "ship" an +application by copying ``.egg`` files or directories to a target +location, but it cannot do this for ``.egg-info`` installs, because +there is no way to tell what code and resources belong to a particular +egg -- there may be several eggs "scrambled" together in a single +installation location, and the ``.egg-info`` format does not currently +include a way to list the files that were installed. (This may change +in a future version.) + + +Code and Resources +================== + +The layout of the code and resources is dictated by Python's normal +import layout, relative to the egg's "base location". + +For the ``.egg`` format, the base location is the ``.egg`` itself. That +is, adding the ``.egg`` filename or directory name to ``sys.path`` +makes its contents importable. + +For the ``.egg-info`` format, however, the base location is the +directory that *contains* the ``.egg-info``, and thus it is the +directory that must be added to ``sys.path`` to make the egg importable. +(Note that this means that the "normal" installation of a package to a +``sys.path`` directory is sufficient to make it an "egg" if it has an +``.egg-info`` file or directory installed alongside of it.) + + +Project Metadata +================= + +If eggs contained only code and resources, there would of course be +no difference between them and any other directory or zip file on +``sys.path``. Thus, metadata must also be included, using a metadata +file or directory. + +For the ``.egg`` format, the metadata is placed in an ``EGG-INFO`` +subdirectory, directly within the ``.egg`` file or directory. For the +``.egg-info`` format, metadata is stored directly within the +``.egg-info`` directory itself. + +The minimum project metadata that all eggs must have is a standard +Python ``PKG-INFO`` file, named ``PKG-INFO`` and placed within the +metadata directory appropriate to the format. Because it's possible for +this to be the only metadata file included, ``.egg-info`` format eggs +are not required to be a directory; they can just be a ``.egg-info`` +file that directly contains the ``PKG-INFO`` metadata. This eliminates +the need to create a directory just to store one file. This option is +*not* available for ``.egg`` formats, since setuptools always includes +other metadata. (In fact, setuptools itself never generates +``.egg-info`` files, either; the support for using files was added so +that the requirement could easily be satisfied by other tools, such +as the distutils in Python 2.5). + +In addition to the ``PKG-INFO`` file, an egg's metadata directory may +also include files and directories representing various forms of +optional standard metadata (see the section on `Standard Metadata`_, +below) or user-defined metadata required by the project. For example, +some projects may define a metadata format to describe their application +plugins, and metadata in this format would then be included by plugin +creators in their projects' metadata directories. + + +Filename-Embedded Metadata +========================== + +To allow introspection of installed projects and runtime resolution of +inter-project dependencies, a certain amount of information is embedded +in egg filenames. At a minimum, this includes the project name, and +ideally will also include the project version number. Optionally, it +can also include the target Python version and required runtime +platform if platform-specific C code is included. The syntax of an +egg filename is as follows:: + + name ["-" version ["-py" pyver ["-" required_platform]]] "." ext + +The "name" and "version" should be escaped using the ``to_filename()`` +function provided by ``pkg_resources``, after first processing them with +``safe_name()`` and ``safe_version()`` respectively. These latter two +functions can also be used to later "unescape" these parts of the +filename. (For a detailed description of these transformations, please +see the "Parsing Utilities" section of the ``pkg_resources`` manual.) + +The "pyver" string is the Python major version, as found in the first +3 characters of ``sys.version``. "required_platform" is essentially +a distutils ``get_platform()`` string, but with enhancements to properly +distinguish Mac OS versions. (See the ``get_build_platform()`` +documentation in the "Platform Utilities" section of the +``pkg_resources`` manual for more details.) + +Finally, the "ext" is either ``.egg`` or ``.egg-info``, as appropriate +for the egg's format. + +Normally, an egg's filename should include at least the project name and +version, as this allows the runtime system to find desired project +versions without having to read the egg's PKG-INFO to determine its +version number. + +Setuptools, however, only includes the version number in the filename +when an ``.egg`` file is built using the ``bdist_egg`` command, or when +an ``.egg-info`` directory is being installed by the +``install_egg_info`` command. When generating metadata for use with the +original source tree, it only includes the project name, so that the +directory will not have to be renamed each time the project's version +changes. + +This is especially important when version numbers change frequently, and +the source metadata directory is kept under version control with the +rest of the project. (As would be the case when the project's source +includes project-defined metadata that is not generated from by +setuptools from data in the setup script.) + + +Egg Links +========= + +In addition to the ``.egg`` and ``.egg-info`` formats, there is a third +egg-related extension that you may encounter on occasion: ``.egg-link`` +files. + +These files are not eggs, strictly speaking. They simply provide a way +to reference an egg that is not physically installed in the desired +location. They exist primarily as a cross-platform alternative to +symbolic links, to support "installing" code that is being developed in +a different location than the desired installation location. For +example, if a user is developing an application plugin in their home +directory, but the plugin needs to be "installed" in an application +plugin directory, running "setup.py develop -md /path/to/app/plugins" +will install an ``.egg-link`` file in ``/path/to/app/plugins``, that +tells the egg runtime system where to find the actual egg (the user's +project source directory and its ``.egg-info`` subdirectory). + +``.egg-link`` files are named following the format for ``.egg`` and +``.egg-info`` names, but only the project name is included; no version, +Python version, or platform information is included. When the runtime +searches for available eggs, ``.egg-link`` files are opened and the +actual egg file/directory name is read from them. + +Each ``.egg-link`` file should contain a single file or directory name, +with no newlines. This filename should be the base location of one or +more eggs. That is, the name must either end in ``.egg``, or else it +should be the parent directory of one or more ``.egg-info`` format eggs. + +As of setuptools 0.6c6, the path may be specified as a platform-independent +(i.e. ``/``-separated) relative path from the directory containing the +``.egg-link`` file, and a second line may appear in the file, specifying a +platform-independent relative path from the egg's base directory to its +setup script directory. This allows installation tools such as EasyInstall +to find the project's setup directory and build eggs or perform other setup +commands on it. + + +----------------- +Standard Metadata +----------------- + +In addition to the minimum required ``PKG-INFO`` metadata, projects can +include a variety of standard metadata files or directories, as +described below. Except as otherwise noted, these files and directories +are automatically generated by setuptools, based on information supplied +in the setup script or through analysis of the project's code and +resources. + +Most of these files and directories are generated via "egg-info +writers" during execution of the setuptools ``egg_info`` command, and +are listed in the ``egg_info.writers`` entry point group defined by +setuptools' own ``setup.py`` file. + +Project authors can register their own metadata writers as entry points +in this group (as described in the setuptools manual under "Adding new +EGG-INFO Files") to cause setuptools to generate project-specific +metadata files or directories during execution of the ``egg_info`` +command. It is up to project authors to document these new metadata +formats, if they create any. + + +``.txt`` File Formats +===================== + +Files described in this section that have ``.txt`` extensions have a +simple lexical format consisting of a sequence of text lines, each line +terminated by a linefeed character (regardless of platform). Leading +and trailing whitespace on each line is ignored, as are blank lines and +lines whose first nonblank character is a ``#`` (comment symbol). (This +is the parsing format defined by the ``yield_lines()`` function of +the ``pkg_resources`` module.) + +All ``.txt`` files defined by this section follow this format, but some +are also "sectioned" files, meaning that their contents are divided into +sections, using square-bracketed section headers akin to Windows +``.ini`` format. Note that this does *not* imply that the lines within +the sections follow an ``.ini`` format, however. Please see an +individual metadata file's documentation for a description of what the +lines and section names mean in that particular file. + +Sectioned files can be parsed using the ``split_sections()`` function; +see the "Parsing Utilities" section of the ``pkg_resources`` manual for +for details. + + +Dependency Metadata +=================== + + +``requires.txt`` +---------------- + +This is a "sectioned" text file. Each section is a sequence of +"requirements", as parsed by the ``parse_requirements()`` function; +please see the ``pkg_resources`` manual for the complete requirement +parsing syntax. + +The first, unnamed section (i.e., before the first section header) in +this file is the project's core requirements, which must be installed +for the project to function. (Specified using the ``install_requires`` +keyword to ``setup()``). + +The remaining (named) sections describe the project's "extra" +requirements, as specified using the ``extras_require`` keyword to +``setup()``. The section name is the name of the optional feature, and +the section body lists that feature's dependencies. + +Note that it is not normally necessary to inspect this file directly; +``pkg_resources.Distribution`` objects have a ``requires()`` method +that can be used to obtain ``Requirement`` objects describing the +project's core and optional dependencies. + + +``setup_requires.txt`` +---------------------- + +Much like ``requires.txt`` except represents the requirements +specified by the ``setup_requires`` parameter to the Distribution. + + +``dependency_links.txt`` +------------------------ + +A list of dependency URLs, one per line, as specified using the +``dependency_links`` keyword to ``setup()``. These may be direct +download URLs, or the URLs of web pages containing direct download +links, and will be used by EasyInstall to find dependencies, as though +the user had manually provided them via the ``--find-links`` command +line option. Please see the setuptools manual and EasyInstall manual +for more information on specifying this option, and for information on +how EasyInstall processes ``--find-links`` URLs. + + +``depends.txt`` -- Obsolete, do not create! +------------------------------------------- + +This file follows an identical format to ``requires.txt``, but is +obsolete and should not be used. The earliest versions of setuptools +required users to manually create and maintain this file, so the runtime +still supports reading it, if it exists. The new filename was created +so that it could be automatically generated from ``setup()`` information +without overwriting an existing hand-created ``depends.txt``, if one +was already present in the project's source ``.egg-info`` directory. + + +``namespace_packages.txt`` -- Namespace Package Metadata +======================================================== + +A list of namespace package names, one per line, as supplied to the +``namespace_packages`` keyword to ``setup()``. Please see the manuals +for setuptools and ``pkg_resources`` for more information about +namespace packages. + + +``entry_points.txt`` -- "Entry Point"/Plugin Metadata +===================================================== + +This is a "sectioned" text file, whose contents encode the +``entry_points`` keyword supplied to ``setup()``. All sections are +named, as the section names specify the entry point groups in which the +corresponding section's entry points are registered. + +Each section is a sequence of "entry point" lines, each parseable using +the ``EntryPoint.parse`` classmethod; please see the ``pkg_resources`` +manual for the complete entry point parsing syntax. + +Note that it is not necessary to parse this file directly; the +``pkg_resources`` module provides a variety of APIs to locate and load +entry points automatically. Please see the setuptools and +``pkg_resources`` manuals for details on the nature and uses of entry +points. + + +The ``scripts`` Subdirectory +============================ + +This directory is currently only created for ``.egg`` files built by +the setuptools ``bdist_egg`` command. It will contain copies of all +of the project's "traditional" scripts (i.e., those specified using the +``scripts`` keyword to ``setup()``). This is so that they can be +reconstituted when an ``.egg`` file is installed. + +The scripts are placed here using the distutils' standard +``install_scripts`` command, so any ``#!`` lines reflect the Python +installation where the egg was built. But instead of copying the +scripts to the local script installation directory, EasyInstall writes +short wrapper scripts that invoke the original scripts from inside the +egg, after ensuring that sys.path includes the egg and any eggs it +depends on. For more about `script wrappers`_, see the section below on +`Installation and Path Management Issues`_. + + +Zip Support Metadata +==================== + + +``native_libs.txt`` +------------------- + +A list of C extensions and other dynamic link libraries contained in +the egg, one per line. Paths are ``/``-separated and relative to the +egg's base location. + +This file is generated as part of ``bdist_egg`` processing, and as such +only appears in ``.egg`` files (and ``.egg`` directories created by +unpacking them). It is used to ensure that all libraries are extracted +from a zipped egg at the same time, in case there is any direct linkage +between them. Please see the `Zip File Issues`_ section below for more +information on library and resource extraction from ``.egg`` files. + + +``eager_resources.txt`` +----------------------- + +A list of resource files and/or directories, one per line, as specified +via the ``eager_resources`` keyword to ``setup()``. Paths are +``/``-separated and relative to the egg's base location. + +Resource files or directories listed here will be extracted +simultaneously, if any of the named resources are extracted, or if any +native libraries listed in ``native_libs.txt`` are extracted. Please +see the setuptools manual for details on what this feature is used for +and how it works, as well as the `Zip File Issues`_ section below. + + +``zip-safe`` and ``not-zip-safe`` +--------------------------------- + +These are zero-length files, and either one or the other should exist. +If ``zip-safe`` exists, it means that the project will work properly +when installed as an ``.egg`` zipfile, and conversely the existence of +``not-zip-safe`` means the project should not be installed as an +``.egg`` file. The ``zip_safe`` option to setuptools' ``setup()`` +determines which file will be written. If the option isn't provided, +setuptools attempts to make its own assessment of whether the package +can work, based on code and content analysis. + +If neither file is present at installation time, EasyInstall defaults +to assuming that the project should be unzipped. (Command-line options +to EasyInstall, however, take precedence even over an existing +``zip-safe`` or ``not-zip-safe`` file.) + +Note that these flag files appear only in ``.egg`` files generated by +``bdist_egg``, and in ``.egg`` directories created by unpacking such an +``.egg`` file. + + + +``top_level.txt`` -- Conflict Management Metadata +================================================= + +This file is a list of the top-level module or package names provided +by the project, one Python identifier per line. + +Subpackages are not included; a project containing both a ``foo.bar`` +and a ``foo.baz`` would include only one line, ``foo``, in its +``top_level.txt``. + +This data is used by ``pkg_resources`` at runtime to issue a warning if +an egg is added to ``sys.path`` when its contained packages may have +already been imported. + +(It was also once used to detect conflicts with non-egg packages at +installation time, but in more recent versions, setuptools installs eggs +in such a way that they always override non-egg packages, thus +preventing a problem from arising.) + + +``SOURCES.txt`` -- Source Files Manifest +======================================== + +This file is roughly equivalent to the distutils' ``MANIFEST`` file. +The differences are as follows: + +* The filenames always use ``/`` as a path separator, which must be + converted back to a platform-specific path whenever they are read. + +* The file is automatically generated by setuptools whenever the + ``egg_info`` or ``sdist`` commands are run, and it is *not* + user-editable. + +Although this metadata is included with distributed eggs, it is not +actually used at runtime for any purpose. Its function is to ensure +that setuptools-built *source* distributions can correctly discover +what files are part of the project's source, even if the list had been +generated using revision control metadata on the original author's +system. + +In other words, ``SOURCES.txt`` has little or no runtime value for being +included in distributed eggs, and it is possible that future versions of +the ``bdist_egg`` and ``install_egg_info`` commands will strip it before +installation or distribution. Therefore, do not rely on its being +available outside of an original source directory or source +distribution. + + +------------------------------ +Other Technical Considerations +------------------------------ + + +Zip File Issues +=============== + +Although zip files resemble directories, they are not fully +substitutable for them. Most platforms do not support loading dynamic +link libraries contained in zipfiles, so it is not possible to directly +import C extensions from ``.egg`` zipfiles. Similarly, there are many +existing libraries -- whether in Python or C -- that require actual +operating system filenames, and do not work with arbitrary "file-like" +objects or in-memory strings, and thus cannot operate directly on the +contents of zip files. + +To address these issues, the ``pkg_resources`` module provides a +"resource API" to support obtaining either the contents of a resource, +or a true operating system filename for the resource. If the egg +containing the resource is a directory, the resource's real filename +is simply returned. However, if the egg is a zipfile, then the +resource is first extracted to a cache directory, and the filename +within the cache is returned. + +The cache directory is determined by the ``pkg_resources`` API; please +see the ``set_cache_path()`` and ``get_default_cache()`` documentation +for details. + + +The Extraction Process +---------------------- + +Resources are extracted to a cache subdirectory whose name is based +on the enclosing ``.egg`` filename and the path to the resource. If +there is already a file of the correct name, size, and timestamp, its +filename is returned to the requester. Otherwise, the desired file is +extracted first to a temporary name generated using +``mkstemp(".$extract",target_dir)``, and then its timestamp is set to +match the one in the zip file, before renaming it to its final name. +(Some collision detection and resolution code is used to handle the +fact that Windows doesn't overwrite files when renaming.) + +If a resource directory is requested, all of its contents are +recursively extracted in this fashion, to ensure that the directory +name can be used as if it were valid all along. + +If the resource requested for extraction is listed in the +``native_libs.txt`` or ``eager_resources.txt`` metadata files, then +*all* resources listed in *either* file will be extracted before the +requested resource's filename is returned, thus ensuring that all +C extensions and data used by them will be simultaneously available. + + +Extension Import Wrappers +------------------------- + +Since Python's built-in zip import feature does not support loading +C extension modules from zipfiles, the setuptools ``bdist_egg`` command +generates special import wrappers to make it work. + +The wrappers are ``.py`` files (along with corresponding ``.pyc`` +and/or ``.pyo`` files) that have the same module name as the +corresponding C extension. These wrappers are located in the same +package directory (or top-level directory) within the zipfile, so that +say, ``foomodule.so`` will get a corresponding ``foo.py``, while +``bar/baz.pyd`` will get a corresponding ``bar/baz.py``. + +These wrapper files contain a short stanza of Python code that asks +``pkg_resources`` for the filename of the corresponding C extension, +then reloads the module using the obtained filename. This will cause +``pkg_resources`` to first ensure that all of the egg's C extensions +(and any accompanying "eager resources") are extracted to the cache +before attempting to link to the C library. + +Note, by the way, that ``.egg`` directories will also contain these +wrapper files. However, Python's default import priority is such that +C extensions take precedence over same-named Python modules, so the +import wrappers are ignored unless the egg is a zipfile. + + +Installation and Path Management Issues +======================================= + +Python's initial setup of ``sys.path`` is very dependent on the Python +version and installation platform, as well as how Python was started +(i.e., script vs. ``-c`` vs. ``-m`` vs. interactive interpreter). +In fact, Python also provides only two relatively robust ways to affect +``sys.path`` outside of direct manipulation in code: the ``PYTHONPATH`` +environment variable, and ``.pth`` files. + +However, with no cross-platform way to safely and persistently change +environment variables, this leaves ``.pth`` files as EasyInstall's only +real option for persistent configuration of ``sys.path``. + +But ``.pth`` files are rather strictly limited in what they are allowed +to do normally. They add directories only to the *end* of ``sys.path``, +after any locally-installed ``site-packages`` directory, and they are +only processed *in* the ``site-packages`` directory to start with. + +This is a double whammy for users who lack write access to that +directory, because they can't create a ``.pth`` file that Python will +read, and even if a sympathetic system administrator adds one for them +that calls ``site.addsitedir()`` to allow some other directory to +contain ``.pth`` files, they won't be able to install newer versions of +anything that's installed in the systemwide ``site-packages``, because +their paths will still be added *after* ``site-packages``. + +So EasyInstall applies two workarounds to solve these problems. + +The first is that EasyInstall leverages ``.pth`` files' "import" feature +to manipulate ``sys.path`` and ensure that anything EasyInstall adds +to a ``.pth`` file will always appear before both the standard library +and the local ``site-packages`` directories. Thus, it is always +possible for a user who can write a Python-read ``.pth`` file to ensure +that their packages come first in their own environment. + +Second, when installing to a ``PYTHONPATH`` directory (as opposed to +a "site" directory like ``site-packages``) EasyInstall will also install +a special version of the ``site`` module. Because it's in a +``PYTHONPATH`` directory, this module will get control before the +standard library version of ``site`` does. It will record the state of +``sys.path`` before invoking the "real" ``site`` module, and then +afterwards it processes any ``.pth`` files found in ``PYTHONPATH`` +directories, including all the fixups needed to ensure that eggs always +appear before the standard library in sys.path, but are in a relative +order to one another that is defined by their ``PYTHONPATH`` and +``.pth``-prescribed sequence. + +The net result of these changes is that ``sys.path`` order will be +as follows at runtime: + +1. The ``sys.argv[0]`` directory, or an empty string if no script + is being executed. + +2. All eggs installed by EasyInstall in any ``.pth`` file in each + ``PYTHONPATH`` directory, in order first by ``PYTHONPATH`` order, + then normal ``.pth`` processing order (which is to say alphabetical + by ``.pth`` filename, then by the order of listing within each + ``.pth`` file). + +3. All eggs installed by EasyInstall in any ``.pth`` file in each "site" + directory (such as ``site-packages``), following the same ordering + rules as for the ones on ``PYTHONPATH``. + +4. The ``PYTHONPATH`` directories themselves, in their original order + +5. Any paths from ``.pth`` files found on ``PYTHONPATH`` that were *not* + eggs installed by EasyInstall, again following the same relative + ordering rules. + +6. The standard library and "site" directories, along with the contents + of any ``.pth`` files found in the "site" directories. + +Notice that sections 1, 4, and 6 comprise the "normal" Python setup for +``sys.path``. Sections 2 and 3 are inserted to support eggs, and +section 5 emulates what the "normal" semantics of ``.pth`` files on +``PYTHONPATH`` would be if Python natively supported them. + +For further discussion of the tradeoffs that went into this design, as +well as notes on the actual magic inserted into ``.pth`` files to make +them do these things, please see also the following messages to the +distutils-SIG mailing list: + +* http://mail.python.org/pipermail/distutils-sig/2006-February/006026.html +* http://mail.python.org/pipermail/distutils-sig/2006-March/006123.html + + +Script Wrappers +--------------- + +EasyInstall never directly installs a project's original scripts to +a script installation directory. Instead, it writes short wrapper +scripts that first ensure that the project's dependencies are active +on sys.path, before invoking the original script. These wrappers +have a #! line that points to the version of Python that was used to +install them, and their second line is always a comment that indicates +the type of script wrapper, the project version required for the script +to run, and information identifying the script to be invoked. + +The format of this marker line is:: + + "# EASY-INSTALL-" script_type ": " tuple_of_strings "\n" + +The ``script_type`` is one of ``SCRIPT``, ``DEV-SCRIPT``, or +``ENTRY-SCRIPT``. The ``tuple_of_strings`` is a comma-separated +sequence of Python string constants. For ``SCRIPT`` and ``DEV-SCRIPT`` +wrappers, there are two strings: the project version requirement, and +the script name (as a filename within the ``scripts`` metadata +directory). For ``ENTRY-SCRIPT`` wrappers, there are three: +the project version requirement, the entry point group name, and the +entry point name. (See the "Automatic Script Creation" section in the +setuptools manual for more information about entry point scripts.) + +In each case, the project version requirement string will be a string +parseable with the ``pkg_resources`` modules' ``Requirement.parse()`` +classmethod. The only difference between a ``SCRIPT`` wrapper and a +``DEV-SCRIPT`` is that a ``DEV-SCRIPT`` actually executes the original +source script in the project's source tree, and is created when the +"setup.py develop" command is run. A ``SCRIPT`` wrapper, on the other +hand, uses the "installed" script written to the ``EGG-INFO/scripts`` +subdirectory of the corresponding ``.egg`` zipfile or directory. +(``.egg-info`` eggs do not have script wrappers associated with them, +except in the "setup.py develop" case.) + +The purpose of including the marker line in generated script wrappers is +to facilitate introspection of installed scripts, and their relationship +to installed eggs. For example, an uninstallation tool could use this +data to identify what scripts can safely be removed, and/or identify +what scripts would stop working if a particular egg is uninstalled. + diff --git a/docs/history.txt b/docs/history.txt new file mode 100644 index 0000000..8fd1dc6 --- /dev/null +++ b/docs/history.txt @@ -0,0 +1,46 @@ +:tocdepth: 2 + +.. _changes: + +History +******* + +.. include:: ../CHANGES (links).rst + +Credits +******* + +* The original design for the ``.egg`` format and the ``pkg_resources`` API was + co-created by Phillip Eby and Bob Ippolito. Bob also implemented the first + version of ``pkg_resources``, and supplied the OS X operating system version + compatibility algorithm. + +* Ian Bicking implemented many early "creature comfort" features of + easy_install, including support for downloading via Sourceforge and + Subversion repositories. Ian's comments on the Web-SIG about WSGI + application deployment also inspired the concept of "entry points" in eggs, + and he has given talks at PyCon and elsewhere to inform and educate the + community about eggs and setuptools. + +* Jim Fulton contributed time and effort to build automated tests of various + aspects of ``easy_install``, and supplied the doctests for the command-line + ``.exe`` wrappers on Windows. + +* Phillip J. Eby is the seminal author of setuptools, and + first proposed the idea of an importable binary distribution format for + Python application plug-ins. + +* Significant parts of the implementation of setuptools were funded by the Open + Source Applications Foundation, to provide a plug-in infrastructure for the + Chandler PIM application. In addition, many OSAF staffers (such as Mike + "Code Bear" Taylor) contributed their time and stress as guinea pigs for the + use of eggs and setuptools, even before eggs were "cool". (Thanks, guys!) + +* Tarek Ziadé is the principal author of the Distribute fork, which + re-invigorated the community on the project, encouraged renewed innovation, + and addressed many defects. + +* Since the merge with Distribute, Jason R. Coombs is the + maintainer of setuptools. The project is maintained in coordination with + the Python Packaging Authority (PyPA) and the larger Python community. + diff --git a/docs/index.txt b/docs/index.txt new file mode 100644 index 0000000..74aabb5 --- /dev/null +++ b/docs/index.txt @@ -0,0 +1,25 @@ +Welcome to Setuptools' documentation! +===================================== + +Setuptools is a fully-featured, actively-maintained, and stable library +designed to facilitate packaging Python projects, where packaging includes: + + - Python package and module definitions + - Distribution package metadata + - Test hooks + - Project installation + - Platform-specific details + - Python 3 support + +Documentation content: + +.. toctree:: + :maxdepth: 2 + + setuptools + easy_install + pkg_resources + python3 + development + roadmap + history diff --git a/docs/pkg_resources.txt b/docs/pkg_resources.txt new file mode 100644 index 0000000..487320c --- /dev/null +++ b/docs/pkg_resources.txt @@ -0,0 +1,1953 @@ +============================================================= +Package Discovery and Resource Access using ``pkg_resources`` +============================================================= + +The ``pkg_resources`` module distributed with ``setuptools`` provides an API +for Python libraries to access their resource files, and for extensible +applications and frameworks to automatically discover plugins. It also +provides runtime support for using C extensions that are inside zipfile-format +eggs, support for merging packages that have separately-distributed modules or +subpackages, and APIs for managing Python's current "working set" of active +packages. + + +.. contents:: **Table of Contents** + + +-------- +Overview +-------- + +The ``pkg_resources`` module provides runtime facilities for finding, +introspecting, activating and using installed Python distributions. Some +of the more advanced features (notably the support for parallel installation +of multiple versions) rely specifically on the "egg" format (either as a +zip archive or subdirectory), while others (such as plugin discovery) will +work correctly so long as "egg-info" metadata directories are available for +relevant distributions. + +Eggs are a distribution format for Python modules, similar in concept to +Java's "jars" or Ruby's "gems", or the "wheel" format defined in PEP 427. +However, unlike a pure distribution format, eggs can also be installed and +added directly to ``sys.path`` as an import location. When installed in +this way, eggs are *discoverable*, meaning that they carry metadata that +unambiguously identifies their contents and dependencies. This means that +an installed egg can be *automatically* found and added to ``sys.path`` in +response to simple requests of the form, "get me everything I need to use +docutils' PDF support". This feature allows mutually conflicting versions of +a distribution to co-exist in the same Python installation, with individual +applications activating the desired version at runtime by manipulating the +contents of ``sys.path`` (this differs from the virtual environment +approach, which involves creating isolated environments for each +application). + +The following terms are needed in order to explain the capabilities offered +by this module: + +project + A library, framework, script, plugin, application, or collection of data + or other resources, or some combination thereof. Projects are assumed to + have "relatively unique" names, e.g. names registered with PyPI. + +release + A snapshot of a project at a particular point in time, denoted by a version + identifier. + +distribution + A file or files that represent a particular release. + +importable distribution + A file or directory that, if placed on ``sys.path``, allows Python to + import any modules contained within it. + +pluggable distribution + An importable distribution whose filename unambiguously identifies its + release (i.e. project and version), and whose contents unambiguously + specify what releases of other projects will satisfy its runtime + requirements. + +extra + An "extra" is an optional feature of a release, that may impose additional + runtime requirements. For example, if docutils PDF support required a + PDF support library to be present, docutils could define its PDF support as + an "extra", and list what other project releases need to be available in + order to provide it. + +environment + A collection of distributions potentially available for importing, but not + necessarily active. More than one distribution (i.e. release version) for + a given project may be present in an environment. + +working set + A collection of distributions actually available for importing, as on + ``sys.path``. At most one distribution (release version) of a given + project may be present in a working set, as otherwise there would be + ambiguity as to what to import. + +eggs + Eggs are pluggable distributions in one of the three formats currently + supported by ``pkg_resources``. There are built eggs, development eggs, + and egg links. Built eggs are directories or zipfiles whose name ends + with ``.egg`` and follows the egg naming conventions, and contain an + ``EGG-INFO`` subdirectory (zipped or otherwise). Development eggs are + normal directories of Python code with one or more ``ProjectName.egg-info`` + subdirectories. The development egg format is also used to provide a + default version of a distribution that is available to software that + doesn't use ``pkg_resources`` to request specific versions. Egg links + are ``*.egg-link`` files that contain the name of a built or + development egg, to support symbolic linking on platforms that do not + have native symbolic links (or where the symbolic link support is + limited). + +(For more information about these terms and concepts, see also this +`architectural overview`_ of ``pkg_resources`` and Python Eggs in general.) + +.. _architectural overview: http://mail.python.org/pipermail/distutils-sig/2005-June/004652.html + + +.. ----------------- +.. Developer's Guide +.. ----------------- + +.. This section isn't written yet. Currently planned topics include + Accessing Resources + Finding and Activating Package Distributions + get_provider() + require() + WorkingSet + iter_distributions + Running Scripts + Configuration + Namespace Packages + Extensible Applications and Frameworks + Locating entry points + Activation listeners + Metadata access + Extended Discovery and Installation + Supporting Custom PEP 302 Implementations +.. For now, please check out the extensive `API Reference`_ below. + + +------------- +API Reference +------------- + +Namespace Package Support +========================= + +A namespace package is a package that only contains other packages and modules, +with no direct contents of its own. Such packages can be split across +multiple, separately-packaged distributions. They are normally used to split +up large packages produced by a single organization, such as in the ``zope`` +namespace package for Zope Corporation packages, and the ``peak`` namespace +package for the Python Enterprise Application Kit. + +To create a namespace package, you list it in the ``namespace_packages`` +argument to ``setup()``, in your project's ``setup.py``. (See the +:ref:`setuptools documentation on namespace packages ` for +more information on this.) Also, you must add a ``declare_namespace()`` call +in the package's ``__init__.py`` file(s): + +``declare_namespace(name)`` + Declare that the dotted package name `name` is a "namespace package" whose + contained packages and modules may be spread across multiple distributions. + The named package's ``__path__`` will be extended to include the + corresponding package in all distributions on ``sys.path`` that contain a + package of that name. (More precisely, if an importer's + ``find_module(name)`` returns a loader, then it will also be searched for + the package's contents.) Whenever a Distribution's ``activate()`` method + is invoked, it checks for the presence of namespace packages and updates + their ``__path__`` contents accordingly. + +Applications that manipulate namespace packages or directly alter ``sys.path`` +at runtime may also need to use this API function: + +``fixup_namespace_packages(path_item)`` + Declare that `path_item` is a newly added item on ``sys.path`` that may + need to be used to update existing namespace packages. Ordinarily, this is + called for you when an egg is automatically added to ``sys.path``, but if + your application modifies ``sys.path`` to include locations that may + contain portions of a namespace package, you will need to call this + function to ensure they are added to the existing namespace packages. + +Although by default ``pkg_resources`` only supports namespace packages for +filesystem and zip importers, you can extend its support to other "importers" +compatible with PEP 302 using the ``register_namespace_handler()`` function. +See the section below on `Supporting Custom Importers`_ for details. + + +``WorkingSet`` Objects +====================== + +The ``WorkingSet`` class provides access to a collection of "active" +distributions. In general, there is only one meaningful ``WorkingSet`` +instance: the one that represents the distributions that are currently active +on ``sys.path``. This global instance is available under the name +``working_set`` in the ``pkg_resources`` module. However, specialized +tools may wish to manipulate working sets that don't correspond to +``sys.path``, and therefore may wish to create other ``WorkingSet`` instances. + +It's important to note that the global ``working_set`` object is initialized +from ``sys.path`` when ``pkg_resources`` is first imported, but is only updated +if you do all future ``sys.path`` manipulation via ``pkg_resources`` APIs. If +you manually modify ``sys.path``, you must invoke the appropriate methods on +the ``working_set`` instance to keep it in sync. Unfortunately, Python does +not provide any way to detect arbitrary changes to a list object like +``sys.path``, so ``pkg_resources`` cannot automatically update the +``working_set`` based on changes to ``sys.path``. + +``WorkingSet(entries=None)`` + Create a ``WorkingSet`` from an iterable of path entries. If `entries` + is not supplied, it defaults to the value of ``sys.path`` at the time + the constructor is called. + + Note that you will not normally construct ``WorkingSet`` instances + yourself, but instead you will implicitly or explicitly use the global + ``working_set`` instance. For the most part, the ``pkg_resources`` API + is designed so that the ``working_set`` is used by default, such that you + don't have to explicitly refer to it most of the time. + +All distributions available directly on ``sys.path`` will be activated +automatically when ``pkg_resources`` is imported. This behaviour can cause +version conflicts for applications which require non-default versions of +those distributions. To handle this situation, ``pkg_resources`` checks for a +``__requires__`` attribute in the ``__main__`` module when initializing the +default working set, and uses this to ensure a suitable version of each +affected distribution is activated. For example:: + + __requires__ = ["CherryPy < 3"] # Must be set before pkg_resources import + import pkg_resources + + +Basic ``WorkingSet`` Methods +---------------------------- + +The following methods of ``WorkingSet`` objects are also available as module- +level functions in ``pkg_resources`` that apply to the default ``working_set`` +instance. Thus, you can use e.g. ``pkg_resources.require()`` as an +abbreviation for ``pkg_resources.working_set.require()``: + + +``require(*requirements)`` + Ensure that distributions matching `requirements` are activated + + `requirements` must be a string or a (possibly-nested) sequence + thereof, specifying the distributions and versions required. The + return value is a sequence of the distributions that needed to be + activated to fulfill the requirements; all relevant distributions are + included, even if they were already activated in this working set. + + For the syntax of requirement specifiers, see the section below on + `Requirements Parsing`_. + + In general, it should not be necessary for you to call this method + directly. It's intended more for use in quick-and-dirty scripting and + interactive interpreter hacking than for production use. If you're creating + an actual library or application, it's strongly recommended that you create + a "setup.py" script using ``setuptools``, and declare all your requirements + there. That way, tools like EasyInstall can automatically detect what + requirements your package has, and deal with them accordingly. + + Note that calling ``require('SomePackage')`` will not install + ``SomePackage`` if it isn't already present. If you need to do this, you + should use the ``resolve()`` method instead, which allows you to pass an + ``installer`` callback that will be invoked when a needed distribution + can't be found on the local machine. You can then have this callback + display a dialog, automatically download the needed distribution, or + whatever else is appropriate for your application. See the documentation + below on the ``resolve()`` method for more information, and also on the + ``obtain()`` method of ``Environment`` objects. + +``run_script(requires, script_name)`` + Locate distribution specified by `requires` and run its `script_name` + script. `requires` must be a string containing a requirement specifier. + (See `Requirements Parsing`_ below for the syntax.) + + The script, if found, will be executed in *the caller's globals*. That's + because this method is intended to be called from wrapper scripts that + act as a proxy for the "real" scripts in a distribution. A wrapper script + usually doesn't need to do anything but invoke this function with the + correct arguments. + + If you need more control over the script execution environment, you + probably want to use the ``run_script()`` method of a ``Distribution`` + object's `Metadata API`_ instead. + +``iter_entry_points(group, name=None)`` + Yield entry point objects from `group` matching `name` + + If `name` is None, yields all entry points in `group` from all + distributions in the working set, otherwise only ones matching both + `group` and `name` are yielded. Entry points are yielded from the active + distributions in the order that the distributions appear in the working + set. (For the global ``working_set``, this should be the same as the order + that they are listed in ``sys.path``.) Note that within the entry points + advertised by an individual distribution, there is no particular ordering. + + Please see the section below on `Entry Points`_ for more information. + + +``WorkingSet`` Methods and Attributes +------------------------------------- + +These methods are used to query or manipulate the contents of a specific +working set, so they must be explicitly invoked on a particular ``WorkingSet`` +instance: + +``add_entry(entry)`` + Add a path item to the ``entries``, finding any distributions on it. You + should use this when you add additional items to ``sys.path`` and you want + the global ``working_set`` to reflect the change. This method is also + called by the ``WorkingSet()`` constructor during initialization. + + This method uses ``find_distributions(entry,True)`` to find distributions + corresponding to the path entry, and then ``add()`` them. `entry` is + always appended to the ``entries`` attribute, even if it is already + present, however. (This is because ``sys.path`` can contain the same value + more than once, and the ``entries`` attribute should be able to reflect + this.) + +``__contains__(dist)`` + True if `dist` is active in this ``WorkingSet``. Note that only one + distribution for a given project can be active in a given ``WorkingSet``. + +``__iter__()`` + Yield distributions for non-duplicate projects in the working set. + The yield order is the order in which the items' path entries were + added to the working set. + +``find(req)`` + Find a distribution matching `req` (a ``Requirement`` instance). + If there is an active distribution for the requested project, this + returns it, as long as it meets the version requirement specified by + `req`. But, if there is an active distribution for the project and it + does *not* meet the `req` requirement, ``VersionConflict`` is raised. + If there is no active distribution for the requested project, ``None`` + is returned. + +``resolve(requirements, env=None, installer=None)`` + List all distributions needed to (recursively) meet `requirements` + + `requirements` must be a sequence of ``Requirement`` objects. `env`, + if supplied, should be an ``Environment`` instance. If + not supplied, an ``Environment`` is created from the working set's + ``entries``. `installer`, if supplied, will be invoked with each + requirement that cannot be met by an already-installed distribution; it + should return a ``Distribution`` or ``None``. (See the ``obtain()`` method + of `Environment Objects`_, below, for more information on the `installer` + argument.) + +``add(dist, entry=None)`` + Add `dist` to working set, associated with `entry` + + If `entry` is unspecified, it defaults to ``dist.location``. On exit from + this routine, `entry` is added to the end of the working set's ``.entries`` + (if it wasn't already present). + + `dist` is only added to the working set if it's for a project that + doesn't already have a distribution active in the set. If it's + successfully added, any callbacks registered with the ``subscribe()`` + method will be called. (See `Receiving Change Notifications`_, below.) + + Note: ``add()`` is automatically called for you by the ``require()`` + method, so you don't normally need to use this method directly. + +``entries`` + This attribute represents a "shadow" ``sys.path``, primarily useful for + debugging. If you are experiencing import problems, you should check + the global ``working_set`` object's ``entries`` against ``sys.path``, to + ensure that they match. If they do not, then some part of your program + is manipulating ``sys.path`` without updating the ``working_set`` + accordingly. IMPORTANT NOTE: do not directly manipulate this attribute! + Setting it equal to ``sys.path`` will not fix your problem, any more than + putting black tape over an "engine warning" light will fix your car! If + this attribute is out of sync with ``sys.path``, it's merely an *indicator* + of the problem, not the cause of it. + + +Receiving Change Notifications +------------------------------ + +Extensible applications and frameworks may need to receive notification when +a new distribution (such as a plug-in component) has been added to a working +set. This is what the ``subscribe()`` method and ``add_activation_listener()`` +function are for. + +``subscribe(callback)`` + Invoke ``callback(distribution)`` once for each active distribution that is + in the set now, or gets added later. Because the callback is invoked for + already-active distributions, you do not need to loop over the working set + yourself to deal with the existing items; just register the callback and + be prepared for the fact that it will be called immediately by this method. + + Note that callbacks *must not* allow exceptions to propagate, or they will + interfere with the operation of other callbacks and possibly result in an + inconsistent working set state. Callbacks should use a try/except block + to ignore, log, or otherwise process any errors, especially since the code + that caused the callback to be invoked is unlikely to be able to handle + the errors any better than the callback itself. + +``pkg_resources.add_activation_listener()`` is an alternate spelling of +``pkg_resources.working_set.subscribe()``. + + +Locating Plugins +---------------- + +Extensible applications will sometimes have a "plugin directory" or a set of +plugin directories, from which they want to load entry points or other +metadata. The ``find_plugins()`` method allows you to do this, by scanning an +environment for the newest version of each project that can be safely loaded +without conflicts or missing requirements. + +``find_plugins(plugin_env, full_env=None, fallback=True)`` + Scan `plugin_env` and identify which distributions could be added to this + working set without version conflicts or missing requirements. + + Example usage:: + + distributions, errors = working_set.find_plugins( + Environment(plugin_dirlist) + ) + map(working_set.add, distributions) # add plugins+libs to sys.path + print "Couldn't load", errors # display errors + + The `plugin_env` should be an ``Environment`` instance that contains only + distributions that are in the project's "plugin directory" or directories. + The `full_env`, if supplied, should be an ``Environment`` instance that + contains all currently-available distributions. + + If `full_env` is not supplied, one is created automatically from the + ``WorkingSet`` this method is called on, which will typically mean that + every directory on ``sys.path`` will be scanned for distributions. + + This method returns a 2-tuple: (`distributions`, `error_info`), where + `distributions` is a list of the distributions found in `plugin_env` that + were loadable, along with any other distributions that are needed to resolve + their dependencies. `error_info` is a dictionary mapping unloadable plugin + distributions to an exception instance describing the error that occurred. + Usually this will be a ``DistributionNotFound`` or ``VersionConflict`` + instance. + + Most applications will use this method mainly on the master ``working_set`` + instance in ``pkg_resources``, and then immediately add the returned + distributions to the working set so that they are available on sys.path. + This will make it possible to find any entry points, and allow any other + metadata tracking and hooks to be activated. + + The resolution algorithm used by ``find_plugins()`` is as follows. First, + the project names of the distributions present in `plugin_env` are sorted. + Then, each project's eggs are tried in descending version order (i.e., + newest version first). + + An attempt is made to resolve each egg's dependencies. If the attempt is + successful, the egg and its dependencies are added to the output list and to + a temporary copy of the working set. The resolution process continues with + the next project name, and no older eggs for that project are tried. + + If the resolution attempt fails, however, the error is added to the error + dictionary. If the `fallback` flag is true, the next older version of the + plugin is tried, until a working version is found. If false, the resolution + process continues with the next plugin project name. + + Some applications may have stricter fallback requirements than others. For + example, an application that has a database schema or persistent objects + may not be able to safely downgrade a version of a package. Others may want + to ensure that a new plugin configuration is either 100% good or else + revert to a known-good configuration. (That is, they may wish to revert to + a known configuration if the `error_info` return value is non-empty.) + + Note that this algorithm gives precedence to satisfying the dependencies of + alphabetically prior project names in case of version conflicts. If two + projects named "AaronsPlugin" and "ZekesPlugin" both need different versions + of "TomsLibrary", then "AaronsPlugin" will win and "ZekesPlugin" will be + disabled due to version conflict. + + +``Environment`` Objects +======================= + +An "environment" is a collection of ``Distribution`` objects, usually ones +that are present and potentially importable on the current platform. +``Environment`` objects are used by ``pkg_resources`` to index available +distributions during dependency resolution. + +``Environment(search_path=None, platform=get_supported_platform(), python=PY_MAJOR)`` + Create an environment snapshot by scanning `search_path` for distributions + compatible with `platform` and `python`. `search_path` should be a + sequence of strings such as might be used on ``sys.path``. If a + `search_path` isn't supplied, ``sys.path`` is used. + + `platform` is an optional string specifying the name of the platform + that platform-specific distributions must be compatible with. If + unspecified, it defaults to the current platform. `python` is an + optional string naming the desired version of Python (e.g. ``'2.4'``); + it defaults to the currently-running version. + + You may explicitly set `platform` (and/or `python`) to ``None`` if you + wish to include *all* distributions, not just those compatible with the + running platform or Python version. + + Note that `search_path` is scanned immediately for distributions, and the + resulting ``Environment`` is a snapshot of the found distributions. It + is not automatically updated if the system's state changes due to e.g. + installation or removal of distributions. + +``__getitem__(project_name)`` + Returns a list of distributions for the given project name, ordered + from newest to oldest version. (And highest to lowest format precedence + for distributions that contain the same version of the project.) If there + are no distributions for the project, returns an empty list. + +``__iter__()`` + Yield the unique project names of the distributions in this environment. + The yielded names are always in lower case. + +``add(dist)`` + Add `dist` to the environment if it matches the platform and python version + specified at creation time, and only if the distribution hasn't already + been added. (i.e., adding the same distribution more than once is a no-op.) + +``remove(dist)`` + Remove `dist` from the environment. + +``can_add(dist)`` + Is distribution `dist` acceptable for this environment? If it's not + compatible with the ``platform`` and ``python`` version values specified + when the environment was created, a false value is returned. + +``__add__(dist_or_env)`` (``+`` operator) + Add a distribution or environment to an ``Environment`` instance, returning + a *new* environment object that contains all the distributions previously + contained by both. The new environment will have a ``platform`` and + ``python`` of ``None``, meaning that it will not reject any distributions + from being added to it; it will simply accept whatever is added. If you + want the added items to be filtered for platform and Python version, or + you want to add them to the *same* environment instance, you should use + in-place addition (``+=``) instead. + +``__iadd__(dist_or_env)`` (``+=`` operator) + Add a distribution or environment to an ``Environment`` instance + *in-place*, updating the existing instance and returning it. The + ``platform`` and ``python`` filter attributes take effect, so distributions + in the source that do not have a suitable platform string or Python version + are silently ignored. + +``best_match(req, working_set, installer=None)`` + Find distribution best matching `req` and usable on `working_set` + + This calls the ``find(req)`` method of the `working_set` to see if a + suitable distribution is already active. (This may raise + ``VersionConflict`` if an unsuitable version of the project is already + active in the specified `working_set`.) If a suitable distribution isn't + active, this method returns the newest distribution in the environment + that meets the ``Requirement`` in `req`. If no suitable distribution is + found, and `installer` is supplied, then the result of calling + the environment's ``obtain(req, installer)`` method will be returned. + +``obtain(requirement, installer=None)`` + Obtain a distro that matches requirement (e.g. via download). In the + base ``Environment`` class, this routine just returns + ``installer(requirement)``, unless `installer` is None, in which case + None is returned instead. This method is a hook that allows subclasses + to attempt other ways of obtaining a distribution before falling back + to the `installer` argument. + +``scan(search_path=None)`` + Scan `search_path` for distributions usable on `platform` + + Any distributions found are added to the environment. `search_path` should + be a sequence of strings such as might be used on ``sys.path``. If not + supplied, ``sys.path`` is used. Only distributions conforming to + the platform/python version defined at initialization are added. This + method is a shortcut for using the ``find_distributions()`` function to + find the distributions from each item in `search_path`, and then calling + ``add()`` to add each one to the environment. + + +``Requirement`` Objects +======================= + +``Requirement`` objects express what versions of a project are suitable for +some purpose. These objects (or their string form) are used by various +``pkg_resources`` APIs in order to find distributions that a script or +distribution needs. + + +Requirements Parsing +-------------------- + +``parse_requirements(s)`` + Yield ``Requirement`` objects for a string or iterable of lines. Each + requirement must start on a new line. See below for syntax. + +``Requirement.parse(s)`` + Create a ``Requirement`` object from a string or iterable of lines. A + ``ValueError`` is raised if the string or lines do not contain a valid + requirement specifier, or if they contain more than one specifier. (To + parse multiple specifiers from a string or iterable of strings, use + ``parse_requirements()`` instead.) + + The syntax of a requirement specifier is defined in full in PEP 508. + + Some examples of valid requirement specifiers:: + + FooProject >= 1.2 + Fizzy [foo, bar] + PickyThing<1.6,>1.9,!=1.9.6,<2.0a0,==2.4c1 + SomethingWhoseVersionIDontCareAbout + SomethingWithMarker[foo]>1.0;python_version<"2.7" + + The project name is the only required portion of a requirement string, and + if it's the only thing supplied, the requirement will accept any version + of that project. + + The "extras" in a requirement are used to request optional features of a + project, that may require additional project distributions in order to + function. For example, if the hypothetical "Report-O-Rama" project offered + optional PDF support, it might require an additional library in order to + provide that support. Thus, a project needing Report-O-Rama's PDF features + could use a requirement of ``Report-O-Rama[PDF]`` to request installation + or activation of both Report-O-Rama and any libraries it needs in order to + provide PDF support. For example, you could use:: + + easy_install.py Report-O-Rama[PDF] + + To install the necessary packages using the EasyInstall program, or call + ``pkg_resources.require('Report-O-Rama[PDF]')`` to add the necessary + distributions to sys.path at runtime. + + The "markers" in a requirement are used to specify when a requirement + should be installed -- the requirement will be installed if the marker + evaluates as true in the current environment. For example, specifying + ``argparse;python_version<"2.7"`` will not install in an Python 2.7 or 3.3 + environment, but will in a Python 2.6 environment. + +``Requirement`` Methods and Attributes +-------------------------------------- + +``__contains__(dist_or_version)`` + Return true if `dist_or_version` fits the criteria for this requirement. + If `dist_or_version` is a ``Distribution`` object, its project name must + match the requirement's project name, and its version must meet the + requirement's version criteria. If `dist_or_version` is a string, it is + parsed using the ``parse_version()`` utility function. Otherwise, it is + assumed to be an already-parsed version. + + The ``Requirement`` object's version specifiers (``.specs``) are internally + sorted into ascending version order, and used to establish what ranges of + versions are acceptable. Adjacent redundant conditions are effectively + consolidated (e.g. ``">1, >2"`` produces the same results as ``">2"``, and + ``"<2,<3"`` produces the same results as``"<2"``). ``"!="`` versions are + excised from the ranges they fall within. The version being tested for + acceptability is then checked for membership in the resulting ranges. + +``__eq__(other_requirement)`` + A requirement compares equal to another requirement if they have + case-insensitively equal project names, version specifiers, and "extras". + (The order that extras and version specifiers are in is also ignored.) + Equal requirements also have equal hashes, so that requirements can be + used in sets or as dictionary keys. + +``__str__()`` + The string form of a ``Requirement`` is a string that, if passed to + ``Requirement.parse()``, would return an equal ``Requirement`` object. + +``project_name`` + The name of the required project + +``key`` + An all-lowercase version of the ``project_name``, useful for comparison + or indexing. + +``extras`` + A tuple of names of "extras" that this requirement calls for. (These will + be all-lowercase and normalized using the ``safe_extra()`` parsing utility + function, so they may not exactly equal the extras the requirement was + created with.) + +``specs`` + A list of ``(op,version)`` tuples, sorted in ascending parsed-version + order. The `op` in each tuple is a comparison operator, represented as + a string. The `version` is the (unparsed) version number. + +``marker`` + An instance of ``packaging.markers.Marker`` that allows evaluation + against the current environment. May be None if no marker specified. + +``url`` + The location to download the requirement from if specified. + +Entry Points +============ + +Entry points are a simple way for distributions to "advertise" Python objects +(such as functions or classes) for use by other distributions. Extensible +applications and frameworks can search for entry points with a particular name +or group, either from a specific distribution or from all active distributions +on sys.path, and then inspect or load the advertised objects at will. + +Entry points belong to "groups" which are named with a dotted name similar to +a Python package or module name. For example, the ``setuptools`` package uses +an entry point named ``distutils.commands`` in order to find commands defined +by distutils extensions. ``setuptools`` treats the names of entry points +defined in that group as the acceptable commands for a setup script. + +In a similar way, other packages can define their own entry point groups, +either using dynamic names within the group (like ``distutils.commands``), or +possibly using predefined names within the group. For example, a blogging +framework that offers various pre- or post-publishing hooks might define an +entry point group and look for entry points named "pre_process" and +"post_process" within that group. + +To advertise an entry point, a project needs to use ``setuptools`` and provide +an ``entry_points`` argument to ``setup()`` in its setup script, so that the +entry points will be included in the distribution's metadata. For more +details, see the ``setuptools`` documentation. (XXX link here to setuptools) + +Each project distribution can advertise at most one entry point of a given +name within the same entry point group. For example, a distutils extension +could advertise two different ``distutils.commands`` entry points, as long as +they had different names. However, there is nothing that prevents *different* +projects from advertising entry points of the same name in the same group. In +some cases, this is a desirable thing, since the application or framework that +uses the entry points may be calling them as hooks, or in some other way +combining them. It is up to the application or framework to decide what to do +if multiple distributions advertise an entry point; some possibilities include +using both entry points, displaying an error message, using the first one found +in sys.path order, etc. + + +Convenience API +--------------- + +In the following functions, the `dist` argument can be a ``Distribution`` +instance, a ``Requirement`` instance, or a string specifying a requirement +(i.e. project name, version, etc.). If the argument is a string or +``Requirement``, the specified distribution is located (and added to sys.path +if not already present). An error will be raised if a matching distribution is +not available. + +The `group` argument should be a string containing a dotted identifier, +identifying an entry point group. If you are defining an entry point group, +you should include some portion of your package's name in the group name so as +to avoid collision with other packages' entry point groups. + +``load_entry_point(dist, group, name)`` + Load the named entry point from the specified distribution, or raise + ``ImportError``. + +``get_entry_info(dist, group, name)`` + Return an ``EntryPoint`` object for the given `group` and `name` from + the specified distribution. Returns ``None`` if the distribution has not + advertised a matching entry point. + +``get_entry_map(dist, group=None)`` + Return the distribution's entry point map for `group`, or the full entry + map for the distribution. This function always returns a dictionary, + even if the distribution advertises no entry points. If `group` is given, + the dictionary maps entry point names to the corresponding ``EntryPoint`` + object. If `group` is None, the dictionary maps group names to + dictionaries that then map entry point names to the corresponding + ``EntryPoint`` instance in that group. + +``iter_entry_points(group, name=None)`` + Yield entry point objects from `group` matching `name`. + + If `name` is None, yields all entry points in `group` from all + distributions in the working set on sys.path, otherwise only ones matching + both `group` and `name` are yielded. Entry points are yielded from + the active distributions in the order that the distributions appear on + sys.path. (Within entry points for a particular distribution, however, + there is no particular ordering.) + + (This API is actually a method of the global ``working_set`` object; see + the section above on `Basic WorkingSet Methods`_ for more information.) + + +Creating and Parsing +-------------------- + +``EntryPoint(name, module_name, attrs=(), extras=(), dist=None)`` + Create an ``EntryPoint`` instance. `name` is the entry point name. The + `module_name` is the (dotted) name of the module containing the advertised + object. `attrs` is an optional tuple of names to look up from the + module to obtain the advertised object. For example, an `attrs` of + ``("foo","bar")`` and a `module_name` of ``"baz"`` would mean that the + advertised object could be obtained by the following code:: + + import baz + advertised_object = baz.foo.bar + + The `extras` are an optional tuple of "extra feature" names that the + distribution needs in order to provide this entry point. When the + entry point is loaded, these extra features are looked up in the `dist` + argument to find out what other distributions may need to be activated + on sys.path; see the ``load()`` method for more details. The `extras` + argument is only meaningful if `dist` is specified. `dist` must be + a ``Distribution`` instance. + +``EntryPoint.parse(src, dist=None)`` (classmethod) + Parse a single entry point from string `src` + + Entry point syntax follows the form:: + + name = some.module:some.attr [extra1,extra2] + + The entry name and module name are required, but the ``:attrs`` and + ``[extras]`` parts are optional, as is the whitespace shown between + some of the items. The `dist` argument is passed through to the + ``EntryPoint()`` constructor, along with the other values parsed from + `src`. + +``EntryPoint.parse_group(group, lines, dist=None)`` (classmethod) + Parse `lines` (a string or sequence of lines) to create a dictionary + mapping entry point names to ``EntryPoint`` objects. ``ValueError`` is + raised if entry point names are duplicated, if `group` is not a valid + entry point group name, or if there are any syntax errors. (Note: the + `group` parameter is used only for validation and to create more + informative error messages.) If `dist` is provided, it will be used to + set the ``dist`` attribute of the created ``EntryPoint`` objects. + +``EntryPoint.parse_map(data, dist=None)`` (classmethod) + Parse `data` into a dictionary mapping group names to dictionaries mapping + entry point names to ``EntryPoint`` objects. If `data` is a dictionary, + then the keys are used as group names and the values are passed to + ``parse_group()`` as the `lines` argument. If `data` is a string or + sequence of lines, it is first split into .ini-style sections (using + the ``split_sections()`` utility function) and the section names are used + as group names. In either case, the `dist` argument is passed through to + ``parse_group()`` so that the entry points will be linked to the specified + distribution. + + +``EntryPoint`` Objects +---------------------- + +For simple introspection, ``EntryPoint`` objects have attributes that +correspond exactly to the constructor argument names: ``name``, +``module_name``, ``attrs``, ``extras``, and ``dist`` are all available. In +addition, the following methods are provided: + +``load()`` + Load the entry point, returning the advertised Python object. Effectively + calls ``self.require()`` then returns ``self.resolve()``. + +``require(env=None, installer=None)`` + Ensure that any "extras" needed by the entry point are available on + sys.path. ``UnknownExtra`` is raised if the ``EntryPoint`` has ``extras``, + but no ``dist``, or if the named extras are not defined by the + distribution. If `env` is supplied, it must be an ``Environment``, and it + will be used to search for needed distributions if they are not already + present on sys.path. If `installer` is supplied, it must be a callable + taking a ``Requirement`` instance and returning a matching importable + ``Distribution`` instance or None. + +``resolve()`` + Resolve the entry point from its module and attrs, returning the advertised + Python object. Raises ``ImportError`` if it cannot be obtained. + +``__str__()`` + The string form of an ``EntryPoint`` is a string that could be passed to + ``EntryPoint.parse()`` to produce an equivalent ``EntryPoint``. + + +``Distribution`` Objects +======================== + +``Distribution`` objects represent collections of Python code that may or may +not be importable, and may or may not have metadata and resources associated +with them. Their metadata may include information such as what other projects +the distribution depends on, what entry points the distribution advertises, and +so on. + + +Getting or Creating Distributions +--------------------------------- + +Most commonly, you'll obtain ``Distribution`` objects from a ``WorkingSet`` or +an ``Environment``. (See the sections above on `WorkingSet Objects`_ and +`Environment Objects`_, which are containers for active distributions and +available distributions, respectively.) You can also obtain ``Distribution`` +objects from one of these high-level APIs: + +``find_distributions(path_item, only=False)`` + Yield distributions accessible via `path_item`. If `only` is true, yield + only distributions whose ``location`` is equal to `path_item`. In other + words, if `only` is true, this yields any distributions that would be + importable if `path_item` were on ``sys.path``. If `only` is false, this + also yields distributions that are "in" or "under" `path_item`, but would + not be importable unless their locations were also added to ``sys.path``. + +``get_distribution(dist_spec)`` + Return a ``Distribution`` object for a given ``Requirement`` or string. + If `dist_spec` is already a ``Distribution`` instance, it is returned. + If it is a ``Requirement`` object or a string that can be parsed into one, + it is used to locate and activate a matching distribution, which is then + returned. + +However, if you're creating specialized tools for working with distributions, +or creating a new distribution format, you may also need to create +``Distribution`` objects directly, using one of the three constructors below. + +These constructors all take an optional `metadata` argument, which is used to +access any resources or metadata associated with the distribution. `metadata` +must be an object that implements the ``IResourceProvider`` interface, or None. +If it is None, an ``EmptyProvider`` is used instead. ``Distribution`` objects +implement both the `IResourceProvider`_ and `IMetadataProvider Methods`_ by +delegating them to the `metadata` object. + +``Distribution.from_location(location, basename, metadata=None, **kw)`` (classmethod) + Create a distribution for `location`, which must be a string such as a + URL, filename, or other string that might be used on ``sys.path``. + `basename` is a string naming the distribution, like ``Foo-1.2-py2.4.egg``. + If `basename` ends with ``.egg``, then the project's name, version, python + version and platform are extracted from the filename and used to set those + properties of the created distribution. Any additional keyword arguments + are forwarded to the ``Distribution()`` constructor. + +``Distribution.from_filename(filename, metadata=None**kw)`` (classmethod) + Create a distribution by parsing a local filename. This is a shorter way + of saying ``Distribution.from_location(normalize_path(filename), + os.path.basename(filename), metadata)``. In other words, it creates a + distribution whose location is the normalize form of the filename, parsing + name and version information from the base portion of the filename. Any + additional keyword arguments are forwarded to the ``Distribution()`` + constructor. + +``Distribution(location,metadata,project_name,version,py_version,platform,precedence)`` + Create a distribution by setting its properties. All arguments are + optional and default to None, except for `py_version` (which defaults to + the current Python version) and `precedence` (which defaults to + ``EGG_DIST``; for more details see ``precedence`` under `Distribution + Attributes`_ below). Note that it's usually easier to use the + ``from_filename()`` or ``from_location()`` constructors than to specify + all these arguments individually. + + +``Distribution`` Attributes +--------------------------- + +location + A string indicating the distribution's location. For an importable + distribution, this is the string that would be added to ``sys.path`` to + make it actively importable. For non-importable distributions, this is + simply a filename, URL, or other way of locating the distribution. + +project_name + A string, naming the project that this distribution is for. Project names + are defined by a project's setup script, and they are used to identify + projects on PyPI. When a ``Distribution`` is constructed, the + `project_name` argument is passed through the ``safe_name()`` utility + function to filter out any unacceptable characters. + +key + ``dist.key`` is short for ``dist.project_name.lower()``. It's used for + case-insensitive comparison and indexing of distributions by project name. + +extras + A list of strings, giving the names of extra features defined by the + project's dependency list (the ``extras_require`` argument specified in + the project's setup script). + +version + A string denoting what release of the project this distribution contains. + When a ``Distribution`` is constructed, the `version` argument is passed + through the ``safe_version()`` utility function to filter out any + unacceptable characters. If no `version` is specified at construction + time, then attempting to access this attribute later will cause the + ``Distribution`` to try to discover its version by reading its ``PKG-INFO`` + metadata file. If ``PKG-INFO`` is unavailable or can't be parsed, + ``ValueError`` is raised. + +parsed_version + The ``parsed_version`` is an object representing a "parsed" form of the + distribution's ``version``. ``dist.parsed_version`` is a shortcut for + calling ``parse_version(dist.version)``. It is used to compare or sort + distributions by version. (See the `Parsing Utilities`_ section below for + more information on the ``parse_version()`` function.) Note that accessing + ``parsed_version`` may result in a ``ValueError`` if the ``Distribution`` + was constructed without a `version` and without `metadata` capable of + supplying the missing version info. + +py_version + The major/minor Python version the distribution supports, as a string. + For example, "2.7" or "3.4". The default is the current version of Python. + +platform + A string representing the platform the distribution is intended for, or + ``None`` if the distribution is "pure Python" and therefore cross-platform. + See `Platform Utilities`_ below for more information on platform strings. + +precedence + A distribution's ``precedence`` is used to determine the relative order of + two distributions that have the same ``project_name`` and + ``parsed_version``. The default precedence is ``pkg_resources.EGG_DIST``, + which is the highest (i.e. most preferred) precedence. The full list + of predefined precedences, from most preferred to least preferred, is: + ``EGG_DIST``, ``BINARY_DIST``, ``SOURCE_DIST``, ``CHECKOUT_DIST``, and + ``DEVELOP_DIST``. Normally, precedences other than ``EGG_DIST`` are used + only by the ``setuptools.package_index`` module, when sorting distributions + found in a package index to determine their suitability for installation. + "System" and "Development" eggs (i.e., ones that use the ``.egg-info`` + format), however, are automatically given a precedence of ``DEVELOP_DIST``. + + + +``Distribution`` Methods +------------------------ + +``activate(path=None)`` + Ensure distribution is importable on `path`. If `path` is None, + ``sys.path`` is used instead. This ensures that the distribution's + ``location`` is in the `path` list, and it also performs any necessary + namespace package fixups or declarations. (That is, if the distribution + contains namespace packages, this method ensures that they are declared, + and that the distribution's contents for those namespace packages are + merged with the contents provided by any other active distributions. See + the section above on `Namespace Package Support`_ for more information.) + + ``pkg_resources`` adds a notification callback to the global ``working_set`` + that ensures this method is called whenever a distribution is added to it. + Therefore, you should not normally need to explicitly call this method. + (Note that this means that namespace packages on ``sys.path`` are always + imported as soon as ``pkg_resources`` is, which is another reason why + namespace packages should not contain any code or import statements.) + +``as_requirement()`` + Return a ``Requirement`` instance that matches this distribution's project + name and version. + +``requires(extras=())`` + List the ``Requirement`` objects that specify this distribution's + dependencies. If `extras` is specified, it should be a sequence of names + of "extras" defined by the distribution, and the list returned will then + include any dependencies needed to support the named "extras". + +``clone(**kw)`` + Create a copy of the distribution. Any supplied keyword arguments override + the corresponding argument to the ``Distribution()`` constructor, allowing + you to change some of the copied distribution's attributes. + +``egg_name()`` + Return what this distribution's standard filename should be, not including + the ".egg" extension. For example, a distribution for project "Foo" + version 1.2 that runs on Python 2.3 for Windows would have an ``egg_name()`` + of ``Foo-1.2-py2.3-win32``. Any dashes in the name or version are + converted to underscores. (``Distribution.from_location()`` will convert + them back when parsing a ".egg" file name.) + +``__cmp__(other)``, ``__hash__()`` + Distribution objects are hashed and compared on the basis of their parsed + version and precedence, followed by their key (lowercase project name), + location, Python version, and platform. + +The following methods are used to access ``EntryPoint`` objects advertised +by the distribution. See the section above on `Entry Points`_ for more +detailed information about these operations: + +``get_entry_info(group, name)`` + Return the ``EntryPoint`` object for `group` and `name`, or None if no + such point is advertised by this distribution. + +``get_entry_map(group=None)`` + Return the entry point map for `group`. If `group` is None, return + a dictionary mapping group names to entry point maps for all groups. + (An entry point map is a dictionary of entry point names to ``EntryPoint`` + objects.) + +``load_entry_point(group, name)`` + Short for ``get_entry_info(group, name).load()``. Returns the object + advertised by the named entry point, or raises ``ImportError`` if + the entry point isn't advertised by this distribution, or there is some + other import problem. + +In addition to the above methods, ``Distribution`` objects also implement all +of the `IResourceProvider`_ and `IMetadataProvider Methods`_ (which are +documented in later sections): + +* ``has_metadata(name)`` +* ``metadata_isdir(name)`` +* ``metadata_listdir(name)`` +* ``get_metadata(name)`` +* ``get_metadata_lines(name)`` +* ``run_script(script_name, namespace)`` +* ``get_resource_filename(manager, resource_name)`` +* ``get_resource_stream(manager, resource_name)`` +* ``get_resource_string(manager, resource_name)`` +* ``has_resource(resource_name)`` +* ``resource_isdir(resource_name)`` +* ``resource_listdir(resource_name)`` + +If the distribution was created with a `metadata` argument, these resource and +metadata access methods are all delegated to that `metadata` provider. +Otherwise, they are delegated to an ``EmptyProvider``, so that the distribution +will appear to have no resources or metadata. This delegation approach is used +so that supporting custom importers or new distribution formats can be done +simply by creating an appropriate `IResourceProvider`_ implementation; see the +section below on `Supporting Custom Importers`_ for more details. + + +``ResourceManager`` API +======================= + +The ``ResourceManager`` class provides uniform access to package resources, +whether those resources exist as files and directories or are compressed in +an archive of some kind. + +Normally, you do not need to create or explicitly manage ``ResourceManager`` +instances, as the ``pkg_resources`` module creates a global instance for you, +and makes most of its methods available as top-level names in the +``pkg_resources`` module namespace. So, for example, this code actually +calls the ``resource_string()`` method of the global ``ResourceManager``:: + + import pkg_resources + my_data = pkg_resources.resource_string(__name__, "foo.dat") + +Thus, you can use the APIs below without needing an explicit +``ResourceManager`` instance; just import and use them as needed. + + +Basic Resource Access +--------------------- + +In the following methods, the `package_or_requirement` argument may be either +a Python package/module name (e.g. ``foo.bar``) or a ``Requirement`` instance. +If it is a package or module name, the named module or package must be +importable (i.e., be in a distribution or directory on ``sys.path``), and the +`resource_name` argument is interpreted relative to the named package. (Note +that if a module name is used, then the resource name is relative to the +package immediately containing the named module. Also, you should not use use +a namespace package name, because a namespace package can be spread across +multiple distributions, and is therefore ambiguous as to which distribution +should be searched for the resource.) + +If it is a ``Requirement``, then the requirement is automatically resolved +(searching the current ``Environment`` if necessary) and a matching +distribution is added to the ``WorkingSet`` and ``sys.path`` if one was not +already present. (Unless the ``Requirement`` can't be satisfied, in which +case an exception is raised.) The `resource_name` argument is then interpreted +relative to the root of the identified distribution; i.e. its first path +segment will be treated as a peer of the top-level modules or packages in the +distribution. + +Note that resource names must be ``/``-separated paths and cannot be absolute +(i.e. no leading ``/``) or contain relative names like ``".."``. Do *not* use +``os.path`` routines to manipulate resource paths, as they are *not* filesystem +paths. + +``resource_exists(package_or_requirement, resource_name)`` + Does the named resource exist? Return ``True`` or ``False`` accordingly. + +``resource_stream(package_or_requirement, resource_name)`` + Return a readable file-like object for the specified resource; it may be + an actual file, a ``StringIO``, or some similar object. The stream is + in "binary mode", in the sense that whatever bytes are in the resource + will be read as-is. + +``resource_string(package_or_requirement, resource_name)`` + Return the specified resource as a string. The resource is read in + binary fashion, such that the returned string contains exactly the bytes + that are stored in the resource. + +``resource_isdir(package_or_requirement, resource_name)`` + Is the named resource a directory? Return ``True`` or ``False`` + accordingly. + +``resource_listdir(package_or_requirement, resource_name)`` + List the contents of the named resource directory, just like ``os.listdir`` + except that it works even if the resource is in a zipfile. + +Note that only ``resource_exists()`` and ``resource_isdir()`` are insensitive +as to the resource type. You cannot use ``resource_listdir()`` on a file +resource, and you can't use ``resource_string()`` or ``resource_stream()`` on +directory resources. Using an inappropriate method for the resource type may +result in an exception or undefined behavior, depending on the platform and +distribution format involved. + + +Resource Extraction +------------------- + +``resource_filename(package_or_requirement, resource_name)`` + Sometimes, it is not sufficient to access a resource in string or stream + form, and a true filesystem filename is needed. In such cases, you can + use this method (or module-level function) to obtain a filename for a + resource. If the resource is in an archive distribution (such as a zipped + egg), it will be extracted to a cache directory, and the filename within + the cache will be returned. If the named resource is a directory, then + all resources within that directory (including subdirectories) are also + extracted. If the named resource is a C extension or "eager resource" + (see the ``setuptools`` documentation for details), then all C extensions + and eager resources are extracted at the same time. + + Archived resources are extracted to a cache location that can be managed by + the following two methods: + +``set_extraction_path(path)`` + Set the base path where resources will be extracted to, if needed. + + If you do not call this routine before any extractions take place, the + path defaults to the return value of ``get_default_cache()``. (Which is + based on the ``PYTHON_EGG_CACHE`` environment variable, with various + platform-specific fallbacks. See that routine's documentation for more + details.) + + Resources are extracted to subdirectories of this path based upon + information given by the resource provider. You may set this to a + temporary directory, but then you must call ``cleanup_resources()`` to + delete the extracted files when done. There is no guarantee that + ``cleanup_resources()`` will be able to remove all extracted files. (On + Windows, for example, you can't unlink .pyd or .dll files that are still + in use.) + + Note that you may not change the extraction path for a given resource + manager once resources have been extracted, unless you first call + ``cleanup_resources()``. + +``cleanup_resources(force=False)`` + Delete all extracted resource files and directories, returning a list + of the file and directory names that could not be successfully removed. + This function does not have any concurrency protection, so it should + generally only be called when the extraction path is a temporary + directory exclusive to a single process. This method is not + automatically called; you must call it explicitly or register it as an + ``atexit`` function if you wish to ensure cleanup of a temporary + directory used for extractions. + + +"Provider" Interface +-------------------- + +If you are implementing an ``IResourceProvider`` and/or ``IMetadataProvider`` +for a new distribution archive format, you may need to use the following +``IResourceManager`` methods to co-ordinate extraction of resources to the +filesystem. If you're not implementing an archive format, however, you have +no need to use these methods. Unlike the other methods listed above, they are +*not* available as top-level functions tied to the global ``ResourceManager``; +you must therefore have an explicit ``ResourceManager`` instance to use them. + +``get_cache_path(archive_name, names=())`` + Return absolute location in cache for `archive_name` and `names` + + The parent directory of the resulting path will be created if it does + not already exist. `archive_name` should be the base filename of the + enclosing egg (which may not be the name of the enclosing zipfile!), + including its ".egg" extension. `names`, if provided, should be a + sequence of path name parts "under" the egg's extraction location. + + This method should only be called by resource providers that need to + obtain an extraction location, and only for names they intend to + extract, as it tracks the generated names for possible cleanup later. + +``extraction_error()`` + Raise an ``ExtractionError`` describing the active exception as interfering + with the extraction process. You should call this if you encounter any + OS errors extracting the file to the cache path; it will format the + operating system exception for you, and add other information to the + ``ExtractionError`` instance that may be needed by programs that want to + wrap or handle extraction errors themselves. + +``postprocess(tempname, filename)`` + Perform any platform-specific postprocessing of `tempname`. + Resource providers should call this method ONLY after successfully + extracting a compressed resource. They must NOT call it on resources + that are already in the filesystem. + + `tempname` is the current (temporary) name of the file, and `filename` + is the name it will be renamed to by the caller after this routine + returns. + + +Metadata API +============ + +The metadata API is used to access metadata resources bundled in a pluggable +distribution. Metadata resources are virtual files or directories containing +information about the distribution, such as might be used by an extensible +application or framework to connect "plugins". Like other kinds of resources, +metadata resource names are ``/``-separated and should not contain ``..`` or +begin with a ``/``. You should not use ``os.path`` routines to manipulate +resource paths. + +The metadata API is provided by objects implementing the ``IMetadataProvider`` +or ``IResourceProvider`` interfaces. ``Distribution`` objects implement this +interface, as do objects returned by the ``get_provider()`` function: + +``get_provider(package_or_requirement)`` + If a package name is supplied, return an ``IResourceProvider`` for the + package. If a ``Requirement`` is supplied, resolve it by returning a + ``Distribution`` from the current working set (searching the current + ``Environment`` if necessary and adding the newly found ``Distribution`` + to the working set). If the named package can't be imported, or the + ``Requirement`` can't be satisfied, an exception is raised. + + NOTE: if you use a package name rather than a ``Requirement``, the object + you get back may not be a pluggable distribution, depending on the method + by which the package was installed. In particular, "development" packages + and "single-version externally-managed" packages do not have any way to + map from a package name to the corresponding project's metadata. Do not + write code that passes a package name to ``get_provider()`` and then tries + to retrieve project metadata from the returned object. It may appear to + work when the named package is in an ``.egg`` file or directory, but + it will fail in other installation scenarios. If you want project + metadata, you need to ask for a *project*, not a package. + + +``IMetadataProvider`` Methods +----------------------------- + +The methods provided by objects (such as ``Distribution`` instances) that +implement the ``IMetadataProvider`` or ``IResourceProvider`` interfaces are: + +``has_metadata(name)`` + Does the named metadata resource exist? + +``metadata_isdir(name)`` + Is the named metadata resource a directory? + +``metadata_listdir(name)`` + List of metadata names in the directory (like ``os.listdir()``) + +``get_metadata(name)`` + Return the named metadata resource as a string. The data is read in binary + mode; i.e., the exact bytes of the resource file are returned. + +``get_metadata_lines(name)`` + Yield named metadata resource as list of non-blank non-comment lines. This + is short for calling ``yield_lines(provider.get_metadata(name))``. See the + section on `yield_lines()`_ below for more information on the syntax it + recognizes. + +``run_script(script_name, namespace)`` + Execute the named script in the supplied namespace dictionary. Raises + ``ResolutionError`` if there is no script by that name in the ``scripts`` + metadata directory. `namespace` should be a Python dictionary, usually + a module dictionary if the script is being run as a module. + + +Exceptions +========== + +``pkg_resources`` provides a simple exception hierarchy for problems that may +occur when processing requests to locate and activate packages:: + + ResolutionError + DistributionNotFound + VersionConflict + UnknownExtra + + ExtractionError + +``ResolutionError`` + This class is used as a base class for the other three exceptions, so that + you can catch all of them with a single "except" clause. It is also raised + directly for miscellaneous requirement-resolution problems like trying to + run a script that doesn't exist in the distribution it was requested from. + +``DistributionNotFound`` + A distribution needed to fulfill a requirement could not be found. + +``VersionConflict`` + The requested version of a project conflicts with an already-activated + version of the same project. + +``UnknownExtra`` + One of the "extras" requested was not recognized by the distribution it + was requested from. + +``ExtractionError`` + A problem occurred extracting a resource to the Python Egg cache. The + following attributes are available on instances of this exception: + + manager + The resource manager that raised this exception + + cache_path + The base directory for resource extraction + + original_error + The exception instance that caused extraction to fail + + +Supporting Custom Importers +=========================== + +By default, ``pkg_resources`` supports normal filesystem imports, and +``zipimport`` importers. If you wish to use the ``pkg_resources`` features +with other (PEP 302-compatible) importers or module loaders, you may need to +register various handlers and support functions using these APIs: + +``register_finder(importer_type, distribution_finder)`` + Register `distribution_finder` to find distributions in ``sys.path`` items. + `importer_type` is the type or class of a PEP 302 "Importer" (``sys.path`` + item handler), and `distribution_finder` is a callable that, when passed a + path item, the importer instance, and an `only` flag, yields + ``Distribution`` instances found under that path item. (The `only` flag, + if true, means the finder should yield only ``Distribution`` objects whose + ``location`` is equal to the path item provided.) + + See the source of the ``pkg_resources.find_on_path`` function for an + example finder function. + +``register_loader_type(loader_type, provider_factory)`` + Register `provider_factory` to make ``IResourceProvider`` objects for + `loader_type`. `loader_type` is the type or class of a PEP 302 + ``module.__loader__``, and `provider_factory` is a function that, when + passed a module object, returns an `IResourceProvider`_ for that module, + allowing it to be used with the `ResourceManager API`_. + +``register_namespace_handler(importer_type, namespace_handler)`` + Register `namespace_handler` to declare namespace packages for the given + `importer_type`. `importer_type` is the type or class of a PEP 302 + "importer" (sys.path item handler), and `namespace_handler` is a callable + with a signature like this:: + + def namespace_handler(importer, path_entry, moduleName, module): + # return a path_entry to use for child packages + + Namespace handlers are only called if the relevant importer object has + already agreed that it can handle the relevant path item. The handler + should only return a subpath if the module ``__path__`` does not already + contain an equivalent subpath. Otherwise, it should return None. + + For an example namespace handler, see the source of the + ``pkg_resources.file_ns_handler`` function, which is used for both zipfile + importing and regular importing. + + +IResourceProvider +----------------- + +``IResourceProvider`` is an abstract class that documents what methods are +required of objects returned by a `provider_factory` registered with +``register_loader_type()``. ``IResourceProvider`` is a subclass of +``IMetadataProvider``, so objects that implement this interface must also +implement all of the `IMetadataProvider Methods`_ as well as the methods +shown here. The `manager` argument to the methods below must be an object +that supports the full `ResourceManager API`_ documented above. + +``get_resource_filename(manager, resource_name)`` + Return a true filesystem path for `resource_name`, coordinating the + extraction with `manager`, if the resource must be unpacked to the + filesystem. + +``get_resource_stream(manager, resource_name)`` + Return a readable file-like object for `resource_name`. + +``get_resource_string(manager, resource_name)`` + Return a string containing the contents of `resource_name`. + +``has_resource(resource_name)`` + Does the package contain the named resource? + +``resource_isdir(resource_name)`` + Is the named resource a directory? Return a false value if the resource + does not exist or is not a directory. + +``resource_listdir(resource_name)`` + Return a list of the contents of the resource directory, ala + ``os.listdir()``. Requesting the contents of a non-existent directory may + raise an exception. + +Note, by the way, that your provider classes need not (and should not) subclass +``IResourceProvider`` or ``IMetadataProvider``! These classes exist solely +for documentation purposes and do not provide any useful implementation code. +You may instead wish to subclass one of the `built-in resource providers`_. + + +Built-in Resource Providers +--------------------------- + +``pkg_resources`` includes several provider classes that are automatically used +where appropriate. Their inheritance tree looks like this:: + + NullProvider + EggProvider + DefaultProvider + PathMetadata + ZipProvider + EggMetadata + EmptyProvider + FileMetadata + + +``NullProvider`` + This provider class is just an abstract base that provides for common + provider behaviors (such as running scripts), given a definition for just + a few abstract methods. + +``EggProvider`` + This provider class adds in some egg-specific features that are common + to zipped and unzipped eggs. + +``DefaultProvider`` + This provider class is used for unpacked eggs and "plain old Python" + filesystem modules. + +``ZipProvider`` + This provider class is used for all zipped modules, whether they are eggs + or not. + +``EmptyProvider`` + This provider class always returns answers consistent with a provider that + has no metadata or resources. ``Distribution`` objects created without + a ``metadata`` argument use an instance of this provider class instead. + Since all ``EmptyProvider`` instances are equivalent, there is no need + to have more than one instance. ``pkg_resources`` therefore creates a + global instance of this class under the name ``empty_provider``, and you + may use it if you have need of an ``EmptyProvider`` instance. + +``PathMetadata(path, egg_info)`` + Create an ``IResourceProvider`` for a filesystem-based distribution, where + `path` is the filesystem location of the importable modules, and `egg_info` + is the filesystem location of the distribution's metadata directory. + `egg_info` should usually be the ``EGG-INFO`` subdirectory of `path` for an + "unpacked egg", and a ``ProjectName.egg-info`` subdirectory of `path` for + a "development egg". However, other uses are possible for custom purposes. + +``EggMetadata(zipimporter)`` + Create an ``IResourceProvider`` for a zipfile-based distribution. The + `zipimporter` should be a ``zipimport.zipimporter`` instance, and may + represent a "basket" (a zipfile containing multiple ".egg" subdirectories) + a specific egg *within* a basket, or a zipfile egg (where the zipfile + itself is a ".egg"). It can also be a combination, such as a zipfile egg + that also contains other eggs. + +``FileMetadata(path_to_pkg_info)`` + Create an ``IResourceProvider`` that provides exactly one metadata + resource: ``PKG-INFO``. The supplied path should be a distutils PKG-INFO + file. This is basically the same as an ``EmptyProvider``, except that + requests for ``PKG-INFO`` will be answered using the contents of the + designated file. (This provider is used to wrap ``.egg-info`` files + installed by vendor-supplied system packages.) + + +Utility Functions +================= + +In addition to its high-level APIs, ``pkg_resources`` also includes several +generally-useful utility routines. These routines are used to implement the +high-level APIs, but can also be quite useful by themselves. + + +Parsing Utilities +----------------- + +``parse_version(version)`` + Parsed a project's version string as defined by PEP 440. The returned + value will be an object that represents the version. These objects may + be compared to each other and sorted. The sorting algorithm is as defined + by PEP 440 with the addition that any version which is not a valid PEP 440 + version will be considered less than any valid PEP 440 version and the + invalid versions will continue sorting using the original algorithm. + +.. _yield_lines(): + +``yield_lines(strs)`` + Yield non-empty/non-comment lines from a string/unicode or a possibly- + nested sequence thereof. If `strs` is an instance of ``basestring``, it + is split into lines, and each non-blank, non-comment line is yielded after + stripping leading and trailing whitespace. (Lines whose first non-blank + character is ``#`` are considered comment lines.) + + If `strs` is not an instance of ``basestring``, it is iterated over, and + each item is passed recursively to ``yield_lines()``, so that an arbitrarily + nested sequence of strings, or sequences of sequences of strings can be + flattened out to the lines contained therein. So for example, passing + a file object or a list of strings to ``yield_lines`` will both work. + (Note that between each string in a sequence of strings there is assumed to + be an implicit line break, so lines cannot bridge two strings in a + sequence.) + + This routine is used extensively by ``pkg_resources`` to parse metadata + and file formats of various kinds, and most other ``pkg_resources`` + parsing functions that yield multiple values will use it to break up their + input. However, this routine is idempotent, so calling ``yield_lines()`` + on the output of another call to ``yield_lines()`` is completely harmless. + +``split_sections(strs)`` + Split a string (or possibly-nested iterable thereof), yielding ``(section, + content)`` pairs found using an ``.ini``-like syntax. Each ``section`` is + a whitespace-stripped version of the section name ("``[section]``") + and each ``content`` is a list of stripped lines excluding blank lines and + comment-only lines. If there are any non-blank, non-comment lines before + the first section header, they're yielded in a first ``section`` of + ``None``. + + This routine uses ``yield_lines()`` as its front end, so you can pass in + anything that ``yield_lines()`` accepts, such as an open text file, string, + or sequence of strings. ``ValueError`` is raised if a malformed section + header is found (i.e. a line starting with ``[`` but not ending with + ``]``). + + Note that this simplistic parser assumes that any line whose first nonblank + character is ``[`` is a section heading, so it can't support .ini format + variations that allow ``[`` as the first nonblank character on other lines. + +``safe_name(name)`` + Return a "safe" form of a project's name, suitable for use in a + ``Requirement`` string, as a distribution name, or a PyPI project name. + All non-alphanumeric runs are condensed to single "-" characters, such that + a name like "The $$$ Tree" becomes "The-Tree". Note that if you are + generating a filename from this value you should combine it with a call to + ``to_filename()`` so all dashes ("-") are replaced by underscores ("_"). + See ``to_filename()``. + +``safe_version(version)`` + This will return the normalized form of any PEP 440 version, if the version + string is not PEP 440 compatible than it is similar to ``safe_name()`` + except that spaces in the input become dots, and dots are allowed to exist + in the output. As with ``safe_name()``, if you are generating a filename + from this you should replace any "-" characters in the output with + underscores. + +``safe_extra(extra)`` + Return a "safe" form of an extra's name, suitable for use in a requirement + string or a setup script's ``extras_require`` keyword. This routine is + similar to ``safe_name()`` except that non-alphanumeric runs are replaced + by a single underbar (``_``), and the result is lowercased. + +``to_filename(name_or_version)`` + Escape a name or version string so it can be used in a dash-separated + filename (or ``#egg=name-version`` tag) without ambiguity. You + should only pass in values that were returned by ``safe_name()`` or + ``safe_version()``. + + +Platform Utilities +------------------ + +``get_build_platform()`` + Return this platform's identifier string. For Windows, the return value + is ``"win32"``, and for Mac OS X it is a string of the form + ``"macosx-10.4-ppc"``. All other platforms return the same uname-based + string that the ``distutils.util.get_platform()`` function returns. + This string is the minimum platform version required by distributions built + on the local machine. (Backward compatibility note: setuptools versions + prior to 0.6b1 called this function ``get_platform()``, and the function is + still available under that name for backward compatibility reasons.) + +``get_supported_platform()`` (New in 0.6b1) + This is the similar to ``get_build_platform()``, but is the maximum + platform version that the local machine supports. You will usually want + to use this value as the ``provided`` argument to the + ``compatible_platforms()`` function. + +``compatible_platforms(provided, required)`` + Return true if a distribution built on the `provided` platform may be used + on the `required` platform. If either platform value is ``None``, it is + considered a wildcard, and the platforms are therefore compatible. + Likewise, if the platform strings are equal, they're also considered + compatible, and ``True`` is returned. Currently, the only non-equal + platform strings that are considered compatible are Mac OS X platform + strings with the same hardware type (e.g. ``ppc``) and major version + (e.g. ``10``) with the `provided` platform's minor version being less than + or equal to the `required` platform's minor version. + +``get_default_cache()`` + Determine the default cache location for extracting resources from zipped + eggs. This routine returns the ``PYTHON_EGG_CACHE`` environment variable, + if set. Otherwise, on Windows, it returns a "Python-Eggs" subdirectory of + the user's "Application Data" directory. On all other systems, it returns + ``os.path.expanduser("~/.python-eggs")`` if ``PYTHON_EGG_CACHE`` is not + set. + + +PEP 302 Utilities +----------------- + +``get_importer(path_item)`` + Retrieve a PEP 302 "importer" for the given path item (which need not + actually be on ``sys.path``). This routine simulates the PEP 302 protocol + for obtaining an "importer" object. It first checks for an importer for + the path item in ``sys.path_importer_cache``, and if not found it calls + each of the ``sys.path_hooks`` and caches the result if a good importer is + found. If no importer is found, this routine returns an ``ImpWrapper`` + instance that wraps the builtin import machinery as a PEP 302-compliant + "importer" object. This ``ImpWrapper`` is *not* cached; instead a new + instance is returned each time. + + (Note: When run under Python 2.5, this function is simply an alias for + ``pkgutil.get_importer()``, and instead of ``pkg_resources.ImpWrapper`` + instances, it may return ``pkgutil.ImpImporter`` instances.) + + +File/Path Utilities +------------------- + +``ensure_directory(path)`` + Ensure that the parent directory (``os.path.dirname``) of `path` actually + exists, using ``os.makedirs()`` if necessary. + +``normalize_path(path)`` + Return a "normalized" version of `path`, such that two paths represent + the same filesystem location if they have equal ``normalized_path()`` + values. Specifically, this is a shortcut for calling ``os.path.realpath`` + and ``os.path.normcase`` on `path`. Unfortunately, on certain platforms + (notably Cygwin and Mac OS X) the ``normcase`` function does not accurately + reflect the platform's case-sensitivity, so there is always the possibility + of two apparently-different paths being equal on such platforms. + +History +------- + +0.6c9 + * Fix ``resource_listdir('')`` always returning an empty list for zipped eggs. + +0.6c7 + * Fix package precedence problem where single-version eggs installed in + ``site-packages`` would take precedence over ``.egg`` files (or directories) + installed in ``site-packages``. + +0.6c6 + * Fix extracted C extensions not having executable permissions under Cygwin. + + * Allow ``.egg-link`` files to contain relative paths. + + * Fix cache dir defaults on Windows when multiple environment vars are needed + to construct a path. + +0.6c4 + * Fix "dev" versions being considered newer than release candidates. + +0.6c3 + * Python 2.5 compatibility fixes. + +0.6c2 + * Fix a problem with eggs specified directly on ``PYTHONPATH`` on + case-insensitive filesystems possibly not showing up in the default + working set, due to differing normalizations of ``sys.path`` entries. + +0.6b3 + * Fixed a duplicate path insertion problem on case-insensitive filesystems. + +0.6b1 + * Split ``get_platform()`` into ``get_supported_platform()`` and + ``get_build_platform()`` to work around a Mac versioning problem that caused + the behavior of ``compatible_platforms()`` to be platform specific. + + * Fix entry point parsing when a standalone module name has whitespace + between it and the extras. + +0.6a11 + * Added ``ExtractionError`` and ``ResourceManager.extraction_error()`` so that + cache permission problems get a more user-friendly explanation of the + problem, and so that programs can catch and handle extraction errors if they + need to. + +0.6a10 + * Added the ``extras`` attribute to ``Distribution``, the ``find_plugins()`` + method to ``WorkingSet``, and the ``__add__()`` and ``__iadd__()`` methods + to ``Environment``. + + * ``safe_name()`` now allows dots in project names. + + * There is a new ``to_filename()`` function that escapes project names and + versions for safe use in constructing egg filenames from a Distribution + object's metadata. + + * Added ``Distribution.clone()`` method, and keyword argument support to other + ``Distribution`` constructors. + + * Added the ``DEVELOP_DIST`` precedence, and automatically assign it to + eggs using ``.egg-info`` format. + +0.6a9 + * Don't raise an error when an invalid (unfinished) distribution is found + unless absolutely necessary. Warn about skipping invalid/unfinished eggs + when building an Environment. + + * Added support for ``.egg-info`` files or directories with version/platform + information embedded in the filename, so that system packagers have the + option of including ``PKG-INFO`` files to indicate the presence of a + system-installed egg, without needing to use ``.egg`` directories, zipfiles, + or ``.pth`` manipulation. + + * Changed ``parse_version()`` to remove dashes before pre-release tags, so + that ``0.2-rc1`` is considered an *older* version than ``0.2``, and is equal + to ``0.2rc1``. The idea that a dash *always* meant a post-release version + was highly non-intuitive to setuptools users and Python developers, who + seem to want to use ``-rc`` version numbers a lot. + +0.6a8 + * Fixed a problem with ``WorkingSet.resolve()`` that prevented version + conflicts from being detected at runtime. + + * Improved runtime conflict warning message to identify a line in the user's + program, rather than flagging the ``warn()`` call in ``pkg_resources``. + + * Avoid giving runtime conflict warnings for namespace packages, even if they + were declared by a different package than the one currently being activated. + + * Fix path insertion algorithm for case-insensitive filesystems. + + * Fixed a problem with nested namespace packages (e.g. ``peak.util``) not + being set as an attribute of their parent package. + +0.6a6 + * Activated distributions are now inserted in ``sys.path`` (and the working + set) just before the directory that contains them, instead of at the end. + This allows e.g. eggs in ``site-packages`` to override unmanaged modules in + the same location, and allows eggs found earlier on ``sys.path`` to override + ones found later. + + * When a distribution is activated, it now checks whether any contained + non-namespace modules have already been imported and issues a warning if + a conflicting module has already been imported. + + * Changed dependency processing so that it's breadth-first, allowing a + depender's preferences to override those of a dependee, to prevent conflicts + when a lower version is acceptable to the dependee, but not the depender. + + * Fixed a problem extracting zipped files on Windows, when the egg in question + has had changed contents but still has the same version number. + +0.6a4 + * Fix a bug in ``WorkingSet.resolve()`` that was introduced in 0.6a3. + +0.6a3 + * Added ``safe_extra()`` parsing utility routine, and use it for Requirement, + EntryPoint, and Distribution objects' extras handling. + +0.6a1 + * Enhanced performance of ``require()`` and related operations when all + requirements are already in the working set, and enhanced performance of + directory scanning for distributions. + + * Fixed some problems using ``pkg_resources`` w/PEP 302 loaders other than + ``zipimport``, and the previously-broken "eager resource" support. + + * Fixed ``pkg_resources.resource_exists()`` not working correctly, along with + some other resource API bugs. + + * Many API changes and enhancements: + + * Added ``EntryPoint``, ``get_entry_map``, ``load_entry_point``, and + ``get_entry_info`` APIs for dynamic plugin discovery. + + * ``list_resources`` is now ``resource_listdir`` (and it actually works) + + * Resource API functions like ``resource_string()`` that accepted a package + name and resource name, will now also accept a ``Requirement`` object in + place of the package name (to allow access to non-package data files in + an egg). + + * ``get_provider()`` will now accept a ``Requirement`` instance or a module + name. If it is given a ``Requirement``, it will return a corresponding + ``Distribution`` (by calling ``require()`` if a suitable distribution + isn't already in the working set), rather than returning a metadata and + resource provider for a specific module. (The difference is in how + resource paths are interpreted; supplying a module name means resources + path will be module-relative, rather than relative to the distribution's + root.) + + * ``Distribution`` objects now implement the ``IResourceProvider`` and + ``IMetadataProvider`` interfaces, so you don't need to reference the (no + longer available) ``metadata`` attribute to get at these interfaces. + + * ``Distribution`` and ``Requirement`` both have a ``project_name`` + attribute for the project name they refer to. (Previously these were + ``name`` and ``distname`` attributes.) + + * The ``path`` attribute of ``Distribution`` objects is now ``location``, + because it isn't necessarily a filesystem path (and hasn't been for some + time now). The ``location`` of ``Distribution`` objects in the filesystem + should always be normalized using ``pkg_resources.normalize_path()``; all + of the setuptools and EasyInstall code that generates distributions from + the filesystem (including ``Distribution.from_filename()``) ensure this + invariant, but if you use a more generic API like ``Distribution()`` or + ``Distribution.from_location()`` you should take care that you don't + create a distribution with an un-normalized filesystem path. + + * ``Distribution`` objects now have an ``as_requirement()`` method that + returns a ``Requirement`` for the distribution's project name and version. + + * Distribution objects no longer have an ``installed_on()`` method, and the + ``install_on()`` method is now ``activate()`` (but may go away altogether + soon). The ``depends()`` method has also been renamed to ``requires()``, + and ``InvalidOption`` is now ``UnknownExtra``. + + * ``find_distributions()`` now takes an additional argument called ``only``, + that tells it to only yield distributions whose location is the passed-in + path. (It defaults to False, so that the default behavior is unchanged.) + + * ``AvailableDistributions`` is now called ``Environment``, and the + ``get()``, ``__len__()``, and ``__contains__()`` methods were removed, + because they weren't particularly useful. ``__getitem__()`` no longer + raises ``KeyError``; it just returns an empty list if there are no + distributions for the named project. + + * The ``resolve()`` method of ``Environment`` is now a method of + ``WorkingSet`` instead, and the ``best_match()`` method now uses a working + set instead of a path list as its second argument. + + * There is a new ``pkg_resources.add_activation_listener()`` API that lets + you register a callback for notifications about distributions added to + ``sys.path`` (including the distributions already on it). This is + basically a hook for extensible applications and frameworks to be able to + search for plugin metadata in distributions added at runtime. + +0.5a13 + * Fixed a bug in resource extraction from nested packages in a zipped egg. + +0.5a12 + * Updated extraction/cache mechanism for zipped resources to avoid inter- + process and inter-thread races during extraction. The default cache + location can now be set via the ``PYTHON_EGGS_CACHE`` environment variable, + and the default Windows cache is now a ``Python-Eggs`` subdirectory of the + current user's "Application Data" directory, if the ``PYTHON_EGGS_CACHE`` + variable isn't set. + +0.5a10 + * Fix a problem with ``pkg_resources`` being confused by non-existent eggs on + ``sys.path`` (e.g. if a user deletes an egg without removing it from the + ``easy-install.pth`` file). + + * Fix a problem with "basket" support in ``pkg_resources``, where egg-finding + never actually went inside ``.egg`` files. + + * Made ``pkg_resources`` import the module you request resources from, if it's + not already imported. + +0.5a4 + * ``pkg_resources.AvailableDistributions.resolve()`` and related methods now + accept an ``installer`` argument: a callable taking one argument, a + ``Requirement`` instance. The callable must return a ``Distribution`` + object, or ``None`` if no distribution is found. This feature is used by + EasyInstall to resolve dependencies by recursively invoking itself. + +0.4a4 + * Fix problems with ``resource_listdir()``, ``resource_isdir()`` and resource + directory extraction for zipped eggs. + +0.4a3 + * Fixed scripts not being able to see a ``__file__`` variable in ``__main__`` + + * Fixed a problem with ``resource_isdir()`` implementation that was introduced + in 0.4a2. + +0.4a1 + * Fixed a bug in requirements processing for exact versions (i.e. ``==`` and + ``!=``) when only one condition was included. + + * Added ``safe_name()`` and ``safe_version()`` APIs to clean up handling of + arbitrary distribution names and versions found on PyPI. + +0.3a4 + * ``pkg_resources`` now supports resource directories, not just the resources + in them. In particular, there are ``resource_listdir()`` and + ``resource_isdir()`` APIs. + + * ``pkg_resources`` now supports "egg baskets" -- .egg zipfiles which contain + multiple distributions in subdirectories whose names end with ``.egg``. + Having such a "basket" in a directory on ``sys.path`` is equivalent to + having the individual eggs in that directory, but the contained eggs can + be individually added (or not) to ``sys.path``. Currently, however, there + is no automated way to create baskets. + + * Namespace package manipulation is now protected by the Python import lock. + +0.3a1 + * Initial release. + diff --git a/docs/python3.txt b/docs/python3.txt new file mode 100644 index 0000000..d550cb6 --- /dev/null +++ b/docs/python3.txt @@ -0,0 +1,94 @@ +===================================================== +Supporting both Python 2 and Python 3 with Setuptools +===================================================== + +Starting with Distribute version 0.6.2 and Setuptools 0.7, the Setuptools +project supported Python 3. Installing and +using setuptools for Python 3 code works exactly the same as for Python 2 +code. + +Setuptools provides a facility to invoke 2to3 on the code as a part of the +build process, by setting the keyword parameter ``use_2to3`` to True, but +the Setuptools strongly recommends instead developing a unified codebase +using `six `_, +`future `_, or another compatibility +library. + + +Using 2to3 +========== + +Setuptools attempts to make the porting process easier by automatically +running +2to3 as a part of running tests. To do so, you need to configure the +setup.py so that you can run the unit tests with ``python setup.py test``. + +See :ref:`test` for more information on this. + +Once you have the tests running under Python 2, you can add the use_2to3 +keyword parameters to setup(), and start running the tests under Python 3. +The test command will now first run the build command during which the code +will be converted with 2to3, and the tests will then be run from the build +directory, as opposed from the source directory as is normally done. + +Setuptools will convert all Python files, and also all doctests in Python +files. However, if you have doctests located in separate text files, these +will not automatically be converted. By adding them to the +``convert_2to3_doctests`` keyword parameter Setuptools will convert them as +well. + +By default, the conversion uses all fixers in the ``lib2to3.fixers`` package. +To use additional fixers, the parameter ``use_2to3_fixers`` can be set +to a list of names of packages containing fixers. To exclude fixers, the +parameter ``use_2to3_exclude_fixers`` can be set to fixer names to be +skipped. + +An example setup.py might look something like this:: + + from setuptools import setup + + setup( + name='your.module', + version='1.0', + description='This is your awesome module', + author='You', + author_email='your@email', + package_dir={'': 'src'}, + packages=['your', 'you.module'], + test_suite='your.module.tests', + use_2to3=True, + convert_2to3_doctests=['src/your/module/README.txt'], + use_2to3_fixers=['your.fixers'], + use_2to3_exclude_fixers=['lib2to3.fixes.fix_import'], + ) + +Differential conversion +----------------------- + +Note that a file will only be copied and converted during the build process +if the source file has been changed. If you add a file to the doctests +that should be converted, it will not be converted the next time you run +the tests, since it hasn't been modified. You need to remove it from the +build directory. Also if you run the build, install or test commands before +adding the use_2to3 parameter, you will have to remove the build directory +before you run the test command, as the files otherwise will seem updated, +and no conversion will happen. + +In general, if code doesn't seem to be converted, deleting the build directory +and trying again is a good safeguard against the build directory getting +"out of sync" with the source directory. + +Distributing Python 3 modules +============================= + +You can distribute your modules with Python 3 support in different ways. A +normal source distribution will work, but can be slow in installing, as the +2to3 process will be run during the install. But you can also distribute +the module in binary format, such as a binary egg. That egg will contain the +already converted code, and hence no 2to3 conversion is needed during install. + +Advanced features +================= + +If you don't want to run the 2to3 conversion on the doctests in Python files, +you can turn that off by setting ``setuptools.use_2to3_on_doctests = False``. diff --git a/docs/releases.txt b/docs/releases.txt new file mode 100644 index 0000000..c84ddd7 --- /dev/null +++ b/docs/releases.txt @@ -0,0 +1,53 @@ +=============== +Release Process +=============== + +In order to allow for rapid, predictable releases, Setuptools uses a +mechanical technique for releases, enacted by Travis following a +successful build of a tagged release per +`PyPI deployment `_. + +To cut a release, install and run ``bumpversion {part}`` where ``part`` +is major, minor, or patch based on the scope of the changes in the +release. Then, push the commits to the master branch. If tests pass, +the release will be uploaded to PyPI (from the Python 3.5 tests). + +Bootstrap Branch +---------------- + +Setuptools has a bootstrap script (ez_setup.py), which is hosted in the +repository in the ``bootstrap`` branch. + +Therefore, the latest bootstrap script can be retrieved by checking out +that branch. + +The officially-published location of the bootstrap script is hosted on Python +infrastructure (#python-infra on freenode) at https://bootstrap.pypa.io and +is updated every fifteen minutes from the bootstrap branch. Sometimes, +especially when the bootstrap script is rolled back, this +process doesn't work as expected and requires manual intervention. + +Release Frequency +----------------- + +Some have asked why Setuptools is released so frequently. Because Setuptools +uses a mechanical release process, it's very easy to make releases whenever the +code is stable (tests are passing). As a result, the philosophy is to release +early and often. + +While some find the frequent releases somewhat surprising, they only empower +the user. Although releases are made frequently, users can choose the frequency +at which they use those releases. If instead Setuptools contributions were only +released in batches, the user would be constrained to only use Setuptools when +those official releases were made. With frequent releases, the user can govern +exactly how often he wishes to update. + +Frequent releases also then obviate the need for dev or beta releases in most +cases. Because releases are made early and often, bugs are discovered and +corrected quickly, in many cases before other users have yet to encounter them. + +Release Managers +---------------- + +Additionally, anyone with push access to the master branch has access to cut +releases. diff --git a/docs/requirements.txt b/docs/requirements.txt new file mode 100644 index 0000000..4be4188 --- /dev/null +++ b/docs/requirements.txt @@ -0,0 +1,2 @@ +rst.linker>=1.6.1 +sphinx>=1.4 diff --git a/docs/roadmap.txt b/docs/roadmap.txt new file mode 100644 index 0000000..8f175b9 --- /dev/null +++ b/docs/roadmap.txt @@ -0,0 +1,6 @@ +======= +Roadmap +======= + +Setuptools is primarily in maintenance mode. The project attempts to address +user issues, concerns, and feature requests in a timely fashion. diff --git a/docs/setuptools.txt b/docs/setuptools.txt new file mode 100644 index 0000000..f0da6e1 --- /dev/null +++ b/docs/setuptools.txt @@ -0,0 +1,2746 @@ +================================================== +Building and Distributing Packages with Setuptools +================================================== + +``Setuptools`` is a collection of enhancements to the Python ``distutils`` +(for Python 2.6 and up) that allow developers to more easily build and +distribute Python packages, especially ones that have dependencies on other +packages. + +Packages built and distributed using ``setuptools`` look to the user like +ordinary Python packages based on the ``distutils``. Your users don't need to +install or even know about setuptools in order to use them, and you don't +have to include the entire setuptools package in your distributions. By +including just a single `bootstrap module`_ (a 12K .py file), your package will +automatically download and install ``setuptools`` if the user is building your +package from source and doesn't have a suitable version already installed. + +.. _bootstrap module: https://bootstrap.pypa.io/ez_setup.py + +Feature Highlights: + +* Automatically find/download/install/upgrade dependencies at build time using + the `EasyInstall tool `_, + which supports downloading via HTTP, FTP, Subversion, and SourceForge, and + automatically scans web pages linked from PyPI to find download links. (It's + the closest thing to CPAN currently available for Python.) + +* Create `Python Eggs `_ - + a single-file importable distribution format + +* Enhanced support for accessing data files hosted in zipped packages. + +* Automatically include all packages in your source tree, without listing them + individually in setup.py + +* Automatically include all relevant files in your source distributions, + without needing to create a ``MANIFEST.in`` file, and without having to force + regeneration of the ``MANIFEST`` file when your source tree changes. + +* Automatically generate wrapper scripts or Windows (console and GUI) .exe + files for any number of "main" functions in your project. (Note: this is not + a py2exe replacement; the .exe files rely on the local Python installation.) + +* Transparent Pyrex support, so that your setup.py can list ``.pyx`` files and + still work even when the end-user doesn't have Pyrex installed (as long as + you include the Pyrex-generated C in your source distribution) + +* Command aliases - create project-specific, per-user, or site-wide shortcut + names for commonly used commands and options + +* PyPI upload support - upload your source distributions and eggs to PyPI + +* Deploy your project in "development mode", such that it's available on + ``sys.path``, yet can still be edited directly from its source checkout. + +* Easily extend the distutils with new commands or ``setup()`` arguments, and + distribute/reuse your extensions for multiple projects, without copying code. + +* Create extensible applications and frameworks that automatically discover + extensions, using simple "entry points" declared in a project's setup script. + +.. contents:: **Table of Contents** + +.. _ez_setup.py: `bootstrap module`_ + + +----------------- +Developer's Guide +----------------- + + +Installing ``setuptools`` +========================= + +Please follow the `EasyInstall Installation Instructions`_ to install the +current stable version of setuptools. In particular, be sure to read the +section on `Custom Installation Locations`_ if you are installing anywhere +other than Python's ``site-packages`` directory. + +.. _EasyInstall Installation Instructions: easy_install.html#installation-instructions + +.. _Custom Installation Locations: easy_install.html#custom-installation-locations + +If you want the current in-development version of setuptools, you should first +install a stable version, and then run:: + + ez_setup.py setuptools==dev + +This will download and install the latest development (i.e. unstable) version +of setuptools from the Python Subversion sandbox. + + +Basic Use +========= + +For basic use of setuptools, just import things from setuptools instead of +the distutils. Here's a minimal setup script using setuptools:: + + from setuptools import setup, find_packages + setup( + name="HelloWorld", + version="0.1", + packages=find_packages(), + ) + +As you can see, it doesn't take much to use setuptools in a project. +Run that script in your project folder, alongside the Python packages +you have developed. + +Invoke that script to produce eggs, upload to +PyPI, and automatically include all packages in the directory where the +setup.py lives. See the `Command Reference`_ section below to see what +commands you can give to this setup script. For example, +to produce a source distribution, simply invoke:: + + python setup.py sdist + +Of course, before you release your project to PyPI, you'll want to add a bit +more information to your setup script to help people find or learn about your +project. And maybe your project will have grown by then to include a few +dependencies, and perhaps some data files and scripts:: + + from setuptools import setup, find_packages + setup( + name="HelloWorld", + version="0.1", + packages=find_packages(), + scripts=['say_hello.py'], + + # Project uses reStructuredText, so ensure that the docutils get + # installed or upgraded on the target machine + install_requires=['docutils>=0.3'], + + package_data={ + # If any package contains *.txt or *.rst files, include them: + '': ['*.txt', '*.rst'], + # And include any *.msg files found in the 'hello' package, too: + 'hello': ['*.msg'], + }, + + # metadata for upload to PyPI + author="Me", + author_email="me@example.com", + description="This is an Example Package", + license="PSF", + keywords="hello world example examples", + url="http://example.com/HelloWorld/", # project home page, if any + + # could also include long_description, download_url, classifiers, etc. + ) + +In the sections that follow, we'll explain what most of these ``setup()`` +arguments do (except for the metadata ones), and the various ways you might use +them in your own project(s). + + +Specifying Your Project's Version +--------------------------------- + +Setuptools can work well with most versioning schemes; there are, however, a +few special things to watch out for, in order to ensure that setuptools and +EasyInstall can always tell what version of your package is newer than another +version. Knowing these things will also help you correctly specify what +versions of other projects your project depends on. + +A version consists of an alternating series of release numbers and pre-release +or post-release tags. A release number is a series of digits punctuated by +dots, such as ``2.4`` or ``0.5``. Each series of digits is treated +numerically, so releases ``2.1`` and ``2.1.0`` are different ways to spell the +same release number, denoting the first subrelease of release 2. But ``2.10`` +is the *tenth* subrelease of release 2, and so is a different and newer release +from ``2.1`` or ``2.1.0``. Leading zeros within a series of digits are also +ignored, so ``2.01`` is the same as ``2.1``, and different from ``2.0.1``. + +Following a release number, you can have either a pre-release or post-release +tag. Pre-release tags make a version be considered *older* than the version +they are appended to. So, revision ``2.4`` is *newer* than revision ``2.4c1``, +which in turn is newer than ``2.4b1`` or ``2.4a1``. Postrelease tags make +a version be considered *newer* than the version they are appended to. So, +revisions like ``2.4-1`` and ``2.4pl3`` are newer than ``2.4``, but are *older* +than ``2.4.1`` (which has a higher release number). + +A pre-release tag is a series of letters that are alphabetically before +"final". Some examples of prerelease tags would include ``alpha``, ``beta``, +``a``, ``c``, ``dev``, and so on. You do not have to place a dot or dash +before the prerelease tag if it's immediately after a number, but it's okay to +do so if you prefer. Thus, ``2.4c1`` and ``2.4.c1`` and ``2.4-c1`` all +represent release candidate 1 of version ``2.4``, and are treated as identical +by setuptools. + +In addition, there are three special prerelease tags that are treated as if +they were the letter ``c``: ``pre``, ``preview``, and ``rc``. So, version +``2.4rc1``, ``2.4pre1`` and ``2.4preview1`` are all the exact same version as +``2.4c1``, and are treated as identical by setuptools. + +A post-release tag is either a series of letters that are alphabetically +greater than or equal to "final", or a dash (``-``). Post-release tags are +generally used to separate patch numbers, port numbers, build numbers, revision +numbers, or date stamps from the release number. For example, the version +``2.4-r1263`` might denote Subversion revision 1263 of a post-release patch of +version ``2.4``. Or you might use ``2.4-20051127`` to denote a date-stamped +post-release. + +Notice that after each pre or post-release tag, you are free to place another +release number, followed again by more pre- or post-release tags. For example, +``0.6a9.dev-r41475`` could denote Subversion revision 41475 of the in- +development version of the ninth alpha of release 0.6. Notice that ``dev`` is +a pre-release tag, so this version is a *lower* version number than ``0.6a9``, +which would be the actual ninth alpha of release 0.6. But the ``-r41475`` is +a post-release tag, so this version is *newer* than ``0.6a9.dev``. + +For the most part, setuptools' interpretation of version numbers is intuitive, +but here are a few tips that will keep you out of trouble in the corner cases: + +* Don't stick adjoining pre-release tags together without a dot or number + between them. Version ``1.9adev`` is the ``adev`` prerelease of ``1.9``, + *not* a development pre-release of ``1.9a``. Use ``.dev`` instead, as in + ``1.9a.dev``, or separate the prerelease tags with a number, as in + ``1.9a0dev``. ``1.9a.dev``, ``1.9a0dev``, and even ``1.9.a.dev`` are + identical versions from setuptools' point of view, so you can use whatever + scheme you prefer. + +* If you want to be certain that your chosen numbering scheme works the way + you think it will, you can use the ``pkg_resources.parse_version()`` function + to compare different version numbers:: + + >>> from pkg_resources import parse_version + >>> parse_version('1.9.a.dev') == parse_version('1.9a0dev') + True + >>> parse_version('2.1-rc2') < parse_version('2.1') + True + >>> parse_version('0.6a9dev-r41475') < parse_version('0.6a9') + True + +Once you've decided on a version numbering scheme for your project, you can +have setuptools automatically tag your in-development releases with various +pre- or post-release tags. See the following sections for more details: + +* `Tagging and "Daily Build" or "Snapshot" Releases`_ +* `Managing "Continuous Releases" Using Subversion`_ +* The `egg_info`_ command + + +New and Changed ``setup()`` Keywords +==================================== + +The following keyword arguments to ``setup()`` are added or changed by +``setuptools``. All of them are optional; you do not have to supply them +unless you need the associated ``setuptools`` feature. + +``include_package_data`` + If set to ``True``, this tells ``setuptools`` to automatically include any + data files it finds inside your package directories that are specified by + your ``MANIFEST.in`` file. For more information, see the section below on + `Including Data Files`_. + +``exclude_package_data`` + A dictionary mapping package names to lists of glob patterns that should + be *excluded* from your package directories. You can use this to trim back + any excess files included by ``include_package_data``. For a complete + description and examples, see the section below on `Including Data Files`_. + +``package_data`` + A dictionary mapping package names to lists of glob patterns. For a + complete description and examples, see the section below on `Including + Data Files`_. You do not need to use this option if you are using + ``include_package_data``, unless you need to add e.g. files that are + generated by your setup script and build process. (And are therefore not + in source control or are files that you don't want to include in your + source distribution.) + +``zip_safe`` + A boolean (True or False) flag specifying whether the project can be + safely installed and run from a zip file. If this argument is not + supplied, the ``bdist_egg`` command will have to analyze all of your + project's contents for possible problems each time it builds an egg. + +``install_requires`` + A string or list of strings specifying what other distributions need to + be installed when this one is. See the section below on `Declaring + Dependencies`_ for details and examples of the format of this argument. + +``entry_points`` + A dictionary mapping entry point group names to strings or lists of strings + defining the entry points. Entry points are used to support dynamic + discovery of services or plugins provided by a project. See `Dynamic + Discovery of Services and Plugins`_ for details and examples of the format + of this argument. In addition, this keyword is used to support `Automatic + Script Creation`_. + +``extras_require`` + A dictionary mapping names of "extras" (optional features of your project) + to strings or lists of strings specifying what other distributions must be + installed to support those features. See the section below on `Declaring + Dependencies`_ for details and examples of the format of this argument. + +``python_requires`` + A string corresponding to a version specifier (as defined in PEP 440) for + the Python version, used to specify the Requires-Python defined in PEP 345. + +``setup_requires`` + A string or list of strings specifying what other distributions need to + be present in order for the *setup script* to run. ``setuptools`` will + attempt to obtain these (even going so far as to download them using + ``EasyInstall``) before processing the rest of the setup script or commands. + This argument is needed if you are using distutils extensions as part of + your build process; for example, extensions that process setup() arguments + and turn them into EGG-INFO metadata files. + + (Note: projects listed in ``setup_requires`` will NOT be automatically + installed on the system where the setup script is being run. They are + simply downloaded to the ./.eggs directory if they're not locally available + already. If you want them to be installed, as well as being available + when the setup script is run, you should add them to ``install_requires`` + **and** ``setup_requires``.) + +``dependency_links`` + A list of strings naming URLs to be searched when satisfying dependencies. + These links will be used if needed to install packages specified by + ``setup_requires`` or ``tests_require``. They will also be written into + the egg's metadata for use by tools like EasyInstall to use when installing + an ``.egg`` file. + +``namespace_packages`` + A list of strings naming the project's "namespace packages". A namespace + package is a package that may be split across multiple project + distributions. For example, Zope 3's ``zope`` package is a namespace + package, because subpackages like ``zope.interface`` and ``zope.publisher`` + may be distributed separately. The egg runtime system can automatically + merge such subpackages into a single parent package at runtime, as long + as you declare them in each project that contains any subpackages of the + namespace package, and as long as the namespace package's ``__init__.py`` + does not contain any code other than a namespace declaration. See the + section below on `Namespace Packages`_ for more information. + +``test_suite`` + A string naming a ``unittest.TestCase`` subclass (or a package or module + containing one or more of them, or a method of such a subclass), or naming + a function that can be called with no arguments and returns a + ``unittest.TestSuite``. If the named suite is a module, and the module + has an ``additional_tests()`` function, it is called and the results are + added to the tests to be run. If the named suite is a package, any + submodules and subpackages are recursively added to the overall test suite. + + Specifying this argument enables use of the `test`_ command to run the + specified test suite, e.g. via ``setup.py test``. See the section on the + `test`_ command below for more details. + +``tests_require`` + If your project's tests need one or more additional packages besides those + needed to install it, you can use this option to specify them. It should + be a string or list of strings specifying what other distributions need to + be present for the package's tests to run. When you run the ``test`` + command, ``setuptools`` will attempt to obtain these (even going + so far as to download them using ``EasyInstall``). Note that these + required projects will *not* be installed on the system where the tests + are run, but only downloaded to the project's setup directory if they're + not already installed locally. + +.. _test_loader: + +``test_loader`` + If you would like to use a different way of finding tests to run than what + setuptools normally uses, you can specify a module name and class name in + this argument. The named class must be instantiable with no arguments, and + its instances must support the ``loadTestsFromNames()`` method as defined + in the Python ``unittest`` module's ``TestLoader`` class. Setuptools will + pass only one test "name" in the `names` argument: the value supplied for + the ``test_suite`` argument. The loader you specify may interpret this + string in any way it likes, as there are no restrictions on what may be + contained in a ``test_suite`` string. + + The module name and class name must be separated by a ``:``. The default + value of this argument is ``"setuptools.command.test:ScanningLoader"``. If + you want to use the default ``unittest`` behavior, you can specify + ``"unittest:TestLoader"`` as your ``test_loader`` argument instead. This + will prevent automatic scanning of submodules and subpackages. + + The module and class you specify here may be contained in another package, + as long as you use the ``tests_require`` option to ensure that the package + containing the loader class is available when the ``test`` command is run. + +``eager_resources`` + A list of strings naming resources that should be extracted together, if + any of them is needed, or if any C extensions included in the project are + imported. This argument is only useful if the project will be installed as + a zipfile, and there is a need to have all of the listed resources be + extracted to the filesystem *as a unit*. Resources listed here + should be '/'-separated paths, relative to the source root, so to list a + resource ``foo.png`` in package ``bar.baz``, you would include the string + ``bar/baz/foo.png`` in this argument. + + If you only need to obtain resources one at a time, or you don't have any C + extensions that access other files in the project (such as data files or + shared libraries), you probably do NOT need this argument and shouldn't + mess with it. For more details on how this argument works, see the section + below on `Automatic Resource Extraction`_. + +``use_2to3`` + Convert the source code from Python 2 to Python 3 with 2to3 during the + build process. See :doc:`python3` for more details. + +``convert_2to3_doctests`` + List of doctest source files that need to be converted with 2to3. + See :doc:`python3` for more details. + +``use_2to3_fixers`` + A list of modules to search for additional fixers to be used during + the 2to3 conversion. See :doc:`python3` for more details. + + +Using ``find_packages()`` +------------------------- + +For simple projects, it's usually easy enough to manually add packages to +the ``packages`` argument of ``setup()``. However, for very large projects +(Twisted, PEAK, Zope, Chandler, etc.), it can be a big burden to keep the +package list updated. That's what ``setuptools.find_packages()`` is for. + +``find_packages()`` takes a source directory and two lists of package name +patterns to exclude and include. If omitted, the source directory defaults to +the same +directory as the setup script. Some projects use a ``src`` or ``lib`` +directory as the root of their source tree, and those projects would of course +use ``"src"`` or ``"lib"`` as the first argument to ``find_packages()``. (And +such projects also need something like ``package_dir={'':'src'}`` in their +``setup()`` arguments, but that's just a normal distutils thing.) + +Anyway, ``find_packages()`` walks the target directory, filtering by inclusion +patterns, and finds Python packages (any directory). On Python 3.2 and +earlier, packages are only recognized if they include an ``__init__.py`` file. +Finally, exclusion patterns are applied to remove matching packages. + +Inclusion and exclusion patterns are package names, optionally including +wildcards. For +example, ``find_packages(exclude=["*.tests"])`` will exclude all packages whose +last name part is ``tests``. Or, ``find_packages(exclude=["*.tests", +"*.tests.*"])`` will also exclude any subpackages of packages named ``tests``, +but it still won't exclude a top-level ``tests`` package or the children +thereof. In fact, if you really want no ``tests`` packages at all, you'll need +something like this:: + + find_packages(exclude=["*.tests", "*.tests.*", "tests.*", "tests"]) + +in order to cover all the bases. Really, the exclusion patterns are intended +to cover simpler use cases than this, like excluding a single, specified +package and its subpackages. + +Regardless of the parameters, the ``find_packages()`` +function returns a list of package names suitable for use as the ``packages`` +argument to ``setup()``, and so is usually the easiest way to set that +argument in your setup script. Especially since it frees you from having to +remember to modify your setup script whenever your project grows additional +top-level packages or subpackages. + + +Automatic Script Creation +========================= + +Packaging and installing scripts can be a bit awkward with the distutils. For +one thing, there's no easy way to have a script's filename match local +conventions on both Windows and POSIX platforms. For another, you often have +to create a separate file just for the "main" script, when your actual "main" +is a function in a module somewhere. And even in Python 2.4, using the ``-m`` +option only works for actual ``.py`` files that aren't installed in a package. + +``setuptools`` fixes all of these problems by automatically generating scripts +for you with the correct extension, and on Windows it will even create an +``.exe`` file so that users don't have to change their ``PATHEXT`` settings. +The way to use this feature is to define "entry points" in your setup script +that indicate what function the generated script should import and run. For +example, to create two console scripts called ``foo`` and ``bar``, and a GUI +script called ``baz``, you might do something like this:: + + setup( + # other arguments here... + entry_points={ + 'console_scripts': [ + 'foo = my_package.some_module:main_func', + 'bar = other_module:some_func', + ], + 'gui_scripts': [ + 'baz = my_package_gui:start_func', + ] + } + ) + +When this project is installed on non-Windows platforms (using "setup.py +install", "setup.py develop", or by using EasyInstall), a set of ``foo``, +``bar``, and ``baz`` scripts will be installed that import ``main_func`` and +``some_func`` from the specified modules. The functions you specify are called +with no arguments, and their return value is passed to ``sys.exit()``, so you +can return an errorlevel or message to print to stderr. + +On Windows, a set of ``foo.exe``, ``bar.exe``, and ``baz.exe`` launchers are +created, alongside a set of ``foo.py``, ``bar.py``, and ``baz.pyw`` files. The +``.exe`` wrappers find and execute the right version of Python to run the +``.py`` or ``.pyw`` file. + +You may define as many "console script" and "gui script" entry points as you +like, and each one can optionally specify "extras" that it depends on, that +will be added to ``sys.path`` when the script is run. For more information on +"extras", see the section below on `Declaring Extras`_. For more information +on "entry points" in general, see the section below on `Dynamic Discovery of +Services and Plugins`_. + + +"Eggsecutable" Scripts +---------------------- + +Occasionally, there are situations where it's desirable to make an ``.egg`` +file directly executable. You can do this by including an entry point such +as the following:: + + setup( + # other arguments here... + entry_points={ + 'setuptools.installation': [ + 'eggsecutable = my_package.some_module:main_func', + ] + } + ) + +Any eggs built from the above setup script will include a short executable +prelude that imports and calls ``main_func()`` from ``my_package.some_module``. +The prelude can be run on Unix-like platforms (including Mac and Linux) by +invoking the egg with ``/bin/sh``, or by enabling execute permissions on the +``.egg`` file. For the executable prelude to run, the appropriate version of +Python must be available via the ``PATH`` environment variable, under its +"long" name. That is, if the egg is built for Python 2.3, there must be a +``python2.3`` executable present in a directory on ``PATH``. + +This feature is primarily intended to support ez_setup the installation of +setuptools itself on non-Windows platforms, but may also be useful for other +projects as well. + +IMPORTANT NOTE: Eggs with an "eggsecutable" header cannot be renamed, or +invoked via symlinks. They *must* be invoked using their original filename, in +order to ensure that, once running, ``pkg_resources`` will know what project +and version is in use. The header script will check this and exit with an +error if the ``.egg`` file has been renamed or is invoked via a symlink that +changes its base name. + + +Declaring Dependencies +====================== + +``setuptools`` supports automatically installing dependencies when a package is +installed, and including information about dependencies in Python Eggs (so that +package management tools like EasyInstall can use the information). + +``setuptools`` and ``pkg_resources`` use a common syntax for specifying a +project's required dependencies. This syntax consists of a project's PyPI +name, optionally followed by a comma-separated list of "extras" in square +brackets, optionally followed by a comma-separated list of version +specifiers. A version specifier is one of the operators ``<``, ``>``, ``<=``, +``>=``, ``==`` or ``!=``, followed by a version identifier. Tokens may be +separated by whitespace, but any whitespace or nonstandard characters within a +project name or version identifier must be replaced with ``-``. + +Version specifiers for a given project are internally sorted into ascending +version order, and used to establish what ranges of versions are acceptable. +Adjacent redundant conditions are also consolidated (e.g. ``">1, >2"`` becomes +``">1"``, and ``"<2,<3"`` becomes ``"<3"``). ``"!="`` versions are excised from +the ranges they fall within. A project's version is then checked for +membership in the resulting ranges. (Note that providing conflicting conditions +for the same version (e.g. "<2,>=2" or "==2,!=2") is meaningless and may +therefore produce bizarre results.) + +Here are some example requirement specifiers:: + + docutils >= 0.3 + + # comment lines and \ continuations are allowed in requirement strings + BazSpam ==1.1, ==1.2, ==1.3, ==1.4, ==1.5, \ + ==1.6, ==1.7 # and so are line-end comments + + PEAK[FastCGI, reST]>=0.5a4 + + setuptools==0.5a7 + +The simplest way to include requirement specifiers is to use the +``install_requires`` argument to ``setup()``. It takes a string or list of +strings containing requirement specifiers. If you include more than one +requirement in a string, each requirement must begin on a new line. + +This has three effects: + +1. When your project is installed, either by using EasyInstall, ``setup.py + install``, or ``setup.py develop``, all of the dependencies not already + installed will be located (via PyPI), downloaded, built (if necessary), + and installed. + +2. Any scripts in your project will be installed with wrappers that verify + the availability of the specified dependencies at runtime, and ensure that + the correct versions are added to ``sys.path`` (e.g. if multiple versions + have been installed). + +3. Python Egg distributions will include a metadata file listing the + dependencies. + +Note, by the way, that if you declare your dependencies in ``setup.py``, you do +*not* need to use the ``require()`` function in your scripts or modules, as +long as you either install the project or use ``setup.py develop`` to do +development work on it. (See `"Development Mode"`_ below for more details on +using ``setup.py develop``.) + + +Dependencies that aren't in PyPI +-------------------------------- + +If your project depends on packages that aren't registered in PyPI, you may +still be able to depend on them, as long as they are available for download +as: + +- an egg, in the standard distutils ``sdist`` format, +- a single ``.py`` file, or +- a VCS repository (Subversion, Mercurial, or Git). + +You just need to add some URLs to the ``dependency_links`` argument to +``setup()``. + +The URLs must be either: + +1. direct download URLs, +2. the URLs of web pages that contain direct download links, or +3. the repository's URL + +In general, it's better to link to web pages, because it is usually less +complex to update a web page than to release a new version of your project. +You can also use a SourceForge ``showfiles.php`` link in the case where a +package you depend on is distributed via SourceForge. + +If you depend on a package that's distributed as a single ``.py`` file, you +must include an ``"#egg=project-version"`` suffix to the URL, to give a project +name and version number. (Be sure to escape any dashes in the name or version +by replacing them with underscores.) EasyInstall will recognize this suffix +and automatically create a trivial ``setup.py`` to wrap the single ``.py`` file +as an egg. + +In the case of a VCS checkout, you should also append ``#egg=project-version`` +in order to identify for what package that checkout should be used. You can +append ``@REV`` to the URL's path (before the fragment) to specify a revision. +Additionally, you can also force the VCS being used by prepending the URL with +a certain prefix. Currently available are: + +- ``svn+URL`` for Subversion, +- ``git+URL`` for Git, and +- ``hg+URL`` for Mercurial + +A more complete example would be: + + ``vcs+proto://host/path@revision#egg=project-version`` + +Be careful with the version. It should match the one inside the project files. +If you want to disregard the version, you have to omit it both in the +``requires`` and in the URL's fragment. + +This will do a checkout (or a clone, in Git and Mercurial parlance) to a +temporary folder and run ``setup.py bdist_egg``. + +The ``dependency_links`` option takes the form of a list of URL strings. For +example, the below will cause EasyInstall to search the specified page for +eggs or source distributions, if the package's dependencies aren't already +installed:: + + setup( + ... + dependency_links=[ + "http://peak.telecommunity.com/snapshots/" + ], + ) + + +.. _Declaring Extras: + + +Declaring "Extras" (optional features with their own dependencies) +------------------------------------------------------------------ + +Sometimes a project has "recommended" dependencies, that are not required for +all uses of the project. For example, a project might offer optional PDF +output if ReportLab is installed, and reStructuredText support if docutils is +installed. These optional features are called "extras", and setuptools allows +you to define their requirements as well. In this way, other projects that +require these optional features can force the additional requirements to be +installed, by naming the desired extras in their ``install_requires``. + +For example, let's say that Project A offers optional PDF and reST support:: + + setup( + name="Project-A", + ... + extras_require={ + 'PDF': ["ReportLab>=1.2", "RXP"], + 'reST': ["docutils>=0.3"], + } + ) + +As you can see, the ``extras_require`` argument takes a dictionary mapping +names of "extra" features, to strings or lists of strings describing those +features' requirements. These requirements will *not* be automatically +installed unless another package depends on them (directly or indirectly) by +including the desired "extras" in square brackets after the associated project +name. (Or if the extras were listed in a requirement spec on the EasyInstall +command line.) + +Extras can be used by a project's `entry points`_ to specify dynamic +dependencies. For example, if Project A includes a "rst2pdf" script, it might +declare it like this, so that the "PDF" requirements are only resolved if the +"rst2pdf" script is run:: + + setup( + name="Project-A", + ... + entry_points={ + 'console_scripts': [ + 'rst2pdf = project_a.tools.pdfgen [PDF]', + 'rst2html = project_a.tools.htmlgen', + # more script entry points ... + ], + } + ) + +Projects can also use another project's extras when specifying dependencies. +For example, if project B needs "project A" with PDF support installed, it +might declare the dependency like this:: + + setup( + name="Project-B", + install_requires=["Project-A[PDF]"], + ... + ) + +This will cause ReportLab to be installed along with project A, if project B is +installed -- even if project A was already installed. In this way, a project +can encapsulate groups of optional "downstream dependencies" under a feature +name, so that packages that depend on it don't have to know what the downstream +dependencies are. If a later version of Project A builds in PDF support and +no longer needs ReportLab, or if it ends up needing other dependencies besides +ReportLab in order to provide PDF support, Project B's setup information does +not need to change, but the right packages will still be installed if needed. + +Note, by the way, that if a project ends up not needing any other packages to +support a feature, it should keep an empty requirements list for that feature +in its ``extras_require`` argument, so that packages depending on that feature +don't break (due to an invalid feature name). For example, if Project A above +builds in PDF support and no longer needs ReportLab, it could change its +setup to this:: + + setup( + name="Project-A", + ... + extras_require={ + 'PDF': [], + 'reST': ["docutils>=0.3"], + } + ) + +so that Package B doesn't have to remove the ``[PDF]`` from its requirement +specifier. + + +.. _Platform Specific Dependencies: + + +Declaring platform specific dependencies +---------------------------------------- + +Sometimes a project might require a dependency to run on a specific platform. +This could to a package that back ports a module so that it can be used in +older python versions. Or it could be a package that is required to run on a +specific operating system. This will allow a project to work on multiple +different platforms without installing dependencies that are not required for +a platform that is installing the project. + +For example, here is a project that uses the ``enum`` module and ``pywin32``:: + + setup( + name="Project", + ... + install_requires=[ + 'enum34;python_version<"3.4"', + 'pywin32 >= 1.0;platform_system=="Windows"' + ] + ) + +Since the ``enum`` module was added in Python 3.4, it should only be installed +if the python version is earlier. Since ``pywin32`` will only be used on +windows, it should only be installed when the operating system is Windows. +Specifying version requirements for the dependencies is supported as normal. + +The environmental markers that may be used for testing platform types are +detailed in `PEP 508`_. + +.. _PEP 508: https://www.python.org/dev/peps/pep-0508/ + +Including Data Files +==================== + +The distutils have traditionally allowed installation of "data files", which +are placed in a platform-specific location. However, the most common use case +for data files distributed with a package is for use *by* the package, usually +by including the data files in the package directory. + +Setuptools offers three ways to specify data files to be included in your +packages. First, you can simply use the ``include_package_data`` keyword, +e.g.:: + + from setuptools import setup, find_packages + setup( + ... + include_package_data=True + ) + +This tells setuptools to install any data files it finds in your packages. +The data files must be specified via the distutils' ``MANIFEST.in`` file. +(They can also be tracked by a revision control system, using an appropriate +plugin. See the section below on `Adding Support for Revision Control +Systems`_ for information on how to write such plugins.) + +If you want finer-grained control over what files are included (for example, +if you have documentation files in your package directories and want to exclude +them from installation), then you can also use the ``package_data`` keyword, +e.g.:: + + from setuptools import setup, find_packages + setup( + ... + package_data={ + # If any package contains *.txt or *.rst files, include them: + '': ['*.txt', '*.rst'], + # And include any *.msg files found in the 'hello' package, too: + 'hello': ['*.msg'], + } + ) + +The ``package_data`` argument is a dictionary that maps from package names to +lists of glob patterns. The globs may include subdirectory names, if the data +files are contained in a subdirectory of the package. For example, if the +package tree looks like this:: + + setup.py + src/ + mypkg/ + __init__.py + mypkg.txt + data/ + somefile.dat + otherdata.dat + +The setuptools setup file might look like this:: + + from setuptools import setup, find_packages + setup( + ... + packages=find_packages('src'), # include all packages under src + package_dir={'':'src'}, # tell distutils packages are under src + + package_data={ + # If any package contains *.txt files, include them: + '': ['*.txt'], + # And include any *.dat files found in the 'data' subdirectory + # of the 'mypkg' package, also: + 'mypkg': ['data/*.dat'], + } + ) + +Notice that if you list patterns in ``package_data`` under the empty string, +these patterns are used to find files in every package, even ones that also +have their own patterns listed. Thus, in the above example, the ``mypkg.txt`` +file gets included even though it's not listed in the patterns for ``mypkg``. + +Also notice that if you use paths, you *must* use a forward slash (``/``) as +the path separator, even if you are on Windows. Setuptools automatically +converts slashes to appropriate platform-specific separators at build time. + +(Note: although the ``package_data`` argument was previously only available in +``setuptools``, it was also added to the Python ``distutils`` package as of +Python 2.4; there is `some documentation for the feature`__ available on the +python.org website. If using the setuptools-specific ``include_package_data`` +argument, files specified by ``package_data`` will *not* be automatically +added to the manifest unless they are listed in the MANIFEST.in file.) + +__ http://docs.python.org/dist/node11.html + +Sometimes, the ``include_package_data`` or ``package_data`` options alone +aren't sufficient to precisely define what files you want included. For +example, you may want to include package README files in your revision control +system and source distributions, but exclude them from being installed. So, +setuptools offers an ``exclude_package_data`` option as well, that allows you +to do things like this:: + + from setuptools import setup, find_packages + setup( + ... + packages=find_packages('src'), # include all packages under src + package_dir={'':'src'}, # tell distutils packages are under src + + include_package_data=True, # include everything in source control + + # ...but exclude README.txt from all packages + exclude_package_data={'': ['README.txt']}, + ) + +The ``exclude_package_data`` option is a dictionary mapping package names to +lists of wildcard patterns, just like the ``package_data`` option. And, just +as with that option, a key of ``''`` will apply the given pattern(s) to all +packages. However, any files that match these patterns will be *excluded* +from installation, even if they were listed in ``package_data`` or were +included as a result of using ``include_package_data``. + +In summary, the three options allow you to: + +``include_package_data`` + Accept all data files and directories matched by ``MANIFEST.in``. + +``package_data`` + Specify additional patterns to match files and directories that may or may + not be matched by ``MANIFEST.in`` or found in source control. + +``exclude_package_data`` + Specify patterns for data files and directories that should *not* be + included when a package is installed, even if they would otherwise have + been included due to the use of the preceding options. + +NOTE: Due to the way the distutils build process works, a data file that you +include in your project and then stop including may be "orphaned" in your +project's build directories, requiring you to run ``setup.py clean --all`` to +fully remove them. This may also be important for your users and contributors +if they track intermediate revisions of your project using Subversion; be sure +to let them know when you make changes that remove files from inclusion so they +can run ``setup.py clean --all``. + + +Accessing Data Files at Runtime +------------------------------- + +Typically, existing programs manipulate a package's ``__file__`` attribute in +order to find the location of data files. However, this manipulation isn't +compatible with PEP 302-based import hooks, including importing from zip files +and Python Eggs. It is strongly recommended that, if you are using data files, +you should use the :ref:`ResourceManager API` of ``pkg_resources`` to access +them. The ``pkg_resources`` module is distributed as part of setuptools, so if +you're using setuptools to distribute your package, there is no reason not to +use its resource management API. See also `Accessing Package Resources`_ for +a quick example of converting code that uses ``__file__`` to use +``pkg_resources`` instead. + +.. _Accessing Package Resources: http://peak.telecommunity.com/DevCenter/PythonEggs#accessing-package-resources + + +Non-Package Data Files +---------------------- + +The ``distutils`` normally install general "data files" to a platform-specific +location (e.g. ``/usr/share``). This feature intended to be used for things +like documentation, example configuration files, and the like. ``setuptools`` +does not install these data files in a separate location, however. They are +bundled inside the egg file or directory, alongside the Python modules and +packages. The data files can also be accessed using the :ref:`ResourceManager +API`, by specifying a ``Requirement`` instead of a package name:: + + from pkg_resources import Requirement, resource_filename + filename = resource_filename(Requirement.parse("MyProject"),"sample.conf") + +The above code will obtain the filename of the "sample.conf" file in the data +root of the "MyProject" distribution. + +Note, by the way, that this encapsulation of data files means that you can't +actually install data files to some arbitrary location on a user's machine; +this is a feature, not a bug. You can always include a script in your +distribution that extracts and copies your the documentation or data files to +a user-specified location, at their discretion. If you put related data files +in a single directory, you can use ``resource_filename()`` with the directory +name to get a filesystem directory that then can be copied with the ``shutil`` +module. (Even if your package is installed as a zipfile, calling +``resource_filename()`` on a directory will return an actual filesystem +directory, whose contents will be that entire subtree of your distribution.) + +(Of course, if you're writing a new package, you can just as easily place your +data files or directories inside one of your packages, rather than using the +distutils' approach. However, if you're updating an existing application, it +may be simpler not to change the way it currently specifies these data files.) + + +Automatic Resource Extraction +----------------------------- + +If you are using tools that expect your resources to be "real" files, or your +project includes non-extension native libraries or other files that your C +extensions expect to be able to access, you may need to list those files in +the ``eager_resources`` argument to ``setup()``, so that the files will be +extracted together, whenever a C extension in the project is imported. + +This is especially important if your project includes shared libraries *other* +than distutils-built C extensions, and those shared libraries use file +extensions other than ``.dll``, ``.so``, or ``.dylib``, which are the +extensions that setuptools 0.6a8 and higher automatically detects as shared +libraries and adds to the ``native_libs.txt`` file for you. Any shared +libraries whose names do not end with one of those extensions should be listed +as ``eager_resources``, because they need to be present in the filesystem when +he C extensions that link to them are used. + +The ``pkg_resources`` runtime for compressed packages will automatically +extract *all* C extensions and ``eager_resources`` at the same time, whenever +*any* C extension or eager resource is requested via the ``resource_filename()`` +API. (C extensions are imported using ``resource_filename()`` internally.) +This ensures that C extensions will see all of the "real" files that they +expect to see. + +Note also that you can list directory resource names in ``eager_resources`` as +well, in which case the directory's contents (including subdirectories) will be +extracted whenever any C extension or eager resource is requested. + +Please note that if you're not sure whether you need to use this argument, you +don't! It's really intended to support projects with lots of non-Python +dependencies and as a last resort for crufty projects that can't otherwise +handle being compressed. If your package is pure Python, Python plus data +files, or Python plus C, you really don't need this. You've got to be using +either C or an external program that needs "real" files in your project before +there's any possibility of ``eager_resources`` being relevant to your project. + + +Extensible Applications and Frameworks +====================================== + + +.. _Entry Points: + +Dynamic Discovery of Services and Plugins +----------------------------------------- + +``setuptools`` supports creating libraries that "plug in" to extensible +applications and frameworks, by letting you register "entry points" in your +project that can be imported by the application or framework. + +For example, suppose that a blogging tool wants to support plugins +that provide translation for various file types to the blog's output format. +The framework might define an "entry point group" called ``blogtool.parsers``, +and then allow plugins to register entry points for the file extensions they +support. + +This would allow people to create distributions that contain one or more +parsers for different file types, and then the blogging tool would be able to +find the parsers at runtime by looking up an entry point for the file +extension (or mime type, or however it wants to). + +Note that if the blogging tool includes parsers for certain file formats, it +can register these as entry points in its own setup script, which means it +doesn't have to special-case its built-in formats. They can just be treated +the same as any other plugin's entry points would be. + +If you're creating a project that plugs in to an existing application or +framework, you'll need to know what entry points or entry point groups are +defined by that application or framework. Then, you can register entry points +in your setup script. Here are a few examples of ways you might register an +``.rst`` file parser entry point in the ``blogtool.parsers`` entry point group, +for our hypothetical blogging tool:: + + setup( + # ... + entry_points={'blogtool.parsers': '.rst = some_module:SomeClass'} + ) + + setup( + # ... + entry_points={'blogtool.parsers': ['.rst = some_module:a_func']} + ) + + setup( + # ... + entry_points=""" + [blogtool.parsers] + .rst = some.nested.module:SomeClass.some_classmethod [reST] + """, + extras_require=dict(reST="Docutils>=0.3.5") + ) + +The ``entry_points`` argument to ``setup()`` accepts either a string with +``.ini``-style sections, or a dictionary mapping entry point group names to +either strings or lists of strings containing entry point specifiers. An +entry point specifier consists of a name and value, separated by an ``=`` +sign. The value consists of a dotted module name, optionally followed by a +``:`` and a dotted identifier naming an object within the module. It can +also include a bracketed list of "extras" that are required for the entry +point to be used. When the invoking application or framework requests loading +of an entry point, any requirements implied by the associated extras will be +passed to ``pkg_resources.require()``, so that an appropriate error message +can be displayed if the needed package(s) are missing. (Of course, the +invoking app or framework can ignore such errors if it wants to make an entry +point optional if a requirement isn't installed.) + + +Defining Additional Metadata +---------------------------- + +Some extensible applications and frameworks may need to define their own kinds +of metadata to include in eggs, which they can then access using the +``pkg_resources`` metadata APIs. Ordinarily, this is done by having plugin +developers include additional files in their ``ProjectName.egg-info`` +directory. However, since it can be tedious to create such files by hand, you +may want to create a distutils extension that will create the necessary files +from arguments to ``setup()``, in much the same way that ``setuptools`` does +for many of the ``setup()`` arguments it adds. See the section below on +`Creating distutils Extensions`_ for more details, especially the subsection on +`Adding new EGG-INFO Files`_. + + +"Development Mode" +================== + +Under normal circumstances, the ``distutils`` assume that you are going to +build a distribution of your project, not use it in its "raw" or "unbuilt" +form. If you were to use the ``distutils`` that way, you would have to rebuild +and reinstall your project every time you made a change to it during +development. + +Another problem that sometimes comes up with the ``distutils`` is that you may +need to do development on two related projects at the same time. You may need +to put both projects' packages in the same directory to run them, but need to +keep them separate for revision control purposes. How can you do this? + +Setuptools allows you to deploy your projects for use in a common directory or +staging area, but without copying any files. Thus, you can edit each project's +code in its checkout directory, and only need to run build commands when you +change a project's C extensions or similarly compiled files. You can even +deploy a project into another project's checkout directory, if that's your +preferred way of working (as opposed to using a common independent staging area +or the site-packages directory). + +To do this, use the ``setup.py develop`` command. It works very similarly to +``setup.py install`` or the EasyInstall tool, except that it doesn't actually +install anything. Instead, it creates a special ``.egg-link`` file in the +deployment directory, that links to your project's source code. And, if your +deployment directory is Python's ``site-packages`` directory, it will also +update the ``easy-install.pth`` file to include your project's source code, +thereby making it available on ``sys.path`` for all programs using that Python +installation. + +If you have enabled the ``use_2to3`` flag, then of course the ``.egg-link`` +will not link directly to your source code when run under Python 3, since +that source code would be made for Python 2 and not work under Python 3. +Instead the ``setup.py develop`` will build Python 3 code under the ``build`` +directory, and link there. This means that after doing code changes you will +have to run ``setup.py build`` before these changes are picked up by your +Python 3 installation. + +In addition, the ``develop`` command creates wrapper scripts in the target +script directory that will run your in-development scripts after ensuring that +all your ``install_requires`` packages are available on ``sys.path``. + +You can deploy the same project to multiple staging areas, e.g. if you have +multiple projects on the same machine that are sharing the same project you're +doing development work. + +When you're done with a given development task, you can remove the project +source from a staging area using ``setup.py develop --uninstall``, specifying +the desired staging area if it's not the default. + +There are several options to control the precise behavior of the ``develop`` +command; see the section on the `develop`_ command below for more details. + +Note that you can also apply setuptools commands to non-setuptools projects, +using commands like this:: + + python -c "import setuptools; execfile('setup.py')" develop + +That is, you can simply list the normal setup commands and options following +the quoted part. + + +Distributing a ``setuptools``-based project +=========================================== + +Using ``setuptools``... Without bundling it! +--------------------------------------------- + +Your users might not have ``setuptools`` installed on their machines, or even +if they do, it might not be the right version. Fixing this is easy; just +download `ez_setup.py`_, and put it in the same directory as your ``setup.py`` +script. (Be sure to add it to your revision control system, too.) Then add +these two lines to the very top of your setup script, before the script imports +anything from setuptools: + +.. code-block:: python + + import ez_setup + ez_setup.use_setuptools() + +That's it. The ``ez_setup`` module will automatically download a matching +version of ``setuptools`` from PyPI, if it isn't present on the target system. +Whenever you install an updated version of setuptools, you should also update +your projects' ``ez_setup.py`` files, so that a matching version gets installed +on the target machine(s). + +By the way, setuptools supports the new PyPI "upload" command, so you can use +``setup.py sdist upload`` or ``setup.py bdist_egg upload`` to upload your +source or egg distributions respectively. Your project's current version must +be registered with PyPI first, of course; you can use ``setup.py register`` to +do that. Or you can do it all in one step, e.g. ``setup.py register sdist +bdist_egg upload`` will register the package, build source and egg +distributions, and then upload them both to PyPI, where they'll be easily +found by other projects that depend on them. + +(By the way, if you need to distribute a specific version of ``setuptools``, +you can specify the exact version and base download URL as parameters to the +``use_setuptools()`` function. See the function's docstring for details.) + + +What Your Users Should Know +--------------------------- + +In general, a setuptools-based project looks just like any distutils-based +project -- as long as your users have an internet connection and are installing +to ``site-packages``, that is. But for some users, these conditions don't +apply, and they may become frustrated if this is their first encounter with +a setuptools-based project. To keep these users happy, you should review the +following topics in your project's installation instructions, if they are +relevant to your project and your target audience isn't already familiar with +setuptools and ``easy_install``. + +Network Access + If your project is using ``ez_setup``, you should inform users of the + need to either have network access, or to preinstall the correct version of + setuptools using the `EasyInstall installation instructions`_. Those + instructions also have tips for dealing with firewalls as well as how to + manually download and install setuptools. + +Custom Installation Locations + You should inform your users that if they are installing your project to + somewhere other than the main ``site-packages`` directory, they should + first install setuptools using the instructions for `Custom Installation + Locations`_, before installing your project. + +Your Project's Dependencies + If your project depends on other projects that may need to be downloaded + from PyPI or elsewhere, you should list them in your installation + instructions, or tell users how to find out what they are. While most + users will not need this information, any users who don't have unrestricted + internet access may have to find, download, and install the other projects + manually. (Note, however, that they must still install those projects + using ``easy_install``, or your project will not know they are installed, + and your setup script will try to download them again.) + + If you want to be especially friendly to users with limited network access, + you may wish to build eggs for your project and its dependencies, making + them all available for download from your site, or at least create a page + with links to all of the needed eggs. In this way, users with limited + network access can manually download all the eggs to a single directory, + then use the ``-f`` option of ``easy_install`` to specify the directory + to find eggs in. Users who have full network access can just use ``-f`` + with the URL of your download page, and ``easy_install`` will find all the + needed eggs using your links directly. This is also useful when your + target audience isn't able to compile packages (e.g. most Windows users) + and your package or some of its dependencies include C code. + +Revision Control System Users and Co-Developers + Users and co-developers who are tracking your in-development code using + a revision control system should probably read this manual's sections + regarding such development. Alternately, you may wish to create a + quick-reference guide containing the tips from this manual that apply to + your particular situation. For example, if you recommend that people use + ``setup.py develop`` when tracking your in-development code, you should let + them know that this needs to be run after every update or commit. + + Similarly, if you remove modules or data files from your project, you + should remind them to run ``setup.py clean --all`` and delete any obsolete + ``.pyc`` or ``.pyo``. (This tip applies to the distutils in general, not + just setuptools, but not everybody knows about them; be kind to your users + by spelling out your project's best practices rather than leaving them + guessing.) + +Creating System Packages + Some users want to manage all Python packages using a single package + manager, and sometimes that package manager isn't ``easy_install``! + Setuptools currently supports ``bdist_rpm``, ``bdist_wininst``, and + ``bdist_dumb`` formats for system packaging. If a user has a locally- + installed "bdist" packaging tool that internally uses the distutils + ``install`` command, it should be able to work with ``setuptools``. Some + examples of "bdist" formats that this should work with include the + ``bdist_nsi`` and ``bdist_msi`` formats for Windows. + + However, packaging tools that build binary distributions by running + ``setup.py install`` on the command line or as a subprocess will require + modification to work with setuptools. They should use the + ``--single-version-externally-managed`` option to the ``install`` command, + combined with the standard ``--root`` or ``--record`` options. + See the `install command`_ documentation below for more details. The + ``bdist_deb`` command is an example of a command that currently requires + this kind of patching to work with setuptools. + + If you or your users have a problem building a usable system package for + your project, please report the problem via the mailing list so that + either the "bdist" tool in question or setuptools can be modified to + resolve the issue. + + +Setting the ``zip_safe`` flag +----------------------------- + +For some use cases (such as bundling as part of a larger application), Python +packages may be run directly from a zip file. +Not all packages, however, are capable of running in compressed form, because +they may expect to be able to access either source code or data files as +normal operating system files. So, ``setuptools`` can install your project +as a zipfile or a directory, and its default choice is determined by the +project's ``zip_safe`` flag. + +You can pass a True or False value for the ``zip_safe`` argument to the +``setup()`` function, or you can omit it. If you omit it, the ``bdist_egg`` +command will analyze your project's contents to see if it can detect any +conditions that would prevent it from working in a zipfile. It will output +notices to the console about any such conditions that it finds. + +Currently, this analysis is extremely conservative: it will consider the +project unsafe if it contains any C extensions or datafiles whatsoever. This +does *not* mean that the project can't or won't work as a zipfile! It just +means that the ``bdist_egg`` authors aren't yet comfortable asserting that +the project *will* work. If the project contains no C or data files, and does +no ``__file__`` or ``__path__`` introspection or source code manipulation, then +there is an extremely solid chance the project will work when installed as a +zipfile. (And if the project uses ``pkg_resources`` for all its data file +access, then C extensions and other data files shouldn't be a problem at all. +See the `Accessing Data Files at Runtime`_ section above for more information.) + +However, if ``bdist_egg`` can't be *sure* that your package will work, but +you've checked over all the warnings it issued, and you are either satisfied it +*will* work (or if you want to try it for yourself), then you should set +``zip_safe`` to ``True`` in your ``setup()`` call. If it turns out that it +doesn't work, you can always change it to ``False``, which will force +``setuptools`` to install your project as a directory rather than as a zipfile. + +Of course, the end-user can still override either decision, if they are using +EasyInstall to install your package. And, if you want to override for testing +purposes, you can just run ``setup.py easy_install --zip-ok .`` or ``setup.py +easy_install --always-unzip .`` in your project directory. to install the +package as a zipfile or directory, respectively. + +In the future, as we gain more experience with different packages and become +more satisfied with the robustness of the ``pkg_resources`` runtime, the +"zip safety" analysis may become less conservative. However, we strongly +recommend that you determine for yourself whether your project functions +correctly when installed as a zipfile, correct any problems if you can, and +then make an explicit declaration of ``True`` or ``False`` for the ``zip_safe`` +flag, so that it will not be necessary for ``bdist_egg`` or ``EasyInstall`` to +try to guess whether your project can work as a zipfile. + + +Namespace Packages +------------------ + +Sometimes, a large package is more useful if distributed as a collection of +smaller eggs. However, Python does not normally allow the contents of a +package to be retrieved from more than one location. "Namespace packages" +are a solution for this problem. When you declare a package to be a namespace +package, it means that the package has no meaningful contents in its +``__init__.py``, and that it is merely a container for modules and subpackages. + +The ``pkg_resources`` runtime will then automatically ensure that the contents +of namespace packages that are spread over multiple eggs or directories are +combined into a single "virtual" package. + +The ``namespace_packages`` argument to ``setup()`` lets you declare your +project's namespace packages, so that they will be included in your project's +metadata. The argument should list the namespace packages that the egg +participates in. For example, the ZopeInterface project might do this:: + + setup( + # ... + namespace_packages=['zope'] + ) + +because it contains a ``zope.interface`` package that lives in the ``zope`` +namespace package. Similarly, a project for a standalone ``zope.publisher`` +would also declare the ``zope`` namespace package. When these projects are +installed and used, Python will see them both as part of a "virtual" ``zope`` +package, even though they will be installed in different locations. + +Namespace packages don't have to be top-level packages. For example, Zope 3's +``zope.app`` package is a namespace package, and in the future PEAK's +``peak.util`` package will be too. + +Note, by the way, that your project's source tree must include the namespace +packages' ``__init__.py`` files (and the ``__init__.py`` of any parent +packages), in a normal Python package layout. These ``__init__.py`` files +*must* contain the line:: + + __import__('pkg_resources').declare_namespace(__name__) + +This code ensures that the namespace package machinery is operating and that +the current package is registered as a namespace package. + +You must NOT include any other code and data in a namespace package's +``__init__.py``. Even though it may appear to work during development, or when +projects are installed as ``.egg`` files, it will not work when the projects +are installed using "system" packaging tools -- in such cases the +``__init__.py`` files will not be installed, let alone executed. + +You must include the ``declare_namespace()`` line in the ``__init__.py`` of +*every* project that has contents for the namespace package in question, in +order to ensure that the namespace will be declared regardless of which +project's copy of ``__init__.py`` is loaded first. If the first loaded +``__init__.py`` doesn't declare it, it will never *be* declared, because no +other copies will ever be loaded! + + +TRANSITIONAL NOTE +~~~~~~~~~~~~~~~~~ + +Setuptools automatically calls ``declare_namespace()`` for you at runtime, +but future versions may *not*. This is because the automatic declaration +feature has some negative side effects, such as needing to import all namespace +packages during the initialization of the ``pkg_resources`` runtime, and also +the need for ``pkg_resources`` to be explicitly imported before any namespace +packages work at all. In some future releases, you'll be responsible +for including your own declaration lines, and the automatic declaration feature +will be dropped to get rid of the negative side effects. + +During the remainder of the current development cycle, therefore, setuptools +will warn you about missing ``declare_namespace()`` calls in your +``__init__.py`` files, and you should correct these as soon as possible +before the compatibility support is removed. +Namespace packages without declaration lines will not work +correctly once a user has upgraded to a later version, so it's important that +you make this change now in order to avoid having your code break in the field. +Our apologies for the inconvenience, and thank you for your patience. + + + +Tagging and "Daily Build" or "Snapshot" Releases +------------------------------------------------ + +When a set of related projects are under development, it may be important to +track finer-grained version increments than you would normally use for e.g. +"stable" releases. While stable releases might be measured in dotted numbers +with alpha/beta/etc. status codes, development versions of a project often +need to be tracked by revision or build number or even build date. This is +especially true when projects in development need to refer to one another, and +therefore may literally need an up-to-the-minute version of something! + +To support these scenarios, ``setuptools`` allows you to "tag" your source and +egg distributions by adding one or more of the following to the project's +"official" version identifier: + +* A manually-specified pre-release tag, such as "build" or "dev", or a + manually-specified post-release tag, such as a build or revision number + (``--tag-build=STRING, -bSTRING``) + +* An 8-character representation of the build date (``--tag-date, -d``), as + a postrelease tag + +You can add these tags by adding ``egg_info`` and the desired options to +the command line ahead of the ``sdist`` or ``bdist`` commands that you want +to generate a daily build or snapshot for. See the section below on the +`egg_info`_ command for more details. + +(Also, before you release your project, be sure to see the section above on +`Specifying Your Project's Version`_ for more information about how pre- and +post-release tags affect how setuptools and EasyInstall interpret version +numbers. This is important in order to make sure that dependency processing +tools will know which versions of your project are newer than others.) + +Finally, if you are creating builds frequently, and either building them in a +downloadable location or are copying them to a distribution server, you should +probably also check out the `rotate`_ command, which lets you automatically +delete all but the N most-recently-modified distributions matching a glob +pattern. So, you can use a command line like:: + + setup.py egg_info -rbDEV bdist_egg rotate -m.egg -k3 + +to build an egg whose version info includes 'DEV-rNNNN' (where NNNN is the +most recent Subversion revision that affected the source tree), and then +delete any egg files from the distribution directory except for the three +that were built most recently. + +If you have to manage automated builds for multiple packages, each with +different tagging and rotation policies, you may also want to check out the +`alias`_ command, which would let each package define an alias like ``daily`` +that would perform the necessary tag, build, and rotate commands. Then, a +simpler script or cron job could just run ``setup.py daily`` in each project +directory. (And, you could also define sitewide or per-user default versions +of the ``daily`` alias, so that projects that didn't define their own would +use the appropriate defaults.) + + +Generating Source Distributions +------------------------------- + +``setuptools`` enhances the distutils' default algorithm for source file +selection with pluggable endpoints for looking up files to include. If you are +using a revision control system, and your source distributions only need to +include files that you're tracking in revision control, use a corresponding +plugin instead of writing a ``MANIFEST.in`` file. See the section below on +`Adding Support for Revision Control Systems`_ for information on plugins. + +If you need to include automatically generated files, or files that are kept in +an unsupported revision control system, you'll need to create a ``MANIFEST.in`` +file to specify any files that the default file location algorithm doesn't +catch. See the distutils documentation for more information on the format of +the ``MANIFEST.in`` file. + +But, be sure to ignore any part of the distutils documentation that deals with +``MANIFEST`` or how it's generated from ``MANIFEST.in``; setuptools shields you +from these issues and doesn't work the same way in any case. Unlike the +distutils, setuptools regenerates the source distribution manifest file +every time you build a source distribution, and it builds it inside the +project's ``.egg-info`` directory, out of the way of your main project +directory. You therefore need not worry about whether it is up-to-date or not. + +Indeed, because setuptools' approach to determining the contents of a source +distribution is so much simpler, its ``sdist`` command omits nearly all of +the options that the distutils' more complex ``sdist`` process requires. For +all practical purposes, you'll probably use only the ``--formats`` option, if +you use any option at all. + + +Making your package available for EasyInstall +--------------------------------------------- + +If you use the ``register`` command (``setup.py register``) to register your +package with PyPI, that's most of the battle right there. (See the +`docs for the register command`_ for more details.) + +.. _docs for the register command: http://docs.python.org/dist/package-index.html + +If you also use the `upload`_ command to upload actual distributions of your +package, that's even better, because EasyInstall will be able to find and +download them directly from your project's PyPI page. + +However, there may be reasons why you don't want to upload distributions to +PyPI, and just want your existing distributions (or perhaps a Subversion +checkout) to be used instead. + +So here's what you need to do before running the ``register`` command. There +are three ``setup()`` arguments that affect EasyInstall: + +``url`` and ``download_url`` + These become links on your project's PyPI page. EasyInstall will examine + them to see if they link to a package ("primary links"), or whether they are + HTML pages. If they're HTML pages, EasyInstall scans all HREF's on the + page for primary links + +``long_description`` + EasyInstall will check any URLs contained in this argument to see if they + are primary links. + +A URL is considered a "primary link" if it is a link to a .tar.gz, .tgz, .zip, +.egg, .egg.zip, .tar.bz2, or .exe file, or if it has an ``#egg=project`` or +``#egg=project-version`` fragment identifier attached to it. EasyInstall +attempts to determine a project name and optional version number from the text +of a primary link *without* downloading it. When it has found all the primary +links, EasyInstall will select the best match based on requested version, +platform compatibility, and other criteria. + +So, if your ``url`` or ``download_url`` point either directly to a downloadable +source distribution, or to HTML page(s) that have direct links to such, then +EasyInstall will be able to locate downloads automatically. If you want to +make Subversion checkouts available, then you should create links with either +``#egg=project`` or ``#egg=project-version`` added to the URL. You should +replace ``project`` and ``version`` with the values they would have in an egg +filename. (Be sure to actually generate an egg and then use the initial part +of the filename, rather than trying to guess what the escaped form of the +project name and version number will be.) + +Note that Subversion checkout links are of lower precedence than other kinds +of distributions, so EasyInstall will not select a Subversion checkout for +downloading unless it has a version included in the ``#egg=`` suffix, and +it's a higher version than EasyInstall has seen in any other links for your +project. + +As a result, it's a common practice to use mark checkout URLs with a version of +"dev" (i.e., ``#egg=projectname-dev``), so that users can do something like +this:: + + easy_install --editable projectname==dev + +in order to check out the in-development version of ``projectname``. + + +Making "Official" (Non-Snapshot) Releases +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +When you make an official release, creating source or binary distributions, +you will need to override the tag settings from ``setup.cfg``, so that you +don't end up registering versions like ``foobar-0.7a1.dev-r34832``. This is +easy to do if you are developing on the trunk and using tags or branches for +your releases - just make the change to ``setup.cfg`` after branching or +tagging the release, so the trunk will still produce development snapshots. + +Alternately, if you are not branching for releases, you can override the +default version options on the command line, using something like:: + + python setup.py egg_info -Db "" sdist bdist_egg register upload + +The first part of this command (``egg_info -Db ""``) will override the +configured tag information, before creating source and binary eggs, registering +the project with PyPI, and uploading the files. Thus, these commands will use +the plain version from your ``setup.py``, without adding the build designation +string. + +Of course, if you will be doing this a lot, you may wish to create a personal +alias for this operation, e.g.:: + + python setup.py alias -u release egg_info -Db "" + +You can then use it like this:: + + python setup.py release sdist bdist_egg register upload + +Or of course you can create more elaborate aliases that do all of the above. +See the sections below on the `egg_info`_ and `alias`_ commands for more ideas. + + + +Distributing Extensions compiled with Pyrex +------------------------------------------- + +``setuptools`` includes transparent support for building Pyrex extensions, as +long as you define your extensions using ``setuptools.Extension``, *not* +``distutils.Extension``. You must also not import anything from Pyrex in +your setup script. + +If you follow these rules, you can safely list ``.pyx`` files as the source +of your ``Extension`` objects in the setup script. ``setuptools`` will detect +at build time whether Pyrex is installed or not. If it is, then ``setuptools`` +will use it. If not, then ``setuptools`` will silently change the +``Extension`` objects to refer to the ``.c`` counterparts of the ``.pyx`` +files, so that the normal distutils C compilation process will occur. + +Of course, for this to work, your source distributions must include the C +code generated by Pyrex, as well as your original ``.pyx`` files. This means +that you will probably want to include current ``.c`` files in your revision +control system, rebuilding them whenever you check changes in for the ``.pyx`` +source files. This will ensure that people tracking your project in a revision +control system will be able to build it even if they don't have Pyrex +installed, and that your source releases will be similarly usable with or +without Pyrex. + + +----------------- +Command Reference +----------------- + +.. _alias: + +``alias`` - Define shortcuts for commonly used commands +======================================================= + +Sometimes, you need to use the same commands over and over, but you can't +necessarily set them as defaults. For example, if you produce both development +snapshot releases and "stable" releases of a project, you may want to put +the distributions in different places, or use different ``egg_info`` tagging +options, etc. In these cases, it doesn't make sense to set the options in +a distutils configuration file, because the values of the options changed based +on what you're trying to do. + +Setuptools therefore allows you to define "aliases" - shortcut names for +an arbitrary string of commands and options, using ``setup.py alias aliasname +expansion``, where aliasname is the name of the new alias, and the remainder of +the command line supplies its expansion. For example, this command defines +a sitewide alias called "daily", that sets various ``egg_info`` tagging +options:: + + setup.py alias --global-config daily egg_info --tag-build=development + +Once the alias is defined, it can then be used with other setup commands, +e.g.:: + + setup.py daily bdist_egg # generate a daily-build .egg file + setup.py daily sdist # generate a daily-build source distro + setup.py daily sdist bdist_egg # generate both + +The above commands are interpreted as if the word ``daily`` were replaced with +``egg_info --tag-build=development``. + +Note that setuptools will expand each alias *at most once* in a given command +line. This serves two purposes. First, if you accidentally create an alias +loop, it will have no effect; you'll instead get an error message about an +unknown command. Second, it allows you to define an alias for a command, that +uses that command. For example, this (project-local) alias:: + + setup.py alias bdist_egg bdist_egg rotate -k1 -m.egg + +redefines the ``bdist_egg`` command so that it always runs the ``rotate`` +command afterwards to delete all but the newest egg file. It doesn't loop +indefinitely on ``bdist_egg`` because the alias is only expanded once when +used. + +You can remove a defined alias with the ``--remove`` (or ``-r``) option, e.g.:: + + setup.py alias --global-config --remove daily + +would delete the "daily" alias we defined above. + +Aliases can be defined on a project-specific, per-user, or sitewide basis. The +default is to define or remove a project-specific alias, but you can use any of +the `configuration file options`_ (listed under the `saveopts`_ command, below) +to determine which distutils configuration file an aliases will be added to +(or removed from). + +Note that if you omit the "expansion" argument to the ``alias`` command, +you'll get output showing that alias' current definition (and what +configuration file it's defined in). If you omit the alias name as well, +you'll get a listing of all current aliases along with their configuration +file locations. + + +``bdist_egg`` - Create a Python Egg for the project +=================================================== + +This command generates a Python Egg (``.egg`` file) for the project. Python +Eggs are the preferred binary distribution format for EasyInstall, because they +are cross-platform (for "pure" packages), directly importable, and contain +project metadata including scripts and information about the project's +dependencies. They can be simply downloaded and added to ``sys.path`` +directly, or they can be placed in a directory on ``sys.path`` and then +automatically discovered by the egg runtime system. + +This command runs the `egg_info`_ command (if it hasn't already run) to update +the project's metadata (``.egg-info``) directory. If you have added any extra +metadata files to the ``.egg-info`` directory, those files will be included in +the new egg file's metadata directory, for use by the egg runtime system or by +any applications or frameworks that use that metadata. + +You won't usually need to specify any special options for this command; just +use ``bdist_egg`` and you're done. But there are a few options that may +be occasionally useful: + +``--dist-dir=DIR, -d DIR`` + Set the directory where the ``.egg`` file will be placed. If you don't + supply this, then the ``--dist-dir`` setting of the ``bdist`` command + will be used, which is usually a directory named ``dist`` in the project + directory. + +``--plat-name=PLATFORM, -p PLATFORM`` + Set the platform name string that will be embedded in the egg's filename + (assuming the egg contains C extensions). This can be used to override + the distutils default platform name with something more meaningful. Keep + in mind, however, that the egg runtime system expects to see eggs with + distutils platform names, so it may ignore or reject eggs with non-standard + platform names. Similarly, the EasyInstall program may ignore them when + searching web pages for download links. However, if you are + cross-compiling or doing some other unusual things, you might find a use + for this option. + +``--exclude-source-files`` + Don't include any modules' ``.py`` files in the egg, just compiled Python, + C, and data files. (Note that this doesn't affect any ``.py`` files in the + EGG-INFO directory or its subdirectories, since for example there may be + scripts with a ``.py`` extension which must still be retained.) We don't + recommend that you use this option except for packages that are being + bundled for proprietary end-user applications, or for "embedded" scenarios + where space is at an absolute premium. On the other hand, if your package + is going to be installed and used in compressed form, you might as well + exclude the source because Python's ``traceback`` module doesn't currently + understand how to display zipped source code anyway, or how to deal with + files that are in a different place from where their code was compiled. + +There are also some options you will probably never need, but which are there +because they were copied from similar ``bdist`` commands used as an example for +creating this one. They may be useful for testing and debugging, however, +which is why we kept them: + +``--keep-temp, -k`` + Keep the contents of the ``--bdist-dir`` tree around after creating the + ``.egg`` file. + +``--bdist-dir=DIR, -b DIR`` + Set the temporary directory for creating the distribution. The entire + contents of this directory are zipped to create the ``.egg`` file, after + running various installation commands to copy the package's modules, data, + and extensions here. + +``--skip-build`` + Skip doing any "build" commands; just go straight to the + install-and-compress phases. + + +.. _develop: + +``develop`` - Deploy the project source in "Development Mode" +============================================================= + +This command allows you to deploy your project's source for use in one or more +"staging areas" where it will be available for importing. This deployment is +done in such a way that changes to the project source are immediately available +in the staging area(s), without needing to run a build or install step after +each change. + +The ``develop`` command works by creating an ``.egg-link`` file (named for the +project) in the given staging area. If the staging area is Python's +``site-packages`` directory, it also updates an ``easy-install.pth`` file so +that the project is on ``sys.path`` by default for all programs run using that +Python installation. + +The ``develop`` command also installs wrapper scripts in the staging area (or +a separate directory, as specified) that will ensure the project's dependencies +are available on ``sys.path`` before running the project's source scripts. +And, it ensures that any missing project dependencies are available in the +staging area, by downloading and installing them if necessary. + +Last, but not least, the ``develop`` command invokes the ``build_ext -i`` +command to ensure any C extensions in the project have been built and are +up-to-date, and the ``egg_info`` command to ensure the project's metadata is +updated (so that the runtime and wrappers know what the project's dependencies +are). If you make any changes to the project's setup script or C extensions, +you should rerun the ``develop`` command against all relevant staging areas to +keep the project's scripts, metadata and extensions up-to-date. Most other +kinds of changes to your project should not require any build operations or +rerunning ``develop``, but keep in mind that even minor changes to the setup +script (e.g. changing an entry point definition) require you to re-run the +``develop`` or ``test`` commands to keep the distribution updated. + +Here are some of the options that the ``develop`` command accepts. Note that +they affect the project's dependencies as well as the project itself, so if you +have dependencies that need to be installed and you use ``--exclude-scripts`` +(for example), the dependencies' scripts will not be installed either! For +this reason, you may want to use EasyInstall to install the project's +dependencies before using the ``develop`` command, if you need finer control +over the installation options for dependencies. + +``--uninstall, -u`` + Un-deploy the current project. You may use the ``--install-dir`` or ``-d`` + option to designate the staging area. The created ``.egg-link`` file will + be removed, if present and it is still pointing to the project directory. + The project directory will be removed from ``easy-install.pth`` if the + staging area is Python's ``site-packages`` directory. + + Note that this option currently does *not* uninstall script wrappers! You + must uninstall them yourself, or overwrite them by using EasyInstall to + activate a different version of the package. You can also avoid installing + script wrappers in the first place, if you use the ``--exclude-scripts`` + (aka ``-x``) option when you run ``develop`` to deploy the project. + +``--multi-version, -m`` + "Multi-version" mode. Specifying this option prevents ``develop`` from + adding an ``easy-install.pth`` entry for the project(s) being deployed, and + if an entry for any version of a project already exists, the entry will be + removed upon successful deployment. In multi-version mode, no specific + version of the package is available for importing, unless you use + ``pkg_resources.require()`` to put it on ``sys.path``, or you are running + a wrapper script generated by ``setuptools`` or EasyInstall. (In which + case the wrapper script calls ``require()`` for you.) + + Note that if you install to a directory other than ``site-packages``, + this option is automatically in effect, because ``.pth`` files can only be + used in ``site-packages`` (at least in Python 2.3 and 2.4). So, if you use + the ``--install-dir`` or ``-d`` option (or they are set via configuration + file(s)) your project and its dependencies will be deployed in multi- + version mode. + +``--install-dir=DIR, -d DIR`` + Set the installation directory (staging area). If this option is not + directly specified on the command line or in a distutils configuration + file, the distutils default installation location is used. Normally, this + will be the ``site-packages`` directory, but if you are using distutils + configuration files, setting things like ``prefix`` or ``install_lib``, + then those settings are taken into account when computing the default + staging area. + +``--script-dir=DIR, -s DIR`` + Set the script installation directory. If you don't supply this option + (via the command line or a configuration file), but you *have* supplied + an ``--install-dir`` (via command line or config file), then this option + defaults to the same directory, so that the scripts will be able to find + their associated package installation. Otherwise, this setting defaults + to the location where the distutils would normally install scripts, taking + any distutils configuration file settings into account. + +``--exclude-scripts, -x`` + Don't deploy script wrappers. This is useful if you don't want to disturb + existing versions of the scripts in the staging area. + +``--always-copy, -a`` + Copy all needed distributions to the staging area, even if they + are already present in another directory on ``sys.path``. By default, if + a requirement can be met using a distribution that is already available in + a directory on ``sys.path``, it will not be copied to the staging area. + +``--egg-path=DIR`` + Force the generated ``.egg-link`` file to use a specified relative path + to the source directory. This can be useful in circumstances where your + installation directory is being shared by code running under multiple + platforms (e.g. Mac and Windows) which have different absolute locations + for the code under development, but the same *relative* locations with + respect to the installation directory. If you use this option when + installing, you must supply the same relative path when uninstalling. + +In addition to the above options, the ``develop`` command also accepts all of +the same options accepted by ``easy_install``. If you've configured any +``easy_install`` settings in your ``setup.cfg`` (or other distutils config +files), the ``develop`` command will use them as defaults, unless you override +them in a ``[develop]`` section or on the command line. + + +``easy_install`` - Find and install packages +============================================ + +This command runs the `EasyInstall tool +`_ for you. It is exactly +equivalent to running the ``easy_install`` command. All command line arguments +following this command are consumed and not processed further by the distutils, +so this must be the last command listed on the command line. Please see +the EasyInstall documentation for the options reference and usage examples. +Normally, there is no reason to use this command via the command line, as you +can just use ``easy_install`` directly. It's only listed here so that you know +it's a distutils command, which means that you can: + +* create command aliases that use it, +* create distutils extensions that invoke it as a subcommand, and +* configure options for it in your ``setup.cfg`` or other distutils config + files. + + +.. _egg_info: + +``egg_info`` - Create egg metadata and set build tags +===================================================== + +This command performs two operations: it updates a project's ``.egg-info`` +metadata directory (used by the ``bdist_egg``, ``develop``, and ``test`` +commands), and it allows you to temporarily change a project's version string, +to support "daily builds" or "snapshot" releases. It is run automatically by +the ``sdist``, ``bdist_egg``, ``develop``, ``register``, and ``test`` commands +in order to update the project's metadata, but you can also specify it +explicitly in order to temporarily change the project's version string while +executing other commands. (It also generates the``.egg-info/SOURCES.txt`` +manifest file, which is used when you are building source distributions.) + +In addition to writing the core egg metadata defined by ``setuptools`` and +required by ``pkg_resources``, this command can be extended to write other +metadata files as well, by defining entry points in the ``egg_info.writers`` +group. See the section on `Adding new EGG-INFO Files`_ below for more details. +Note that using additional metadata writers may require you to include a +``setup_requires`` argument to ``setup()`` in order to ensure that the desired +writers are available on ``sys.path``. + + +Release Tagging Options +----------------------- + +The following options can be used to modify the project's version string for +all remaining commands on the setup command line. The options are processed +in the order shown, so if you use more than one, the requested tags will be +added in the following order: + +``--tag-build=NAME, -b NAME`` + Append NAME to the project's version string. Due to the way setuptools + processes "pre-release" version suffixes beginning with the letters "a" + through "e" (like "alpha", "beta", and "candidate"), you will usually want + to use a tag like ".build" or ".dev", as this will cause the version number + to be considered *lower* than the project's default version. (If you + want to make the version number *higher* than the default version, you can + always leave off --tag-build and then use one or both of the following + options.) + + If you have a default build tag set in your ``setup.cfg``, you can suppress + it on the command line using ``-b ""`` or ``--tag-build=""`` as an argument + to the ``egg_info`` command. + +``--tag-date, -d`` + Add a date stamp of the form "-YYYYMMDD" (e.g. "-20050528") to the + project's version number. + +``--no-date, -D`` + Don't include a date stamp in the version number. This option is included + so you can override a default setting in ``setup.cfg``. + + +(Note: Because these options modify the version number used for source and +binary distributions of your project, you should first make sure that you know +how the resulting version numbers will be interpreted by automated tools +like EasyInstall. See the section above on `Specifying Your Project's +Version`_ for an explanation of pre- and post-release tags, as well as tips on +how to choose and verify a versioning scheme for your your project.) + +For advanced uses, there is one other option that can be set, to change the +location of the project's ``.egg-info`` directory. Commands that need to find +the project's source directory or metadata should get it from this setting: + + +Other ``egg_info`` Options +-------------------------- + +``--egg-base=SOURCEDIR, -e SOURCEDIR`` + Specify the directory that should contain the .egg-info directory. This + should normally be the root of your project's source tree (which is not + necessarily the same as your project directory; some projects use a ``src`` + or ``lib`` subdirectory as the source root). You should not normally need + to specify this directory, as it is normally determined from the + ``package_dir`` argument to the ``setup()`` function, if any. If there is + no ``package_dir`` set, this option defaults to the current directory. + + +``egg_info`` Examples +--------------------- + +Creating a dated "nightly build" snapshot egg:: + + python setup.py egg_info --tag-date --tag-build=DEV bdist_egg + +Creating and uploading a release with no version tags, even if some default +tags are specified in ``setup.cfg``:: + + python setup.py egg_info -RDb "" sdist bdist_egg register upload + +(Notice that ``egg_info`` must always appear on the command line *before* any +commands that you want the version changes to apply to.) + + +.. _install command: + +``install`` - Run ``easy_install`` or old-style installation +============================================================ + +The setuptools ``install`` command is basically a shortcut to run the +``easy_install`` command on the current project. However, for convenience +in creating "system packages" of setuptools-based projects, you can also +use this option: + +``--single-version-externally-managed`` + This boolean option tells the ``install`` command to perform an "old style" + installation, with the addition of an ``.egg-info`` directory so that the + installed project will still have its metadata available and operate + normally. If you use this option, you *must* also specify the ``--root`` + or ``--record`` options (or both), because otherwise you will have no way + to identify and remove the installed files. + +This option is automatically in effect when ``install`` is invoked by another +distutils command, so that commands like ``bdist_wininst`` and ``bdist_rpm`` +will create system packages of eggs. It is also automatically in effect if +you specify the ``--root`` option. + + +``install_egg_info`` - Install an ``.egg-info`` directory in ``site-packages`` +============================================================================== + +Setuptools runs this command as part of ``install`` operations that use the +``--single-version-externally-managed`` options. You should not invoke it +directly; it is documented here for completeness and so that distutils +extensions such as system package builders can make use of it. This command +has only one option: + +``--install-dir=DIR, -d DIR`` + The parent directory where the ``.egg-info`` directory will be placed. + Defaults to the same as the ``--install-dir`` option specified for the + ``install_lib`` command, which is usually the system ``site-packages`` + directory. + +This command assumes that the ``egg_info`` command has been given valid options +via the command line or ``setup.cfg``, as it will invoke the ``egg_info`` +command and use its options to locate the project's source ``.egg-info`` +directory. + + +.. _rotate: + +``rotate`` - Delete outdated distribution files +=============================================== + +As you develop new versions of your project, your distribution (``dist``) +directory will gradually fill up with older source and/or binary distribution +files. The ``rotate`` command lets you automatically clean these up, keeping +only the N most-recently modified files matching a given pattern. + +``--match=PATTERNLIST, -m PATTERNLIST`` + Comma-separated list of glob patterns to match. This option is *required*. + The project name and ``-*`` is prepended to the supplied patterns, in order + to match only distributions belonging to the current project (in case you + have a shared distribution directory for multiple projects). Typically, + you will use a glob pattern like ``.zip`` or ``.egg`` to match files of + the specified type. Note that each supplied pattern is treated as a + distinct group of files for purposes of selecting files to delete. + +``--keep=COUNT, -k COUNT`` + Number of matching distributions to keep. For each group of files + identified by a pattern specified with the ``--match`` option, delete all + but the COUNT most-recently-modified files in that group. This option is + *required*. + +``--dist-dir=DIR, -d DIR`` + Directory where the distributions are. This defaults to the value of the + ``bdist`` command's ``--dist-dir`` option, which will usually be the + project's ``dist`` subdirectory. + +**Example 1**: Delete all .tar.gz files from the distribution directory, except +for the 3 most recently modified ones:: + + setup.py rotate --match=.tar.gz --keep=3 + +**Example 2**: Delete all Python 2.3 or Python 2.4 eggs from the distribution +directory, except the most recently modified one for each Python version:: + + setup.py rotate --match=-py2.3*.egg,-py2.4*.egg --keep=1 + + +.. _saveopts: + +``saveopts`` - Save used options to a configuration file +======================================================== + +Finding and editing ``distutils`` configuration files can be a pain, especially +since you also have to translate the configuration options from command-line +form to the proper configuration file format. You can avoid these hassles by +using the ``saveopts`` command. Just add it to the command line to save the +options you used. For example, this command builds the project using +the ``mingw32`` C compiler, then saves the --compiler setting as the default +for future builds (even those run implicitly by the ``install`` command):: + + setup.py build --compiler=mingw32 saveopts + +The ``saveopts`` command saves all options for every command specified on the +command line to the project's local ``setup.cfg`` file, unless you use one of +the `configuration file options`_ to change where the options are saved. For +example, this command does the same as above, but saves the compiler setting +to the site-wide (global) distutils configuration:: + + setup.py build --compiler=mingw32 saveopts -g + +Note that it doesn't matter where you place the ``saveopts`` command on the +command line; it will still save all the options specified for all commands. +For example, this is another valid way to spell the last example:: + + setup.py saveopts -g build --compiler=mingw32 + +Note, however, that all of the commands specified are always run, regardless of +where ``saveopts`` is placed on the command line. + + +Configuration File Options +-------------------------- + +Normally, settings such as options and aliases are saved to the project's +local ``setup.cfg`` file. But you can override this and save them to the +global or per-user configuration files, or to a manually-specified filename. + +``--global-config, -g`` + Save settings to the global ``distutils.cfg`` file inside the ``distutils`` + package directory. You must have write access to that directory to use + this option. You also can't combine this option with ``-u`` or ``-f``. + +``--user-config, -u`` + Save settings to the current user's ``~/.pydistutils.cfg`` (POSIX) or + ``$HOME/pydistutils.cfg`` (Windows) file. You can't combine this option + with ``-g`` or ``-f``. + +``--filename=FILENAME, -f FILENAME`` + Save settings to the specified configuration file to use. You can't + combine this option with ``-g`` or ``-u``. Note that if you specify a + non-standard filename, the ``distutils`` and ``setuptools`` will not + use the file's contents. This option is mainly included for use in + testing. + +These options are used by other ``setuptools`` commands that modify +configuration files, such as the `alias`_ and `setopt`_ commands. + + +.. _setopt: + +``setopt`` - Set a distutils or setuptools option in a config file +================================================================== + +This command is mainly for use by scripts, but it can also be used as a quick +and dirty way to change a distutils configuration option without having to +remember what file the options are in and then open an editor. + +**Example 1**. Set the default C compiler to ``mingw32`` (using long option +names):: + + setup.py setopt --command=build --option=compiler --set-value=mingw32 + +**Example 2**. Remove any setting for the distutils default package +installation directory (short option names):: + + setup.py setopt -c install -o install_lib -r + + +Options for the ``setopt`` command: + +``--command=COMMAND, -c COMMAND`` + Command to set the option for. This option is required. + +``--option=OPTION, -o OPTION`` + The name of the option to set. This option is required. + +``--set-value=VALUE, -s VALUE`` + The value to set the option to. Not needed if ``-r`` or ``--remove`` is + set. + +``--remove, -r`` + Remove (unset) the option, instead of setting it. + +In addition to the above options, you may use any of the `configuration file +options`_ (listed under the `saveopts`_ command, above) to determine which +distutils configuration file the option will be added to (or removed from). + + +.. _test: + +``test`` - Build package and run a unittest suite +================================================= + +When doing test-driven development, or running automated builds that need +testing before they are deployed for downloading or use, it's often useful +to be able to run a project's unit tests without actually deploying the project +anywhere, even using the ``develop`` command. The ``test`` command runs a +project's unit tests without actually deploying it, by temporarily putting the +project's source on ``sys.path``, after first running ``build_ext -i`` and +``egg_info`` to ensure that any C extensions and project metadata are +up-to-date. + +To use this command, your project's tests must be wrapped in a ``unittest`` +test suite by either a function, a ``TestCase`` class or method, or a module +or package containing ``TestCase`` classes. If the named suite is a module, +and the module has an ``additional_tests()`` function, it is called and the +result (which must be a ``unittest.TestSuite``) is added to the tests to be +run. If the named suite is a package, any submodules and subpackages are +recursively added to the overall test suite. (Note: if your project specifies +a ``test_loader``, the rules for processing the chosen ``test_suite`` may +differ; see the `test_loader`_ documentation for more details.) + +Note that many test systems including ``doctest`` support wrapping their +non-``unittest`` tests in ``TestSuite`` objects. So, if you are using a test +package that does not support this, we suggest you encourage its developers to +implement test suite support, as this is a convenient and standard way to +aggregate a collection of tests to be run under a common test harness. + +By default, tests will be run in the "verbose" mode of the ``unittest`` +package's text test runner, but you can get the "quiet" mode (just dots) if +you supply the ``-q`` or ``--quiet`` option, either as a global option to +the setup script (e.g. ``setup.py -q test``) or as an option for the ``test`` +command itself (e.g. ``setup.py test -q``). There is one other option +available: + +``--test-suite=NAME, -s NAME`` + Specify the test suite (or module, class, or method) to be run + (e.g. ``some_module.test_suite``). The default for this option can be + set by giving a ``test_suite`` argument to the ``setup()`` function, e.g.:: + + setup( + # ... + test_suite="my_package.tests.test_all" + ) + + If you did not set a ``test_suite`` in your ``setup()`` call, and do not + provide a ``--test-suite`` option, an error will occur. + + +.. _upload: + +``upload`` - Upload source and/or egg distributions to PyPI +=========================================================== + +The ``upload`` command is implemented and `documented +`_ +in distutils. + +Setuptools augments the ``upload`` command with support +for `keyring `_, +allowing the password to be stored in a secure +location and not in plaintext in the .pypirc file. To use +keyring, first install keyring and set the password for +the relevant repository, e.g.:: + + python -m keyring set + Password for '' in '': ******** + +Then, in .pypirc, set the repository configuration as normal, +but omit the password. Thereafter, uploads will use the +password from the keyring. + +New in 20.1: Added keyring support. + + +----------------------------------------- +Configuring setup() using setup.cfg files +----------------------------------------- + +``Setuptools`` allows using configuration files (usually `setup.cfg`) +to define package’s metadata and other options which are normally supplied +to ``setup()`` function. + +This approach not only allows automation scenarios, but also reduces +boilerplate code in some cases. + +.. note:: + Implementation presents limited compatibility with distutils2-like + ``setup.cfg`` sections (used by ``pbr`` and ``d2to1`` packages). + + Namely: only metadata related keys from ``metadata`` section are supported + (except for ``description-file``); keys from ``files``, ``entry_points`` + and ``backwards_compat`` are not supported. + + +.. code-block:: ini + + [metadata] + name = my_package + version = attr: src.VERSION + description = My package description + long_description = file: README.rst + keywords = one, two + license = BSD 3-Clause License + classifiers = + Framework :: Django + Programming Language :: Python :: 3 + Programming Language :: Python :: 3.5 + + [options] + zip_safe = False + include_package_data = True + packages = find: + scripts = + bin/first.py + bin/second.py + + [options.package_data] + * = *.txt, *.rst + hello = *.msg + + [options.extras_require] + pdf = ReportLab>=1.2; RXP + rest = docutils>=0.3; pack ==1.1, ==1.3 + + [options.packages.find] + exclude = + src.subpackage1 + src.subpackage2 + + +Metadata and options could be set in sections with the same names. + +* Keys are the same as keyword arguments one provides to ``setup()`` function. + +* Complex values could be placed comma-separated or one per line + in *dangling* sections. The following are the same: + + .. code-block:: ini + + [metadata] + keywords = one, two + + [metadata] + keywords = + one + two + +* In some cases complex values could be provided in subsections for clarity. + +* Some keys allow ``file:``, ``attr:`` and ``find:`` directives to cover + common usecases. + +* Unknown keys are ignored. + + +Specifying values +================= + +Some values are treated as simple strings, some allow more logic. + +Type names used below: + +* ``str`` - simple string +* ``list-comma`` - dangling list or comma-separated values string +* ``list-semi`` - dangling list or semicolon-separated values string +* ``bool`` - ``True`` is 1, yes, true +* ``dict`` - list-comma where keys from values are separated by = +* ``section`` - values could be read from a dedicated (sub)section + + +Special directives: + +* ``attr:`` - value could be read from module attribute +* ``file:`` - value could be read from a file + + +.. note:: + ``file:`` directive is sandboxed and won't reach anything outside + directory with ``setup.py``. + + +Metadata +-------- + +.. note:: + Aliases given below are supported for compatibility reasons, + but not advised. + +================= ================= ===== +Key Aliases Accepted value type +================= ================= ===== +name str +version attr:, str +url home-page str +download_url download-url str +author str +author_email author-email str +maintainer str +maintainer_email maintainer-email str +classifiers classifier file:, list-comma +license file:, str +description summary file:, str +long_description long-description file:, str +keywords list-comma +platforms platform list-comma +provides list-comma +requires list-comma +obsoletes list-comma +================= ================= ===== + +.. note:: + + **version** - ``attr:`` supports callables; supports iterables; + unsupported types are casted using ``str()``. + + +Options +------- + +======================= ===== +Key Accepted value type +======================= ===== +zip_safe bool +setup_requires list-semi +install_requires list-semi +extras_require section +entry_points file:, section +use_2to3 bool +use_2to3_fixers list-comma +use_2to3_exclude_fixers list-comma +convert_2to3_doctests list-comma +scripts list-comma +eager_resources list-comma +dependency_links list-comma +tests_require list-semi +include_package_data bool +packages find:, list-comma +package_dir dict +package_data section +exclude_package_data section +namespace_packages list-comma +======================= ===== + +.. note:: + + **packages** - ``find:`` directive can be further configured + in a dedicated subsection `options.packages.find`. This subsection + accepts the same keys as `setuptools.find` function: + `where`, `include`, `exclude`. + + +Configuration API +================= + +Some automation tools may wish to access data from a configuration file. + +``Setuptools`` exposes ``read_configuration()`` function allowing +parsing ``metadata`` and ``options`` sections into a dictionary. + + +.. code-block:: python + + from setuptools.config import read_configuration + + conf_dict = read_configuration('/home/user/dev/package/setup.cfg') + + +By default ``read_configuration()`` will read only file provided +in the first argument. To include values from other configuration files +which could be in various places set `find_others` function argument +to ``True``. + +If you have only a configuration file but not the whole package you can still +try to get data out of it with the help of `ignore_option_errors` function +argument. When it is set to ``True`` all options with errors possibly produced +by directives, such as ``attr:`` and others will be silently ignored. +As a consequence the resulting dictionary will include no such options. + + +-------------------------------- +Extending and Reusing Setuptools +-------------------------------- + +Creating ``distutils`` Extensions +================================= + +It can be hard to add new commands or setup arguments to the distutils. But +the ``setuptools`` package makes it a bit easier, by allowing you to distribute +a distutils extension as a separate project, and then have projects that need +the extension just refer to it in their ``setup_requires`` argument. + +With ``setuptools``, your distutils extension projects can hook in new +commands and ``setup()`` arguments just by defining "entry points". These +are mappings from command or argument names to a specification of where to +import a handler from. (See the section on `Dynamic Discovery of Services and +Plugins`_ above for some more background on entry points.) + + +Adding Commands +--------------- + +You can add new ``setup`` commands by defining entry points in the +``distutils.commands`` group. For example, if you wanted to add a ``foo`` +command, you might add something like this to your distutils extension +project's setup script:: + + setup( + # ... + entry_points={ + "distutils.commands": [ + "foo = mypackage.some_module:foo", + ], + }, + ) + +(Assuming, of course, that the ``foo`` class in ``mypackage.some_module`` is +a ``setuptools.Command`` subclass.) + +Once a project containing such entry points has been activated on ``sys.path``, +(e.g. by running "install" or "develop" with a site-packages installation +directory) the command(s) will be available to any ``setuptools``-based setup +scripts. It is not necessary to use the ``--command-packages`` option or +to monkeypatch the ``distutils.command`` package to install your commands; +``setuptools`` automatically adds a wrapper to the distutils to search for +entry points in the active distributions on ``sys.path``. In fact, this is +how setuptools' own commands are installed: the setuptools project's setup +script defines entry points for them! + + +Adding ``setup()`` Arguments +---------------------------- + +Sometimes, your commands may need additional arguments to the ``setup()`` +call. You can enable this by defining entry points in the +``distutils.setup_keywords`` group. For example, if you wanted a ``setup()`` +argument called ``bar_baz``, you might add something like this to your +distutils extension project's setup script:: + + setup( + # ... + entry_points={ + "distutils.commands": [ + "foo = mypackage.some_module:foo", + ], + "distutils.setup_keywords": [ + "bar_baz = mypackage.some_module:validate_bar_baz", + ], + }, + ) + +The idea here is that the entry point defines a function that will be called +to validate the ``setup()`` argument, if it's supplied. The ``Distribution`` +object will have the initial value of the attribute set to ``None``, and the +validation function will only be called if the ``setup()`` call sets it to +a non-None value. Here's an example validation function:: + + def assert_bool(dist, attr, value): + """Verify that value is True, False, 0, or 1""" + if bool(value) != value: + raise DistutilsSetupError( + "%r must be a boolean value (got %r)" % (attr,value) + ) + +Your function should accept three arguments: the ``Distribution`` object, +the attribute name, and the attribute value. It should raise a +``DistutilsSetupError`` (from the ``distutils.errors`` module) if the argument +is invalid. Remember, your function will only be called with non-None values, +and the default value of arguments defined this way is always None. So, your +commands should always be prepared for the possibility that the attribute will +be ``None`` when they access it later. + +If more than one active distribution defines an entry point for the same +``setup()`` argument, *all* of them will be called. This allows multiple +distutils extensions to define a common argument, as long as they agree on +what values of that argument are valid. + +Also note that as with commands, it is not necessary to subclass or monkeypatch +the distutils ``Distribution`` class in order to add your arguments; it is +sufficient to define the entry points in your extension, as long as any setup +script using your extension lists your project in its ``setup_requires`` +argument. + + +Adding new EGG-INFO Files +------------------------- + +Some extensible applications or frameworks may want to allow third parties to +develop plugins with application or framework-specific metadata included in +the plugins' EGG-INFO directory, for easy access via the ``pkg_resources`` +metadata API. The easiest way to allow this is to create a distutils extension +to be used from the plugin projects' setup scripts (via ``setup_requires``) +that defines a new setup keyword, and then uses that data to write an EGG-INFO +file when the ``egg_info`` command is run. + +The ``egg_info`` command looks for extension points in an ``egg_info.writers`` +group, and calls them to write the files. Here's a simple example of a +distutils extension defining a setup argument ``foo_bar``, which is a list of +lines that will be written to ``foo_bar.txt`` in the EGG-INFO directory of any +project that uses the argument:: + + setup( + # ... + entry_points={ + "distutils.setup_keywords": [ + "foo_bar = setuptools.dist:assert_string_list", + ], + "egg_info.writers": [ + "foo_bar.txt = setuptools.command.egg_info:write_arg", + ], + }, + ) + +This simple example makes use of two utility functions defined by setuptools +for its own use: a routine to validate that a setup keyword is a sequence of +strings, and another one that looks up a setup argument and writes it to +a file. Here's what the writer utility looks like:: + + def write_arg(cmd, basename, filename): + argname = os.path.splitext(basename)[0] + value = getattr(cmd.distribution, argname, None) + if value is not None: + value = '\n'.join(value) + '\n' + cmd.write_or_delete_file(argname, filename, value) + +As you can see, ``egg_info.writers`` entry points must be a function taking +three arguments: a ``egg_info`` command instance, the basename of the file to +write (e.g. ``foo_bar.txt``), and the actual full filename that should be +written to. + +In general, writer functions should honor the command object's ``dry_run`` +setting when writing files, and use the ``distutils.log`` object to do any +console output. The easiest way to conform to this requirement is to use +the ``cmd`` object's ``write_file()``, ``delete_file()``, and +``write_or_delete_file()`` methods exclusively for your file operations. See +those methods' docstrings for more details. + + +Adding Support for Revision Control Systems +------------------------------------------------- + +If the files you want to include in the source distribution are tracked using +Git, Mercurial or SVN, you can use the following packages to achieve that: + +- Git and Mercurial: `setuptools_scm `_ +- SVN: `setuptools_svn `_ + +If you would like to create a plugin for ``setuptools`` to find files tracked +by another revision control system, you can do so by adding an entry point to +the ``setuptools.file_finders`` group. The entry point should be a function +accepting a single directory name, and should yield all the filenames within +that directory (and any subdirectories thereof) that are under revision +control. + +For example, if you were going to create a plugin for a revision control system +called "foobar", you would write a function something like this: + +.. code-block:: python + + def find_files_for_foobar(dirname): + # loop to yield paths that start with `dirname` + +And you would register it in a setup script using something like this:: + + entry_points={ + "setuptools.file_finders": [ + "foobar = my_foobar_module:find_files_for_foobar", + ] + } + +Then, anyone who wants to use your plugin can simply install it, and their +local setuptools installation will be able to find the necessary files. + +It is not necessary to distribute source control plugins with projects that +simply use the other source control system, or to specify the plugins in +``setup_requires``. When you create a source distribution with the ``sdist`` +command, setuptools automatically records what files were found in the +``SOURCES.txt`` file. That way, recipients of source distributions don't need +to have revision control at all. However, if someone is working on a package +by checking out with that system, they will need the same plugin(s) that the +original author is using. + +A few important points for writing revision control file finders: + +* Your finder function MUST return relative paths, created by appending to the + passed-in directory name. Absolute paths are NOT allowed, nor are relative + paths that reference a parent directory of the passed-in directory. + +* Your finder function MUST accept an empty string as the directory name, + meaning the current directory. You MUST NOT convert this to a dot; just + yield relative paths. So, yielding a subdirectory named ``some/dir`` under + the current directory should NOT be rendered as ``./some/dir`` or + ``/somewhere/some/dir``, but *always* as simply ``some/dir`` + +* Your finder function SHOULD NOT raise any errors, and SHOULD deal gracefully + with the absence of needed programs (i.e., ones belonging to the revision + control system itself. It *may*, however, use ``distutils.log.warn()`` to + inform the user of the missing program(s). + + +Subclassing ``Command`` +----------------------- + +Sorry, this section isn't written yet, and neither is a lot of what's below +this point. + +XXX + + +Reusing ``setuptools`` Code +=========================== + +``ez_setup`` +------------ + +XXX + + +``setuptools.archive_util`` +--------------------------- + +XXX + + +``setuptools.sandbox`` +---------------------- + +XXX + + +``setuptools.package_index`` +---------------------------- + +XXX + + +Mailing List and Bug Tracker +============================ + +Please use the `distutils-sig mailing list`_ for questions and discussion about +setuptools, and the `setuptools bug tracker`_ ONLY for issues you have +confirmed via the list are actual bugs, and which you have reduced to a minimal +set of steps to reproduce. + +.. _distutils-sig mailing list: http://mail.python.org/pipermail/distutils-sig/ +.. _setuptools bug tracker: https://github.com/pypa/setuptools/ diff --git a/ez_setup.py b/ez_setup.py deleted file mode 100755 index 7aab12d..0000000 --- a/ez_setup.py +++ /dev/null @@ -1,280 +0,0 @@ -#!python -"""Bootstrap setuptools installation - -If you want to use setuptools in your package's setup.py, just include this -file in the same directory with it, and add this to the top of your setup.py:: - - from ez_setup import use_setuptools - use_setuptools() - -If you want to require a specific version of setuptools, set a download -mirror, or use an alternate download directory, you can do so by supplying -the appropriate options to ``use_setuptools()``. - -This file can also be run as a script to install or upgrade setuptools. -""" -import sys -DEFAULT_VERSION = "0.6c11" -DEFAULT_URL = "http://pypi.python.org/packages/%s/s/setuptools/" % sys.version[:3] - -md5_data = { - 'setuptools-0.6b1-py2.3.egg': '8822caf901250d848b996b7f25c6e6ca', - 'setuptools-0.6b1-py2.4.egg': 'b79a8a403e4502fbb85ee3f1941735cb', - 'setuptools-0.6b2-py2.3.egg': '5657759d8a6d8fc44070a9d07272d99b', - 'setuptools-0.6b2-py2.4.egg': '4996a8d169d2be661fa32a6e52e4f82a', - 'setuptools-0.6b3-py2.3.egg': 'bb31c0fc7399a63579975cad9f5a0618', - 'setuptools-0.6b3-py2.4.egg': '38a8c6b3d6ecd22247f179f7da669fac', - 'setuptools-0.6b4-py2.3.egg': '62045a24ed4e1ebc77fe039aa4e6f7e5', - 'setuptools-0.6b4-py2.4.egg': '4cb2a185d228dacffb2d17f103b3b1c4', - 'setuptools-0.6c1-py2.3.egg': 'b3f2b5539d65cb7f74ad79127f1a908c', - 'setuptools-0.6c1-py2.4.egg': 'b45adeda0667d2d2ffe14009364f2a4b', - 'setuptools-0.6c10-py2.3.egg': 'ce1e2ab5d3a0256456d9fc13800a7090', - 'setuptools-0.6c10-py2.4.egg': '57d6d9d6e9b80772c59a53a8433a5dd4', - 'setuptools-0.6c10-py2.5.egg': 'de46ac8b1c97c895572e5e8596aeb8c7', - 'setuptools-0.6c10-py2.6.egg': '58ea40aef06da02ce641495523a0b7f5', - 'setuptools-0.6c2-py2.3.egg': 'f0064bf6aa2b7d0f3ba0b43f20817c27', - 'setuptools-0.6c2-py2.4.egg': '616192eec35f47e8ea16cd6a122b7277', - 'setuptools-0.6c3-py2.3.egg': 'f181fa125dfe85a259c9cd6f1d7b78fa', - 'setuptools-0.6c3-py2.4.egg': 'e0ed74682c998bfb73bf803a50e7b71e', - 'setuptools-0.6c3-py2.5.egg': 'abef16fdd61955514841c7c6bd98965e', - 'setuptools-0.6c4-py2.3.egg': 'b0b9131acab32022bfac7f44c5d7971f', - 'setuptools-0.6c4-py2.4.egg': '2a1f9656d4fbf3c97bf946c0a124e6e2', - 'setuptools-0.6c4-py2.5.egg': '8f5a052e32cdb9c72bcf4b5526f28afc', - 'setuptools-0.6c5-py2.3.egg': 'ee9fd80965da04f2f3e6b3576e9d8167', - 'setuptools-0.6c5-py2.4.egg': 'afe2adf1c01701ee841761f5bcd8aa64', - 'setuptools-0.6c5-py2.5.egg': 'a8d3f61494ccaa8714dfed37bccd3d5d', - 'setuptools-0.6c6-py2.3.egg': '35686b78116a668847237b69d549ec20', - 'setuptools-0.6c6-py2.4.egg': '3c56af57be3225019260a644430065ab', - 'setuptools-0.6c6-py2.5.egg': 'b2f8a7520709a5b34f80946de5f02f53', - 'setuptools-0.6c7-py2.3.egg': '209fdf9adc3a615e5115b725658e13e2', - 'setuptools-0.6c7-py2.4.egg': '5a8f954807d46a0fb67cf1f26c55a82e', - 'setuptools-0.6c7-py2.5.egg': '45d2ad28f9750e7434111fde831e8372', - 'setuptools-0.6c8-py2.3.egg': '50759d29b349db8cfd807ba8303f1902', - 'setuptools-0.6c8-py2.4.egg': 'cba38d74f7d483c06e9daa6070cce6de', - 'setuptools-0.6c8-py2.5.egg': '1721747ee329dc150590a58b3e1ac95b', - 'setuptools-0.6c9-py2.3.egg': 'a83c4020414807b496e4cfbe08507c03', - 'setuptools-0.6c9-py2.4.egg': '260a2be2e5388d66bdaee06abec6342a', - 'setuptools-0.6c9-py2.5.egg': 'fe67c3e5a17b12c0e7c541b7ea43a8e6', - 'setuptools-0.6c9-py2.6.egg': 'ca37b1ff16fa2ede6e19383e7b59245a', -} - -import sys, os -try: from hashlib import md5 -except ImportError: from md5 import md5 - -def _validate_md5(egg_name, data): - if egg_name in md5_data: - digest = md5(data).hexdigest() - if digest != md5_data[egg_name]: - print >>sys.stderr, ( - "md5 validation of %s failed! (Possible download problem?)" - % egg_name - ) - sys.exit(2) - return data - -def use_setuptools( - version=DEFAULT_VERSION, download_base=DEFAULT_URL, to_dir=os.curdir, - download_delay=15 -): - """Automatically find/download setuptools and make it available on sys.path - - `version` should be a valid setuptools version number that is available - as an egg for download under the `download_base` URL (which should end with - a '/'). `to_dir` is the directory where setuptools will be downloaded, if - it is not already available. If `download_delay` is specified, it should - be the number of seconds that will be paused before initiating a download, - should one be required. If an older version of setuptools is installed, - this routine will print a message to ``sys.stderr`` and raise SystemExit in - an attempt to abort the calling script. - """ - was_imported = 'pkg_resources' in sys.modules or 'setuptools' in sys.modules - def do_download(): - egg = download_setuptools(version, download_base, to_dir, download_delay) - sys.path.insert(0, egg) - import setuptools; setuptools.bootstrap_install_from = egg - try: - import pkg_resources - except ImportError: - return do_download() - try: - pkg_resources.require("setuptools>="+version); return - except pkg_resources.VersionConflict, e: - if was_imported: - print >>sys.stderr, ( - "The required version of setuptools (>=%s) is not available, and\n" - "can't be installed while this script is running. Please install\n" - " a more recent version first, using 'easy_install -U setuptools'." - "\n\n(Currently using %r)" - ) % (version, e.args[0]) - sys.exit(2) - else: - del pkg_resources, sys.modules['pkg_resources'] # reload ok - return do_download() - except pkg_resources.DistributionNotFound: - return do_download() - -def download_setuptools( - version=DEFAULT_VERSION, download_base=DEFAULT_URL, to_dir=os.curdir, - delay = 15 -): - """Download setuptools from a specified location and return its filename - - `version` should be a valid setuptools version number that is available - as an egg for download under the `download_base` URL (which should end - with a '/'). `to_dir` is the directory where the egg will be downloaded. - `delay` is the number of seconds to pause before an actual download attempt. - """ - import urllib2, shutil - egg_name = "setuptools-%s-py%s.egg" % (version,sys.version[:3]) - url = download_base + egg_name - saveto = os.path.join(to_dir, egg_name) - src = dst = None - if not os.path.exists(saveto): # Avoid repeated downloads - try: - from distutils import log - if delay: - log.warn(""" ---------------------------------------------------------------------------- -This script requires setuptools version %s to run (even to display -help). I will attempt to download it for you (from -%s), but -you may need to enable firewall access for this script first. -I will start the download in %d seconds. - -(Note: if this machine does not have network access, please obtain the file - - %s - -and place it in this directory before rerunning this script.) ----------------------------------------------------------------------------""", - version, download_base, delay, url - ); from time import sleep; sleep(delay) - log.warn("Downloading %s", url) - src = urllib2.urlopen(url) - # Read/write all in one block, so we don't create a corrupt file - # if the download is interrupted. - data = _validate_md5(egg_name, src.read()) - dst = open(saveto,"wb"); dst.write(data) - finally: - if src: src.close() - if dst: dst.close() - return os.path.realpath(saveto) - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -def main(argv, version=DEFAULT_VERSION): - """Install or upgrade setuptools and EasyInstall""" - try: - import setuptools - except ImportError: - egg = None - try: - egg = download_setuptools(version, delay=0) - sys.path.insert(0,egg) - from setuptools.command.easy_install import main - return main(list(argv)+[egg]) # we're done here - finally: - if egg and os.path.exists(egg): - os.unlink(egg) - else: - if setuptools.__version__ == '0.0.1': - print >>sys.stderr, ( - "You have an obsolete version of setuptools installed. Please\n" - "remove it from your system entirely before rerunning this script." - ) - sys.exit(2) - - req = "setuptools>="+version - import pkg_resources - try: - pkg_resources.require(req) - except pkg_resources.VersionConflict: - try: - from setuptools.command.easy_install import main - except ImportError: - from easy_install import main - main(list(argv)+[download_setuptools(delay=0)]) - sys.exit(0) # try to force an exit - else: - if argv: - from setuptools.command.easy_install import main - main(argv) - else: - print "Setuptools version",version,"or greater has been installed." - print '(Run "ez_setup.py -U setuptools" to reinstall or upgrade.)' - -def update_md5(filenames): - """Update our built-in md5 registry""" - - import re - - for name in filenames: - base = os.path.basename(name) - f = open(name,'rb') - md5_data[base] = md5(f.read()).hexdigest() - f.close() - - data = [" %r: %r,\n" % it for it in md5_data.items()] - data.sort() - repl = "".join(data) - - import inspect - srcfile = inspect.getsourcefile(sys.modules[__name__]) - f = open(srcfile, 'rb'); src = f.read(); f.close() - - match = re.search("\nmd5_data = {\n([^}]+)}", src) - if not match: - print >>sys.stderr, "Internal error!" - sys.exit(2) - - src = src[:match.start(1)] + repl + src[match.end(1):] - f = open(srcfile,'w') - f.write(src) - f.close() - - -if __name__=='__main__': - if len(sys.argv)>2 and sys.argv[1]=='--md5update': - update_md5(sys.argv[2:]) - else: - main(sys.argv[1:]) - - - - - - diff --git a/launcher.c b/launcher.c index 201219c..be69f0c 100755 --- a/launcher.c +++ b/launcher.c @@ -14,6 +14,14 @@ gcc -DGUI=0 -mno-cygwin -O -s -o setuptools/cli.exe launcher.c gcc -DGUI=1 -mwindows -mno-cygwin -O -s -o setuptools/gui.exe launcher.c + To build for Windows RT, install both Visual Studio Express for Windows 8 + and for Windows Desktop (both freeware), create "win32" application using + "Windows Desktop" version, create new "ARM" target via + "Configuration Manager" menu and modify ".vcxproj" file by adding + "true" tag + as child of "PropertyGroup" tags that has "Debug|ARM" and "Release|ARM" + properties. + It links to msvcrt.dll, but this shouldn't be a problem since it doesn't actually run Python in the same process. Note that using 'exec' instead of 'spawn' doesn't work, because on Windows this leads to the Python @@ -25,9 +33,12 @@ #include #include -#include +#include +#include +#include #include -#include "windows.h" + +int child_pid=0; int fail(char *format, char *data) { /* Print error message to stderr and return 2 */ @@ -35,10 +46,6 @@ int fail(char *format, char *data) { return 2; } - - - - char *quoted(char *data) { int i, ln = strlen(data), nb; @@ -90,7 +97,7 @@ char *loadable_exe(char *exename) { /* Return the absolute filename for spawnv */ result = calloc(MAX_PATH, sizeof(char)); strncpy(result, exename, MAX_PATH); - /*if (result) GetModuleFileName(hPython, result, MAX_PATH); + /*if (result) GetModuleFileNameA(hPython, result, MAX_PATH); FreeLibrary(hPython); */ return result; @@ -160,8 +167,82 @@ char **parse_argv(char *cmdline, int *argc) } while (1); } +void pass_control_to_child(DWORD control_type) { + /* + * distribute-issue207 + * passes the control event to child process (Python) + */ + if (!child_pid) { + return; + } + GenerateConsoleCtrlEvent(child_pid,0); +} +BOOL control_handler(DWORD control_type) { + /* + * distribute-issue207 + * control event handler callback function + */ + switch (control_type) { + case CTRL_C_EVENT: + pass_control_to_child(0); + break; + } + return TRUE; +} +int create_and_wait_for_subprocess(char* command) { + /* + * distribute-issue207 + * launches child process (Python) + */ + DWORD return_value = 0; + LPSTR commandline = command; + STARTUPINFOA s_info; + PROCESS_INFORMATION p_info; + ZeroMemory(&p_info, sizeof(p_info)); + ZeroMemory(&s_info, sizeof(s_info)); + s_info.cb = sizeof(STARTUPINFO); + // set-up control handler callback funciotn + SetConsoleCtrlHandler((PHANDLER_ROUTINE) control_handler, TRUE); + if (!CreateProcessA(NULL, commandline, NULL, NULL, TRUE, 0, NULL, NULL, &s_info, &p_info)) { + fprintf(stderr, "failed to create process.\n"); + return 0; + } + child_pid = p_info.dwProcessId; + // wait for Python to exit + WaitForSingleObject(p_info.hProcess, INFINITE); + if (!GetExitCodeProcess(p_info.hProcess, &return_value)) { + fprintf(stderr, "failed to get exit code from process.\n"); + return 0; + } + return return_value; +} + +char* join_executable_and_args(char *executable, char **args, int argc) +{ + /* + * distribute-issue207 + * CreateProcess needs a long string of the executable and command-line arguments, + * so we need to convert it from the args that was built + */ + int len,counter; + char* cmdline; + + len=strlen(executable)+2; + for (counter=1; counterscript && *end != '.') *end-- = '\0'; @@ -236,12 +318,18 @@ int run(int argc, char **argv, int is_gui) { return fail("Could not exec %s", ptr); /* shouldn't get here! */ } - /* We *do* need to wait for a CLI to finish, so use spawn */ - return spawnv(P_WAIT, ptr, (const char * const *)(newargs)); + /* + * distribute-issue207: using CreateProcessA instead of spawnv + */ + cmdline = join_executable_and_args(ptr, newargs, parsedargc + argc); + return create_and_wait_for_subprocess(cmdline); } - int WINAPI WinMain(HINSTANCE hI, HINSTANCE hP, LPSTR lpCmd, int nShow) { return run(__argc, __argv, GUI); } +int main(int argc, char** argv) { + return run(argc, argv, GUI); +} + diff --git a/msvc-build-launcher.cmd b/msvc-build-launcher.cmd new file mode 100644 index 0000000..92da290 --- /dev/null +++ b/msvc-build-launcher.cmd @@ -0,0 +1,39 @@ +@echo off + +REM Use old Windows SDK 6.1 so created .exe will be compatible with +REM old Windows versions. +REM Windows SDK 6.1 may be downloaded at: +REM http://www.microsoft.com/en-us/download/details.aspx?id=11310 +set PATH_OLD=%PATH% + +REM The SDK creates a false install of Visual Studio at one of these locations +set PATH=C:\Program Files\Microsoft Visual Studio 9.0\VC\bin;%PATH% +set PATH=C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\bin;%PATH% + +REM set up the environment to compile to x86 +call VCVARS32 +if "%ERRORLEVEL%"=="0" ( + cl /D "GUI=0" /D "WIN32_LEAN_AND_MEAN" launcher.c /O2 /link /MACHINE:x86 /SUBSYSTEM:CONSOLE /out:setuptools/cli-32.exe + cl /D "GUI=1" /D "WIN32_LEAN_AND_MEAN" launcher.c /O2 /link /MACHINE:x86 /SUBSYSTEM:WINDOWS /out:setuptools/gui-32.exe +) else ( + echo Windows SDK 6.1 not found to build Windows 32-bit version +) + +REM buildout (and possibly other implementations) currently depend on +REM the 32-bit launcher scripts without the -32 in the filename, so copy them +REM there for now. +copy setuptools/cli-32.exe setuptools/cli.exe +copy setuptools/gui-32.exe setuptools/gui.exe + +REM now for 64-bit +REM Use the x86_amd64 profile, which is the 32-bit cross compiler for amd64 +call VCVARSx86_amd64 +if "%ERRORLEVEL%"=="0" ( + cl /D "GUI=0" /D "WIN32_LEAN_AND_MEAN" launcher.c /O2 /link /MACHINE:x64 /SUBSYSTEM:CONSOLE /out:setuptools/cli-64.exe + cl /D "GUI=1" /D "WIN32_LEAN_AND_MEAN" launcher.c /O2 /link /MACHINE:x64 /SUBSYSTEM:WINDOWS /out:setuptools/gui-64.exe +) else ( + echo Windows SDK 6.1 not found to build Windows 64-bit version +) + +set PATH=%PATH_OLD% + diff --git a/packaging/psfl.txt b/packaging/psfl.txt deleted file mode 100644 index e3cb43e..0000000 --- a/packaging/psfl.txt +++ /dev/null @@ -1,216 +0,0 @@ -Python Software Foundation License -Python 2.1.1 license - -This is the official license for the Python 2.1.1 release: -A. HISTORY OF THE SOFTWARE -========================== - -Python was created in the early 1990s by Guido van Rossum at Stichting -Mathematisch Centrum (CWI) in the Netherlands as a successor of a language -called ABC. Guido is Python's principal author, although it includes many -contributions from others. The last version released from CWI was Python 1.2. -In 1995, Guido continued his work on Python at the Corporation for National -Research Initiatives (CNRI) in Reston, Virginia where he released several -versions of the software. Python 1.6 was the last of the versions released by -CNRI. In 2000, Guido and the Python core development team moved to BeOpen.com -to form the BeOpen PythonLabs team. Python 2.0 was the first and only release -from BeOpen.com. - -Following the release of Python 1.6, and after Guido van Rossum left CNRI to -work with commercial software developers, it became clear that the ability to -use Python with software available under the GNU Public License (GPL) was very -desirable. CNRI and the Free Software Foundation (FSF) interacted to develop -enabling wording changes to the Python license. Python 1.6.1 is essentially the -same as Python 1.6, with a few minor bug fixes, and with a different license -that enables later versions to be GPL-compatible. Python 2.1 is a derivative -work of Python 1.6.1, as well as of Python 2.0. - -After Python 2.0 was released by BeOpen.com, Guido van Rossum and the other -PythonLabs developers joined Digital Creations. All intellectual property added -from this point on, starting with Python 2.1 and its alpha and beta releases, -is owned by the Python Software Foundation (PSF), a non-profit modeled after -the Apache Software Foundation. See http://www.python.org/psf/ for more -information about the PSF. - -Thanks to the many outside volunteers who have worked under Guido's direction -to make these releases possible. - -B. TERMS AND CONDITIONS FOR ACCESSING OR OTHERWISE USING PYTHON -=============================================================== - -PSF LICENSE AGREEMENT ---------------------- - -1. This LICENSE AGREEMENT is between the Python Software Foundation ("PSF"), -and the Individual or Organization ("Licensee") accessing and otherwise using -Python 2.1.1 software in source or binary form and its associated -documentation. - -2. Subject to the terms and conditions of this License Agreement, PSF hereby -grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce, -analyze, test, perform and/or display publicly, prepare derivative works, -distribute, and otherwise use Python 2.1.1 alone or in any derivative version, -provided, however, that PSF's License Agreement and PSF's notice of copyright, -i.e., "Copyright (c) 2001 Python Software Foundation; All Rights Reserved" are -retained in Python 2.1.1 alone or in any derivative version prepared by -Licensee. - -3. In the event Licensee prepares a derivative work that is based on or -incorporates Python 2.1.1 or any part thereof, and wants to make the derivative -work available to others as provided herein, then Licensee hereby agrees to -include in any such work a brief summary of the changes made to Python 2.1.1. - -4. PSF is making Python 2.1.1 available to Licensee on an "AS IS" basis. PSF -MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR IMPLIED. BY WAY OF EXAMPLE, -BUT NOT LIMITATION, PSF MAKES NO AND DISCLAIMS ANY REPRESENTATION OR WARRANTY -OF MERCHANTABILITY OR FITNESS FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF -PYTHON 2.1.1 WILL NOT INFRINGE ANY THIRD PARTY RIGHTS. - -5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON 2.1.1 FOR -ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS A RESULT OF -MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON 2.1.1, OR ANY DERIVATIVE -THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF. - -6. This License Agreement will automatically terminate upon a material breach -of its terms and conditions. - -7. Nothing in this License Agreement shall be deemed to create any relationship -of agency, partnership, or joint venture between PSF and Licensee. This License -Agreement does not grant permission to use PSF trademarks or trade name in a -trademark sense to endorse or promote products or services of Licensee, or any -third party. - -8. By copying, installing or otherwise using Python 2.1.1, Licensee agrees to -be bound by the terms and conditions of this License Agreement. - -BEOPEN.COM TERMS AND CONDITIONS FOR PYTHON 2.0 ----------------------------------------------- - -BEOPEN PYTHON OPEN SOURCE LICENSE AGREEMENT VERSION 1 - -1. This LICENSE AGREEMENT is between BeOpen.com ("BeOpen"), having an office at -160 Saratoga Avenue, Santa Clara, CA 95051, and the Individual or Organization -("Licensee") accessing and otherwise using this software in source or binary -form and its associated documentation ("the Software"). - -2. Subject to the terms and conditions of this BeOpen Python License Agreement, -BeOpen hereby grants Licensee a non-exclusive, royalty-free, world-wide license -to reproduce, analyze, test, perform and/or display publicly, prepare -derivative works, distribute, and otherwise use the Software alone or in any -derivative version, provided, however, that the BeOpen Python License is -retained in the Software, alone or in any derivative version prepared by -Licensee. - -3. BeOpen is making the Software available to Licensee on an "AS IS" basis. -BEOPEN MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR IMPLIED. BY WAY OF -EXAMPLE, BUT NOT LIMITATION, BEOPEN MAKES NO AND DISCLAIMS ANY REPRESENTATION -OR WARRANTY OF MERCHANTABILITY OR FITNESS FOR ANY PARTICULAR PURPOSE OR THAT -THE USE OF THE SOFTWARE WILL NOT INFRINGE ANY THIRD PARTY RIGHTS. - -4. BEOPEN SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF THE SOFTWARE -FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS A RESULT OF -USING, MODIFYING OR DISTRIBUTING THE SOFTWARE, OR ANY DERIVATIVE THEREOF, EVEN -IF ADVISED OF THE POSSIBILITY THEREOF. - -5. This License Agreement will automatically terminate upon a material breach -of its terms and conditions. - -6. This License Agreement shall be governed by and interpreted in all respects -by the law of the State of California, excluding conflict of law provisions. -Nothing in this License Agreement shall be deemed to create any relationship of -agency, partnership, or joint venture between BeOpen and Licensee. This License -Agreement does not grant permission to use BeOpen trademarks or trade names in -a trademark sense to endorse or promote products or services of Licensee, or -any third party. As an exception, the "BeOpen Python" logos available at -http://www.pythonlabs.com/logos.html may be used according to the permissions -granted on that web page. - -7. By copying, installing or otherwise using the software, Licensee agrees to -be bound by the terms and conditions of this License Agreement. - -CNRI OPEN SOURCE GPL-COMPATIBLE LICENSE AGREEMENT -------------------------------------------------- - -1. This LICENSE AGREEMENT is between the Corporation for National Research -Initiatives, having an office at 1895 Preston White Drive, Reston, VA 20191 -("CNRI"), and the Individual or Organization ("Licensee") accessing and -otherwise using Python 1.6.1 software in source or binary form and its -associated documentation. - -2. Subject to the terms and conditions of this License Agreement, CNRI hereby -grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce, -analyze, test, perform and/or display publicly, prepare derivative works, -distribute, and otherwise use Python 1.6.1 alone or in any derivative version, -provided, however, that CNRI's License Agreement and CNRI's notice of -copyright, i.e., "Copyright (c) 1995-2001 Corporation for National Research -Initiatives; All Rights Reserved" are retained in Python 1.6.1 alone or in any -derivative version prepared by Licensee. Alternately, in lieu of CNRI's License -Agreement, Licensee may substitute the following text (omitting the quotes): -"Python 1.6.1 is made available subject to the terms and conditions in CNRI's -License Agreement. This Agreement together with Python 1.6.1 may be located on -the Internet using the following unique, persistent identifier (known as a -handle): 1895.22/1013. This Agreement may also be obtained from a proxy server -on the Internet using the following URL: http://hdl.handle.net/1895.22/1013". - -3. In the event Licensee prepares a derivative work that is based on or -incorporates Python 1.6.1 or any part thereof, and wants to make the derivative -work available to others as provided herein, then Licensee hereby agrees to -include in any such work a brief summary of the changes made to Python 1.6.1. - -4. CNRI is making Python 1.6.1 available to Licensee on an "AS IS" basis. CNRI -MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR IMPLIED. BY WAY OF EXAMPLE, -BUT NOT LIMITATION, CNRI MAKES NO AND DISCLAIMS ANY REPRESENTATION OR WARRANTY -OF MERCHANTABILITY OR FITNESS FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF -PYTHON 1.6.1 WILL NOT INFRINGE ANY THIRD PARTY RIGHTS. - -5. CNRI SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON 1.6.1 FOR -ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS A RESULT OF -MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON 1.6.1, OR ANY DERIVATIVE -THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF. - -6. This License Agreement will automatically terminate upon a material breach -of its terms and conditions. - -7. This License Agreement shall be governed by the federal intellectual -property law of the United States, including without limitation the federal -copyright law, and, to the extent such U.S. federal law does not apply, by the -law of the Commonwealth of Virginia, excluding Virginia's conflict of law -provisions. Notwithstanding the foregoing, with regard to derivative works -based on Python 1.6.1 that incorporate non-separable material that was -previously distributed under the GNU General Public License (GPL), the law of -the Commonwealth of Virginia shall govern this License Agreement only as to -issues arising under or with respect to Paragraphs 4, 5, and 7 of this License -Agreement. Nothing in this License Agreement shall be deemed to create any -relationship of agency, partnership, or joint venture between CNRI and -Licensee. This License Agreement does not grant permission to use CNRI -trademarks or trade name in a trademark sense to endorse or promote products or -services of Licensee, or any third party. - -8. By clicking on the "ACCEPT" button where indicated, or by copying, -installing or otherwise using Python 1.6.1, Licensee agrees to be bound by the -terms and conditions of this License Agreement. - - ACCEPT - -CWI PERMISSIONS STATEMENT AND DISCLAIMER ----------------------------------------- - -Copyright (c) 1991 - 1995, Stichting Mathematisch Centrum Amsterdam, The -Netherlands. All rights reserved. - -Permission to use, copy, modify, and distribute this software and its -documentation for any purpose and without fee is hereby granted, provided that -the above copyright notice appear in all copies and that both that copyright -notice and this permission notice appear in supporting documentation, and that -the name of Stichting Mathematisch Centrum or CWI not be used in advertising or -publicity pertaining to distribution of the software without specific, written -prior permission. - -STICHTING MATHEMATISCH CENTRUM DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS -SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS, IN -NO EVENT SHALL STICHTING MATHEMATISCH CENTRUM BE LIABLE FOR ANY SPECIAL, -INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS -OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER -TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF -THIS SOFTWARE. - diff --git a/packaging/python-setuptools.changes b/packaging/python-setuptools.changes deleted file mode 100644 index 8686fb5..0000000 --- a/packaging/python-setuptools.changes +++ /dev/null @@ -1,57 +0,0 @@ -* Sat Jan 30 2010 Jian-feng Ding 0.6c11 -- Upgrade to 0.6c11 and enable spectacle - -* Fri Feb 13 2009 Xu Li - 0.6c9 -- Upgrade to 0.6c9 - -* Mon Sep 24 2007 Konstantin Ryabitsev - 0.6c7-2 -- Move pretty much everything back into runtime in order to avoid more - brokenness than we're trying to address with these fixes. - -* Fri Sep 14 2007 Konstantin Ryabitsev - 0.6c7-1 -- Upstream 0.6c7 -- Move some things from devel into runtime, in order to not break other - projects. - -* Sat Aug 18 2007 Konstantin Ryabitsev - 0.6c6-2 -- Make license tag conform to the new Licensing Guidelines -- Move everything except pkg_resources.py into a separate -devel package - so we avoid bundling python-devel when it's not required (#251645) -- Do not package tests - -* Sun Jun 10 2007 Konstantin Ryabitsev - 0.6c6-1 -- Upstream 0.6c6 -- Require python-devel (#240707) - -* Sun Jan 28 2007 Konstantin Ryabitsev - 0.6c5-1 -- Upstream 0.6c5 (known bugs, but the promised 0.6c6 is taking too long) - -* Tue Dec 05 2006 Konstantin Ryabitsev - 0.6c3-1 -- Upstream 0.6c3 (#218540, thanks to Michel Alexandre Salim for the patch) - -* Tue Sep 12 2006 Konstantin Ryabitsev - 0.6c2-1 -- Upstream 0.6c2 -- Ghostbusting - -* Mon Jul 31 2006 Konstantin Ryabitsev - 0.6c1-2 -- Set perms on license files (#200768) - -* Sat Jul 22 2006 Konstantin Ryabitsev - 0.6c1-1 -- Version 0.6c1 - -* Wed Jun 28 2006 Konstantin Ryabitsev - 0.6b3-1 -- Taking over from Ignacio -- Version 0.6b3 -- Ghost .pyo files in sitelib -- Add license files -- Remove manual python-abi, since we're building FC4 and up -- Kill .exe files - -* Wed Feb 15 2006 Ignacio Vazquez-Abrams 0.6a10-1 -- Upstream update - -* Mon Jan 16 2006 Ignacio Vazquez-Abrams 0.6a9-1 -- Upstream update - -* Sat Dec 24 2005 Ignacio Vazquez-Abrams 0.6a8-1 -- Initial RPM release diff --git a/packaging/python-setuptools.manifest b/packaging/python-setuptools.manifest deleted file mode 100644 index 017d22d..0000000 --- a/packaging/python-setuptools.manifest +++ /dev/null @@ -1,5 +0,0 @@ - - - - - diff --git a/packaging/python-setuptools.spec b/packaging/python-setuptools.spec deleted file mode 100644 index 3808d99..0000000 --- a/packaging/python-setuptools.spec +++ /dev/null @@ -1,74 +0,0 @@ -Name: python-setuptools -Summary: Easily build and distribute Python packages -Version: 0.6c11 -Release: 2 -Group: Applications/System -License: Python or ZPLv2.0 -BuildArch: noarch -URL: http://pypi.python.org/pypi/setuptools -Source0: http://pypi.python.org/packages/source/s/setuptools/setuptools-%{version}.tar.gz -Source1: psfl.txt -Source2: zpl.txt -Source1001: python-setuptools.manifest -BuildRequires: python-devel - - -%description -Setuptools is a collection of enhancements to the Python distutils that allow -you to more easily build and distribute Python packages, especially ones that -have dependencies on other packages. - -This package contains the runtime components of setuptools, necessary to -execute the software that requires pkg_resources.py. - - - -%package devel -Summary: Download, install, upgrade, and uninstall Python packages -Group: Development/Languages -Requires: %{name} = %{version}-%{release} -Requires: python-devel - -%description devel -setuptools is a collection of enhancements to the Python distutils that allow -you to more easily build and distribute Python packages, especially ones that -have dependencies on other packages. - -This package contains the components necessary to build and install software -requiring setuptools. - - - -%prep -%setup -q -n setuptools-%{version} - -%build -cp %{SOURCE1001} . -find -name '*.txt' | xargs chmod -x -find -name '*.py' | xargs sed -i '1s|^#!python|#!%{__python}|' -CFLAGS="$RPM_OPT_FLAGS" %{__python} setup.py build - -%install -%{__python} setup.py install -O1 --skip-build \ - --root $RPM_BUILD_ROOT \ - --prefix %{_prefix} \ - --single-version-externally-managed - -rm -rf $RPM_BUILD_ROOT%{python_sitelib}/setuptools/tests - -install -p -m 0644 %{SOURCE1} %{SOURCE2} . -find $RPM_BUILD_ROOT%{python_sitelib} -name '*.exe' | xargs rm -f -chmod +x $RPM_BUILD_ROOT%{python_sitelib}/setuptools/command/easy_install.py - - -%files -%manifest %{name}.manifest -%{python_sitelib}/* -%exclude %{python_sitelib}/easy_install* - - -%files devel -%manifest %{name}.manifest -%{python_sitelib}/easy_install* -%{_bindir}/* - diff --git a/packaging/zpl.txt b/packaging/zpl.txt deleted file mode 100644 index 44e0648..0000000 --- a/packaging/zpl.txt +++ /dev/null @@ -1,59 +0,0 @@ -Zope Public License (ZPL) Version 2.0 ------------------------------------------------ - -This software is Copyright (c) Zope Corporation (tm) and -Contributors. All rights reserved. - -This license has been certified as open source. It has also -been designated as GPL compatible by the Free Software -Foundation (FSF). - -Redistribution and use in source and binary forms, with or -without modification, are permitted provided that the -following conditions are met: - -1. Redistributions in source code must retain the above - copyright notice, this list of conditions, and the following - disclaimer. - -2. Redistributions in binary form must reproduce the above - copyright notice, this list of conditions, and the following - disclaimer in the documentation and/or other materials - provided with the distribution. - -3. The name Zope Corporation (tm) must not be used to - endorse or promote products derived from this software - without prior written permission from Zope Corporation. - -4. The right to distribute this software or to use it for - any purpose does not give you the right to use Servicemarks - (sm) or Trademarks (tm) of Zope Corporation. Use of them is - covered in a separate agreement (see - http://www.zope.com/Marks). - -5. If any files are modified, you must cause the modified - files to carry prominent notices stating that you changed - the files and the date of any change. - -Disclaimer - - THIS SOFTWARE IS PROVIDED BY ZOPE CORPORATION ``AS IS'' - AND ANY EXPRESSED OR IMPLIED WARRANTIES, INCLUDING, BUT - NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY - AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN - NO EVENT SHALL ZOPE CORPORATION OR ITS CONTRIBUTORS BE - LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, - EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT - LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; - LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) - HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN - CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE - OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS - SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH - DAMAGE. - - -This software consists of contributions made by Zope -Corporation and many individuals on behalf of Zope -Corporation. Specific attributions are listed in the -accompanying credits file. diff --git a/pkg_resources.py b/pkg_resources.py deleted file mode 100644 index 79db00b..0000000 --- a/pkg_resources.py +++ /dev/null @@ -1,2625 +0,0 @@ -"""Package resource API --------------------- - -A resource is a logical file contained within a package, or a logical -subdirectory thereof. The package resource API expects resource names -to have their path parts separated with ``/``, *not* whatever the local -path separator is. Do not use os.path operations to manipulate resource -names being passed into the API. - -The package resource API is designed to work with normal filesystem packages, -.egg files, and unpacked .egg files. It can also work in a limited way with -.zip files and with custom PEP 302 loaders that support the ``get_data()`` -method. -""" - -import sys, os, zipimport, time, re, imp - -try: - frozenset -except NameError: - from sets import ImmutableSet as frozenset - -# capture these to bypass sandboxing -from os import utime, rename, unlink, mkdir -from os import open as os_open -from os.path import isdir, split - - -def _bypass_ensure_directory(name, mode=0777): - # Sandbox-bypassing version of ensure_directory() - dirname, filename = split(name) - if dirname and filename and not isdir(dirname): - _bypass_ensure_directory(dirname) - mkdir(dirname, mode) - - - - - - - -_state_vars = {} - -def _declare_state(vartype, **kw): - g = globals() - for name, val in kw.iteritems(): - g[name] = val - _state_vars[name] = vartype - -def __getstate__(): - state = {} - g = globals() - for k, v in _state_vars.iteritems(): - state[k] = g['_sget_'+v](g[k]) - return state - -def __setstate__(state): - g = globals() - for k, v in state.iteritems(): - g['_sset_'+_state_vars[k]](k, g[k], v) - return state - -def _sget_dict(val): - return val.copy() - -def _sset_dict(key, ob, state): - ob.clear() - ob.update(state) - -def _sget_object(val): - return val.__getstate__() - -def _sset_object(key, ob, state): - ob.__setstate__(state) - -_sget_none = _sset_none = lambda *args: None - - - - - - -def get_supported_platform(): - """Return this platform's maximum compatible version. - - distutils.util.get_platform() normally reports the minimum version - of Mac OS X that would be required to *use* extensions produced by - distutils. But what we want when checking compatibility is to know the - version of Mac OS X that we are *running*. To allow usage of packages that - explicitly require a newer version of Mac OS X, we must also know the - current version of the OS. - - If this condition occurs for any other platform with a version in its - platform strings, this function should be extended accordingly. - """ - plat = get_build_platform(); m = macosVersionString.match(plat) - if m is not None and sys.platform == "darwin": - try: - plat = 'macosx-%s-%s' % ('.'.join(_macosx_vers()[:2]), m.group(3)) - except ValueError: - pass # not Mac OS X - return plat - - - - - - - - - - - - - - - - - - - - - -__all__ = [ - # Basic resource access and distribution/entry point discovery - 'require', 'run_script', 'get_provider', 'get_distribution', - 'load_entry_point', 'get_entry_map', 'get_entry_info', 'iter_entry_points', - 'resource_string', 'resource_stream', 'resource_filename', - 'resource_listdir', 'resource_exists', 'resource_isdir', - - # Environmental control - 'declare_namespace', 'working_set', 'add_activation_listener', - 'find_distributions', 'set_extraction_path', 'cleanup_resources', - 'get_default_cache', - - # Primary implementation classes - 'Environment', 'WorkingSet', 'ResourceManager', - 'Distribution', 'Requirement', 'EntryPoint', - - # Exceptions - 'ResolutionError','VersionConflict','DistributionNotFound','UnknownExtra', - 'ExtractionError', - - # Parsing functions and string utilities - 'parse_requirements', 'parse_version', 'safe_name', 'safe_version', - 'get_platform', 'compatible_platforms', 'yield_lines', 'split_sections', - 'safe_extra', 'to_filename', - - # filesystem utilities - 'ensure_directory', 'normalize_path', - - # Distribution "precedence" constants - 'EGG_DIST', 'BINARY_DIST', 'SOURCE_DIST', 'CHECKOUT_DIST', 'DEVELOP_DIST', - - # "Provider" interfaces, implementations, and registration/lookup APIs - 'IMetadataProvider', 'IResourceProvider', 'FileMetadata', - 'PathMetadata', 'EggMetadata', 'EmptyProvider', 'empty_provider', - 'NullProvider', 'EggProvider', 'DefaultProvider', 'ZipProvider', - 'register_finder', 'register_namespace_handler', 'register_loader_type', - 'fixup_namespace_packages', 'get_importer', - - # Deprecated/backward compatibility only - 'run_main', 'AvailableDistributions', -] -class ResolutionError(Exception): - """Abstract base for dependency resolution errors""" - def __repr__(self): return self.__class__.__name__+repr(self.args) - -class VersionConflict(ResolutionError): - """An already-installed version conflicts with the requested version""" - -class DistributionNotFound(ResolutionError): - """A requested distribution was not found""" - -class UnknownExtra(ResolutionError): - """Distribution doesn't have an "extra feature" of the given name""" -_provider_factories = {} -PY_MAJOR = sys.version[:3] -EGG_DIST = 3 -BINARY_DIST = 2 -SOURCE_DIST = 1 -CHECKOUT_DIST = 0 -DEVELOP_DIST = -1 - -def register_loader_type(loader_type, provider_factory): - """Register `provider_factory` to make providers for `loader_type` - - `loader_type` is the type or class of a PEP 302 ``module.__loader__``, - and `provider_factory` is a function that, passed a *module* object, - returns an ``IResourceProvider`` for that module. - """ - _provider_factories[loader_type] = provider_factory - -def get_provider(moduleOrReq): - """Return an IResourceProvider for the named module or requirement""" - if isinstance(moduleOrReq,Requirement): - return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] - try: - module = sys.modules[moduleOrReq] - except KeyError: - __import__(moduleOrReq) - module = sys.modules[moduleOrReq] - loader = getattr(module, '__loader__', None) - return _find_adapter(_provider_factories, loader)(module) - -def _macosx_vers(_cache=[]): - if not _cache: - from platform import mac_ver - _cache.append(mac_ver()[0].split('.')) - return _cache[0] - -def _macosx_arch(machine): - return {'PowerPC':'ppc', 'Power_Macintosh':'ppc'}.get(machine,machine) - -def get_build_platform(): - """Return this platform's string for platform-specific distributions - - XXX Currently this is the same as ``distutils.util.get_platform()``, but it - needs some hacks for Linux and Mac OS X. - """ - from distutils.util import get_platform - plat = get_platform() - if sys.platform == "darwin" and not plat.startswith('macosx-'): - try: - version = _macosx_vers() - machine = os.uname()[4].replace(" ", "_") - return "macosx-%d.%d-%s" % (int(version[0]), int(version[1]), - _macosx_arch(machine)) - except ValueError: - # if someone is running a non-Mac darwin system, this will fall - # through to the default implementation - pass - return plat - -macosVersionString = re.compile(r"macosx-(\d+)\.(\d+)-(.*)") -darwinVersionString = re.compile(r"darwin-(\d+)\.(\d+)\.(\d+)-(.*)") -get_platform = get_build_platform # XXX backward compat - - - - - - - - - -def compatible_platforms(provided,required): - """Can code for the `provided` platform run on the `required` platform? - - Returns true if either platform is ``None``, or the platforms are equal. - - XXX Needs compatibility checks for Linux and other unixy OSes. - """ - if provided is None or required is None or provided==required: - return True # easy case - - # Mac OS X special cases - reqMac = macosVersionString.match(required) - if reqMac: - provMac = macosVersionString.match(provided) - - # is this a Mac package? - if not provMac: - # this is backwards compatibility for packages built before - # setuptools 0.6. All packages built after this point will - # use the new macosx designation. - provDarwin = darwinVersionString.match(provided) - if provDarwin: - dversion = int(provDarwin.group(1)) - macosversion = "%s.%s" % (reqMac.group(1), reqMac.group(2)) - if dversion == 7 and macosversion >= "10.3" or \ - dversion == 8 and macosversion >= "10.4": - - #import warnings - #warnings.warn("Mac eggs should be rebuilt to " - # "use the macosx designation instead of darwin.", - # category=DeprecationWarning) - return True - return False # egg isn't macosx or legacy darwin - - # are they the same major version and machine type? - if provMac.group(1) != reqMac.group(1) or \ - provMac.group(3) != reqMac.group(3): - return False - - - - # is the required OS major update >= the provided one? - if int(provMac.group(2)) > int(reqMac.group(2)): - return False - - return True - - # XXX Linux and other platforms' special cases should go here - return False - - -def run_script(dist_spec, script_name): - """Locate distribution `dist_spec` and run its `script_name` script""" - ns = sys._getframe(1).f_globals - name = ns['__name__'] - ns.clear() - ns['__name__'] = name - require(dist_spec)[0].run_script(script_name, ns) - -run_main = run_script # backward compatibility - -def get_distribution(dist): - """Return a current distribution object for a Requirement or string""" - if isinstance(dist,basestring): dist = Requirement.parse(dist) - if isinstance(dist,Requirement): dist = get_provider(dist) - if not isinstance(dist,Distribution): - raise TypeError("Expected string, Requirement, or Distribution", dist) - return dist - -def load_entry_point(dist, group, name): - """Return `name` entry point of `group` for `dist` or raise ImportError""" - return get_distribution(dist).load_entry_point(group, name) - -def get_entry_map(dist, group=None): - """Return the entry point map for `group`, or the full entry map""" - return get_distribution(dist).get_entry_map(group) - -def get_entry_info(dist, group, name): - """Return the EntryPoint object for `group`+`name`, or ``None``""" - return get_distribution(dist).get_entry_info(group, name) - - -class IMetadataProvider: - - def has_metadata(name): - """Does the package's distribution contain the named metadata?""" - - def get_metadata(name): - """The named metadata resource as a string""" - - def get_metadata_lines(name): - """Yield named metadata resource as list of non-blank non-comment lines - - Leading and trailing whitespace is stripped from each line, and lines - with ``#`` as the first non-blank character are omitted.""" - - def metadata_isdir(name): - """Is the named metadata a directory? (like ``os.path.isdir()``)""" - - def metadata_listdir(name): - """List of metadata names in the directory (like ``os.listdir()``)""" - - def run_script(script_name, namespace): - """Execute the named script in the supplied namespace dictionary""" - - - - - - - - - - - - - - - - - - - -class IResourceProvider(IMetadataProvider): - """An object that provides access to package resources""" - - def get_resource_filename(manager, resource_name): - """Return a true filesystem path for `resource_name` - - `manager` must be an ``IResourceManager``""" - - def get_resource_stream(manager, resource_name): - """Return a readable file-like object for `resource_name` - - `manager` must be an ``IResourceManager``""" - - def get_resource_string(manager, resource_name): - """Return a string containing the contents of `resource_name` - - `manager` must be an ``IResourceManager``""" - - def has_resource(resource_name): - """Does the package contain the named resource?""" - - def resource_isdir(resource_name): - """Is the named resource a directory? (like ``os.path.isdir()``)""" - - def resource_listdir(resource_name): - """List of resource names in the directory (like ``os.listdir()``)""" - - - - - - - - - - - - - - - -class WorkingSet(object): - """A collection of active distributions on sys.path (or a similar list)""" - - def __init__(self, entries=None): - """Create working set from list of path entries (default=sys.path)""" - self.entries = [] - self.entry_keys = {} - self.by_key = {} - self.callbacks = [] - - if entries is None: - entries = sys.path - - for entry in entries: - self.add_entry(entry) - - - def add_entry(self, entry): - """Add a path item to ``.entries``, finding any distributions on it - - ``find_distributions(entry, True)`` is used to find distributions - corresponding to the path entry, and they are added. `entry` is - always appended to ``.entries``, even if it is already present. - (This is because ``sys.path`` can contain the same value more than - once, and the ``.entries`` of the ``sys.path`` WorkingSet should always - equal ``sys.path``.) - """ - self.entry_keys.setdefault(entry, []) - self.entries.append(entry) - for dist in find_distributions(entry, True): - self.add(dist, entry, False) - - - def __contains__(self,dist): - """True if `dist` is the active distribution for its project""" - return self.by_key.get(dist.key) == dist - - - - - - def find(self, req): - """Find a distribution matching requirement `req` - - If there is an active distribution for the requested project, this - returns it as long as it meets the version requirement specified by - `req`. But, if there is an active distribution for the project and it - does *not* meet the `req` requirement, ``VersionConflict`` is raised. - If there is no active distribution for the requested project, ``None`` - is returned. - """ - dist = self.by_key.get(req.key) - if dist is not None and dist not in req: - raise VersionConflict(dist,req) # XXX add more info - else: - return dist - - def iter_entry_points(self, group, name=None): - """Yield entry point objects from `group` matching `name` - - If `name` is None, yields all entry points in `group` from all - distributions in the working set, otherwise only ones matching - both `group` and `name` are yielded (in distribution order). - """ - for dist in self: - entries = dist.get_entry_map(group) - if name is None: - for ep in entries.values(): - yield ep - elif name in entries: - yield entries[name] - - def run_script(self, requires, script_name): - """Locate distribution for `requires` and run `script_name` script""" - ns = sys._getframe(1).f_globals - name = ns['__name__'] - ns.clear() - ns['__name__'] = name - self.require(requires)[0].run_script(script_name, ns) - - - - def __iter__(self): - """Yield distributions for non-duplicate projects in the working set - - The yield order is the order in which the items' path entries were - added to the working set. - """ - seen = {} - for item in self.entries: - for key in self.entry_keys[item]: - if key not in seen: - seen[key]=1 - yield self.by_key[key] - - def add(self, dist, entry=None, insert=True): - """Add `dist` to working set, associated with `entry` - - If `entry` is unspecified, it defaults to the ``.location`` of `dist`. - On exit from this routine, `entry` is added to the end of the working - set's ``.entries`` (if it wasn't already present). - - `dist` is only added to the working set if it's for a project that - doesn't already have a distribution in the set. If it's added, any - callbacks registered with the ``subscribe()`` method will be called. - """ - if insert: - dist.insert_on(self.entries, entry) - - if entry is None: - entry = dist.location - keys = self.entry_keys.setdefault(entry,[]) - keys2 = self.entry_keys.setdefault(dist.location,[]) - if dist.key in self.by_key: - return # ignore hidden distros - - self.by_key[dist.key] = dist - if dist.key not in keys: - keys.append(dist.key) - if dist.key not in keys2: - keys2.append(dist.key) - self._added_new(dist) - - def resolve(self, requirements, env=None, installer=None): - """List all distributions needed to (recursively) meet `requirements` - - `requirements` must be a sequence of ``Requirement`` objects. `env`, - if supplied, should be an ``Environment`` instance. If - not supplied, it defaults to all distributions available within any - entry or distribution in the working set. `installer`, if supplied, - will be invoked with each requirement that cannot be met by an - already-installed distribution; it should return a ``Distribution`` or - ``None``. - """ - - requirements = list(requirements)[::-1] # set up the stack - processed = {} # set of processed requirements - best = {} # key -> dist - to_activate = [] - - while requirements: - req = requirements.pop(0) # process dependencies breadth-first - if req in processed: - # Ignore cyclic or redundant dependencies - continue - dist = best.get(req.key) - if dist is None: - # Find the best distribution and add it to the map - dist = self.by_key.get(req.key) - if dist is None: - if env is None: - env = Environment(self.entries) - dist = best[req.key] = env.best_match(req, self, installer) - if dist is None: - raise DistributionNotFound(req) # XXX put more info here - to_activate.append(dist) - if dist not in req: - # Oops, the "best" so far conflicts with a dependency - raise VersionConflict(dist,req) # XXX put more info here - requirements.extend(dist.requires(req.extras)[::-1]) - processed[req] = True - - return to_activate # return list of distros to activate - - def find_plugins(self, - plugin_env, full_env=None, installer=None, fallback=True - ): - """Find all activatable distributions in `plugin_env` - - Example usage:: - - distributions, errors = working_set.find_plugins( - Environment(plugin_dirlist) - ) - map(working_set.add, distributions) # add plugins+libs to sys.path - print "Couldn't load", errors # display errors - - The `plugin_env` should be an ``Environment`` instance that contains - only distributions that are in the project's "plugin directory" or - directories. The `full_env`, if supplied, should be an ``Environment`` - contains all currently-available distributions. If `full_env` is not - supplied, one is created automatically from the ``WorkingSet`` this - method is called on, which will typically mean that every directory on - ``sys.path`` will be scanned for distributions. - - `installer` is a standard installer callback as used by the - ``resolve()`` method. The `fallback` flag indicates whether we should - attempt to resolve older versions of a plugin if the newest version - cannot be resolved. - - This method returns a 2-tuple: (`distributions`, `error_info`), where - `distributions` is a list of the distributions found in `plugin_env` - that were loadable, along with any other distributions that are needed - to resolve their dependencies. `error_info` is a dictionary mapping - unloadable plugin distributions to an exception instance describing the - error that occurred. Usually this will be a ``DistributionNotFound`` or - ``VersionConflict`` instance. - """ - - plugin_projects = list(plugin_env) - plugin_projects.sort() # scan project names in alphabetic order - - error_info = {} - distributions = {} - - if full_env is None: - env = Environment(self.entries) - env += plugin_env - else: - env = full_env + plugin_env - - shadow_set = self.__class__([]) - map(shadow_set.add, self) # put all our entries in shadow_set - - for project_name in plugin_projects: - - for dist in plugin_env[project_name]: - - req = [dist.as_requirement()] - - try: - resolvees = shadow_set.resolve(req, env, installer) - - except ResolutionError,v: - error_info[dist] = v # save error info - if fallback: - continue # try the next older version of project - else: - break # give up on this project, keep going - - else: - map(shadow_set.add, resolvees) - distributions.update(dict.fromkeys(resolvees)) - - # success, no need to try any more versions of this project - break - - distributions = list(distributions) - distributions.sort() - - return distributions, error_info - - - - - - def require(self, *requirements): - """Ensure that distributions matching `requirements` are activated - - `requirements` must be a string or a (possibly-nested) sequence - thereof, specifying the distributions and versions required. The - return value is a sequence of the distributions that needed to be - activated to fulfill the requirements; all relevant distributions are - included, even if they were already activated in this working set. - """ - needed = self.resolve(parse_requirements(requirements)) - - for dist in needed: - self.add(dist) - - return needed - - def subscribe(self, callback): - """Invoke `callback` for all distributions (including existing ones)""" - if callback in self.callbacks: - return - self.callbacks.append(callback) - for dist in self: - callback(dist) - - def _added_new(self, dist): - for callback in self.callbacks: - callback(dist) - - def __getstate__(self): - return ( - self.entries[:], self.entry_keys.copy(), self.by_key.copy(), - self.callbacks[:] - ) - - def __setstate__(self, (entries, keys, by_key, callbacks)): - self.entries = entries[:] - self.entry_keys = keys.copy() - self.by_key = by_key.copy() - self.callbacks = callbacks[:] - - -class Environment(object): - """Searchable snapshot of distributions on a search path""" - - def __init__(self, search_path=None, platform=get_supported_platform(), python=PY_MAJOR): - """Snapshot distributions available on a search path - - Any distributions found on `search_path` are added to the environment. - `search_path` should be a sequence of ``sys.path`` items. If not - supplied, ``sys.path`` is used. - - `platform` is an optional string specifying the name of the platform - that platform-specific distributions must be compatible with. If - unspecified, it defaults to the current platform. `python` is an - optional string naming the desired version of Python (e.g. ``'2.4'``); - it defaults to the current version. - - You may explicitly set `platform` (and/or `python`) to ``None`` if you - wish to map *all* distributions, not just those compatible with the - running platform or Python version. - """ - self._distmap = {} - self._cache = {} - self.platform = platform - self.python = python - self.scan(search_path) - - def can_add(self, dist): - """Is distribution `dist` acceptable for this environment? - - The distribution must match the platform and python version - requirements specified when this environment was created, or False - is returned. - """ - return (self.python is None or dist.py_version is None - or dist.py_version==self.python) \ - and compatible_platforms(dist.platform,self.platform) - - def remove(self, dist): - """Remove `dist` from the environment""" - self._distmap[dist.key].remove(dist) - - def scan(self, search_path=None): - """Scan `search_path` for distributions usable in this environment - - Any distributions found are added to the environment. - `search_path` should be a sequence of ``sys.path`` items. If not - supplied, ``sys.path`` is used. Only distributions conforming to - the platform/python version defined at initialization are added. - """ - if search_path is None: - search_path = sys.path - - for item in search_path: - for dist in find_distributions(item): - self.add(dist) - - def __getitem__(self,project_name): - """Return a newest-to-oldest list of distributions for `project_name` - """ - try: - return self._cache[project_name] - except KeyError: - project_name = project_name.lower() - if project_name not in self._distmap: - return [] - - if project_name not in self._cache: - dists = self._cache[project_name] = self._distmap[project_name] - _sort_dists(dists) - - return self._cache[project_name] - - def add(self,dist): - """Add `dist` if we ``can_add()`` it and it isn't already added""" - if self.can_add(dist) and dist.has_version(): - dists = self._distmap.setdefault(dist.key,[]) - if dist not in dists: - dists.append(dist) - if dist.key in self._cache: - _sort_dists(self._cache[dist.key]) - - - def best_match(self, req, working_set, installer=None): - """Find distribution best matching `req` and usable on `working_set` - - This calls the ``find(req)`` method of the `working_set` to see if a - suitable distribution is already active. (This may raise - ``VersionConflict`` if an unsuitable version of the project is already - active in the specified `working_set`.) If a suitable distribution - isn't active, this method returns the newest distribution in the - environment that meets the ``Requirement`` in `req`. If no suitable - distribution is found, and `installer` is supplied, then the result of - calling the environment's ``obtain(req, installer)`` method will be - returned. - """ - dist = working_set.find(req) - if dist is not None: - return dist - for dist in self[req.key]: - if dist in req: - return dist - return self.obtain(req, installer) # try and download/install - - def obtain(self, requirement, installer=None): - """Obtain a distribution matching `requirement` (e.g. via download) - - Obtain a distro that matches requirement (e.g. via download). In the - base ``Environment`` class, this routine just returns - ``installer(requirement)``, unless `installer` is None, in which case - None is returned instead. This method is a hook that allows subclasses - to attempt other ways of obtaining a distribution before falling back - to the `installer` argument.""" - if installer is not None: - return installer(requirement) - - def __iter__(self): - """Yield the unique project names of the available distributions""" - for key in self._distmap.keys(): - if self[key]: yield key - - - - - def __iadd__(self, other): - """In-place addition of a distribution or environment""" - if isinstance(other,Distribution): - self.add(other) - elif isinstance(other,Environment): - for project in other: - for dist in other[project]: - self.add(dist) - else: - raise TypeError("Can't add %r to environment" % (other,)) - return self - - def __add__(self, other): - """Add an environment or distribution to an environment""" - new = self.__class__([], platform=None, python=None) - for env in self, other: - new += env - return new - - -AvailableDistributions = Environment # XXX backward compatibility - - -class ExtractionError(RuntimeError): - """An error occurred extracting a resource - - The following attributes are available from instances of this exception: - - manager - The resource manager that raised this exception - - cache_path - The base directory for resource extraction - - original_error - The exception instance that caused extraction to fail - """ - - - - -class ResourceManager: - """Manage resource extraction and packages""" - extraction_path = None - - def __init__(self): - self.cached_files = {} - - def resource_exists(self, package_or_requirement, resource_name): - """Does the named resource exist?""" - return get_provider(package_or_requirement).has_resource(resource_name) - - def resource_isdir(self, package_or_requirement, resource_name): - """Is the named resource an existing directory?""" - return get_provider(package_or_requirement).resource_isdir( - resource_name - ) - - def resource_filename(self, package_or_requirement, resource_name): - """Return a true filesystem path for specified resource""" - return get_provider(package_or_requirement).get_resource_filename( - self, resource_name - ) - - def resource_stream(self, package_or_requirement, resource_name): - """Return a readable file-like object for specified resource""" - return get_provider(package_or_requirement).get_resource_stream( - self, resource_name - ) - - def resource_string(self, package_or_requirement, resource_name): - """Return specified resource as a string""" - return get_provider(package_or_requirement).get_resource_string( - self, resource_name - ) - - def resource_listdir(self, package_or_requirement, resource_name): - """List the contents of the named resource directory""" - return get_provider(package_or_requirement).resource_listdir( - resource_name - ) - - def extraction_error(self): - """Give an error message for problems extracting file(s)""" - - old_exc = sys.exc_info()[1] - cache_path = self.extraction_path or get_default_cache() - - err = ExtractionError("""Can't extract file(s) to egg cache - -The following error occurred while trying to extract file(s) to the Python egg -cache: - - %s - -The Python egg cache directory is currently set to: - - %s - -Perhaps your account does not have write access to this directory? You can -change the cache directory by setting the PYTHON_EGG_CACHE environment -variable to point to an accessible directory. -""" % (old_exc, cache_path) - ) - err.manager = self - err.cache_path = cache_path - err.original_error = old_exc - raise err - - - - - - - - - - - - - - - - def get_cache_path(self, archive_name, names=()): - """Return absolute location in cache for `archive_name` and `names` - - The parent directory of the resulting path will be created if it does - not already exist. `archive_name` should be the base filename of the - enclosing egg (which may not be the name of the enclosing zipfile!), - including its ".egg" extension. `names`, if provided, should be a - sequence of path name parts "under" the egg's extraction location. - - This method should only be called by resource providers that need to - obtain an extraction location, and only for names they intend to - extract, as it tracks the generated names for possible cleanup later. - """ - extract_path = self.extraction_path or get_default_cache() - target_path = os.path.join(extract_path, archive_name+'-tmp', *names) - try: - _bypass_ensure_directory(target_path) - except: - self.extraction_error() - - self.cached_files[target_path] = 1 - return target_path - - - - - - - - - - - - - - - - - - - - def postprocess(self, tempname, filename): - """Perform any platform-specific postprocessing of `tempname` - - This is where Mac header rewrites should be done; other platforms don't - have anything special they should do. - - Resource providers should call this method ONLY after successfully - extracting a compressed resource. They must NOT call it on resources - that are already in the filesystem. - - `tempname` is the current (temporary) name of the file, and `filename` - is the name it will be renamed to by the caller after this routine - returns. - """ - - if os.name == 'posix': - # Make the resource executable - mode = ((os.stat(tempname).st_mode) | 0555) & 07777 - os.chmod(tempname, mode) - - - - - - - - - - - - - - - - - - - - - - - def set_extraction_path(self, path): - """Set the base path where resources will be extracted to, if needed. - - If you do not call this routine before any extractions take place, the - path defaults to the return value of ``get_default_cache()``. (Which - is based on the ``PYTHON_EGG_CACHE`` environment variable, with various - platform-specific fallbacks. See that routine's documentation for more - details.) - - Resources are extracted to subdirectories of this path based upon - information given by the ``IResourceProvider``. You may set this to a - temporary directory, but then you must call ``cleanup_resources()`` to - delete the extracted files when done. There is no guarantee that - ``cleanup_resources()`` will be able to remove all extracted files. - - (Note: you may not change the extraction path for a given resource - manager once resources have been extracted, unless you first call - ``cleanup_resources()``.) - """ - if self.cached_files: - raise ValueError( - "Can't change extraction path, files already extracted" - ) - - self.extraction_path = path - - def cleanup_resources(self, force=False): - """ - Delete all extracted resource files and directories, returning a list - of the file and directory names that could not be successfully removed. - This function does not have any concurrency protection, so it should - generally only be called when the extraction path is a temporary - directory exclusive to a single process. This method is not - automatically called; you must call it explicitly or register it as an - ``atexit`` function if you wish to ensure cleanup of a temporary - directory used for extractions. - """ - # XXX - - - -def get_default_cache(): - """Determine the default cache location - - This returns the ``PYTHON_EGG_CACHE`` environment variable, if set. - Otherwise, on Windows, it returns a "Python-Eggs" subdirectory of the - "Application Data" directory. On all other systems, it's "~/.python-eggs". - """ - try: - return os.environ['PYTHON_EGG_CACHE'] - except KeyError: - pass - - if os.name!='nt': - return os.path.expanduser('~/.python-eggs') - - app_data = 'Application Data' # XXX this may be locale-specific! - app_homes = [ - (('APPDATA',), None), # best option, should be locale-safe - (('USERPROFILE',), app_data), - (('HOMEDRIVE','HOMEPATH'), app_data), - (('HOMEPATH',), app_data), - (('HOME',), None), - (('WINDIR',), app_data), # 95/98/ME - ] - - for keys, subdir in app_homes: - dirname = '' - for key in keys: - if key in os.environ: - dirname = os.path.join(dirname, os.environ[key]) - else: - break - else: - if subdir: - dirname = os.path.join(dirname,subdir) - return os.path.join(dirname, 'Python-Eggs') - else: - raise RuntimeError( - "Please set the PYTHON_EGG_CACHE enviroment variable" - ) - -def safe_name(name): - """Convert an arbitrary string to a standard distribution name - - Any runs of non-alphanumeric/. characters are replaced with a single '-'. - """ - return re.sub('[^A-Za-z0-9.]+', '-', name) - - -def safe_version(version): - """Convert an arbitrary string to a standard version string - - Spaces become dots, and all other non-alphanumeric characters become - dashes, with runs of multiple dashes condensed to a single dash. - """ - version = version.replace(' ','.') - return re.sub('[^A-Za-z0-9.]+', '-', version) - - -def safe_extra(extra): - """Convert an arbitrary string to a standard 'extra' name - - Any runs of non-alphanumeric characters are replaced with a single '_', - and the result is always lowercased. - """ - return re.sub('[^A-Za-z0-9.]+', '_', extra).lower() - - -def to_filename(name): - """Convert a project or version name to its filename-escaped form - - Any '-' characters are currently replaced with '_'. - """ - return name.replace('-','_') - - - - - - - - -class NullProvider: - """Try to implement resources and metadata for arbitrary PEP 302 loaders""" - - egg_name = None - egg_info = None - loader = None - - def __init__(self, module): - self.loader = getattr(module, '__loader__', None) - self.module_path = os.path.dirname(getattr(module, '__file__', '')) - - def get_resource_filename(self, manager, resource_name): - return self._fn(self.module_path, resource_name) - - def get_resource_stream(self, manager, resource_name): - return StringIO(self.get_resource_string(manager, resource_name)) - - def get_resource_string(self, manager, resource_name): - return self._get(self._fn(self.module_path, resource_name)) - - def has_resource(self, resource_name): - return self._has(self._fn(self.module_path, resource_name)) - - def has_metadata(self, name): - return self.egg_info and self._has(self._fn(self.egg_info,name)) - - def get_metadata(self, name): - if not self.egg_info: - return "" - return self._get(self._fn(self.egg_info,name)) - - def get_metadata_lines(self, name): - return yield_lines(self.get_metadata(name)) - - def resource_isdir(self,resource_name): - return self._isdir(self._fn(self.module_path, resource_name)) - - def metadata_isdir(self,name): - return self.egg_info and self._isdir(self._fn(self.egg_info,name)) - - - def resource_listdir(self,resource_name): - return self._listdir(self._fn(self.module_path,resource_name)) - - def metadata_listdir(self,name): - if self.egg_info: - return self._listdir(self._fn(self.egg_info,name)) - return [] - - def run_script(self,script_name,namespace): - script = 'scripts/'+script_name - if not self.has_metadata(script): - raise ResolutionError("No script named %r" % script_name) - script_text = self.get_metadata(script).replace('\r\n','\n') - script_text = script_text.replace('\r','\n') - script_filename = self._fn(self.egg_info,script) - namespace['__file__'] = script_filename - if os.path.exists(script_filename): - execfile(script_filename, namespace, namespace) - else: - from linecache import cache - cache[script_filename] = ( - len(script_text), 0, script_text.split('\n'), script_filename - ) - script_code = compile(script_text,script_filename,'exec') - exec script_code in namespace, namespace - - def _has(self, path): - raise NotImplementedError( - "Can't perform this operation for unregistered loader type" - ) - - def _isdir(self, path): - raise NotImplementedError( - "Can't perform this operation for unregistered loader type" - ) - - def _listdir(self, path): - raise NotImplementedError( - "Can't perform this operation for unregistered loader type" - ) - - def _fn(self, base, resource_name): - if resource_name: - return os.path.join(base, *resource_name.split('/')) - return base - - def _get(self, path): - if hasattr(self.loader, 'get_data'): - return self.loader.get_data(path) - raise NotImplementedError( - "Can't perform this operation for loaders without 'get_data()'" - ) - -register_loader_type(object, NullProvider) - - -class EggProvider(NullProvider): - """Provider based on a virtual filesystem""" - - def __init__(self,module): - NullProvider.__init__(self,module) - self._setup_prefix() - - def _setup_prefix(self): - # we assume here that our metadata may be nested inside a "basket" - # of multiple eggs; that's why we use module_path instead of .archive - path = self.module_path - old = None - while path!=old: - if path.lower().endswith('.egg'): - self.egg_name = os.path.basename(path) - self.egg_info = os.path.join(path, 'EGG-INFO') - self.egg_root = path - break - old = path - path, base = os.path.split(path) - - - - - - -class DefaultProvider(EggProvider): - """Provides access to package resources in the filesystem""" - - def _has(self, path): - return os.path.exists(path) - - def _isdir(self,path): - return os.path.isdir(path) - - def _listdir(self,path): - return os.listdir(path) - - def get_resource_stream(self, manager, resource_name): - return open(self._fn(self.module_path, resource_name), 'rb') - - def _get(self, path): - stream = open(path, 'rb') - try: - return stream.read() - finally: - stream.close() - -register_loader_type(type(None), DefaultProvider) - - -class EmptyProvider(NullProvider): - """Provider that returns nothing for all requests""" - - _isdir = _has = lambda self,path: False - _get = lambda self,path: '' - _listdir = lambda self,path: [] - module_path = None - - def __init__(self): - pass - -empty_provider = EmptyProvider() - - - - -class ZipProvider(EggProvider): - """Resource support for zips and eggs""" - - eagers = None - - def __init__(self, module): - EggProvider.__init__(self,module) - self.zipinfo = zipimport._zip_directory_cache[self.loader.archive] - self.zip_pre = self.loader.archive+os.sep - - def _zipinfo_name(self, fspath): - # Convert a virtual filename (full path to file) into a zipfile subpath - # usable with the zipimport directory cache for our target archive - if fspath.startswith(self.zip_pre): - return fspath[len(self.zip_pre):] - raise AssertionError( - "%s is not a subpath of %s" % (fspath,self.zip_pre) - ) - - def _parts(self,zip_path): - # Convert a zipfile subpath into an egg-relative path part list - fspath = self.zip_pre+zip_path # pseudo-fs path - if fspath.startswith(self.egg_root+os.sep): - return fspath[len(self.egg_root)+1:].split(os.sep) - raise AssertionError( - "%s is not a subpath of %s" % (fspath,self.egg_root) - ) - - def get_resource_filename(self, manager, resource_name): - if not self.egg_name: - raise NotImplementedError( - "resource_filename() only supported for .egg, not .zip" - ) - # no need to lock for extraction, since we use temp names - zip_path = self._resource_to_zip(resource_name) - eagers = self._get_eager_resources() - if '/'.join(self._parts(zip_path)) in eagers: - for name in eagers: - self._extract_resource(manager, self._eager_to_zip(name)) - return self._extract_resource(manager, zip_path) - - def _extract_resource(self, manager, zip_path): - - if zip_path in self._index(): - for name in self._index()[zip_path]: - last = self._extract_resource( - manager, os.path.join(zip_path, name) - ) - return os.path.dirname(last) # return the extracted directory name - - zip_stat = self.zipinfo[zip_path] - t,d,size = zip_stat[5], zip_stat[6], zip_stat[3] - date_time = ( - (d>>9)+1980, (d>>5)&0xF, d&0x1F, # ymd - (t&0xFFFF)>>11, (t>>5)&0x3F, (t&0x1F) * 2, 0, 0, -1 # hms, etc. - ) - timestamp = time.mktime(date_time) - - try: - real_path = manager.get_cache_path( - self.egg_name, self._parts(zip_path) - ) - - if os.path.isfile(real_path): - stat = os.stat(real_path) - if stat.st_size==size and stat.st_mtime==timestamp: - # size and stamp match, don't bother extracting - return real_path - - outf, tmpnam = _mkstemp(".$extract", dir=os.path.dirname(real_path)) - os.write(outf, self.loader.get_data(zip_path)) - os.close(outf) - utime(tmpnam, (timestamp,timestamp)) - manager.postprocess(tmpnam, real_path) - - try: - rename(tmpnam, real_path) - - except os.error: - if os.path.isfile(real_path): - stat = os.stat(real_path) - - if stat.st_size==size and stat.st_mtime==timestamp: - # size and stamp match, somebody did it just ahead of - # us, so we're done - return real_path - elif os.name=='nt': # Windows, del old file and retry - unlink(real_path) - rename(tmpnam, real_path) - return real_path - raise - - except os.error: - manager.extraction_error() # report a user-friendly error - - return real_path - - def _get_eager_resources(self): - if self.eagers is None: - eagers = [] - for name in ('native_libs.txt', 'eager_resources.txt'): - if self.has_metadata(name): - eagers.extend(self.get_metadata_lines(name)) - self.eagers = eagers - return self.eagers - - def _index(self): - try: - return self._dirindex - except AttributeError: - ind = {} - for path in self.zipinfo: - parts = path.split(os.sep) - while parts: - parent = os.sep.join(parts[:-1]) - if parent in ind: - ind[parent].append(parts[-1]) - break - else: - ind[parent] = [parts.pop()] - self._dirindex = ind - return ind - - def _has(self, fspath): - zip_path = self._zipinfo_name(fspath) - return zip_path in self.zipinfo or zip_path in self._index() - - def _isdir(self,fspath): - return self._zipinfo_name(fspath) in self._index() - - def _listdir(self,fspath): - return list(self._index().get(self._zipinfo_name(fspath), ())) - - def _eager_to_zip(self,resource_name): - return self._zipinfo_name(self._fn(self.egg_root,resource_name)) - - def _resource_to_zip(self,resource_name): - return self._zipinfo_name(self._fn(self.module_path,resource_name)) - -register_loader_type(zipimport.zipimporter, ZipProvider) - - - - - - - - - - - - - - - - - - - - - - - - -class FileMetadata(EmptyProvider): - """Metadata handler for standalone PKG-INFO files - - Usage:: - - metadata = FileMetadata("/path/to/PKG-INFO") - - This provider rejects all data and metadata requests except for PKG-INFO, - which is treated as existing, and will be the contents of the file at - the provided location. - """ - - def __init__(self,path): - self.path = path - - def has_metadata(self,name): - return name=='PKG-INFO' - - def get_metadata(self,name): - if name=='PKG-INFO': - return open(self.path,'rU').read() - raise KeyError("No metadata except PKG-INFO is available") - - def get_metadata_lines(self,name): - return yield_lines(self.get_metadata(name)) - - - - - - - - - - - - - - - - -class PathMetadata(DefaultProvider): - """Metadata provider for egg directories - - Usage:: - - # Development eggs: - - egg_info = "/path/to/PackageName.egg-info" - base_dir = os.path.dirname(egg_info) - metadata = PathMetadata(base_dir, egg_info) - dist_name = os.path.splitext(os.path.basename(egg_info))[0] - dist = Distribution(basedir,project_name=dist_name,metadata=metadata) - - # Unpacked egg directories: - - egg_path = "/path/to/PackageName-ver-pyver-etc.egg" - metadata = PathMetadata(egg_path, os.path.join(egg_path,'EGG-INFO')) - dist = Distribution.from_filename(egg_path, metadata=metadata) - """ - - def __init__(self, path, egg_info): - self.module_path = path - self.egg_info = egg_info - - -class EggMetadata(ZipProvider): - """Metadata provider for .egg files""" - - def __init__(self, importer): - """Create a metadata provider from a zipimporter""" - - self.zipinfo = zipimport._zip_directory_cache[importer.archive] - self.zip_pre = importer.archive+os.sep - self.loader = importer - if importer.prefix: - self.module_path = os.path.join(importer.archive, importer.prefix) - else: - self.module_path = importer.archive - self._setup_prefix() - - -class ImpWrapper: - """PEP 302 Importer that wraps Python's "normal" import algorithm""" - - def __init__(self, path=None): - self.path = path - - def find_module(self, fullname, path=None): - subname = fullname.split(".")[-1] - if subname != fullname and self.path is None: - return None - if self.path is None: - path = None - else: - path = [self.path] - try: - file, filename, etc = imp.find_module(subname, path) - except ImportError: - return None - return ImpLoader(file, filename, etc) - - -class ImpLoader: - """PEP 302 Loader that wraps Python's "normal" import algorithm""" - - def __init__(self, file, filename, etc): - self.file = file - self.filename = filename - self.etc = etc - - def load_module(self, fullname): - try: - mod = imp.load_module(fullname, self.file, self.filename, self.etc) - finally: - if self.file: self.file.close() - # Note: we don't set __loader__ because we want the module to look - # normal; i.e. this is just a wrapper for standard import machinery - return mod - - - - -def get_importer(path_item): - """Retrieve a PEP 302 "importer" for the given path item - - If there is no importer, this returns a wrapper around the builtin import - machinery. The returned importer is only cached if it was created by a - path hook. - """ - try: - importer = sys.path_importer_cache[path_item] - except KeyError: - for hook in sys.path_hooks: - try: - importer = hook(path_item) - except ImportError: - pass - else: - break - else: - importer = None - - sys.path_importer_cache.setdefault(path_item,importer) - if importer is None: - try: - importer = ImpWrapper(path_item) - except ImportError: - pass - return importer - -try: - from pkgutil import get_importer, ImpImporter -except ImportError: - pass # Python 2.3 or 2.4, use our own implementation -else: - ImpWrapper = ImpImporter # Python 2.5, use pkgutil's implementation - del ImpLoader, ImpImporter - - - - - - -_declare_state('dict', _distribution_finders = {}) - -def register_finder(importer_type, distribution_finder): - """Register `distribution_finder` to find distributions in sys.path items - - `importer_type` is the type or class of a PEP 302 "Importer" (sys.path item - handler), and `distribution_finder` is a callable that, passed a path - item and the importer instance, yields ``Distribution`` instances found on - that path item. See ``pkg_resources.find_on_path`` for an example.""" - _distribution_finders[importer_type] = distribution_finder - - -def find_distributions(path_item, only=False): - """Yield distributions accessible via `path_item`""" - importer = get_importer(path_item) - finder = _find_adapter(_distribution_finders, importer) - return finder(importer, path_item, only) - -def find_in_zip(importer, path_item, only=False): - metadata = EggMetadata(importer) - if metadata.has_metadata('PKG-INFO'): - yield Distribution.from_filename(path_item, metadata=metadata) - if only: - return # don't yield nested distros - for subitem in metadata.resource_listdir('/'): - if subitem.endswith('.egg'): - subpath = os.path.join(path_item, subitem) - for dist in find_in_zip(zipimport.zipimporter(subpath), subpath): - yield dist - -register_finder(zipimport.zipimporter, find_in_zip) - -def StringIO(*args, **kw): - """Thunk to load the real StringIO on demand""" - global StringIO - try: - from cStringIO import StringIO - except ImportError: - from StringIO import StringIO - return StringIO(*args,**kw) - -def find_nothing(importer, path_item, only=False): - return () -register_finder(object,find_nothing) - -def find_on_path(importer, path_item, only=False): - """Yield distributions accessible on a sys.path directory""" - path_item = _normalize_cached(path_item) - - if os.path.isdir(path_item) and os.access(path_item, os.R_OK): - if path_item.lower().endswith('.egg'): - # unpacked egg - yield Distribution.from_filename( - path_item, metadata=PathMetadata( - path_item, os.path.join(path_item,'EGG-INFO') - ) - ) - else: - # scan for .egg and .egg-info in directory - for entry in os.listdir(path_item): - lower = entry.lower() - if lower.endswith('.egg-info'): - fullpath = os.path.join(path_item, entry) - if os.path.isdir(fullpath): - # egg-info directory, allow getting metadata - metadata = PathMetadata(path_item, fullpath) - else: - metadata = FileMetadata(fullpath) - yield Distribution.from_location( - path_item,entry,metadata,precedence=DEVELOP_DIST - ) - elif not only and lower.endswith('.egg'): - for dist in find_distributions(os.path.join(path_item, entry)): - yield dist - elif not only and lower.endswith('.egg-link'): - for line in file(os.path.join(path_item, entry)): - if not line.strip(): continue - for item in find_distributions(os.path.join(path_item,line.rstrip())): - yield item - break -register_finder(ImpWrapper,find_on_path) - -_declare_state('dict', _namespace_handlers = {}) -_declare_state('dict', _namespace_packages = {}) - -def register_namespace_handler(importer_type, namespace_handler): - """Register `namespace_handler` to declare namespace packages - - `importer_type` is the type or class of a PEP 302 "Importer" (sys.path item - handler), and `namespace_handler` is a callable like this:: - - def namespace_handler(importer,path_entry,moduleName,module): - # return a path_entry to use for child packages - - Namespace handlers are only called if the importer object has already - agreed that it can handle the relevant path item, and they should only - return a subpath if the module __path__ does not already contain an - equivalent subpath. For an example namespace handler, see - ``pkg_resources.file_ns_handler``. - """ - _namespace_handlers[importer_type] = namespace_handler - -def _handle_ns(packageName, path_item): - """Ensure that named package includes a subpath of path_item (if needed)""" - importer = get_importer(path_item) - if importer is None: - return None - loader = importer.find_module(packageName) - if loader is None: - return None - module = sys.modules.get(packageName) - if module is None: - module = sys.modules[packageName] = imp.new_module(packageName) - module.__path__ = []; _set_parent_ns(packageName) - elif not hasattr(module,'__path__'): - raise TypeError("Not a package:", packageName) - handler = _find_adapter(_namespace_handlers, importer) - subpath = handler(importer,path_item,packageName,module) - if subpath is not None: - path = module.__path__; path.append(subpath) - loader.load_module(packageName); module.__path__ = path - return subpath - -def declare_namespace(packageName): - """Declare that package 'packageName' is a namespace package""" - - imp.acquire_lock() - try: - if packageName in _namespace_packages: - return - - path, parent = sys.path, None - if '.' in packageName: - parent = '.'.join(packageName.split('.')[:-1]) - declare_namespace(parent) - __import__(parent) - try: - path = sys.modules[parent].__path__ - except AttributeError: - raise TypeError("Not a package:", parent) - - # Track what packages are namespaces, so when new path items are added, - # they can be updated - _namespace_packages.setdefault(parent,[]).append(packageName) - _namespace_packages.setdefault(packageName,[]) - - for path_item in path: - # Ensure all the parent's path items are reflected in the child, - # if they apply - _handle_ns(packageName, path_item) - - finally: - imp.release_lock() - -def fixup_namespace_packages(path_item, parent=None): - """Ensure that previously-declared namespace packages include path_item""" - imp.acquire_lock() - try: - for package in _namespace_packages.get(parent,()): - subpath = _handle_ns(package, path_item) - if subpath: fixup_namespace_packages(subpath,package) - finally: - imp.release_lock() - -def file_ns_handler(importer, path_item, packageName, module): - """Compute an ns-package subpath for a filesystem or zipfile importer""" - - subpath = os.path.join(path_item, packageName.split('.')[-1]) - normalized = _normalize_cached(subpath) - for item in module.__path__: - if _normalize_cached(item)==normalized: - break - else: - # Only return the path if it's not already there - return subpath - -register_namespace_handler(ImpWrapper,file_ns_handler) -register_namespace_handler(zipimport.zipimporter,file_ns_handler) - - -def null_ns_handler(importer, path_item, packageName, module): - return None - -register_namespace_handler(object,null_ns_handler) - - -def normalize_path(filename): - """Normalize a file/dir name for comparison purposes""" - return os.path.normcase(os.path.realpath(filename)) - -def _normalize_cached(filename,_cache={}): - try: - return _cache[filename] - except KeyError: - _cache[filename] = result = normalize_path(filename) - return result - -def _set_parent_ns(packageName): - parts = packageName.split('.') - name = parts.pop() - if parts: - parent = '.'.join(parts) - setattr(sys.modules[parent], name, sys.modules[packageName]) - - -def yield_lines(strs): - """Yield non-empty/non-comment lines of a ``basestring`` or sequence""" - if isinstance(strs,basestring): - for s in strs.splitlines(): - s = s.strip() - if s and not s.startswith('#'): # skip blank lines/comments - yield s - else: - for ss in strs: - for s in yield_lines(ss): - yield s - -LINE_END = re.compile(r"\s*(#.*)?$").match # whitespace and comment -CONTINUE = re.compile(r"\s*\\\s*(#.*)?$").match # line continuation -DISTRO = re.compile(r"\s*((\w|[-.])+)").match # Distribution or extra -VERSION = re.compile(r"\s*(<=?|>=?|==|!=)\s*((\w|[-.])+)").match # ver. info -COMMA = re.compile(r"\s*,").match # comma between items -OBRACKET = re.compile(r"\s*\[").match -CBRACKET = re.compile(r"\s*\]").match -MODULE = re.compile(r"\w+(\.\w+)*$").match -EGG_NAME = re.compile( - r"(?P[^-]+)" - r"( -(?P[^-]+) (-py(?P[^-]+) (-(?P.+))? )? )?", - re.VERBOSE | re.IGNORECASE -).match - -component_re = re.compile(r'(\d+ | [a-z]+ | \.| -)', re.VERBOSE) -replace = {'pre':'c', 'preview':'c','-':'final-','rc':'c','dev':'@'}.get - -def _parse_version_parts(s): - for part in component_re.split(s): - part = replace(part,part) - if not part or part=='.': - continue - if part[:1] in '0123456789': - yield part.zfill(8) # pad for numeric comparison - else: - yield '*'+part - - yield '*final' # ensure that alpha/beta/candidate are before final - -def parse_version(s): - """Convert a version string to a chronologically-sortable key - - This is a rough cross between distutils' StrictVersion and LooseVersion; - if you give it versions that would work with StrictVersion, then it behaves - the same; otherwise it acts like a slightly-smarter LooseVersion. It is - *possible* to create pathological version coding schemes that will fool - this parser, but they should be very rare in practice. - - The returned value will be a tuple of strings. Numeric portions of the - version are padded to 8 digits so they will compare numerically, but - without relying on how numbers compare relative to strings. Dots are - dropped, but dashes are retained. Trailing zeros between alpha segments - or dashes are suppressed, so that e.g. "2.4.0" is considered the same as - "2.4". Alphanumeric parts are lower-cased. - - The algorithm assumes that strings like "-" and any alpha string that - alphabetically follows "final" represents a "patch level". So, "2.4-1" - is assumed to be a branch or patch of "2.4", and therefore "2.4.1" is - considered newer than "2.4-1", which in turn is newer than "2.4". - - Strings like "a", "b", "c", "alpha", "beta", "candidate" and so on (that - come before "final" alphabetically) are assumed to be pre-release versions, - so that the version "2.4" is considered newer than "2.4a1". - - Finally, to handle miscellaneous cases, the strings "pre", "preview", and - "rc" are treated as if they were "c", i.e. as though they were release - candidates, and therefore are not as new as a version string that does not - contain them, and "dev" is replaced with an '@' so that it sorts lower than - than any other pre-release tag. - """ - parts = [] - for part in _parse_version_parts(s.lower()): - if part.startswith('*'): - if part<'*final': # remove '-' before a prerelease tag - while parts and parts[-1]=='*final-': parts.pop() - # remove trailing zeros from each series of numeric parts - while parts and parts[-1]=='00000000': - parts.pop() - parts.append(part) - return tuple(parts) - -class EntryPoint(object): - """Object representing an advertised importable object""" - - def __init__(self, name, module_name, attrs=(), extras=(), dist=None): - if not MODULE(module_name): - raise ValueError("Invalid module name", module_name) - self.name = name - self.module_name = module_name - self.attrs = tuple(attrs) - self.extras = Requirement.parse(("x[%s]" % ','.join(extras))).extras - self.dist = dist - - def __str__(self): - s = "%s = %s" % (self.name, self.module_name) - if self.attrs: - s += ':' + '.'.join(self.attrs) - if self.extras: - s += ' [%s]' % ','.join(self.extras) - return s - - def __repr__(self): - return "EntryPoint.parse(%r)" % str(self) - - def load(self, require=True, env=None, installer=None): - if require: self.require(env, installer) - entry = __import__(self.module_name, globals(),globals(), ['__name__']) - for attr in self.attrs: - try: - entry = getattr(entry,attr) - except AttributeError: - raise ImportError("%r has no %r attribute" % (entry,attr)) - return entry - - def require(self, env=None, installer=None): - if self.extras and not self.dist: - raise UnknownExtra("Can't require() without a distribution", self) - map(working_set.add, - working_set.resolve(self.dist.requires(self.extras),env,installer)) - - - - #@classmethod - def parse(cls, src, dist=None): - """Parse a single entry point from string `src` - - Entry point syntax follows the form:: - - name = some.module:some.attr [extra1,extra2] - - The entry name and module name are required, but the ``:attrs`` and - ``[extras]`` parts are optional - """ - try: - attrs = extras = () - name,value = src.split('=',1) - if '[' in value: - value,extras = value.split('[',1) - req = Requirement.parse("x["+extras) - if req.specs: raise ValueError - extras = req.extras - if ':' in value: - value,attrs = value.split(':',1) - if not MODULE(attrs.rstrip()): - raise ValueError - attrs = attrs.rstrip().split('.') - except ValueError: - raise ValueError( - "EntryPoint must be in 'name=module:attrs [extras]' format", - src - ) - else: - return cls(name.strip(), value.strip(), attrs, extras, dist) - - parse = classmethod(parse) - - - - - - - - - #@classmethod - def parse_group(cls, group, lines, dist=None): - """Parse an entry point group""" - if not MODULE(group): - raise ValueError("Invalid group name", group) - this = {} - for line in yield_lines(lines): - ep = cls.parse(line, dist) - if ep.name in this: - raise ValueError("Duplicate entry point", group, ep.name) - this[ep.name]=ep - return this - - parse_group = classmethod(parse_group) - - #@classmethod - def parse_map(cls, data, dist=None): - """Parse a map of entry point groups""" - if isinstance(data,dict): - data = data.items() - else: - data = split_sections(data) - maps = {} - for group, lines in data: - if group is None: - if not lines: - continue - raise ValueError("Entry points must be listed in groups") - group = group.strip() - if group in maps: - raise ValueError("Duplicate group name", group) - maps[group] = cls.parse_group(group, lines, dist) - return maps - - parse_map = classmethod(parse_map) - - - - - - -class Distribution(object): - """Wrap an actual or potential sys.path entry w/metadata""" - def __init__(self, - location=None, metadata=None, project_name=None, version=None, - py_version=PY_MAJOR, platform=None, precedence = EGG_DIST - ): - self.project_name = safe_name(project_name or 'Unknown') - if version is not None: - self._version = safe_version(version) - self.py_version = py_version - self.platform = platform - self.location = location - self.precedence = precedence - self._provider = metadata or empty_provider - - #@classmethod - def from_location(cls,location,basename,metadata=None,**kw): - project_name, version, py_version, platform = [None]*4 - basename, ext = os.path.splitext(basename) - if ext.lower() in (".egg",".egg-info"): - match = EGG_NAME(basename) - if match: - project_name, version, py_version, platform = match.group( - 'name','ver','pyver','plat' - ) - return cls( - location, metadata, project_name=project_name, version=version, - py_version=py_version, platform=platform, **kw - ) - from_location = classmethod(from_location) - - hashcmp = property( - lambda self: ( - getattr(self,'parsed_version',()), self.precedence, self.key, - -len(self.location or ''), self.location, self.py_version, - self.platform - ) - ) - def __cmp__(self, other): return cmp(self.hashcmp, other) - def __hash__(self): return hash(self.hashcmp) - - # These properties have to be lazy so that we don't have to load any - # metadata until/unless it's actually needed. (i.e., some distributions - # may not know their name or version without loading PKG-INFO) - - #@property - def key(self): - try: - return self._key - except AttributeError: - self._key = key = self.project_name.lower() - return key - key = property(key) - - #@property - def parsed_version(self): - try: - return self._parsed_version - except AttributeError: - self._parsed_version = pv = parse_version(self.version) - return pv - - parsed_version = property(parsed_version) - - #@property - def version(self): - try: - return self._version - except AttributeError: - for line in self._get_metadata('PKG-INFO'): - if line.lower().startswith('version:'): - self._version = safe_version(line.split(':',1)[1].strip()) - return self._version - else: - raise ValueError( - "Missing 'Version:' header and/or PKG-INFO file", self - ) - version = property(version) - - - - - #@property - def _dep_map(self): - try: - return self.__dep_map - except AttributeError: - dm = self.__dep_map = {None: []} - for name in 'requires.txt', 'depends.txt': - for extra,reqs in split_sections(self._get_metadata(name)): - if extra: extra = safe_extra(extra) - dm.setdefault(extra,[]).extend(parse_requirements(reqs)) - return dm - _dep_map = property(_dep_map) - - def requires(self,extras=()): - """List of Requirements needed for this distro if `extras` are used""" - dm = self._dep_map - deps = [] - deps.extend(dm.get(None,())) - for ext in extras: - try: - deps.extend(dm[safe_extra(ext)]) - except KeyError: - raise UnknownExtra( - "%s has no such extra feature %r" % (self, ext) - ) - return deps - - def _get_metadata(self,name): - if self.has_metadata(name): - for line in self.get_metadata_lines(name): - yield line - - def activate(self,path=None): - """Ensure distribution is importable on `path` (default=sys.path)""" - if path is None: path = sys.path - self.insert_on(path) - if path is sys.path: - fixup_namespace_packages(self.location) - map(declare_namespace, self._get_metadata('namespace_packages.txt')) - - - def egg_name(self): - """Return what this distribution's standard .egg filename should be""" - filename = "%s-%s-py%s" % ( - to_filename(self.project_name), to_filename(self.version), - self.py_version or PY_MAJOR - ) - - if self.platform: - filename += '-'+self.platform - return filename - - def __repr__(self): - if self.location: - return "%s (%s)" % (self,self.location) - else: - return str(self) - - def __str__(self): - try: version = getattr(self,'version',None) - except ValueError: version = None - version = version or "[unknown version]" - return "%s %s" % (self.project_name,version) - - def __getattr__(self,attr): - """Delegate all unrecognized public attributes to .metadata provider""" - if attr.startswith('_'): - raise AttributeError,attr - return getattr(self._provider, attr) - - #@classmethod - def from_filename(cls,filename,metadata=None, **kw): - return cls.from_location( - _normalize_cached(filename), os.path.basename(filename), metadata, - **kw - ) - from_filename = classmethod(from_filename) - - def as_requirement(self): - """Return a ``Requirement`` that matches this distribution exactly""" - return Requirement.parse('%s==%s' % (self.project_name, self.version)) - - def load_entry_point(self, group, name): - """Return the `name` entry point of `group` or raise ImportError""" - ep = self.get_entry_info(group,name) - if ep is None: - raise ImportError("Entry point %r not found" % ((group,name),)) - return ep.load() - - def get_entry_map(self, group=None): - """Return the entry point map for `group`, or the full entry map""" - try: - ep_map = self._ep_map - except AttributeError: - ep_map = self._ep_map = EntryPoint.parse_map( - self._get_metadata('entry_points.txt'), self - ) - if group is not None: - return ep_map.get(group,{}) - return ep_map - - def get_entry_info(self, group, name): - """Return the EntryPoint object for `group`+`name`, or ``None``""" - return self.get_entry_map(group).get(name) - - - - - - - - - - - - - - - - - - - - def insert_on(self, path, loc = None): - """Insert self.location in path before its nearest parent directory""" - - loc = loc or self.location - if not loc: - return - - nloc = _normalize_cached(loc) - bdir = os.path.dirname(nloc) - npath= [(p and _normalize_cached(p) or p) for p in path] - - bp = None - for p, item in enumerate(npath): - if item==nloc: - break - elif item==bdir and self.precedence==EGG_DIST: - # if it's an .egg, give it precedence over its directory - if path is sys.path: - self.check_version_conflict() - path.insert(p, loc) - npath.insert(p, nloc) - break - else: - if path is sys.path: - self.check_version_conflict() - path.append(loc) - return - - # p is the spot where we found or inserted loc; now remove duplicates - while 1: - try: - np = npath.index(nloc, p+1) - except ValueError: - break - else: - del npath[np], path[np] - p = np # ha! - - return - - - def check_version_conflict(self): - if self.key=='setuptools': - return # ignore the inevitable setuptools self-conflicts :( - - nsp = dict.fromkeys(self._get_metadata('namespace_packages.txt')) - loc = normalize_path(self.location) - for modname in self._get_metadata('top_level.txt'): - if (modname not in sys.modules or modname in nsp - or modname in _namespace_packages - ): - continue - - fn = getattr(sys.modules[modname], '__file__', None) - if fn and (normalize_path(fn).startswith(loc) or fn.startswith(loc)): - continue - issue_warning( - "Module %s was already imported from %s, but %s is being added" - " to sys.path" % (modname, fn, self.location), - ) - - def has_version(self): - try: - self.version - except ValueError: - issue_warning("Unbuilt egg for "+repr(self)) - return False - return True - - def clone(self,**kw): - """Copy this distribution, substituting in any changed keyword args""" - for attr in ( - 'project_name', 'version', 'py_version', 'platform', 'location', - 'precedence' - ): - kw.setdefault(attr, getattr(self,attr,None)) - kw.setdefault('metadata', self._provider) - return self.__class__(**kw) - - - - - #@property - def extras(self): - return [dep for dep in self._dep_map if dep] - extras = property(extras) - - -def issue_warning(*args,**kw): - level = 1 - g = globals() - try: - # find the first stack frame that is *not* code in - # the pkg_resources module, to use for the warning - while sys._getframe(level).f_globals is g: - level += 1 - except ValueError: - pass - from warnings import warn - warn(stacklevel = level+1, *args, **kw) - - - - - - - - - - - - - - - - - - - - - - - -def parse_requirements(strs): - """Yield ``Requirement`` objects for each specification in `strs` - - `strs` must be an instance of ``basestring``, or a (possibly-nested) - iterable thereof. - """ - # create a steppable iterator, so we can handle \-continuations - lines = iter(yield_lines(strs)) - - def scan_list(ITEM,TERMINATOR,line,p,groups,item_name): - - items = [] - - while not TERMINATOR(line,p): - if CONTINUE(line,p): - try: - line = lines.next(); p = 0 - except StopIteration: - raise ValueError( - "\\ must not appear on the last nonblank line" - ) - - match = ITEM(line,p) - if not match: - raise ValueError("Expected "+item_name+" in",line,"at",line[p:]) - - items.append(match.group(*groups)) - p = match.end() - - match = COMMA(line,p) - if match: - p = match.end() # skip the comma - elif not TERMINATOR(line,p): - raise ValueError( - "Expected ',' or end-of-list in",line,"at",line[p:] - ) - - match = TERMINATOR(line,p) - if match: p = match.end() # skip the terminator, if any - return line, p, items - - for line in lines: - match = DISTRO(line) - if not match: - raise ValueError("Missing distribution spec", line) - project_name = match.group(1) - p = match.end() - extras = [] - - match = OBRACKET(line,p) - if match: - p = match.end() - line, p, extras = scan_list( - DISTRO, CBRACKET, line, p, (1,), "'extra' name" - ) - - line, p, specs = scan_list(VERSION,LINE_END,line,p,(1,2),"version spec") - specs = [(op,safe_version(val)) for op,val in specs] - yield Requirement(project_name, specs, extras) - - -def _sort_dists(dists): - tmp = [(dist.hashcmp,dist) for dist in dists] - tmp.sort() - dists[::-1] = [d for hc,d in tmp] - - - - - - - - - - - - - - - - - -class Requirement: - def __init__(self, project_name, specs, extras): - """DO NOT CALL THIS UNDOCUMENTED METHOD; use Requirement.parse()!""" - self.unsafe_name, project_name = project_name, safe_name(project_name) - self.project_name, self.key = project_name, project_name.lower() - index = [(parse_version(v),state_machine[op],op,v) for op,v in specs] - index.sort() - self.specs = [(op,ver) for parsed,trans,op,ver in index] - self.index, self.extras = index, tuple(map(safe_extra,extras)) - self.hashCmp = ( - self.key, tuple([(op,parsed) for parsed,trans,op,ver in index]), - frozenset(self.extras) - ) - self.__hash = hash(self.hashCmp) - - def __str__(self): - specs = ','.join([''.join(s) for s in self.specs]) - extras = ','.join(self.extras) - if extras: extras = '[%s]' % extras - return '%s%s%s' % (self.project_name, extras, specs) - - def __eq__(self,other): - return isinstance(other,Requirement) and self.hashCmp==other.hashCmp - - def __contains__(self,item): - if isinstance(item,Distribution): - if item.key != self.key: return False - if self.index: item = item.parsed_version # only get if we need it - elif isinstance(item,basestring): - item = parse_version(item) - last = None - for parsed,trans,op,ver in self.index: - action = trans[cmp(item,parsed)] - if action=='F': return False - elif action=='T': return True - elif action=='+': last = True - elif action=='-' or last is None: last = False - if last is None: last = True # no rules encountered - return last - - - def __hash__(self): - return self.__hash - - def __repr__(self): return "Requirement.parse(%r)" % str(self) - - #@staticmethod - def parse(s): - reqs = list(parse_requirements(s)) - if reqs: - if len(reqs)==1: - return reqs[0] - raise ValueError("Expected only one requirement", s) - raise ValueError("No requirements found", s) - - parse = staticmethod(parse) - -state_machine = { - # =>< - '<' : '--T', - '<=': 'T-T', - '>' : 'F+F', - '>=': 'T+F', - '==': 'T..', - '!=': 'F++', -} - - -def _get_mro(cls): - """Get an mro for a type or classic class""" - if not isinstance(cls,type): - class cls(cls,object): pass - return cls.__mro__[1:] - return cls.__mro__ - -def _find_adapter(registry, ob): - """Return an adapter factory for `ob` from `registry`""" - for t in _get_mro(getattr(ob, '__class__', type(ob))): - if t in registry: - return registry[t] - - -def ensure_directory(path): - """Ensure that the parent directory of `path` exists""" - dirname = os.path.dirname(path) - if not os.path.isdir(dirname): - os.makedirs(dirname) - -def split_sections(s): - """Split a string or iterable thereof into (section,content) pairs - - Each ``section`` is a stripped version of the section header ("[section]") - and each ``content`` is a list of stripped lines excluding blank lines and - comment-only lines. If there are any such lines before the first section - header, they're returned in a first ``section`` of ``None``. - """ - section = None - content = [] - for line in yield_lines(s): - if line.startswith("["): - if line.endswith("]"): - if section or content: - yield section, content - section = line[1:-1].strip() - content = [] - else: - raise ValueError("Invalid section heading", line) - else: - content.append(line) - - # wrap up last segment - yield section, content - -def _mkstemp(*args,**kw): - from tempfile import mkstemp - old_open = os.open - try: - os.open = os_open # temporarily bypass sandboxing - return mkstemp(*args,**kw) - finally: - os.open = old_open # and then put it back - - -# Set up global resource manager (deliberately not state-saved) -_manager = ResourceManager() -def _initialize(g): - for name in dir(_manager): - if not name.startswith('_'): - g[name] = getattr(_manager, name) -_initialize(globals()) - -# Prepare the master working set and make the ``require()`` API available -_declare_state('object', working_set = WorkingSet()) -try: - # Does the main program list any requirements? - from __main__ import __requires__ -except ImportError: - pass # No: just use the default working set based on sys.path -else: - # Yes: ensure the requirements are met, by prefixing sys.path if necessary - try: - working_set.require(__requires__) - except VersionConflict: # try it without defaults already on sys.path - working_set = WorkingSet([]) # by starting with an empty path - for dist in working_set.resolve( - parse_requirements(__requires__), Environment() - ): - working_set.add(dist) - for entry in sys.path: # add any missing entries from sys.path - if entry not in working_set.entries: - working_set.add_entry(entry) - sys.path[:] = working_set.entries # then copy back to sys.path - -require = working_set.require -iter_entry_points = working_set.iter_entry_points -add_activation_listener = working_set.subscribe -run_script = working_set.run_script -run_main = run_script # backward compatibility -# Activate all distributions already on sys.path, and ensure that -# all distributions added to the working set in the future (e.g. by -# calling ``require()``) will get activated as well. -add_activation_listener(lambda dist: dist.activate()) -working_set.entries=[]; map(working_set.add_entry,sys.path) # match order - diff --git a/pkg_resources.txt b/pkg_resources.txt deleted file mode 100755 index 03793b6..0000000 --- a/pkg_resources.txt +++ /dev/null @@ -1,1953 +0,0 @@ -============================================================= -Package Discovery and Resource Access using ``pkg_resources`` -============================================================= - -The ``pkg_resources`` module distributed with ``setuptools`` provides an API -for Python libraries to access their resource files, and for extensible -applications and frameworks to automatically discover plugins. It also -provides runtime support for using C extensions that are inside zipfile-format -eggs, support for merging packages that have separately-distributed modules or -subpackages, and APIs for managing Python's current "working set" of active -packages. - - -.. contents:: **Table of Contents** - - --------- -Overview --------- - -Eggs are a distribution format for Python modules, similar in concept to Java's -"jars" or Ruby's "gems". They differ from previous Python distribution formats -in that they are importable (i.e. they can be added to ``sys.path``), and they -are *discoverable*, meaning that they carry metadata that unambiguously -identifies their contents and dependencies, and thus can be *automatically* -found and added to ``sys.path`` in response to simple requests of the form, -"get me everything I need to use docutils' PDF support". - -The ``pkg_resources`` module provides runtime facilities for finding, -introspecting, activating and using eggs and other "pluggable" distribution -formats. Because these are new concepts in Python (and not that well- -established in other languages either), it helps to have a few special terms -for talking about eggs and how they can be used: - -project - A library, framework, script, plugin, application, or collection of data - or other resources, or some combination thereof. Projects are assumed to - have "relatively unique" names, e.g. names registered with PyPI. - -release - A snapshot of a project at a particular point in time, denoted by a version - identifier. - -distribution - A file or files that represent a particular release. - -importable distribution - A file or directory that, if placed on ``sys.path``, allows Python to - import any modules contained within it. - -pluggable distribution - An importable distribution whose filename unambiguously identifies its - release (i.e. project and version), and whose contents unamabiguously - specify what releases of other projects will satisfy its runtime - requirements. - -extra - An "extra" is an optional feature of a release, that may impose additional - runtime requirements. For example, if docutils PDF support required a - PDF support library to be present, docutils could define its PDF support as - an "extra", and list what other project releases need to be available in - order to provide it. - -environment - A collection of distributions potentially available for importing, but not - necessarily active. More than one distribution (i.e. release version) for - a given project may be present in an environment. - -working set - A collection of distributions actually available for importing, as on - ``sys.path``. At most one distribution (release version) of a given - project may be present in a working set, as otherwise there would be - ambiguity as to what to import. - -eggs - Eggs are pluggable distributions in one of the three formats currently - supported by ``pkg_resources``. There are built eggs, development eggs, - and egg links. Built eggs are directories or zipfiles whose name ends - with ``.egg`` and follows the egg naming conventions, and contain an - ``EGG-INFO`` subdirectory (zipped or otherwise). Development eggs are - normal directories of Python code with one or more ``ProjectName.egg-info`` - subdirectories. And egg links are ``*.egg-link`` files that contain the - name of a built or development egg, to support symbolic linking on - platforms that do not have native symbolic links. - -(For more information about these terms and concepts, see also this -`architectural overview`_ of ``pkg_resources`` and Python Eggs in general.) - -.. _architectural overview: http://mail.python.org/pipermail/distutils-sig/2005-June/004652.html - - -.. ----------------- -.. Developer's Guide -.. ----------------- - -.. This section isn't written yet. Currently planned topics include - Accessing Resources - Finding and Activating Package Distributions - get_provider() - require() - WorkingSet - iter_distributions - Running Scripts - Configuration - Namespace Packages - Extensible Applications and Frameworks - Locating entry points - Activation listeners - Metadata access - Extended Discovery and Installation - Supporting Custom PEP 302 Implementations -.. For now, please check out the extensive `API Reference`_ below. - - -------------- -API Reference -------------- - -Namespace Package Support -========================= - -A namespace package is a package that only contains other packages and modules, -with no direct contents of its own. Such packages can be split across -multiple, separately-packaged distributions. Normally, you do not need to use -the namespace package APIs directly; instead you should supply the -``namespace_packages`` argument to ``setup()`` in your project's ``setup.py``. -See the `setuptools documentation on namespace packages`_ for more information. - -However, if for some reason you need to manipulate namespace packages or -directly alter ``sys.path`` at runtime, you may find these APIs useful: - -``declare_namespace(name)`` - Declare that the dotted package name `name` is a "namespace package" whose - contained packages and modules may be spread across multiple distributions. - The named package's ``__path__`` will be extended to include the - corresponding package in all distributions on ``sys.path`` that contain a - package of that name. (More precisely, if an importer's - ``find_module(name)`` returns a loader, then it will also be searched for - the package's contents.) Whenever a Distribution's ``activate()`` method - is invoked, it checks for the presence of namespace packages and updates - their ``__path__`` contents accordingly. - -``fixup_namespace_packages(path_item)`` - Declare that `path_item` is a newly added item on ``sys.path`` that may - need to be used to update existing namespace packages. Ordinarily, this is - called for you when an egg is automatically added to ``sys.path``, but if - your application modifies ``sys.path`` to include locations that may - contain portions of a namespace package, you will need to call this - function to ensure they are added to the existing namespace packages. - -Although by default ``pkg_resources`` only supports namespace packages for -filesystem and zip importers, you can extend its support to other "importers" -compatible with PEP 302 using the ``register_namespace_handler()`` function. -See the section below on `Supporting Custom Importers`_ for details. - -.. _setuptools documentation on namespace packages: http://peak.telecommunity.com/DevCenter/setuptools#namespace-packages - - -``WorkingSet`` Objects -====================== - -The ``WorkingSet`` class provides access to a collection of "active" -distributions. In general, there is only one meaningful ``WorkingSet`` -instance: the one that represents the distributions that are currently active -on ``sys.path``. This global instance is available under the name -``working_set`` in the ``pkg_resources`` module. However, specialized -tools may wish to manipulate working sets that don't correspond to -``sys.path``, and therefore may wish to create other ``WorkingSet`` instances. - -It's important to note that the global ``working_set`` object is initialized -from ``sys.path`` when ``pkg_resources`` is first imported, but is only updated -if you do all future ``sys.path`` manipulation via ``pkg_resources`` APIs. If -you manually modify ``sys.path``, you must invoke the appropriate methods on -the ``working_set`` instance to keep it in sync. Unfortunately, Python does -not provide any way to detect arbitrary changes to a list object like -``sys.path``, so ``pkg_resources`` cannot automatically update the -``working_set`` based on changes to ``sys.path``. - -``WorkingSet(entries=None)`` - Create a ``WorkingSet`` from an iterable of path entries. If `entries` - is not supplied, it defaults to the value of ``sys.path`` at the time - the constructor is called. - - Note that you will not normally construct ``WorkingSet`` instances - yourbut instead you will implicitly or explicitly use the global - ``working_set`` instance. For the most part, the ``pkg_resources`` API - is designed so that the ``working_set`` is used by default, such that you - don't have to explicitly refer to it most of the time. - - -Basic ``WorkingSet`` Methods ----------------------------- - -The following methods of ``WorkingSet`` objects are also available as module- -level functions in ``pkg_resources`` that apply to the default ``working_set`` -instance. Thus, you can use e.g. ``pkg_resources.require()`` as an -abbreviation for ``pkg_resources.working_set.require()``: - - -``require(*requirements)`` - Ensure that distributions matching `requirements` are activated - - `requirements` must be a string or a (possibly-nested) sequence - thereof, specifying the distributions and versions required. The - return value is a sequence of the distributions that needed to be - activated to fulfill the requirements; all relevant distributions are - included, even if they were already activated in this working set. - - For the syntax of requirement specifiers, see the section below on - `Requirements Parsing`_. - - In general, it should not be necessary for you to call this method - directly. It's intended more for use in quick-and-dirty scripting and - interactive interpreter hacking than for production use. If you're creating - an actual library or application, it's strongly recommended that you create - a "setup.py" script using ``setuptools``, and declare all your requirements - there. That way, tools like EasyInstall can automatically detect what - requirements your package has, and deal with them accordingly. - - Note that calling ``require('SomePackage')`` will not install - ``SomePackage`` if it isn't already present. If you need to do this, you - should use the ``resolve()`` method instead, which allows you to pass an - ``installer`` callback that will be invoked when a needed distribution - can't be found on the local machine. You can then have this callback - display a dialog, automatically download the needed distribution, or - whatever else is appropriate for your application. See the documentation - below on the ``resolve()`` method for more information, and also on the - ``obtain()`` method of ``Environment`` objects. - -``run_script(requires, script_name)`` - Locate distribution specified by `requires` and run its `script_name` - script. `requires` must be a string containing a requirement specifier. - (See `Requirements Parsing`_ below for the syntax.) - - The script, if found, will be executed in *the caller's globals*. That's - because this method is intended to be called from wrapper scripts that - act as a proxy for the "real" scripts in a distribution. A wrapper script - usually doesn't need to do anything but invoke this function with the - correct arguments. - - If you need more control over the script execution environment, you - probably want to use the ``run_script()`` method of a ``Distribution`` - object's `Metadata API`_ instead. - -``iter_entry_points(group, name=None)`` - Yield entry point objects from `group` matching `name` - - If `name` is None, yields all entry points in `group` from all - distributions in the working set, otherwise only ones matching both - `group` and `name` are yielded. Entry points are yielded from the active - distributions in the order that the distributions appear in the working - set. (For the global ``working_set``, this should be the same as the order - that they are listed in ``sys.path``.) Note that within the entry points - advertised by an individual distribution, there is no particular ordering. - - Please see the section below on `Entry Points`_ for more information. - - -``WorkingSet`` Methods and Attributes -------------------------------------- - -These methods are used to query or manipulate the contents of a specific -working set, so they must be explicitly invoked on a particular ``WorkingSet`` -instance: - -``add_entry(entry)`` - Add a path item to the ``entries``, finding any distributions on it. You - should use this when you add additional items to ``sys.path`` and you want - the global ``working_set`` to reflect the change. This method is also - called by the ``WorkingSet()`` constructor during initialization. - - This method uses ``find_distributions(entry, True)`` to find distributions - corresponding to the path entry, and then ``add()`` them. `entry` is - always appended to the ``entries`` attribute, even if it is already - present, however. (This is because ``sys.path`` can contain the same value - more than once, and the ``entries`` attribute should be able to reflect - this.) - -``__contains__(dist)`` - True if `dist` is active in this ``WorkingSet``. Note that only one - distribution for a given project can be active in a given ``WorkingSet``. - -``__iter__()`` - Yield distributions for non-duplicate projects in the working set. - The yield order is the order in which the items' path entries were - added to the working set. - -``find(req)`` - Find a distribution matching `req` (a ``Requirement`` instance). - If there is an active distribution for the requested project, this - returns it, as long as it meets the version requirement specified by - `req`. But, if there is an active distribution for the project and it - does *not* meet the `req` requirement, ``VersionConflict`` is raised. - If there is no active distribution for the requested project, ``None`` - is returned. - -``resolve(requirements, env=None, installer=None)`` - List all distributions needed to (recursively) meet `requirements` - - `requirements` must be a sequence of ``Requirement`` objects. `env`, - if supplied, should be an ``Environment`` instance. If - not supplied, an ``Environment`` is created from the working set's - ``entries``. `installer`, if supplied, will be invoked with each - requirement that cannot be met by an already-installed distribution; it - should return a ``Distribution`` or ``None``. (See the ``obtain()`` method - of `Environment Objects`_, below, for more information on the `installer` - argument.) - -``add(dist, entry=None)`` - Add `dist` to working set, associated with `entry` - - If `entry` is unspecified, it defaults to ``dist.location``. On exit from - this routine, `entry` is added to the end of the working set's ``.entries`` - (if it wasn't already present). - - `dist` is only added to the working set if it's for a project that - doesn't already have a distribution active in the set. If it's - successfully added, any callbacks registered with the ``subscribe()`` - method will be called. (See `Receiving Change Notifications`_, below.) - - Note: ``add()`` is automatically called for you by the ``require()`` - method, so you don't normally need to use this method directly. - -``entries`` - This attribute represents a "shadow" ``sys.path``, primarily useful for - debugging. If you are experiencing import problems, you should check - the global ``working_set`` object's ``entries`` against ``sys.path``, to - ensure that they match. If they do not, then some part of your program - is manipulating ``sys.path`` without updating the ``working_set`` - accordingly. IMPORTANT NOTE: do not directly manipulate this attribute! - Setting it equal to ``sys.path`` will not fix your problem, any more than - putting black tape over an "engine warning" light will fix your car! If - this attribute is out of sync with ``sys.path``, it's merely an *indicator* - of the problem, not the cause of it. - - -Receiving Change Notifications ------------------------------- - -Extensible applications and frameworks may need to receive notification when -a new distribution (such as a plug-in component) has been added to a working -set. This is what the ``subscribe()`` method and ``add_activation_listener()`` -function are for. - -``subscribe(callback)`` - Invoke ``callback(distribution)`` once for each active distribution that is - in the set now, or gets added later. Because the callback is invoked for - already-active distributions, you do not need to loop over the working set - yourself to deal with the existing items; just register the callback and - be prepared for the fact that it will be called immediately by this method. - - Note that callbacks *must not* allow exceptions to propagate, or they will - interfere with the operation of other callbacks and possibly result in an - inconsistent working set state. Callbacks should use a try/except block - to ignore, log, or otherwise process any errors, especially since the code - that caused the callback to be invoked is unlikely to be able to handle - the errors any better than the callback itself. - -``pkg_resources.add_activation_listener()`` is an alternate spelling of -``pkg_resources.working_set.subscribe()``. - - -Locating Plugins ----------------- - -Extensible applications will sometimes have a "plugin directory" or a set of -plugin directories, from which they want to load entry points or other -metadata. The ``find_plugins()`` method allows you to do this, by scanning an -environment for the newest version of each project that can be safely loaded -without conflicts or missing requirements. - -``find_plugins(plugin_env, full_env=None, fallback=True)`` - Scan `plugin_env` and identify which distributions could be added to this - working set without version conflicts or missing requirements. - - Example usage:: - - distributions, errors = working_set.find_plugins( - Environment(plugin_dirlist) - ) - map(working_set.add, distributions) # add plugins+libs to sys.path - print "Couldn't load", errors # display errors - - The `plugin_env` should be an ``Environment`` instance that contains only - distributions that are in the project's "plugin directory" or directories. - The `full_env`, if supplied, should be an ``Environment`` instance that - contains all currently-available distributions. - - If `full_env` is not supplied, one is created automatically from the - ``WorkingSet`` this method is called on, which will typically mean that - every directory on ``sys.path`` will be scanned for distributions. - - This method returns a 2-tuple: (`distributions`, `error_info`), where - `distributions` is a list of the distributions found in `plugin_env` that - were loadable, along with any other distributions that are needed to resolve - their dependencies. `error_info` is a dictionary mapping unloadable plugin - distributions to an exception instance describing the error that occurred. - Usually this will be a ``DistributionNotFound`` or ``VersionConflict`` - instance. - - Most applications will use this method mainly on the master ``working_set`` - instance in ``pkg_resources``, and then immediately add the returned - distributions to the working set so that they are available on sys.path. - This will make it possible to find any entry points, and allow any other - metadata tracking and hooks to be activated. - - The resolution algorithm used by ``find_plugins()`` is as follows. First, - the project names of the distributions present in `plugin_env` are sorted. - Then, each project's eggs are tried in descending version order (i.e., - newest version first). - - An attempt is made to resolve each egg's dependencies. If the attempt is - successful, the egg and its dependencies are added to the output list and to - a temporary copy of the working set. The resolution process continues with - the next project name, and no older eggs for that project are tried. - - If the resolution attempt fails, however, the error is added to the error - dictionary. If the `fallback` flag is true, the next older version of the - plugin is tried, until a working version is found. If false, the resolution - process continues with the next plugin project name. - - Some applications may have stricter fallback requirements than others. For - example, an application that has a database schema or persistent objects - may not be able to safely downgrade a version of a package. Others may want - to ensure that a new plugin configuration is either 100% good or else - revert to a known-good configuration. (That is, they may wish to revert to - a known configuration if the `error_info` return value is non-empty.) - - Note that this algorithm gives precedence to satisfying the dependencies of - alphabetically prior project names in case of version conflicts. If two - projects named "AaronsPlugin" and "ZekesPlugin" both need different versions - of "TomsLibrary", then "AaronsPlugin" will win and "ZekesPlugin" will be - disabled due to version conflict. - - -``Environment`` Objects -======================= - -An "environment" is a collection of ``Distribution`` objects, usually ones -that are present and potentially importable on the current platform. -``Environment`` objects are used by ``pkg_resources`` to index available -distributions during dependency resolution. - -``Environment(search_path=None, platform=get_supported_platform(), python=PY_MAJOR)`` - Create an environment snapshot by scanning `search_path` for distributions - compatible with `platform` and `python`. `search_path` should be a - sequence of strings such as might be used on ``sys.path``. If a - `search_path` isn't supplied, ``sys.path`` is used. - - `platform` is an optional string specifying the name of the platform - that platform-specific distributions must be compatible with. If - unspecified, it defaults to the current platform. `python` is an - optional string naming the desired version of Python (e.g. ``'2.4'``); - it defaults to the currently-running version. - - You may explicitly set `platform` (and/or `python`) to ``None`` if you - wish to include *all* distributions, not just those compatible with the - running platform or Python version. - - Note that `search_path` is scanned immediately for distributions, and the - resulting ``Environment`` is a snapshot of the found distributions. It - is not automatically updated if the system's state changes due to e.g. - installation or removal of distributions. - -``__getitem__(project_name)`` - Returns a list of distributions for the given project name, ordered - from newest to oldest version. (And highest to lowest format precedence - for distributions that contain the same version of the project.) If there - are no distributions for the project, returns an empty list. - -``__iter__()`` - Yield the unique project names of the distributions in this environment. - The yielded names are always in lower case. - -``add(dist)`` - Add `dist` to the environment if it matches the platform and python version - specified at creation time, and only if the distribution hasn't already - been added. (i.e., adding the same distribution more than once is a no-op.) - -``remove(dist)`` - Remove `dist` from the environment. - -``can_add(dist)`` - Is distribution `dist` acceptable for this environment? If it's not - compatible with the ``platform`` and ``python`` version values specified - when the environment was created, a false value is returned. - -``__add__(dist_or_env)`` (``+`` operator) - Add a distribution or environment to an ``Environment`` instance, returning - a *new* environment object that contains all the distributions previously - contained by both. The new environment will have a ``platform`` and - ``python`` of ``None``, meaning that it will not reject any distributions - from being added to it; it will simply accept whatever is added. If you - want the added items to be filtered for platform and Python version, or - you want to add them to the *same* environment instance, you should use - in-place addition (``+=``) instead. - -``__iadd__(dist_or_env)`` (``+=`` operator) - Add a distribution or environment to an ``Environment`` instance - *in-place*, updating the existing instance and returning it. The - ``platform`` and ``python`` filter attributes take effect, so distributions - in the source that do not have a suitable platform string or Python version - are silently ignored. - -``best_match(req, working_set, installer=None)`` - Find distribution best matching `req` and usable on `working_set` - - This calls the ``find(req)`` method of the `working_set` to see if a - suitable distribution is already active. (This may raise - ``VersionConflict`` if an unsuitable version of the project is already - active in the specified `working_set`.) If a suitable distribution isn't - active, this method returns the newest distribution in the environment - that meets the ``Requirement`` in `req`. If no suitable distribution is - found, and `installer` is supplied, then the result of calling - the environment's ``obtain(req, installer)`` method will be returned. - -``obtain(requirement, installer=None)`` - Obtain a distro that matches requirement (e.g. via download). In the - base ``Environment`` class, this routine just returns - ``installer(requirement)``, unless `installer` is None, in which case - None is returned instead. This method is a hook that allows subclasses - to attempt other ways of obtaining a distribution before falling back - to the `installer` argument. - -``scan(search_path=None)`` - Scan `search_path` for distributions usable on `platform` - - Any distributions found are added to the environment. `search_path` should - be a sequence of strings such as might be used on ``sys.path``. If not - supplied, ``sys.path`` is used. Only distributions conforming to - the platform/python version defined at initialization are added. This - method is a shortcut for using the ``find_distributions()`` function to - find the distributions from each item in `search_path`, and then calling - ``add()`` to add each one to the environment. - - -``Requirement`` Objects -======================= - -``Requirement`` objects express what versions of a project are suitable for -some purpose. These objects (or their string form) are used by various -``pkg_resources`` APIs in order to find distributions that a script or -distribution needs. - - -Requirements Parsing --------------------- - -``parse_requirements(s)`` - Yield ``Requirement`` objects for a string or iterable of lines. Each - requirement must start on a new line. See below for syntax. - -``Requirement.parse(s)`` - Create a ``Requirement`` object from a string or iterable of lines. A - ``ValueError`` is raised if the string or lines do not contain a valid - requirement specifier, or if they contain more than one specifier. (To - parse multiple specifiers from a string or iterable of strings, use - ``parse_requirements()`` instead.) - - The syntax of a requirement specifier can be defined in EBNF as follows:: - - requirement ::= project_name versionspec? extras? - versionspec ::= comparison version (',' comparison version)* - comparison ::= '<' | '<=' | '!=' | '==' | '>=' | '>' - extras ::= '[' extralist? ']' - extralist ::= identifier (',' identifier)* - project_name ::= identifier - identifier ::= [-A-Za-z0-9_]+ - version ::= [-A-Za-z0-9_.]+ - - Tokens can be separated by whitespace, and a requirement can be continued - over multiple lines using a backslash (``\\``). Line-end comments (using - ``#``) are also allowed. - - Some examples of valid requirement specifiers:: - - FooProject >= 1.2 - Fizzy [foo, bar] - PickyThing<1.6,>1.9,!=1.9.6,<2.0a0,==2.4c1 - SomethingWhoseVersionIDontCareAbout - - The project name is the only required portion of a requirement string, and - if it's the only thing supplied, the requirement will accept any version - of that project. - - The "extras" in a requirement are used to request optional features of a - project, that may require additional project distributions in order to - function. For example, if the hypothetical "Report-O-Rama" project offered - optional PDF support, it might require an additional library in order to - provide that support. Thus, a project needing Report-O-Rama's PDF features - could use a requirement of ``Report-O-Rama[PDF]`` to request installation - or activation of both Report-O-Rama and any libraries it needs in order to - provide PDF support. For example, you could use:: - - easy_install.py Report-O-Rama[PDF] - - To install the necessary packages using the EasyInstall program, or call - ``pkg_resources.require('Report-O-Rama[PDF]')`` to add the necessary - distributions to sys.path at runtime. - - -``Requirement`` Methods and Attributes --------------------------------------- - -``__contains__(dist_or_version)`` - Return true if `dist_or_version` fits the criteria for this requirement. - If `dist_or_version` is a ``Distribution`` object, its project name must - match the requirement's project name, and its version must meet the - requirement's version criteria. If `dist_or_version` is a string, it is - parsed using the ``parse_version()`` utility function. Otherwise, it is - assumed to be an already-parsed version. - - The ``Requirement`` object's version specifiers (``.specs``) are internally - sorted into ascending version order, and used to establish what ranges of - versions are acceptable. Adjacent redundant conditions are effectively - consolidated (e.g. ``">1, >2"`` produces the same results as ``">1"``, and - ``"<2,<3"`` produces the same results as``"<3"``). ``"!="`` versions are - excised from the ranges they fall within. The version being tested for - acceptability is then checked for membership in the resulting ranges. - (Note that providing conflicting conditions for the same version (e.g. - ``"<2,>=2"`` or ``"==2,!=2"``) is meaningless and may therefore produce - bizarre results when compared with actual version number(s).) - -``__eq__(other_requirement)`` - A requirement compares equal to another requirement if they have - case-insensitively equal project names, version specifiers, and "extras". - (The order that extras and version specifiers are in is also ignored.) - Equal requirements also have equal hashes, so that requirements can be - used in sets or as dictionary keys. - -``__str__()`` - The string form of a ``Requirement`` is a string that, if passed to - ``Requirement.parse()``, would return an equal ``Requirement`` object. - -``project_name`` - The name of the required project - -``key`` - An all-lowercase version of the ``project_name``, useful for comparison - or indexing. - -``extras`` - A tuple of names of "extras" that this requirement calls for. (These will - be all-lowercase and normalized using the ``safe_extra()`` parsing utility - function, so they may not exactly equal the extras the requirement was - created with.) - -``specs`` - A list of ``(op,version)`` tuples, sorted in ascending parsed-version - order. The `op` in each tuple is a comparison operator, represented as - a string. The `version` is the (unparsed) version number. The relative - order of tuples containing the same version numbers is undefined, since - having more than one operator for a given version is either redundant or - self-contradictory. - - -Entry Points -============ - -Entry points are a simple way for distributions to "advertise" Python objects -(such as functions or classes) for use by other distributions. Extensible -applications and frameworks can search for entry points with a particular name -or group, either from a specific distribution or from all active distributions -on sys.path, and then inspect or load the advertised objects at will. - -Entry points belong to "groups" which are named with a dotted name similar to -a Python package or module name. For example, the ``setuptools`` package uses -an entry point named ``distutils.commands`` in order to find commands defined -by distutils extensions. ``setuptools`` treats the names of entry points -defined in that group as the acceptable commands for a setup script. - -In a similar way, other packages can define their own entry point groups, -either using dynamic names within the group (like ``distutils.commands``), or -possibly using predefined names within the group. For example, a blogging -framework that offers various pre- or post-publishing hooks might define an -entry point group and look for entry points named "pre_process" and -"post_process" within that group. - -To advertise an entry point, a project needs to use ``setuptools`` and provide -an ``entry_points`` argument to ``setup()`` in its setup script, so that the -entry points will be included in the distribution's metadata. For more -details, see the ``setuptools`` documentation. (XXX link here to setuptools) - -Each project distribution can advertise at most one entry point of a given -name within the same entry point group. For example, a distutils extension -could advertise two different ``distutils.commands`` entry points, as long as -they had different names. However, there is nothing that prevents *different* -projects from advertising entry points of the same name in the same group. In -some cases, this is a desirable thing, since the application or framework that -uses the entry points may be calling them as hooks, or in some other way -combining them. It is up to the application or framework to decide what to do -if multiple distributions advertise an entry point; some possibilities include -using both entry points, displaying an error message, using the first one found -in sys.path order, etc. - - -Convenience API ---------------- - -In the following functions, the `dist` argument can be a ``Distribution`` -instance, a ``Requirement`` instance, or a string specifying a requirement -(i.e. project name, version, etc.). If the argument is a string or -``Requirement``, the specified distribution is located (and added to sys.path -if not already present). An error will be raised if a matching distribution is -not available. - -The `group` argument should be a string containing a dotted identifier, -identifying an entry point group. If you are defining an entry point group, -you should include some portion of your package's name in the group name so as -to avoid collision with other packages' entry point groups. - -``load_entry_point(dist, group, name)`` - Load the named entry point from the specified distribution, or raise - ``ImportError``. - -``get_entry_info(dist, group, name)`` - Return an ``EntryPoint`` object for the given `group` and `name` from - the specified distribution. Returns ``None`` if the distribution has not - advertised a matching entry point. - -``get_entry_map(dist, group=None)`` - Return the distribution's entry point map for `group`, or the full entry - map for the distribution. This function always returns a dictionary, - even if the distribution advertises no entry points. If `group` is given, - the dictionary maps entry point names to the corresponding ``EntryPoint`` - object. If `group` is None, the dictionary maps group names to - dictionaries that then map entry point names to the corresponding - ``EntryPoint`` instance in that group. - -``iter_entry_points(group, name=None)`` - Yield entry point objects from `group` matching `name`. - - If `name` is None, yields all entry points in `group` from all - distributions in the working set on sys.path, otherwise only ones matching - both `group` and `name` are yielded. Entry points are yielded from - the active distributions in the order that the distributions appear on - sys.path. (Within entry points for a particular distribution, however, - there is no particular ordering.) - - (This API is actually a method of the global ``working_set`` object; see - the section above on `Basic WorkingSet Methods`_ for more information.) - - -Creating and Parsing --------------------- - -``EntryPoint(name, module_name, attrs=(), extras=(), dist=None)`` - Create an ``EntryPoint`` instance. `name` is the entry point name. The - `module_name` is the (dotted) name of the module containing the advertised - object. `attrs` is an optional tuple of names to look up from the - module to obtain the advertised object. For example, an `attrs` of - ``("foo","bar")`` and a `module_name` of ``"baz"`` would mean that the - advertised object could be obtained by the following code:: - - import baz - advertised_object = baz.foo.bar - - The `extras` are an optional tuple of "extra feature" names that the - distribution needs in order to provide this entry point. When the - entry point is loaded, these extra features are looked up in the `dist` - argument to find out what other distributions may need to be activated - on sys.path; see the ``load()`` method for more details. The `extras` - argument is only meaningful if `dist` is specified. `dist` must be - a ``Distribution`` instance. - -``EntryPoint.parse(src, dist=None)`` (classmethod) - Parse a single entry point from string `src` - - Entry point syntax follows the form:: - - name = some.module:some.attr [extra1,extra2] - - The entry name and module name are required, but the ``:attrs`` and - ``[extras]`` parts are optional, as is the whitespace shown between - some of the items. The `dist` argument is passed through to the - ``EntryPoint()`` constructor, along with the other values parsed from - `src`. - -``EntryPoint.parse_group(group, lines, dist=None)`` (classmethod) - Parse `lines` (a string or sequence of lines) to create a dictionary - mapping entry point names to ``EntryPoint`` objects. ``ValueError`` is - raised if entry point names are duplicated, if `group` is not a valid - entry point group name, or if there are any syntax errors. (Note: the - `group` parameter is used only for validation and to create more - informative error messages.) If `dist` is provided, it will be used to - set the ``dist`` attribute of the created ``EntryPoint`` objects. - -``EntryPoint.parse_map(data, dist=None)`` (classmethod) - Parse `data` into a dictionary mapping group names to dictionaries mapping - entry point names to ``EntryPoint`` objects. If `data` is a dictionary, - then the keys are used as group names and the values are passed to - ``parse_group()`` as the `lines` argument. If `data` is a string or - sequence of lines, it is first split into .ini-style sections (using - the ``split_sections()`` utility function) and the section names are used - as group names. In either case, the `dist` argument is passed through to - ``parse_group()`` so that the entry points will be linked to the specified - distribution. - - -``EntryPoint`` Objects ----------------------- - -For simple introspection, ``EntryPoint`` objects have attributes that -correspond exactly to the constructor argument names: ``name``, -``module_name``, ``attrs``, ``extras``, and ``dist`` are all available. In -addition, the following methods are provided: - -``load(require=True, env=None, installer=None)`` - Load the entry point, returning the advertised Python object, or raise - ``ImportError`` if it cannot be obtained. If `require` is a true value, - then ``require(env, installer)`` is called before attempting the import. - -``require(env=None, installer=None)`` - Ensure that any "extras" needed by the entry point are available on - sys.path. ``UnknownExtra`` is raised if the ``EntryPoint`` has ``extras``, - but no ``dist``, or if the named extras are not defined by the - distribution. If `env` is supplied, it must be an ``Environment``, and it - will be used to search for needed distributions if they are not already - present on sys.path. If `installer` is supplied, it must be a callable - taking a ``Requirement`` instance and returning a matching importable - ``Distribution`` instance or None. - -``__str__()`` - The string form of an ``EntryPoint`` is a string that could be passed to - ``EntryPoint.parse()`` to produce an equivalent ``EntryPoint``. - - -``Distribution`` Objects -======================== - -``Distribution`` objects represent collections of Python code that may or may -not be importable, and may or may not have metadata and resources associated -with them. Their metadata may include information such as what other projects -the distribution depends on, what entry points the distribution advertises, and -so on. - - -Getting or Creating Distributions ---------------------------------- - -Most commonly, you'll obtain ``Distribution`` objects from a ``WorkingSet`` or -an ``Environment``. (See the sections above on `WorkingSet Objects`_ and -`Environment Objects`_, which are containers for active distributions and -available distributions, respectively.) You can also obtain ``Distribution`` -objects from one of these high-level APIs: - -``find_distributions(path_item, only=False)`` - Yield distributions accessible via `path_item`. If `only` is true, yield - only distributions whose ``location`` is equal to `path_item`. In other - words, if `only` is true, this yields any distributions that would be - importable if `path_item` were on ``sys.path``. If `only` is false, this - also yields distributions that are "in" or "under" `path_item`, but would - not be importable unless their locations were also added to ``sys.path``. - -``get_distribution(dist_spec)`` - Return a ``Distribution`` object for a given ``Requirement`` or string. - If `dist_spec` is already a ``Distribution`` instance, it is returned. - If it is a ``Requirement`` object or a string that can be parsed into one, - it is used to locate and activate a matching distribution, which is then - returned. - -However, if you're creating specialized tools for working with distributions, -or creating a new distribution format, you may also need to create -``Distribution`` objects directly, using one of the three constructors below. - -These constructors all take an optional `metadata` argument, which is used to -access any resources or metadata associated with the distribution. `metadata` -must be an object that implements the ``IResourceProvider`` interface, or None. -If it is None, an ``EmptyProvider`` is used instead. ``Distribution`` objects -implement both the `IResourceProvider`_ and `IMetadataProvider Methods`_ by -delegating them to the `metadata` object. - -``Distribution.from_location(location, basename, metadata=None, **kw)`` (classmethod) - Create a distribution for `location`, which must be a string such as a - URL, filename, or other string that might be used on ``sys.path``. - `basename` is a string naming the distribution, like ``Foo-1.2-py2.4.egg``. - If `basename` ends with ``.egg``, then the project's name, version, python - version and platform are extracted from the filename and used to set those - properties of the created distribution. Any additional keyword arguments - are forwarded to the ``Distribution()`` constructor. - -``Distribution.from_filename(filename, metadata=None**kw)`` (classmethod) - Create a distribution by parsing a local filename. This is a shorter way - of saying ``Distribution.from_location(normalize_path(filename), - os.path.basename(filename), metadata)``. In other words, it creates a - distribution whose location is the normalize form of the filename, parsing - name and version information from the base portion of the filename. Any - additional keyword arguments are forwarded to the ``Distribution()`` - constructor. - -``Distribution(location,metadata,project_name,version,py_version,platform,precedence)`` - Create a distribution by setting its properties. All arguments are - optional and default to None, except for `py_version` (which defaults to - the current Python version) and `precedence` (which defaults to - ``EGG_DIST``; for more details see ``precedence`` under `Distribution - Attributes`_ below). Note that it's usually easier to use the - ``from_filename()`` or ``from_location()`` constructors than to specify - all these arguments individually. - - -``Distribution`` Attributes ---------------------------- - -location - A string indicating the distribution's location. For an importable - distribution, this is the string that would be added to ``sys.path`` to - make it actively importable. For non-importable distributions, this is - simply a filename, URL, or other way of locating the distribution. - -project_name - A string, naming the project that this distribution is for. Project names - are defined by a project's setup script, and they are used to identify - projects on PyPI. When a ``Distribution`` is constructed, the - `project_name` argument is passed through the ``safe_name()`` utility - function to filter out any unacceptable characters. - -key - ``dist.key`` is short for ``dist.project_name.lower()``. It's used for - case-insensitive comparison and indexing of distributions by project name. - -extras - A list of strings, giving the names of extra features defined by the - project's dependency list (the ``extras_require`` argument specified in - the project's setup script). - -version - A string denoting what release of the project this distribution contains. - When a ``Distribution`` is constructed, the `version` argument is passed - through the ``safe_version()`` utility function to filter out any - unacceptable characters. If no `version` is specified at construction - time, then attempting to access this attribute later will cause the - ``Distribution`` to try to discover its version by reading its ``PKG-INFO`` - metadata file. If ``PKG-INFO`` is unavailable or can't be parsed, - ``ValueError`` is raised. - -parsed_version - The ``parsed_version`` is a tuple representing a "parsed" form of the - distribution's ``version``. ``dist.parsed_version`` is a shortcut for - calling ``parse_version(dist.version)``. It is used to compare or sort - distributions by version. (See the `Parsing Utilities`_ section below for - more information on the ``parse_version()`` function.) Note that accessing - ``parsed_version`` may result in a ``ValueError`` if the ``Distribution`` - was constructed without a `version` and without `metadata` capable of - supplying the missing version info. - -py_version - The major/minor Python version the distribution supports, as a string. - For example, "2.3" or "2.4". The default is the current version of Python. - -platform - A string representing the platform the distribution is intended for, or - ``None`` if the distribution is "pure Python" and therefore cross-platform. - See `Platform Utilities`_ below for more information on platform strings. - -precedence - A distribution's ``precedence`` is used to determine the relative order of - two distributions that have the same ``project_name`` and - ``parsed_version``. The default precedence is ``pkg_resources.EGG_DIST``, - which is the highest (i.e. most preferred) precedence. The full list - of predefined precedences, from most preferred to least preferred, is: - ``EGG_DIST``, ``BINARY_DIST``, ``SOURCE_DIST``, ``CHECKOUT_DIST``, and - ``DEVELOP_DIST``. Normally, precedences other than ``EGG_DIST`` are used - only by the ``setuptools.package_index`` module, when sorting distributions - found in a package index to determine their suitability for installation. - "System" and "Development" eggs (i.e., ones that use the ``.egg-info`` - format), however, are automatically given a precedence of ``DEVELOP_DIST``. - - - -``Distribution`` Methods ------------------------- - -``activate(path=None)`` - Ensure distribution is importable on `path`. If `path` is None, - ``sys.path`` is used instead. This ensures that the distribution's - ``location`` is in the `path` list, and it also performs any necessary - namespace package fixups or declarations. (That is, if the distribution - contains namespace packages, this method ensures that they are declared, - and that the distribution's contents for those namespace packages are - merged with the contents provided by any other active distributions. See - the section above on `Namespace Package Support`_ for more information.) - - ``pkg_resources`` adds a notification callback to the global ``working_set`` - that ensures this method is called whenever a distribution is added to it. - Therefore, you should not normally need to explicitly call this method. - (Note that this means that namespace packages on ``sys.path`` are always - imported as soon as ``pkg_resources`` is, which is another reason why - namespace packages should not contain any code or import statements.) - -``as_requirement()`` - Return a ``Requirement`` instance that matches this distribution's project - name and version. - -``requires(extras=())`` - List the ``Requirement`` objects that specify this distribution's - dependencies. If `extras` is specified, it should be a sequence of names - of "extras" defined by the distribution, and the list returned will then - include any dependencies needed to support the named "extras". - -``clone(**kw)`` - Create a copy of the distribution. Any supplied keyword arguments override - the corresponding argument to the ``Distribution()`` constructor, allowing - you to change some of the copied distribution's attributes. - -``egg_name()`` - Return what this distribution's standard filename should be, not including - the ".egg" extension. For example, a distribution for project "Foo" - version 1.2 that runs on Python 2.3 for Windows would have an ``egg_name()`` - of ``Foo-1.2-py2.3-win32``. Any dashes in the name or version are - converted to underscores. (``Distribution.from_location()`` will convert - them back when parsing a ".egg" file name.) - -``__cmp__(other)``, ``__hash__()`` - Distribution objects are hashed and compared on the basis of their parsed - version and precedence, followed by their key (lowercase project name), - location, Python version, and platform. - -The following methods are used to access ``EntryPoint`` objects advertised -by the distribution. See the section above on `Entry Points`_ for more -detailed information about these operations: - -``get_entry_info(group, name)`` - Return the ``EntryPoint`` object for `group` and `name`, or None if no - such point is advertised by this distribution. - -``get_entry_map(group=None)`` - Return the entry point map for `group`. If `group` is None, return - a dictionary mapping group names to entry point maps for all groups. - (An entry point map is a dictionary of entry point names to ``EntryPoint`` - objects.) - -``load_entry_point(group, name)`` - Short for ``get_entry_info(group, name).load()``. Returns the object - advertised by the named entry point, or raises ``ImportError`` if - the entry point isn't advertised by this distribution, or there is some - other import problem. - -In addition to the above methods, ``Distribution`` objects also implement all -of the `IResourceProvider`_ and `IMetadataProvider Methods`_ (which are -documented in later sections): - -* ``has_metadata(name)`` -* ``metadata_isdir(name)`` -* ``metadata_listdir(name)`` -* ``get_metadata(name)`` -* ``get_metadata_lines(name)`` -* ``run_script(script_name, namespace)`` -* ``get_resource_filename(manager, resource_name)`` -* ``get_resource_stream(manager, resource_name)`` -* ``get_resource_string(manager, resource_name)`` -* ``has_resource(resource_name)`` -* ``resource_isdir(resource_name)`` -* ``resource_listdir(resource_name)`` - -If the distribution was created with a `metadata` argument, these resource and -metadata access methods are all delegated to that `metadata` provider. -Otherwise, they are delegated to an ``EmptyProvider``, so that the distribution -will appear to have no resources or metadata. This delegation approach is used -so that supporting custom importers or new distribution formats can be done -simply by creating an appropriate `IResourceProvider`_ implementation; see the -section below on `Supporting Custom Importers`_ for more details. - - -``ResourceManager`` API -======================= - -The ``ResourceManager`` class provides uniform access to package resources, -whether those resources exist as files and directories or are compressed in -an archive of some kind. - -Normally, you do not need to create or explicitly manage ``ResourceManager`` -instances, as the ``pkg_resources`` module creates a global instance for you, -and makes most of its methods available as top-level names in the -``pkg_resources`` module namespace. So, for example, this code actually -calls the ``resource_string()`` method of the global ``ResourceManager``:: - - import pkg_resources - my_data = pkg_resources.resource_string(__name__, "foo.dat") - -Thus, you can use the APIs below without needing an explicit -``ResourceManager`` instance; just import and use them as needed. - - -Basic Resource Access ---------------------- - -In the following methods, the `package_or_requirement` argument may be either -a Python package/module name (e.g. ``foo.bar``) or a ``Requirement`` instance. -If it is a package or module name, the named module or package must be -importable (i.e., be in a distribution or directory on ``sys.path``), and the -`resource_name` argument is interpreted relative to the named package. (Note -that if a module name is used, then the resource name is relative to the -package immediately containing the named module. Also, you should not use use -a namespace package name, because a namespace package can be spread across -multiple distributions, and is therefore ambiguous as to which distribution -should be searched for the resource.) - -If it is a ``Requirement``, then the requirement is automatically resolved -(searching the current ``Environment`` if necessary) and a matching -distribution is added to the ``WorkingSet`` and ``sys.path`` if one was not -already present. (Unless the ``Requirement`` can't be satisfied, in which -case an exception is raised.) The `resource_name` argument is then interpreted -relative to the root of the identified distribution; i.e. its first path -segment will be treated as a peer of the top-level modules or packages in the -distribution. - -Note that resource names must be ``/``-separated paths and cannot be absolute -(i.e. no leading ``/``) or contain relative names like ``".."``. Do *not* use -``os.path`` routines to manipulate resource paths, as they are *not* filesystem -paths. - -``resource_exists(package_or_requirement, resource_name)`` - Does the named resource exist? Return ``True`` or ``False`` accordingly. - -``resource_stream(package_or_requirement, resource_name)`` - Return a readable file-like object for the specified resource; it may be - an actual file, a ``StringIO``, or some similar object. The stream is - in "binary mode", in the sense that whatever bytes are in the resource - will be read as-is. - -``resource_string(package_or_requirement, resource_name)`` - Return the specified resource as a string. The resource is read in - binary fashion, such that the returned string contains exactly the bytes - that are stored in the resource. - -``resource_isdir(package_or_requirement, resource_name)`` - Is the named resource a directory? Return ``True`` or ``False`` - accordingly. - -``resource_listdir(package_or_requirement, resource_name)`` - List the contents of the named resource directory, just like ``os.listdir`` - except that it works even if the resource is in a zipfile. - -Note that only ``resource_exists()`` and ``resource_isdir()`` are insensitive -as to the resource type. You cannot use ``resource_listdir()`` on a file -resource, and you can't use ``resource_string()`` or ``resource_stream()`` on -directory resources. Using an inappropriate method for the resource type may -result in an exception or undefined behavior, depending on the platform and -distribution format involved. - - -Resource Extraction -------------------- - -``resource_filename(package_or_requirement, resource_name)`` - Sometimes, it is not sufficient to access a resource in string or stream - form, and a true filesystem filename is needed. In such cases, you can - use this method (or module-level function) to obtain a filename for a - resource. If the resource is in an archive distribution (such as a zipped - egg), it will be extracted to a cache directory, and the filename within - the cache will be returned. If the named resource is a directory, then - all resources within that directory (including subdirectories) are also - extracted. If the named resource is a C extension or "eager resource" - (see the ``setuptools`` documentation for details), then all C extensions - and eager resources are extracted at the same time. - - Archived resources are extracted to a cache location that can be managed by - the following two methods: - -``set_extraction_path(path)`` - Set the base path where resources will be extracted to, if needed. - - If you do not call this routine before any extractions take place, the - path defaults to the return value of ``get_default_cache()``. (Which is - based on the ``PYTHON_EGG_CACHE`` environment variable, with various - platform-specific fallbacks. See that routine's documentation for more - details.) - - Resources are extracted to subdirectories of this path based upon - information given by the resource provider. You may set this to a - temporary directory, but then you must call ``cleanup_resources()`` to - delete the extracted files when done. There is no guarantee that - ``cleanup_resources()`` will be able to remove all extracted files. (On - Windows, for example, you can't unlink .pyd or .dll files that are still - in use.) - - Note that you may not change the extraction path for a given resource - manager once resources have been extracted, unless you first call - ``cleanup_resources()``. - -``cleanup_resources(force=False)`` - Delete all extracted resource files and directories, returning a list - of the file and directory names that could not be successfully removed. - This function does not have any concurrency protection, so it should - generally only be called when the extraction path is a temporary - directory exclusive to a single process. This method is not - automatically called; you must call it explicitly or register it as an - ``atexit`` function if you wish to ensure cleanup of a temporary - directory used for extractions. - - -"Provider" Interface --------------------- - -If you are implementing an ``IResourceProvider`` and/or ``IMetadataProvider`` -for a new distribution archive format, you may need to use the following -``IResourceManager`` methods to co-ordinate extraction of resources to the -filesystem. If you're not implementing an archive format, however, you have -no need to use these methods. Unlike the other methods listed above, they are -*not* available as top-level functions tied to the global ``ResourceManager``; -you must therefore have an explicit ``ResourceManager`` instance to use them. - -``get_cache_path(archive_name, names=())`` - Return absolute location in cache for `archive_name` and `names` - - The parent directory of the resulting path will be created if it does - not already exist. `archive_name` should be the base filename of the - enclosing egg (which may not be the name of the enclosing zipfile!), - including its ".egg" extension. `names`, if provided, should be a - sequence of path name parts "under" the egg's extraction location. - - This method should only be called by resource providers that need to - obtain an extraction location, and only for names they intend to - extract, as it tracks the generated names for possible cleanup later. - -``extraction_error()`` - Raise an ``ExtractionError`` describing the active exception as interfering - with the extraction process. You should call this if you encounter any - OS errors extracting the file to the cache path; it will format the - operating system exception for you, and add other information to the - ``ExtractionError`` instance that may be needed by programs that want to - wrap or handle extraction errors themselves. - -``postprocess(tempname, filename)`` - Perform any platform-specific postprocessing of `tempname`. - Resource providers should call this method ONLY after successfully - extracting a compressed resource. They must NOT call it on resources - that are already in the filesystem. - - `tempname` is the current (temporary) name of the file, and `filename` - is the name it will be renamed to by the caller after this routine - returns. - - -Metadata API -============ - -The metadata API is used to access metadata resources bundled in a pluggable -distribution. Metadata resources are virtual files or directories containing -information about the distribution, such as might be used by an extensible -application or framework to connect "plugins". Like other kinds of resources, -metadata resource names are ``/``-separated and should not contain ``..`` or -begin with a ``/``. You should not use ``os.path`` routines to manipulate -resource paths. - -The metadata API is provided by objects implementing the ``IMetadataProvider`` -or ``IResourceProvider`` interfaces. ``Distribution`` objects implement this -interface, as do objects returned by the ``get_provider()`` function: - -``get_provider(package_or_requirement)`` - If a package name is supplied, return an ``IResourceProvider`` for the - package. If a ``Requirement`` is supplied, resolve it by returning a - ``Distribution`` from the current working set (searching the current - ``Environment`` if necessary and adding the newly found ``Distribution`` - to the working set). If the named package can't be imported, or the - ``Requirement`` can't be satisfied, an exception is raised. - - NOTE: if you use a package name rather than a ``Requirement``, the object - you get back may not be a pluggable distribution, depending on the method - by which the package was installed. In particular, "development" packages - and "single-version externally-managed" packages do not have any way to - map from a package name to the corresponding project's metadata. Do not - write code that passes a package name to ``get_provider()`` and then tries - to retrieve project metadata from the returned object. It may appear to - work when the named package is in an ``.egg`` file or directory, but - it will fail in other installation scenarios. If you want project - metadata, you need to ask for a *project*, not a package. - - -``IMetadataProvider`` Methods ------------------------------ - -The methods provided by objects (such as ``Distribution`` instances) that -implement the ``IMetadataProvider`` or ``IResourceProvider`` interfaces are: - -``has_metadata(name)`` - Does the named metadata resource exist? - -``metadata_isdir(name)`` - Is the named metadata resource a directory? - -``metadata_listdir(name)`` - List of metadata names in the directory (like ``os.listdir()``) - -``get_metadata(name)`` - Return the named metadata resource as a string. The data is read in binary - mode; i.e., the exact bytes of the resource file are returned. - -``get_metadata_lines(name)`` - Yield named metadata resource as list of non-blank non-comment lines. This - is short for calling ``yield_lines(provider.get_metadata(name))``. See the - section on `yield_lines()`_ below for more information on the syntax it - recognizes. - -``run_script(script_name, namespace)`` - Execute the named script in the supplied namespace dictionary. Raises - ``ResolutionError`` if there is no script by that name in the ``scripts`` - metadata directory. `namespace` should be a Python dictionary, usually - a module dictionary if the script is being run as a module. - - -Exceptions -========== - -``pkg_resources`` provides a simple exception hierarchy for problems that may -occur when processing requests to locate and activate packages:: - - ResolutionError - DistributionNotFound - VersionConflict - UnknownExtra - - ExtractionError - -``ResolutionError`` - This class is used as a base class for the other three exceptions, so that - you can catch all of them with a single "except" clause. It is also raised - directly for miscellaneous requirement-resolution problems like trying to - run a script that doesn't exist in the distribution it was requested from. - -``DistributionNotFound`` - A distribution needed to fulfill a requirement could not be found. - -``VersionConflict`` - The requested version of a project conflicts with an already-activated - version of the same project. - -``UnknownExtra`` - One of the "extras" requested was not recognized by the distribution it - was requested from. - -``ExtractionError`` - A problem occurred extracting a resource to the Python Egg cache. The - following attributes are available on instances of this exception: - - manager - The resource manager that raised this exception - - cache_path - The base directory for resource extraction - - original_error - The exception instance that caused extraction to fail - - -Supporting Custom Importers -=========================== - -By default, ``pkg_resources`` supports normal filesystem imports, and -``zipimport`` importers. If you wish to use the ``pkg_resources`` features -with other (PEP 302-compatible) importers or module loaders, you may need to -register various handlers and support functions using these APIs: - -``register_finder(importer_type, distribution_finder)`` - Register `distribution_finder` to find distributions in ``sys.path`` items. - `importer_type` is the type or class of a PEP 302 "Importer" (``sys.path`` - item handler), and `distribution_finder` is a callable that, when passed a - path item, the importer instance, and an `only` flag, yields - ``Distribution`` instances found under that path item. (The `only` flag, - if true, means the finder should yield only ``Distribution`` objects whose - ``location`` is equal to the path item provided.) - - See the source of the ``pkg_resources.find_on_path`` function for an - example finder function. - -``register_loader_type(loader_type, provider_factory)`` - Register `provider_factory` to make ``IResourceProvider`` objects for - `loader_type`. `loader_type` is the type or class of a PEP 302 - ``module.__loader__``, and `provider_factory` is a function that, when - passed a module object, returns an `IResourceProvider`_ for that module, - allowing it to be used with the `ResourceManager API`_. - -``register_namespace_handler(importer_type, namespace_handler)`` - Register `namespace_handler` to declare namespace packages for the given - `importer_type`. `importer_type` is the type or class of a PEP 302 - "importer" (sys.path item handler), and `namespace_handler` is a callable - with a signature like this:: - - def namespace_handler(importer, path_entry, moduleName, module): - # return a path_entry to use for child packages - - Namespace handlers are only called if the relevant importer object has - already agreed that it can handle the relevant path item. The handler - should only return a subpath if the module ``__path__`` does not already - contain an equivalent subpath. Otherwise, it should return None. - - For an example namespace handler, see the source of the - ``pkg_resources.file_ns_handler`` function, which is used for both zipfile - importing and regular importing. - - -IResourceProvider ------------------ - -``IResourceProvider`` is an abstract class that documents what methods are -required of objects returned by a `provider_factory` registered with -``register_loader_type()``. ``IResourceProvider`` is a subclass of -``IMetadataProvider``, so objects that implement this interface must also -implement all of the `IMetadataProvider Methods`_ as well as the methods -shown here. The `manager` argument to the methods below must be an object -that supports the full `ResourceManager API`_ documented above. - -``get_resource_filename(manager, resource_name)`` - Return a true filesystem path for `resource_name`, co-ordinating the - extraction with `manager`, if the resource must be unpacked to the - filesystem. - -``get_resource_stream(manager, resource_name)`` - Return a readable file-like object for `resource_name`. - -``get_resource_string(manager, resource_name)`` - Return a string containing the contents of `resource_name`. - -``has_resource(resource_name)`` - Does the package contain the named resource? - -``resource_isdir(resource_name)`` - Is the named resource a directory? Return a false value if the resource - does not exist or is not a directory. - -``resource_listdir(resource_name)`` - Return a list of the contents of the resource directory, ala - ``os.listdir()``. Requesting the contents of a non-existent directory may - raise an exception. - -Note, by the way, that your provider classes need not (and should not) subclass -``IResourceProvider`` or ``IMetadataProvider``! These classes exist solely -for documentation purposes and do not provide any useful implementation code. -You may instead wish to subclass one of the `built-in resource providers`_. - - -Built-in Resource Providers ---------------------------- - -``pkg_resources`` includes several provider classes that are automatically used -where appropriate. Their inheritance tree looks like this:: - - NullProvider - EggProvider - DefaultProvider - PathMetadata - ZipProvider - EggMetadata - EmptyProvider - FileMetadata - - -``NullProvider`` - This provider class is just an abstract base that provides for common - provider behaviors (such as running scripts), given a definition for just - a few abstract methods. - -``EggProvider`` - This provider class adds in some egg-specific features that are common - to zipped and unzipped eggs. - -``DefaultProvider`` - This provider class is used for unpacked eggs and "plain old Python" - filesystem modules. - -``ZipProvider`` - This provider class is used for all zipped modules, whether they are eggs - or not. - -``EmptyProvider`` - This provider class always returns answers consistent with a provider that - has no metadata or resources. ``Distribution`` objects created without - a ``metadata`` argument use an instance of this provider class instead. - Since all ``EmptyProvider`` instances are equivalent, there is no need - to have more than one instance. ``pkg_resources`` therefore creates a - global instance of this class under the name ``empty_provider``, and you - may use it if you have need of an ``EmptyProvider`` instance. - -``PathMetadata(path, egg_info)`` - Create an ``IResourceProvider`` for a filesystem-based distribution, where - `path` is the filesystem location of the importable modules, and `egg_info` - is the filesystem location of the distribution's metadata directory. - `egg_info` should usually be the ``EGG-INFO`` subdirectory of `path` for an - "unpacked egg", and a ``ProjectName.egg-info`` subdirectory of `path` for - a "development egg". However, other uses are possible for custom purposes. - -``EggMetadata(zipimporter)`` - Create an ``IResourceProvider`` for a zipfile-based distribution. The - `zipimporter` should be a ``zipimport.zipimporter`` instance, and may - represent a "basket" (a zipfile containing multiple ".egg" subdirectories) - a specific egg *within* a basket, or a zipfile egg (where the zipfile - itself is a ".egg"). It can also be a combination, such as a zipfile egg - that also contains other eggs. - -``FileMetadata(path_to_pkg_info)`` - Create an ``IResourceProvider`` that provides exactly one metadata - resource: ``PKG-INFO``. The supplied path should be a distutils PKG-INFO - file. This is basically the same as an ``EmptyProvider``, except that - requests for ``PKG-INFO`` will be answered using the contents of the - designated file. (This provider is used to wrap ``.egg-info`` files - installed by vendor-supplied system packages.) - - -Utility Functions -================= - -In addition to its high-level APIs, ``pkg_resources`` also includes several -generally-useful utility routines. These routines are used to implement the -high-level APIs, but can also be quite useful by themselves. - - -Parsing Utilities ------------------ - -``parse_version(version)`` - Parse a project's version string, returning a value that can be used to - compare versions by chronological order. Semantically, the format is a - rough cross between distutils' ``StrictVersion`` and ``LooseVersion`` - classes; if you give it versions that would work with ``StrictVersion``, - then they will compare the same way. Otherwise, comparisons are more like - a "smarter" form of ``LooseVersion``. It is *possible* to create - pathological version coding schemes that will fool this parser, but they - should be very rare in practice. - - The returned value will be a tuple of strings. Numeric portions of the - version are padded to 8 digits so they will compare numerically, but - without relying on how numbers compare relative to strings. Dots are - dropped, but dashes are retained. Trailing zeros between alpha segments - or dashes are suppressed, so that e.g. "2.4.0" is considered the same as - "2.4". Alphanumeric parts are lower-cased. - - The algorithm assumes that strings like "-" and any alpha string that - alphabetically follows "final" represents a "patch level". So, "2.4-1" - is assumed to be a branch or patch of "2.4", and therefore "2.4.1" is - considered newer than "2.4-1", which in turn is newer than "2.4". - - Strings like "a", "b", "c", "alpha", "beta", "candidate" and so on (that - come before "final" alphabetically) are assumed to be pre-release versions, - so that the version "2.4" is considered newer than "2.4a1". Any "-" - characters preceding a pre-release indicator are removed. (In versions of - setuptools prior to 0.6a9, "-" characters were not removed, leading to the - unintuitive result that "0.2-rc1" was considered a newer version than - "0.2".) - - Finally, to handle miscellaneous cases, the strings "pre", "preview", and - "rc" are treated as if they were "c", i.e. as though they were release - candidates, and therefore are not as new as a version string that does not - contain them. And the string "dev" is treated as if it were an "@" sign; - that is, a version coming before even "a" or "alpha". - -.. _yield_lines(): - -``yield_lines(strs)`` - Yield non-empty/non-comment lines from a string/unicode or a possibly- - nested sequence thereof. If `strs` is an instance of ``basestring``, it - is split into lines, and each non-blank, non-comment line is yielded after - stripping leading and trailing whitespace. (Lines whose first non-blank - character is ``#`` are considered comment lines.) - - If `strs` is not an instance of ``basestring``, it is iterated over, and - each item is passed recursively to ``yield_lines()``, so that an arbitarily - nested sequence of strings, or sequences of sequences of strings can be - flattened out to the lines contained therein. So for example, passing - a file object or a list of strings to ``yield_lines`` will both work. - (Note that between each string in a sequence of strings there is assumed to - be an implicit line break, so lines cannot bridge two strings in a - sequence.) - - This routine is used extensively by ``pkg_resources`` to parse metadata - and file formats of various kinds, and most other ``pkg_resources`` - parsing functions that yield multiple values will use it to break up their - input. However, this routine is idempotent, so calling ``yield_lines()`` - on the output of another call to ``yield_lines()`` is completely harmless. - -``split_sections(strs)`` - Split a string (or possibly-nested iterable thereof), yielding ``(section, - content)`` pairs found using an ``.ini``-like syntax. Each ``section`` is - a whitespace-stripped version of the section name ("``[section]``") - and each ``content`` is a list of stripped lines excluding blank lines and - comment-only lines. If there are any non-blank, non-comment lines before - the first section header, they're yielded in a first ``section`` of - ``None``. - - This routine uses ``yield_lines()`` as its front end, so you can pass in - anything that ``yield_lines()`` accepts, such as an open text file, string, - or sequence of strings. ``ValueError`` is raised if a malformed section - header is found (i.e. a line starting with ``[`` but not ending with - ``]``). - - Note that this simplistic parser assumes that any line whose first nonblank - character is ``[`` is a section heading, so it can't support .ini format - variations that allow ``[`` as the first nonblank character on other lines. - -``safe_name(name)`` - Return a "safe" form of a project's name, suitable for use in a - ``Requirement`` string, as a distribution name, or a PyPI project name. - All non-alphanumeric runs are condensed to single "-" characters, such that - a name like "The $$$ Tree" becomes "The-Tree". Note that if you are - generating a filename from this value you should replace the "-" characters - with underscores ("_") because setuptools and the distutils - -``safe_version(version)`` - Similar to ``safe_name()`` except that spaces in the input become dots, and - dots are allowed to exist in the output. As with ``safe_name()``, if you - are generating a filename from this you should replace any "-" characters - in the output with underscores. - -``safe_extra(extra)`` - Return a "safe" form of an extra's name, suitable for use in a requirement - string or a setup script's ``extras_require`` keyword. This routine is - similar to ``safe_name()`` except that non-alphanumeric runs are replaced - by a single underbar (``_``), and the result is lowercased. - -``to_filename(name_or_version)`` - Escape a name or version string so it can be used in a dash-separated - filename (or ``#egg=name-version`` tag) without ambiguity. You - should only pass in values that were returned by ``safe_name()`` or - ``safe_version()``. - - -Platform Utilities ------------------- - -``get_build_platform()`` - Return this platform's identifier string. For Windows, the return value - is ``"win32"``, and for Mac OS X it is a string of the form - ``"macosx-10.4-ppc"``. All other platforms return the same uname-based - string that the ``distutils.util.get_platform()`` function returns. - This string is the minimum platform version required by distributions built - on the local machine. (Backward compatibility note: setuptools versions - prior to 0.6b1 called this function ``get_platform()``, and the function is - still available under that name for backward compatibility reasons.) - -``get_supported_platform()`` (New in 0.6b1) - This is the similar to ``get_build_platform()``, but is the maximum - platform version that the local machine supports. You will usually want - to use this value as the ``provided`` argument to the - ``compatible_platforms()`` function. - -``compatible_platforms(provided, required)`` - Return true if a distribution built on the `provided` platform may be used - on the `required` platform. If either platform value is ``None``, it is - considered a wildcard, and the platforms are therefore compatible. - Likewise, if the platform strings are equal, they're also considered - compatible, and ``True`` is returned. Currently, the only non-equal - platform strings that are considered compatible are Mac OS X platform - strings with the same hardware type (e.g. ``ppc``) and major version - (e.g. ``10``) with the `provided` platform's minor version being less than - or equal to the `required` platform's minor version. - -``get_default_cache()`` - Determine the default cache location for extracting resources from zipped - eggs. This routine returns the ``PYTHON_EGG_CACHE`` environment variable, - if set. Otherwise, on Windows, it returns a "Python-Eggs" subdirectory of - the user's "Application Data" directory. On all other systems, it returns - ``os.path.expanduser("~/.python-eggs")`` if ``PYTHON_EGG_CACHE`` is not - set. - - -PEP 302 Utilities ------------------ - -``get_importer(path_item)`` - Retrieve a PEP 302 "importer" for the given path item (which need not - actually be on ``sys.path``). This routine simulates the PEP 302 protocol - for obtaining an "importer" object. It first checks for an importer for - the path item in ``sys.path_importer_cache``, and if not found it calls - each of the ``sys.path_hooks`` and caches the result if a good importer is - found. If no importer is found, this routine returns a wrapper object - that wraps the builtin import machinery as a PEP 302-compliant "importer" - object. This wrapper object is *not* cached; instead a new instance is - returned each time. - - -File/Path Utilities -------------------- - -``ensure_directory(path)`` - Ensure that the parent directory (``os.path.dirname``) of `path` actually - exists, using ``os.makedirs()`` if necessary. - -``normalize_path(path)`` - Return a "normalized" version of `path`, such that two paths represent - the same filesystem location if they have equal ``normalized_path()`` - values. Specifically, this is a shortcut for calling ``os.path.realpath`` - and ``os.path.normcase`` on `path`. Unfortunately, on certain platforms - (notably Cygwin and Mac OS X) the ``normcase`` function does not accurately - reflect the platform's case-sensitivity, so there is always the possibility - of two apparently-different paths being equal on such platforms. - - ----------------------------- -Release Notes/Change History ----------------------------- - -0.6c10 - * Prevent lots of spurious "already imported from another path" warnings (e.g. - when pkg_resources is imported late). - -0.6c9 - * Fix ``resource_listdir('')`` always returning an empty list for zipped eggs. - -0.6c7 - * Fix package precedence problem where single-version eggs installed in - ``site-packages`` would take precedence over ``.egg`` files (or directories) - installed in ``site-packages``. - -0.6c6 - * Fix extracted C extensions not having executable permissions under Cygwin. - - * Allow ``.egg-link`` files to contain relative paths. - - * Fix cache dir defaults on Windows when multiple environment vars are needed - to construct a path. - -0.6c4 - * Fix "dev" versions being considered newer than release candidates. - -0.6c3 - * Python 2.5 compatibility fixes. - -0.6c2 - * Fix a problem with eggs specified directly on ``PYTHONPATH`` on - case-insensitive filesystems possibly not showing up in the default - working set, due to differing normalizations of ``sys.path`` entries. - -0.6b3 - * Fixed a duplicate path insertion problem on case-insensitive filesystems. - -0.6b1 - * Split ``get_platform()`` into ``get_supported_platform()`` and - ``get_build_platform()`` to work around a Mac versioning problem that caused - the behavior of ``compatible_platforms()`` to be platform specific. - - * Fix entry point parsing when a standalone module name has whitespace - between it and the extras. - -0.6a11 - * Added ``ExtractionError`` and ``ResourceManager.extraction_error()`` so that - cache permission problems get a more user-friendly explanation of the - problem, and so that programs can catch and handle extraction errors if they - need to. - -0.6a10 - * Added the ``extras`` attribute to ``Distribution``, the ``find_plugins()`` - method to ``WorkingSet``, and the ``__add__()`` and ``__iadd__()`` methods - to ``Environment``. - - * ``safe_name()`` now allows dots in project names. - - * There is a new ``to_filename()`` function that escapes project names and - versions for safe use in constructing egg filenames from a Distribution - object's metadata. - - * Added ``Distribution.clone()`` method, and keyword argument support to other - ``Distribution`` constructors. - - * Added the ``DEVELOP_DIST`` precedence, and automatically assign it to - eggs using ``.egg-info`` format. - -0.6a9 - * Don't raise an error when an invalid (unfinished) distribution is found - unless absolutely necessary. Warn about skipping invalid/unfinished eggs - when building an Environment. - - * Added support for ``.egg-info`` files or directories with version/platform - information embedded in the filename, so that system packagers have the - option of including ``PKG-INFO`` files to indicate the presence of a - system-installed egg, without needing to use ``.egg`` directories, zipfiles, - or ``.pth`` manipulation. - - * Changed ``parse_version()`` to remove dashes before pre-release tags, so - that ``0.2-rc1`` is considered an *older* version than ``0.2``, and is equal - to ``0.2rc1``. The idea that a dash *always* meant a post-release version - was highly non-intuitive to setuptools users and Python developers, who - seem to want to use ``-rc`` version numbers a lot. - -0.6a8 - * Fixed a problem with ``WorkingSet.resolve()`` that prevented version - conflicts from being detected at runtime. - - * Improved runtime conflict warning message to identify a line in the user's - program, rather than flagging the ``warn()`` call in ``pkg_resources``. - - * Avoid giving runtime conflict warnings for namespace packages, even if they - were declared by a different package than the one currently being activated. - - * Fix path insertion algorithm for case-insensitive filesystems. - - * Fixed a problem with nested namespace packages (e.g. ``peak.util``) not - being set as an attribute of their parent package. - -0.6a6 - * Activated distributions are now inserted in ``sys.path`` (and the working - set) just before the directory that contains them, instead of at the end. - This allows e.g. eggs in ``site-packages`` to override unmanaged modules in - the same location, and allows eggs found earlier on ``sys.path`` to override - ones found later. - - * When a distribution is activated, it now checks whether any contained - non-namespace modules have already been imported and issues a warning if - a conflicting module has already been imported. - - * Changed dependency processing so that it's breadth-first, allowing a - depender's preferences to override those of a dependee, to prevent conflicts - when a lower version is acceptable to the dependee, but not the depender. - - * Fixed a problem extracting zipped files on Windows, when the egg in question - has had changed contents but still has the same version number. - -0.6a4 - * Fix a bug in ``WorkingSet.resolve()`` that was introduced in 0.6a3. - -0.6a3 - * Added ``safe_extra()`` parsing utility routine, and use it for Requirement, - EntryPoint, and Distribution objects' extras handling. - -0.6a1 - * Enhanced performance of ``require()`` and related operations when all - requirements are already in the working set, and enhanced performance of - directory scanning for distributions. - - * Fixed some problems using ``pkg_resources`` w/PEP 302 loaders other than - ``zipimport``, and the previously-broken "eager resource" support. - - * Fixed ``pkg_resources.resource_exists()`` not working correctly, along with - some other resource API bugs. - - * Many API changes and enhancements: - - * Added ``EntryPoint``, ``get_entry_map``, ``load_entry_point``, and - ``get_entry_info`` APIs for dynamic plugin discovery. - - * ``list_resources`` is now ``resource_listdir`` (and it actually works) - - * Resource API functions like ``resource_string()`` that accepted a package - name and resource name, will now also accept a ``Requirement`` object in - place of the package name (to allow access to non-package data files in - an egg). - - * ``get_provider()`` will now accept a ``Requirement`` instance or a module - name. If it is given a ``Requirement``, it will return a corresponding - ``Distribution`` (by calling ``require()`` if a suitable distribution - isn't already in the working set), rather than returning a metadata and - resource provider for a specific module. (The difference is in how - resource paths are interpreted; supplying a module name means resources - path will be module-relative, rather than relative to the distribution's - root.) - - * ``Distribution`` objects now implement the ``IResourceProvider`` and - ``IMetadataProvider`` interfaces, so you don't need to reference the (no - longer available) ``metadata`` attribute to get at these interfaces. - - * ``Distribution`` and ``Requirement`` both have a ``project_name`` - attribute for the project name they refer to. (Previously these were - ``name`` and ``distname`` attributes.) - - * The ``path`` attribute of ``Distribution`` objects is now ``location``, - because it isn't necessarily a filesystem path (and hasn't been for some - time now). The ``location`` of ``Distribution`` objects in the filesystem - should always be normalized using ``pkg_resources.normalize_path()``; all - of the setuptools and EasyInstall code that generates distributions from - the filesystem (including ``Distribution.from_filename()``) ensure this - invariant, but if you use a more generic API like ``Distribution()`` or - ``Distribution.from_location()`` you should take care that you don't - create a distribution with an un-normalized filesystem path. - - * ``Distribution`` objects now have an ``as_requirement()`` method that - returns a ``Requirement`` for the distribution's project name and version. - - * Distribution objects no longer have an ``installed_on()`` method, and the - ``install_on()`` method is now ``activate()`` (but may go away altogether - soon). The ``depends()`` method has also been renamed to ``requires()``, - and ``InvalidOption`` is now ``UnknownExtra``. - - * ``find_distributions()`` now takes an additional argument called ``only``, - that tells it to only yield distributions whose location is the passed-in - path. (It defaults to False, so that the default behavior is unchanged.) - - * ``AvailableDistributions`` is now called ``Environment``, and the - ``get()``, ``__len__()``, and ``__contains__()`` methods were removed, - because they weren't particularly useful. ``__getitem__()`` no longer - raises ``KeyError``; it just returns an empty list if there are no - distributions for the named project. - - * The ``resolve()`` method of ``Environment`` is now a method of - ``WorkingSet`` instead, and the ``best_match()`` method now uses a working - set instead of a path list as its second argument. - - * There is a new ``pkg_resources.add_activation_listener()`` API that lets - you register a callback for notifications about distributions added to - ``sys.path`` (including the distributions already on it). This is - basically a hook for extensible applications and frameworks to be able to - search for plugin metadata in distributions added at runtime. - -0.5a13 - * Fixed a bug in resource extraction from nested packages in a zipped egg. - -0.5a12 - * Updated extraction/cache mechanism for zipped resources to avoid inter- - process and inter-thread races during extraction. The default cache - location can now be set via the ``PYTHON_EGGS_CACHE`` environment variable, - and the default Windows cache is now a ``Python-Eggs`` subdirectory of the - current user's "Application Data" directory, if the ``PYTHON_EGGS_CACHE`` - variable isn't set. - -0.5a10 - * Fix a problem with ``pkg_resources`` being confused by non-existent eggs on - ``sys.path`` (e.g. if a user deletes an egg without removing it from the - ``easy-install.pth`` file). - - * Fix a problem with "basket" support in ``pkg_resources``, where egg-finding - never actually went inside ``.egg`` files. - - * Made ``pkg_resources`` import the module you request resources from, if it's - not already imported. - -0.5a4 - * ``pkg_resources.AvailableDistributions.resolve()`` and related methods now - accept an ``installer`` argument: a callable taking one argument, a - ``Requirement`` instance. The callable must return a ``Distribution`` - object, or ``None`` if no distribution is found. This feature is used by - EasyInstall to resolve dependencies by recursively invoking itself. - -0.4a4 - * Fix problems with ``resource_listdir()``, ``resource_isdir()`` and resource - directory extraction for zipped eggs. - -0.4a3 - * Fixed scripts not being able to see a ``__file__`` variable in ``__main__`` - - * Fixed a problem with ``resource_isdir()`` implementation that was introduced - in 0.4a2. - -0.4a1 - * Fixed a bug in requirements processing for exact versions (i.e. ``==`` and - ``!=``) when only one condition was included. - - * Added ``safe_name()`` and ``safe_version()`` APIs to clean up handling of - arbitrary distribution names and versions found on PyPI. - -0.3a4 - * ``pkg_resources`` now supports resource directories, not just the resources - in them. In particular, there are ``resource_listdir()`` and - ``resource_isdir()`` APIs. - - * ``pkg_resources`` now supports "egg baskets" -- .egg zipfiles which contain - multiple distributions in subdirectories whose names end with ``.egg``. - Having such a "basket" in a directory on ``sys.path`` is equivalent to - having the individual eggs in that directory, but the contained eggs can - be individually added (or not) to ``sys.path``. Currently, however, there - is no automated way to create baskets. - - * Namespace package manipulation is now protected by the Python import lock. - -0.3a1 - * Initial release. - diff --git a/pkg_resources/__init__.py b/pkg_resources/__init__.py new file mode 100644 index 0000000..220a7cc --- /dev/null +++ b/pkg_resources/__init__.py @@ -0,0 +1,3070 @@ +# coding: utf-8 +""" +Package resource API +-------------------- + +A resource is a logical file contained within a package, or a logical +subdirectory thereof. The package resource API expects resource names +to have their path parts separated with ``/``, *not* whatever the local +path separator is. Do not use os.path operations to manipulate resource +names being passed into the API. + +The package resource API is designed to work with normal filesystem packages, +.egg files, and unpacked .egg files. It can also work in a limited way with +.zip files and with custom PEP 302 loaders that support the ``get_data()`` +method. +""" + +from __future__ import absolute_import + +import sys +import os +import io +import time +import re +import types +import zipfile +import zipimport +import warnings +import stat +import functools +import pkgutil +import operator +import platform +import collections +import plistlib +import email.parser +import tempfile +import textwrap +import itertools +from pkgutil import get_importer + +try: + import _imp +except ImportError: + # Python 3.2 compatibility + import imp as _imp + +import six +from six.moves import urllib, map, filter + +# capture these to bypass sandboxing +from os import utime +try: + from os import mkdir, rename, unlink + WRITE_SUPPORT = True +except ImportError: + # no write support, probably under GAE + WRITE_SUPPORT = False + +from os import open as os_open +from os.path import isdir, split + +try: + import importlib.machinery as importlib_machinery + # access attribute to force import under delayed import mechanisms. + importlib_machinery.__name__ +except ImportError: + importlib_machinery = None + +import packaging.version +import packaging.specifiers +import packaging.requirements +import packaging.markers +import appdirs + +if (3, 0) < sys.version_info < (3, 3): + raise RuntimeError("Python 3.3 or later is required") + +# declare some globals that will be defined later to +# satisfy the linters. +require = None +working_set = None + + +class PEP440Warning(RuntimeWarning): + """ + Used when there is an issue with a version or specifier not complying with + PEP 440. + """ + + +class _SetuptoolsVersionMixin(object): + def __hash__(self): + return super(_SetuptoolsVersionMixin, self).__hash__() + + def __lt__(self, other): + if isinstance(other, tuple): + return tuple(self) < other + else: + return super(_SetuptoolsVersionMixin, self).__lt__(other) + + def __le__(self, other): + if isinstance(other, tuple): + return tuple(self) <= other + else: + return super(_SetuptoolsVersionMixin, self).__le__(other) + + def __eq__(self, other): + if isinstance(other, tuple): + return tuple(self) == other + else: + return super(_SetuptoolsVersionMixin, self).__eq__(other) + + def __ge__(self, other): + if isinstance(other, tuple): + return tuple(self) >= other + else: + return super(_SetuptoolsVersionMixin, self).__ge__(other) + + def __gt__(self, other): + if isinstance(other, tuple): + return tuple(self) > other + else: + return super(_SetuptoolsVersionMixin, self).__gt__(other) + + def __ne__(self, other): + if isinstance(other, tuple): + return tuple(self) != other + else: + return super(_SetuptoolsVersionMixin, self).__ne__(other) + + def __getitem__(self, key): + return tuple(self)[key] + + def __iter__(self): + component_re = re.compile(r'(\d+ | [a-z]+ | \.| -)', re.VERBOSE) + replace = { + 'pre': 'c', + 'preview': 'c', + '-': 'final-', + 'rc': 'c', + 'dev': '@', + }.get + + def _parse_version_parts(s): + for part in component_re.split(s): + part = replace(part, part) + if not part or part == '.': + continue + if part[:1] in '0123456789': + # pad for numeric comparison + yield part.zfill(8) + else: + yield '*' + part + + # ensure that alpha/beta/candidate are before final + yield '*final' + + def old_parse_version(s): + parts = [] + for part in _parse_version_parts(s.lower()): + if part.startswith('*'): + # remove '-' before a prerelease tag + if part < '*final': + while parts and parts[-1] == '*final-': + parts.pop() + # remove trailing zeros from each series of numeric parts + while parts and parts[-1] == '00000000': + parts.pop() + parts.append(part) + return tuple(parts) + + # Warn for use of this function + warnings.warn( + "You have iterated over the result of " + "pkg_resources.parse_version. This is a legacy behavior which is " + "inconsistent with the new version class introduced in setuptools " + "8.0. In most cases, conversion to a tuple is unnecessary. For " + "comparison of versions, sort the Version instances directly. If " + "you have another use case requiring the tuple, please file a " + "bug with the setuptools project describing that need.", + RuntimeWarning, + stacklevel=1, + ) + + for part in old_parse_version(str(self)): + yield part + + +class SetuptoolsVersion(_SetuptoolsVersionMixin, packaging.version.Version): + pass + + +class SetuptoolsLegacyVersion(_SetuptoolsVersionMixin, + packaging.version.LegacyVersion): + pass + + +def parse_version(v): + try: + return SetuptoolsVersion(v) + except packaging.version.InvalidVersion: + return SetuptoolsLegacyVersion(v) + + +_state_vars = {} + + +def _declare_state(vartype, **kw): + globals().update(kw) + _state_vars.update(dict.fromkeys(kw, vartype)) + + +def __getstate__(): + state = {} + g = globals() + for k, v in _state_vars.items(): + state[k] = g['_sget_' + v](g[k]) + return state + + +def __setstate__(state): + g = globals() + for k, v in state.items(): + g['_sset_' + _state_vars[k]](k, g[k], v) + return state + + +def _sget_dict(val): + return val.copy() + + +def _sset_dict(key, ob, state): + ob.clear() + ob.update(state) + + +def _sget_object(val): + return val.__getstate__() + + +def _sset_object(key, ob, state): + ob.__setstate__(state) + + +_sget_none = _sset_none = lambda *args: None + + +def get_supported_platform(): + """Return this platform's maximum compatible version. + + distutils.util.get_platform() normally reports the minimum version + of Mac OS X that would be required to *use* extensions produced by + distutils. But what we want when checking compatibility is to know the + version of Mac OS X that we are *running*. To allow usage of packages that + explicitly require a newer version of Mac OS X, we must also know the + current version of the OS. + + If this condition occurs for any other platform with a version in its + platform strings, this function should be extended accordingly. + """ + plat = get_build_platform() + m = macosVersionString.match(plat) + if m is not None and sys.platform == "darwin": + try: + plat = 'macosx-%s-%s' % ('.'.join(_macosx_vers()[:2]), m.group(3)) + except ValueError: + # not Mac OS X + pass + return plat + + +__all__ = [ + # Basic resource access and distribution/entry point discovery + 'require', 'run_script', 'get_provider', 'get_distribution', + 'load_entry_point', 'get_entry_map', 'get_entry_info', + 'iter_entry_points', + 'resource_string', 'resource_stream', 'resource_filename', + 'resource_listdir', 'resource_exists', 'resource_isdir', + + # Environmental control + 'declare_namespace', 'working_set', 'add_activation_listener', + 'find_distributions', 'set_extraction_path', 'cleanup_resources', + 'get_default_cache', + + # Primary implementation classes + 'Environment', 'WorkingSet', 'ResourceManager', + 'Distribution', 'Requirement', 'EntryPoint', + + # Exceptions + 'ResolutionError', 'VersionConflict', 'DistributionNotFound', + 'UnknownExtra', 'ExtractionError', + + # Warnings + 'PEP440Warning', + + # Parsing functions and string utilities + 'parse_requirements', 'parse_version', 'safe_name', 'safe_version', + 'get_platform', 'compatible_platforms', 'yield_lines', 'split_sections', + 'safe_extra', 'to_filename', 'invalid_marker', 'evaluate_marker', + + # filesystem utilities + 'ensure_directory', 'normalize_path', + + # Distribution "precedence" constants + 'EGG_DIST', 'BINARY_DIST', 'SOURCE_DIST', 'CHECKOUT_DIST', 'DEVELOP_DIST', + + # "Provider" interfaces, implementations, and registration/lookup APIs + 'IMetadataProvider', 'IResourceProvider', 'FileMetadata', + 'PathMetadata', 'EggMetadata', 'EmptyProvider', 'empty_provider', + 'NullProvider', 'EggProvider', 'DefaultProvider', 'ZipProvider', + 'register_finder', 'register_namespace_handler', 'register_loader_type', + 'fixup_namespace_packages', 'get_importer', + + # Deprecated/backward compatibility only + 'run_main', 'AvailableDistributions', +] + + +class ResolutionError(Exception): + """Abstract base for dependency resolution errors""" + + def __repr__(self): + return self.__class__.__name__ + repr(self.args) + + +class VersionConflict(ResolutionError): + """ + An already-installed version conflicts with the requested version. + + Should be initialized with the installed Distribution and the requested + Requirement. + """ + + _template = "{self.dist} is installed but {self.req} is required" + + @property + def dist(self): + return self.args[0] + + @property + def req(self): + return self.args[1] + + def report(self): + return self._template.format(**locals()) + + def with_context(self, required_by): + """ + If required_by is non-empty, return a version of self that is a + ContextualVersionConflict. + """ + if not required_by: + return self + args = self.args + (required_by,) + return ContextualVersionConflict(*args) + + +class ContextualVersionConflict(VersionConflict): + """ + A VersionConflict that accepts a third parameter, the set of the + requirements that required the installed Distribution. + """ + + _template = VersionConflict._template + ' by {self.required_by}' + + @property + def required_by(self): + return self.args[2] + + +class DistributionNotFound(ResolutionError): + """A requested distribution was not found""" + + _template = ("The '{self.req}' distribution was not found " + "and is required by {self.requirers_str}") + + @property + def req(self): + return self.args[0] + + @property + def requirers(self): + return self.args[1] + + @property + def requirers_str(self): + if not self.requirers: + return 'the application' + return ', '.join(self.requirers) + + def report(self): + return self._template.format(**locals()) + + def __str__(self): + return self.report() + + +class UnknownExtra(ResolutionError): + """Distribution doesn't have an "extra feature" of the given name""" + + +_provider_factories = {} + +PY_MAJOR = sys.version[:3] +EGG_DIST = 3 +BINARY_DIST = 2 +SOURCE_DIST = 1 +CHECKOUT_DIST = 0 +DEVELOP_DIST = -1 + + +def register_loader_type(loader_type, provider_factory): + """Register `provider_factory` to make providers for `loader_type` + + `loader_type` is the type or class of a PEP 302 ``module.__loader__``, + and `provider_factory` is a function that, passed a *module* object, + returns an ``IResourceProvider`` for that module. + """ + _provider_factories[loader_type] = provider_factory + + +def get_provider(moduleOrReq): + """Return an IResourceProvider for the named module or requirement""" + if isinstance(moduleOrReq, Requirement): + return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] + try: + module = sys.modules[moduleOrReq] + except KeyError: + __import__(moduleOrReq) + module = sys.modules[moduleOrReq] + loader = getattr(module, '__loader__', None) + return _find_adapter(_provider_factories, loader)(module) + + +def _macosx_vers(_cache=[]): + if not _cache: + version = platform.mac_ver()[0] + # fallback for MacPorts + if version == '': + plist = '/System/Library/CoreServices/SystemVersion.plist' + if os.path.exists(plist): + if hasattr(plistlib, 'readPlist'): + plist_content = plistlib.readPlist(plist) + if 'ProductVersion' in plist_content: + version = plist_content['ProductVersion'] + + _cache.append(version.split('.')) + return _cache[0] + + +def _macosx_arch(machine): + return {'PowerPC': 'ppc', 'Power_Macintosh': 'ppc'}.get(machine, machine) + + +def get_build_platform(): + """Return this platform's string for platform-specific distributions + + XXX Currently this is the same as ``distutils.util.get_platform()``, but it + needs some hacks for Linux and Mac OS X. + """ + try: + # Python 2.7 or >=3.2 + from sysconfig import get_platform + except ImportError: + from distutils.util import get_platform + + plat = get_platform() + if sys.platform == "darwin" and not plat.startswith('macosx-'): + try: + version = _macosx_vers() + machine = os.uname()[4].replace(" ", "_") + return "macosx-%d.%d-%s" % (int(version[0]), int(version[1]), + _macosx_arch(machine)) + except ValueError: + # if someone is running a non-Mac darwin system, this will fall + # through to the default implementation + pass + return plat + + +macosVersionString = re.compile(r"macosx-(\d+)\.(\d+)-(.*)") +darwinVersionString = re.compile(r"darwin-(\d+)\.(\d+)\.(\d+)-(.*)") +# XXX backward compat +get_platform = get_build_platform + + +def compatible_platforms(provided, required): + """Can code for the `provided` platform run on the `required` platform? + + Returns true if either platform is ``None``, or the platforms are equal. + + XXX Needs compatibility checks for Linux and other unixy OSes. + """ + if provided is None or required is None or provided == required: + # easy case + return True + + # Mac OS X special cases + reqMac = macosVersionString.match(required) + if reqMac: + provMac = macosVersionString.match(provided) + + # is this a Mac package? + if not provMac: + # this is backwards compatibility for packages built before + # setuptools 0.6. All packages built after this point will + # use the new macosx designation. + provDarwin = darwinVersionString.match(provided) + if provDarwin: + dversion = int(provDarwin.group(1)) + macosversion = "%s.%s" % (reqMac.group(1), reqMac.group(2)) + if dversion == 7 and macosversion >= "10.3" or \ + dversion == 8 and macosversion >= "10.4": + return True + # egg isn't macosx or legacy darwin + return False + + # are they the same major version and machine type? + if provMac.group(1) != reqMac.group(1) or \ + provMac.group(3) != reqMac.group(3): + return False + + # is the required OS major update >= the provided one? + if int(provMac.group(2)) > int(reqMac.group(2)): + return False + + return True + + # XXX Linux and other platforms' special cases should go here + return False + + +def run_script(dist_spec, script_name): + """Locate distribution `dist_spec` and run its `script_name` script""" + ns = sys._getframe(1).f_globals + name = ns['__name__'] + ns.clear() + ns['__name__'] = name + require(dist_spec)[0].run_script(script_name, ns) + + +# backward compatibility +run_main = run_script + + +def get_distribution(dist): + """Return a current distribution object for a Requirement or string""" + if isinstance(dist, six.string_types): + dist = Requirement.parse(dist) + if isinstance(dist, Requirement): + dist = get_provider(dist) + if not isinstance(dist, Distribution): + raise TypeError("Expected string, Requirement, or Distribution", dist) + return dist + + +def load_entry_point(dist, group, name): + """Return `name` entry point of `group` for `dist` or raise ImportError""" + return get_distribution(dist).load_entry_point(group, name) + + +def get_entry_map(dist, group=None): + """Return the entry point map for `group`, or the full entry map""" + return get_distribution(dist).get_entry_map(group) + + +def get_entry_info(dist, group, name): + """Return the EntryPoint object for `group`+`name`, or ``None``""" + return get_distribution(dist).get_entry_info(group, name) + + +class IMetadataProvider: + def has_metadata(name): + """Does the package's distribution contain the named metadata?""" + + def get_metadata(name): + """The named metadata resource as a string""" + + def get_metadata_lines(name): + """Yield named metadata resource as list of non-blank non-comment lines + + Leading and trailing whitespace is stripped from each line, and lines + with ``#`` as the first non-blank character are omitted.""" + + def metadata_isdir(name): + """Is the named metadata a directory? (like ``os.path.isdir()``)""" + + def metadata_listdir(name): + """List of metadata names in the directory (like ``os.listdir()``)""" + + def run_script(script_name, namespace): + """Execute the named script in the supplied namespace dictionary""" + + +class IResourceProvider(IMetadataProvider): + """An object that provides access to package resources""" + + def get_resource_filename(manager, resource_name): + """Return a true filesystem path for `resource_name` + + `manager` must be an ``IResourceManager``""" + + def get_resource_stream(manager, resource_name): + """Return a readable file-like object for `resource_name` + + `manager` must be an ``IResourceManager``""" + + def get_resource_string(manager, resource_name): + """Return a string containing the contents of `resource_name` + + `manager` must be an ``IResourceManager``""" + + def has_resource(resource_name): + """Does the package contain the named resource?""" + + def resource_isdir(resource_name): + """Is the named resource a directory? (like ``os.path.isdir()``)""" + + def resource_listdir(resource_name): + """List of resource names in the directory (like ``os.listdir()``)""" + + +class WorkingSet(object): + """A collection of active distributions on sys.path (or a similar list)""" + + def __init__(self, entries=None): + """Create working set from list of path entries (default=sys.path)""" + self.entries = [] + self.entry_keys = {} + self.by_key = {} + self.callbacks = [] + + if entries is None: + entries = sys.path + + for entry in entries: + self.add_entry(entry) + + @classmethod + def _build_master(cls): + """ + Prepare the master working set. + """ + ws = cls() + try: + from __main__ import __requires__ + except ImportError: + # The main program does not list any requirements + return ws + + # ensure the requirements are met + try: + ws.require(__requires__) + except VersionConflict: + return cls._build_from_requirements(__requires__) + + return ws + + @classmethod + def _build_from_requirements(cls, req_spec): + """ + Build a working set from a requirement spec. Rewrites sys.path. + """ + # try it without defaults already on sys.path + # by starting with an empty path + ws = cls([]) + reqs = parse_requirements(req_spec) + dists = ws.resolve(reqs, Environment()) + for dist in dists: + ws.add(dist) + + # add any missing entries from sys.path + for entry in sys.path: + if entry not in ws.entries: + ws.add_entry(entry) + + # then copy back to sys.path + sys.path[:] = ws.entries + return ws + + def add_entry(self, entry): + """Add a path item to ``.entries``, finding any distributions on it + + ``find_distributions(entry, True)`` is used to find distributions + corresponding to the path entry, and they are added. `entry` is + always appended to ``.entries``, even if it is already present. + (This is because ``sys.path`` can contain the same value more than + once, and the ``.entries`` of the ``sys.path`` WorkingSet should always + equal ``sys.path``.) + """ + self.entry_keys.setdefault(entry, []) + self.entries.append(entry) + for dist in find_distributions(entry, True): + self.add(dist, entry, False) + + def __contains__(self, dist): + """True if `dist` is the active distribution for its project""" + return self.by_key.get(dist.key) == dist + + def find(self, req): + """Find a distribution matching requirement `req` + + If there is an active distribution for the requested project, this + returns it as long as it meets the version requirement specified by + `req`. But, if there is an active distribution for the project and it + does *not* meet the `req` requirement, ``VersionConflict`` is raised. + If there is no active distribution for the requested project, ``None`` + is returned. + """ + dist = self.by_key.get(req.key) + if dist is not None and dist not in req: + # XXX add more info + raise VersionConflict(dist, req) + return dist + + def iter_entry_points(self, group, name=None): + """Yield entry point objects from `group` matching `name` + + If `name` is None, yields all entry points in `group` from all + distributions in the working set, otherwise only ones matching + both `group` and `name` are yielded (in distribution order). + """ + for dist in self: + entries = dist.get_entry_map(group) + if name is None: + for ep in entries.values(): + yield ep + elif name in entries: + yield entries[name] + + def run_script(self, requires, script_name): + """Locate distribution for `requires` and run `script_name` script""" + ns = sys._getframe(1).f_globals + name = ns['__name__'] + ns.clear() + ns['__name__'] = name + self.require(requires)[0].run_script(script_name, ns) + + def __iter__(self): + """Yield distributions for non-duplicate projects in the working set + + The yield order is the order in which the items' path entries were + added to the working set. + """ + seen = {} + for item in self.entries: + if item not in self.entry_keys: + # workaround a cache issue + continue + + for key in self.entry_keys[item]: + if key not in seen: + seen[key] = 1 + yield self.by_key[key] + + def add(self, dist, entry=None, insert=True, replace=False): + """Add `dist` to working set, associated with `entry` + + If `entry` is unspecified, it defaults to the ``.location`` of `dist`. + On exit from this routine, `entry` is added to the end of the working + set's ``.entries`` (if it wasn't already present). + + `dist` is only added to the working set if it's for a project that + doesn't already have a distribution in the set, unless `replace=True`. + If it's added, any callbacks registered with the ``subscribe()`` method + will be called. + """ + if insert: + dist.insert_on(self.entries, entry, replace=replace) + + if entry is None: + entry = dist.location + keys = self.entry_keys.setdefault(entry, []) + keys2 = self.entry_keys.setdefault(dist.location, []) + if not replace and dist.key in self.by_key: + # ignore hidden distros + return + + self.by_key[dist.key] = dist + if dist.key not in keys: + keys.append(dist.key) + if dist.key not in keys2: + keys2.append(dist.key) + self._added_new(dist) + + def resolve(self, requirements, env=None, installer=None, + replace_conflicting=False, extras=None): + """List all distributions needed to (recursively) meet `requirements` + + `requirements` must be a sequence of ``Requirement`` objects. `env`, + if supplied, should be an ``Environment`` instance. If + not supplied, it defaults to all distributions available within any + entry or distribution in the working set. `installer`, if supplied, + will be invoked with each requirement that cannot be met by an + already-installed distribution; it should return a ``Distribution`` or + ``None``. + + Unless `replace_conflicting=True`, raises a VersionConflict exception if + any requirements are found on the path that have the correct name but + the wrong version. Otherwise, if an `installer` is supplied it will be + invoked to obtain the correct version of the requirement and activate + it. + + `extras` is a list of the extras to be used with these requirements. + This is important because extra requirements may look like `my_req; + extra = "my_extra"`, which would otherwise be interpreted as a purely + optional requirement. Instead, we want to be able to assert that these + requirements are truly required. + """ + + # set up the stack + requirements = list(requirements)[::-1] + # set of processed requirements + processed = {} + # key -> dist + best = {} + to_activate = [] + + req_extras = _ReqExtras() + + # Mapping of requirement to set of distributions that required it; + # useful for reporting info about conflicts. + required_by = collections.defaultdict(set) + + while requirements: + # process dependencies breadth-first + req = requirements.pop(0) + if req in processed: + # Ignore cyclic or redundant dependencies + continue + + if not req_extras.markers_pass(req, extras): + continue + + dist = best.get(req.key) + if dist is None: + # Find the best distribution and add it to the map + dist = self.by_key.get(req.key) + if dist is None or (dist not in req and replace_conflicting): + ws = self + if env is None: + if dist is None: + env = Environment(self.entries) + else: + # Use an empty environment and workingset to avoid + # any further conflicts with the conflicting + # distribution + env = Environment([]) + ws = WorkingSet([]) + dist = best[req.key] = env.best_match(req, ws, installer) + if dist is None: + requirers = required_by.get(req, None) + raise DistributionNotFound(req, requirers) + to_activate.append(dist) + if dist not in req: + # Oops, the "best" so far conflicts with a dependency + dependent_req = required_by[req] + raise VersionConflict(dist, req).with_context(dependent_req) + + # push the new requirements onto the stack + new_requirements = dist.requires(req.extras)[::-1] + requirements.extend(new_requirements) + + # Register the new requirements needed by req + for new_requirement in new_requirements: + required_by[new_requirement].add(req.project_name) + req_extras[new_requirement] = req.extras + + processed[req] = True + + # return list of distros to activate + return to_activate + + def find_plugins(self, plugin_env, full_env=None, installer=None, + fallback=True): + """Find all activatable distributions in `plugin_env` + + Example usage:: + + distributions, errors = working_set.find_plugins( + Environment(plugin_dirlist) + ) + # add plugins+libs to sys.path + map(working_set.add, distributions) + # display errors + print('Could not load', errors) + + The `plugin_env` should be an ``Environment`` instance that contains + only distributions that are in the project's "plugin directory" or + directories. The `full_env`, if supplied, should be an ``Environment`` + contains all currently-available distributions. If `full_env` is not + supplied, one is created automatically from the ``WorkingSet`` this + method is called on, which will typically mean that every directory on + ``sys.path`` will be scanned for distributions. + + `installer` is a standard installer callback as used by the + ``resolve()`` method. The `fallback` flag indicates whether we should + attempt to resolve older versions of a plugin if the newest version + cannot be resolved. + + This method returns a 2-tuple: (`distributions`, `error_info`), where + `distributions` is a list of the distributions found in `plugin_env` + that were loadable, along with any other distributions that are needed + to resolve their dependencies. `error_info` is a dictionary mapping + unloadable plugin distributions to an exception instance describing the + error that occurred. Usually this will be a ``DistributionNotFound`` or + ``VersionConflict`` instance. + """ + + plugin_projects = list(plugin_env) + # scan project names in alphabetic order + plugin_projects.sort() + + error_info = {} + distributions = {} + + if full_env is None: + env = Environment(self.entries) + env += plugin_env + else: + env = full_env + plugin_env + + shadow_set = self.__class__([]) + # put all our entries in shadow_set + list(map(shadow_set.add, self)) + + for project_name in plugin_projects: + + for dist in plugin_env[project_name]: + + req = [dist.as_requirement()] + + try: + resolvees = shadow_set.resolve(req, env, installer) + + except ResolutionError as v: + # save error info + error_info[dist] = v + if fallback: + # try the next older version of project + continue + else: + # give up on this project, keep going + break + + else: + list(map(shadow_set.add, resolvees)) + distributions.update(dict.fromkeys(resolvees)) + + # success, no need to try any more versions of this project + break + + distributions = list(distributions) + distributions.sort() + + return distributions, error_info + + def require(self, *requirements): + """Ensure that distributions matching `requirements` are activated + + `requirements` must be a string or a (possibly-nested) sequence + thereof, specifying the distributions and versions required. The + return value is a sequence of the distributions that needed to be + activated to fulfill the requirements; all relevant distributions are + included, even if they were already activated in this working set. + """ + needed = self.resolve(parse_requirements(requirements)) + + for dist in needed: + self.add(dist) + + return needed + + def subscribe(self, callback, existing=True): + """Invoke `callback` for all distributions + + If `existing=True` (default), + call on all existing ones, as well. + """ + if callback in self.callbacks: + return + self.callbacks.append(callback) + if not existing: + return + for dist in self: + callback(dist) + + def _added_new(self, dist): + for callback in self.callbacks: + callback(dist) + + def __getstate__(self): + return ( + self.entries[:], self.entry_keys.copy(), self.by_key.copy(), + self.callbacks[:] + ) + + def __setstate__(self, e_k_b_c): + entries, keys, by_key, callbacks = e_k_b_c + self.entries = entries[:] + self.entry_keys = keys.copy() + self.by_key = by_key.copy() + self.callbacks = callbacks[:] + + +class _ReqExtras(dict): + """ + Map each requirement to the extras that demanded it. + """ + + def markers_pass(self, req, extras=None): + """ + Evaluate markers for req against each extra that + demanded it. + + Return False if the req has a marker and fails + evaluation. Otherwise, return True. + """ + extra_evals = ( + req.marker.evaluate({'extra': extra}) + for extra in self.get(req, ()) + (extras or (None,)) + ) + return not req.marker or any(extra_evals) + + +class Environment(object): + """Searchable snapshot of distributions on a search path""" + + def __init__(self, search_path=None, platform=get_supported_platform(), + python=PY_MAJOR): + """Snapshot distributions available on a search path + + Any distributions found on `search_path` are added to the environment. + `search_path` should be a sequence of ``sys.path`` items. If not + supplied, ``sys.path`` is used. + + `platform` is an optional string specifying the name of the platform + that platform-specific distributions must be compatible with. If + unspecified, it defaults to the current platform. `python` is an + optional string naming the desired version of Python (e.g. ``'3.3'``); + it defaults to the current version. + + You may explicitly set `platform` (and/or `python`) to ``None`` if you + wish to map *all* distributions, not just those compatible with the + running platform or Python version. + """ + self._distmap = {} + self.platform = platform + self.python = python + self.scan(search_path) + + def can_add(self, dist): + """Is distribution `dist` acceptable for this environment? + + The distribution must match the platform and python version + requirements specified when this environment was created, or False + is returned. + """ + return (self.python is None or dist.py_version is None + or dist.py_version == self.python) \ + and compatible_platforms(dist.platform, self.platform) + + def remove(self, dist): + """Remove `dist` from the environment""" + self._distmap[dist.key].remove(dist) + + def scan(self, search_path=None): + """Scan `search_path` for distributions usable in this environment + + Any distributions found are added to the environment. + `search_path` should be a sequence of ``sys.path`` items. If not + supplied, ``sys.path`` is used. Only distributions conforming to + the platform/python version defined at initialization are added. + """ + if search_path is None: + search_path = sys.path + + for item in search_path: + for dist in find_distributions(item): + self.add(dist) + + def __getitem__(self, project_name): + """Return a newest-to-oldest list of distributions for `project_name` + + Uses case-insensitive `project_name` comparison, assuming all the + project's distributions use their project's name converted to all + lowercase as their key. + + """ + distribution_key = project_name.lower() + return self._distmap.get(distribution_key, []) + + def add(self, dist): + """Add `dist` if we ``can_add()`` it and it has not already been added + """ + if self.can_add(dist) and dist.has_version(): + dists = self._distmap.setdefault(dist.key, []) + if dist not in dists: + dists.append(dist) + dists.sort(key=operator.attrgetter('hashcmp'), reverse=True) + + def best_match(self, req, working_set, installer=None): + """Find distribution best matching `req` and usable on `working_set` + + This calls the ``find(req)`` method of the `working_set` to see if a + suitable distribution is already active. (This may raise + ``VersionConflict`` if an unsuitable version of the project is already + active in the specified `working_set`.) If a suitable distribution + isn't active, this method returns the newest distribution in the + environment that meets the ``Requirement`` in `req`. If no suitable + distribution is found, and `installer` is supplied, then the result of + calling the environment's ``obtain(req, installer)`` method will be + returned. + """ + dist = working_set.find(req) + if dist is not None: + return dist + for dist in self[req.key]: + if dist in req: + return dist + # try to download/install + return self.obtain(req, installer) + + def obtain(self, requirement, installer=None): + """Obtain a distribution matching `requirement` (e.g. via download) + + Obtain a distro that matches requirement (e.g. via download). In the + base ``Environment`` class, this routine just returns + ``installer(requirement)``, unless `installer` is None, in which case + None is returned instead. This method is a hook that allows subclasses + to attempt other ways of obtaining a distribution before falling back + to the `installer` argument.""" + if installer is not None: + return installer(requirement) + + def __iter__(self): + """Yield the unique project names of the available distributions""" + for key in self._distmap.keys(): + if self[key]: + yield key + + def __iadd__(self, other): + """In-place addition of a distribution or environment""" + if isinstance(other, Distribution): + self.add(other) + elif isinstance(other, Environment): + for project in other: + for dist in other[project]: + self.add(dist) + else: + raise TypeError("Can't add %r to environment" % (other,)) + return self + + def __add__(self, other): + """Add an environment or distribution to an environment""" + new = self.__class__([], platform=None, python=None) + for env in self, other: + new += env + return new + + +# XXX backward compatibility +AvailableDistributions = Environment + + +class ExtractionError(RuntimeError): + """An error occurred extracting a resource + + The following attributes are available from instances of this exception: + + manager + The resource manager that raised this exception + + cache_path + The base directory for resource extraction + + original_error + The exception instance that caused extraction to fail + """ + + +class ResourceManager: + """Manage resource extraction and packages""" + extraction_path = None + + def __init__(self): + self.cached_files = {} + + def resource_exists(self, package_or_requirement, resource_name): + """Does the named resource exist?""" + return get_provider(package_or_requirement).has_resource(resource_name) + + def resource_isdir(self, package_or_requirement, resource_name): + """Is the named resource an existing directory?""" + return get_provider(package_or_requirement).resource_isdir( + resource_name + ) + + def resource_filename(self, package_or_requirement, resource_name): + """Return a true filesystem path for specified resource""" + return get_provider(package_or_requirement).get_resource_filename( + self, resource_name + ) + + def resource_stream(self, package_or_requirement, resource_name): + """Return a readable file-like object for specified resource""" + return get_provider(package_or_requirement).get_resource_stream( + self, resource_name + ) + + def resource_string(self, package_or_requirement, resource_name): + """Return specified resource as a string""" + return get_provider(package_or_requirement).get_resource_string( + self, resource_name + ) + + def resource_listdir(self, package_or_requirement, resource_name): + """List the contents of the named resource directory""" + return get_provider(package_or_requirement).resource_listdir( + resource_name + ) + + def extraction_error(self): + """Give an error message for problems extracting file(s)""" + + old_exc = sys.exc_info()[1] + cache_path = self.extraction_path or get_default_cache() + + tmpl = textwrap.dedent(""" + Can't extract file(s) to egg cache + + The following error occurred while trying to extract file(s) to the Python egg + cache: + + {old_exc} + + The Python egg cache directory is currently set to: + + {cache_path} + + Perhaps your account does not have write access to this directory? You can + change the cache directory by setting the PYTHON_EGG_CACHE environment + variable to point to an accessible directory. + """).lstrip() + err = ExtractionError(tmpl.format(**locals())) + err.manager = self + err.cache_path = cache_path + err.original_error = old_exc + raise err + + def get_cache_path(self, archive_name, names=()): + """Return absolute location in cache for `archive_name` and `names` + + The parent directory of the resulting path will be created if it does + not already exist. `archive_name` should be the base filename of the + enclosing egg (which may not be the name of the enclosing zipfile!), + including its ".egg" extension. `names`, if provided, should be a + sequence of path name parts "under" the egg's extraction location. + + This method should only be called by resource providers that need to + obtain an extraction location, and only for names they intend to + extract, as it tracks the generated names for possible cleanup later. + """ + extract_path = self.extraction_path or get_default_cache() + target_path = os.path.join(extract_path, archive_name + '-tmp', *names) + try: + _bypass_ensure_directory(target_path) + except: + self.extraction_error() + + self._warn_unsafe_extraction_path(extract_path) + + self.cached_files[target_path] = 1 + return target_path + + @staticmethod + def _warn_unsafe_extraction_path(path): + """ + If the default extraction path is overridden and set to an insecure + location, such as /tmp, it opens up an opportunity for an attacker to + replace an extracted file with an unauthorized payload. Warn the user + if a known insecure location is used. + + See Distribute #375 for more details. + """ + if os.name == 'nt' and not path.startswith(os.environ['windir']): + # On Windows, permissions are generally restrictive by default + # and temp directories are not writable by other users, so + # bypass the warning. + return + mode = os.stat(path).st_mode + if mode & stat.S_IWOTH or mode & stat.S_IWGRP: + msg = ("%s is writable by group/others and vulnerable to attack " + "when " + "used with get_resource_filename. Consider a more secure " + "location (set with .set_extraction_path or the " + "PYTHON_EGG_CACHE environment variable)." % path) + warnings.warn(msg, UserWarning) + + def postprocess(self, tempname, filename): + """Perform any platform-specific postprocessing of `tempname` + + This is where Mac header rewrites should be done; other platforms don't + have anything special they should do. + + Resource providers should call this method ONLY after successfully + extracting a compressed resource. They must NOT call it on resources + that are already in the filesystem. + + `tempname` is the current (temporary) name of the file, and `filename` + is the name it will be renamed to by the caller after this routine + returns. + """ + + if os.name == 'posix': + # Make the resource executable + mode = ((os.stat(tempname).st_mode) | 0o555) & 0o7777 + os.chmod(tempname, mode) + + def set_extraction_path(self, path): + """Set the base path where resources will be extracted to, if needed. + + If you do not call this routine before any extractions take place, the + path defaults to the return value of ``get_default_cache()``. (Which + is based on the ``PYTHON_EGG_CACHE`` environment variable, with various + platform-specific fallbacks. See that routine's documentation for more + details.) + + Resources are extracted to subdirectories of this path based upon + information given by the ``IResourceProvider``. You may set this to a + temporary directory, but then you must call ``cleanup_resources()`` to + delete the extracted files when done. There is no guarantee that + ``cleanup_resources()`` will be able to remove all extracted files. + + (Note: you may not change the extraction path for a given resource + manager once resources have been extracted, unless you first call + ``cleanup_resources()``.) + """ + if self.cached_files: + raise ValueError( + "Can't change extraction path, files already extracted" + ) + + self.extraction_path = path + + def cleanup_resources(self, force=False): + """ + Delete all extracted resource files and directories, returning a list + of the file and directory names that could not be successfully removed. + This function does not have any concurrency protection, so it should + generally only be called when the extraction path is a temporary + directory exclusive to a single process. This method is not + automatically called; you must call it explicitly or register it as an + ``atexit`` function if you wish to ensure cleanup of a temporary + directory used for extractions. + """ + # XXX + + +def get_default_cache(): + """ + Return the ``PYTHON_EGG_CACHE`` environment variable + or a platform-relevant user cache dir for an app + named "Python-Eggs". + """ + return ( + os.environ.get('PYTHON_EGG_CACHE') + or appdirs.user_cache_dir(appname='Python-Eggs') + ) + + +def safe_name(name): + """Convert an arbitrary string to a standard distribution name + + Any runs of non-alphanumeric/. characters are replaced with a single '-'. + """ + return re.sub('[^A-Za-z0-9.]+', '-', name) + + +def safe_version(version): + """ + Convert an arbitrary string to a standard version string + """ + try: + # normalize the version + return str(packaging.version.Version(version)) + except packaging.version.InvalidVersion: + version = version.replace(' ', '.') + return re.sub('[^A-Za-z0-9.]+', '-', version) + + +def safe_extra(extra): + """Convert an arbitrary string to a standard 'extra' name + + Any runs of non-alphanumeric characters are replaced with a single '_', + and the result is always lowercased. + """ + return re.sub('[^A-Za-z0-9.-]+', '_', extra).lower() + + +def to_filename(name): + """Convert a project or version name to its filename-escaped form + + Any '-' characters are currently replaced with '_'. + """ + return name.replace('-', '_') + + +def invalid_marker(text): + """ + Validate text as a PEP 508 environment marker; return an exception + if invalid or False otherwise. + """ + try: + evaluate_marker(text) + except SyntaxError as e: + e.filename = None + e.lineno = None + return e + return False + + +def evaluate_marker(text, extra=None): + """ + Evaluate a PEP 508 environment marker. + Return a boolean indicating the marker result in this environment. + Raise SyntaxError if marker is invalid. + + This implementation uses the 'pyparsing' module. + """ + try: + marker = packaging.markers.Marker(text) + return marker.evaluate() + except packaging.markers.InvalidMarker as e: + raise SyntaxError(e) + + +class NullProvider: + """Try to implement resources and metadata for arbitrary PEP 302 loaders""" + + egg_name = None + egg_info = None + loader = None + + def __init__(self, module): + self.loader = getattr(module, '__loader__', None) + self.module_path = os.path.dirname(getattr(module, '__file__', '')) + + def get_resource_filename(self, manager, resource_name): + return self._fn(self.module_path, resource_name) + + def get_resource_stream(self, manager, resource_name): + return io.BytesIO(self.get_resource_string(manager, resource_name)) + + def get_resource_string(self, manager, resource_name): + return self._get(self._fn(self.module_path, resource_name)) + + def has_resource(self, resource_name): + return self._has(self._fn(self.module_path, resource_name)) + + def has_metadata(self, name): + return self.egg_info and self._has(self._fn(self.egg_info, name)) + + def get_metadata(self, name): + if not self.egg_info: + return "" + value = self._get(self._fn(self.egg_info, name)) + return value.decode('utf-8') if six.PY3 else value + + def get_metadata_lines(self, name): + return yield_lines(self.get_metadata(name)) + + def resource_isdir(self, resource_name): + return self._isdir(self._fn(self.module_path, resource_name)) + + def metadata_isdir(self, name): + return self.egg_info and self._isdir(self._fn(self.egg_info, name)) + + def resource_listdir(self, resource_name): + return self._listdir(self._fn(self.module_path, resource_name)) + + def metadata_listdir(self, name): + if self.egg_info: + return self._listdir(self._fn(self.egg_info, name)) + return [] + + def run_script(self, script_name, namespace): + script = 'scripts/' + script_name + if not self.has_metadata(script): + raise ResolutionError("No script named %r" % script_name) + script_text = self.get_metadata(script).replace('\r\n', '\n') + script_text = script_text.replace('\r', '\n') + script_filename = self._fn(self.egg_info, script) + namespace['__file__'] = script_filename + if os.path.exists(script_filename): + source = open(script_filename).read() + code = compile(source, script_filename, 'exec') + exec(code, namespace, namespace) + else: + from linecache import cache + cache[script_filename] = ( + len(script_text), 0, script_text.split('\n'), script_filename + ) + script_code = compile(script_text, script_filename, 'exec') + exec(script_code, namespace, namespace) + + def _has(self, path): + raise NotImplementedError( + "Can't perform this operation for unregistered loader type" + ) + + def _isdir(self, path): + raise NotImplementedError( + "Can't perform this operation for unregistered loader type" + ) + + def _listdir(self, path): + raise NotImplementedError( + "Can't perform this operation for unregistered loader type" + ) + + def _fn(self, base, resource_name): + if resource_name: + return os.path.join(base, *resource_name.split('/')) + return base + + def _get(self, path): + if hasattr(self.loader, 'get_data'): + return self.loader.get_data(path) + raise NotImplementedError( + "Can't perform this operation for loaders without 'get_data()'" + ) + + +register_loader_type(object, NullProvider) + + +class EggProvider(NullProvider): + """Provider based on a virtual filesystem""" + + def __init__(self, module): + NullProvider.__init__(self, module) + self._setup_prefix() + + def _setup_prefix(self): + # we assume here that our metadata may be nested inside a "basket" + # of multiple eggs; that's why we use module_path instead of .archive + path = self.module_path + old = None + while path != old: + if _is_unpacked_egg(path): + self.egg_name = os.path.basename(path) + self.egg_info = os.path.join(path, 'EGG-INFO') + self.egg_root = path + break + old = path + path, base = os.path.split(path) + + +class DefaultProvider(EggProvider): + """Provides access to package resources in the filesystem""" + + def _has(self, path): + return os.path.exists(path) + + def _isdir(self, path): + return os.path.isdir(path) + + def _listdir(self, path): + return os.listdir(path) + + def get_resource_stream(self, manager, resource_name): + return open(self._fn(self.module_path, resource_name), 'rb') + + def _get(self, path): + with open(path, 'rb') as stream: + return stream.read() + + @classmethod + def _register(cls): + loader_cls = getattr(importlib_machinery, 'SourceFileLoader', + type(None)) + register_loader_type(loader_cls, cls) + + +DefaultProvider._register() + + +class EmptyProvider(NullProvider): + """Provider that returns nothing for all requests""" + + _isdir = _has = lambda self, path: False + _get = lambda self, path: '' + _listdir = lambda self, path: [] + module_path = None + + def __init__(self): + pass + + +empty_provider = EmptyProvider() + + +class ZipManifests(dict): + """ + zip manifest builder + """ + + @classmethod + def build(cls, path): + """ + Build a dictionary similar to the zipimport directory + caches, except instead of tuples, store ZipInfo objects. + + Use a platform-specific path separator (os.sep) for the path keys + for compatibility with pypy on Windows. + """ + with ContextualZipFile(path) as zfile: + items = ( + ( + name.replace('/', os.sep), + zfile.getinfo(name), + ) + for name in zfile.namelist() + ) + return dict(items) + + load = build + + +class MemoizedZipManifests(ZipManifests): + """ + Memoized zipfile manifests. + """ + manifest_mod = collections.namedtuple('manifest_mod', 'manifest mtime') + + def load(self, path): + """ + Load a manifest at path or return a suitable manifest already loaded. + """ + path = os.path.normpath(path) + mtime = os.stat(path).st_mtime + + if path not in self or self[path].mtime != mtime: + manifest = self.build(path) + self[path] = self.manifest_mod(manifest, mtime) + + return self[path].manifest + + +class ContextualZipFile(zipfile.ZipFile): + """ + Supplement ZipFile class to support context manager for Python 2.6 + """ + + def __enter__(self): + return self + + def __exit__(self, type, value, traceback): + self.close() + + def __new__(cls, *args, **kwargs): + """ + Construct a ZipFile or ContextualZipFile as appropriate + """ + if hasattr(zipfile.ZipFile, '__exit__'): + return zipfile.ZipFile(*args, **kwargs) + return super(ContextualZipFile, cls).__new__(cls) + + +class ZipProvider(EggProvider): + """Resource support for zips and eggs""" + + eagers = None + _zip_manifests = MemoizedZipManifests() + + def __init__(self, module): + EggProvider.__init__(self, module) + self.zip_pre = self.loader.archive + os.sep + + def _zipinfo_name(self, fspath): + # Convert a virtual filename (full path to file) into a zipfile subpath + # usable with the zipimport directory cache for our target archive + if fspath.startswith(self.zip_pre): + return fspath[len(self.zip_pre):] + raise AssertionError( + "%s is not a subpath of %s" % (fspath, self.zip_pre) + ) + + def _parts(self, zip_path): + # Convert a zipfile subpath into an egg-relative path part list. + # pseudo-fs path + fspath = self.zip_pre + zip_path + if fspath.startswith(self.egg_root + os.sep): + return fspath[len(self.egg_root) + 1:].split(os.sep) + raise AssertionError( + "%s is not a subpath of %s" % (fspath, self.egg_root) + ) + + @property + def zipinfo(self): + return self._zip_manifests.load(self.loader.archive) + + def get_resource_filename(self, manager, resource_name): + if not self.egg_name: + raise NotImplementedError( + "resource_filename() only supported for .egg, not .zip" + ) + # no need to lock for extraction, since we use temp names + zip_path = self._resource_to_zip(resource_name) + eagers = self._get_eager_resources() + if '/'.join(self._parts(zip_path)) in eagers: + for name in eagers: + self._extract_resource(manager, self._eager_to_zip(name)) + return self._extract_resource(manager, zip_path) + + @staticmethod + def _get_date_and_size(zip_stat): + size = zip_stat.file_size + # ymdhms+wday, yday, dst + date_time = zip_stat.date_time + (0, 0, -1) + # 1980 offset already done + timestamp = time.mktime(date_time) + return timestamp, size + + def _extract_resource(self, manager, zip_path): + + if zip_path in self._index(): + for name in self._index()[zip_path]: + last = self._extract_resource( + manager, os.path.join(zip_path, name) + ) + # return the extracted directory name + return os.path.dirname(last) + + timestamp, size = self._get_date_and_size(self.zipinfo[zip_path]) + + if not WRITE_SUPPORT: + raise IOError('"os.rename" and "os.unlink" are not supported ' + 'on this platform') + try: + + real_path = manager.get_cache_path( + self.egg_name, self._parts(zip_path) + ) + + if self._is_current(real_path, zip_path): + return real_path + + outf, tmpnam = _mkstemp(".$extract", dir=os.path.dirname(real_path)) + os.write(outf, self.loader.get_data(zip_path)) + os.close(outf) + utime(tmpnam, (timestamp, timestamp)) + manager.postprocess(tmpnam, real_path) + + try: + rename(tmpnam, real_path) + + except os.error: + if os.path.isfile(real_path): + if self._is_current(real_path, zip_path): + # the file became current since it was checked above, + # so proceed. + return real_path + # Windows, del old file and retry + elif os.name == 'nt': + unlink(real_path) + rename(tmpnam, real_path) + return real_path + raise + + except os.error: + # report a user-friendly error + manager.extraction_error() + + return real_path + + def _is_current(self, file_path, zip_path): + """ + Return True if the file_path is current for this zip_path + """ + timestamp, size = self._get_date_and_size(self.zipinfo[zip_path]) + if not os.path.isfile(file_path): + return False + stat = os.stat(file_path) + if stat.st_size != size or stat.st_mtime != timestamp: + return False + # check that the contents match + zip_contents = self.loader.get_data(zip_path) + with open(file_path, 'rb') as f: + file_contents = f.read() + return zip_contents == file_contents + + def _get_eager_resources(self): + if self.eagers is None: + eagers = [] + for name in ('native_libs.txt', 'eager_resources.txt'): + if self.has_metadata(name): + eagers.extend(self.get_metadata_lines(name)) + self.eagers = eagers + return self.eagers + + def _index(self): + try: + return self._dirindex + except AttributeError: + ind = {} + for path in self.zipinfo: + parts = path.split(os.sep) + while parts: + parent = os.sep.join(parts[:-1]) + if parent in ind: + ind[parent].append(parts[-1]) + break + else: + ind[parent] = [parts.pop()] + self._dirindex = ind + return ind + + def _has(self, fspath): + zip_path = self._zipinfo_name(fspath) + return zip_path in self.zipinfo or zip_path in self._index() + + def _isdir(self, fspath): + return self._zipinfo_name(fspath) in self._index() + + def _listdir(self, fspath): + return list(self._index().get(self._zipinfo_name(fspath), ())) + + def _eager_to_zip(self, resource_name): + return self._zipinfo_name(self._fn(self.egg_root, resource_name)) + + def _resource_to_zip(self, resource_name): + return self._zipinfo_name(self._fn(self.module_path, resource_name)) + + +register_loader_type(zipimport.zipimporter, ZipProvider) + + +class FileMetadata(EmptyProvider): + """Metadata handler for standalone PKG-INFO files + + Usage:: + + metadata = FileMetadata("/path/to/PKG-INFO") + + This provider rejects all data and metadata requests except for PKG-INFO, + which is treated as existing, and will be the contents of the file at + the provided location. + """ + + def __init__(self, path): + self.path = path + + def has_metadata(self, name): + return name == 'PKG-INFO' and os.path.isfile(self.path) + + def get_metadata(self, name): + if name != 'PKG-INFO': + raise KeyError("No metadata except PKG-INFO is available") + + with io.open(self.path, encoding='utf-8', errors="replace") as f: + metadata = f.read() + self._warn_on_replacement(metadata) + return metadata + + def _warn_on_replacement(self, metadata): + # Python 2.6 and 3.2 compat for: replacement_char = '�' + replacement_char = b'\xef\xbf\xbd'.decode('utf-8') + if replacement_char in metadata: + tmpl = "{self.path} could not be properly decoded in UTF-8" + msg = tmpl.format(**locals()) + warnings.warn(msg) + + def get_metadata_lines(self, name): + return yield_lines(self.get_metadata(name)) + + +class PathMetadata(DefaultProvider): + """Metadata provider for egg directories + + Usage:: + + # Development eggs: + + egg_info = "/path/to/PackageName.egg-info" + base_dir = os.path.dirname(egg_info) + metadata = PathMetadata(base_dir, egg_info) + dist_name = os.path.splitext(os.path.basename(egg_info))[0] + dist = Distribution(basedir, project_name=dist_name, metadata=metadata) + + # Unpacked egg directories: + + egg_path = "/path/to/PackageName-ver-pyver-etc.egg" + metadata = PathMetadata(egg_path, os.path.join(egg_path,'EGG-INFO')) + dist = Distribution.from_filename(egg_path, metadata=metadata) + """ + + def __init__(self, path, egg_info): + self.module_path = path + self.egg_info = egg_info + + +class EggMetadata(ZipProvider): + """Metadata provider for .egg files""" + + def __init__(self, importer): + """Create a metadata provider from a zipimporter""" + + self.zip_pre = importer.archive + os.sep + self.loader = importer + if importer.prefix: + self.module_path = os.path.join(importer.archive, importer.prefix) + else: + self.module_path = importer.archive + self._setup_prefix() + + +_declare_state('dict', _distribution_finders={}) + + +def register_finder(importer_type, distribution_finder): + """Register `distribution_finder` to find distributions in sys.path items + + `importer_type` is the type or class of a PEP 302 "Importer" (sys.path item + handler), and `distribution_finder` is a callable that, passed a path + item and the importer instance, yields ``Distribution`` instances found on + that path item. See ``pkg_resources.find_on_path`` for an example.""" + _distribution_finders[importer_type] = distribution_finder + + +def find_distributions(path_item, only=False): + """Yield distributions accessible via `path_item`""" + importer = get_importer(path_item) + finder = _find_adapter(_distribution_finders, importer) + return finder(importer, path_item, only) + + +def find_eggs_in_zip(importer, path_item, only=False): + """ + Find eggs in zip files; possibly multiple nested eggs. + """ + if importer.archive.endswith('.whl'): + # wheels are not supported with this finder + # they don't have PKG-INFO metadata, and won't ever contain eggs + return + metadata = EggMetadata(importer) + if metadata.has_metadata('PKG-INFO'): + yield Distribution.from_filename(path_item, metadata=metadata) + if only: + # don't yield nested distros + return + for subitem in metadata.resource_listdir('/'): + if _is_unpacked_egg(subitem): + subpath = os.path.join(path_item, subitem) + for dist in find_eggs_in_zip(zipimport.zipimporter(subpath), subpath): + yield dist + elif subitem.lower().endswith('.dist-info'): + subpath = os.path.join(path_item, subitem) + submeta = EggMetadata(zipimport.zipimporter(subpath)) + submeta.egg_info = subpath + yield Distribution.from_location(path_item, subitem, submeta) + + + +register_finder(zipimport.zipimporter, find_eggs_in_zip) + + +def find_nothing(importer, path_item, only=False): + return () + + +register_finder(object, find_nothing) + + +def _by_version_descending(names): + """ + Given a list of filenames, return them in descending order + by version number. + + >>> names = 'bar', 'foo', 'Python-2.7.10.egg', 'Python-2.7.2.egg' + >>> _by_version_descending(names) + ['Python-2.7.10.egg', 'Python-2.7.2.egg', 'foo', 'bar'] + >>> names = 'Setuptools-1.2.3b1.egg', 'Setuptools-1.2.3.egg' + >>> _by_version_descending(names) + ['Setuptools-1.2.3.egg', 'Setuptools-1.2.3b1.egg'] + >>> names = 'Setuptools-1.2.3b1.egg', 'Setuptools-1.2.3.post1.egg' + >>> _by_version_descending(names) + ['Setuptools-1.2.3.post1.egg', 'Setuptools-1.2.3b1.egg'] + """ + def _by_version(name): + """ + Parse each component of the filename + """ + name, ext = os.path.splitext(name) + parts = itertools.chain(name.split('-'), [ext]) + return [packaging.version.parse(part) for part in parts] + + return sorted(names, key=_by_version, reverse=True) + + +def find_on_path(importer, path_item, only=False): + """Yield distributions accessible on a sys.path directory""" + path_item = _normalize_cached(path_item) + + if os.path.isdir(path_item) and os.access(path_item, os.R_OK): + if _is_unpacked_egg(path_item): + yield Distribution.from_filename( + path_item, metadata=PathMetadata( + path_item, os.path.join(path_item, 'EGG-INFO') + ) + ) + else: + # scan for .egg and .egg-info in directory + path_item_entries = _by_version_descending(os.listdir(path_item)) + for entry in path_item_entries: + lower = entry.lower() + if lower.endswith('.egg-info') or lower.endswith('.dist-info'): + fullpath = os.path.join(path_item, entry) + if os.path.isdir(fullpath): + # egg-info directory, allow getting metadata + if len(os.listdir(fullpath)) == 0: + # Empty egg directory, skip. + continue + metadata = PathMetadata(path_item, fullpath) + else: + metadata = FileMetadata(fullpath) + yield Distribution.from_location( + path_item, entry, metadata, precedence=DEVELOP_DIST + ) + elif not only and _is_unpacked_egg(entry): + dists = find_distributions(os.path.join(path_item, entry)) + for dist in dists: + yield dist + elif not only and lower.endswith('.egg-link'): + with open(os.path.join(path_item, entry)) as entry_file: + entry_lines = entry_file.readlines() + for line in entry_lines: + if not line.strip(): + continue + path = os.path.join(path_item, line.rstrip()) + dists = find_distributions(path) + for item in dists: + yield item + break + + +register_finder(pkgutil.ImpImporter, find_on_path) + +if hasattr(importlib_machinery, 'FileFinder'): + register_finder(importlib_machinery.FileFinder, find_on_path) + +_declare_state('dict', _namespace_handlers={}) +_declare_state('dict', _namespace_packages={}) + + +def register_namespace_handler(importer_type, namespace_handler): + """Register `namespace_handler` to declare namespace packages + + `importer_type` is the type or class of a PEP 302 "Importer" (sys.path item + handler), and `namespace_handler` is a callable like this:: + + def namespace_handler(importer, path_entry, moduleName, module): + # return a path_entry to use for child packages + + Namespace handlers are only called if the importer object has already + agreed that it can handle the relevant path item, and they should only + return a subpath if the module __path__ does not already contain an + equivalent subpath. For an example namespace handler, see + ``pkg_resources.file_ns_handler``. + """ + _namespace_handlers[importer_type] = namespace_handler + + +def _handle_ns(packageName, path_item): + """Ensure that named package includes a subpath of path_item (if needed)""" + + importer = get_importer(path_item) + if importer is None: + return None + loader = importer.find_module(packageName) + if loader is None: + return None + module = sys.modules.get(packageName) + if module is None: + module = sys.modules[packageName] = types.ModuleType(packageName) + module.__path__ = [] + _set_parent_ns(packageName) + elif not hasattr(module, '__path__'): + raise TypeError("Not a package:", packageName) + handler = _find_adapter(_namespace_handlers, importer) + subpath = handler(importer, path_item, packageName, module) + if subpath is not None: + path = module.__path__ + path.append(subpath) + loader.load_module(packageName) + _rebuild_mod_path(path, packageName, module) + return subpath + + +def _rebuild_mod_path(orig_path, package_name, module): + """ + Rebuild module.__path__ ensuring that all entries are ordered + corresponding to their sys.path order + """ + sys_path = [_normalize_cached(p) for p in sys.path] + + def safe_sys_path_index(entry): + """ + Workaround for #520 and #513. + """ + try: + return sys_path.index(entry) + except ValueError: + return float('inf') + + def position_in_sys_path(path): + """ + Return the ordinal of the path based on its position in sys.path + """ + path_parts = path.split(os.sep) + module_parts = package_name.count('.') + 1 + parts = path_parts[:-module_parts] + return safe_sys_path_index(_normalize_cached(os.sep.join(parts))) + + if not isinstance(orig_path, list): + # Is this behavior useful when module.__path__ is not a list? + return + + orig_path.sort(key=position_in_sys_path) + module.__path__[:] = [_normalize_cached(p) for p in orig_path] + + +def declare_namespace(packageName): + """Declare that package 'packageName' is a namespace package""" + + _imp.acquire_lock() + try: + if packageName in _namespace_packages: + return + + path, parent = sys.path, None + if '.' in packageName: + parent = '.'.join(packageName.split('.')[:-1]) + declare_namespace(parent) + if parent not in _namespace_packages: + __import__(parent) + try: + path = sys.modules[parent].__path__ + except AttributeError: + raise TypeError("Not a package:", parent) + + # Track what packages are namespaces, so when new path items are added, + # they can be updated + _namespace_packages.setdefault(parent, []).append(packageName) + _namespace_packages.setdefault(packageName, []) + + for path_item in path: + # Ensure all the parent's path items are reflected in the child, + # if they apply + _handle_ns(packageName, path_item) + + finally: + _imp.release_lock() + + +def fixup_namespace_packages(path_item, parent=None): + """Ensure that previously-declared namespace packages include path_item""" + _imp.acquire_lock() + try: + for package in _namespace_packages.get(parent, ()): + subpath = _handle_ns(package, path_item) + if subpath: + fixup_namespace_packages(subpath, package) + finally: + _imp.release_lock() + + +def file_ns_handler(importer, path_item, packageName, module): + """Compute an ns-package subpath for a filesystem or zipfile importer""" + + subpath = os.path.join(path_item, packageName.split('.')[-1]) + normalized = _normalize_cached(subpath) + for item in module.__path__: + if _normalize_cached(item) == normalized: + break + else: + # Only return the path if it's not already there + return subpath + + +register_namespace_handler(pkgutil.ImpImporter, file_ns_handler) +register_namespace_handler(zipimport.zipimporter, file_ns_handler) + +if hasattr(importlib_machinery, 'FileFinder'): + register_namespace_handler(importlib_machinery.FileFinder, file_ns_handler) + + +def null_ns_handler(importer, path_item, packageName, module): + return None + + +register_namespace_handler(object, null_ns_handler) + + +def normalize_path(filename): + """Normalize a file/dir name for comparison purposes""" + return os.path.normcase(os.path.realpath(filename)) + + +def _normalize_cached(filename, _cache={}): + try: + return _cache[filename] + except KeyError: + _cache[filename] = result = normalize_path(filename) + return result + + +def _is_unpacked_egg(path): + """ + Determine if given path appears to be an unpacked egg. + """ + return ( + path.lower().endswith('.egg') + ) + + +def _set_parent_ns(packageName): + parts = packageName.split('.') + name = parts.pop() + if parts: + parent = '.'.join(parts) + setattr(sys.modules[parent], name, sys.modules[packageName]) + + +def yield_lines(strs): + """Yield non-empty/non-comment lines of a string or sequence""" + if isinstance(strs, six.string_types): + for s in strs.splitlines(): + s = s.strip() + # skip blank lines/comments + if s and not s.startswith('#'): + yield s + else: + for ss in strs: + for s in yield_lines(ss): + yield s + + +MODULE = re.compile(r"\w+(\.\w+)*$").match +EGG_NAME = re.compile( + r""" + (?P[^-]+) ( + -(?P[^-]+) ( + -py(?P[^-]+) ( + -(?P.+) + )? + )? + )? + """, + re.VERBOSE | re.IGNORECASE, +).match + + +class EntryPoint(object): + """Object representing an advertised importable object""" + + def __init__(self, name, module_name, attrs=(), extras=(), dist=None): + if not MODULE(module_name): + raise ValueError("Invalid module name", module_name) + self.name = name + self.module_name = module_name + self.attrs = tuple(attrs) + self.extras = Requirement.parse(("x[%s]" % ','.join(extras))).extras + self.dist = dist + + def __str__(self): + s = "%s = %s" % (self.name, self.module_name) + if self.attrs: + s += ':' + '.'.join(self.attrs) + if self.extras: + s += ' [%s]' % ','.join(self.extras) + return s + + def __repr__(self): + return "EntryPoint.parse(%r)" % str(self) + + def load(self, require=True, *args, **kwargs): + """ + Require packages for this EntryPoint, then resolve it. + """ + if not require or args or kwargs: + warnings.warn( + "Parameters to load are deprecated. Call .resolve and " + ".require separately.", + DeprecationWarning, + stacklevel=2, + ) + if require: + self.require(*args, **kwargs) + return self.resolve() + + def resolve(self): + """ + Resolve the entry point from its module and attrs. + """ + module = __import__(self.module_name, fromlist=['__name__'], level=0) + try: + return functools.reduce(getattr, self.attrs, module) + except AttributeError as exc: + raise ImportError(str(exc)) + + def require(self, env=None, installer=None): + if self.extras and not self.dist: + raise UnknownExtra("Can't require() without a distribution", self) + + # Get the requirements for this entry point with all its extras and + # then resolve them. We have to pass `extras` along when resolving so + # that the working set knows what extras we want. Otherwise, for + # dist-info distributions, the working set will assume that the + # requirements for that extra are purely optional and skip over them. + reqs = self.dist.requires(self.extras) + items = working_set.resolve(reqs, env, installer, extras=self.extras) + list(map(working_set.add, items)) + + pattern = re.compile( + r'\s*' + r'(?P.+?)\s*' + r'=\s*' + r'(?P[\w.]+)\s*' + r'(:\s*(?P[\w.]+))?\s*' + r'(?P\[.*\])?\s*$' + ) + + @classmethod + def parse(cls, src, dist=None): + """Parse a single entry point from string `src` + + Entry point syntax follows the form:: + + name = some.module:some.attr [extra1, extra2] + + The entry name and module name are required, but the ``:attrs`` and + ``[extras]`` parts are optional + """ + m = cls.pattern.match(src) + if not m: + msg = "EntryPoint must be in 'name=module:attrs [extras]' format" + raise ValueError(msg, src) + res = m.groupdict() + extras = cls._parse_extras(res['extras']) + attrs = res['attr'].split('.') if res['attr'] else () + return cls(res['name'], res['module'], attrs, extras, dist) + + @classmethod + def _parse_extras(cls, extras_spec): + if not extras_spec: + return () + req = Requirement.parse('x' + extras_spec) + if req.specs: + raise ValueError() + return req.extras + + @classmethod + def parse_group(cls, group, lines, dist=None): + """Parse an entry point group""" + if not MODULE(group): + raise ValueError("Invalid group name", group) + this = {} + for line in yield_lines(lines): + ep = cls.parse(line, dist) + if ep.name in this: + raise ValueError("Duplicate entry point", group, ep.name) + this[ep.name] = ep + return this + + @classmethod + def parse_map(cls, data, dist=None): + """Parse a map of entry point groups""" + if isinstance(data, dict): + data = data.items() + else: + data = split_sections(data) + maps = {} + for group, lines in data: + if group is None: + if not lines: + continue + raise ValueError("Entry points must be listed in groups") + group = group.strip() + if group in maps: + raise ValueError("Duplicate group name", group) + maps[group] = cls.parse_group(group, lines, dist) + return maps + + +def _remove_md5_fragment(location): + if not location: + return '' + parsed = urllib.parse.urlparse(location) + if parsed[-1].startswith('md5='): + return urllib.parse.urlunparse(parsed[:-1] + ('',)) + return location + + +def _version_from_file(lines): + """ + Given an iterable of lines from a Metadata file, return + the value of the Version field, if present, or None otherwise. + """ + is_version_line = lambda line: line.lower().startswith('version:') + version_lines = filter(is_version_line, lines) + line = next(iter(version_lines), '') + _, _, value = line.partition(':') + return safe_version(value.strip()) or None + + +class Distribution(object): + """Wrap an actual or potential sys.path entry w/metadata""" + PKG_INFO = 'PKG-INFO' + + def __init__(self, location=None, metadata=None, project_name=None, + version=None, py_version=PY_MAJOR, platform=None, + precedence=EGG_DIST): + self.project_name = safe_name(project_name or 'Unknown') + if version is not None: + self._version = safe_version(version) + self.py_version = py_version + self.platform = platform + self.location = location + self.precedence = precedence + self._provider = metadata or empty_provider + + @classmethod + def from_location(cls, location, basename, metadata=None, **kw): + project_name, version, py_version, platform = [None] * 4 + basename, ext = os.path.splitext(basename) + if ext.lower() in _distributionImpl: + cls = _distributionImpl[ext.lower()] + + match = EGG_NAME(basename) + if match: + project_name, version, py_version, platform = match.group( + 'name', 'ver', 'pyver', 'plat' + ) + return cls( + location, metadata, project_name=project_name, version=version, + py_version=py_version, platform=platform, **kw + )._reload_version() + + def _reload_version(self): + return self + + @property + def hashcmp(self): + return ( + self.parsed_version, + self.precedence, + self.key, + _remove_md5_fragment(self.location), + self.py_version or '', + self.platform or '', + ) + + def __hash__(self): + return hash(self.hashcmp) + + def __lt__(self, other): + return self.hashcmp < other.hashcmp + + def __le__(self, other): + return self.hashcmp <= other.hashcmp + + def __gt__(self, other): + return self.hashcmp > other.hashcmp + + def __ge__(self, other): + return self.hashcmp >= other.hashcmp + + def __eq__(self, other): + if not isinstance(other, self.__class__): + # It's not a Distribution, so they are not equal + return False + return self.hashcmp == other.hashcmp + + def __ne__(self, other): + return not self == other + + # These properties have to be lazy so that we don't have to load any + # metadata until/unless it's actually needed. (i.e., some distributions + # may not know their name or version without loading PKG-INFO) + + @property + def key(self): + try: + return self._key + except AttributeError: + self._key = key = self.project_name.lower() + return key + + @property + def parsed_version(self): + if not hasattr(self, "_parsed_version"): + self._parsed_version = parse_version(self.version) + + return self._parsed_version + + def _warn_legacy_version(self): + LV = packaging.version.LegacyVersion + is_legacy = isinstance(self._parsed_version, LV) + if not is_legacy: + return + + # While an empty version is technically a legacy version and + # is not a valid PEP 440 version, it's also unlikely to + # actually come from someone and instead it is more likely that + # it comes from setuptools attempting to parse a filename and + # including it in the list. So for that we'll gate this warning + # on if the version is anything at all or not. + if not self.version: + return + + tmpl = textwrap.dedent(""" + '{project_name} ({version})' is being parsed as a legacy, + non PEP 440, + version. You may find odd behavior and sort order. + In particular it will be sorted as less than 0.0. It + is recommended to migrate to PEP 440 compatible + versions. + """).strip().replace('\n', ' ') + + warnings.warn(tmpl.format(**vars(self)), PEP440Warning) + + @property + def version(self): + try: + return self._version + except AttributeError: + version = _version_from_file(self._get_metadata(self.PKG_INFO)) + if version is None: + tmpl = "Missing 'Version:' header and/or %s file" + raise ValueError(tmpl % self.PKG_INFO, self) + return version + + @property + def _dep_map(self): + try: + return self.__dep_map + except AttributeError: + dm = self.__dep_map = {None: []} + for name in 'requires.txt', 'depends.txt': + for extra, reqs in split_sections(self._get_metadata(name)): + if extra: + if ':' in extra: + extra, marker = extra.split(':', 1) + if invalid_marker(marker): + # XXX warn + reqs = [] + elif not evaluate_marker(marker): + reqs = [] + extra = safe_extra(extra) or None + dm.setdefault(extra, []).extend(parse_requirements(reqs)) + return dm + + def requires(self, extras=()): + """List of Requirements needed for this distro if `extras` are used""" + dm = self._dep_map + deps = [] + deps.extend(dm.get(None, ())) + for ext in extras: + try: + deps.extend(dm[safe_extra(ext)]) + except KeyError: + raise UnknownExtra( + "%s has no such extra feature %r" % (self, ext) + ) + return deps + + def _get_metadata(self, name): + if self.has_metadata(name): + for line in self.get_metadata_lines(name): + yield line + + def activate(self, path=None, replace=False): + """Ensure distribution is importable on `path` (default=sys.path)""" + if path is None: + path = sys.path + self.insert_on(path, replace=replace) + if path is sys.path: + fixup_namespace_packages(self.location) + for pkg in self._get_metadata('namespace_packages.txt'): + if pkg in sys.modules: + declare_namespace(pkg) + + def egg_name(self): + """Return what this distribution's standard .egg filename should be""" + filename = "%s-%s-py%s" % ( + to_filename(self.project_name), to_filename(self.version), + self.py_version or PY_MAJOR + ) + + if self.platform: + filename += '-' + self.platform + return filename + + def __repr__(self): + if self.location: + return "%s (%s)" % (self, self.location) + else: + return str(self) + + def __str__(self): + try: + version = getattr(self, 'version', None) + except ValueError: + version = None + version = version or "[unknown version]" + return "%s %s" % (self.project_name, version) + + def __getattr__(self, attr): + """Delegate all unrecognized public attributes to .metadata provider""" + if attr.startswith('_'): + raise AttributeError(attr) + return getattr(self._provider, attr) + + @classmethod + def from_filename(cls, filename, metadata=None, **kw): + return cls.from_location( + _normalize_cached(filename), os.path.basename(filename), metadata, + **kw + ) + + def as_requirement(self): + """Return a ``Requirement`` that matches this distribution exactly""" + if isinstance(self.parsed_version, packaging.version.Version): + spec = "%s==%s" % (self.project_name, self.parsed_version) + else: + spec = "%s===%s" % (self.project_name, self.parsed_version) + + return Requirement.parse(spec) + + def load_entry_point(self, group, name): + """Return the `name` entry point of `group` or raise ImportError""" + ep = self.get_entry_info(group, name) + if ep is None: + raise ImportError("Entry point %r not found" % ((group, name),)) + return ep.load() + + def get_entry_map(self, group=None): + """Return the entry point map for `group`, or the full entry map""" + try: + ep_map = self._ep_map + except AttributeError: + ep_map = self._ep_map = EntryPoint.parse_map( + self._get_metadata('entry_points.txt'), self + ) + if group is not None: + return ep_map.get(group, {}) + return ep_map + + def get_entry_info(self, group, name): + """Return the EntryPoint object for `group`+`name`, or ``None``""" + return self.get_entry_map(group).get(name) + + def insert_on(self, path, loc=None, replace=False): + """Ensure self.location is on path + + If replace=False (default): + - If location is already in path anywhere, do nothing. + - Else: + - If it's an egg and its parent directory is on path, + insert just ahead of the parent. + - Else: add to the end of path. + If replace=True: + - If location is already on path anywhere (not eggs) + or higher priority than its parent (eggs) + do nothing. + - Else: + - If it's an egg and its parent directory is on path, + insert just ahead of the parent, + removing any lower-priority entries. + - Else: add it to the front of path. + """ + + loc = loc or self.location + if not loc: + return + + nloc = _normalize_cached(loc) + bdir = os.path.dirname(nloc) + npath = [(p and _normalize_cached(p) or p) for p in path] + + for p, item in enumerate(npath): + if item == nloc: + if replace: + break + else: + # don't modify path (even removing duplicates) if found and not replace + return + elif item == bdir and self.precedence == EGG_DIST: + # if it's an .egg, give it precedence over its directory + # UNLESS it's already been added to sys.path and replace=False + if (not replace) and nloc in npath[p:]: + return + if path is sys.path: + self.check_version_conflict() + path.insert(p, loc) + npath.insert(p, nloc) + break + else: + if path is sys.path: + self.check_version_conflict() + if replace: + path.insert(0, loc) + else: + path.append(loc) + return + + # p is the spot where we found or inserted loc; now remove duplicates + while True: + try: + np = npath.index(nloc, p + 1) + except ValueError: + break + else: + del npath[np], path[np] + # ha! + p = np + + return + + def check_version_conflict(self): + if self.key == 'setuptools': + # ignore the inevitable setuptools self-conflicts :( + return + + nsp = dict.fromkeys(self._get_metadata('namespace_packages.txt')) + loc = normalize_path(self.location) + for modname in self._get_metadata('top_level.txt'): + if (modname not in sys.modules or modname in nsp + or modname in _namespace_packages): + continue + if modname in ('pkg_resources', 'setuptools', 'site'): + continue + fn = getattr(sys.modules[modname], '__file__', None) + if fn and (normalize_path(fn).startswith(loc) or + fn.startswith(self.location)): + continue + issue_warning( + "Module %s was already imported from %s, but %s is being added" + " to sys.path" % (modname, fn, self.location), + ) + + def has_version(self): + try: + self.version + except ValueError: + issue_warning("Unbuilt egg for " + repr(self)) + return False + return True + + def clone(self, **kw): + """Copy this distribution, substituting in any changed keyword args""" + names = 'project_name version py_version platform location precedence' + for attr in names.split(): + kw.setdefault(attr, getattr(self, attr, None)) + kw.setdefault('metadata', self._provider) + return self.__class__(**kw) + + @property + def extras(self): + return [dep for dep in self._dep_map if dep] + + +class EggInfoDistribution(Distribution): + def _reload_version(self): + """ + Packages installed by distutils (e.g. numpy or scipy), + which uses an old safe_version, and so + their version numbers can get mangled when + converted to filenames (e.g., 1.11.0.dev0+2329eae to + 1.11.0.dev0_2329eae). These distributions will not be + parsed properly + downstream by Distribution and safe_version, so + take an extra step and try to get the version number from + the metadata file itself instead of the filename. + """ + md_version = _version_from_file(self._get_metadata(self.PKG_INFO)) + if md_version: + self._version = md_version + return self + + +class DistInfoDistribution(Distribution): + """Wrap an actual or potential sys.path entry w/metadata, .dist-info style""" + PKG_INFO = 'METADATA' + EQEQ = re.compile(r"([\(,])\s*(\d.*?)\s*([,\)])") + + @property + def _parsed_pkg_info(self): + """Parse and cache metadata""" + try: + return self._pkg_info + except AttributeError: + metadata = self.get_metadata(self.PKG_INFO) + self._pkg_info = email.parser.Parser().parsestr(metadata) + return self._pkg_info + + @property + def _dep_map(self): + try: + return self.__dep_map + except AttributeError: + self.__dep_map = self._compute_dependencies() + return self.__dep_map + + def _compute_dependencies(self): + """Recompute this distribution's dependencies.""" + dm = self.__dep_map = {None: []} + + reqs = [] + # Including any condition expressions + for req in self._parsed_pkg_info.get_all('Requires-Dist') or []: + reqs.extend(parse_requirements(req)) + + def reqs_for_extra(extra): + for req in reqs: + if not req.marker or req.marker.evaluate({'extra': extra}): + yield req + + common = frozenset(reqs_for_extra(None)) + dm[None].extend(common) + + for extra in self._parsed_pkg_info.get_all('Provides-Extra') or []: + s_extra = safe_extra(extra.strip()) + dm[s_extra] = list(frozenset(reqs_for_extra(extra)) - common) + + return dm + + +_distributionImpl = { + '.egg': Distribution, + '.egg-info': EggInfoDistribution, + '.dist-info': DistInfoDistribution, + } + + +def issue_warning(*args, **kw): + level = 1 + g = globals() + try: + # find the first stack frame that is *not* code in + # the pkg_resources module, to use for the warning + while sys._getframe(level).f_globals is g: + level += 1 + except ValueError: + pass + warnings.warn(stacklevel=level + 1, *args, **kw) + + +class RequirementParseError(ValueError): + def __str__(self): + return ' '.join(self.args) + + +def parse_requirements(strs): + """Yield ``Requirement`` objects for each specification in `strs` + + `strs` must be a string, or a (possibly-nested) iterable thereof. + """ + # create a steppable iterator, so we can handle \-continuations + lines = iter(yield_lines(strs)) + + for line in lines: + # Drop comments -- a hash without a space may be in a URL. + if ' #' in line: + line = line[:line.find(' #')] + # If there is a line continuation, drop it, and append the next line. + if line.endswith('\\'): + line = line[:-2].strip() + line += next(lines) + yield Requirement(line) + + +class Requirement(packaging.requirements.Requirement): + def __init__(self, requirement_string): + """DO NOT CALL THIS UNDOCUMENTED METHOD; use Requirement.parse()!""" + try: + super(Requirement, self).__init__(requirement_string) + except packaging.requirements.InvalidRequirement as e: + raise RequirementParseError(str(e)) + self.unsafe_name = self.name + project_name = safe_name(self.name) + self.project_name, self.key = project_name, project_name.lower() + self.specs = [ + (spec.operator, spec.version) for spec in self.specifier] + self.extras = tuple(map(safe_extra, self.extras)) + self.hashCmp = ( + self.key, + self.specifier, + frozenset(self.extras), + str(self.marker) if self.marker else None, + ) + self.__hash = hash(self.hashCmp) + + def __eq__(self, other): + return ( + isinstance(other, Requirement) and + self.hashCmp == other.hashCmp + ) + + def __ne__(self, other): + return not self == other + + def __contains__(self, item): + if isinstance(item, Distribution): + if item.key != self.key: + return False + + item = item.version + + # Allow prereleases always in order to match the previous behavior of + # this method. In the future this should be smarter and follow PEP 440 + # more accurately. + return self.specifier.contains(item, prereleases=True) + + def __hash__(self): + return self.__hash + + def __repr__(self): return "Requirement.parse(%r)" % str(self) + + @staticmethod + def parse(s): + req, = parse_requirements(s) + return req + + +def _get_mro(cls): + """Get an mro for a type or classic class""" + if not isinstance(cls, type): + + class cls(cls, object): + pass + + return cls.__mro__[1:] + return cls.__mro__ + + +def _find_adapter(registry, ob): + """Return an adapter factory for `ob` from `registry`""" + for t in _get_mro(getattr(ob, '__class__', type(ob))): + if t in registry: + return registry[t] + + +def ensure_directory(path): + """Ensure that the parent directory of `path` exists""" + dirname = os.path.dirname(path) + if not os.path.isdir(dirname): + os.makedirs(dirname) + + +def _bypass_ensure_directory(path): + """Sandbox-bypassing version of ensure_directory()""" + if not WRITE_SUPPORT: + raise IOError('"os.mkdir" not supported on this platform.') + dirname, filename = split(path) + if dirname and filename and not isdir(dirname): + _bypass_ensure_directory(dirname) + mkdir(dirname, 0o755) + + +def split_sections(s): + """Split a string or iterable thereof into (section, content) pairs + + Each ``section`` is a stripped version of the section header ("[section]") + and each ``content`` is a list of stripped lines excluding blank lines and + comment-only lines. If there are any such lines before the first section + header, they're returned in a first ``section`` of ``None``. + """ + section = None + content = [] + for line in yield_lines(s): + if line.startswith("["): + if line.endswith("]"): + if section or content: + yield section, content + section = line[1:-1].strip() + content = [] + else: + raise ValueError("Invalid section heading", line) + else: + content.append(line) + + # wrap up last segment + yield section, content + + +def _mkstemp(*args, **kw): + old_open = os.open + try: + # temporarily bypass sandboxing + os.open = os_open + return tempfile.mkstemp(*args, **kw) + finally: + # and then put it back + os.open = old_open + + +# Silence the PEP440Warning by default, so that end users don't get hit by it +# randomly just because they use pkg_resources. We want to append the rule +# because we want earlier uses of filterwarnings to take precedence over this +# one. +warnings.filterwarnings("ignore", category=PEP440Warning, append=True) + + +# from jaraco.functools 1.3 +def _call_aside(f, *args, **kwargs): + f(*args, **kwargs) + return f + + +@_call_aside +def _initialize(g=globals()): + "Set up global resource manager (deliberately not state-saved)" + manager = ResourceManager() + g['_manager'] = manager + g.update( + (name, getattr(manager, name)) + for name in dir(manager) + if not name.startswith('_') + ) + + +@_call_aside +def _initialize_master_working_set(): + """ + Prepare the master working set and make the ``require()`` + API available. + + This function has explicit effects on the global state + of pkg_resources. It is intended to be invoked once at + the initialization of this module. + + Invocation by other packages is unsupported and done + at their own risk. + """ + working_set = WorkingSet._build_master() + _declare_state('object', working_set=working_set) + + require = working_set.require + iter_entry_points = working_set.iter_entry_points + add_activation_listener = working_set.subscribe + run_script = working_set.run_script + # backward compatibility + run_main = run_script + # Activate all distributions already on sys.path with replace=False and + # ensure that all distributions added to the working set in the future + # (e.g. by calling ``require()``) will get activated as well, + # with higher priority (replace=True). + tuple( + dist.activate(replace=False) + for dist in working_set + ) + add_activation_listener(lambda dist: dist.activate(replace=True), existing=False) + working_set.entries = [] + # match order + list(map(working_set.add_entry, sys.path)) + globals().update(locals()) diff --git a/pkg_resources/api_tests.txt b/pkg_resources/api_tests.txt new file mode 100644 index 0000000..4fbd3d2 --- /dev/null +++ b/pkg_resources/api_tests.txt @@ -0,0 +1,401 @@ +Pluggable Distributions of Python Software +========================================== + +Distributions +------------- + +A "Distribution" is a collection of files that represent a "Release" of a +"Project" as of a particular point in time, denoted by a +"Version":: + + >>> import sys, pkg_resources + >>> from pkg_resources import Distribution + >>> Distribution(project_name="Foo", version="1.2") + Foo 1.2 + +Distributions have a location, which can be a filename, URL, or really anything +else you care to use:: + + >>> dist = Distribution( + ... location="http://example.com/something", + ... project_name="Bar", version="0.9" + ... ) + + >>> dist + Bar 0.9 (http://example.com/something) + + +Distributions have various introspectable attributes:: + + >>> dist.location + 'http://example.com/something' + + >>> dist.project_name + 'Bar' + + >>> dist.version + '0.9' + + >>> dist.py_version == sys.version[:3] + True + + >>> print(dist.platform) + None + +Including various computed attributes:: + + >>> from pkg_resources import parse_version + >>> dist.parsed_version == parse_version(dist.version) + True + + >>> dist.key # case-insensitive form of the project name + 'bar' + +Distributions are compared (and hashed) by version first:: + + >>> Distribution(version='1.0') == Distribution(version='1.0') + True + >>> Distribution(version='1.0') == Distribution(version='1.1') + False + >>> Distribution(version='1.0') < Distribution(version='1.1') + True + +but also by project name (case-insensitive), platform, Python version, +location, etc.:: + + >>> Distribution(project_name="Foo",version="1.0") == \ + ... Distribution(project_name="Foo",version="1.0") + True + + >>> Distribution(project_name="Foo",version="1.0") == \ + ... Distribution(project_name="foo",version="1.0") + True + + >>> Distribution(project_name="Foo",version="1.0") == \ + ... Distribution(project_name="Foo",version="1.1") + False + + >>> Distribution(project_name="Foo",py_version="2.3",version="1.0") == \ + ... Distribution(project_name="Foo",py_version="2.4",version="1.0") + False + + >>> Distribution(location="spam",version="1.0") == \ + ... Distribution(location="spam",version="1.0") + True + + >>> Distribution(location="spam",version="1.0") == \ + ... Distribution(location="baz",version="1.0") + False + + + +Hash and compare distribution by prio/plat + +Get version from metadata +provider capabilities +egg_name() +as_requirement() +from_location, from_filename (w/path normalization) + +Releases may have zero or more "Requirements", which indicate +what releases of another project the release requires in order to +function. A Requirement names the other project, expresses some criteria +as to what releases of that project are acceptable, and lists any "Extras" +that the requiring release may need from that project. (An Extra is an +optional feature of a Release, that can only be used if its additional +Requirements are satisfied.) + + + +The Working Set +--------------- + +A collection of active distributions is called a Working Set. Note that a +Working Set can contain any importable distribution, not just pluggable ones. +For example, the Python standard library is an importable distribution that +will usually be part of the Working Set, even though it is not pluggable. +Similarly, when you are doing development work on a project, the files you are +editing are also a Distribution. (And, with a little attention to the +directory names used, and including some additional metadata, such a +"development distribution" can be made pluggable as well.) + + >>> from pkg_resources import WorkingSet + +A working set's entries are the sys.path entries that correspond to the active +distributions. By default, the working set's entries are the items on +``sys.path``:: + + >>> ws = WorkingSet() + >>> ws.entries == sys.path + True + +But you can also create an empty working set explicitly, and add distributions +to it:: + + >>> ws = WorkingSet([]) + >>> ws.add(dist) + >>> ws.entries + ['http://example.com/something'] + >>> dist in ws + True + >>> Distribution('foo',version="") in ws + False + +And you can iterate over its distributions:: + + >>> list(ws) + [Bar 0.9 (http://example.com/something)] + +Adding the same distribution more than once is a no-op:: + + >>> ws.add(dist) + >>> list(ws) + [Bar 0.9 (http://example.com/something)] + +For that matter, adding multiple distributions for the same project also does +nothing, because a working set can only hold one active distribution per +project -- the first one added to it:: + + >>> ws.add( + ... Distribution( + ... 'http://example.com/something', project_name="Bar", + ... version="7.2" + ... ) + ... ) + >>> list(ws) + [Bar 0.9 (http://example.com/something)] + +You can append a path entry to a working set using ``add_entry()``:: + + >>> ws.entries + ['http://example.com/something'] + >>> ws.add_entry(pkg_resources.__file__) + >>> ws.entries + ['http://example.com/something', '...pkg_resources...'] + +Multiple additions result in multiple entries, even if the entry is already in +the working set (because ``sys.path`` can contain the same entry more than +once):: + + >>> ws.add_entry(pkg_resources.__file__) + >>> ws.entries + ['...example.com...', '...pkg_resources...', '...pkg_resources...'] + +And you can specify the path entry a distribution was found under, using the +optional second parameter to ``add()``:: + + >>> ws = WorkingSet([]) + >>> ws.add(dist,"foo") + >>> ws.entries + ['foo'] + +But even if a distribution is found under multiple path entries, it still only +shows up once when iterating the working set: + + >>> ws.add_entry(ws.entries[0]) + >>> list(ws) + [Bar 0.9 (http://example.com/something)] + +You can ask a WorkingSet to ``find()`` a distribution matching a requirement:: + + >>> from pkg_resources import Requirement + >>> print(ws.find(Requirement.parse("Foo==1.0"))) # no match, return None + None + + >>> ws.find(Requirement.parse("Bar==0.9")) # match, return distribution + Bar 0.9 (http://example.com/something) + +Note that asking for a conflicting version of a distribution already in a +working set triggers a ``pkg_resources.VersionConflict`` error: + + >>> try: + ... ws.find(Requirement.parse("Bar==1.0")) + ... except pkg_resources.VersionConflict as exc: + ... print(str(exc)) + ... else: + ... raise AssertionError("VersionConflict was not raised") + (Bar 0.9 (http://example.com/something), Requirement.parse('Bar==1.0')) + +You can subscribe a callback function to receive notifications whenever a new +distribution is added to a working set. The callback is immediately invoked +once for each existing distribution in the working set, and then is called +again for new distributions added thereafter:: + + >>> def added(dist): print("Added %s" % dist) + >>> ws.subscribe(added) + Added Bar 0.9 + >>> foo12 = Distribution(project_name="Foo", version="1.2", location="f12") + >>> ws.add(foo12) + Added Foo 1.2 + +Note, however, that only the first distribution added for a given project name +will trigger a callback, even during the initial ``subscribe()`` callback:: + + >>> foo14 = Distribution(project_name="Foo", version="1.4", location="f14") + >>> ws.add(foo14) # no callback, because Foo 1.2 is already active + + >>> ws = WorkingSet([]) + >>> ws.add(foo12) + >>> ws.add(foo14) + >>> ws.subscribe(added) + Added Foo 1.2 + +And adding a callback more than once has no effect, either:: + + >>> ws.subscribe(added) # no callbacks + + # and no double-callbacks on subsequent additions, either + >>> just_a_test = Distribution(project_name="JustATest", version="0.99") + >>> ws.add(just_a_test) + Added JustATest 0.99 + + +Finding Plugins +--------------- + +``WorkingSet`` objects can be used to figure out what plugins in an +``Environment`` can be loaded without any resolution errors:: + + >>> from pkg_resources import Environment + + >>> plugins = Environment([]) # normally, a list of plugin directories + >>> plugins.add(foo12) + >>> plugins.add(foo14) + >>> plugins.add(just_a_test) + +In the simplest case, we just get the newest version of each distribution in +the plugin environment:: + + >>> ws = WorkingSet([]) + >>> ws.find_plugins(plugins) + ([JustATest 0.99, Foo 1.4 (f14)], {}) + +But if there's a problem with a version conflict or missing requirements, the +method falls back to older versions, and the error info dict will contain an +exception instance for each unloadable plugin:: + + >>> ws.add(foo12) # this will conflict with Foo 1.4 + >>> ws.find_plugins(plugins) + ([JustATest 0.99, Foo 1.2 (f12)], {Foo 1.4 (f14): VersionConflict(...)}) + +But if you disallow fallbacks, the failed plugin will be skipped instead of +trying older versions:: + + >>> ws.find_plugins(plugins, fallback=False) + ([JustATest 0.99], {Foo 1.4 (f14): VersionConflict(...)}) + + + +Platform Compatibility Rules +---------------------------- + +On the Mac, there are potential compatibility issues for modules compiled +on newer versions of Mac OS X than what the user is running. Additionally, +Mac OS X will soon have two platforms to contend with: Intel and PowerPC. + +Basic equality works as on other platforms:: + + >>> from pkg_resources import compatible_platforms as cp + >>> reqd = 'macosx-10.4-ppc' + >>> cp(reqd, reqd) + True + >>> cp("win32", reqd) + False + +Distributions made on other machine types are not compatible:: + + >>> cp("macosx-10.4-i386", reqd) + False + +Distributions made on earlier versions of the OS are compatible, as +long as they are from the same top-level version. The patchlevel version +number does not matter:: + + >>> cp("macosx-10.4-ppc", reqd) + True + >>> cp("macosx-10.3-ppc", reqd) + True + >>> cp("macosx-10.5-ppc", reqd) + False + >>> cp("macosx-9.5-ppc", reqd) + False + +Backwards compatibility for packages made via earlier versions of +setuptools is provided as well:: + + >>> cp("darwin-8.2.0-Power_Macintosh", reqd) + True + >>> cp("darwin-7.2.0-Power_Macintosh", reqd) + True + >>> cp("darwin-8.2.0-Power_Macintosh", "macosx-10.3-ppc") + False + + +Environment Markers +------------------- + + >>> from pkg_resources import invalid_marker as im, evaluate_marker as em + >>> import os + + >>> print(im("sys_platform")) + Invalid marker: 'sys_platform', parse error at '' + + >>> print(im("sys_platform==")) + Invalid marker: 'sys_platform==', parse error at '' + + >>> print(im("sys_platform=='win32'")) + False + + >>> print(im("sys=='x'")) + Invalid marker: "sys=='x'", parse error at "sys=='x'" + + >>> print(im("(extra)")) + Invalid marker: '(extra)', parse error at ')' + + >>> print(im("(extra")) + Invalid marker: '(extra', parse error at '' + + >>> print(im("os.open('foo')=='y'")) + Invalid marker: "os.open('foo')=='y'", parse error at 'os.open(' + + >>> print(im("'x'=='y' and os.open('foo')=='y'")) # no short-circuit! + Invalid marker: "'x'=='y' and os.open('foo')=='y'", parse error at 'and os.o' + + >>> print(im("'x'=='x' or os.open('foo')=='y'")) # no short-circuit! + Invalid marker: "'x'=='x' or os.open('foo')=='y'", parse error at 'or os.op' + + >>> print(im("'x' < 'y' < 'z'")) + Invalid marker: "'x' < 'y' < 'z'", parse error at "< 'z'" + + >>> print(im("r'x'=='x'")) + Invalid marker: "r'x'=='x'", parse error at "r'x'=='x" + + >>> print(im("'''x'''=='x'")) + Invalid marker: "'''x'''=='x'", parse error at "'x'''=='" + + >>> print(im('"""x"""=="x"')) + Invalid marker: '"""x"""=="x"', parse error at '"x"""=="' + + >>> print(im(r"x\n=='x'")) + Invalid marker: "x\\n=='x'", parse error at "x\\n=='x'" + + >>> print(im("os.open=='y'")) + Invalid marker: "os.open=='y'", parse error at 'os.open=' + + >>> em("sys_platform=='win32'") == (sys.platform=='win32') + True + + >>> em("python_version >= '2.6'") + True + + >>> em("python_version > '2.5'") + True + + >>> im("implementation_name=='cpython'") + False + + >>> im("platform_python_implementation=='CPython'") + False + + >>> im("implementation_version=='3.5.1'") + False diff --git a/pkg_resources/tests/__init__.py b/pkg_resources/tests/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/pkg_resources/tests/test_markers.py b/pkg_resources/tests/test_markers.py new file mode 100644 index 0000000..9306d5b --- /dev/null +++ b/pkg_resources/tests/test_markers.py @@ -0,0 +1,8 @@ +from unittest import mock + +from pkg_resources import evaluate_marker + + +@mock.patch('platform.python_version', return_value='2.7.10') +def test_ordering(python_version_mock): + assert evaluate_marker("python_full_version > '2.7.3'") is True diff --git a/pkg_resources/tests/test_pkg_resources.py b/pkg_resources/tests/test_pkg_resources.py new file mode 100644 index 0000000..bef914a --- /dev/null +++ b/pkg_resources/tests/test_pkg_resources.py @@ -0,0 +1,172 @@ +# coding: utf-8 +from __future__ import unicode_literals + +import sys +import tempfile +import os +import zipfile +import datetime +import time +import subprocess +import stat +import distutils.dist +import distutils.command.install_egg_info + +from six.moves import map + +import pytest + +import pkg_resources + +try: + unicode +except NameError: + unicode = str + + +def timestamp(dt): + """ + Return a timestamp for a local, naive datetime instance. + """ + try: + return dt.timestamp() + except AttributeError: + # Python 3.2 and earlier + return time.mktime(dt.timetuple()) + + +class EggRemover(unicode): + def __call__(self): + if self in sys.path: + sys.path.remove(self) + if os.path.exists(self): + os.remove(self) + + +class TestZipProvider(object): + finalizers = [] + + ref_time = datetime.datetime(2013, 5, 12, 13, 25, 0) + "A reference time for a file modification" + + @classmethod + def setup_class(cls): + "create a zip egg and add it to sys.path" + egg = tempfile.NamedTemporaryFile(suffix='.egg', delete=False) + zip_egg = zipfile.ZipFile(egg, 'w') + zip_info = zipfile.ZipInfo() + zip_info.filename = 'mod.py' + zip_info.date_time = cls.ref_time.timetuple() + zip_egg.writestr(zip_info, 'x = 3\n') + zip_info = zipfile.ZipInfo() + zip_info.filename = 'data.dat' + zip_info.date_time = cls.ref_time.timetuple() + zip_egg.writestr(zip_info, 'hello, world!') + zip_egg.close() + egg.close() + + sys.path.append(egg.name) + cls.finalizers.append(EggRemover(egg.name)) + + @classmethod + def teardown_class(cls): + for finalizer in cls.finalizers: + finalizer() + + def test_resource_filename_rewrites_on_change(self): + """ + If a previous call to get_resource_filename has saved the file, but + the file has been subsequently mutated with different file of the + same size and modification time, it should not be overwritten on a + subsequent call to get_resource_filename. + """ + import mod + manager = pkg_resources.ResourceManager() + zp = pkg_resources.ZipProvider(mod) + filename = zp.get_resource_filename(manager, 'data.dat') + actual = datetime.datetime.fromtimestamp(os.stat(filename).st_mtime) + assert actual == self.ref_time + f = open(filename, 'w') + f.write('hello, world?') + f.close() + ts = timestamp(self.ref_time) + os.utime(filename, (ts, ts)) + filename = zp.get_resource_filename(manager, 'data.dat') + f = open(filename) + assert f.read() == 'hello, world!' + manager.cleanup_resources() + + +class TestResourceManager(object): + def test_get_cache_path(self): + mgr = pkg_resources.ResourceManager() + path = mgr.get_cache_path('foo') + type_ = str(type(path)) + message = "Unexpected type from get_cache_path: " + type_ + assert isinstance(path, (unicode, str)), message + + +class TestIndependence: + """ + Tests to ensure that pkg_resources runs independently from setuptools. + """ + + def test_setuptools_not_imported(self): + """ + In a separate Python environment, import pkg_resources and assert + that action doesn't cause setuptools to be imported. + """ + lines = ( + 'import pkg_resources', + 'import sys', + 'assert "setuptools" not in sys.modules, ' + '"setuptools was imported"', + ) + cmd = [sys.executable, '-c', '; '.join(lines)] + subprocess.check_call(cmd) + + +class TestDeepVersionLookupDistutils(object): + @pytest.fixture + def env(self, tmpdir): + """ + Create a package environment, similar to a virtualenv, + in which packages are installed. + """ + + class Environment(str): + pass + + env = Environment(tmpdir) + tmpdir.chmod(stat.S_IRWXU) + subs = 'home', 'lib', 'scripts', 'data', 'egg-base' + env.paths = dict( + (dirname, str(tmpdir / dirname)) + for dirname in subs + ) + list(map(os.mkdir, env.paths.values())) + return env + + def create_foo_pkg(self, env, version): + """ + Create a foo package installed (distutils-style) to env.paths['lib'] + as version. + """ + ld = "This package has unicode metadata! ❄" + attrs = dict(name='foo', version=version, long_description=ld) + dist = distutils.dist.Distribution(attrs) + iei_cmd = distutils.command.install_egg_info.install_egg_info(dist) + iei_cmd.initialize_options() + iei_cmd.install_dir = env.paths['lib'] + iei_cmd.finalize_options() + iei_cmd.run() + + def test_version_resolved_from_egg_info(self, env): + version = '1.11.0.dev0+2329eae' + self.create_foo_pkg(env, version) + + # this requirement parsing will raise a VersionConflict unless the + # .egg-info file is parsed (see #419 on BitBucket) + req = pkg_resources.Requirement.parse('foo>=1.9') + dist = pkg_resources.WorkingSet([env.paths['lib']]).find(req) + assert dist.version == version diff --git a/pkg_resources/tests/test_resources.py b/pkg_resources/tests/test_resources.py new file mode 100644 index 0000000..b997aaa --- /dev/null +++ b/pkg_resources/tests/test_resources.py @@ -0,0 +1,858 @@ +from __future__ import unicode_literals + +import os +import sys +import string +import platform + +from six.moves import map + +import pytest +import packaging + +import pkg_resources +from pkg_resources import (parse_requirements, VersionConflict, parse_version, + Distribution, EntryPoint, Requirement, safe_version, safe_name, + WorkingSet) + + +class Metadata(pkg_resources.EmptyProvider): + """Mock object to return metadata as if from an on-disk distribution""" + + def __init__(self, *pairs): + self.metadata = dict(pairs) + + def has_metadata(self, name): + return name in self.metadata + + def get_metadata(self, name): + return self.metadata[name] + + def get_metadata_lines(self, name): + return pkg_resources.yield_lines(self.get_metadata(name)) + + +dist_from_fn = pkg_resources.Distribution.from_filename + + +class TestDistro: + def testCollection(self): + # empty path should produce no distributions + ad = pkg_resources.Environment([], platform=None, python=None) + assert list(ad) == [] + assert ad['FooPkg'] == [] + ad.add(dist_from_fn("FooPkg-1.3_1.egg")) + ad.add(dist_from_fn("FooPkg-1.4-py2.4-win32.egg")) + ad.add(dist_from_fn("FooPkg-1.2-py2.4.egg")) + + # Name is in there now + assert ad['FooPkg'] + # But only 1 package + assert list(ad) == ['foopkg'] + + # Distributions sort by version + assert [dist.version for dist in ad['FooPkg']] == ['1.4', '1.3-1', '1.2'] + + # Removing a distribution leaves sequence alone + ad.remove(ad['FooPkg'][1]) + assert [dist.version for dist in ad['FooPkg']] == ['1.4', '1.2'] + + # And inserting adds them in order + ad.add(dist_from_fn("FooPkg-1.9.egg")) + assert [dist.version for dist in ad['FooPkg']] == ['1.9', '1.4', '1.2'] + + ws = WorkingSet([]) + foo12 = dist_from_fn("FooPkg-1.2-py2.4.egg") + foo14 = dist_from_fn("FooPkg-1.4-py2.4-win32.egg") + req, = parse_requirements("FooPkg>=1.3") + + # Nominal case: no distros on path, should yield all applicable + assert ad.best_match(req, ws).version == '1.9' + # If a matching distro is already installed, should return only that + ws.add(foo14) + assert ad.best_match(req, ws).version == '1.4' + + # If the first matching distro is unsuitable, it's a version conflict + ws = WorkingSet([]) + ws.add(foo12) + ws.add(foo14) + with pytest.raises(VersionConflict): + ad.best_match(req, ws) + + # If more than one match on the path, the first one takes precedence + ws = WorkingSet([]) + ws.add(foo14) + ws.add(foo12) + ws.add(foo14) + assert ad.best_match(req, ws).version == '1.4' + + def checkFooPkg(self, d): + assert d.project_name == "FooPkg" + assert d.key == "foopkg" + assert d.version == "1.3.post1" + assert d.py_version == "2.4" + assert d.platform == "win32" + assert d.parsed_version == parse_version("1.3-1") + + def testDistroBasics(self): + d = Distribution( + "/some/path", + project_name="FooPkg", version="1.3-1", py_version="2.4", platform="win32" + ) + self.checkFooPkg(d) + + d = Distribution("/some/path") + assert d.py_version == sys.version[:3] + assert d.platform is None + + def testDistroParse(self): + d = dist_from_fn("FooPkg-1.3.post1-py2.4-win32.egg") + self.checkFooPkg(d) + d = dist_from_fn("FooPkg-1.3.post1-py2.4-win32.egg-info") + self.checkFooPkg(d) + + def testDistroMetadata(self): + d = Distribution( + "/some/path", project_name="FooPkg", py_version="2.4", platform="win32", + metadata=Metadata( + ('PKG-INFO', "Metadata-Version: 1.0\nVersion: 1.3-1\n") + ) + ) + self.checkFooPkg(d) + + def distRequires(self, txt): + return Distribution("/foo", metadata=Metadata(('depends.txt', txt))) + + def checkRequires(self, dist, txt, extras=()): + assert list(dist.requires(extras)) == list(parse_requirements(txt)) + + def testDistroDependsSimple(self): + for v in "Twisted>=1.5", "Twisted>=1.5\nZConfig>=2.0": + self.checkRequires(self.distRequires(v), v) + + def testResolve(self): + ad = pkg_resources.Environment([]) + ws = WorkingSet([]) + # Resolving no requirements -> nothing to install + assert list(ws.resolve([], ad)) == [] + # Request something not in the collection -> DistributionNotFound + with pytest.raises(pkg_resources.DistributionNotFound): + ws.resolve(parse_requirements("Foo"), ad) + + Foo = Distribution.from_filename( + "/foo_dir/Foo-1.2.egg", + metadata=Metadata(('depends.txt', "[bar]\nBaz>=2.0")) + ) + ad.add(Foo) + ad.add(Distribution.from_filename("Foo-0.9.egg")) + + # Request thing(s) that are available -> list to activate + for i in range(3): + targets = list(ws.resolve(parse_requirements("Foo"), ad)) + assert targets == [Foo] + list(map(ws.add, targets)) + with pytest.raises(VersionConflict): + ws.resolve(parse_requirements("Foo==0.9"), ad) + ws = WorkingSet([]) # reset + + # Request an extra that causes an unresolved dependency for "Baz" + with pytest.raises(pkg_resources.DistributionNotFound): + ws.resolve(parse_requirements("Foo[bar]"), ad) + Baz = Distribution.from_filename( + "/foo_dir/Baz-2.1.egg", metadata=Metadata(('depends.txt', "Foo")) + ) + ad.add(Baz) + + # Activation list now includes resolved dependency + assert list(ws.resolve(parse_requirements("Foo[bar]"), ad)) == [Foo, Baz] + # Requests for conflicting versions produce VersionConflict + with pytest.raises(VersionConflict) as vc: + ws.resolve(parse_requirements("Foo==1.2\nFoo!=1.2"), ad) + + msg = 'Foo 0.9 is installed but Foo==1.2 is required' + assert vc.value.report() == msg + + def test_environment_marker_evaluation_negative(self): + """Environment markers are evaluated at resolution time.""" + ad = pkg_resources.Environment([]) + ws = WorkingSet([]) + res = ws.resolve(parse_requirements("Foo;python_version<'2'"), ad) + assert list(res) == [] + + def test_environment_marker_evaluation_positive(self): + ad = pkg_resources.Environment([]) + ws = WorkingSet([]) + Foo = Distribution.from_filename("/foo_dir/Foo-1.2.dist-info") + ad.add(Foo) + res = ws.resolve(parse_requirements("Foo;python_version>='2'"), ad) + assert list(res) == [Foo] + + def test_environment_marker_evaluation_called(self): + """ + If one package foo requires bar without any extras, + markers should pass for bar without extras. + """ + parent_req, = parse_requirements("foo") + req, = parse_requirements("bar;python_version>='2'") + req_extras = pkg_resources._ReqExtras({req: parent_req.extras}) + assert req_extras.markers_pass(req) + + parent_req, = parse_requirements("foo[]") + req, = parse_requirements("bar;python_version>='2'") + req_extras = pkg_resources._ReqExtras({req: parent_req.extras}) + assert req_extras.markers_pass(req) + + def test_marker_evaluation_with_extras(self): + """Extras are also evaluated as markers at resolution time.""" + ad = pkg_resources.Environment([]) + ws = WorkingSet([]) + # Metadata needs to be native strings due to cStringIO behaviour in + # 2.6, so use str(). + Foo = Distribution.from_filename( + "/foo_dir/Foo-1.2.dist-info", + metadata=Metadata(("METADATA", str("Provides-Extra: baz\n" + "Requires-Dist: quux; extra=='baz'"))) + ) + ad.add(Foo) + assert list(ws.resolve(parse_requirements("Foo"), ad)) == [Foo] + quux = Distribution.from_filename("/foo_dir/quux-1.0.dist-info") + ad.add(quux) + res = list(ws.resolve(parse_requirements("Foo[baz]"), ad)) + assert res == [Foo, quux] + + def test_marker_evaluation_with_extras_normlized(self): + """Extras are also evaluated as markers at resolution time.""" + ad = pkg_resources.Environment([]) + ws = WorkingSet([]) + # Metadata needs to be native strings due to cStringIO behaviour in + # 2.6, so use str(). + Foo = Distribution.from_filename( + "/foo_dir/Foo-1.2.dist-info", + metadata=Metadata(("METADATA", str("Provides-Extra: baz-lightyear\n" + "Requires-Dist: quux; extra=='baz-lightyear'"))) + ) + ad.add(Foo) + assert list(ws.resolve(parse_requirements("Foo"), ad)) == [Foo] + quux = Distribution.from_filename("/foo_dir/quux-1.0.dist-info") + ad.add(quux) + res = list(ws.resolve(parse_requirements("Foo[baz-lightyear]"), ad)) + assert res == [Foo, quux] + + def test_marker_evaluation_with_multiple_extras(self): + ad = pkg_resources.Environment([]) + ws = WorkingSet([]) + # Metadata needs to be native strings due to cStringIO behaviour in + # 2.6, so use str(). + Foo = Distribution.from_filename( + "/foo_dir/Foo-1.2.dist-info", + metadata=Metadata(("METADATA", str("Provides-Extra: baz\n" + "Requires-Dist: quux; extra=='baz'\n" + "Provides-Extra: bar\n" + "Requires-Dist: fred; extra=='bar'\n"))) + ) + ad.add(Foo) + quux = Distribution.from_filename("/foo_dir/quux-1.0.dist-info") + ad.add(quux) + fred = Distribution.from_filename("/foo_dir/fred-0.1.dist-info") + ad.add(fred) + res = list(ws.resolve(parse_requirements("Foo[baz,bar]"), ad)) + assert sorted(res) == [fred, quux, Foo] + + def test_marker_evaluation_with_extras_loop(self): + ad = pkg_resources.Environment([]) + ws = WorkingSet([]) + # Metadata needs to be native strings due to cStringIO behaviour in + # 2.6, so use str(). + a = Distribution.from_filename( + "/foo_dir/a-0.2.dist-info", + metadata=Metadata(("METADATA", str("Requires-Dist: c[a]"))) + ) + b = Distribution.from_filename( + "/foo_dir/b-0.3.dist-info", + metadata=Metadata(("METADATA", str("Requires-Dist: c[b]"))) + ) + c = Distribution.from_filename( + "/foo_dir/c-1.0.dist-info", + metadata=Metadata(("METADATA", str("Provides-Extra: a\n" + "Requires-Dist: b;extra=='a'\n" + "Provides-Extra: b\n" + "Requires-Dist: foo;extra=='b'"))) + ) + foo = Distribution.from_filename("/foo_dir/foo-0.1.dist-info") + for dist in (a, b, c, foo): + ad.add(dist) + res = list(ws.resolve(parse_requirements("a"), ad)) + assert res == [a, c, b, foo] + + def testDistroDependsOptions(self): + d = self.distRequires(""" + Twisted>=1.5 + [docgen] + ZConfig>=2.0 + docutils>=0.3 + [fastcgi] + fcgiapp>=0.1""") + self.checkRequires(d, "Twisted>=1.5") + self.checkRequires( + d, "Twisted>=1.5 ZConfig>=2.0 docutils>=0.3".split(), ["docgen"] + ) + self.checkRequires( + d, "Twisted>=1.5 fcgiapp>=0.1".split(), ["fastcgi"] + ) + self.checkRequires( + d, "Twisted>=1.5 ZConfig>=2.0 docutils>=0.3 fcgiapp>=0.1".split(), + ["docgen", "fastcgi"] + ) + self.checkRequires( + d, "Twisted>=1.5 fcgiapp>=0.1 ZConfig>=2.0 docutils>=0.3".split(), + ["fastcgi", "docgen"] + ) + with pytest.raises(pkg_resources.UnknownExtra): + d.requires(["foo"]) + + +class TestWorkingSet: + def test_find_conflicting(self): + ws = WorkingSet([]) + Foo = Distribution.from_filename("/foo_dir/Foo-1.2.egg") + ws.add(Foo) + + # create a requirement that conflicts with Foo 1.2 + req = next(parse_requirements("Foo<1.2")) + + with pytest.raises(VersionConflict) as vc: + ws.find(req) + + msg = 'Foo 1.2 is installed but Foo<1.2 is required' + assert vc.value.report() == msg + + def test_resolve_conflicts_with_prior(self): + """ + A ContextualVersionConflict should be raised when a requirement + conflicts with a prior requirement for a different package. + """ + # Create installation where Foo depends on Baz 1.0 and Bar depends on + # Baz 2.0. + ws = WorkingSet([]) + md = Metadata(('depends.txt', "Baz==1.0")) + Foo = Distribution.from_filename("/foo_dir/Foo-1.0.egg", metadata=md) + ws.add(Foo) + md = Metadata(('depends.txt', "Baz==2.0")) + Bar = Distribution.from_filename("/foo_dir/Bar-1.0.egg", metadata=md) + ws.add(Bar) + Baz = Distribution.from_filename("/foo_dir/Baz-1.0.egg") + ws.add(Baz) + Baz = Distribution.from_filename("/foo_dir/Baz-2.0.egg") + ws.add(Baz) + + with pytest.raises(VersionConflict) as vc: + ws.resolve(parse_requirements("Foo\nBar\n")) + + msg = "Baz 1.0 is installed but Baz==2.0 is required by " + msg += repr(set(['Bar'])) + assert vc.value.report() == msg + + +class TestEntryPoints: + def assertfields(self, ep): + assert ep.name == "foo" + assert ep.module_name == "pkg_resources.tests.test_resources" + assert ep.attrs == ("TestEntryPoints",) + assert ep.extras == ("x",) + assert ep.load() is TestEntryPoints + expect = "foo = pkg_resources.tests.test_resources:TestEntryPoints [x]" + assert str(ep) == expect + + def setup_method(self, method): + self.dist = Distribution.from_filename( + "FooPkg-1.2-py2.4.egg", metadata=Metadata(('requires.txt', '[x]'))) + + def testBasics(self): + ep = EntryPoint( + "foo", "pkg_resources.tests.test_resources", ["TestEntryPoints"], + ["x"], self.dist + ) + self.assertfields(ep) + + def testParse(self): + s = "foo = pkg_resources.tests.test_resources:TestEntryPoints [x]" + ep = EntryPoint.parse(s, self.dist) + self.assertfields(ep) + + ep = EntryPoint.parse("bar baz= spammity[PING]") + assert ep.name == "bar baz" + assert ep.module_name == "spammity" + assert ep.attrs == () + assert ep.extras == ("ping",) + + ep = EntryPoint.parse(" fizzly = wocka:foo") + assert ep.name == "fizzly" + assert ep.module_name == "wocka" + assert ep.attrs == ("foo",) + assert ep.extras == () + + # plus in the name + spec = "html+mako = mako.ext.pygmentplugin:MakoHtmlLexer" + ep = EntryPoint.parse(spec) + assert ep.name == 'html+mako' + + reject_specs = "foo", "x=a:b:c", "q=x/na", "fez=pish:tush-z", "x=f[a]>2" + + @pytest.mark.parametrize("reject_spec", reject_specs) + def test_reject_spec(self, reject_spec): + with pytest.raises(ValueError): + EntryPoint.parse(reject_spec) + + def test_printable_name(self): + """ + Allow any printable character in the name. + """ + # Create a name with all printable characters; strip the whitespace. + name = string.printable.strip() + spec = "{name} = module:attr".format(**locals()) + ep = EntryPoint.parse(spec) + assert ep.name == name + + def checkSubMap(self, m): + assert len(m) == len(self.submap_expect) + for key, ep in self.submap_expect.items(): + assert m.get(key).name == ep.name + assert m.get(key).module_name == ep.module_name + assert sorted(m.get(key).attrs) == sorted(ep.attrs) + assert sorted(m.get(key).extras) == sorted(ep.extras) + + submap_expect = dict( + feature1=EntryPoint('feature1', 'somemodule', ['somefunction']), + feature2=EntryPoint('feature2', 'another.module', ['SomeClass'], ['extra1', 'extra2']), + feature3=EntryPoint('feature3', 'this.module', extras=['something']) + ) + submap_str = """ + # define features for blah blah + feature1 = somemodule:somefunction + feature2 = another.module:SomeClass [extra1,extra2] + feature3 = this.module [something] + """ + + def testParseList(self): + self.checkSubMap(EntryPoint.parse_group("xyz", self.submap_str)) + with pytest.raises(ValueError): + EntryPoint.parse_group("x a", "foo=bar") + with pytest.raises(ValueError): + EntryPoint.parse_group("x", ["foo=baz", "foo=bar"]) + + def testParseMap(self): + m = EntryPoint.parse_map({'xyz': self.submap_str}) + self.checkSubMap(m['xyz']) + assert list(m.keys()) == ['xyz'] + m = EntryPoint.parse_map("[xyz]\n" + self.submap_str) + self.checkSubMap(m['xyz']) + assert list(m.keys()) == ['xyz'] + with pytest.raises(ValueError): + EntryPoint.parse_map(["[xyz]", "[xyz]"]) + with pytest.raises(ValueError): + EntryPoint.parse_map(self.submap_str) + + +class TestRequirements: + def testBasics(self): + r = Requirement.parse("Twisted>=1.2") + assert str(r) == "Twisted>=1.2" + assert repr(r) == "Requirement.parse('Twisted>=1.2')" + assert r == Requirement("Twisted>=1.2") + assert r == Requirement("twisTed>=1.2") + assert r != Requirement("Twisted>=2.0") + assert r != Requirement("Zope>=1.2") + assert r != Requirement("Zope>=3.0") + assert r != Requirement("Twisted[extras]>=1.2") + + def testOrdering(self): + r1 = Requirement("Twisted==1.2c1,>=1.2") + r2 = Requirement("Twisted>=1.2,==1.2c1") + assert r1 == r2 + assert str(r1) == str(r2) + assert str(r2) == "Twisted==1.2c1,>=1.2" + + def testBasicContains(self): + r = Requirement("Twisted>=1.2") + foo_dist = Distribution.from_filename("FooPkg-1.3_1.egg") + twist11 = Distribution.from_filename("Twisted-1.1.egg") + twist12 = Distribution.from_filename("Twisted-1.2.egg") + assert parse_version('1.2') in r + assert parse_version('1.1') not in r + assert '1.2' in r + assert '1.1' not in r + assert foo_dist not in r + assert twist11 not in r + assert twist12 in r + + def testOptionsAndHashing(self): + r1 = Requirement.parse("Twisted[foo,bar]>=1.2") + r2 = Requirement.parse("Twisted[bar,FOO]>=1.2") + assert r1 == r2 + assert set(r1.extras) == set(("foo", "bar")) + assert set(r2.extras) == set(("foo", "bar")) + assert hash(r1) == hash(r2) + assert ( + hash(r1) + == + hash(( + "twisted", + packaging.specifiers.SpecifierSet(">=1.2"), + frozenset(["foo", "bar"]), + None + )) + ) + + def testVersionEquality(self): + r1 = Requirement.parse("foo==0.3a2") + r2 = Requirement.parse("foo!=0.3a4") + d = Distribution.from_filename + + assert d("foo-0.3a4.egg") not in r1 + assert d("foo-0.3a1.egg") not in r1 + assert d("foo-0.3a4.egg") not in r2 + + assert d("foo-0.3a2.egg") in r1 + assert d("foo-0.3a2.egg") in r2 + assert d("foo-0.3a3.egg") in r2 + assert d("foo-0.3a5.egg") in r2 + + def testSetuptoolsProjectName(self): + """ + The setuptools project should implement the setuptools package. + """ + + assert ( + Requirement.parse('setuptools').project_name == 'setuptools') + # setuptools 0.7 and higher means setuptools. + assert ( + Requirement.parse('setuptools == 0.7').project_name == 'setuptools') + assert ( + Requirement.parse('setuptools == 0.7a1').project_name == 'setuptools') + assert ( + Requirement.parse('setuptools >= 0.7').project_name == 'setuptools') + + +class TestParsing: + def testEmptyParse(self): + assert list(parse_requirements('')) == [] + + def testYielding(self): + for inp, out in [ + ([], []), ('x', ['x']), ([[]], []), (' x\n y', ['x', 'y']), + (['x\n\n', 'y'], ['x', 'y']), + ]: + assert list(pkg_resources.yield_lines(inp)) == out + + def testSplitting(self): + sample = """ + x + [Y] + z + + a + [b ] + # foo + c + [ d] + [q] + v + """ + assert ( + list(pkg_resources.split_sections(sample)) + == + [ + (None, ["x"]), + ("Y", ["z", "a"]), + ("b", ["c"]), + ("d", []), + ("q", ["v"]), + ] + ) + with pytest.raises(ValueError): + list(pkg_resources.split_sections("[foo")) + + def testSafeName(self): + assert safe_name("adns-python") == "adns-python" + assert safe_name("WSGI Utils") == "WSGI-Utils" + assert safe_name("WSGI Utils") == "WSGI-Utils" + assert safe_name("Money$$$Maker") == "Money-Maker" + assert safe_name("peak.web") != "peak-web" + + def testSafeVersion(self): + assert safe_version("1.2-1") == "1.2.post1" + assert safe_version("1.2 alpha") == "1.2.alpha" + assert safe_version("2.3.4 20050521") == "2.3.4.20050521" + assert safe_version("Money$$$Maker") == "Money-Maker" + assert safe_version("peak.web") == "peak.web" + + def testSimpleRequirements(self): + assert ( + list(parse_requirements('Twis-Ted>=1.2-1')) + == + [Requirement('Twis-Ted>=1.2-1')] + ) + assert ( + list(parse_requirements('Twisted >=1.2, \\ # more\n<2.0')) + == + [Requirement('Twisted>=1.2,<2.0')] + ) + assert ( + Requirement.parse("FooBar==1.99a3") + == + Requirement("FooBar==1.99a3") + ) + with pytest.raises(ValueError): + Requirement.parse(">=2.3") + with pytest.raises(ValueError): + Requirement.parse("x\\") + with pytest.raises(ValueError): + Requirement.parse("x==2 q") + with pytest.raises(ValueError): + Requirement.parse("X==1\nY==2") + with pytest.raises(ValueError): + Requirement.parse("#") + + def test_requirements_with_markers(self): + assert ( + Requirement.parse("foobar;os_name=='a'") + == + Requirement.parse("foobar;os_name=='a'") + ) + assert ( + Requirement.parse("name==1.1;python_version=='2.7'") + != + Requirement.parse("name==1.1;python_version=='3.3'") + ) + assert ( + Requirement.parse("name==1.0;python_version=='2.7'") + != + Requirement.parse("name==1.2;python_version=='2.7'") + ) + assert ( + Requirement.parse("name[foo]==1.0;python_version=='3.3'") + != + Requirement.parse("name[foo,bar]==1.0;python_version=='3.3'") + ) + + def test_local_version(self): + req, = parse_requirements('foo==1.0.org1') + + def test_spaces_between_multiple_versions(self): + req, = parse_requirements('foo>=1.0, <3') + req, = parse_requirements('foo >= 1.0, < 3') + + def testVersionEquality(self): + def c(s1, s2): + p1, p2 = parse_version(s1), parse_version(s2) + assert p1 == p2, (s1, s2, p1, p2) + + c('1.2-rc1', '1.2rc1') + c('0.4', '0.4.0') + c('0.4.0.0', '0.4.0') + c('0.4.0-0', '0.4-0') + c('0post1', '0.0post1') + c('0pre1', '0.0c1') + c('0.0.0preview1', '0c1') + c('0.0c1', '0-rc1') + c('1.2a1', '1.2.a.1') + c('1.2.a', '1.2a') + + def testVersionOrdering(self): + def c(s1, s2): + p1, p2 = parse_version(s1), parse_version(s2) + assert p1 < p2, (s1, s2, p1, p2) + + c('2.1', '2.1.1') + c('2a1', '2b0') + c('2a1', '2.1') + c('2.3a1', '2.3') + c('2.1-1', '2.1-2') + c('2.1-1', '2.1.1') + c('2.1', '2.1post4') + c('2.1a0-20040501', '2.1') + c('1.1', '02.1') + c('3.2', '3.2.post0') + c('3.2post1', '3.2post2') + c('0.4', '4.0') + c('0.0.4', '0.4.0') + c('0post1', '0.4post1') + c('2.1.0-rc1', '2.1.0') + c('2.1dev', '2.1a0') + + torture = """ + 0.80.1-3 0.80.1-2 0.80.1-1 0.79.9999+0.80.0pre4-1 + 0.79.9999+0.80.0pre2-3 0.79.9999+0.80.0pre2-2 + 0.77.2-1 0.77.1-1 0.77.0-1 + """.split() + + for p, v1 in enumerate(torture): + for v2 in torture[p + 1:]: + c(v2, v1) + + def testVersionBuildout(self): + """ + Buildout has a function in it's bootstrap.py that inspected the return + value of parse_version. The new parse_version returns a Version class + which needs to support this behavior, at least for now. + """ + + def buildout(parsed_version): + _final_parts = '*final-', '*final' + + def _final_version(parsed_version): + for part in parsed_version: + if (part[:1] == '*') and (part not in _final_parts): + return False + return True + + return _final_version(parsed_version) + + assert buildout(parse_version("1.0")) + assert not buildout(parse_version("1.0a1")) + + def testVersionIndexable(self): + """ + Some projects were doing things like parse_version("v")[0], so we'll + support indexing the same as we support iterating. + """ + assert parse_version("1.0")[0] == "00000001" + + def testVersionTupleSort(self): + """ + Some projects expected to be able to sort tuples against the return + value of parse_version. So again we'll add a warning enabled shim to + make this possible. + """ + assert parse_version("1.0") < tuple(parse_version("2.0")) + assert parse_version("1.0") <= tuple(parse_version("2.0")) + assert parse_version("1.0") == tuple(parse_version("1.0")) + assert parse_version("3.0") > tuple(parse_version("2.0")) + assert parse_version("3.0") >= tuple(parse_version("2.0")) + assert parse_version("3.0") != tuple(parse_version("2.0")) + assert not (parse_version("3.0") != tuple(parse_version("3.0"))) + + def testVersionHashable(self): + """ + Ensure that our versions stay hashable even though we've subclassed + them and added some shim code to them. + """ + assert ( + hash(parse_version("1.0")) + == + hash(parse_version("1.0")) + ) + + +class TestNamespaces: + + ns_str = "__import__('pkg_resources').declare_namespace(__name__)\n" + + @pytest.yield_fixture + def symlinked_tmpdir(self, tmpdir): + """ + Where available, return the tempdir as a symlink, + which as revealed in #231 is more fragile than + a natural tempdir. + """ + if not hasattr(os, 'symlink'): + yield str(tmpdir) + return + + link_name = str(tmpdir) + '-linked' + os.symlink(str(tmpdir), link_name) + try: + yield type(tmpdir)(link_name) + finally: + os.unlink(link_name) + + @pytest.yield_fixture(autouse=True) + def patched_path(self, tmpdir): + """ + Patch sys.path to include the 'site-pkgs' dir. Also + restore pkg_resources._namespace_packages to its + former state. + """ + saved_ns_pkgs = pkg_resources._namespace_packages.copy() + saved_sys_path = sys.path[:] + site_pkgs = tmpdir.mkdir('site-pkgs') + sys.path.append(str(site_pkgs)) + try: + yield + finally: + pkg_resources._namespace_packages = saved_ns_pkgs + sys.path = saved_sys_path + + issue591 = pytest.mark.xfail(platform.system() == 'Windows', reason="#591") + + @issue591 + def test_two_levels_deep(self, symlinked_tmpdir): + """ + Test nested namespace packages + Create namespace packages in the following tree : + site-packages-1/pkg1/pkg2 + site-packages-2/pkg1/pkg2 + Check both are in the _namespace_packages dict and that their __path__ + is correct + """ + real_tmpdir = symlinked_tmpdir.realpath() + tmpdir = symlinked_tmpdir + sys.path.append(str(tmpdir / 'site-pkgs2')) + site_dirs = tmpdir / 'site-pkgs', tmpdir / 'site-pkgs2' + for site in site_dirs: + pkg1 = site / 'pkg1' + pkg2 = pkg1 / 'pkg2' + pkg2.ensure_dir() + (pkg1 / '__init__.py').write_text(self.ns_str, encoding='utf-8') + (pkg2 / '__init__.py').write_text(self.ns_str, encoding='utf-8') + import pkg1 + assert "pkg1" in pkg_resources._namespace_packages + # attempt to import pkg2 from site-pkgs2 + import pkg1.pkg2 + # check the _namespace_packages dict + assert "pkg1.pkg2" in pkg_resources._namespace_packages + assert pkg_resources._namespace_packages["pkg1"] == ["pkg1.pkg2"] + # check the __path__ attribute contains both paths + expected = [ + str(real_tmpdir / "site-pkgs" / "pkg1" / "pkg2"), + str(real_tmpdir / "site-pkgs2" / "pkg1" / "pkg2"), + ] + assert pkg1.pkg2.__path__ == expected + + @issue591 + def test_path_order(self, symlinked_tmpdir): + """ + Test that if multiple versions of the same namespace package subpackage + are on different sys.path entries, that only the one earliest on + sys.path is imported, and that the namespace package's __path__ is in + the correct order. + + Regression test for https://github.com/pypa/setuptools/issues/207 + """ + + tmpdir = symlinked_tmpdir + site_dirs = ( + tmpdir / "site-pkgs", + tmpdir / "site-pkgs2", + tmpdir / "site-pkgs3", + ) + + vers_str = "__version__ = %r" + + for number, site in enumerate(site_dirs, 1): + if number > 1: + sys.path.append(str(site)) + nspkg = site / 'nspkg' + subpkg = nspkg / 'subpkg' + subpkg.ensure_dir() + (nspkg / '__init__.py').write_text(self.ns_str, encoding='utf-8') + (subpkg / '__init__.py').write_text(vers_str % number, encoding='utf-8') + + import nspkg.subpkg + import nspkg + expected = [ + str(site.realpath() / 'nspkg') + for site in site_dirs + ] + assert nspkg.__path__ == expected + assert nspkg.subpkg.__version__ == 1 diff --git a/pytest.ini b/pytest.ini new file mode 100755 index 0000000..11a213d --- /dev/null +++ b/pytest.ini @@ -0,0 +1,6 @@ +[pytest] +addopts=--doctest-modules --ignore release.py --ignore setuptools/lib2to3_ex.py --ignore tests/manual_test.py --ignore tests/test_pypi.py --ignore tests/shlib_test --doctest-glob=pkg_resources/api_tests.txt --ignore scripts/upload-old-releases-as-zip.py --ignore pavement.py --ignore setuptools/tests/mod_with_constant.py +norecursedirs=dist build *.egg setuptools/extern pkg_resources/extern .* +flake8-ignore = + setuptools/site-patch.py F821 + setuptools/py*compat.py F811 diff --git a/release.sh b/release.sh deleted file mode 100755 index e184d02..0000000 --- a/release.sh +++ /dev/null @@ -1,25 +0,0 @@ -#!/bin/sh - -# This script is for PJE's working environment only, to upload -# releases to PyPI, telecommunity, eby-sarna SVN, update local -# project checkouts, etc. -# -# If your initials aren't PJE, don't run it. :) -# - -export VERSION="0.6c11" -python2.3 setup.py -q egg_info # force upload to be available -python2.3 setup.py -q release source --target-version=2.3 upload && \ -python2.4 setup.py -q release binary --target-version=2.4 upload && \ -python2.5 setup.py -q release binary --target-version=2.5 upload && \ -python2.6 setup.py -q release binary --target-version=2.6 -p win32 upload && \ -python2.3 ez_setup.py --md5update dist/setuptools-$VERSION*-py2.?.egg && \ - cp ez_setup.py virtual-python.py ~/distrib/ && \ - cp ez_setup.py ~/projects/ez_setup/__init__.py && \ - svn ci -m "Update ez_setup for setuptools $VERSION" \ - ~/projects/ez_setup/__init__.py #&& \ - #svn up ~/projects/*/ez_setup - -# update wiki pages from EasyInstall.txt, setuptools.txt, & -# pkg_resources.txt -python2.5 setup.py wikiup -c "Released version: $VERSION" diff --git a/setup.cfg b/setup.cfg index ddab119..b8bba8a 100755 --- a/setup.cfg +++ b/setup.cfg @@ -1,18 +1,26 @@ -[bdist_rpm] -source_only = 1 -requires = python-devel -doc_files = setuptools.txt EasyInstall.txt pkg_resources.txt +[bumpversion] +current_version = 34.3.3 +commit = True +tag = True [egg_info] tag_build = tag_date = 0 -tag_svn_revision = 0 + +[aliases] +clean_egg_info = egg_info -Db '' +release = clean_egg_info sdist bdist_wheel +source = register sdist binary +binary = bdist_egg upload --show-response [upload] -show_response = 1 +repository = https://upload.pypi.org/legacy/ -[aliases] -release = egg_info -RDb '' -binary = bdist_egg bdist_wininst -source = bdist_rpm register binary +[sdist] +formats = zip + +[wheel] +universal = 1 + +[bumpversion:file:setup.py] diff --git a/setup.py b/setup.py index b9a4076..100acfa 100755 --- a/setup.py +++ b/setup.py @@ -1,59 +1,133 @@ #!/usr/bin/env python -"""Distutils setup file, used to install or test 'setuptools'""" +""" +Distutils setup file, used to install or test 'setuptools' +""" -from distutils.util import convert_path +import io +import os +import sys +import textwrap -d = {} -execfile(convert_path('setuptools/command/__init__.py'), d) +import setuptools -SETUP_COMMANDS = d['__all__'] -VERSION = "0.6c11" +here = os.path.dirname(__file__) -from setuptools import setup, find_packages -import sys -scripts = [] -setup( - name="setuptools", - version=VERSION, - description="Download, build, install, upgrade, and uninstall Python " - "packages -- easily!", - author="Phillip J. Eby", - author_email="distutils-sig@python.org", - license="PSF or ZPL", - long_description = open('README.txt').read(), - keywords = "CPAN PyPI distutils eggs package management", - url = "http://pypi.python.org/pypi/setuptools", - test_suite = 'setuptools.tests', - packages = find_packages(), - package_data = {'setuptools':['*.exe']}, +def require_metadata(): + "Prevent improper installs without necessary metadata. See #659" + if not os.path.exists('setuptools.egg-info'): + msg = ( + "Cannot build setuptools without metadata. " + "Install rwt and run `rwt -- bootstrap.py`." + ) + raise RuntimeError(msg) + + +def read_commands(): + command_ns = {} + cmd_module_path = 'setuptools/command/__init__.py' + init_path = os.path.join(here, cmd_module_path) + with open(init_path) as init_file: + exec(init_file.read(), command_ns) + return command_ns['__all__'] + - py_modules = ['pkg_resources', 'easy_install', 'site'], +def _gen_console_scripts(): + yield "easy_install = setuptools.command.easy_install:main" - zip_safe = (sys.version>="2.5"), # <2.5 needs unzipped for -m to work + # Gentoo distributions manage the python-version-specific scripts + # themselves, so those platforms define an environment variable to + # suppress the creation of the version-specific scripts. + var_names = ( + 'SETUPTOOLS_DISABLE_VERSIONED_EASY_INSTALL_SCRIPT', + 'DISTRIBUTE_DISABLE_VERSIONED_EASY_INSTALL_SCRIPT', + ) + if any(os.environ.get(var) not in (None, "", "0") for var in var_names): + return + yield ("easy_install-{shortver} = setuptools.command.easy_install:main" + .format(shortver=sys.version[:3])) - entry_points = { - "distutils.commands" : [ +readme_path = os.path.join(here, 'README.rst') +with io.open(readme_path, encoding='utf-8') as readme_file: + long_description = readme_file.read() + +package_data = dict( + setuptools=['script (dev).tmpl', 'script.tmpl', 'site-patch.py'], +) + +force_windows_specific_files = ( + os.environ.get("SETUPTOOLS_INSTALL_WINDOWS_SPECIFIC_FILES", "1").lower() + not in ("", "0", "false", "no") +) + +include_windows_files = ( + sys.platform == 'win32' or + os.name == 'java' and os._name == 'nt' or + force_windows_specific_files +) + +if include_windows_files: + package_data.setdefault('setuptools', []).extend(['*.exe']) + package_data.setdefault('setuptools.command', []).extend(['*.xml']) + +needs_wheel = set(['release', 'bdist_wheel']).intersection(sys.argv) +wheel = ['wheel'] if needs_wheel else [] + + +def pypi_link(pkg_filename): + """ + Given the filename, including md5 fragment, construct the + dependency link for PyPI. + """ + root = 'https://files.pythonhosted.org/packages/source' + name, sep, rest = pkg_filename.partition('-') + parts = root, name[0], name, pkg_filename + return '/'.join(parts) + + +setup_params = dict( + name="setuptools", + version="34.3.3", + description="Easily download, build, install, upgrade, and uninstall " + "Python packages", + author="Python Packaging Authority", + author_email="distutils-sig@python.org", + long_description=long_description, + keywords="CPAN PyPI distutils eggs package management", + url="https://github.com/pypa/setuptools", + src_root=None, + packages=setuptools.find_packages(exclude=['*.tests']), + package_data=package_data, + py_modules=['easy_install'], + zip_safe=True, + entry_points={ + "distutils.commands": [ "%(cmd)s = setuptools.command.%(cmd)s:%(cmd)s" % locals() - for cmd in SETUP_COMMANDS + for cmd in read_commands() ], - "distutils.setup_keywords": [ - "eager_resources = setuptools.dist:assert_string_list", - "namespace_packages = setuptools.dist:check_nsp", - "extras_require = setuptools.dist:check_extras", - "install_requires = setuptools.dist:check_requirements", - "tests_require = setuptools.dist:check_requirements", - "entry_points = setuptools.dist:check_entry_points", - "test_suite = setuptools.dist:check_test_suite", - "zip_safe = setuptools.dist:assert_bool", - "package_data = setuptools.dist:check_package_data", - "exclude_package_data = setuptools.dist:check_package_data", - "include_package_data = setuptools.dist:assert_bool", - "dependency_links = setuptools.dist:assert_string_list", - "test_loader = setuptools.dist:check_importable", - "packages = setuptools.dist:check_packages", + "eager_resources = setuptools.dist:assert_string_list", + "namespace_packages = setuptools.dist:check_nsp", + "extras_require = setuptools.dist:check_extras", + "install_requires = setuptools.dist:check_requirements", + "tests_require = setuptools.dist:check_requirements", + "setup_requires = setuptools.dist:check_requirements", + "python_requires = setuptools.dist:check_specifier", + "entry_points = setuptools.dist:check_entry_points", + "test_suite = setuptools.dist:check_test_suite", + "zip_safe = setuptools.dist:assert_bool", + "package_data = setuptools.dist:check_package_data", + "exclude_package_data = setuptools.dist:check_package_data", + "include_package_data = setuptools.dist:assert_bool", + "packages = setuptools.dist:check_packages", + "dependency_links = setuptools.dist:assert_string_list", + "test_loader = setuptools.dist:check_importable", + "test_runner = setuptools.dist:check_importable", + "use_2to3 = setuptools.dist:assert_bool", + "convert_2to3_doctests = setuptools.dist:assert_string_list", + "use_2to3_fixers = setuptools.dist:assert_string_list", + "use_2to3_exclude_fixers = setuptools.dist:assert_string_list", ], "egg_info.writers": [ "PKG-INFO = setuptools.command.egg_info:write_pkg_info", @@ -65,59 +139,53 @@ setup( "depends.txt = setuptools.command.egg_info:warn_depends_obsolete", "dependency_links.txt = setuptools.command.egg_info:overwrite_arg", ], - - "console_scripts": [ - "easy_install = setuptools.command.easy_install:main", - "easy_install-%s = setuptools.command.easy_install:main" - % sys.version[:3] - ], - - "setuptools.file_finders": - ["svn_cvs = setuptools.command.sdist:_default_revctrl"], - + "console_scripts": list(_gen_console_scripts()), "setuptools.installation": ['eggsecutable = setuptools.command.easy_install:bootstrap'], - }, - - - classifiers = [f.strip() for f in """ - Development Status :: 3 - Alpha - Intended Audience :: Developers - License :: OSI Approved :: Python Software Foundation License - License :: OSI Approved :: Zope Public License - Operating System :: OS Independent - Programming Language :: Python - Topic :: Software Development :: Libraries :: Python Modules - Topic :: System :: Archiving :: Packaging - Topic :: System :: Systems Administration - Topic :: Utilities""".splitlines() if f.strip()], - scripts = scripts, - - # uncomment for testing - # setup_requires = ['setuptools>=0.6a0'], + }, + classifiers=textwrap.dedent(""" + Development Status :: 5 - Production/Stable + Intended Audience :: Developers + License :: OSI Approved :: MIT License + Operating System :: OS Independent + Programming Language :: Python :: 2 + Programming Language :: Python :: 2.6 + Programming Language :: Python :: 2.7 + Programming Language :: Python :: 3 + Programming Language :: Python :: 3.3 + Programming Language :: Python :: 3.4 + Programming Language :: Python :: 3.5 + Programming Language :: Python :: 3.6 + Topic :: Software Development :: Libraries :: Python Modules + Topic :: System :: Archiving :: Packaging + Topic :: System :: Systems Administration + Topic :: Utilities + """).strip().splitlines(), + python_requires='>=2.6,!=3.0.*,!=3.1.*,!=3.2.*', + install_requires=[ + 'packaging>=16.8', + 'six>=1.6.0', + 'appdirs>=1.4.0', + ], + extras_require={ + "ssl:sys_platform=='win32'": "wincertstore==0.2", + "certs": "certifi==2016.9.26", + }, + dependency_links=[ + pypi_link( + 'certifi-2016.9.26.tar.gz#md5=baa81e951a29958563689d868ef1064d', + ), + pypi_link( + 'wincertstore-0.2.zip#md5=ae728f2f007185648d0c7a8679b361e2', + ), + ], + scripts=[], + setup_requires=[ + ] + wheel, ) - - - - - - - - - - - - - - - - - - - - - - - - +if __name__ == '__main__': + # allow setup.py to run from another directory + here and os.chdir(here) + require_metadata() + dist = setuptools.setup(**setup_params) diff --git a/setuptools.egg-info/PKG-INFO b/setuptools.egg-info/PKG-INFO index f2bca0d..439d58b 100644 --- a/setuptools.egg-info/PKG-INFO +++ b/setuptools.egg-info/PKG-INFO @@ -1,183 +1,51 @@ -Metadata-Version: 1.0 +Metadata-Version: 1.2 Name: setuptools -Version: 0.6c11 -Summary: Download, build, install, upgrade, and uninstall Python packages -- easily! -Home-page: http://pypi.python.org/pypi/setuptools -Author: Phillip J. Eby +Version: 34.3.3 +Summary: Easily download, build, install, upgrade, and uninstall Python packages +Home-page: https://github.com/pypa/setuptools +Author: Python Packaging Authority Author-email: distutils-sig@python.org -License: PSF or ZPL -Description: =============================== - Installing and Using Setuptools - =============================== +License: UNKNOWN +Description: .. image:: https://readthedocs.org/projects/setuptools/badge/?version=latest + :target: https://setuptools.readthedocs.io - .. contents:: **Table of Contents** + See the `Installation Instructions + `_ in the Python Packaging + User's Guide for instructions on installing, upgrading, and uninstalling + Setuptools. + The project is `maintained at GitHub `_. - ------------------------- - Installation Instructions - ------------------------- + Questions and comments should be directed to the `distutils-sig + mailing list `_. + Bug reports and especially tested patches may be + submitted directly to the `bug tracker + `_. - Windows - ======= - Install setuptools using the provided ``.exe`` installer. If you've previously - installed older versions of setuptools, please delete all ``setuptools*.egg`` - and ``setuptools.pth`` files from your system's ``site-packages`` directory - (and any other ``sys.path`` directories) FIRST. + Code of Conduct + --------------- - If you are upgrading a previous version of setuptools that was installed using - an ``.exe`` installer, please be sure to also *uninstall that older version* - via your system's "Add/Remove Programs" feature, BEFORE installing the newer - version. - - Once installation is complete, you will find an ``easy_install.exe`` program in - your Python ``Scripts`` subdirectory. Be sure to add this directory to your - ``PATH`` environment variable, if you haven't already done so. - - - RPM-Based Systems - ================= - - Install setuptools using the provided source RPM. The included ``.spec`` file - assumes you are installing using the default ``python`` executable, and is not - specific to a particular Python version. The ``easy_install`` executable will - be installed to a system ``bin`` directory such as ``/usr/bin``. - - If you wish to install to a location other than the default Python - installation's default ``site-packages`` directory (and ``$prefix/bin`` for - scripts), please use the ``.egg``-based installation approach described in the - following section. - - - Cygwin, Mac OS X, Linux, Other - ============================== - - 1. Download the appropriate egg for your version of Python (e.g. - ``setuptools-0.6c9-py2.4.egg``). Do NOT rename it. - - 2. Run it as if it were a shell script, e.g. ``sh setuptools-0.6c9-py2.4.egg``. - Setuptools will install itself using the matching version of Python (e.g. - ``python2.4``), and will place the ``easy_install`` executable in the - default location for installing Python scripts (as determined by the - standard distutils configuration files, or by the Python installation). - - If you want to install setuptools to somewhere other than ``site-packages`` or - your default distutils installation locations for libraries and scripts, you - may include EasyInstall command-line options such as ``--prefix``, - ``--install-dir``, and so on, following the ``.egg`` filename on the same - command line. For example:: - - sh setuptools-0.6c9-py2.4.egg --prefix=~ - - You can use ``--help`` to get a full options list, but we recommend consulting - the `EasyInstall manual`_ for detailed instructions, especially `the section - on custom installation locations`_. - - .. _EasyInstall manual: http://peak.telecommunity.com/DevCenter/EasyInstall - .. _the section on custom installation locations: http://peak.telecommunity.com/DevCenter/EasyInstall#custom-installation-locations - - - Cygwin Note - ----------- - - If you are trying to install setuptools for the **Windows** version of Python - (as opposed to the Cygwin version that lives in ``/usr/bin``), you must make - sure that an appropriate executable (``python2.3``, ``python2.4``, or - ``python2.5``) is on your **Cygwin** ``PATH`` when invoking the egg. For - example, doing the following at a Cygwin bash prompt will install setuptools - for the **Windows** Python found at ``C:\\Python24``:: - - ln -s /cygdrive/c/Python24/python.exe python2.4 - PATH=.:$PATH sh setuptools-0.6c9-py2.4.egg - rm python2.4 - - - Downloads - ========= - - All setuptools downloads can be found at `the project's home page in the Python - Package Index`_. Scroll to the very bottom of the page to find the links. - - .. _the project's home page in the Python Package Index: http://pypi.python.org/pypi/setuptools#files - - In addition to the PyPI downloads, the development version of ``setuptools`` - is available from the `Python SVN sandbox`_, and in-development versions of the - `0.6 branch`_ are available as well. - - .. _0.6 branch: http://svn.python.org/projects/sandbox/branches/setuptools-0.6/#egg=setuptools-dev06 - - .. _Python SVN sandbox: http://svn.python.org/projects/sandbox/trunk/setuptools/#egg=setuptools-dev - - -------------------------------- - Using Setuptools and EasyInstall - -------------------------------- - - Here are some of the available manuals, tutorials, and other resources for - learning about Setuptools, Python Eggs, and EasyInstall: - - * `The EasyInstall user's guide and reference manual`_ - * `The setuptools Developer's Guide`_ - * `The pkg_resources API reference`_ - * `Package Compatibility Notes`_ (user-maintained) - * `The Internal Structure of Python Eggs`_ - - Questions, comments, and bug reports should be directed to the `distutils-sig - mailing list`_. If you have written (or know of) any tutorials, documentation, - plug-ins, or other resources for setuptools users, please let us know about - them there, so this reference list can be updated. If you have working, - *tested* patches to correct problems or add features, you may submit them to - the `setuptools bug tracker`_. - - .. _setuptools bug tracker: http://bugs.python.org/setuptools/ - .. _Package Compatibility Notes: http://peak.telecommunity.com/DevCenter/PackageNotes - .. _The Internal Structure of Python Eggs: http://peak.telecommunity.com/DevCenter/EggFormats - .. _The setuptools Developer's Guide: http://peak.telecommunity.com/DevCenter/setuptools - .. _The pkg_resources API reference: http://peak.telecommunity.com/DevCenter/PkgResources - .. _The EasyInstall user's guide and reference manual: http://peak.telecommunity.com/DevCenter/EasyInstall - .. _distutils-sig mailing list: http://mail.python.org/pipermail/distutils-sig/ - - - ------- - Credits - ------- - - * The original design for the ``.egg`` format and the ``pkg_resources`` API was - co-created by Phillip Eby and Bob Ippolito. Bob also implemented the first - version of ``pkg_resources``, and supplied the OS X operating system version - compatibility algorithm. - - * Ian Bicking implemented many early "creature comfort" features of - easy_install, including support for downloading via Sourceforge and - Subversion repositories. Ian's comments on the Web-SIG about WSGI - application deployment also inspired the concept of "entry points" in eggs, - and he has given talks at PyCon and elsewhere to inform and educate the - community about eggs and setuptools. - - * Jim Fulton contributed time and effort to build automated tests of various - aspects of ``easy_install``, and supplied the doctests for the command-line - ``.exe`` wrappers on Windows. - - * Phillip J. Eby is the principal author and maintainer of setuptools, and - first proposed the idea of an importable binary distribution format for - Python application plug-ins. - - * Significant parts of the implementation of setuptools were funded by the Open - Source Applications Foundation, to provide a plug-in infrastructure for the - Chandler PIM application. In addition, many OSAF staffers (such as Mike - "Code Bear" Taylor) contributed their time and stress as guinea pigs for the - use of eggs and setuptools, even before eggs were "cool". (Thanks, guys!) - - .. _files: + Everyone interacting in the setuptools project's codebases, issue trackers, + chat rooms, and mailing lists is expected to follow the + `PyPA Code of Conduct `_. Keywords: CPAN PyPI distutils eggs package management Platform: UNKNOWN -Classifier: Development Status :: 3 - Alpha +Classifier: Development Status :: 5 - Production/Stable Classifier: Intended Audience :: Developers -Classifier: License :: OSI Approved :: Python Software Foundation License -Classifier: License :: OSI Approved :: Zope Public License +Classifier: License :: OSI Approved :: MIT License Classifier: Operating System :: OS Independent -Classifier: Programming Language :: Python +Classifier: Programming Language :: Python :: 2 +Classifier: Programming Language :: Python :: 2.6 +Classifier: Programming Language :: Python :: 2.7 +Classifier: Programming Language :: Python :: 3 +Classifier: Programming Language :: Python :: 3.3 +Classifier: Programming Language :: Python :: 3.4 +Classifier: Programming Language :: Python :: 3.5 +Classifier: Programming Language :: Python :: 3.6 Classifier: Topic :: Software Development :: Libraries :: Python Modules Classifier: Topic :: System :: Archiving :: Packaging Classifier: Topic :: System :: Systems Administration Classifier: Topic :: Utilities +Requires-Python: >=2.6,!=3.0.*,!=3.1.*,!=3.2.* diff --git a/setuptools.egg-info/SOURCES.txt b/setuptools.egg-info/SOURCES.txt index 443c5ae..069d01a 100644 --- a/setuptools.egg-info/SOURCES.txt +++ b/setuptools.egg-info/SOURCES.txt @@ -1,40 +1,86 @@ -EasyInstall.txt -README.txt -api_tests.txt +CHANGES.rst +LICENSE +MANIFEST.in +README.rst +bootstrap.py +conftest.py easy_install.py -ez_setup.py launcher.c -pkg_resources.py -pkg_resources.txt -release.sh +msvc-build-launcher.cmd +pytest.ini setup.cfg setup.py -setuptools.txt -site.py -version -version.dat -virtual-python.py -wikiup.cfg +tox.ini +docs/Makefile +docs/conf.py +docs/developer-guide.txt +docs/development.txt +docs/easy_install.txt +docs/formats.txt +docs/history.txt +docs/index.txt +docs/pkg_resources.txt +docs/python3.txt +docs/releases.txt +docs/requirements.txt +docs/roadmap.txt +docs/setuptools.txt +docs/_templates/indexsidebar.html +docs/_theme/nature/theme.conf +docs/_theme/nature/static/nature.css_t +docs/_theme/nature/static/pygments.css +pkg_resources/__init__.py +pkg_resources/api_tests.txt +pkg_resources/tests/__init__.py +pkg_resources/tests/test_markers.py +pkg_resources/tests/test_pkg_resources.py +pkg_resources/tests/test_resources.py setuptools/__init__.py setuptools/archive_util.py +setuptools/cli-32.exe +setuptools/cli-64.exe setuptools/cli.exe +setuptools/config.py +setuptools/dep_util.py setuptools/depends.py setuptools/dist.py setuptools/extension.py +setuptools/glob.py +setuptools/gui-32.exe +setuptools/gui-64.exe setuptools/gui.exe +setuptools/launch.py +setuptools/lib2to3_ex.py +setuptools/monkey.py +setuptools/msvc.py +setuptools/namespaces.py setuptools/package_index.py +setuptools/py26compat.py +setuptools/py27compat.py +setuptools/py31compat.py +setuptools/py33compat.py +setuptools/py36compat.py setuptools/sandbox.py +setuptools/script (dev).tmpl +setuptools/script.tmpl +setuptools/site-patch.py +setuptools/ssl_support.py +setuptools/unicode_utils.py +setuptools/version.py +setuptools/windows_support.py setuptools.egg-info/PKG-INFO setuptools.egg-info/SOURCES.txt setuptools.egg-info/dependency_links.txt setuptools.egg-info/entry_points.txt -setuptools.egg-info/not-zip-safe +setuptools.egg-info/requires.txt setuptools.egg-info/top_level.txt +setuptools.egg-info/zip-safe setuptools/command/__init__.py setuptools/command/alias.py setuptools/command/bdist_egg.py setuptools/command/bdist_rpm.py setuptools/command/bdist_wininst.py +setuptools/command/build_clib.py setuptools/command/build_ext.py setuptools/command/build_py.py setuptools/command/develop.py @@ -44,6 +90,8 @@ setuptools/command/install.py setuptools/command/install_egg_info.py setuptools/command/install_lib.py setuptools/command/install_scripts.py +setuptools/command/launcher manifest.xml +setuptools/command/py36compat.py setuptools/command/register.py setuptools/command/rotate.py setuptools/command/saveopts.py @@ -51,13 +99,45 @@ setuptools/command/sdist.py setuptools/command/setopt.py setuptools/command/test.py setuptools/command/upload.py +setuptools/command/upload_docs.py setuptools/tests/__init__.py -setuptools/tests/doctest.py +setuptools/tests/contexts.py +setuptools/tests/environment.py +setuptools/tests/files.py +setuptools/tests/fixtures.py +setuptools/tests/mod_with_constant.py +setuptools/tests/namespaces.py +setuptools/tests/py26compat.py +setuptools/tests/script-with-bom.py +setuptools/tests/server.py +setuptools/tests/test_archive_util.py +setuptools/tests/test_bdist_egg.py +setuptools/tests/test_build_clib.py +setuptools/tests/test_build_ext.py +setuptools/tests/test_build_py.py +setuptools/tests/test_config.py +setuptools/tests/test_dep_util.py +setuptools/tests/test_depends.py +setuptools/tests/test_develop.py +setuptools/tests/test_dist_info.py +setuptools/tests/test_easy_install.py +setuptools/tests/test_egg_info.py +setuptools/tests/test_find_packages.py +setuptools/tests/test_install_scripts.py +setuptools/tests/test_integration.py +setuptools/tests/test_manifest.py +setuptools/tests/test_msvc.py +setuptools/tests/test_namespaces.py setuptools/tests/test_packageindex.py -setuptools/tests/test_resources.py -setuptools/tests/win_script_wrapper.txt -tests/shlib_test/hello.c -tests/shlib_test/hello.pyx -tests/shlib_test/hellolib.c -tests/shlib_test/setup.py -tests/shlib_test/test_hello.py \ No newline at end of file +setuptools/tests/test_sandbox.py +setuptools/tests/test_sdist.py +setuptools/tests/test_setuptools.py +setuptools/tests/test_test.py +setuptools/tests/test_unicode_utils.py +setuptools/tests/test_upload_docs.py +setuptools/tests/test_windows_wrappers.py +setuptools/tests/textwrap.py +setuptools/tests/indexes/test_links_priority/external.html +setuptools/tests/indexes/test_links_priority/simple/foobar/index.html +tests/manual_test.py +tests/test_pypi.py \ No newline at end of file diff --git a/setuptools.egg-info/dependency_links.txt b/setuptools.egg-info/dependency_links.txt index 8b13789..e87d021 100644 --- a/setuptools.egg-info/dependency_links.txt +++ b/setuptools.egg-info/dependency_links.txt @@ -1 +1,2 @@ - +https://files.pythonhosted.org/packages/source/c/certifi/certifi-2016.9.26.tar.gz#md5=baa81e951a29958563689d868ef1064d +https://files.pythonhosted.org/packages/source/w/wincertstore/wincertstore-0.2.zip#md5=ae728f2f007185648d0c7a8679b361e2 diff --git a/setuptools.egg-info/entry_points.txt b/setuptools.egg-info/entry_points.txt old mode 100755 new mode 100644 index 04fbe4b..3f29bd3 --- a/setuptools.egg-info/entry_points.txt +++ b/setuptools.egg-info/entry_points.txt @@ -1,57 +1,63 @@ +[console_scripts] +easy_install = setuptools.command.easy_install:main +easy_install-3.6 = setuptools.command.easy_install:main + [distutils.commands] +alias = setuptools.command.alias:alias +bdist_egg = setuptools.command.bdist_egg:bdist_egg bdist_rpm = setuptools.command.bdist_rpm:bdist_rpm -rotate = setuptools.command.rotate:rotate -develop = setuptools.command.develop:develop -setopt = setuptools.command.setopt:setopt +bdist_wininst = setuptools.command.bdist_wininst:bdist_wininst +build_clib = setuptools.command.build_clib:build_clib +build_ext = setuptools.command.build_ext:build_ext build_py = setuptools.command.build_py:build_py -saveopts = setuptools.command.saveopts:saveopts -egg_info = setuptools.command.egg_info:egg_info -register = setuptools.command.register:register -upload = setuptools.command.upload:upload -install_egg_info = setuptools.command.install_egg_info:install_egg_info -alias = setuptools.command.alias:alias +develop = setuptools.command.develop:develop easy_install = setuptools.command.easy_install:easy_install -install_scripts = setuptools.command.install_scripts:install_scripts -bdist_wininst = setuptools.command.bdist_wininst:bdist_wininst -bdist_egg = setuptools.command.bdist_egg:bdist_egg +egg_info = setuptools.command.egg_info:egg_info install = setuptools.command.install:install -test = setuptools.command.test:test +install_egg_info = setuptools.command.install_egg_info:install_egg_info install_lib = setuptools.command.install_lib:install_lib -build_ext = setuptools.command.build_ext:build_ext +install_scripts = setuptools.command.install_scripts:install_scripts +register = setuptools.command.register:register +rotate = setuptools.command.rotate:rotate +saveopts = setuptools.command.saveopts:saveopts sdist = setuptools.command.sdist:sdist - -[egg_info.writers] -dependency_links.txt = setuptools.command.egg_info:overwrite_arg -requires.txt = setuptools.command.egg_info:write_requirements -PKG-INFO = setuptools.command.egg_info:write_pkg_info -eager_resources.txt = setuptools.command.egg_info:overwrite_arg -top_level.txt = setuptools.command.egg_info:write_toplevel_names -namespace_packages.txt = setuptools.command.egg_info:overwrite_arg -entry_points.txt = setuptools.command.egg_info:write_entries -depends.txt = setuptools.command.egg_info:warn_depends_obsolete - -[console_scripts] -easy_install = setuptools.command.easy_install:main -easy_install-2.3 = setuptools.command.easy_install:main - -[setuptools.file_finders] -svn_cvs = setuptools.command.sdist:_default_revctrl +setopt = setuptools.command.setopt:setopt +test = setuptools.command.test:test +upload = setuptools.command.upload:upload +upload_docs = setuptools.command.upload_docs:upload_docs [distutils.setup_keywords] +convert_2to3_doctests = setuptools.dist:assert_string_list dependency_links = setuptools.dist:assert_string_list +eager_resources = setuptools.dist:assert_string_list entry_points = setuptools.dist:check_entry_points +exclude_package_data = setuptools.dist:check_package_data extras_require = setuptools.dist:check_extras -package_data = setuptools.dist:check_package_data -install_requires = setuptools.dist:check_requirements include_package_data = setuptools.dist:assert_bool -exclude_package_data = setuptools.dist:check_package_data +install_requires = setuptools.dist:check_requirements namespace_packages = setuptools.dist:check_nsp -test_suite = setuptools.dist:check_test_suite -eager_resources = setuptools.dist:assert_string_list -zip_safe = setuptools.dist:assert_bool -test_loader = setuptools.dist:check_importable +package_data = setuptools.dist:check_package_data packages = setuptools.dist:check_packages +python_requires = setuptools.dist:check_specifier +setup_requires = setuptools.dist:check_requirements +test_loader = setuptools.dist:check_importable +test_runner = setuptools.dist:check_importable +test_suite = setuptools.dist:check_test_suite tests_require = setuptools.dist:check_requirements +use_2to3 = setuptools.dist:assert_bool +use_2to3_exclude_fixers = setuptools.dist:assert_string_list +use_2to3_fixers = setuptools.dist:assert_string_list +zip_safe = setuptools.dist:assert_bool + +[egg_info.writers] +PKG-INFO = setuptools.command.egg_info:write_pkg_info +dependency_links.txt = setuptools.command.egg_info:overwrite_arg +depends.txt = setuptools.command.egg_info:warn_depends_obsolete +eager_resources.txt = setuptools.command.egg_info:overwrite_arg +entry_points.txt = setuptools.command.egg_info:write_entries +namespace_packages.txt = setuptools.command.egg_info:overwrite_arg +requires.txt = setuptools.command.egg_info:write_requirements +top_level.txt = setuptools.command.egg_info:write_toplevel_names [setuptools.installation] eggsecutable = setuptools.command.easy_install:bootstrap diff --git a/setuptools.egg-info/not-zip-safe b/setuptools.egg-info/not-zip-safe deleted file mode 100644 index 8b13789..0000000 --- a/setuptools.egg-info/not-zip-safe +++ /dev/null @@ -1 +0,0 @@ - diff --git a/setuptools.egg-info/requires.txt b/setuptools.egg-info/requires.txt new file mode 100644 index 0000000..b5b5680 --- /dev/null +++ b/setuptools.egg-info/requires.txt @@ -0,0 +1,9 @@ +packaging>=16.8 +six>=1.6.0 +appdirs>=1.4.0 + +[certs] +certifi==2016.9.26 + +[ssl:sys_platform=='win32'] +wincertstore==0.2 diff --git a/setuptools.egg-info/top_level.txt b/setuptools.egg-info/top_level.txt index ef77c7c..4577c6a 100644 --- a/setuptools.egg-info/top_level.txt +++ b/setuptools.egg-info/top_level.txt @@ -1,4 +1,3 @@ easy_install pkg_resources setuptools -site diff --git a/setuptools.egg-info/zip-safe b/setuptools.egg-info/zip-safe new file mode 100644 index 0000000..8b13789 --- /dev/null +++ b/setuptools.egg-info/zip-safe @@ -0,0 +1 @@ + diff --git a/setuptools.txt b/setuptools.txt deleted file mode 100755 index c709a90..0000000 --- a/setuptools.txt +++ /dev/null @@ -1,3149 +0,0 @@ -====================================================== -Building and Distributing Packages with ``setuptools`` -====================================================== - -``setuptools`` is a collection of enhancements to the Python ``distutils`` -(for Python 2.3.5 and up on most platforms; 64-bit platforms require a minimum -of Python 2.4) that allow you to more easily build and distribute Python -packages, especially ones that have dependencies on other packages. - -Packages built and distributed using ``setuptools`` look to the user like -ordinary Python packages based on the ``distutils``. Your users don't need to -install or even know about setuptools in order to use them, and you don't -have to include the entire setuptools package in your distributions. By -including just a single `bootstrap module`_ (an 8K .py file), your package will -automatically download and install ``setuptools`` if the user is building your -package from source and doesn't have a suitable version already installed. - -.. _bootstrap module: http://peak.telecommunity.com/dist/ez_setup.py - -Feature Highlights: - -* Automatically find/download/install/upgrade dependencies at build time using - the `EasyInstall tool `_, - which supports downloading via HTTP, FTP, Subversion, and SourceForge, and - automatically scans web pages linked from PyPI to find download links. (It's - the closest thing to CPAN currently available for Python.) - -* Create `Python Eggs `_ - - a single-file importable distribution format - -* Include data files inside your package directories, where your code can - actually use them. (Python 2.4 distutils also supports this feature, but - setuptools provides the feature for Python 2.3 packages also, and supports - accessing data files in zipped packages too.) - -* Automatically include all packages in your source tree, without listing them - individually in setup.py - -* Automatically include all relevant files in your source distributions, - without needing to create a ``MANIFEST.in`` file, and without having to force - regeneration of the ``MANIFEST`` file when your source tree changes. - -* Automatically generate wrapper scripts or Windows (console and GUI) .exe - files for any number of "main" functions in your project. (Note: this is not - a py2exe replacement; the .exe files rely on the local Python installation.) - -* Transparent Pyrex support, so that your setup.py can list ``.pyx`` files and - still work even when the end-user doesn't have Pyrex installed (as long as - you include the Pyrex-generated C in your source distribution) - -* Command aliases - create project-specific, per-user, or site-wide shortcut - names for commonly used commands and options - -* PyPI upload support - upload your source distributions and eggs to PyPI - -* Deploy your project in "development mode", such that it's available on - ``sys.path``, yet can still be edited directly from its source checkout. - -* Easily extend the distutils with new commands or ``setup()`` arguments, and - distribute/reuse your extensions for multiple projects, without copying code. - -* Create extensible applications and frameworks that automatically discover - extensions, using simple "entry points" declared in a project's setup script. - -In addition to the PyPI downloads, the development version of ``setuptools`` -is available from the `Python SVN sandbox`_, and in-development versions of the -`0.6 branch`_ are available as well. - -.. _0.6 branch: http://svn.python.org/projects/sandbox/branches/setuptools-0.6/#egg=setuptools-dev06 - -.. _Python SVN sandbox: http://svn.python.org/projects/sandbox/trunk/setuptools/#egg=setuptools-dev - -.. contents:: **Table of Contents** - -.. _ez_setup.py: `bootstrap module`_ - - ------------------ -Developer's Guide ------------------ - - -Installing ``setuptools`` -========================= - -Please follow the `EasyInstall Installation Instructions`_ to install the -current stable version of setuptools. In particular, be sure to read the -section on `Custom Installation Locations`_ if you are installing anywhere -other than Python's ``site-packages`` directory. - -.. _EasyInstall Installation Instructions: http://peak.telecommunity.com/DevCenter/EasyInstall#installation-instructions - -.. _Custom Installation Locations: http://peak.telecommunity.com/DevCenter/EasyInstall#custom-installation-locations - -If you want the current in-development version of setuptools, you should first -install a stable version, and then run:: - - ez_setup.py setuptools==dev - -This will download and install the latest development (i.e. unstable) version -of setuptools from the Python Subversion sandbox. - - -Basic Use -========= - -For basic use of setuptools, just import things from setuptools instead of -the distutils. Here's a minimal setup script using setuptools:: - - from setuptools import setup, find_packages - setup( - name = "HelloWorld", - version = "0.1", - packages = find_packages(), - ) - -As you can see, it doesn't take much to use setuptools in a project. -Just by doing the above, this project will be able to produce eggs, upload to -PyPI, and automatically include all packages in the directory where the -setup.py lives. See the `Command Reference`_ section below to see what -commands you can give to this setup script. - -Of course, before you release your project to PyPI, you'll want to add a bit -more information to your setup script to help people find or learn about your -project. And maybe your project will have grown by then to include a few -dependencies, and perhaps some data files and scripts:: - - from setuptools import setup, find_packages - setup( - name = "HelloWorld", - version = "0.1", - packages = find_packages(), - scripts = ['say_hello.py'], - - # Project uses reStructuredText, so ensure that the docutils get - # installed or upgraded on the target machine - install_requires = ['docutils>=0.3'], - - package_data = { - # If any package contains *.txt or *.rst files, include them: - '': ['*.txt', '*.rst'], - # And include any *.msg files found in the 'hello' package, too: - 'hello': ['*.msg'], - } - - # metadata for upload to PyPI - author = "Me", - author_email = "me@example.com", - description = "This is an Example Package", - license = "PSF", - keywords = "hello world example examples", - url = "http://example.com/HelloWorld/", # project home page, if any - - # could also include long_description, download_url, classifiers, etc. - ) - -In the sections that follow, we'll explain what most of these ``setup()`` -arguments do (except for the metadata ones), and the various ways you might use -them in your own project(s). - - -Specifying Your Project's Version ---------------------------------- - -Setuptools can work well with most versioning schemes; there are, however, a -few special things to watch out for, in order to ensure that setuptools and -EasyInstall can always tell what version of your package is newer than another -version. Knowing these things will also help you correctly specify what -versions of other projects your project depends on. - -A version consists of an alternating series of release numbers and pre-release -or post-release tags. A release number is a series of digits punctuated by -dots, such as ``2.4`` or ``0.5``. Each series of digits is treated -numerically, so releases ``2.1`` and ``2.1.0`` are different ways to spell the -same release number, denoting the first subrelease of release 2. But ``2.10`` -is the *tenth* subrelease of release 2, and so is a different and newer release -from ``2.1`` or ``2.1.0``. Leading zeros within a series of digits are also -ignored, so ``2.01`` is the same as ``2.1``, and different from ``2.0.1``. - -Following a release number, you can have either a pre-release or post-release -tag. Pre-release tags make a version be considered *older* than the version -they are appended to. So, revision ``2.4`` is *newer* than revision ``2.4c1``, -which in turn is newer than ``2.4b1`` or ``2.4a1``. Postrelease tags make -a version be considered *newer* than the version they are appended to. So, -revisions like ``2.4-1`` and ``2.4pl3`` are newer than ``2.4``, but are *older* -than ``2.4.1`` (which has a higher release number). - -A pre-release tag is a series of letters that are alphabetically before -"final". Some examples of prerelease tags would include ``alpha``, ``beta``, -``a``, ``c``, ``dev``, and so on. You do not have to place a dot before -the prerelease tag if it's immediately after a number, but it's okay to do -so if you prefer. Thus, ``2.4c1`` and ``2.4.c1`` both represent release -candidate 1 of version ``2.4``, and are treated as identical by setuptools. - -In addition, there are three special prerelease tags that are treated as if -they were the letter ``c``: ``pre``, ``preview``, and ``rc``. So, version -``2.4rc1``, ``2.4pre1`` and ``2.4preview1`` are all the exact same version as -``2.4c1``, and are treated as identical by setuptools. - -A post-release tag is either a series of letters that are alphabetically -greater than or equal to "final", or a dash (``-``). Post-release tags are -generally used to separate patch numbers, port numbers, build numbers, revision -numbers, or date stamps from the release number. For example, the version -``2.4-r1263`` might denote Subversion revision 1263 of a post-release patch of -version ``2.4``. Or you might use ``2.4-20051127`` to denote a date-stamped -post-release. - -Notice that after each pre or post-release tag, you are free to place another -release number, followed again by more pre- or post-release tags. For example, -``0.6a9.dev-r41475`` could denote Subversion revision 41475 of the in- -development version of the ninth alpha of release 0.6. Notice that ``dev`` is -a pre-release tag, so this version is a *lower* version number than ``0.6a9``, -which would be the actual ninth alpha of release 0.6. But the ``-r41475`` is -a post-release tag, so this version is *newer* than ``0.6a9.dev``. - -For the most part, setuptools' interpretation of version numbers is intuitive, -but here are a few tips that will keep you out of trouble in the corner cases: - -* Don't use ``-`` or any other character than ``.`` as a separator, unless you - really want a post-release. Remember that ``2.1-rc2`` means you've - *already* released ``2.1``, whereas ``2.1rc2`` and ``2.1.c2`` are candidates - you're putting out *before* ``2.1``. If you accidentally distribute copies - of a post-release that you meant to be a pre-release, the only safe fix is to - bump your main release number (e.g. to ``2.1.1``) and re-release the project. - -* Don't stick adjoining pre-release tags together without a dot or number - between them. Version ``1.9adev`` is the ``adev`` prerelease of ``1.9``, - *not* a development pre-release of ``1.9a``. Use ``.dev`` instead, as in - ``1.9a.dev``, or separate the prerelease tags with a number, as in - ``1.9a0dev``. ``1.9a.dev``, ``1.9a0dev``, and even ``1.9.a.dev`` are - identical versions from setuptools' point of view, so you can use whatever - scheme you prefer. - -* If you want to be certain that your chosen numbering scheme works the way - you think it will, you can use the ``pkg_resources.parse_version()`` function - to compare different version numbers:: - - >>> from pkg_resources import parse_version - >>> parse_version('1.9.a.dev') == parse_version('1.9a0dev') - True - >>> parse_version('2.1-rc2') < parse_version('2.1') - False - >>> parse_version('0.6a9dev-r41475') < parse_version('0.6a9') - True - -Once you've decided on a version numbering scheme for your project, you can -have setuptools automatically tag your in-development releases with various -pre- or post-release tags. See the following sections for more details: - -* `Tagging and "Daily Build" or "Snapshot" Releases`_ -* `Managing "Continuous Releases" Using Subversion`_ -* The `egg_info`_ command - - -New and Changed ``setup()`` Keywords -==================================== - -The following keyword arguments to ``setup()`` are added or changed by -``setuptools``. All of them are optional; you do not have to supply them -unless you need the associated ``setuptools`` feature. - -``include_package_data`` - If set to ``True``, this tells ``setuptools`` to automatically include any - data files it finds inside your package directories, that are either under - CVS or Subversion control, or which are specified by your ``MANIFEST.in`` - file. For more information, see the section below on `Including Data - Files`_. - -``exclude_package_data`` - A dictionary mapping package names to lists of glob patterns that should - be *excluded* from your package directories. You can use this to trim back - any excess files included by ``include_package_data``. For a complete - description and examples, see the section below on `Including Data Files`_. - -``package_data`` - A dictionary mapping package names to lists of glob patterns. For a - complete description and examples, see the section below on `Including - Data Files`_. You do not need to use this option if you are using - ``include_package_data``, unless you need to add e.g. files that are - generated by your setup script and build process. (And are therefore not - in source control or are files that you don't want to include in your - source distribution.) - -``zip_safe`` - A boolean (True or False) flag specifying whether the project can be - safely installed and run from a zip file. If this argument is not - supplied, the ``bdist_egg`` command will have to analyze all of your - project's contents for possible problems each time it buids an egg. - -``install_requires`` - A string or list of strings specifying what other distributions need to - be installed when this one is. See the section below on `Declaring - Dependencies`_ for details and examples of the format of this argument. - -``entry_points`` - A dictionary mapping entry point group names to strings or lists of strings - defining the entry points. Entry points are used to support dynamic - discovery of services or plugins provided by a project. See `Dynamic - Discovery of Services and Plugins`_ for details and examples of the format - of this argument. In addition, this keyword is used to support `Automatic - Script Creation`_. - -``extras_require`` - A dictionary mapping names of "extras" (optional features of your project) - to strings or lists of strings specifying what other distributions must be - installed to support those features. See the section below on `Declaring - Dependencies`_ for details and examples of the format of this argument. - -``setup_requires`` - A string or list of strings specifying what other distributions need to - be present in order for the *setup script* to run. ``setuptools`` will - attempt to obtain these (even going so far as to download them using - ``EasyInstall``) before processing the rest of the setup script or commands. - This argument is needed if you are using distutils extensions as part of - your build process; for example, extensions that process setup() arguments - and turn them into EGG-INFO metadata files. - - (Note: projects listed in ``setup_requires`` will NOT be automatically - installed on the system where the setup script is being run. They are - simply downloaded to the setup directory if they're not locally available - already. If you want them to be installed, as well as being available - when the setup script is run, you should add them to ``install_requires`` - **and** ``setup_requires``.) - -``dependency_links`` - A list of strings naming URLs to be searched when satisfying dependencies. - These links will be used if needed to install packages specified by - ``setup_requires`` or ``tests_require``. They will also be written into - the egg's metadata for use by tools like EasyInstall to use when installing - an ``.egg`` file. - -``namespace_packages`` - A list of strings naming the project's "namespace packages". A namespace - package is a package that may be split across multiple project - distributions. For example, Zope 3's ``zope`` package is a namespace - package, because subpackages like ``zope.interface`` and ``zope.publisher`` - may be distributed separately. The egg runtime system can automatically - merge such subpackages into a single parent package at runtime, as long - as you declare them in each project that contains any subpackages of the - namespace package, and as long as the namespace package's ``__init__.py`` - does not contain any code. See the section below on `Namespace Packages`_ - for more information. - -``test_suite`` - A string naming a ``unittest.TestCase`` subclass (or a package or module - containing one or more of them, or a method of such a subclass), or naming - a function that can be called with no arguments and returns a - ``unittest.TestSuite``. If the named suite is a module, and the module - has an ``additional_tests()`` function, it is called and the results are - added to the tests to be run. If the named suite is a package, any - submodules and subpackages are recursively added to the overall test suite. - - Specifying this argument enables use of the `test`_ command to run the - specified test suite, e.g. via ``setup.py test``. See the section on the - `test`_ command below for more details. - -``tests_require`` - If your project's tests need one or more additional packages besides those - needed to install it, you can use this option to specify them. It should - be a string or list of strings specifying what other distributions need to - be present for the package's tests to run. When you run the ``test`` - command, ``setuptools`` will attempt to obtain these (even going - so far as to download them using ``EasyInstall``). Note that these - required projects will *not* be installed on the system where the tests - are run, but only downloaded to the project's setup directory if they're - not already installed locally. - -.. _test_loader: - -``test_loader`` - If you would like to use a different way of finding tests to run than what - setuptools normally uses, you can specify a module name and class name in - this argument. The named class must be instantiable with no arguments, and - its instances must support the ``loadTestsFromNames()`` method as defined - in the Python ``unittest`` module's ``TestLoader`` class. Setuptools will - pass only one test "name" in the `names` argument: the value supplied for - the ``test_suite`` argument. The loader you specify may interpret this - string in any way it likes, as there are no restrictions on what may be - contained in a ``test_suite`` string. - - The module name and class name must be separated by a ``:``. The default - value of this argument is ``"setuptools.command.test:ScanningLoader"``. If - you want to use the default ``unittest`` behavior, you can specify - ``"unittest:TestLoader"`` as your ``test_loader`` argument instead. This - will prevent automatic scanning of submodules and subpackages. - - The module and class you specify here may be contained in another package, - as long as you use the ``tests_require`` option to ensure that the package - containing the loader class is available when the ``test`` command is run. - -``eager_resources`` - A list of strings naming resources that should be extracted together, if - any of them is needed, or if any C extensions included in the project are - imported. This argument is only useful if the project will be installed as - a zipfile, and there is a need to have all of the listed resources be - extracted to the filesystem *as a unit*. Resources listed here - should be '/'-separated paths, relative to the source root, so to list a - resource ``foo.png`` in package ``bar.baz``, you would include the string - ``bar/baz/foo.png`` in this argument. - - If you only need to obtain resources one at a time, or you don't have any C - extensions that access other files in the project (such as data files or - shared libraries), you probably do NOT need this argument and shouldn't - mess with it. For more details on how this argument works, see the section - below on `Automatic Resource Extraction`_. - - -Using ``find_packages()`` -------------------------- - -For simple projects, it's usually easy enough to manually add packages to -the ``packages`` argument of ``setup()``. However, for very large projects -(Twisted, PEAK, Zope, Chandler, etc.), it can be a big burden to keep the -package list updated. That's what ``setuptools.find_packages()`` is for. - -``find_packages()`` takes a source directory, and a list of package names or -patterns to exclude. If omitted, the source directory defaults to the same -directory as the setup script. Some projects use a ``src`` or ``lib`` -directory as the root of their source tree, and those projects would of course -use ``"src"`` or ``"lib"`` as the first argument to ``find_packages()``. (And -such projects also need something like ``package_dir = {'':'src'}`` in their -``setup()`` arguments, but that's just a normal distutils thing.) - -Anyway, ``find_packages()`` walks the target directory, and finds Python -packages by looking for ``__init__.py`` files. It then filters the list of -packages using the exclusion patterns. - -Exclusion patterns are package names, optionally including wildcards. For -example, ``find_packages(exclude=["*.tests"])`` will exclude all packages whose -last name part is ``tests``. Or, ``find_packages(exclude=["*.tests", -"*.tests.*"])`` will also exclude any subpackages of packages named ``tests``, -but it still won't exclude a top-level ``tests`` package or the children -thereof. In fact, if you really want no ``tests`` packages at all, you'll need -something like this:: - - find_packages(exclude=["*.tests", "*.tests.*", "tests.*", "tests"]) - -in order to cover all the bases. Really, the exclusion patterns are intended -to cover simpler use cases than this, like excluding a single, specified -package and its subpackages. - -Regardless of the target directory or exclusions, the ``find_packages()`` -function returns a list of package names suitable for use as the ``packages`` -argument to ``setup()``, and so is usually the easiest way to set that -argument in your setup script. Especially since it frees you from having to -remember to modify your setup script whenever your project grows additional -top-level packages or subpackages. - - -Automatic Script Creation -========================= - -Packaging and installing scripts can be a bit awkward with the distutils. For -one thing, there's no easy way to have a script's filename match local -conventions on both Windows and POSIX platforms. For another, you often have -to create a separate file just for the "main" script, when your actual "main" -is a function in a module somewhere. And even in Python 2.4, using the ``-m`` -option only works for actual ``.py`` files that aren't installed in a package. - -``setuptools`` fixes all of these problems by automatically generating scripts -for you with the correct extension, and on Windows it will even create an -``.exe`` file so that users don't have to change their ``PATHEXT`` settings. -The way to use this feature is to define "entry points" in your setup script -that indicate what function the generated script should import and run. For -example, to create two console scripts called ``foo`` and ``bar``, and a GUI -script called ``baz``, you might do something like this:: - - setup( - # other arguments here... - entry_points = { - 'console_scripts': [ - 'foo = my_package.some_module:main_func', - 'bar = other_module:some_func', - ], - 'gui_scripts': [ - 'baz = my_package_gui.start_func', - ] - } - ) - -When this project is installed on non-Windows platforms (using "setup.py -install", "setup.py develop", or by using EasyInstall), a set of ``foo``, -``bar``, and ``baz`` scripts will be installed that import ``main_func`` and -``some_func`` from the specified modules. The functions you specify are called -with no arguments, and their return value is passed to ``sys.exit()``, so you -can return an errorlevel or message to print to stderr. - -On Windows, a set of ``foo.exe``, ``bar.exe``, and ``baz.exe`` launchers are -created, alongside a set of ``foo.py``, ``bar.py``, and ``baz.pyw`` files. The -``.exe`` wrappers find and execute the right version of Python to run the -``.py`` or ``.pyw`` file. - -You may define as many "console script" and "gui script" entry points as you -like, and each one can optionally specify "extras" that it depends on, that -will be added to ``sys.path`` when the script is run. For more information on -"extras", see the section below on `Declaring Extras`_. For more information -on "entry points" in general, see the section below on `Dynamic Discovery of -Services and Plugins`_. - - -"Eggsecutable" Scripts ----------------------- - -Occasionally, there are situations where it's desirable to make an ``.egg`` -file directly executable. You can do this by including an entry point such -as the following:: - - setup( - # other arguments here... - entry_points = { - 'setuptools.installation': [ - 'eggsecutable = my_package.some_module:main_func', - ] - } - ) - -Any eggs built from the above setup script will include a short excecutable -prelude that imports and calls ``main_func()`` from ``my_package.some_module``. -The prelude can be run on Unix-like platforms (including Mac and Linux) by -invoking the egg with ``/bin/sh``, or by enabling execute permissions on the -``.egg`` file. For the executable prelude to run, the appropriate version of -Python must be available via the ``PATH`` environment variable, under its -"long" name. That is, if the egg is built for Python 2.3, there must be a -``python2.3`` executable present in a directory on ``PATH``. - -This feature is primarily intended to support bootstrapping the installation of -setuptools itself on non-Windows platforms, but may also be useful for other -projects as well. - -IMPORTANT NOTE: Eggs with an "eggsecutable" header cannot be renamed, or -invoked via symlinks. They *must* be invoked using their original filename, in -order to ensure that, once running, ``pkg_resources`` will know what project -and version is in use. The header script will check this and exit with an -error if the ``.egg`` file has been renamed or is invoked via a symlink that -changes its base name. - - -Declaring Dependencies -====================== - -``setuptools`` supports automatically installing dependencies when a package is -installed, and including information about dependencies in Python Eggs (so that -package management tools like EasyInstall can use the information). - -``setuptools`` and ``pkg_resources`` use a common syntax for specifying a -project's required dependencies. This syntax consists of a project's PyPI -name, optionally followed by a comma-separated list of "extras" in square -brackets, optionally followed by a comma-separated list of version -specifiers. A version specifier is one of the operators ``<``, ``>``, ``<=``, -``>=``, ``==`` or ``!=``, followed by a version identifier. Tokens may be -separated by whitespace, but any whitespace or nonstandard characters within a -project name or version identifier must be replaced with ``-``. - -Version specifiers for a given project are internally sorted into ascending -version order, and used to establish what ranges of versions are acceptable. -Adjacent redundant conditions are also consolidated (e.g. ``">1, >2"`` becomes -``">1"``, and ``"<2,<3"`` becomes ``"<3"``). ``"!="`` versions are excised from -the ranges they fall within. A project's version is then checked for -membership in the resulting ranges. (Note that providing conflicting conditions -for the same version (e.g. "<2,>=2" or "==2,!=2") is meaningless and may -therefore produce bizarre results.) - -Here are some example requirement specifiers:: - - docutils >= 0.3 - - # comment lines and \ continuations are allowed in requirement strings - BazSpam ==1.1, ==1.2, ==1.3, ==1.4, ==1.5, \ - ==1.6, ==1.7 # and so are line-end comments - - PEAK[FastCGI, reST]>=0.5a4 - - setuptools==0.5a7 - -The simplest way to include requirement specifiers is to use the -``install_requires`` argument to ``setup()``. It takes a string or list of -strings containing requirement specifiers. If you include more than one -requirement in a string, each requirement must begin on a new line. - -This has three effects: - -1. When your project is installed, either by using EasyInstall, ``setup.py - install``, or ``setup.py develop``, all of the dependencies not already - installed will be located (via PyPI), downloaded, built (if necessary), - and installed. - -2. Any scripts in your project will be installed with wrappers that verify - the availability of the specified dependencies at runtime, and ensure that - the correct versions are added to ``sys.path`` (e.g. if multiple versions - have been installed). - -3. Python Egg distributions will include a metadata file listing the - dependencies. - -Note, by the way, that if you declare your dependencies in ``setup.py``, you do -*not* need to use the ``require()`` function in your scripts or modules, as -long as you either install the project or use ``setup.py develop`` to do -development work on it. (See `"Development Mode"`_ below for more details on -using ``setup.py develop``.) - - -Dependencies that aren't in PyPI --------------------------------- - -If your project depends on packages that aren't registered in PyPI, you may -still be able to depend on them, as long as they are available for download -as an egg, in the standard distutils ``sdist`` format, or as a single ``.py`` -file. You just need to add some URLs to the ``dependency_links`` argument to -``setup()``. - -The URLs must be either: - -1. direct download URLs, or -2. the URLs of web pages that contain direct download links - -In general, it's better to link to web pages, because it is usually less -complex to update a web page than to release a new version of your project. -You can also use a SourceForge ``showfiles.php`` link in the case where a -package you depend on is distributed via SourceForge. - -If you depend on a package that's distributed as a single ``.py`` file, you -must include an ``"#egg=project-version"`` suffix to the URL, to give a project -name and version number. (Be sure to escape any dashes in the name or version -by replacing them with underscores.) EasyInstall will recognize this suffix -and automatically create a trivial ``setup.py`` to wrap the single ``.py`` file -as an egg. - -The ``dependency_links`` option takes the form of a list of URL strings. For -example, the below will cause EasyInstall to search the specified page for -eggs or source distributions, if the package's dependencies aren't already -installed:: - - setup( - ... - dependency_links = [ - "http://peak.telecommunity.com/snapshots/" - ], - ) - - -.. _Declaring Extras: - - -Declaring "Extras" (optional features with their own dependencies) ------------------------------------------------------------------- - -Sometimes a project has "recommended" dependencies, that are not required for -all uses of the project. For example, a project might offer optional PDF -output if ReportLab is installed, and reStructuredText support if docutils is -installed. These optional features are called "extras", and setuptools allows -you to define their requirements as well. In this way, other projects that -require these optional features can force the additional requirements to be -installed, by naming the desired extras in their ``install_requires``. - -For example, let's say that Project A offers optional PDF and reST support:: - - setup( - name="Project-A", - ... - extras_require = { - 'PDF': ["ReportLab>=1.2", "RXP"], - 'reST': ["docutils>=0.3"], - } - ) - -As you can see, the ``extras_require`` argument takes a dictionary mapping -names of "extra" features, to strings or lists of strings describing those -features' requirements. These requirements will *not* be automatically -installed unless another package depends on them (directly or indirectly) by -including the desired "extras" in square brackets after the associated project -name. (Or if the extras were listed in a requirement spec on the EasyInstall -command line.) - -Extras can be used by a project's `entry points`_ to specify dynamic -dependencies. For example, if Project A includes a "rst2pdf" script, it might -declare it like this, so that the "PDF" requirements are only resolved if the -"rst2pdf" script is run:: - - setup( - name="Project-A", - ... - entry_points = { - 'console_scripts': - ['rst2pdf = project_a.tools.pdfgen [PDF]'], - ['rst2html = project_a.tools.htmlgen'], - # more script entry points ... - } - ) - -Projects can also use another project's extras when specifying dependencies. -For example, if project B needs "project A" with PDF support installed, it -might declare the dependency like this:: - - setup( - name="Project-B", - install_requires = ["Project-A[PDF]"], - ... - ) - -This will cause ReportLab to be installed along with project A, if project B is -installed -- even if project A was already installed. In this way, a project -can encapsulate groups of optional "downstream dependencies" under a feature -name, so that packages that depend on it don't have to know what the downstream -dependencies are. If a later version of Project A builds in PDF support and -no longer needs ReportLab, or if it ends up needing other dependencies besides -ReportLab in order to provide PDF support, Project B's setup information does -not need to change, but the right packages will still be installed if needed. - -Note, by the way, that if a project ends up not needing any other packages to -support a feature, it should keep an empty requirements list for that feature -in its ``extras_require`` argument, so that packages depending on that feature -don't break (due to an invalid feature name). For example, if Project A above -builds in PDF support and no longer needs ReportLab, it could change its -setup to this:: - - setup( - name="Project-A", - ... - extras_require = { - 'PDF': [], - 'reST': ["docutils>=0.3"], - } - ) - -so that Package B doesn't have to remove the ``[PDF]`` from its requirement -specifier. - - -Including Data Files -==================== - -The distutils have traditionally allowed installation of "data files", which -are placed in a platform-specific location. However, the most common use case -for data files distributed with a package is for use *by* the package, usually -by including the data files in the package directory. - -Setuptools offers three ways to specify data files to be included in your -packages. First, you can simply use the ``include_package_data`` keyword, -e.g.:: - - from setuptools import setup, find_packages - setup( - ... - include_package_data = True - ) - -This tells setuptools to install any data files it finds in your packages. The -data files must be under CVS or Subversion control, or else they must be -specified via the distutils' ``MANIFEST.in`` file. (They can also be tracked -by another revision control system, using an appropriate plugin. See the -section below on `Adding Support for Other Revision Control Systems`_ for -information on how to write such plugins.) - -If you want finer-grained control over what files are included (for example, if -you have documentation files in your package directories and want to exclude -them from installation), then you can also use the ``package_data`` keyword, -e.g.:: - - from setuptools import setup, find_packages - setup( - ... - package_data = { - # If any package contains *.txt or *.rst files, include them: - '': ['*.txt', '*.rst'], - # And include any *.msg files found in the 'hello' package, too: - 'hello': ['*.msg'], - } - ) - -The ``package_data`` argument is a dictionary that maps from package names to -lists of glob patterns. The globs may include subdirectory names, if the data -files are contained in a subdirectory of the package. For example, if the -package tree looks like this:: - - setup.py - src/ - mypkg/ - __init__.py - mypkg.txt - data/ - somefile.dat - otherdata.dat - -The setuptools setup file might look like this:: - - from setuptools import setup, find_packages - setup( - ... - packages = find_packages('src'), # include all packages under src - package_dir = {'':'src'}, # tell distutils packages are under src - - package_data = { - # If any package contains *.txt files, include them: - '': ['*.txt'], - # And include any *.dat files found in the 'data' subdirectory - # of the 'mypkg' package, also: - 'mypkg': ['data/*.dat'], - } - ) - -Notice that if you list patterns in ``package_data`` under the empty string, -these patterns are used to find files in every package, even ones that also -have their own patterns listed. Thus, in the above example, the ``mypkg.txt`` -file gets included even though it's not listed in the patterns for ``mypkg``. - -Also notice that if you use paths, you *must* use a forward slash (``/``) as -the path separator, even if you are on Windows. Setuptools automatically -converts slashes to appropriate platform-specific separators at build time. - -(Note: although the ``package_data`` argument was previously only available in -``setuptools``, it was also added to the Python ``distutils`` package as of -Python 2.4; there is `some documentation for the feature`__ available on the -python.org website.) - -__ http://docs.python.org/dist/node11.html - -Sometimes, the ``include_package_data`` or ``package_data`` options alone -aren't sufficient to precisely define what files you want included. For -example, you may want to include package README files in your revision control -system and source distributions, but exclude them from being installed. So, -setuptools offers an ``exclude_package_data`` option as well, that allows you -to do things like this:: - - from setuptools import setup, find_packages - setup( - ... - packages = find_packages('src'), # include all packages under src - package_dir = {'':'src'}, # tell distutils packages are under src - - include_package_data = True, # include everything in source control - - # ...but exclude README.txt from all packages - exclude_package_data = { '': ['README.txt'] }, - ) - -The ``exclude_package_data`` option is a dictionary mapping package names to -lists of wildcard patterns, just like the ``package_data`` option. And, just -as with that option, a key of ``''`` will apply the given pattern(s) to all -packages. However, any files that match these patterns will be *excluded* -from installation, even if they were listed in ``package_data`` or were -included as a result of using ``include_package_data``. - -In summary, the three options allow you to: - -``include_package_data`` - Accept all data files and directories matched by ``MANIFEST.in`` or found - in source control. - -``package_data`` - Specify additional patterns to match files and directories that may or may - not be matched by ``MANIFEST.in`` or found in source control. - -``exclude_package_data`` - Specify patterns for data files and directories that should *not* be - included when a package is installed, even if they would otherwise have - been included due to the use of the preceding options. - -NOTE: Due to the way the distutils build process works, a data file that you -include in your project and then stop including may be "orphaned" in your -project's build directories, requiring you to run ``setup.py clean --all`` to -fully remove them. This may also be important for your users and contributors -if they track intermediate revisions of your project using Subversion; be sure -to let them know when you make changes that remove files from inclusion so they -can run ``setup.py clean --all``. - - -Accessing Data Files at Runtime -------------------------------- - -Typically, existing programs manipulate a package's ``__file__`` attribute in -order to find the location of data files. However, this manipulation isn't -compatible with PEP 302-based import hooks, including importing from zip files -and Python Eggs. It is strongly recommended that, if you are using data files, -you should use the `Resource Management API`_ of ``pkg_resources`` to access -them. The ``pkg_resources`` module is distributed as part of setuptools, so if -you're using setuptools to distribute your package, there is no reason not to -use its resource management API. See also `Accessing Package Resources`_ for -a quick example of converting code that uses ``__file__`` to use -``pkg_resources`` instead. - -.. _Resource Management API: http://peak.telecommunity.com/DevCenter/PythonEggs#resource-management -.. _Accessing Package Resources: http://peak.telecommunity.com/DevCenter/PythonEggs#accessing-package-resources - - -Non-Package Data Files ----------------------- - -The ``distutils`` normally install general "data files" to a platform-specific -location (e.g. ``/usr/share``). This feature intended to be used for things -like documentation, example configuration files, and the like. ``setuptools`` -does not install these data files in a separate location, however. They are -bundled inside the egg file or directory, alongside the Python modules and -packages. The data files can also be accessed using the `Resource Management -API`_, by specifying a ``Requirement`` instead of a package name:: - - from pkg_resources import Requirement, resource_filename - filename = resource_filename(Requirement.parse("MyProject"),"sample.conf") - -The above code will obtain the filename of the "sample.conf" file in the data -root of the "MyProject" distribution. - -Note, by the way, that this encapsulation of data files means that you can't -actually install data files to some arbitrary location on a user's machine; -this is a feature, not a bug. You can always include a script in your -distribution that extracts and copies your the documentation or data files to -a user-specified location, at their discretion. If you put related data files -in a single directory, you can use ``resource_filename()`` with the directory -name to get a filesystem directory that then can be copied with the ``shutil`` -module. (Even if your package is installed as a zipfile, calling -``resource_filename()`` on a directory will return an actual filesystem -directory, whose contents will be that entire subtree of your distribution.) - -(Of course, if you're writing a new package, you can just as easily place your -data files or directories inside one of your packages, rather than using the -distutils' approach. However, if you're updating an existing application, it -may be simpler not to change the way it currently specifies these data files.) - - -Automatic Resource Extraction ------------------------------ - -If you are using tools that expect your resources to be "real" files, or your -project includes non-extension native libraries or other files that your C -extensions expect to be able to access, you may need to list those files in -the ``eager_resources`` argument to ``setup()``, so that the files will be -extracted together, whenever a C extension in the project is imported. - -This is especially important if your project includes shared libraries *other* -than distutils-built C extensions, and those shared libraries use file -extensions other than ``.dll``, ``.so``, or ``.dylib``, which are the -extensions that setuptools 0.6a8 and higher automatically detects as shared -libraries and adds to the ``native_libs.txt`` file for you. Any shared -libraries whose names do not end with one of those extensions should be listed -as ``eager_resources``, because they need to be present in the filesystem when -he C extensions that link to them are used. - -The ``pkg_resources`` runtime for compressed packages will automatically -extract *all* C extensions and ``eager_resources`` at the same time, whenever -*any* C extension or eager resource is requested via the ``resource_filename()`` -API. (C extensions are imported using ``resource_filename()`` internally.) -This ensures that C extensions will see all of the "real" files that they -expect to see. - -Note also that you can list directory resource names in ``eager_resources`` as -well, in which case the directory's contents (including subdirectories) will be -extracted whenever any C extension or eager resource is requested. - -Please note that if you're not sure whether you need to use this argument, you -don't! It's really intended to support projects with lots of non-Python -dependencies and as a last resort for crufty projects that can't otherwise -handle being compressed. If your package is pure Python, Python plus data -files, or Python plus C, you really don't need this. You've got to be using -either C or an external program that needs "real" files in your project before -there's any possibility of ``eager_resources`` being relevant to your project. - - -Extensible Applications and Frameworks -====================================== - - -.. _Entry Points: - -Dynamic Discovery of Services and Plugins ------------------------------------------ - -``setuptools`` supports creating libraries that "plug in" to extensible -applications and frameworks, by letting you register "entry points" in your -project that can be imported by the application or framework. - -For example, suppose that a blogging tool wants to support plugins -that provide translation for various file types to the blog's output format. -The framework might define an "entry point group" called ``blogtool.parsers``, -and then allow plugins to register entry points for the file extensions they -support. - -This would allow people to create distributions that contain one or more -parsers for different file types, and then the blogging tool would be able to -find the parsers at runtime by looking up an entry point for the file -extension (or mime type, or however it wants to). - -Note that if the blogging tool includes parsers for certain file formats, it -can register these as entry points in its own setup script, which means it -doesn't have to special-case its built-in formats. They can just be treated -the same as any other plugin's entry points would be. - -If you're creating a project that plugs in to an existing application or -framework, you'll need to know what entry points or entry point groups are -defined by that application or framework. Then, you can register entry points -in your setup script. Here are a few examples of ways you might register an -``.rst`` file parser entry point in the ``blogtool.parsers`` entry point group, -for our hypothetical blogging tool:: - - setup( - # ... - entry_points = {'blogtool.parsers': '.rst = some_module:SomeClass'} - ) - - setup( - # ... - entry_points = {'blogtool.parsers': ['.rst = some_module:a_func']} - ) - - setup( - # ... - entry_points = """ - [blogtool.parsers] - .rst = some.nested.module:SomeClass.some_classmethod [reST] - """, - extras_require = dict(reST = "Docutils>=0.3.5") - ) - -The ``entry_points`` argument to ``setup()`` accepts either a string with -``.ini``-style sections, or a dictionary mapping entry point group names to -either strings or lists of strings containing entry point specifiers. An -entry point specifier consists of a name and value, separated by an ``=`` -sign. The value consists of a dotted module name, optionally followed by a -``:`` and a dotted identifier naming an object within the module. It can -also include a bracketed list of "extras" that are required for the entry -point to be used. When the invoking application or framework requests loading -of an entry point, any requirements implied by the associated extras will be -passed to ``pkg_resources.require()``, so that an appropriate error message -can be displayed if the needed package(s) are missing. (Of course, the -invoking app or framework can ignore such errors if it wants to make an entry -point optional if a requirement isn't installed.) - - -Defining Additional Metadata ----------------------------- - -Some extensible applications and frameworks may need to define their own kinds -of metadata to include in eggs, which they can then access using the -``pkg_resources`` metadata APIs. Ordinarily, this is done by having plugin -developers include additional files in their ``ProjectName.egg-info`` -directory. However, since it can be tedious to create such files by hand, you -may want to create a distutils extension that will create the necessary files -from arguments to ``setup()``, in much the same way that ``setuptools`` does -for many of the ``setup()`` arguments it adds. See the section below on -`Creating distutils Extensions`_ for more details, especially the subsection on -`Adding new EGG-INFO Files`_. - - -"Development Mode" -================== - -Under normal circumstances, the ``distutils`` assume that you are going to -build a distribution of your project, not use it in its "raw" or "unbuilt" -form. If you were to use the ``distutils`` that way, you would have to rebuild -and reinstall your project every time you made a change to it during -development. - -Another problem that sometimes comes up with the ``distutils`` is that you may -need to do development on two related projects at the same time. You may need -to put both projects' packages in the same directory to run them, but need to -keep them separate for revision control purposes. How can you do this? - -Setuptools allows you to deploy your projects for use in a common directory or -staging area, but without copying any files. Thus, you can edit each project's -code in its checkout directory, and only need to run build commands when you -change a project's C extensions or similarly compiled files. You can even -deploy a project into another project's checkout directory, if that's your -preferred way of working (as opposed to using a common independent staging area -or the site-packages directory). - -To do this, use the ``setup.py develop`` command. It works very similarly to -``setup.py install`` or the EasyInstall tool, except that it doesn't actually -install anything. Instead, it creates a special ``.egg-link`` file in the -deployment directory, that links to your project's source code. And, if your -deployment directory is Python's ``site-packages`` directory, it will also -update the ``easy-install.pth`` file to include your project's source code, -thereby making it available on ``sys.path`` for all programs using that Python -installation. - -In addition, the ``develop`` command creates wrapper scripts in the target -script directory that will run your in-development scripts after ensuring that -all your ``install_requires`` packages are available on ``sys.path``. - -You can deploy the same project to multiple staging areas, e.g. if you have -multiple projects on the same machine that are sharing the same project you're -doing development work. - -When you're done with a given development task, you can remove the project -source from a staging area using ``setup.py develop --uninstall``, specifying -the desired staging area if it's not the default. - -There are several options to control the precise behavior of the ``develop`` -command; see the section on the `develop`_ command below for more details. - -Note that you can also apply setuptools commands to non-setuptools projects, -using commands like this:: - - python -c "import setuptools; execfile('setup.py')" develop - -That is, you can simply list the normal setup commands and options following -the quoted part. - - -Distributing a ``setuptools``-based project -=========================================== - -Using ``setuptools``... Without bundling it! ---------------------------------------------- - -Your users might not have ``setuptools`` installed on their machines, or even -if they do, it might not be the right version. Fixing this is easy; just -download `ez_setup.py`_, and put it in the same directory as your ``setup.py`` -script. (Be sure to add it to your revision control system, too.) Then add -these two lines to the very top of your setup script, before the script imports -anything from setuptools:: - - import ez_setup - ez_setup.use_setuptools() - -That's it. The ``ez_setup`` module will automatically download a matching -version of ``setuptools`` from PyPI, if it isn't present on the target system. -Whenever you install an updated version of setuptools, you should also update -your projects' ``ez_setup.py`` files, so that a matching version gets installed -on the target machine(s). - -By the way, setuptools supports the new PyPI "upload" command, so you can use -``setup.py sdist upload`` or ``setup.py bdist_egg upload`` to upload your -source or egg distributions respectively. Your project's current version must -be registered with PyPI first, of course; you can use ``setup.py register`` to -do that. Or you can do it all in one step, e.g. ``setup.py register sdist -bdist_egg upload`` will register the package, build source and egg -distributions, and then upload them both to PyPI, where they'll be easily -found by other projects that depend on them. - -(By the way, if you need to distribute a specific version of ``setuptools``, -you can specify the exact version and base download URL as parameters to the -``use_setuptools()`` function. See the function's docstring for details.) - - -What Your Users Should Know ---------------------------- - -In general, a setuptools-based project looks just like any distutils-based -project -- as long as your users have an internet connection and are installing -to ``site-packages``, that is. But for some users, these conditions don't -apply, and they may become frustrated if this is their first encounter with -a setuptools-based project. To keep these users happy, you should review the -following topics in your project's installation instructions, if they are -relevant to your project and your target audience isn't already familiar with -setuptools and ``easy_install``. - -Network Access - If your project is using ``ez_setup``, you should inform users of the need - to either have network access, or to preinstall the correct version of - setuptools using the `EasyInstall installation instructions`_. Those - instructions also have tips for dealing with firewalls as well as how to - manually download and install setuptools. - -Custom Installation Locations - You should inform your users that if they are installing your project to - somewhere other than the main ``site-packages`` directory, they should - first install setuptools using the instructions for `Custom Installation - Locations`_, before installing your project. - -Your Project's Dependencies - If your project depends on other projects that may need to be downloaded - from PyPI or elsewhere, you should list them in your installation - instructions, or tell users how to find out what they are. While most - users will not need this information, any users who don't have unrestricted - internet access may have to find, download, and install the other projects - manually. (Note, however, that they must still install those projects - using ``easy_install``, or your project will not know they are installed, - and your setup script will try to download them again.) - - If you want to be especially friendly to users with limited network access, - you may wish to build eggs for your project and its dependencies, making - them all available for download from your site, or at least create a page - with links to all of the needed eggs. In this way, users with limited - network access can manually download all the eggs to a single directory, - then use the ``-f`` option of ``easy_install`` to specify the directory - to find eggs in. Users who have full network access can just use ``-f`` - with the URL of your download page, and ``easy_install`` will find all the - needed eggs using your links directly. This is also useful when your - target audience isn't able to compile packages (e.g. most Windows users) - and your package or some of its dependencies include C code. - -Subversion or CVS Users and Co-Developers - Users and co-developers who are tracking your in-development code using - CVS, Subversion, or some other revision control system should probably read - this manual's sections regarding such development. Alternately, you may - wish to create a quick-reference guide containing the tips from this manual - that apply to your particular situation. For example, if you recommend - that people use ``setup.py develop`` when tracking your in-development - code, you should let them know that this needs to be run after every update - or commit. - - Similarly, if you remove modules or data files from your project, you - should remind them to run ``setup.py clean --all`` and delete any obsolete - ``.pyc`` or ``.pyo``. (This tip applies to the distutils in general, not - just setuptools, but not everybody knows about them; be kind to your users - by spelling out your project's best practices rather than leaving them - guessing.) - -Creating System Packages - Some users want to manage all Python packages using a single package - manager, and sometimes that package manager isn't ``easy_install``! - Setuptools currently supports ``bdist_rpm``, ``bdist_wininst``, and - ``bdist_dumb`` formats for system packaging. If a user has a locally- - installed "bdist" packaging tool that internally uses the distutils - ``install`` command, it should be able to work with ``setuptools``. Some - examples of "bdist" formats that this should work with include the - ``bdist_nsi`` and ``bdist_msi`` formats for Windows. - - However, packaging tools that build binary distributions by running - ``setup.py install`` on the command line or as a subprocess will require - modification to work with setuptools. They should use the - ``--single-version-externally-managed`` option to the ``install`` command, - combined with the standard ``--root`` or ``--record`` options. - See the `install command`_ documentation below for more details. The - ``bdist_deb`` command is an example of a command that currently requires - this kind of patching to work with setuptools. - - If you or your users have a problem building a usable system package for - your project, please report the problem via the `mailing list`_ so that - either the "bdist" tool in question or setuptools can be modified to - resolve the issue. - - - -Managing Multiple Projects --------------------------- - -If you're managing several projects that need to use ``ez_setup``, and you are -using Subversion as your revision control system, you can use the -"svn:externals" property to share a single copy of ``ez_setup`` between -projects, so that it will always be up-to-date whenever you check out or update -an individual project, without having to manually update each project to use -a new version. - -However, because Subversion only supports using directories as externals, you -have to turn ``ez_setup.py`` into ``ez_setup/__init__.py`` in order to do this, -then create "externals" definitions that map the ``ez_setup`` directory into -each project. Also, if any of your projects use ``find_packages()`` on their -setup directory, you will need to exclude the resulting ``ez_setup`` package, -to keep it from being included in your distributions, e.g.:: - - setup( - ... - packages = find_packages(exclude=['ez_setup']), - ) - -Of course, the ``ez_setup`` package will still be included in your packages' -source distributions, as it needs to be. - -For your convenience, you may use the following external definition, which will -track the latest version of setuptools:: - - ez_setup svn://svn.eby-sarna.com/svnroot/ez_setup - -You can set this by executing this command in your project directory:: - - svn propedit svn:externals . - -And then adding the line shown above to the file that comes up for editing. - - -Setting the ``zip_safe`` flag ------------------------------ - -For maximum performance, Python packages are best installed as zip files. -Not all packages, however, are capable of running in compressed form, because -they may expect to be able to access either source code or data files as -normal operating system files. So, ``setuptools`` can install your project -as a zipfile or a directory, and its default choice is determined by the -project's ``zip_safe`` flag. - -You can pass a True or False value for the ``zip_safe`` argument to the -``setup()`` function, or you can omit it. If you omit it, the ``bdist_egg`` -command will analyze your project's contents to see if it can detect any -conditions that would prevent it from working in a zipfile. It will output -notices to the console about any such conditions that it finds. - -Currently, this analysis is extremely conservative: it will consider the -project unsafe if it contains any C extensions or datafiles whatsoever. This -does *not* mean that the project can't or won't work as a zipfile! It just -means that the ``bdist_egg`` authors aren't yet comfortable asserting that -the project *will* work. If the project contains no C or data files, and does -no ``__file__`` or ``__path__`` introspection or source code manipulation, then -there is an extremely solid chance the project will work when installed as a -zipfile. (And if the project uses ``pkg_resources`` for all its data file -access, then C extensions and other data files shouldn't be a problem at all. -See the `Accessing Data Files at Runtime`_ section above for more information.) - -However, if ``bdist_egg`` can't be *sure* that your package will work, but -you've checked over all the warnings it issued, and you are either satisfied it -*will* work (or if you want to try it for yourself), then you should set -``zip_safe`` to ``True`` in your ``setup()`` call. If it turns out that it -doesn't work, you can always change it to ``False``, which will force -``setuptools`` to install your project as a directory rather than as a zipfile. - -Of course, the end-user can still override either decision, if they are using -EasyInstall to install your package. And, if you want to override for testing -purposes, you can just run ``setup.py easy_install --zip-ok .`` or ``setup.py -easy_install --always-unzip .`` in your project directory. to install the -package as a zipfile or directory, respectively. - -In the future, as we gain more experience with different packages and become -more satisfied with the robustness of the ``pkg_resources`` runtime, the -"zip safety" analysis may become less conservative. However, we strongly -recommend that you determine for yourself whether your project functions -correctly when installed as a zipfile, correct any problems if you can, and -then make an explicit declaration of ``True`` or ``False`` for the ``zip_safe`` -flag, so that it will not be necessary for ``bdist_egg`` or ``EasyInstall`` to -try to guess whether your project can work as a zipfile. - - -Namespace Packages ------------------- - -Sometimes, a large package is more useful if distributed as a collection of -smaller eggs. However, Python does not normally allow the contents of a -package to be retrieved from more than one location. "Namespace packages" -are a solution for this problem. When you declare a package to be a namespace -package, it means that the package has no meaningful contents in its -``__init__.py``, and that it is merely a container for modules and subpackages. - -The ``pkg_resources`` runtime will then automatically ensure that the contents -of namespace packages that are spread over multiple eggs or directories are -combined into a single "virtual" package. - -The ``namespace_packages`` argument to ``setup()`` lets you declare your -project's namespace packages, so that they will be included in your project's -metadata. The argument should list the namespace packages that the egg -participates in. For example, the ZopeInterface project might do this:: - - setup( - # ... - namespace_packages = ['zope'] - ) - -because it contains a ``zope.interface`` package that lives in the ``zope`` -namespace package. Similarly, a project for a standalone ``zope.publisher`` -would also declare the ``zope`` namespace package. When these projects are -installed and used, Python will see them both as part of a "virtual" ``zope`` -package, even though they will be installed in different locations. - -Namespace packages don't have to be top-level packages. For example, Zope 3's -``zope.app`` package is a namespace package, and in the future PEAK's -``peak.util`` package will be too. - -Note, by the way, that your project's source tree must include the namespace -packages' ``__init__.py`` files (and the ``__init__.py`` of any parent -packages), in a normal Python package layout. These ``__init__.py`` files -*must* contain the line:: - - __import__('pkg_resources').declare_namespace(__name__) - -This code ensures that the namespace package machinery is operating and that -the current package is registered as a namespace package. - -You must NOT include any other code and data in a namespace package's -``__init__.py``. Even though it may appear to work during development, or when -projects are installed as ``.egg`` files, it will not work when the projects -are installed using "system" packaging tools -- in such cases the -``__init__.py`` files will not be installed, let alone executed. - -You must include the ``declare_namespace()`` line in the ``__init__.py`` of -*every* project that has contents for the namespace package in question, in -order to ensure that the namespace will be declared regardless of which -project's copy of ``__init__.py`` is loaded first. If the first loaded -``__init__.py`` doesn't declare it, it will never *be* declared, because no -other copies will ever be loaded!) - - -TRANSITIONAL NOTE -~~~~~~~~~~~~~~~~~ - -Setuptools 0.6a automatically calls ``declare_namespace()`` for you at runtime, -but the 0.7a versions will *not*. This is because the automatic declaration -feature has some negative side effects, such as needing to import all namespace -packages during the initialization of the ``pkg_resources`` runtime, and also -the need for ``pkg_resources`` to be explicitly imported before any namespace -packages work at all. Beginning with the 0.7a releases, you'll be responsible -for including your own declaration lines, and the automatic declaration feature -will be dropped to get rid of the negative side effects. - -During the remainder of the 0.6 development cycle, therefore, setuptools will -warn you about missing ``declare_namespace()`` calls in your ``__init__.py`` -files, and you should correct these as soon as possible before setuptools 0.7a1 -is released. Namespace packages without declaration lines will not work -correctly once a user has upgraded to setuptools 0.7a1, so it's important that -you make this change now in order to avoid having your code break in the field. -Our apologies for the inconvenience, and thank you for your patience. - - - -Tagging and "Daily Build" or "Snapshot" Releases ------------------------------------------------- - -When a set of related projects are under development, it may be important to -track finer-grained version increments than you would normally use for e.g. -"stable" releases. While stable releases might be measured in dotted numbers -with alpha/beta/etc. status codes, development versions of a project often -need to be tracked by revision or build number or even build date. This is -especially true when projects in development need to refer to one another, and -therefore may literally need an up-to-the-minute version of something! - -To support these scenarios, ``setuptools`` allows you to "tag" your source and -egg distributions by adding one or more of the following to the project's -"official" version identifier: - -* A manually-specified pre-release tag, such as "build" or "dev", or a - manually-specified post-release tag, such as a build or revision number - (``--tag-build=STRING, -bSTRING``) - -* A "last-modified revision number" string generated automatically from - Subversion's metadata (assuming your project is being built from a Subversion - "working copy") (``--tag-svn-revision, -r``) - -* An 8-character representation of the build date (``--tag-date, -d``), as - a postrelease tag - -You can add these tags by adding ``egg_info`` and the desired options to -the command line ahead of the ``sdist`` or ``bdist`` commands that you want -to generate a daily build or snapshot for. See the section below on the -`egg_info`_ command for more details. - -(Also, before you release your project, be sure to see the section above on -`Specifying Your Project's Version`_ for more information about how pre- and -post-release tags affect how setuptools and EasyInstall interpret version -numbers. This is important in order to make sure that dependency processing -tools will know which versions of your project are newer than others.) - -Finally, if you are creating builds frequently, and either building them in a -downloadable location or are copying them to a distribution server, you should -probably also check out the `rotate`_ command, which lets you automatically -delete all but the N most-recently-modified distributions matching a glob -pattern. So, you can use a command line like:: - - setup.py egg_info -rbDEV bdist_egg rotate -m.egg -k3 - -to build an egg whose version info includes 'DEV-rNNNN' (where NNNN is the -most recent Subversion revision that affected the source tree), and then -delete any egg files from the distribution directory except for the three -that were built most recently. - -If you have to manage automated builds for multiple packages, each with -different tagging and rotation policies, you may also want to check out the -`alias`_ command, which would let each package define an alias like ``daily`` -that would perform the necessary tag, build, and rotate commands. Then, a -simpler script or cron job could just run ``setup.py daily`` in each project -directory. (And, you could also define sitewide or per-user default versions -of the ``daily`` alias, so that projects that didn't define their own would -use the appropriate defaults.) - - -Generating Source Distributions -------------------------------- - -``setuptools`` enhances the distutils' default algorithm for source file -selection, so that all files managed by CVS or Subversion in your project tree -are included in any source distribution you build. This is a big improvement -over having to manually write a ``MANIFEST.in`` file and try to keep it in -sync with your project. So, if you are using CVS or Subversion, and your -source distributions only need to include files that you're tracking in -revision control, don't create a a ``MANIFEST.in`` file for your project. -(And, if you already have one, you might consider deleting it the next time -you would otherwise have to change it.) - -(NOTE: other revision control systems besides CVS and Subversion can be -supported using plugins; see the section below on `Adding Support for Other -Revision Control Systems`_ for information on how to write such plugins.) - -If you need to include automatically generated files, or files that are kept in -an unsupported revision control system, you'll need to create a ``MANIFEST.in`` -file to specify any files that the default file location algorithm doesn't -catch. See the distutils documentation for more information on the format of -the ``MANIFEST.in`` file. - -But, be sure to ignore any part of the distutils documentation that deals with -``MANIFEST`` or how it's generated from ``MANIFEST.in``; setuptools shields you -from these issues and doesn't work the same way in any case. Unlike the -distutils, setuptools regenerates the source distribution manifest file -every time you build a source distribution, and it builds it inside the -project's ``.egg-info`` directory, out of the way of your main project -directory. You therefore need not worry about whether it is up-to-date or not. - -Indeed, because setuptools' approach to determining the contents of a source -distribution is so much simpler, its ``sdist`` command omits nearly all of -the options that the distutils' more complex ``sdist`` process requires. For -all practical purposes, you'll probably use only the ``--formats`` option, if -you use any option at all. - -(By the way, if you're using some other revision control system, you might -consider creating and publishing a `revision control plugin for setuptools`_.) - - -.. _revision control plugin for setuptools: `Adding Support for Other Revision Control Systems`_ - - -Making your package available for EasyInstall ---------------------------------------------- - -If you use the ``register`` command (``setup.py register``) to register your -package with PyPI, that's most of the battle right there. (See the -`docs for the register command`_ for more details.) - -.. _docs for the register command: http://docs.python.org/dist/package-index.html - -If you also use the `upload`_ command to upload actual distributions of your -package, that's even better, because EasyInstall will be able to find and -download them directly from your project's PyPI page. - -However, there may be reasons why you don't want to upload distributions to -PyPI, and just want your existing distributions (or perhaps a Subversion -checkout) to be used instead. - -So here's what you need to do before running the ``register`` command. There -are three ``setup()`` arguments that affect EasyInstall: - -``url`` and ``download_url`` - These become links on your project's PyPI page. EasyInstall will examine - them to see if they link to a package ("primary links"), or whether they are - HTML pages. If they're HTML pages, EasyInstall scans all HREF's on the - page for primary links - -``long_description`` - EasyInstall will check any URLs contained in this argument to see if they - are primary links. - -A URL is considered a "primary link" if it is a link to a .tar.gz, .tgz, .zip, -.egg, .egg.zip, .tar.bz2, or .exe file, or if it has an ``#egg=project`` or -``#egg=project-version`` fragment identifier attached to it. EasyInstall -attempts to determine a project name and optional version number from the text -of a primary link *without* downloading it. When it has found all the primary -links, EasyInstall will select the best match based on requested version, -platform compatibility, and other criteria. - -So, if your ``url`` or ``download_url`` point either directly to a downloadable -source distribution, or to HTML page(s) that have direct links to such, then -EasyInstall will be able to locate downloads automatically. If you want to -make Subversion checkouts available, then you should create links with either -``#egg=project`` or ``#egg=project-version`` added to the URL. You should -replace ``project`` and ``version`` with the values they would have in an egg -filename. (Be sure to actually generate an egg and then use the initial part -of the filename, rather than trying to guess what the escaped form of the -project name and version number will be.) - -Note that Subversion checkout links are of lower precedence than other kinds -of distributions, so EasyInstall will not select a Subversion checkout for -downloading unless it has a version included in the ``#egg=`` suffix, and -it's a higher version than EasyInstall has seen in any other links for your -project. - -As a result, it's a common practice to use mark checkout URLs with a version of -"dev" (i.e., ``#egg=projectname-dev``), so that users can do something like -this:: - - easy_install --editable projectname==dev - -in order to check out the in-development version of ``projectname``. - - -Managing "Continuous Releases" Using Subversion ------------------------------------------------ - -If you expect your users to track in-development versions of your project via -Subversion, there are a few additional steps you should take to ensure that -things work smoothly with EasyInstall. First, you should add the following -to your project's ``setup.cfg`` file:: - - [egg_info] - tag_build = .dev - tag_svn_revision = 1 - -This will tell ``setuptools`` to generate package version numbers like -``1.0a1.dev-r1263``, which will be considered to be an *older* release than -``1.0a1``. Thus, when you actually release ``1.0a1``, the entire egg -infrastructure (including ``setuptools``, ``pkg_resources`` and EasyInstall) -will know that ``1.0a1`` supersedes any interim snapshots from Subversion, and -handle upgrades accordingly. - -(Note: the project version number you specify in ``setup.py`` should always be -the *next* version of your software, not the last released version. -Alternately, you can leave out the ``tag_build=.dev``, and always use the -*last* release as a version number, so that your post-1.0 builds are labelled -``1.0-r1263``, indicating a post-1.0 patchlevel. Most projects so far, -however, seem to prefer to think of their project as being a future version -still under development, rather than a past version being patched. It is of -course possible for a single project to have both situations, using -post-release numbering on release branches, and pre-release numbering on the -trunk. But you don't have to make things this complex if you don't want to.) - -Commonly, projects releasing code from Subversion will include a PyPI link to -their checkout URL (as described in the previous section) with an -``#egg=projectname-dev`` suffix. This allows users to request EasyInstall -to download ``projectname==dev`` in order to get the latest in-development -code. Note that if your project depends on such in-progress code, you may wish -to specify your ``install_requires`` (or other requirements) to include -``==dev``, e.g.:: - - install_requires = ["OtherProject>=0.2a1.dev-r143,==dev"] - -The above example says, "I really want at least this particular development -revision number, but feel free to follow and use an ``#egg=OtherProject-dev`` -link if you find one". This avoids the need to have actual source or binary -distribution snapshots of in-development code available, just to be able to -depend on the latest and greatest a project has to offer. - -A final note for Subversion development: if you are using SVN revision tags -as described in this section, it's a good idea to run ``setup.py develop`` -after each Subversion checkin or update, because your project's version number -will be changing, and your script wrappers need to be updated accordingly. - -Also, if the project's requirements have changed, the ``develop`` command will -take care of fetching the updated dependencies, building changed extensions, -etc. Be sure to also remind any of your users who check out your project -from Subversion that they need to run ``setup.py develop`` after every update -in order to keep their checkout completely in sync. - - -Making "Official" (Non-Snapshot) Releases -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -When you make an official release, creating source or binary distributions, -you will need to override the tag settings from ``setup.cfg``, so that you -don't end up registering versions like ``foobar-0.7a1.dev-r34832``. This is -easy to do if you are developing on the trunk and using tags or branches for -your releases - just make the change to ``setup.cfg`` after branching or -tagging the release, so the trunk will still produce development snapshots. - -Alternately, if you are not branching for releases, you can override the -default version options on the command line, using something like:: - - python setup.py egg_info -RDb "" sdist bdist_egg register upload - -The first part of this command (``egg_info -RDb ""``) will override the -configured tag information, before creating source and binary eggs, registering -the project with PyPI, and uploading the files. Thus, these commands will use -the plain version from your ``setup.py``, without adding the Subversion -revision number or build designation string. - -Of course, if you will be doing this a lot, you may wish to create a personal -alias for this operation, e.g.:: - - python setup.py alias -u release egg_info -RDb "" - -You can then use it like this:: - - python setup.py release sdist bdist_egg register upload - -Or of course you can create more elaborate aliases that do all of the above. -See the sections below on the `egg_info`_ and `alias`_ commands for more ideas. - - - -Distributing Extensions compiled with Pyrex -------------------------------------------- - -``setuptools`` includes transparent support for building Pyrex extensions, as -long as you define your extensions using ``setuptools.Extension``, *not* -``distutils.Extension``. You must also not import anything from Pyrex in -your setup script. - -If you follow these rules, you can safely list ``.pyx`` files as the source -of your ``Extension`` objects in the setup script. ``setuptools`` will detect -at build time whether Pyrex is installed or not. If it is, then ``setuptools`` -will use it. If not, then ``setuptools`` will silently change the -``Extension`` objects to refer to the ``.c`` counterparts of the ``.pyx`` -files, so that the normal distutils C compilation process will occur. - -Of course, for this to work, your source distributions must include the C -code generated by Pyrex, as well as your original ``.pyx`` files. This means -that you will probably want to include current ``.c`` files in your revision -control system, rebuilding them whenever you check changes in for the ``.pyx`` -source files. This will ensure that people tracking your project in CVS or -Subversion will be able to build it even if they don't have Pyrex installed, -and that your source releases will be similarly usable with or without Pyrex. - - ------------------ -Command Reference ------------------ - -.. _alias: - -``alias`` - Define shortcuts for commonly used commands -======================================================= - -Sometimes, you need to use the same commands over and over, but you can't -necessarily set them as defaults. For example, if you produce both development -snapshot releases and "stable" releases of a project, you may want to put -the distributions in different places, or use different ``egg_info`` tagging -options, etc. In these cases, it doesn't make sense to set the options in -a distutils configuration file, because the values of the options changed based -on what you're trying to do. - -Setuptools therefore allows you to define "aliases" - shortcut names for -an arbitrary string of commands and options, using ``setup.py alias aliasname -expansion``, where aliasname is the name of the new alias, and the remainder of -the command line supplies its expansion. For example, this command defines -a sitewide alias called "daily", that sets various ``egg_info`` tagging -options:: - - setup.py alias --global-config daily egg_info --tag-svn-revision \ - --tag-build=development - -Once the alias is defined, it can then be used with other setup commands, -e.g.:: - - setup.py daily bdist_egg # generate a daily-build .egg file - setup.py daily sdist # generate a daily-build source distro - setup.py daily sdist bdist_egg # generate both - -The above commands are interpreted as if the word ``daily`` were replaced with -``egg_info --tag-svn-revision --tag-build=development``. - -Note that setuptools will expand each alias *at most once* in a given command -line. This serves two purposes. First, if you accidentally create an alias -loop, it will have no effect; you'll instead get an error message about an -unknown command. Second, it allows you to define an alias for a command, that -uses that command. For example, this (project-local) alias:: - - setup.py alias bdist_egg bdist_egg rotate -k1 -m.egg - -redefines the ``bdist_egg`` command so that it always runs the ``rotate`` -command afterwards to delete all but the newest egg file. It doesn't loop -indefinitely on ``bdist_egg`` because the alias is only expanded once when -used. - -You can remove a defined alias with the ``--remove`` (or ``-r``) option, e.g.:: - - setup.py alias --global-config --remove daily - -would delete the "daily" alias we defined above. - -Aliases can be defined on a project-specific, per-user, or sitewide basis. The -default is to define or remove a project-specific alias, but you can use any of -the `configuration file options`_ (listed under the `saveopts`_ command, below) -to determine which distutils configuration file an aliases will be added to -(or removed from). - -Note that if you omit the "expansion" argument to the ``alias`` command, -you'll get output showing that alias' current definition (and what -configuration file it's defined in). If you omit the alias name as well, -you'll get a listing of all current aliases along with their configuration -file locations. - - -``bdist_egg`` - Create a Python Egg for the project -=================================================== - -This command generates a Python Egg (``.egg`` file) for the project. Python -Eggs are the preferred binary distribution format for EasyInstall, because they -are cross-platform (for "pure" packages), directly importable, and contain -project metadata including scripts and information about the project's -dependencies. They can be simply downloaded and added to ``sys.path`` -directly, or they can be placed in a directory on ``sys.path`` and then -automatically discovered by the egg runtime system. - -This command runs the `egg_info`_ command (if it hasn't already run) to update -the project's metadata (``.egg-info``) directory. If you have added any extra -metadata files to the ``.egg-info`` directory, those files will be included in -the new egg file's metadata directory, for use by the egg runtime system or by -any applications or frameworks that use that metadata. - -You won't usually need to specify any special options for this command; just -use ``bdist_egg`` and you're done. But there are a few options that may -be occasionally useful: - -``--dist-dir=DIR, -d DIR`` - Set the directory where the ``.egg`` file will be placed. If you don't - supply this, then the ``--dist-dir`` setting of the ``bdist`` command - will be used, which is usually a directory named ``dist`` in the project - directory. - -``--plat-name=PLATFORM, -p PLATFORM`` - Set the platform name string that will be embedded in the egg's filename - (assuming the egg contains C extensions). This can be used to override - the distutils default platform name with something more meaningful. Keep - in mind, however, that the egg runtime system expects to see eggs with - distutils platform names, so it may ignore or reject eggs with non-standard - platform names. Similarly, the EasyInstall program may ignore them when - searching web pages for download links. However, if you are - cross-compiling or doing some other unusual things, you might find a use - for this option. - -``--exclude-source-files`` - Don't include any modules' ``.py`` files in the egg, just compiled Python, - C, and data files. (Note that this doesn't affect any ``.py`` files in the - EGG-INFO directory or its subdirectories, since for example there may be - scripts with a ``.py`` extension which must still be retained.) We don't - recommend that you use this option except for packages that are being - bundled for proprietary end-user applications, or for "embedded" scenarios - where space is at an absolute premium. On the other hand, if your package - is going to be installed and used in compressed form, you might as well - exclude the source because Python's ``traceback`` module doesn't currently - understand how to display zipped source code anyway, or how to deal with - files that are in a different place from where their code was compiled. - -There are also some options you will probably never need, but which are there -because they were copied from similar ``bdist`` commands used as an example for -creating this one. They may be useful for testing and debugging, however, -which is why we kept them: - -``--keep-temp, -k`` - Keep the contents of the ``--bdist-dir`` tree around after creating the - ``.egg`` file. - -``--bdist-dir=DIR, -b DIR`` - Set the temporary directory for creating the distribution. The entire - contents of this directory are zipped to create the ``.egg`` file, after - running various installation commands to copy the package's modules, data, - and extensions here. - -``--skip-build`` - Skip doing any "build" commands; just go straight to the - install-and-compress phases. - - -.. _develop: - -``develop`` - Deploy the project source in "Development Mode" -============================================================= - -This command allows you to deploy your project's source for use in one or more -"staging areas" where it will be available for importing. This deployment is -done in such a way that changes to the project source are immediately available -in the staging area(s), without needing to run a build or install step after -each change. - -The ``develop`` command works by creating an ``.egg-link`` file (named for the -project) in the given staging area. If the staging area is Python's -``site-packages`` directory, it also updates an ``easy-install.pth`` file so -that the project is on ``sys.path`` by default for all programs run using that -Python installation. - -The ``develop`` command also installs wrapper scripts in the staging area (or -a separate directory, as specified) that will ensure the project's dependencies -are available on ``sys.path`` before running the project's source scripts. -And, it ensures that any missing project dependencies are available in the -staging area, by downloading and installing them if necessary. - -Last, but not least, the ``develop`` command invokes the ``build_ext -i`` -command to ensure any C extensions in the project have been built and are -up-to-date, and the ``egg_info`` command to ensure the project's metadata is -updated (so that the runtime and wrappers know what the project's dependencies -are). If you make any changes to the project's setup script or C extensions, -you should rerun the ``develop`` command against all relevant staging areas to -keep the project's scripts, metadata and extensions up-to-date. Most other -kinds of changes to your project should not require any build operations or -rerunning ``develop``, but keep in mind that even minor changes to the setup -script (e.g. changing an entry point definition) require you to re-run the -``develop`` or ``test`` commands to keep the distribution updated. - -Here are some of the options that the ``develop`` command accepts. Note that -they affect the project's dependencies as well as the project itself, so if you -have dependencies that need to be installed and you use ``--exclude-scripts`` -(for example), the dependencies' scripts will not be installed either! For -this reason, you may want to use EasyInstall to install the project's -dependencies before using the ``develop`` command, if you need finer control -over the installation options for dependencies. - -``--uninstall, -u`` - Un-deploy the current project. You may use the ``--install-dir`` or ``-d`` - option to designate the staging area. The created ``.egg-link`` file will - be removed, if present and it is still pointing to the project directory. - The project directory will be removed from ``easy-install.pth`` if the - staging area is Python's ``site-packages`` directory. - - Note that this option currently does *not* uninstall script wrappers! You - must uninstall them yourself, or overwrite them by using EasyInstall to - activate a different version of the package. You can also avoid installing - script wrappers in the first place, if you use the ``--exclude-scripts`` - (aka ``-x``) option when you run ``develop`` to deploy the project. - -``--multi-version, -m`` - "Multi-version" mode. Specifying this option prevents ``develop`` from - adding an ``easy-install.pth`` entry for the project(s) being deployed, and - if an entry for any version of a project already exists, the entry will be - removed upon successful deployment. In multi-version mode, no specific - version of the package is available for importing, unless you use - ``pkg_resources.require()`` to put it on ``sys.path``, or you are running - a wrapper script generated by ``setuptools`` or EasyInstall. (In which - case the wrapper script calls ``require()`` for you.) - - Note that if you install to a directory other than ``site-packages``, - this option is automatically in effect, because ``.pth`` files can only be - used in ``site-packages`` (at least in Python 2.3 and 2.4). So, if you use - the ``--install-dir`` or ``-d`` option (or they are set via configuration - file(s)) your project and its dependencies will be deployed in multi- - version mode. - -``--install-dir=DIR, -d DIR`` - Set the installation directory (staging area). If this option is not - directly specified on the command line or in a distutils configuration - file, the distutils default installation location is used. Normally, this - will be the ``site-packages`` directory, but if you are using distutils - configuration files, setting things like ``prefix`` or ``install_lib``, - then those settings are taken into account when computing the default - staging area. - -``--script-dir=DIR, -s DIR`` - Set the script installation directory. If you don't supply this option - (via the command line or a configuration file), but you *have* supplied - an ``--install-dir`` (via command line or config file), then this option - defaults to the same directory, so that the scripts will be able to find - their associated package installation. Otherwise, this setting defaults - to the location where the distutils would normally install scripts, taking - any distutils configuration file settings into account. - -``--exclude-scripts, -x`` - Don't deploy script wrappers. This is useful if you don't want to disturb - existing versions of the scripts in the staging area. - -``--always-copy, -a`` - Copy all needed distributions to the staging area, even if they - are already present in another directory on ``sys.path``. By default, if - a requirement can be met using a distribution that is already available in - a directory on ``sys.path``, it will not be copied to the staging area. - -``--egg-path=DIR`` - Force the generated ``.egg-link`` file to use a specified relative path - to the source directory. This can be useful in circumstances where your - installation directory is being shared by code running under multiple - platforms (e.g. Mac and Windows) which have different absolute locations - for the code under development, but the same *relative* locations with - respect to the installation directory. If you use this option when - installing, you must supply the same relative path when uninstalling. - -In addition to the above options, the ``develop`` command also accepts all of -the same options accepted by ``easy_install``. If you've configured any -``easy_install`` settings in your ``setup.cfg`` (or other distutils config -files), the ``develop`` command will use them as defaults, unless you override -them in a ``[develop]`` section or on the command line. - - -``easy_install`` - Find and install packages -============================================ - -This command runs the `EasyInstall tool -`_ for you. It is exactly -equivalent to running the ``easy_install`` command. All command line arguments -following this command are consumed and not processed further by the distutils, -so this must be the last command listed on the command line. Please see -the EasyInstall documentation for the options reference and usage examples. -Normally, there is no reason to use this command via the command line, as you -can just use ``easy_install`` directly. It's only listed here so that you know -it's a distutils command, which means that you can: - -* create command aliases that use it, -* create distutils extensions that invoke it as a subcommand, and -* configure options for it in your ``setup.cfg`` or other distutils config - files. - - -.. _egg_info: - -``egg_info`` - Create egg metadata and set build tags -===================================================== - -This command performs two operations: it updates a project's ``.egg-info`` -metadata directory (used by the ``bdist_egg``, ``develop``, and ``test`` -commands), and it allows you to temporarily change a project's version string, -to support "daily builds" or "snapshot" releases. It is run automatically by -the ``sdist``, ``bdist_egg``, ``develop``, ``register``, and ``test`` commands -in order to update the project's metadata, but you can also specify it -explicitly in order to temporarily change the project's version string while -executing other commands. (It also generates the``.egg-info/SOURCES.txt`` -manifest file, which is used when you are building source distributions.) - -In addition to writing the core egg metadata defined by ``setuptools`` and -required by ``pkg_resources``, this command can be extended to write other -metadata files as well, by defining entry points in the ``egg_info.writers`` -group. See the section on `Adding new EGG-INFO Files`_ below for more details. -Note that using additional metadata writers may require you to include a -``setup_requires`` argument to ``setup()`` in order to ensure that the desired -writers are available on ``sys.path``. - - -Release Tagging Options ------------------------ - -The following options can be used to modify the project's version string for -all remaining commands on the setup command line. The options are processed -in the order shown, so if you use more than one, the requested tags will be -added in the following order: - -``--tag-build=NAME, -b NAME`` - Append NAME to the project's version string. Due to the way setuptools - processes "pre-release" version suffixes beginning with the letters "a" - through "e" (like "alpha", "beta", and "candidate"), you will usually want - to use a tag like ".build" or ".dev", as this will cause the version number - to be considered *lower* than the project's default version. (If you - want to make the version number *higher* than the default version, you can - always leave off --tag-build and then use one or both of the following - options.) - - If you have a default build tag set in your ``setup.cfg``, you can suppress - it on the command line using ``-b ""`` or ``--tag-build=""`` as an argument - to the ``egg_info`` command. - -``--tag-svn-revision, -r`` - If the current directory is a Subversion checkout (i.e. has a ``.svn`` - subdirectory, this appends a string of the form "-rNNNN" to the project's - version string, where NNNN is the revision number of the most recent - modification to the current directory, as obtained from the ``svn info`` - command. - - If the current directory is not a Subversion checkout, the command will - look for a ``PKG-INFO`` file instead, and try to find the revision number - from that, by looking for a "-rNNNN" string at the end of the version - number. (This is so that building a package from a source distribution of - a Subversion snapshot will produce a binary with the correct version - number.) - - If there is no ``PKG-INFO`` file, or the version number contained therein - does not end with ``-r`` and a number, then ``-r0`` is used. - -``--no-svn-revision, -R`` - Don't include the Subversion revision in the version number. This option - is included so you can override a default setting put in ``setup.cfg``. - -``--tag-date, -d`` - Add a date stamp of the form "-YYYYMMDD" (e.g. "-20050528") to the - project's version number. - -``--no-date, -D`` - Don't include a date stamp in the version number. This option is included - so you can override a default setting in ``setup.cfg``. - - -(Note: Because these options modify the version number used for source and -binary distributions of your project, you should first make sure that you know -how the resulting version numbers will be interpreted by automated tools -like EasyInstall. See the section above on `Specifying Your Project's -Version`_ for an explanation of pre- and post-release tags, as well as tips on -how to choose and verify a versioning scheme for your your project.) - -For advanced uses, there is one other option that can be set, to change the -location of the project's ``.egg-info`` directory. Commands that need to find -the project's source directory or metadata should get it from this setting: - - -Other ``egg_info`` Options --------------------------- - -``--egg-base=SOURCEDIR, -e SOURCEDIR`` - Specify the directory that should contain the .egg-info directory. This - should normally be the root of your project's source tree (which is not - necessarily the same as your project directory; some projects use a ``src`` - or ``lib`` subdirectory as the source root). You should not normally need - to specify this directory, as it is normally determined from the - ``package_dir`` argument to the ``setup()`` function, if any. If there is - no ``package_dir`` set, this option defaults to the current directory. - - -``egg_info`` Examples ---------------------- - -Creating a dated "nightly build" snapshot egg:: - - python setup.py egg_info --tag-date --tag-build=DEV bdist_egg - -Creating and uploading a release with no version tags, even if some default -tags are specified in ``setup.cfg``:: - - python setup.py egg_info -RDb "" sdist bdist_egg register upload - -(Notice that ``egg_info`` must always appear on the command line *before* any -commands that you want the version changes to apply to.) - - -.. _install command: - -``install`` - Run ``easy_install`` or old-style installation -============================================================ - -The setuptools ``install`` command is basically a shortcut to run the -``easy_install`` command on the current project. However, for convenience -in creating "system packages" of setuptools-based projects, you can also -use this option: - -``--single-version-externally-managed`` - This boolean option tells the ``install`` command to perform an "old style" - installation, with the addition of an ``.egg-info`` directory so that the - installed project will still have its metadata available and operate - normally. If you use this option, you *must* also specify the ``--root`` - or ``--record`` options (or both), because otherwise you will have no way - to identify and remove the installed files. - -This option is automatically in effect when ``install`` is invoked by another -distutils command, so that commands like ``bdist_wininst`` and ``bdist_rpm`` -will create system packages of eggs. It is also automatically in effect if -you specify the ``--root`` option. - - -``install_egg_info`` - Install an ``.egg-info`` directory in ``site-packages`` -============================================================================== - -Setuptools runs this command as part of ``install`` operations that use the -``--single-version-externally-managed`` options. You should not invoke it -directly; it is documented here for completeness and so that distutils -extensions such as system package builders can make use of it. This command -has only one option: - -``--install-dir=DIR, -d DIR`` - The parent directory where the ``.egg-info`` directory will be placed. - Defaults to the same as the ``--install-dir`` option specified for the - ``install_lib`` command, which is usually the system ``site-packages`` - directory. - -This command assumes that the ``egg_info`` command has been given valid options -via the command line or ``setup.cfg``, as it will invoke the ``egg_info`` -command and use its options to locate the project's source ``.egg-info`` -directory. - - -.. _rotate: - -``rotate`` - Delete outdated distribution files -=============================================== - -As you develop new versions of your project, your distribution (``dist``) -directory will gradually fill up with older source and/or binary distribution -files. The ``rotate`` command lets you automatically clean these up, keeping -only the N most-recently modified files matching a given pattern. - -``--match=PATTERNLIST, -m PATTERNLIST`` - Comma-separated list of glob patterns to match. This option is *required*. - The project name and ``-*`` is prepended to the supplied patterns, in order - to match only distributions belonging to the current project (in case you - have a shared distribution directory for multiple projects). Typically, - you will use a glob pattern like ``.zip`` or ``.egg`` to match files of - the specified type. Note that each supplied pattern is treated as a - distinct group of files for purposes of selecting files to delete. - -``--keep=COUNT, -k COUNT`` - Number of matching distributions to keep. For each group of files - identified by a pattern specified with the ``--match`` option, delete all - but the COUNT most-recently-modified files in that group. This option is - *required*. - -``--dist-dir=DIR, -d DIR`` - Directory where the distributions are. This defaults to the value of the - ``bdist`` command's ``--dist-dir`` option, which will usually be the - project's ``dist`` subdirectory. - -**Example 1**: Delete all .tar.gz files from the distribution directory, except -for the 3 most recently modified ones:: - - setup.py rotate --match=.tar.gz --keep=3 - -**Example 2**: Delete all Python 2.3 or Python 2.4 eggs from the distribution -directory, except the most recently modified one for each Python version:: - - setup.py rotate --match=-py2.3*.egg,-py2.4*.egg --keep=1 - - -.. _saveopts: - -``saveopts`` - Save used options to a configuration file -======================================================== - -Finding and editing ``distutils`` configuration files can be a pain, especially -since you also have to translate the configuration options from command-line -form to the proper configuration file format. You can avoid these hassles by -using the ``saveopts`` command. Just add it to the command line to save the -options you used. For example, this command builds the project using -the ``mingw32`` C compiler, then saves the --compiler setting as the default -for future builds (even those run implicitly by the ``install`` command):: - - setup.py build --compiler=mingw32 saveopts - -The ``saveopts`` command saves all options for every commmand specified on the -command line to the project's local ``setup.cfg`` file, unless you use one of -the `configuration file options`_ to change where the options are saved. For -example, this command does the same as above, but saves the compiler setting -to the site-wide (global) distutils configuration:: - - setup.py build --compiler=mingw32 saveopts -g - -Note that it doesn't matter where you place the ``saveopts`` command on the -command line; it will still save all the options specified for all commands. -For example, this is another valid way to spell the last example:: - - setup.py saveopts -g build --compiler=mingw32 - -Note, however, that all of the commands specified are always run, regardless of -where ``saveopts`` is placed on the command line. - - -Configuration File Options --------------------------- - -Normally, settings such as options and aliases are saved to the project's -local ``setup.cfg`` file. But you can override this and save them to the -global or per-user configuration files, or to a manually-specified filename. - -``--global-config, -g`` - Save settings to the global ``distutils.cfg`` file inside the ``distutils`` - package directory. You must have write access to that directory to use - this option. You also can't combine this option with ``-u`` or ``-f``. - -``--user-config, -u`` - Save settings to the current user's ``~/.pydistutils.cfg`` (POSIX) or - ``$HOME/pydistutils.cfg`` (Windows) file. You can't combine this option - with ``-g`` or ``-f``. - -``--filename=FILENAME, -f FILENAME`` - Save settings to the specified configuration file to use. You can't - combine this option with ``-g`` or ``-u``. Note that if you specify a - non-standard filename, the ``distutils`` and ``setuptools`` will not - use the file's contents. This option is mainly included for use in - testing. - -These options are used by other ``setuptools`` commands that modify -configuration files, such as the `alias`_ and `setopt`_ commands. - - -.. _setopt: - -``setopt`` - Set a distutils or setuptools option in a config file -================================================================== - -This command is mainly for use by scripts, but it can also be used as a quick -and dirty way to change a distutils configuration option without having to -remember what file the options are in and then open an editor. - -**Example 1**. Set the default C compiler to ``mingw32`` (using long option -names):: - - setup.py setopt --command=build --option=compiler --set-value=mingw32 - -**Example 2**. Remove any setting for the distutils default package -installation directory (short option names):: - - setup.py setopt -c install -o install_lib -r - - -Options for the ``setopt`` command: - -``--command=COMMAND, -c COMMAND`` - Command to set the option for. This option is required. - -``--option=OPTION, -o OPTION`` - The name of the option to set. This option is required. - -``--set-value=VALUE, -s VALUE`` - The value to set the option to. Not needed if ``-r`` or ``--remove`` is - set. - -``--remove, -r`` - Remove (unset) the option, instead of setting it. - -In addition to the above options, you may use any of the `configuration file -options`_ (listed under the `saveopts`_ command, above) to determine which -distutils configuration file the option will be added to (or removed from). - - -.. _test: - -``test`` - Build package and run a unittest suite -================================================= - -When doing test-driven development, or running automated builds that need -testing before they are deployed for downloading or use, it's often useful -to be able to run a project's unit tests without actually deploying the project -anywhere, even using the ``develop`` command. The ``test`` command runs a -project's unit tests without actually deploying it, by temporarily putting the -project's source on ``sys.path``, after first running ``build_ext -i`` and -``egg_info`` to ensure that any C extensions and project metadata are -up-to-date. - -To use this command, your project's tests must be wrapped in a ``unittest`` -test suite by either a function, a ``TestCase`` class or method, or a module -or package containing ``TestCase`` classes. If the named suite is a module, -and the module has an ``additional_tests()`` function, it is called and the -result (which must be a ``unittest.TestSuite``) is added to the tests to be -run. If the named suite is a package, any submodules and subpackages are -recursively added to the overall test suite. (Note: if your project specifies -a ``test_loader``, the rules for processing the chosen ``test_suite`` may -differ; see the `test_loader`_ documentation for more details.) - -Note that many test systems including ``doctest`` support wrapping their -non-``unittest`` tests in ``TestSuite`` objects. So, if you are using a test -package that does not support this, we suggest you encourage its developers to -implement test suite support, as this is a convenient and standard way to -aggregate a collection of tests to be run under a common test harness. - -By default, tests will be run in the "verbose" mode of the ``unittest`` -package's text test runner, but you can get the "quiet" mode (just dots) if -you supply the ``-q`` or ``--quiet`` option, either as a global option to -the setup script (e.g. ``setup.py -q test``) or as an option for the ``test`` -command itself (e.g. ``setup.py test -q``). There is one other option -available: - -``--test-suite=NAME, -s NAME`` - Specify the test suite (or module, class, or method) to be run - (e.g. ``some_module.test_suite``). The default for this option can be - set by giving a ``test_suite`` argument to the ``setup()`` function, e.g.:: - - setup( - # ... - test_suite = "my_package.tests.test_all" - ) - - If you did not set a ``test_suite`` in your ``setup()`` call, and do not - provide a ``--test-suite`` option, an error will occur. - - -.. _upload: - -``upload`` - Upload source and/or egg distributions to PyPI -=========================================================== - -PyPI now supports uploading project files for redistribution; uploaded files -are easily found by EasyInstall, even if you don't have download links on your -project's home page. - -Although Python 2.5 will support uploading all types of distributions to PyPI, -setuptools only supports source distributions and eggs. (This is partly -because PyPI's upload support is currently broken for various other file -types.) To upload files, you must include the ``upload`` command *after* the -``sdist`` or ``bdist_egg`` commands on the setup command line. For example:: - - setup.py bdist_egg upload # create an egg and upload it - setup.py sdist upload # create a source distro and upload it - setup.py sdist bdist_egg upload # create and upload both - -Note that to upload files for a project, the corresponding version must already -be registered with PyPI, using the distutils ``register`` command. It's -usually a good idea to include the ``register`` command at the start of the -command line, so that any registration problems can be found and fixed before -building and uploading the distributions, e.g.:: - - setup.py register sdist bdist_egg upload - -This will update PyPI's listing for your project's current version. - -Note, by the way, that the metadata in your ``setup()`` call determines what -will be listed in PyPI for your package. Try to fill out as much of it as -possible, as it will save you a lot of trouble manually adding and updating -your PyPI listings. Just put it in ``setup.py`` and use the ``register`` -comamnd to keep PyPI up to date. - -The ``upload`` command has a few options worth noting: - -``--sign, -s`` - Sign each uploaded file using GPG (GNU Privacy Guard). The ``gpg`` program - must be available for execution on the system ``PATH``. - -``--identity=NAME, -i NAME`` - Specify the identity or key name for GPG to use when signing. The value of - this option will be passed through the ``--local-user`` option of the - ``gpg`` program. - -``--show-response`` - Display the full response text from server; this is useful for debugging - PyPI problems. - -``--repository=URL, -r URL`` - The URL of the repository to upload to. Defaults to - http://pypi.python.org/pypi (i.e., the main PyPI installation). - - ------------------------------------- -Extending and Reusing ``setuptools`` ------------------------------------- - -Creating ``distutils`` Extensions -================================= - -It can be hard to add new commands or setup arguments to the distutils. But -the ``setuptools`` package makes it a bit easier, by allowing you to distribute -a distutils extension as a separate project, and then have projects that need -the extension just refer to it in their ``setup_requires`` argument. - -With ``setuptools``, your distutils extension projects can hook in new -commands and ``setup()`` arguments just by defining "entry points". These -are mappings from command or argument names to a specification of where to -import a handler from. (See the section on `Dynamic Discovery of Services and -Plugins`_ above for some more background on entry points.) - - -Adding Commands ---------------- - -You can add new ``setup`` commands by defining entry points in the -``distutils.commands`` group. For example, if you wanted to add a ``foo`` -command, you might add something like this to your distutils extension -project's setup script:: - - setup( - # ... - entry_points = { - "distutils.commands": [ - "foo = mypackage.some_module:foo", - ], - }, - ) - -(Assuming, of course, that the ``foo`` class in ``mypackage.some_module`` is -a ``setuptools.Command`` subclass.) - -Once a project containing such entry points has been activated on ``sys.path``, -(e.g. by running "install" or "develop" with a site-packages installation -directory) the command(s) will be available to any ``setuptools``-based setup -scripts. It is not necessary to use the ``--command-packages`` option or -to monkeypatch the ``distutils.command`` package to install your commands; -``setuptools`` automatically adds a wrapper to the distutils to search for -entry points in the active distributions on ``sys.path``. In fact, this is -how setuptools' own commands are installed: the setuptools project's setup -script defines entry points for them! - - -Adding ``setup()`` Arguments ----------------------------- - -Sometimes, your commands may need additional arguments to the ``setup()`` -call. You can enable this by defining entry points in the -``distutils.setup_keywords`` group. For example, if you wanted a ``setup()`` -argument called ``bar_baz``, you might add something like this to your -distutils extension project's setup script:: - - setup( - # ... - entry_points = { - "distutils.commands": [ - "foo = mypackage.some_module:foo", - ], - "distutils.setup_keywords": [ - "bar_baz = mypackage.some_module:validate_bar_baz", - ], - }, - ) - -The idea here is that the entry point defines a function that will be called -to validate the ``setup()`` argument, if it's supplied. The ``Distribution`` -object will have the initial value of the attribute set to ``None``, and the -validation function will only be called if the ``setup()`` call sets it to -a non-None value. Here's an example validation function:: - - def assert_bool(dist, attr, value): - """Verify that value is True, False, 0, or 1""" - if bool(value) != value: - raise DistutilsSetupError( - "%r must be a boolean value (got %r)" % (attr,value) - ) - -Your function should accept three arguments: the ``Distribution`` object, -the attribute name, and the attribute value. It should raise a -``DistutilsSetupError`` (from the ``distutils.error`` module) if the argument -is invalid. Remember, your function will only be called with non-None values, -and the default value of arguments defined this way is always None. So, your -commands should always be prepared for the possibility that the attribute will -be ``None`` when they access it later. - -If more than one active distribution defines an entry point for the same -``setup()`` argument, *all* of them will be called. This allows multiple -distutils extensions to define a common argument, as long as they agree on -what values of that argument are valid. - -Also note that as with commands, it is not necessary to subclass or monkeypatch -the distutils ``Distribution`` class in order to add your arguments; it is -sufficient to define the entry points in your extension, as long as any setup -script using your extension lists your project in its ``setup_requires`` -argument. - - -Adding new EGG-INFO Files -------------------------- - -Some extensible applications or frameworks may want to allow third parties to -develop plugins with application or framework-specific metadata included in -the plugins' EGG-INFO directory, for easy access via the ``pkg_resources`` -metadata API. The easiest way to allow this is to create a distutils extension -to be used from the plugin projects' setup scripts (via ``setup_requires``) -that defines a new setup keyword, and then uses that data to write an EGG-INFO -file when the ``egg_info`` command is run. - -The ``egg_info`` command looks for extension points in an ``egg_info.writers`` -group, and calls them to write the files. Here's a simple example of a -distutils extension defining a setup argument ``foo_bar``, which is a list of -lines that will be written to ``foo_bar.txt`` in the EGG-INFO directory of any -project that uses the argument:: - - setup( - # ... - entry_points = { - "distutils.setup_keywords": [ - "foo_bar = setuptools.dist:assert_string_list", - ], - "egg_info.writers": [ - "foo_bar.txt = setuptools.command.egg_info:write_arg", - ], - }, - ) - -This simple example makes use of two utility functions defined by setuptools -for its own use: a routine to validate that a setup keyword is a sequence of -strings, and another one that looks up a setup argument and writes it to -a file. Here's what the writer utility looks like:: - - def write_arg(cmd, basename, filename): - argname = os.path.splitext(basename)[0] - value = getattr(cmd.distribution, argname, None) - if value is not None: - value = '\n'.join(value)+'\n' - cmd.write_or_delete_file(argname, filename, value) - -As you can see, ``egg_info.writers`` entry points must be a function taking -three arguments: a ``egg_info`` command instance, the basename of the file to -write (e.g. ``foo_bar.txt``), and the actual full filename that should be -written to. - -In general, writer functions should honor the command object's ``dry_run`` -setting when writing files, and use the ``distutils.log`` object to do any -console output. The easiest way to conform to this requirement is to use -the ``cmd`` object's ``write_file()``, ``delete_file()``, and -``write_or_delete_file()`` methods exclusively for your file operations. See -those methods' docstrings for more details. - - -Adding Support for Other Revision Control Systems -------------------------------------------------- - -If you would like to create a plugin for ``setuptools`` to find files in other -source control systems besides CVS and Subversion, you can do so by adding an -entry point to the ``setuptools.file_finders`` group. The entry point should -be a function accepting a single directory name, and should yield -all the filenames within that directory (and any subdirectories thereof) that -are under revision control. - -For example, if you were going to create a plugin for a revision control system -called "foobar", you would write a function something like this:: - - def find_files_for_foobar(dirname): - # loop to yield paths that start with `dirname` - -And you would register it in a setup script using something like this:: - - entry_points = { - "setuptools.file_finders": [ - "foobar = my_foobar_module:find_files_for_foobar" - ] - } - -Then, anyone who wants to use your plugin can simply install it, and their -local setuptools installation will be able to find the necessary files. - -It is not necessary to distribute source control plugins with projects that -simply use the other source control system, or to specify the plugins in -``setup_requires``. When you create a source distribution with the ``sdist`` -command, setuptools automatically records what files were found in the -``SOURCES.txt`` file. That way, recipients of source distributions don't need -to have revision control at all. However, if someone is working on a package -by checking out with that system, they will need the same plugin(s) that the -original author is using. - -A few important points for writing revision control file finders: - -* Your finder function MUST return relative paths, created by appending to the - passed-in directory name. Absolute paths are NOT allowed, nor are relative - paths that reference a parent directory of the passed-in directory. - -* Your finder function MUST accept an empty string as the directory name, - meaning the current directory. You MUST NOT convert this to a dot; just - yield relative paths. So, yielding a subdirectory named ``some/dir`` under - the current directory should NOT be rendered as ``./some/dir`` or - ``/somewhere/some/dir``, but *always* as simply ``some/dir`` - -* Your finder function SHOULD NOT raise any errors, and SHOULD deal gracefully - with the absence of needed programs (i.e., ones belonging to the revision - control system itself. It *may*, however, use ``distutils.log.warn()`` to - inform the user of the missing program(s). - - -A Note Regarding Dependencies ------------------------------ - -If the project *containing* your distutils/setuptools extension(s) depends on -any projects other than setuptools, you *must* also declare those dependencies -as part of your project's ``setup_requires`` keyword, so that they will -already be built (and at least temprorarily installed) before your extension -project is built. - -So, if for example you create a project Foo that includes a new file finder -plugin, and Foo depends on Bar, then you *must* list Bar in both the -``install_requires`` **and** ``setup_requires`` arguments to ``setup()``. - -If you don't do this, then in certain edge cases you may cause setuptools to -try to go into infinite recursion, trying to build your dependencies to resolve -your dependencies, while still building your dependencies. (It probably won't -happen on your development machine, but it *will* happen in a full build -pulling everything from revision control on a clean machine, and then you or -your users will be scratching their heads trying to figure it out!) - - -Subclassing ``Command`` ------------------------ - -Sorry, this section isn't written yet, and neither is a lot of what's below -this point, except for the change log. You might want to `subscribe to changes -in this page `_ to see when new documentation is -added or updated. - -XXX - - -Reusing ``setuptools`` Code -=========================== - -``ez_setup`` ------------- - -XXX - - -``setuptools.archive_util`` ---------------------------- - -XXX - - -``setuptools.sandbox`` ----------------------- - -XXX - - -``setuptools.package_index`` ----------------------------- - -XXX - - ----------------------------- -Release Notes/Change History ----------------------------- - -0.6c11 - * Fix "bdist_wininst upload" trying to upload same file twice - -0.6c10 - * Fix for the Python 2.6.3 build_ext API change - - * Ensure C libraries (as opposed to extensions) are also built when doing - bdist_egg - - * Support for SVN 1.6 - -0.6c9 - * Fixed a missing files problem when using Windows source distributions on - non-Windows platforms, due to distutils not handling manifest file line - endings correctly. - - * Updated Pyrex support to work with Pyrex 0.9.6 and higher. - - * Minor changes for Jython compatibility, including skipping tests that can't - work on Jython. - - * Fixed not installing eggs in ``install_requires`` if they were also used for - ``setup_requires`` or ``tests_require``. - - * Fixed not fetching eggs in ``install_requires`` when running tests. - - * Allow ``ez_setup.use_setuptools()`` to upgrade existing setuptools - installations when called from a standalone ``setup.py``. - - * Added a warning if a namespace package is declared, but its parent package - is not also declared as a namespace. - - * Support Subversion 1.5 - - * Removed use of deprecated ``md5`` module if ``hashlib`` is available - - * Fixed ``bdist_wininst upload`` trying to upload the ``.exe`` twice - - * Fixed ``bdist_egg`` putting a ``native_libs.txt`` in the source package's - ``.egg-info``, when it should only be in the built egg's ``EGG-INFO``. - - * Ensure that _full_name is set on all shared libs before extensions are - checked for shared lib usage. (Fixes a bug in the experimental shared - library build support.) - - * Fix to allow unpacked eggs containing native libraries to fail more - gracefully under Google App Engine (with an ``ImportError`` loading the - C-based module, instead of getting a ``NameError``). - -0.6c7 - * Fixed ``distutils.filelist.findall()`` crashing on broken symlinks, and - ``egg_info`` command failing on new, uncommitted SVN directories. - - * Fix import problems with nested namespace packages installed via - ``--root`` or ``--single-version-externally-managed``, due to the - parent package not having the child package as an attribute. - -0.6c6 - * Added ``--egg-path`` option to ``develop`` command, allowing you to force - ``.egg-link`` files to use relative paths (allowing them to be shared across - platforms on a networked drive). - - * Fix not building binary RPMs correctly. - - * Fix "eggsecutables" (such as setuptools' own egg) only being runnable with - bash-compatible shells. - - * Fix ``#!`` parsing problems in Windows ``.exe`` script wrappers, when there - was whitespace inside a quoted argument or at the end of the ``#!`` line - (a regression introduced in 0.6c4). - - * Fix ``test`` command possibly failing if an older version of the project - being tested was installed on ``sys.path`` ahead of the test source - directory. - - * Fix ``find_packages()`` treating ``ez_setup`` and directories with ``.`` in - their names as packages. - -0.6c5 - * Fix uploaded ``bdist_rpm`` packages being described as ``bdist_egg`` - packages under Python versions less than 2.5. - - * Fix uploaded ``bdist_wininst`` packages being described as suitable for - "any" version by Python 2.5, even if a ``--target-version`` was specified. - -0.6c4 - * Overhauled Windows script wrapping to support ``bdist_wininst`` better. - Scripts installed with ``bdist_wininst`` will always use ``#!python.exe`` or - ``#!pythonw.exe`` as the executable name (even when built on non-Windows - platforms!), and the wrappers will look for the executable in the script's - parent directory (which should find the right version of Python). - - * Fix ``upload`` command not uploading files built by ``bdist_rpm`` or - ``bdist_wininst`` under Python 2.3 and 2.4. - - * Add support for "eggsecutable" headers: a ``#!/bin/sh`` script that is - prepended to an ``.egg`` file to allow it to be run as a script on Unix-ish - platforms. (This is mainly so that setuptools itself can have a single-file - installer on Unix, without doing multiple downloads, dealing with firewalls, - etc.) - - * Fix problem with empty revision numbers in Subversion 1.4 ``entries`` files - - * Use cross-platform relative paths in ``easy-install.pth`` when doing - ``develop`` and the source directory is a subdirectory of the installation - target directory. - - * Fix a problem installing eggs with a system packaging tool if the project - contained an implicit namespace package; for example if the ``setup()`` - listed a namespace package ``foo.bar`` without explicitly listing ``foo`` - as a namespace package. - -0.6c3 - * Fixed breakages caused by Subversion 1.4's new "working copy" format - -0.6c2 - * The ``ez_setup`` module displays the conflicting version of setuptools (and - its installation location) when a script requests a version that's not - available. - - * Running ``setup.py develop`` on a setuptools-using project will now install - setuptools if needed, instead of only downloading the egg. - -0.6c1 - * Fixed ``AttributeError`` when trying to download a ``setup_requires`` - dependency when a distribution lacks a ``dependency_links`` setting. - - * Made ``zip-safe`` and ``not-zip-safe`` flag files contain a single byte, so - as to play better with packaging tools that complain about zero-length - files. - - * Made ``setup.py develop`` respect the ``--no-deps`` option, which it - previously was ignoring. - - * Support ``extra_path`` option to ``setup()`` when ``install`` is run in - backward-compatibility mode. - - * Source distributions now always include a ``setup.cfg`` file that explicitly - sets ``egg_info`` options such that they produce an identical version number - to the source distribution's version number. (Previously, the default - version number could be different due to the use of ``--tag-date``, or if - the version was overridden on the command line that built the source - distribution.) - -0.6b4 - * Fix ``register`` not obeying name/version set by ``egg_info`` command, if - ``egg_info`` wasn't explicitly run first on the same command line. - - * Added ``--no-date`` and ``--no-svn-revision`` options to ``egg_info`` - command, to allow suppressing tags configured in ``setup.cfg``. - - * Fixed redundant warnings about missing ``README`` file(s); it should now - appear only if you are actually a source distribution. - -0.6b3 - * Fix ``bdist_egg`` not including files in subdirectories of ``.egg-info``. - - * Allow ``.py`` files found by the ``include_package_data`` option to be - automatically included. Remove duplicate data file matches if both - ``include_package_data`` and ``package_data`` are used to refer to the same - files. - -0.6b1 - * Strip ``module`` from the end of compiled extension modules when computing - the name of a ``.py`` loader/wrapper. (Python's import machinery ignores - this suffix when searching for an extension module.) - -0.6a11 - * Added ``test_loader`` keyword to support custom test loaders - - * Added ``setuptools.file_finders`` entry point group to allow implementing - revision control plugins. - - * Added ``--identity`` option to ``upload`` command. - - * Added ``dependency_links`` to allow specifying URLs for ``--find-links``. - - * Enhanced test loader to scan packages as well as modules, and call - ``additional_tests()`` if present to get non-unittest tests. - - * Support namespace packages in conjunction with system packagers, by omitting - the installation of any ``__init__.py`` files for namespace packages, and - adding a special ``.pth`` file to create a working package in - ``sys.modules``. - - * Made ``--single-version-externally-managed`` automatic when ``--root`` is - used, so that most system packagers won't require special support for - setuptools. - - * Fixed ``setup_requires``, ``tests_require``, etc. not using ``setup.cfg`` or - other configuration files for their option defaults when installing, and - also made the install use ``--multi-version`` mode so that the project - directory doesn't need to support .pth files. - - * ``MANIFEST.in`` is now forcibly closed when any errors occur while reading - it. Previously, the file could be left open and the actual error would be - masked by problems trying to remove the open file on Windows systems. - -0.6a10 - * Fixed the ``develop`` command ignoring ``--find-links``. - -0.6a9 - * The ``sdist`` command no longer uses the traditional ``MANIFEST`` file to - create source distributions. ``MANIFEST.in`` is still read and processed, - as are the standard defaults and pruning. But the manifest is built inside - the project's ``.egg-info`` directory as ``SOURCES.txt``, and it is rebuilt - every time the ``egg_info`` command is run. - - * Added the ``include_package_data`` keyword to ``setup()``, allowing you to - automatically include any package data listed in revision control or - ``MANIFEST.in`` - - * Added the ``exclude_package_data`` keyword to ``setup()``, allowing you to - trim back files included via the ``package_data`` and - ``include_package_data`` options. - - * Fixed ``--tag-svn-revision`` not working when run from a source - distribution. - - * Added warning for namespace packages with missing ``declare_namespace()`` - - * Added ``tests_require`` keyword to ``setup()``, so that e.g. packages - requiring ``nose`` to run unit tests can make this dependency optional - unless the ``test`` command is run. - - * Made all commands that use ``easy_install`` respect its configuration - options, as this was causing some problems with ``setup.py install``. - - * Added an ``unpack_directory()`` driver to ``setuptools.archive_util``, so - that you can process a directory tree through a processing filter as if it - were a zipfile or tarfile. - - * Added an internal ``install_egg_info`` command to use as part of old-style - ``install`` operations, that installs an ``.egg-info`` directory with the - package. - - * Added a ``--single-version-externally-managed`` option to the ``install`` - command so that you can more easily wrap a "flat" egg in a system package. - - * Enhanced ``bdist_rpm`` so that it installs single-version eggs that - don't rely on a ``.pth`` file. The ``--no-egg`` option has been removed, - since all RPMs are now built in a more backwards-compatible format. - - * Support full roundtrip translation of eggs to and from ``bdist_wininst`` - format. Running ``bdist_wininst`` on a setuptools-based package wraps the - egg in an .exe that will safely install it as an egg (i.e., with metadata - and entry-point wrapper scripts), and ``easy_install`` can turn the .exe - back into an ``.egg`` file or directory and install it as such. - - -0.6a8 - * Fixed some problems building extensions when Pyrex was installed, especially - with Python 2.4 and/or packages using SWIG. - - * Made ``develop`` command accept all the same options as ``easy_install``, - and use the ``easy_install`` command's configuration settings as defaults. - - * Made ``egg_info --tag-svn-revision`` fall back to extracting the revision - number from ``PKG-INFO`` in case it is being run on a source distribution of - a snapshot taken from a Subversion-based project. - - * Automatically detect ``.dll``, ``.so`` and ``.dylib`` files that are being - installed as data, adding them to ``native_libs.txt`` automatically. - - * Fixed some problems with fresh checkouts of projects that don't include - ``.egg-info/PKG-INFO`` under revision control and put the project's source - code directly in the project directory. If such a package had any - requirements that get processed before the ``egg_info`` command can be run, - the setup scripts would fail with a "Missing 'Version:' header and/or - PKG-INFO file" error, because the egg runtime interpreted the unbuilt - metadata in a directory on ``sys.path`` (i.e. the current directory) as - being a corrupted egg. Setuptools now monkeypatches the distribution - metadata cache to pretend that the egg has valid version information, until - it has a chance to make it actually be so (via the ``egg_info`` command). - -0.6a5 - * Fixed missing gui/cli .exe files in distribution. Fixed bugs in tests. - -0.6a3 - * Added ``gui_scripts`` entry point group to allow installing GUI scripts - on Windows and other platforms. (The special handling is only for Windows; - other platforms are treated the same as for ``console_scripts``.) - -0.6a2 - * Added ``console_scripts`` entry point group to allow installing scripts - without the need to create separate script files. On Windows, console - scripts get an ``.exe`` wrapper so you can just type their name. On other - platforms, the scripts are written without a file extension. - -0.6a1 - * Added support for building "old-style" RPMs that don't install an egg for - the target package, using a ``--no-egg`` option. - - * The ``build_ext`` command now works better when using the ``--inplace`` - option and multiple Python versions. It now makes sure that all extensions - match the current Python version, even if newer copies were built for a - different Python version. - - * The ``upload`` command no longer attaches an extra ``.zip`` when uploading - eggs, as PyPI now supports egg uploads without trickery. - - * The ``ez_setup`` script/module now displays a warning before downloading - the setuptools egg, and attempts to check the downloaded egg against an - internal MD5 checksum table. - - * Fixed the ``--tag-svn-revision`` option of ``egg_info`` not finding the - latest revision number; it was using the revision number of the directory - containing ``setup.py``, not the highest revision number in the project. - - * Added ``eager_resources`` setup argument - - * The ``sdist`` command now recognizes Subversion "deleted file" entries and - does not include them in source distributions. - - * ``setuptools`` now embeds itself more thoroughly into the distutils, so that - other distutils extensions (e.g. py2exe, py2app) will subclass setuptools' - versions of things, rather than the native distutils ones. - - * Added ``entry_points`` and ``setup_requires`` arguments to ``setup()``; - ``setup_requires`` allows you to automatically find and download packages - that are needed in order to *build* your project (as opposed to running it). - - * ``setuptools`` now finds its commands, ``setup()`` argument validators, and - metadata writers using entry points, so that they can be extended by - third-party packages. See `Creating distutils Extensions`_ above for more - details. - - * The vestigial ``depends`` command has been removed. It was never finished - or documented, and never would have worked without EasyInstall - which it - pre-dated and was never compatible with. - -0.5a12 - * The zip-safety scanner now checks for modules that might be used with - ``python -m``, and marks them as unsafe for zipping, since Python 2.4 can't - handle ``-m`` on zipped modules. - -0.5a11 - * Fix breakage of the "develop" command that was caused by the addition of - ``--always-unzip`` to the ``easy_install`` command. - -0.5a9 - * Include ``svn:externals`` directories in source distributions as well as - normal subversion-controlled files and directories. - - * Added ``exclude=patternlist`` option to ``setuptools.find_packages()`` - - * Changed --tag-svn-revision to include an "r" in front of the revision number - for better readability. - - * Added ability to build eggs without including source files (except for any - scripts, of course), using the ``--exclude-source-files`` option to - ``bdist_egg``. - - * ``setup.py install`` now automatically detects when an "unmanaged" package - or module is going to be on ``sys.path`` ahead of a package being installed, - thereby preventing the newer version from being imported. If this occurs, - a warning message is output to ``sys.stderr``, but installation proceeds - anyway. The warning message informs the user what files or directories - need deleting, and advises them they can also use EasyInstall (with the - ``--delete-conflicting`` option) to do it automatically. - - * The ``egg_info`` command now adds a ``top_level.txt`` file to the metadata - directory that lists all top-level modules and packages in the distribution. - This is used by the ``easy_install`` command to find possibly-conflicting - "unmanaged" packages when installing the distribution. - - * Added ``zip_safe`` and ``namespace_packages`` arguments to ``setup()``. - Added package analysis to determine zip-safety if the ``zip_safe`` flag - is not given, and advise the author regarding what code might need changing. - - * Fixed the swapped ``-d`` and ``-b`` options of ``bdist_egg``. - -0.5a8 - * The "egg_info" command now always sets the distribution metadata to "safe" - forms of the distribution name and version, so that distribution files will - be generated with parseable names (i.e., ones that don't include '-' in the - name or version). Also, this means that if you use the various ``--tag`` - options of "egg_info", any distributions generated will use the tags in the - version, not just egg distributions. - - * Added support for defining command aliases in distutils configuration files, - under the "[aliases]" section. To prevent recursion and to allow aliases to - call the command of the same name, a given alias can be expanded only once - per command-line invocation. You can define new aliases with the "alias" - command, either for the local, global, or per-user configuration. - - * Added "rotate" command to delete old distribution files, given a set of - patterns to match and the number of files to keep. (Keeps the most - recently-modified distribution files matching each pattern.) - - * Added "saveopts" command that saves all command-line options for the current - invocation to the local, global, or per-user configuration file. Useful for - setting defaults without having to hand-edit a configuration file. - - * Added a "setopt" command that sets a single option in a specified distutils - configuration file. - -0.5a7 - * Added "upload" support for egg and source distributions, including a bug - fix for "upload" and a temporary workaround for lack of .egg support in - PyPI. - -0.5a6 - * Beefed up the "sdist" command so that if you don't have a MANIFEST.in, it - will include all files under revision control (CVS or Subversion) in the - current directory, and it will regenerate the list every time you create a - source distribution, not just when you tell it to. This should make the - default "do what you mean" more often than the distutils' default behavior - did, while still retaining the old behavior in the presence of MANIFEST.in. - - * Fixed the "develop" command always updating .pth files, even if you - specified ``-n`` or ``--dry-run``. - - * Slightly changed the format of the generated version when you use - ``--tag-build`` on the "egg_info" command, so that you can make tagged - revisions compare *lower* than the version specified in setup.py (e.g. by - using ``--tag-build=dev``). - -0.5a5 - * Added ``develop`` command to ``setuptools``-based packages. This command - installs an ``.egg-link`` pointing to the package's source directory, and - script wrappers that ``execfile()`` the source versions of the package's - scripts. This lets you put your development checkout(s) on sys.path without - having to actually install them. (To uninstall the link, use - use ``setup.py develop --uninstall``.) - - * Added ``egg_info`` command to ``setuptools``-based packages. This command - just creates or updates the "projectname.egg-info" directory, without - building an egg. (It's used by the ``bdist_egg``, ``test``, and ``develop`` - commands.) - - * Enhanced the ``test`` command so that it doesn't install the package, but - instead builds any C extensions in-place, updates the ``.egg-info`` - metadata, adds the source directory to ``sys.path``, and runs the tests - directly on the source. This avoids an "unmanaged" installation of the - package to ``site-packages`` or elsewhere. - - * Made ``easy_install`` a standard ``setuptools`` command, moving it from - the ``easy_install`` module to ``setuptools.command.easy_install``. Note - that if you were importing or extending it, you must now change your imports - accordingly. ``easy_install.py`` is still installed as a script, but not as - a module. - -0.5a4 - * Setup scripts using setuptools can now list their dependencies directly in - the setup.py file, without having to manually create a ``depends.txt`` file. - The ``install_requires`` and ``extras_require`` arguments to ``setup()`` - are used to create a dependencies file automatically. If you are manually - creating ``depends.txt`` right now, please switch to using these setup - arguments as soon as practical, because ``depends.txt`` support will be - removed in the 0.6 release cycle. For documentation on the new arguments, - see the ``setuptools.dist.Distribution`` class. - - * Setup scripts using setuptools now always install using ``easy_install`` - internally, for ease of uninstallation and upgrading. - -0.5a1 - * Added support for "self-installation" bootstrapping. Packages can now - include ``ez_setup.py`` in their source distribution, and add the following - to their ``setup.py``, in order to automatically bootstrap installation of - setuptools as part of their setup process:: - - from ez_setup import use_setuptools - use_setuptools() - - from setuptools import setup - # etc... - -0.4a2 - * Added ``ez_setup.py`` installer/bootstrap script to make initial setuptools - installation easier, and to allow distributions using setuptools to avoid - having to include setuptools in their source distribution. - - * All downloads are now managed by the ``PackageIndex`` class (which is now - subclassable and replaceable), so that embedders can more easily override - download logic, give download progress reports, etc. The class has also - been moved to the new ``setuptools.package_index`` module. - - * The ``Installer`` class no longer handles downloading, manages a temporary - directory, or tracks the ``zip_ok`` option. Downloading is now handled - by ``PackageIndex``, and ``Installer`` has become an ``easy_install`` - command class based on ``setuptools.Command``. - - * There is a new ``setuptools.sandbox.run_setup()`` API to invoke a setup - script in a directory sandbox, and a new ``setuptools.archive_util`` module - with an ``unpack_archive()`` API. These were split out of EasyInstall to - allow reuse by other tools and applications. - - * ``setuptools.Command`` now supports reinitializing commands using keyword - arguments to set/reset options. Also, ``Command`` subclasses can now set - their ``command_consumes_arguments`` attribute to ``True`` in order to - receive an ``args`` option containing the rest of the command line. - -0.3a2 - * Added new options to ``bdist_egg`` to allow tagging the egg's version number - with a subversion revision number, the current date, or an explicit tag - value. Run ``setup.py bdist_egg --help`` to get more information. - - * Misc. bug fixes - -0.3a1 - * Initial release. - - -Mailing List and Bug Tracker -============================ - -Please use the `distutils-sig mailing list`_ for questions and discussion about -setuptools, and the `setuptools bug tracker`_ ONLY for issues you have -confirmed via the list are actual bugs, and which you have reduced to a minimal -set of steps to reproduce. - -.. _distutils-sig mailing list: http://mail.python.org/pipermail/distutils-sig/ -.. _setuptools bug tracker: http://bugs.python.org/setuptools/ - diff --git a/setuptools/__init__.py b/setuptools/__init__.py index a314ca7..d01918e 100644 --- a/setuptools/__init__.py +++ b/setuptools/__init__.py @@ -1,47 +1,118 @@ """Extensions to the 'distutils' for large or complex distributions""" -from setuptools.extension import Extension, Library -from setuptools.dist import Distribution, Feature, _get_unpatched -import distutils.core, setuptools.command -from setuptools.depends import Require -from distutils.core import Command as _Command + +import os +import functools +import distutils.core +import distutils.filelist from distutils.util import convert_path -import os.path +from fnmatch import fnmatchcase + +from six.moves import filter, map + +import setuptools.version +from setuptools.extension import Extension +from setuptools.dist import Distribution, Feature +from setuptools.depends import Require +from . import monkey -__version__ = '0.6c11' __all__ = [ 'setup', 'Distribution', 'Feature', 'Command', 'Extension', 'Require', - 'find_packages' + 'find_packages', ] +__version__ = setuptools.version.__version__ + bootstrap_install_from = None -def find_packages(where='.', exclude=()): - """Return a list all Python packages found within directory 'where' +# If we run 2to3 on .py files, should we also convert docstrings? +# Default: yes; assume that we can detect doctests reliably +run_2to3_on_doctests = True +# Standard package names for fixer packages +lib2to3_fixer_packages = ['lib2to3.fixes'] + - 'where' should be supplied as a "cross-platform" (i.e. URL-style) path; it - will be converted to the appropriate local path syntax. 'exclude' is a - sequence of package names to exclude; '*' can be used as a wildcard in the - names, such that 'foo.*' will exclude all subpackages of 'foo' (but not - 'foo' itself). +class PackageFinder(object): """ - out = [] - stack=[(convert_path(where), '')] - while stack: - where,prefix = stack.pop(0) - for name in os.listdir(where): - fn = os.path.join(where,name) - if ('.' not in name and os.path.isdir(fn) and - os.path.isfile(os.path.join(fn,'__init__.py')) - ): - out.append(prefix+name); stack.append((fn,prefix+name+'.')) - for pat in list(exclude)+['ez_setup']: - from fnmatch import fnmatchcase - out = [item for item in out if not fnmatchcase(item,pat)] - return out + Generate a list of all Python packages found within a directory + """ + + @classmethod + def find(cls, where='.', exclude=(), include=('*',)): + """Return a list all Python packages found within directory 'where' + + 'where' is the root directory which will be searched for packages. It + should be supplied as a "cross-platform" (i.e. URL-style) path; it will + be converted to the appropriate local path syntax. + + 'exclude' is a sequence of package names to exclude; '*' can be used + as a wildcard in the names, such that 'foo.*' will exclude all + subpackages of 'foo' (but not 'foo' itself). + + 'include' is a sequence of package names to include. If it's + specified, only the named packages will be included. If it's not + specified, all found packages will be included. 'include' can contain + shell style wildcard patterns just like 'exclude'. + """ + + return list(cls._find_packages_iter( + convert_path(where), + cls._build_filter('ez_setup', '*__pycache__', *exclude), + cls._build_filter(*include))) + + @classmethod + def _find_packages_iter(cls, where, exclude, include): + """ + All the packages found in 'where' that pass the 'include' filter, but + not the 'exclude' filter. + """ + for root, dirs, files in os.walk(where, followlinks=True): + # Copy dirs to iterate over it, then empty dirs. + all_dirs = dirs[:] + dirs[:] = [] + + for dir in all_dirs: + full_path = os.path.join(root, dir) + rel_path = os.path.relpath(full_path, where) + package = rel_path.replace(os.path.sep, '.') + + # Skip directory trees that are not valid packages + if ('.' in dir or not cls._looks_like_package(full_path)): + continue + + # Should this package be included? + if include(package) and not exclude(package): + yield package + + # Keep searching subdirectories, as there may be more packages + # down there, even if the parent was excluded. + dirs.append(dir) + + @staticmethod + def _looks_like_package(path): + """Does a directory look like a package?""" + return os.path.isfile(os.path.join(path, '__init__.py')) + + @staticmethod + def _build_filter(*patterns): + """ + Given a list of patterns, return a callable that will be true only if + the input matches at least one of the patterns. + """ + return lambda name: any(fnmatchcase(name, pat=pat) for pat in patterns) + + +class PEP420PackageFinder(PackageFinder): + @staticmethod + def _looks_like_package(path): + return True + + +find_packages = PackageFinder.find setup = distutils.core.setup - -_Command = _get_unpatched(_Command) + +_Command = monkey.get_unpatched(distutils.core.Command) + class Command(_Command): __doc__ = _Command.__doc__ @@ -49,34 +120,41 @@ class Command(_Command): command_consumes_arguments = False def __init__(self, dist, **kw): - # Add support for keyword arguments - _Command.__init__(self,dist) - for k,v in kw.items(): - setattr(self,k,v) - + """ + Construct the command for dist, updating + vars(self) with any keyword parameters. + """ + _Command.__init__(self, dist) + vars(self).update(kw) + def reinitialize_command(self, command, reinit_subcommands=0, **kw): cmd = _Command.reinitialize_command(self, command, reinit_subcommands) - for k,v in kw.items(): - setattr(cmd,k,v) # update command with keywords + vars(cmd).update(kw) return cmd -import distutils.core -distutils.core.Command = Command # we can't patch distutils.cmd, alas -def findall(dir = os.curdir): - """Find all files under 'dir' and return the list of full filenames - (relative to 'dir'). +def _find_all_simple(path): + """ + Find all files under 'path' """ - all_files = [] - for base, dirs, files in os.walk(dir): - if base==os.curdir or base.startswith(os.curdir+os.sep): - base = base[2:] - if base: - files = [os.path.join(base, f) for f in files] - all_files.extend(filter(os.path.isfile, files)) - return all_files + results = ( + os.path.join(base, file) + for base, dirs, files in os.walk(path, followlinks=True) + for file in files + ) + return filter(os.path.isfile, results) -import distutils.filelist -distutils.filelist.findall = findall # fix findall bug in distutils. + +def findall(dir=os.curdir): + """ + Find all files under 'dir' and return the list of full filenames. + Unless dir is '.', return full filenames with dir prepended. + """ + files = _find_all_simple(dir) + if dir == os.curdir: + make_rel = functools.partial(os.path.relpath, start=dir) + files = map(make_rel, files) + return list(files) +monkey.patch_all() diff --git a/setuptools/archive_util.py b/setuptools/archive_util.py index 5d72e7e..cc82b3d 100755 --- a/setuptools/archive_util.py +++ b/setuptools/archive_util.py @@ -1,47 +1,32 @@ """Utilities for extracting common archive formats""" +import zipfile +import tarfile +import os +import shutil +import posixpath +import contextlib +from distutils.errors import DistutilsError + +from pkg_resources import ensure_directory, ContextualZipFile __all__ = [ "unpack_archive", "unpack_zipfile", "unpack_tarfile", "default_filter", "UnrecognizedFormat", "extraction_drivers", "unpack_directory", ] -import zipfile, tarfile, os, shutil -from pkg_resources import ensure_directory -from distutils.errors import DistutilsError class UnrecognizedFormat(DistutilsError): """Couldn't recognize the archive type""" -def default_filter(src,dst): - """The default progress/filter callback; returns True for all files""" - return dst - - - - - - - - - - - - - - - - - - - - +def default_filter(src, dst): + """The default progress/filter callback; returns True for all files""" + return dst def unpack_archive(filename, extract_dir, progress_filter=default_filter, - drivers=None -): + drivers=None): """Unpack `filename` to `extract_dir`, or raise ``UnrecognizedFormat`` `progress_filter` is a function taking two arguments: a source path @@ -75,52 +60,33 @@ def unpack_archive(filename, extract_dir, progress_filter=default_filter, ) - - - - - def unpack_directory(filename, extract_dir, progress_filter=default_filter): """"Unpack" a directory, using the same interface as for archives Raises ``UnrecognizedFormat`` if `filename` is not a directory """ if not os.path.isdir(filename): - raise UnrecognizedFormat("%s is not a directory" % (filename,)) + raise UnrecognizedFormat("%s is not a directory" % filename) - paths = {filename:('',extract_dir)} + paths = { + filename: ('', extract_dir), + } for base, dirs, files in os.walk(filename): - src,dst = paths[base] + src, dst = paths[base] for d in dirs: - paths[os.path.join(base,d)] = src+d+'/', os.path.join(dst,d) + paths[os.path.join(base, d)] = src + d + '/', os.path.join(dst, d) for f in files: - name = src+f - target = os.path.join(dst,f) - target = progress_filter(src+f, target) + target = os.path.join(dst, f) + target = progress_filter(src + f, target) if not target: - continue # skip non-files + # skip non-files + continue ensure_directory(target) - f = os.path.join(base,f) + f = os.path.join(base, f) shutil.copyfile(f, target) shutil.copystat(f, target) - - - - - - - - - - - - - - - - def unpack_zipfile(filename, extract_dir, progress_filter=default_filter): """Unpack zip `filename` to `extract_dir` @@ -132,13 +98,12 @@ def unpack_zipfile(filename, extract_dir, progress_filter=default_filter): if not zipfile.is_zipfile(filename): raise UnrecognizedFormat("%s is not a zip file" % (filename,)) - z = zipfile.ZipFile(filename) - try: + with ContextualZipFile(filename) as z: for info in z.infolist(): name = info.filename # don't extract absolute paths or ones with .. in them - if name.startswith('/') or '..' in name: + if name.startswith('/') or '..' in name.split('/'): continue target = os.path.join(extract_dir, *name.split('/')) @@ -152,14 +117,11 @@ def unpack_zipfile(filename, extract_dir, progress_filter=default_filter): # file ensure_directory(target) data = z.read(info.filename) - f = open(target,'wb') - try: + with open(target, 'wb') as f: f.write(data) - finally: - f.close() - del data - finally: - z.close() + unix_attributes = info.external_attr >> 16 + if unix_attributes: + os.chmod(target, unix_attributes) def unpack_tarfile(filename, extract_dir, progress_filter=default_filter): @@ -169,37 +131,43 @@ def unpack_tarfile(filename, extract_dir, progress_filter=default_filter): by ``tarfile.open()``). See ``unpack_archive()`` for an explanation of the `progress_filter` argument. """ - try: tarobj = tarfile.open(filename) except tarfile.TarError: raise UnrecognizedFormat( "%s is not a compressed or uncompressed tar file" % (filename,) ) - - try: - tarobj.chown = lambda *args: None # don't do any chowning! + with contextlib.closing(tarobj): + # don't do any chowning! + tarobj.chown = lambda *args: None for member in tarobj: - if member.isfile() or member.isdir(): - name = member.name - # don't extract absolute paths or ones with .. in them - if not name.startswith('/') and '..' not in name: - dst = os.path.join(extract_dir, *name.split('/')) - dst = progress_filter(name, dst) - if dst: - if dst.endswith(os.sep): - dst = dst[:-1] + name = member.name + # don't extract absolute paths or ones with .. in them + if not name.startswith('/') and '..' not in name.split('/'): + prelim_dst = os.path.join(extract_dir, *name.split('/')) + + # resolve any links and to extract the link targets as normal + # files + while member is not None and (member.islnk() or member.issym()): + linkpath = member.linkname + if member.issym(): + base = posixpath.dirname(member.name) + linkpath = posixpath.join(base, linkpath) + linkpath = posixpath.normpath(linkpath) + member = tarobj._getmember(linkpath) + + if member is not None and (member.isfile() or member.isdir()): + final_dst = progress_filter(name, prelim_dst) + if final_dst: + if final_dst.endswith(os.sep): + final_dst = final_dst[:-1] try: - tarobj._extract_member(member,dst) # XXX Ugh + # XXX Ugh + tarobj._extract_member(member, final_dst) except tarfile.ExtractError: - pass # chown/chmod/mkfifo/mknode/makedev failed + # chown/chmod/mkfifo/mknode/makedev failed + pass return True - finally: - tarobj.close() - - extraction_drivers = unpack_directory, unpack_zipfile, unpack_tarfile - - diff --git a/setuptools/cli-32.exe b/setuptools/cli-32.exe new file mode 100644 index 0000000..b1487b7 Binary files /dev/null and b/setuptools/cli-32.exe differ diff --git a/setuptools/cli-64.exe b/setuptools/cli-64.exe new file mode 100644 index 0000000..675e6bf Binary files /dev/null and b/setuptools/cli-64.exe differ diff --git a/setuptools/cli.exe b/setuptools/cli.exe old mode 100755 new mode 100644 index 8906ff7..b1487b7 Binary files a/setuptools/cli.exe and b/setuptools/cli.exe differ diff --git a/setuptools/command/__init__.py b/setuptools/command/__init__.py index f898822..c96d33c 100644 --- a/setuptools/command/__init__.py +++ b/setuptools/command/__init__.py @@ -1,16 +1,14 @@ __all__ = [ 'alias', 'bdist_egg', 'bdist_rpm', 'build_ext', 'build_py', 'develop', 'easy_install', 'egg_info', 'install', 'install_lib', 'rotate', 'saveopts', - 'sdist', 'setopt', 'test', 'upload', 'install_egg_info', 'install_scripts', - 'register', 'bdist_wininst', + 'sdist', 'setopt', 'test', 'install_egg_info', 'install_scripts', + 'register', 'bdist_wininst', 'upload_docs', 'upload', 'build_clib', ] +from distutils.command.bdist import bdist import sys -if sys.version>='2.5': - # In Python 2.5 and above, distutils includes its own upload command - __all__.remove('upload') -from distutils.command.bdist import bdist +from setuptools.command import install_scripts if 'egg' not in bdist.format_commands: bdist.format_command['egg'] = ('bdist_egg', "Python .egg file") diff --git a/setuptools/command/alias.py b/setuptools/command/alias.py index 40c00b5..35ece78 100755 --- a/setuptools/command/alias.py +++ b/setuptools/command/alias.py @@ -1,27 +1,28 @@ -import distutils, os -from setuptools import Command -from distutils.util import convert_path -from distutils import log -from distutils.errors import * +from distutils.errors import DistutilsOptionError + +from six.moves import map + from setuptools.command.setopt import edit_config, option_base, config_file + def shquote(arg): """Quote an argument for later parsing by shlex.split()""" for c in '"', "'", "\\", "#": - if c in arg: return repr(arg) - if arg.split()!=[arg]: + if c in arg: + return repr(arg) + if arg.split() != [arg]: return repr(arg) - return arg + return arg class alias(option_base): """Define a shortcut that invokes one or more commands""" - + description = "define a shortcut to invoke one or more commands" command_consumes_arguments = True user_options = [ - ('remove', 'r', 'remove (unset) the alias'), + ('remove', 'r', 'remove (unset) the alias'), ] + option_base.user_options boolean_options = option_base.boolean_options + ['remove'] @@ -33,7 +34,7 @@ class alias(option_base): def finalize_options(self): option_base.finalize_options(self) - if self.remove and len(self.args)!=1: + if self.remove and len(self.args) != 1: raise DistutilsOptionError( "Must specify exactly one argument (the alias name) when " "using --remove" @@ -43,27 +44,27 @@ class alias(option_base): aliases = self.distribution.get_option_dict('aliases') if not self.args: - print "Command Aliases" - print "---------------" + print("Command Aliases") + print("---------------") for alias in aliases: - print "setup.py alias", format_alias(alias, aliases) + print("setup.py alias", format_alias(alias, aliases)) return - elif len(self.args)==1: + elif len(self.args) == 1: alias, = self.args if self.remove: command = None elif alias in aliases: - print "setup.py alias", format_alias(alias, aliases) + print("setup.py alias", format_alias(alias, aliases)) return else: - print "No alias definition found for %r" % alias + print("No alias definition found for %r" % alias) return else: alias = self.args[0] - command = ' '.join(map(shquote,self.args[1:])) + command = ' '.join(map(shquote, self.args[1:])) - edit_config(self.filename, {'aliases': {alias:command}}, self.dry_run) + edit_config(self.filename, {'aliases': {alias: command}}, self.dry_run) def format_alias(name, aliases): @@ -76,7 +77,4 @@ def format_alias(name, aliases): source = '' else: source = '--filename=%r' % source - return source+name+' '+command - - - + return source + name + ' ' + command diff --git a/setuptools/command/bdist_egg.py b/setuptools/command/bdist_egg.py index 7e5a379..ae344cd 100644 --- a/setuptools/command/bdist_egg.py +++ b/setuptools/command/bdist_egg.py @@ -2,17 +2,34 @@ Build .egg distributions""" -# This module should be kept compatible with Python 2.3 -import sys, os, marshal -from setuptools import Command +from distutils.errors import DistutilsSetupError from distutils.dir_util import remove_tree, mkpath -from distutils.sysconfig import get_python_version, get_python_lib from distutils import log -from distutils.errors import DistutilsSetupError +from types import CodeType +import sys +import os +import textwrap +import marshal + +import six + from pkg_resources import get_build_platform, Distribution, ensure_directory from pkg_resources import EntryPoint -from types import CodeType from setuptools.extension import Library +from setuptools import Command + +try: + # Python 2.7 or >=3.2 + from sysconfig import get_path, get_python_version + + def _get_purelib(): + return get_path("purelib") +except ImportError: + from distutils.sysconfig import get_python_lib, get_python_version + + def _get_purelib(): + return get_python_lib(False) + def strip_module(filename): if '.' in filename: @@ -21,66 +38,45 @@ def strip_module(filename): filename = filename[:-6] return filename + def write_stub(resource, pyfile): - f = open(pyfile,'w') - f.write('\n'.join([ - "def __bootstrap__():", - " global __bootstrap__, __loader__, __file__", - " import sys, pkg_resources, imp", - " __file__ = pkg_resources.resource_filename(__name__,%r)" - % resource, - " __loader__ = None; del __bootstrap__, __loader__", - " imp.load_dynamic(__name__,__file__)", - "__bootstrap__()", - "" # terminal \n - ])) - f.close() + _stub_template = textwrap.dedent(""" + def __bootstrap__(): + global __bootstrap__, __loader__, __file__ + import sys, pkg_resources, imp + __file__ = pkg_resources.resource_filename(__name__, %r) + __loader__ = None; del __bootstrap__, __loader__ + imp.load_dynamic(__name__,__file__) + __bootstrap__() + """).lstrip() + with open(pyfile, 'w') as f: + f.write(_stub_template % resource) -# stub __init__.py for packages distributed without one -NS_PKG_STUB = '__import__("pkg_resources").declare_namespace(__name__)' class bdist_egg(Command): - description = "create an \"egg\" distribution" user_options = [ ('bdist-dir=', 'b', - "temporary directory for creating the distribution"), - ('plat-name=', 'p', - "platform name to embed in generated filenames " - "(default: %s)" % get_build_platform()), + "temporary directory for creating the distribution"), + ('plat-name=', 'p', "platform name to embed in generated filenames " + "(default: %s)" % get_build_platform()), ('exclude-source-files', None, - "remove all .py files from the generated egg"), + "remove all .py files from the generated egg"), ('keep-temp', 'k', - "keep the pseudo-installation tree around after " + - "creating the distribution archive"), + "keep the pseudo-installation tree around after " + + "creating the distribution archive"), ('dist-dir=', 'd', - "directory to put final built distributions in"), + "directory to put final built distributions in"), ('skip-build', None, - "skip rebuilding everything (for testing/debugging)"), + "skip rebuilding everything (for testing/debugging)"), ] boolean_options = [ 'keep-temp', 'skip-build', 'exclude-source-files' ] - - - - - - - - - - - - - - - - - def initialize_options (self): + def initialize_options(self): self.bdist_dir = None self.plat_name = None self.keep_temp = 0 @@ -89,7 +85,6 @@ class bdist_egg(Command): self.egg_output = None self.exclude_source_files = None - def finalize_options(self): ei_cmd = self.ei_cmd = self.get_finalized_command("egg_info") self.egg_info = ei_cmd.egg_info @@ -101,7 +96,7 @@ class bdist_egg(Command): if self.plat_name is None: self.plat_name = get_build_platform() - self.set_undefined_options('bdist',('dist_dir', 'dist_dir')) + self.set_undefined_options('bdist', ('dist_dir', 'dist_dir')) if self.egg_output is None: @@ -112,64 +107,55 @@ class bdist_egg(Command): self.distribution.has_ext_modules() and self.plat_name ).egg_name() - self.egg_output = os.path.join(self.dist_dir, basename+'.egg') - - - - - - - + self.egg_output = os.path.join(self.dist_dir, basename + '.egg') def do_install_data(self): # Hack for packages that install data to install's --install-lib self.get_finalized_command('install').install_lib = self.bdist_dir - site_packages = os.path.normcase(os.path.realpath(get_python_lib())) - old, self.distribution.data_files = self.distribution.data_files,[] + site_packages = os.path.normcase(os.path.realpath(_get_purelib())) + old, self.distribution.data_files = self.distribution.data_files, [] for item in old: - if isinstance(item,tuple) and len(item)==2: + if isinstance(item, tuple) and len(item) == 2: if os.path.isabs(item[0]): realpath = os.path.realpath(item[0]) normalized = os.path.normcase(realpath) - if normalized==site_packages or normalized.startswith( - site_packages+os.sep + if normalized == site_packages or normalized.startswith( + site_packages + os.sep ): - item = realpath[len(site_packages)+1:], item[1] - # XXX else: raise ??? + item = realpath[len(site_packages) + 1:], item[1] + # XXX else: raise ??? self.distribution.data_files.append(item) try: - log.info("installing package data to %s" % self.bdist_dir) + log.info("installing package data to %s", self.bdist_dir) self.call_command('install_data', force=0, root=None) finally: self.distribution.data_files = old - def get_outputs(self): return [self.egg_output] - - def call_command(self,cmdname,**kw): + def call_command(self, cmdname, **kw): """Invoke reinitialized command `cmdname` with keyword args""" for dirname in INSTALL_DIRECTORY_ATTRS: - kw.setdefault(dirname,self.bdist_dir) - kw.setdefault('skip_build',self.skip_build) + kw.setdefault(dirname, self.bdist_dir) + kw.setdefault('skip_build', self.skip_build) kw.setdefault('dry_run', self.dry_run) cmd = self.reinitialize_command(cmdname, **kw) self.run_command(cmdname) return cmd - def run(self): # Generate metadata first self.run_command("egg_info") # We run install_lib before install_data, because some data hacks # pull their data path from the install_lib command. - log.info("installing library code to %s" % self.bdist_dir) + log.info("installing library code to %s", self.bdist_dir) instcmd = self.get_finalized_command('install') - old_root = instcmd.root; instcmd.root = None + old_root = instcmd.root + instcmd.root = None if self.distribution.has_c_libraries() and not self.skip_build: self.run_command('build_clib') cmd = self.call_command('install_lib', warn_dir=0) @@ -178,17 +164,17 @@ class bdist_egg(Command): all_outputs, ext_outputs = self.get_ext_outputs() self.stubs = [] to_compile = [] - for (p,ext_name) in enumerate(ext_outputs): - filename,ext = os.path.splitext(ext_name) - pyfile = os.path.join(self.bdist_dir, strip_module(filename)+'.py') + for (p, ext_name) in enumerate(ext_outputs): + filename, ext = os.path.splitext(ext_name) + pyfile = os.path.join(self.bdist_dir, strip_module(filename) + + '.py') self.stubs.append(pyfile) - log.info("creating stub loader for %s" % ext_name) + log.info("creating stub loader for %s", ext_name) if not self.dry_run: write_stub(os.path.basename(ext_name), pyfile) to_compile.append(pyfile) - ext_outputs[p] = ext_name.replace(os.sep,'/') + ext_outputs[p] = ext_name.replace(os.sep, '/') - to_compile.extend(self.make_init_files()) if to_compile: cmd.byte_compile(to_compile) if self.distribution.data_files: @@ -196,17 +182,18 @@ class bdist_egg(Command): # Make the EGG-INFO directory archive_root = self.bdist_dir - egg_info = os.path.join(archive_root,'EGG-INFO') + egg_info = os.path.join(archive_root, 'EGG-INFO') self.mkpath(egg_info) if self.distribution.scripts: script_dir = os.path.join(egg_info, 'scripts') - log.info("installing scripts to %s" % script_dir) - self.call_command('install_scripts',install_dir=script_dir,no_ep=1) + log.info("installing scripts to %s", script_dir) + self.call_command('install_scripts', install_dir=script_dir, + no_ep=1) self.copy_metadata_to(egg_info) native_libs = os.path.join(egg_info, "native_libs.txt") if all_outputs: - log.info("writing %s" % native_libs) + log.info("writing %s", native_libs) if not self.dry_run: ensure_directory(native_libs) libs_file = open(native_libs, 'wt') @@ -214,15 +201,15 @@ class bdist_egg(Command): libs_file.write('\n') libs_file.close() elif os.path.isfile(native_libs): - log.info("removing %s" % native_libs) + log.info("removing %s", native_libs) if not self.dry_run: os.unlink(native_libs) write_safety_flag( - os.path.join(archive_root,'EGG-INFO'), self.zip_safe() + os.path.join(archive_root, 'EGG-INFO'), self.zip_safe() ) - if os.path.exists(os.path.join(self.egg_info,'depends.txt')): + if os.path.exists(os.path.join(self.egg_info, 'depends.txt')): log.warn( "WARNING: 'depends.txt' will not be used by setuptools 0.6!\n" "Use the install_requires/extras_require setup() args instead." @@ -233,61 +220,33 @@ class bdist_egg(Command): # Make the archive make_zipfile(self.egg_output, archive_root, verbose=self.verbose, - dry_run=self.dry_run, mode=self.gen_header()) + dry_run=self.dry_run, mode=self.gen_header()) if not self.keep_temp: remove_tree(self.bdist_dir, dry_run=self.dry_run) # Add to 'Distribution.dist_files' so that the "upload" command works - getattr(self.distribution,'dist_files',[]).append( - ('bdist_egg',get_python_version(),self.egg_output)) - - - + getattr(self.distribution, 'dist_files', []).append( + ('bdist_egg', get_python_version(), self.egg_output)) def zap_pyfiles(self): log.info("Removing .py files from temporary directory") - for base,dirs,files in walk_egg(self.bdist_dir): + for base, dirs, files in walk_egg(self.bdist_dir): for name in files: if name.endswith('.py'): - path = os.path.join(base,name) + path = os.path.join(base, name) log.debug("Deleting %s", path) os.unlink(path) def zip_safe(self): - safe = getattr(self.distribution,'zip_safe',None) + safe = getattr(self.distribution, 'zip_safe', None) if safe is not None: return safe log.warn("zip_safe flag not set; analyzing archive contents...") return analyze_egg(self.bdist_dir, self.stubs) - def make_init_files(self): - """Create missing package __init__ files""" - init_files = [] - for base,dirs,files in walk_egg(self.bdist_dir): - if base==self.bdist_dir: - # don't put an __init__ in the root - continue - for name in files: - if name.endswith('.py'): - if '__init__.py' not in files: - pkg = base[len(self.bdist_dir)+1:].replace(os.sep,'.') - if self.distribution.has_contents_for(pkg): - log.warn("Creating missing __init__.py for %s",pkg) - filename = os.path.join(base,'__init__.py') - if not self.dry_run: - f = open(filename,'w'); f.write(NS_PKG_STUB) - f.close() - init_files.append(filename) - break - else: - # not a package, don't traverse to subdirectories - dirs[:] = [] - - return init_files - def gen_header(self): epm = EntryPoint.parse_map(self.distribution.entry_points or '') - ep = epm.get('setuptools.installation',{}).get('eggsecutable') + ep = epm.get('setuptools.installation', {}).get('eggsecutable') if ep is None: return 'w' # not an eggsecutable, do it the usual way. @@ -315,7 +274,6 @@ class bdist_egg(Command): ' echo Please rename it back to %(basename)s and try again.\n' ' exec false\n' 'fi\n' - ) % locals() if not self.dry_run: @@ -325,9 +283,12 @@ class bdist_egg(Command): f.close() return 'a' - def copy_metadata_to(self, target_dir): - prefix = os.path.join(self.egg_info,'') + "Copy metadata (egg info) to the target_dir" + # normalize the path (so that a forward-slash in egg_info will + # match using startswith below) + norm_egg_info = os.path.normpath(self.egg_info) + prefix = os.path.join(norm_egg_info, '') for path in self.ei_cmd.filelist.files: if path.startswith(prefix): target = os.path.join(target_dir, path[len(prefix):]) @@ -340,23 +301,24 @@ class bdist_egg(Command): all_outputs = [] ext_outputs = [] - paths = {self.bdist_dir:''} + paths = {self.bdist_dir: ''} for base, dirs, files in os.walk(self.bdist_dir): for filename in files: if os.path.splitext(filename)[1].lower() in NATIVE_EXTENSIONS: - all_outputs.append(paths[base]+filename) + all_outputs.append(paths[base] + filename) for filename in dirs: - paths[os.path.join(base,filename)] = paths[base]+filename+'/' + paths[os.path.join(base, filename)] = (paths[base] + + filename + '/') if self.distribution.has_ext_modules(): build_cmd = self.get_finalized_command('build_ext') for ext in build_cmd.extensions: - if isinstance(ext,Library): + if isinstance(ext, Library): continue fullname = build_cmd.get_ext_fullname(ext.name) filename = build_cmd.get_ext_filename(fullname) if not os.path.basename(filename).startswith('dl-'): - if os.path.exists(os.path.join(self.bdist_dir,filename)): + if os.path.exists(os.path.join(self.bdist_dir, filename)): ext_outputs.append(filename) return all_outputs, ext_outputs @@ -365,24 +327,24 @@ class bdist_egg(Command): NATIVE_EXTENSIONS = dict.fromkeys('.dll .so .dylib .pyd'.split()) - - def walk_egg(egg_dir): """Walk an unpacked egg's contents, skipping the metadata directory""" walker = os.walk(egg_dir) - base,dirs,files = walker.next() + base, dirs, files = next(walker) if 'EGG-INFO' in dirs: dirs.remove('EGG-INFO') - yield base,dirs,files + yield base, dirs, files for bdf in walker: yield bdf + def analyze_egg(egg_dir, stubs): # check for existing flag in EGG-INFO - for flag,fn in safety_flags.items(): - if os.path.exists(os.path.join(egg_dir,'EGG-INFO',fn)): + for flag, fn in safety_flags.items(): + if os.path.exists(os.path.join(egg_dir, 'EGG-INFO', fn)): return flag - if not can_scan(): return False + if not can_scan(): + return False safe = True for base, dirs, files in walk_egg(egg_dir): for name in files: @@ -393,31 +355,42 @@ def analyze_egg(egg_dir, stubs): safe = scan_module(egg_dir, base, name, stubs) and safe return safe + def write_safety_flag(egg_dir, safe): # Write or remove zip safety flag file(s) - for flag,fn in safety_flags.items(): + for flag, fn in safety_flags.items(): fn = os.path.join(egg_dir, fn) if os.path.exists(fn): - if safe is None or bool(safe)!=flag: + if safe is None or bool(safe) != flag: os.unlink(fn) - elif safe is not None and bool(safe)==flag: - f=open(fn,'wb'); f.write('\n'); f.close() + elif safe is not None and bool(safe) == flag: + f = open(fn, 'wt') + f.write('\n') + f.close() + safety_flags = { True: 'zip-safe', False: 'not-zip-safe', } + def scan_module(egg_dir, base, name, stubs): """Check whether module possibly uses unsafe-for-zipfile stuff""" - filename = os.path.join(base,name) + filename = os.path.join(base, name) if filename[:-1] in stubs: - return True # Extension module - pkg = base[len(egg_dir)+1:].replace(os.sep,'.') - module = pkg+(pkg and '.' or '')+os.path.splitext(name)[0] - f = open(filename,'rb'); f.read(8) # skip magic & date - code = marshal.load(f); f.close() + return True # Extension module + pkg = base[len(egg_dir) + 1:].replace(os.sep, '.') + module = pkg + (pkg and '.' or '') + os.path.splitext(name)[0] + if sys.version_info < (3, 3): + skip = 8 # skip magic & date + else: + skip = 12 # skip magic & date & file size + f = open(filename, 'rb') + f.read(skip) + code = marshal.load(f) + f.close() safe = True symbols = dict.fromkeys(iter_symbols(code)) for bad in ['__file__', '__path__']: @@ -433,22 +406,21 @@ def scan_module(egg_dir, base, name, stubs): if bad in symbols: log.warn("%s: module MAY be using inspect.%s", module, bad) safe = False - if '__name__' in symbols and '__main__' in symbols and '.' not in module: - if sys.version[:3]=="2.4": # -m works w/zipfiles in 2.5 - log.warn("%s: top-level module may be 'python -m' script", module) - safe = False return safe + def iter_symbols(code): """Yield names and strings used by `code` and its nested code objects""" - for name in code.co_names: yield name + for name in code.co_names: + yield name for const in code.co_consts: - if isinstance(const,basestring): + if isinstance(const, six.string_types): yield const - elif isinstance(const,CodeType): + elif isinstance(const, CodeType): for name in iter_symbols(const): yield name + def can_scan(): if not sys.platform.startswith('java') and sys.platform != 'cli': # CPython, PyPy, etc. @@ -458,38 +430,6 @@ def can_scan(): " setting (either True or False) in the package's setup.py") - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - # Attribute names of options for commands that might need to be convinced to # install to the egg build directory @@ -497,9 +437,9 @@ INSTALL_DIRECTORY_ATTRS = [ 'install_lib', 'install_dir', 'install_data', 'install_base' ] -def make_zipfile(zip_filename, base_dir, verbose=0, dry_run=0, compress=None, - mode='w' -): + +def make_zipfile(zip_filename, base_dir, verbose=0, dry_run=0, compress=True, + mode='w'): """Create a zip file from all the files under 'base_dir'. The output zip file will be named 'base_dir' + ".zip". Uses either the "zipfile" Python module (if available) or the InfoZIP "zip" utility (if installed @@ -507,6 +447,7 @@ def make_zipfile(zip_filename, base_dir, verbose=0, dry_run=0, compress=None, raises DistutilsExecError. Returns the name of the output zip file. """ import zipfile + mkpath(os.path.dirname(zip_filename), dry_run=dry_run) log.info("creating '%s' and adding '%s' to it", zip_filename, base_dir) @@ -514,20 +455,18 @@ def make_zipfile(zip_filename, base_dir, verbose=0, dry_run=0, compress=None, for name in names: path = os.path.normpath(os.path.join(dirname, name)) if os.path.isfile(path): - p = path[len(base_dir)+1:] + p = path[len(base_dir) + 1:] if not dry_run: z.write(path, p) - log.debug("adding '%s'" % p) - - if compress is None: - compress = (sys.version>="2.4") # avoid 2.3 zipimport bug when 64 bits + log.debug("adding '%s'", p) - compression = [zipfile.ZIP_STORED, zipfile.ZIP_DEFLATED][bool(compress)] + compression = zipfile.ZIP_DEFLATED if compress else zipfile.ZIP_STORED if not dry_run: z = zipfile.ZipFile(zip_filename, mode, compression=compression) - os.path.walk(base_dir, visit, z) + for dirname, dirs, files in os.walk(base_dir): + visit(z, dirname, files) z.close() else: - os.path.walk(base_dir, visit, None) + for dirname, dirs, files in os.walk(base_dir): + visit(None, dirname, files) return zip_filename -# diff --git a/setuptools/command/bdist_rpm.py b/setuptools/command/bdist_rpm.py index 8c48da3..7073092 100755 --- a/setuptools/command/bdist_rpm.py +++ b/setuptools/command/bdist_rpm.py @@ -1,51 +1,30 @@ -# This is just a kludge so that bdist_rpm doesn't guess wrong about the -# distribution name and version, if the egg_info command is going to alter -# them, another kludge to allow you to build old-style non-egg RPMs, and -# finally, a kludge to track .rpm files for uploading when run on Python <2.5. +import distutils.command.bdist_rpm as orig -from distutils.command.bdist_rpm import bdist_rpm as _bdist_rpm -import sys, os -class bdist_rpm(_bdist_rpm): +class bdist_rpm(orig.bdist_rpm): + """ + Override the default bdist_rpm behavior to do the following: - def initialize_options(self): - _bdist_rpm.initialize_options(self) - self.no_egg = None - - if sys.version<"2.5": - # Track for uploading any .rpm file(s) moved to self.dist_dir - def move_file(self, src, dst, level=1): - _bdist_rpm.move_file(self, src, dst, level) - if dst==self.dist_dir and src.endswith('.rpm'): - getattr(self.distribution,'dist_files',[]).append( - ('bdist_rpm', - src.endswith('.src.rpm') and 'any' or sys.version[:3], - os.path.join(dst, os.path.basename(src))) - ) + 1. Run egg_info to ensure the name and version are properly calculated. + 2. Always run 'install' using --single-version-externally-managed to + disable eggs in RPM distributions. + 3. Replace dash with underscore in the version numbers for better RPM + compatibility. + """ def run(self): - self.run_command('egg_info') # ensure distro name is up-to-date - _bdist_rpm.run(self) - - - - - - - - - - - + # ensure distro name is up-to-date + self.run_command('egg_info') + orig.bdist_rpm.run(self) def _make_spec_file(self): version = self.distribution.get_version() - rpmversion = version.replace('-','_') - spec = _bdist_rpm._make_spec_file(self) - line23 = '%define version '+version - line24 = '%define version '+rpmversion - spec = [ + rpmversion = version.replace('-', '_') + spec = orig.bdist_rpm._make_spec_file(self) + line23 = '%define version ' + version + line24 = '%define version ' + rpmversion + spec = [ line.replace( "Source0: %{name}-%{version}.tar", "Source0: %{name}-%{unmangled_version}.tar" @@ -55,28 +34,10 @@ class bdist_rpm(_bdist_rpm): ).replace( "%setup", "%setup -n %{name}-%{unmangled_version}" - ).replace(line23,line24) + ).replace(line23, line24) for line in spec ] - spec.insert(spec.index(line24)+1, "%define unmangled_version "+version) + insert_loc = spec.index(line24) + 1 + unmangled_version = "%define unmangled_version " + version + spec.insert(insert_loc, unmangled_version) return spec - - - - - - - - - - - - - - - - - - - - diff --git a/setuptools/command/bdist_wininst.py b/setuptools/command/bdist_wininst.py index e8521f8..073de97 100755 --- a/setuptools/command/bdist_wininst.py +++ b/setuptools/command/bdist_wininst.py @@ -1,82 +1,21 @@ -from distutils.command.bdist_wininst import bdist_wininst as _bdist_wininst -import os, sys +import distutils.command.bdist_wininst as orig -class bdist_wininst(_bdist_wininst): - _good_upload = _bad_upload = None - def create_exe(self, arcname, fullname, bitmap=None): - _bdist_wininst.create_exe(self, arcname, fullname, bitmap) - installer_name = self.get_installer_filename(fullname) - if self.target_version: - pyversion = self.target_version - # fix 2.5+ bdist_wininst ignoring --target-version spec - self._bad_upload = ('bdist_wininst', 'any', installer_name) - else: - pyversion = 'any' - self._good_upload = ('bdist_wininst', pyversion, installer_name) - - def _fix_upload_names(self): - good, bad = self._good_upload, self._bad_upload - dist_files = getattr(self.distribution, 'dist_files', []) - if bad in dist_files: - dist_files.remove(bad) - if good not in dist_files: - dist_files.append(good) - - def reinitialize_command (self, command, reinit_subcommands=0): +class bdist_wininst(orig.bdist_wininst): + def reinitialize_command(self, command, reinit_subcommands=0): + """ + Supplement reinitialize_command to work around + http://bugs.python.org/issue20819 + """ cmd = self.distribution.reinitialize_command( command, reinit_subcommands) if command in ('install', 'install_lib'): - cmd.install_lib = None # work around distutils bug + cmd.install_lib = None return cmd def run(self): self._is_running = True try: - _bdist_wininst.run(self) - self._fix_upload_names() + orig.bdist_wininst.run(self) finally: self._is_running = False - - - if not hasattr(_bdist_wininst, 'get_installer_filename'): - def get_installer_filename(self, fullname): - # Factored out to allow overriding in subclasses - if self.target_version: - # if we create an installer for a specific python version, - # it's better to include this in the name - installer_name = os.path.join(self.dist_dir, - "%s.win32-py%s.exe" % - (fullname, self.target_version)) - else: - installer_name = os.path.join(self.dist_dir, - "%s.win32.exe" % fullname) - return installer_name - # get_installer_filename() - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/setuptools/command/build_clib.py b/setuptools/command/build_clib.py new file mode 100644 index 0000000..09caff6 --- /dev/null +++ b/setuptools/command/build_clib.py @@ -0,0 +1,98 @@ +import distutils.command.build_clib as orig +from distutils.errors import DistutilsSetupError +from distutils import log +from setuptools.dep_util import newer_pairwise_group + + +class build_clib(orig.build_clib): + """ + Override the default build_clib behaviour to do the following: + + 1. Implement a rudimentary timestamp-based dependency system + so 'compile()' doesn't run every time. + 2. Add more keys to the 'build_info' dictionary: + * obj_deps - specify dependencies for each object compiled. + this should be a dictionary mapping a key + with the source filename to a list of + dependencies. Use an empty string for global + dependencies. + * cflags - specify a list of additional flags to pass to + the compiler. + """ + + def build_libraries(self, libraries): + for (lib_name, build_info) in libraries: + sources = build_info.get('sources') + if sources is None or not isinstance(sources, (list, tuple)): + raise DistutilsSetupError( + "in 'libraries' option (library '%s'), " + "'sources' must be present and must be " + "a list of source filenames" % lib_name) + sources = list(sources) + + log.info("building '%s' library", lib_name) + + # Make sure everything is the correct type. + # obj_deps should be a dictionary of keys as sources + # and a list/tuple of files that are its dependencies. + obj_deps = build_info.get('obj_deps', dict()) + if not isinstance(obj_deps, dict): + raise DistutilsSetupError( + "in 'libraries' option (library '%s'), " + "'obj_deps' must be a dictionary of " + "type 'source: list'" % lib_name) + dependencies = [] + + # Get the global dependencies that are specified by the '' key. + # These will go into every source's dependency list. + global_deps = obj_deps.get('', list()) + if not isinstance(global_deps, (list, tuple)): + raise DistutilsSetupError( + "in 'libraries' option (library '%s'), " + "'obj_deps' must be a dictionary of " + "type 'source: list'" % lib_name) + + # Build the list to be used by newer_pairwise_group + # each source will be auto-added to its dependencies. + for source in sources: + src_deps = [source] + src_deps.extend(global_deps) + extra_deps = obj_deps.get(source, list()) + if not isinstance(extra_deps, (list, tuple)): + raise DistutilsSetupError( + "in 'libraries' option (library '%s'), " + "'obj_deps' must be a dictionary of " + "type 'source: list'" % lib_name) + src_deps.extend(extra_deps) + dependencies.append(src_deps) + + expected_objects = self.compiler.object_filenames( + sources, + output_dir=self.build_temp + ) + + if newer_pairwise_group(dependencies, expected_objects) != ([], []): + # First, compile the source code to object files in the library + # directory. (This should probably change to putting object + # files in a temporary build directory.) + macros = build_info.get('macros') + include_dirs = build_info.get('include_dirs') + cflags = build_info.get('cflags') + objects = self.compiler.compile( + sources, + output_dir=self.build_temp, + macros=macros, + include_dirs=include_dirs, + extra_postargs=cflags, + debug=self.debug + ) + + # Now "link" the object files together into a static library. + # (On Unix at least, this isn't really linking -- it just + # builds an archive. Whatever.) + self.compiler.create_static_lib( + expected_objects, + lib_name, + output_dir=self.build_clib, + debug=self.debug + ) diff --git a/setuptools/command/build_ext.py b/setuptools/command/build_ext.py index f6f3355..c2fd870 100644 --- a/setuptools/command/build_ext.py +++ b/setuptools/command/build_ext.py @@ -1,19 +1,47 @@ +import os +import sys +import itertools +import imp from distutils.command.build_ext import build_ext as _du_build_ext -try: - # Attempt to use Pyrex for building extensions, if available - from Pyrex.Distutils.build_ext import build_ext as _build_ext -except ImportError: - _build_ext = _du_build_ext - -import os, sys from distutils.file_util import copy_file -from setuptools.extension import Library from distutils.ccompiler import new_compiler from distutils.sysconfig import customize_compiler, get_config_var -get_config_var("LDSHARED") # make sure _config_vars is initialized -from distutils.sysconfig import _config_vars +from distutils.errors import DistutilsError from distutils import log -from distutils.errors import * + +from setuptools.extension import Library +import six + +try: + # Attempt to use Cython for building extensions, if available + from Cython.Distutils.build_ext import build_ext as _build_ext +except ImportError: + _build_ext = _du_build_ext + +# make sure _config_vars is initialized +get_config_var("LDSHARED") +from distutils.sysconfig import _config_vars as _CONFIG_VARS + + +def _customize_compiler_for_shlib(compiler): + if sys.platform == "darwin": + # building .dylib requires additional compiler flags on OSX; here we + # temporarily substitute the pyconfig.h variables so that distutils' + # 'customize_compiler' uses them before we build the shared libraries. + tmp = _CONFIG_VARS.copy() + try: + # XXX Help! I don't have any idea whether these are right... + _CONFIG_VARS['LDSHARED'] = ( + "gcc -Wl,-x -dynamiclib -undefined dynamic_lookup") + _CONFIG_VARS['CCSHARED'] = " -dynamiclib" + _CONFIG_VARS['SO'] = ".dylib" + customize_compiler(compiler) + finally: + _CONFIG_VARS.clear() + _CONFIG_VARS.update(tmp) + else: + customize_compiler(compiler) + have_rtld = False use_stubs = False @@ -23,20 +51,21 @@ if sys.platform == "darwin": use_stubs = True elif os.name != 'nt': try: - from dl import RTLD_NOW - have_rtld = True - use_stubs = True + import dl + use_stubs = have_rtld = hasattr(dl, 'RTLD_NOW') except ImportError: pass -def if_dl(s): - if have_rtld: - return s - return '' - - +if_dl = lambda s: s if have_rtld else '' +def get_abi3_suffix(): + """Return the file extension for an abi3-compliant Extension()""" + for suffix, _, _ in (s for s in imp.get_suffixes() if s[2] == imp.C_EXTENSION): + if '.abi3' in suffix: # Unix + return suffix + elif suffix == '.pyd': # Windows + return suffix class build_ext(_build_ext): @@ -56,8 +85,9 @@ class build_ext(_build_ext): modpath = fullname.split('.') package = '.'.join(modpath[:-1]) package_dir = build_py.get_package_dir(package) - dest_filename = os.path.join(package_dir,os.path.basename(filename)) - src_filename = os.path.join(self.build_lib,filename) + dest_filename = os.path.join(package_dir, + os.path.basename(filename)) + src_filename = os.path.join(self.build_lib, filename) # Always copy, even if source is older than destination, to ensure # that the right extensions for the current Python/platform are @@ -69,27 +99,25 @@ class build_ext(_build_ext): if ext._needs_stub: self.write_stub(package_dir or os.curdir, ext, True) - - if _build_ext is not _du_build_ext and not hasattr(_build_ext,'pyrex_sources'): - # Workaround for problems using some Pyrex versions w/SWIG and/or 2.4 - def swig_sources(self, sources, *otherargs): - # first do any Pyrex processing - sources = _build_ext.swig_sources(self, sources) or sources - # Then do any actual SWIG stuff on the remainder - return _du_build_ext.swig_sources(self, sources, *otherargs) - - - def get_ext_filename(self, fullname): - filename = _build_ext.get_ext_filename(self,fullname) + filename = _build_ext.get_ext_filename(self, fullname) if fullname in self.ext_map: ext = self.ext_map[fullname] - if isinstance(ext,Library): + use_abi3 = ( + six.PY3 + and getattr(ext, 'py_limited_api') + and get_abi3_suffix() + ) + if use_abi3: + so_ext = _get_config_var_837('EXT_SUFFIX') + filename = filename[:-len(so_ext)] + filename = filename + get_abi3_suffix() + if isinstance(ext, Library): fn, ext = os.path.splitext(filename) - return self.shlib_compiler.library_filename(fn,libtype) + return self.shlib_compiler.library_filename(fn, libtype) elif use_stubs and ext._links_to_dynamic: - d,fn = os.path.split(filename) - return os.path.join(d,'dl-'+fn) + d, fn = os.path.split(filename) + return os.path.join(d, 'dl-' + fn) return filename def initialize_options(self): @@ -103,7 +131,7 @@ class build_ext(_build_ext): self.extensions = self.extensions or [] self.check_extensions_list(self.extensions) self.shlibs = [ext for ext in self.extensions - if isinstance(ext,Library)] + if isinstance(ext, Library)] if self.shlibs: self.setup_shlib_compiler() for ext in self.extensions: @@ -111,11 +139,17 @@ class build_ext(_build_ext): for ext in self.extensions: fullname = ext._full_name self.ext_map[fullname] = ext - ltd = ext._links_to_dynamic = \ - self.shlibs and self.links_to_dynamic(ext) or False - ext._needs_stub = ltd and use_stubs and not isinstance(ext,Library) + + # distutils 3.1 will also ask for module names + # XXX what to do with conflicts? + self.ext_map[fullname.split('.')[-1]] = ext + + ltd = self.shlibs and self.links_to_dynamic(ext) or False + ns = ltd and use_stubs and not isinstance(ext, Library) + ext._links_to_dynamic = ltd + ext._needs_stub = ns filename = ext._file_name = self.get_ext_filename(fullname) - libdir = os.path.dirname(os.path.join(self.build_lib,filename)) + libdir = os.path.dirname(os.path.join(self.build_lib, filename)) if ltd and libdir not in ext.library_dirs: ext.library_dirs.append(libdir) if ltd and use_stubs and os.curdir not in ext.runtime_library_dirs: @@ -125,25 +159,13 @@ class build_ext(_build_ext): compiler = self.shlib_compiler = new_compiler( compiler=self.compiler, dry_run=self.dry_run, force=self.force ) - if sys.platform == "darwin": - tmp = _config_vars.copy() - try: - # XXX Help! I don't have any idea whether these are right... - _config_vars['LDSHARED'] = "gcc -Wl,-x -dynamiclib -undefined dynamic_lookup" - _config_vars['CCSHARED'] = " -dynamiclib" - _config_vars['SO'] = ".dylib" - customize_compiler(compiler) - finally: - _config_vars.clear() - _config_vars.update(tmp) - else: - customize_compiler(compiler) + _customize_compiler_for_shlib(compiler) if self.include_dirs is not None: compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples - for (name,value) in self.define: + for (name, value) in self.define: compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: @@ -160,23 +182,21 @@ class build_ext(_build_ext): # hack so distutils' build_extension() builds a library instead compiler.link_shared_object = link_shared_object.__get__(compiler) - - def get_export_symbols(self, ext): - if isinstance(ext,Library): + if isinstance(ext, Library): return ext.export_symbols - return _build_ext.get_export_symbols(self,ext) + return _build_ext.get_export_symbols(self, ext) def build_extension(self, ext): + ext._convert_pyx_sources_to_lang() _compiler = self.compiler try: - if isinstance(ext,Library): + if isinstance(ext, Library): self.compiler = self.shlib_compiler - _build_ext.build_extension(self,ext) + _build_ext.build_extension(self, ext) if ext._needs_stub: - self.write_stub( - self.get_finalized_command('build_py').build_lib, ext - ) + cmd = self.get_finalized_command('build_py').build_lib + self.write_stub(cmd, ext) finally: self.compiler = _compiler @@ -186,54 +206,66 @@ class build_ext(_build_ext): # XXX as dynamic, and not just using a locally-found version or a # XXX static-compiled version libnames = dict.fromkeys([lib._full_name for lib in self.shlibs]) - pkg = '.'.join(ext._full_name.split('.')[:-1]+['']) - for libname in ext.libraries: - if pkg+libname in libnames: return True - return False + pkg = '.'.join(ext._full_name.split('.')[:-1] + ['']) + return any(pkg + libname in libnames for libname in ext.libraries) def get_outputs(self): - outputs = _build_ext.get_outputs(self) - optimize = self.get_finalized_command('build_py').optimize - for ext in self.extensions: - if ext._needs_stub: - base = os.path.join(self.build_lib, *ext._full_name.split('.')) - outputs.append(base+'.py') - outputs.append(base+'.pyc') - if optimize: - outputs.append(base+'.pyo') - return outputs + return _build_ext.get_outputs(self) + self.__get_stubs_outputs() + + def __get_stubs_outputs(self): + # assemble the base name for each extension that needs a stub + ns_ext_bases = ( + os.path.join(self.build_lib, *ext._full_name.split('.')) + for ext in self.extensions + if ext._needs_stub + ) + # pair each base with the extension + pairs = itertools.product(ns_ext_bases, self.__get_output_extensions()) + return list(base + fnext for base, fnext in pairs) + + def __get_output_extensions(self): + yield '.py' + yield '.pyc' + if self.get_finalized_command('build_py').optimize: + yield '.pyo' def write_stub(self, output_dir, ext, compile=False): - log.info("writing stub loader for %s to %s",ext._full_name, output_dir) - stub_file = os.path.join(output_dir, *ext._full_name.split('.'))+'.py' + log.info("writing stub loader for %s to %s", ext._full_name, + output_dir) + stub_file = (os.path.join(output_dir, *ext._full_name.split('.')) + + '.py') if compile and os.path.exists(stub_file): - raise DistutilsError(stub_file+" already exists! Please delete.") + raise DistutilsError(stub_file + " already exists! Please delete.") if not self.dry_run: - f = open(stub_file,'w') - f.write('\n'.join([ - "def __bootstrap__():", - " global __bootstrap__, __file__, __loader__", - " import sys, os, pkg_resources, imp"+if_dl(", dl"), - " __file__ = pkg_resources.resource_filename(__name__,%r)" - % os.path.basename(ext._file_name), - " del __bootstrap__", - " if '__loader__' in globals():", - " del __loader__", - if_dl(" old_flags = sys.getdlopenflags()"), - " old_dir = os.getcwd()", - " try:", - " os.chdir(os.path.dirname(__file__))", - if_dl(" sys.setdlopenflags(dl.RTLD_NOW)"), - " imp.load_dynamic(__name__,__file__)", - " finally:", - if_dl(" sys.setdlopenflags(old_flags)"), - " os.chdir(old_dir)", - "__bootstrap__()", - "" # terminal \n - ])) + f = open(stub_file, 'w') + f.write( + '\n'.join([ + "def __bootstrap__():", + " global __bootstrap__, __file__, __loader__", + " import sys, os, pkg_resources, imp" + if_dl(", dl"), + " __file__ = pkg_resources.resource_filename" + "(__name__,%r)" + % os.path.basename(ext._file_name), + " del __bootstrap__", + " if '__loader__' in globals():", + " del __loader__", + if_dl(" old_flags = sys.getdlopenflags()"), + " old_dir = os.getcwd()", + " try:", + " os.chdir(os.path.dirname(__file__))", + if_dl(" sys.setdlopenflags(dl.RTLD_NOW)"), + " imp.load_dynamic(__name__,__file__)", + " finally:", + if_dl(" sys.setdlopenflags(old_flags)"), + " os.chdir(old_dir)", + "__bootstrap__()", + "" # terminal \n + ]) + ) f.close() if compile: from distutils.util import byte_compile + byte_compile([stub_file], optimize=0, force=True, dry_run=self.dry_run) optimize = self.get_finalized_command('install_lib').optimize @@ -244,14 +276,15 @@ class build_ext(_build_ext): os.unlink(stub_file) -if use_stubs or os.name=='nt': +if use_stubs or os.name == 'nt': # Build shared libraries # - def link_shared_object(self, objects, output_libname, output_dir=None, - libraries=None, library_dirs=None, runtime_library_dirs=None, - export_symbols=None, debug=0, extra_preargs=None, - extra_postargs=None, build_temp=None, target_lang=None - ): self.link( + def link_shared_object( + self, objects, output_libname, output_dir=None, libraries=None, + library_dirs=None, runtime_library_dirs=None, export_symbols=None, + debug=0, extra_preargs=None, extra_postargs=None, build_temp=None, + target_lang=None): + self.link( self.SHARED_LIBRARY, objects, output_libname, output_dir, libraries, library_dirs, runtime_library_dirs, export_symbols, debug, extra_preargs, extra_postargs, @@ -261,19 +294,19 @@ else: # Build static libraries everywhere else libtype = 'static' - def link_shared_object(self, objects, output_libname, output_dir=None, - libraries=None, library_dirs=None, runtime_library_dirs=None, - export_symbols=None, debug=0, extra_preargs=None, - extra_postargs=None, build_temp=None, target_lang=None - ): + def link_shared_object( + self, objects, output_libname, output_dir=None, libraries=None, + library_dirs=None, runtime_library_dirs=None, export_symbols=None, + debug=0, extra_preargs=None, extra_postargs=None, build_temp=None, + target_lang=None): # XXX we need to either disallow these attrs on Library instances, - # or warn/abort here if set, or something... - #libraries=None, library_dirs=None, runtime_library_dirs=None, - #export_symbols=None, extra_preargs=None, extra_postargs=None, - #build_temp=None + # or warn/abort here if set, or something... + # libraries=None, library_dirs=None, runtime_library_dirs=None, + # export_symbols=None, extra_preargs=None, extra_postargs=None, + # build_temp=None - assert output_dir is None # distutils build_ext doesn't pass this - output_dir,filename = os.path.split(output_libname) + assert output_dir is None # distutils build_ext doesn't pass this + output_dir, filename = os.path.split(output_libname) basename, ext = os.path.splitext(filename) if self.library_filename("x").startswith('lib'): # strip 'lib' prefix; this is kludgy if some platform uses @@ -285,3 +318,11 @@ else: ) +def _get_config_var_837(name): + """ + In https://github.com/pypa/setuptools/pull/837, we discovered + Python 3.3.0 exposes the extension suffix under the name 'SO'. + """ + if sys.version_info < (3, 3, 1): + name = 'SO' + return get_config_var(name) diff --git a/setuptools/command/build_py.py b/setuptools/command/build_py.py index 79570bc..56daa2b 100644 --- a/setuptools/command/build_py.py +++ b/setuptools/command/build_py.py @@ -1,9 +1,26 @@ -import os.path, sys, fnmatch -from distutils.command.build_py import build_py as _build_py -from distutils.util import convert_path from glob import glob +from distutils.util import convert_path +import distutils.command.build_py as orig +import os +import fnmatch +import textwrap +import io +import distutils.errors +import itertools + +import six +from six.moves import map, filter, filterfalse + +try: + from setuptools.lib2to3_ex import Mixin2to3 +except ImportError: -class build_py(_build_py): + class Mixin2to3: + def run_2to3(self, files, doctests=True): + "do nothing" + + +class build_py(orig.build_py, Mixin2to3): """Enhanced 'build_py' command that includes data files with packages The data files are specified via a 'package_data' argument to 'setup()'. @@ -12,11 +29,16 @@ class build_py(_build_py): Also, this version of the 'build_py' command allows you to specify both 'py_modules' and 'packages' in the same setup operation. """ + def finalize_options(self): - _build_py.finalize_options(self) + orig.build_py.finalize_options(self) self.package_data = self.distribution.package_data - self.exclude_package_data = self.distribution.exclude_package_data or {} - if 'data_files' in self.__dict__: del self.__dict__['data_files'] + self.exclude_package_data = (self.distribution.exclude_package_data or + {}) + if 'data_files' in self.__dict__: + del self.__dict__['data_files'] + self.__updated_files = [] + self.__doctests_2to3 = [] def run(self): """Build modules, packages, and copy data files to build directory""" @@ -30,55 +52,79 @@ class build_py(_build_py): self.build_packages() self.build_package_data() + self.run_2to3(self.__updated_files, False) + self.run_2to3(self.__updated_files, True) + self.run_2to3(self.__doctests_2to3, True) + # Only compile actual .py files, using our base class' idea of what our # output files are. - self.byte_compile(_build_py.get_outputs(self, include_bytecode=0)) - - def __getattr__(self,attr): - if attr=='data_files': # lazily compute data files - self.data_files = files = self._get_data_files(); return files - return _build_py.__getattr__(self,attr) + self.byte_compile(orig.build_py.get_outputs(self, include_bytecode=0)) + + def __getattr__(self, attr): + "lazily compute data files" + if attr == 'data_files': + self.data_files = self._get_data_files() + return self.data_files + return orig.build_py.__getattr__(self, attr) + + def build_module(self, module, module_file, package): + if six.PY2 and isinstance(package, six.string_types): + # avoid errors on Python 2 when unicode is passed (#190) + package = package.split('.') + outfile, copied = orig.build_py.build_module(self, module, module_file, + package) + if copied: + self.__updated_files.append(outfile) + return outfile, copied def _get_data_files(self): """Generate list of '(package,src_dir,build_dir,filenames)' tuples""" self.analyze_manifest() - data = [] - for package in self.packages or (): - # Locate package source directory - src_dir = self.get_package_dir(package) + return list(map(self._get_pkg_data_files, self.packages or ())) - # Compute package build directory - build_dir = os.path.join(*([self.build_lib] + package.split('.'))) + def _get_pkg_data_files(self, package): + # Locate package source directory + src_dir = self.get_package_dir(package) - # Length of path to strip from found files - plen = len(src_dir)+1 + # Compute package build directory + build_dir = os.path.join(*([self.build_lib] + package.split('.'))) - # Strip directory from globbed filenames - filenames = [ - file[plen:] for file in self.find_data_files(package, src_dir) - ] - data.append( (package, src_dir, build_dir, filenames) ) - return data + # Strip directory from globbed filenames + filenames = [ + os.path.relpath(file, src_dir) + for file in self.find_data_files(package, src_dir) + ] + return package, src_dir, build_dir, filenames def find_data_files(self, package, src_dir): """Return filenames for package's data files in 'src_dir'""" - globs = (self.package_data.get('', []) - + self.package_data.get(package, [])) - files = self.manifest_files.get(package, [])[:] - for pattern in globs: - # Each pattern has to be converted to a platform-specific path - files.extend(glob(os.path.join(src_dir, convert_path(pattern)))) + patterns = self._get_platform_patterns( + self.package_data, + package, + src_dir, + ) + globs_expanded = map(glob, patterns) + # flatten the expanded globs into an iterable of matches + globs_matches = itertools.chain.from_iterable(globs_expanded) + glob_files = filter(os.path.isfile, globs_matches) + files = itertools.chain( + self.manifest_files.get(package, []), + glob_files, + ) return self.exclude_data_files(package, src_dir, files) def build_package_data(self): """Copy data files into build directory""" - lastdir = None for package, src_dir, build_dir, filenames in self.data_files: for filename in filenames: target = os.path.join(build_dir, filename) self.mkpath(os.path.dirname(target)) - self.copy_file(os.path.join(src_dir, filename), target) - + srcfile = os.path.join(src_dir, filename) + outf, copied = self.copy_file(srcfile, target) + srcfile = os.path.abspath(srcfile) + if (copied and + srcfile in self.distribution.convert_2to3_doctests): + self.__doctests_2to3.append(outf) def analyze_manifest(self): self.manifest_files = mf = {} @@ -92,34 +138,20 @@ class build_py(_build_py): self.run_command('egg_info') ei_cmd = self.get_finalized_command('egg_info') for path in ei_cmd.filelist.files: - d,f = os.path.split(assert_relative(path)) + d, f = os.path.split(assert_relative(path)) prev = None oldf = f - while d and d!=prev and d not in src_dirs: + while d and d != prev and d not in src_dirs: prev = d d, df = os.path.split(d) f = os.path.join(df, f) if d in src_dirs: - if path.endswith('.py') and f==oldf: - continue # it's a module, not data - mf.setdefault(src_dirs[d],[]).append(path) - - def get_data_files(self): pass # kludge 2.4 for lazy computation - - if sys.version<"2.4": # Python 2.4 already has this code - def get_outputs(self, include_bytecode=1): - """Return complete list of files copied to the build directory - - This includes both '.py' files and data files, as well as '.pyc' - and '.pyo' files if 'include_bytecode' is true. (This method is - needed for the 'install_lib' command to do its job properly, and to - generate a correct installation manifest.) - """ - return _build_py.get_outputs(self, include_bytecode) + [ - os.path.join(build_dir, filename) - for package, src_dir, build_dir,filenames in self.data_files - for filename in filenames - ] + if path.endswith('.py') and f == oldf: + continue # it's a module, not data + mf.setdefault(src_dirs[d], []).append(path) + + def get_data_files(self): + pass # Lazily compute data files in _get_data_files() function. def check_package(self, package, package_dir): """Check namespace packages' __init__ for declare_namespace""" @@ -128,78 +160,111 @@ class build_py(_build_py): except KeyError: pass - init_py = _build_py.check_package(self, package, package_dir) + init_py = orig.build_py.check_package(self, package, package_dir) self.packages_checked[package] = init_py if not init_py or not self.distribution.namespace_packages: return init_py for pkg in self.distribution.namespace_packages: - if pkg==package or pkg.startswith(package+'.'): + if pkg == package or pkg.startswith(package + '.'): break else: return init_py - f = open(init_py,'rU') - if 'declare_namespace' not in f.read(): - from distutils import log - log.warn( - "WARNING: %s is a namespace package, but its __init__.py does\n" - "not declare_namespace(); setuptools 0.7 will REQUIRE this!\n" - '(See the setuptools manual under "Namespace Packages" for ' - "details.)\n", package + with io.open(init_py, 'rb') as f: + contents = f.read() + if b'declare_namespace' not in contents: + raise distutils.errors.DistutilsError( + "Namespace package problem: %s is a namespace package, but " + "its\n__init__.py does not call declare_namespace()! Please " + 'fix it.\n(See the setuptools manual under ' + '"Namespace Packages" for details.)\n"' % (package,) ) - f.close() return init_py def initialize_options(self): - self.packages_checked={} - _build_py.initialize_options(self) - - - - - + self.packages_checked = {} + orig.build_py.initialize_options(self) + def get_package_dir(self, package): + res = orig.build_py.get_package_dir(self, package) + if self.distribution.src_root is not None: + return os.path.join(self.distribution.src_root, res) + return res def exclude_data_files(self, package, src_dir, files): """Filter filenames for package's data files in 'src_dir'""" - globs = (self.exclude_package_data.get('', []) - + self.exclude_package_data.get(package, [])) - bad = [] - for pattern in globs: - bad.extend( - fnmatch.filter( - files, os.path.join(src_dir, convert_path(pattern)) - ) - ) - bad = dict.fromkeys(bad) - seen = {} - return [ - f for f in files if f not in bad - and f not in seen and seen.setdefault(f,1) # ditch dupes - ] + files = list(files) + patterns = self._get_platform_patterns( + self.exclude_package_data, + package, + src_dir, + ) + match_groups = ( + fnmatch.filter(files, pattern) + for pattern in patterns + ) + # flatten the groups of matches into an iterable of matches + matches = itertools.chain.from_iterable(match_groups) + bad = set(matches) + keepers = ( + fn + for fn in files + if fn not in bad + ) + # ditch dupes + return list(_unique_everseen(keepers)) + + @staticmethod + def _get_platform_patterns(spec, package, src_dir): + """ + yield platform-specific path patterns (suitable for glob + or fn_match) from a glob-based spec (such as + self.package_data or self.exclude_package_data) + matching package in src_dir. + """ + raw_patterns = itertools.chain( + spec.get('', []), + spec.get(package, []), + ) + return ( + # Each pattern has to be converted to a platform-specific path + os.path.join(src_dir, convert_path(pattern)) + for pattern in raw_patterns + ) + + +# from Python docs +def _unique_everseen(iterable, key=None): + "List unique elements, preserving order. Remember all elements ever seen." + # unique_everseen('AAAABBBCCDAABBB') --> A B C D + # unique_everseen('ABBCcAD', str.lower) --> A B C D + seen = set() + seen_add = seen.add + if key is None: + for element in filterfalse(seen.__contains__, iterable): + seen_add(element) + yield element + else: + for element in iterable: + k = key(element) + if k not in seen: + seen_add(k) + yield element def assert_relative(path): if not os.path.isabs(path): return path from distutils.errors import DistutilsSetupError - raise DistutilsSetupError( -"""Error: setup script specifies an absolute path: - - %s - -setup() arguments must *always* be /-separated paths relative to the -setup.py directory, *never* absolute paths. -""" % path - ) - - - - - - + msg = textwrap.dedent(""" + Error: setup script specifies an absolute path: + %s + setup() arguments must *always* be /-separated paths relative to the + setup.py directory, *never* absolute paths. + """).lstrip() % path + raise DistutilsSetupError(msg) diff --git a/setuptools/command/develop.py b/setuptools/command/develop.py index f128b80..ddfdc66 100755 --- a/setuptools/command/develop.py +++ b/setuptools/command/develop.py @@ -1,11 +1,19 @@ -from setuptools.command.easy_install import easy_install from distutils.util import convert_path -from pkg_resources import Distribution, PathMetadata, normalize_path from distutils import log -from distutils.errors import * -import sys, os, setuptools, glob +from distutils.errors import DistutilsError, DistutilsOptionError +import os +import glob +import io + +import six + +from pkg_resources import Distribution, PathMetadata, normalize_path +from setuptools.command.easy_install import easy_install +from setuptools import namespaces +import setuptools -class develop(easy_install): + +class develop(namespaces.DevelopInstaller, easy_install): """Set up package for development""" description = "install package in 'development mode'" @@ -23,6 +31,7 @@ class develop(easy_install): if self.uninstall: self.multi_version = True self.uninstall_link() + self.uninstall_namespaces() else: self.install_for_development() self.warn_deprecated_options() @@ -32,81 +41,122 @@ class develop(easy_install): self.egg_path = None easy_install.initialize_options(self) self.setup_path = None - self.always_copy_from = '.' # always copy eggs installed in curdir - - - - - + self.always_copy_from = '.' # always copy eggs installed in curdir def finalize_options(self): ei = self.get_finalized_command("egg_info") if ei.broken_egg_info: - raise DistutilsError( - "Please rename %r to %r before using 'develop'" - % (ei.egg_info, ei.broken_egg_info) - ) - self.args = [ei.egg_name] + template = "Please rename %r to %r before using 'develop'" + args = ei.egg_info, ei.broken_egg_info + raise DistutilsError(template % args) + self.args = [ei.egg_name] + easy_install.finalize_options(self) + self.expand_basedirs() + self.expand_dirs() # pick up setup-dir .egg files only: no .egg-info self.package_index.scan(glob.glob('*.egg')) - self.egg_link = os.path.join(self.install_dir, ei.egg_name+'.egg-link') + egg_link_fn = ei.egg_name + '.egg-link' + self.egg_link = os.path.join(self.install_dir, egg_link_fn) self.egg_base = ei.egg_base if self.egg_path is None: self.egg_path = os.path.abspath(ei.egg_base) target = normalize_path(self.egg_base) - if normalize_path(os.path.join(self.install_dir, self.egg_path)) != target: + egg_path = normalize_path(os.path.join(self.install_dir, + self.egg_path)) + if egg_path != target: raise DistutilsOptionError( "--egg-path must be a relative path from the install" - " directory to "+target - ) - + " directory to " + target + ) + # Make a distribution for the package's source self.dist = Distribution( target, PathMetadata(target, os.path.abspath(ei.egg_info)), - project_name = ei.egg_name + project_name=ei.egg_name ) - p = self.egg_base.replace(os.sep,'/') - if p!= os.curdir: - p = '../' * (p.count('/')+1) - self.setup_path = p - p = normalize_path(os.path.join(self.install_dir, self.egg_path, p)) - if p != normalize_path(os.curdir): + self.setup_path = self._resolve_setup_path( + self.egg_base, + self.install_dir, + self.egg_path, + ) + + @staticmethod + def _resolve_setup_path(egg_base, install_dir, egg_path): + """ + Generate a path from egg_base back to '.' where the + setup script resides and ensure that path points to the + setup path from $install_dir/$egg_path. + """ + path_to_setup = egg_base.replace(os.sep, '/').rstrip('/') + if path_to_setup != os.curdir: + path_to_setup = '../' * (path_to_setup.count('/') + 1) + resolved = normalize_path(os.path.join(install_dir, egg_path, path_to_setup)) + if resolved != normalize_path(os.curdir): raise DistutilsOptionError( "Can't get a consistent path to setup script from" - " installation directory", p, normalize_path(os.curdir)) + " installation directory", resolved, normalize_path(os.curdir)) + return path_to_setup def install_for_development(self): - # Ensure metadata is up-to-date - self.run_command('egg_info') - # Build extensions in-place - self.reinitialize_command('build_ext', inplace=1) - self.run_command('build_ext') + if six.PY3 and getattr(self.distribution, 'use_2to3', False): + # If we run 2to3 we can not do this inplace: + + # Ensure metadata is up-to-date + self.reinitialize_command('build_py', inplace=0) + self.run_command('build_py') + bpy_cmd = self.get_finalized_command("build_py") + build_path = normalize_path(bpy_cmd.build_lib) + + # Build extensions + self.reinitialize_command('egg_info', egg_base=build_path) + self.run_command('egg_info') + + self.reinitialize_command('build_ext', inplace=0) + self.run_command('build_ext') + + # Fixup egg-link and easy-install.pth + ei_cmd = self.get_finalized_command("egg_info") + self.egg_path = build_path + self.dist.location = build_path + # XXX + self.dist._provider = PathMetadata(build_path, ei_cmd.egg_info) + else: + # Without 2to3 inplace works fine: + self.run_command('egg_info') + + # Build extensions in-place + self.reinitialize_command('build_ext', inplace=1) + self.run_command('build_ext') + self.install_site_py() # ensure that target dir is site-safe if setuptools.bootstrap_install_from: self.easy_install(setuptools.bootstrap_install_from) setuptools.bootstrap_install_from = None + self.install_namespaces() + # create an .egg-link in the installation dir, pointing to our egg log.info("Creating %s (link to %s)", self.egg_link, self.egg_base) if not self.dry_run: - f = open(self.egg_link,"w") - f.write(self.egg_path + "\n" + self.setup_path) - f.close() + with open(self.egg_link, "w") as f: + f.write(self.egg_path + "\n" + self.setup_path) # postprocess the installed distro, fixing up .pth, installing scripts, # and handling requirements self.process_distribution(None, self.dist, not self.no_deps) - def uninstall_link(self): if os.path.exists(self.egg_link): log.info("Removing %s (link to %s)", self.egg_link, self.egg_base) - contents = [line.rstrip() for line in file(self.egg_link)] - if contents not in ([self.egg_path], [self.egg_path, self.setup_path]): + egg_link_file = open(self.egg_link) + contents = [line.rstrip() for line in egg_link_file] + egg_link_file.close() + if contents not in ([self.egg_path], + [self.egg_path, self.setup_path]): log.warn("Link points to %s: uninstall aborted", contents) return if not self.dry_run: @@ -117,48 +167,48 @@ class develop(easy_install): # XXX should also check for entry point scripts! log.warn("Note: you must uninstall or replace scripts manually!") - - - - def install_egg_scripts(self, dist): if dist is not self.dist: # Installing a dependency, so fall back to normal behavior - return easy_install.install_egg_scripts(self,dist) + return easy_install.install_egg_scripts(self, dist) # create wrapper scripts in the script dir, pointing to dist.scripts # new-style... - self.install_wrapper_scripts(dist) + self.install_wrapper_scripts(dist) # ...and old-style for script_name in self.distribution.scripts or []: script_path = os.path.abspath(convert_path(script_name)) script_name = os.path.basename(script_path) - f = open(script_path,'rU') - script_text = f.read() - f.close() + with io.open(script_path) as strm: + script_text = strm.read() self.install_script(dist, script_name, script_text, script_path) + def install_wrapper_scripts(self, dist): + dist = VersionlessRequirement(dist) + return easy_install.install_wrapper_scripts(self, dist) +class VersionlessRequirement(object): + """ + Adapt a pkg_resources.Distribution to simply return the project + name as the 'requirement' so that scripts will work across + multiple versions. + >>> dist = Distribution(project_name='foo', version='1.0') + >>> str(dist.as_requirement()) + 'foo==1.0' + >>> adapted_dist = VersionlessRequirement(dist) + >>> str(adapted_dist.as_requirement()) + 'foo' + """ + def __init__(self, dist): + self.__dist = dist + def __getattr__(self, name): + return getattr(self.__dist, name) - - - - - - - - - - - - - - - - + def as_requirement(self): + return self.project_name diff --git a/setuptools/command/easy_install.py b/setuptools/command/easy_install.py index af4e349..ef83f7a 100755 --- a/setuptools/command/easy_install.py +++ b/setuptools/command/easy_install.py @@ -1,5 +1,5 @@ -#!python -"""\ +#!/usr/bin/env python +""" Easy Install ------------ @@ -7,37 +7,115 @@ A tool for doing automatic download/extract/build of distutils-based Python packages. For detailed documentation, see the accompanying EasyInstall.txt file, or visit the `EasyInstall home page`__. -__ http://peak.telecommunity.com/DevCenter/EasyInstall +__ https://setuptools.readthedocs.io/en/latest/easy_install.html + """ -import sys, os.path, zipimport, shutil, tempfile, zipfile, re, stat, random + from glob import glob +from distutils.util import get_platform +from distutils.util import convert_path, subst_vars +from distutils.errors import ( + DistutilsArgError, DistutilsOptionError, + DistutilsError, DistutilsPlatformError, +) +from distutils.command.install import INSTALL_SCHEMES, SCHEME_KEYS +from distutils import log, dir_util +from distutils.command.build_scripts import first_line_re +from distutils.spawn import find_executable +import sys +import os +import zipimport +import shutil +import tempfile +import zipfile +import re +import stat +import random +import textwrap +import warnings +import site +import struct +import contextlib +import subprocess +import shlex +import io + +import six +from six.moves import configparser, map + from setuptools import Command from setuptools.sandbox import run_setup -from distutils import log, dir_util -from distutils.sysconfig import get_python_lib -from distutils.errors import DistutilsArgError, DistutilsOptionError, \ - DistutilsError +from setuptools.py31compat import get_path, get_config_vars +from setuptools.py27compat import rmtree_safe +from setuptools.command import setopt from setuptools.archive_util import unpack_archive -from setuptools.package_index import PackageIndex, parse_bdist_wininst -from setuptools.package_index import URL_SCHEME +from setuptools.package_index import ( + PackageIndex, parse_requirement_arg, URL_SCHEME, +) from setuptools.command import bdist_egg, egg_info -from pkg_resources import * -sys_executable = os.path.normpath(sys.executable) +from pkg_resources import ( + yield_lines, normalize_path, resource_string, ensure_directory, + get_distribution, find_distributions, Environment, Requirement, + Distribution, PathMetadata, EggMetadata, WorkingSet, DistributionNotFound, + VersionConflict, DEVELOP_DIST, +) +import pkg_resources + +# Turn on PEP440Warnings +warnings.filterwarnings("default", category=pkg_resources.PEP440Warning) __all__ = [ 'samefile', 'easy_install', 'PthDistributions', 'extract_wininst_cfg', 'main', 'get_exe_prefixes', ] -def samefile(p1,p2): - if hasattr(os.path,'samefile') and ( - os.path.exists(p1) and os.path.exists(p2) - ): - return os.path.samefile(p1,p2) - return ( - os.path.normpath(os.path.normcase(p1)) == - os.path.normpath(os.path.normcase(p2)) - ) + +def is_64bit(): + return struct.calcsize("P") == 8 + + +def samefile(p1, p2): + """ + Determine if two paths reference the same file. + + Augments os.path.samefile to work on Windows and + suppresses errors if the path doesn't exist. + """ + both_exist = os.path.exists(p1) and os.path.exists(p2) + use_samefile = hasattr(os.path, 'samefile') and both_exist + if use_samefile: + return os.path.samefile(p1, p2) + norm_p1 = os.path.normpath(os.path.normcase(p1)) + norm_p2 = os.path.normpath(os.path.normcase(p2)) + return norm_p1 == norm_p2 + + +if six.PY2: + + def _to_ascii(s): + return s + + def isascii(s): + try: + six.text_type(s, 'ascii') + return True + except UnicodeError: + return False +else: + + def _to_ascii(s): + return s.encode('ascii') + + def isascii(s): + try: + s.encode('ascii') + return True + except UnicodeError: + return False + + +_one_liner = lambda text: textwrap.dedent(text).strip().replace('\n', '; ') + class easy_install(Command): """Manage a download/build/install process""" @@ -55,32 +133,42 @@ class easy_install(Command): ("always-copy", "a", "Copy all needed packages to install dir"), ("index-url=", "i", "base URL of Python Package Index"), ("find-links=", "f", "additional URL(s) to search for packages"), - ("delete-conflicting", "D", "no longer needed; don't use this"), - ("ignore-conflicts-at-my-risk", None, - "no longer needed; don't use this"), ("build-directory=", "b", - "download/extract/build in DIR; keep the results"), + "download/extract/build in DIR; keep the results"), ('optimize=', 'O', "also compile with optimization: -O1 for \"python -O\", " "-O2 for \"python -OO\", and -O0 to disable [default: -O0]"), ('record=', None, "filename in which to record list of installed files"), ('always-unzip', 'Z', "don't install as a zipfile, no matter what"), - ('site-dirs=','S',"list of directories where .pth files work"), + ('site-dirs=', 'S', "list of directories where .pth files work"), ('editable', 'e', "Install specified packages in editable form"), ('no-deps', 'N', "don't install dependencies"), ('allow-hosts=', 'H', "pattern(s) that hostnames must match"), - ('local-snapshots-ok', 'l', "allow building eggs from local checkouts"), + ('local-snapshots-ok', 'l', + "allow building eggs from local checkouts"), + ('version', None, "print version information and exit"), + ('no-find-links', None, + "Don't load find-links defined in packages being installed") ] boolean_options = [ 'zip-ok', 'multi-version', 'exclude-scripts', 'upgrade', 'always-copy', - 'delete-conflicting', 'ignore-conflicts-at-my-risk', 'editable', - 'no-deps', 'local-snapshots-ok', + 'editable', + 'no-deps', 'local-snapshots-ok', 'version' ] + + if site.ENABLE_USER_SITE: + help_msg = "install in user site-package '%s'" % site.USER_SITE + user_options.append(('user', None, help_msg)) + boolean_options.append('user') + negative_opt = {'always-unzip': 'zip-ok'} create_index = PackageIndex def initialize_options(self): + # the --user option seems to be an opt-in one, + # so the default should be False. + self.user = 0 self.zip_ok = self.local_snapshots_ok = None self.install_dir = self.script_dir = self.exclude_scripts = None self.index_url = None @@ -91,12 +179,26 @@ class easy_install(Command): self.upgrade = self.always_copy = self.multi_version = None self.editable = self.no_deps = self.allow_hosts = None self.root = self.prefix = self.no_report = None + self.version = None + self.install_purelib = None # for pure module distributions + self.install_platlib = None # non-pure (dists w/ extensions) + self.install_headers = None # for C/C++ headers + self.install_lib = None # set to either purelib or platlib + self.install_scripts = None + self.install_data = None + self.install_base = None + self.install_platbase = None + if site.ENABLE_USER_SITE: + self.install_userbase = site.USER_BASE + self.install_usersite = site.USER_SITE + else: + self.install_userbase = None + self.install_usersite = None + self.no_find_links = None # Options not specifiable via command line self.package_index = None self.pth_file = self.always_copy_from = None - self.delete_conflicting = None - self.ignore_conflicts_at_my_risk = None self.site_dirs = None self.installed_projects = {} self.sitepy_installed = False @@ -112,51 +214,111 @@ class easy_install(Command): ) def delete_blockers(self, blockers): - for filename in blockers: - if os.path.exists(filename) or os.path.islink(filename): - log.info("Deleting %s", filename) - if not self.dry_run: - if os.path.isdir(filename) and not os.path.islink(filename): - rmtree(filename) - else: - os.unlink(filename) + extant_blockers = ( + filename for filename in blockers + if os.path.exists(filename) or os.path.islink(filename) + ) + list(map(self._delete_path, extant_blockers)) + + def _delete_path(self, path): + log.info("Deleting %s", path) + if self.dry_run: + return + + is_tree = os.path.isdir(path) and not os.path.islink(path) + remover = rmtree if is_tree else os.unlink + remover(path) + + @staticmethod + def _render_version(): + """ + Render the Setuptools version and installation details, then exit. + """ + ver = sys.version[:3] + dist = get_distribution('setuptools') + tmpl = 'setuptools {dist.version} from {dist.location} (Python {ver})' + print(tmpl.format(**locals())) + raise SystemExit() def finalize_options(self): - self._expand('install_dir','script_dir','build_directory','site_dirs') + self.version and self._render_version() + + py_version = sys.version.split()[0] + prefix, exec_prefix = get_config_vars('prefix', 'exec_prefix') + + self.config_vars = { + 'dist_name': self.distribution.get_name(), + 'dist_version': self.distribution.get_version(), + 'dist_fullname': self.distribution.get_fullname(), + 'py_version': py_version, + 'py_version_short': py_version[0:3], + 'py_version_nodot': py_version[0] + py_version[2], + 'sys_prefix': prefix, + 'prefix': prefix, + 'sys_exec_prefix': exec_prefix, + 'exec_prefix': exec_prefix, + # Only python 3.2+ has abiflags + 'abiflags': getattr(sys, 'abiflags', ''), + } + + if site.ENABLE_USER_SITE: + self.config_vars['userbase'] = self.install_userbase + self.config_vars['usersite'] = self.install_usersite + + self._fix_install_dir_for_user_site() + + self.expand_basedirs() + self.expand_dirs() + + self._expand( + 'install_dir', 'script_dir', 'build_directory', + 'site_dirs', + ) # If a non-default installation directory was specified, default the # script directory to match it. if self.script_dir is None: self.script_dir = self.install_dir + if self.no_find_links is None: + self.no_find_links = False + # Let install_dir get set by install_lib command, which in turn # gets its info from the install command, and takes into account # --prefix and --home and all that other crud. - self.set_undefined_options('install_lib', - ('install_dir','install_dir') + self.set_undefined_options( + 'install_lib', ('install_dir', 'install_dir') ) # Likewise, set default script_dir from 'install_scripts.install_dir' - self.set_undefined_options('install_scripts', - ('install_dir', 'script_dir') + self.set_undefined_options( + 'install_scripts', ('install_dir', 'script_dir') ) + + if self.user and self.install_purelib: + self.install_dir = self.install_purelib + self.script_dir = self.install_scripts # default --record from the install command self.set_undefined_options('install', ('record', 'record')) + # Should this be moved to the if statement below? It's not used + # elsewhere normpath = map(normalize_path, sys.path) self.all_site_dirs = get_site_dirs() if self.site_dirs is not None: site_dirs = [ - os.path.expanduser(s.strip()) for s in self.site_dirs.split(',') + os.path.expanduser(s.strip()) for s in + self.site_dirs.split(',') ] for d in site_dirs: if not os.path.isdir(d): log.warn("%s (in --site-dirs) does not exist", d) elif normalize_path(d) not in normpath: raise DistutilsOptionError( - d+" (in --site-dirs) is not on sys.path" + d + " (in --site-dirs) is not on sys.path" ) else: self.all_site_dirs.append(normalize_path(d)) - if not self.editable: self.check_site_dir() - self.index_url = self.index_url or "http://pypi.python.org/simple" + if not self.editable: + self.check_site_dir() + self.index_url = self.index_url or "https://pypi.python.org/simple" self.shadow_path = self.all_site_dirs[:] for path_item in self.install_dir, normalize_path(self.script_dir): if path_item not in self.shadow_path: @@ -168,31 +330,28 @@ class easy_install(Command): hosts = ['*'] if self.package_index is None: self.package_index = self.create_index( - self.index_url, search_path = self.shadow_path, hosts=hosts, + self.index_url, search_path=self.shadow_path, hosts=hosts, ) - self.local_index = Environment(self.shadow_path+sys.path) + self.local_index = Environment(self.shadow_path + sys.path) if self.find_links is not None: - if isinstance(self.find_links, basestring): + if isinstance(self.find_links, six.string_types): self.find_links = self.find_links.split() else: self.find_links = [] if self.local_snapshots_ok: - self.package_index.scan_egg_links(self.shadow_path+sys.path) - self.package_index.add_find_links(self.find_links) - self.set_undefined_options('install_lib', ('optimize','optimize')) - if not isinstance(self.optimize,int): + self.package_index.scan_egg_links(self.shadow_path + sys.path) + if not self.no_find_links: + self.package_index.add_find_links(self.find_links) + self.set_undefined_options('install_lib', ('optimize', 'optimize')) + if not isinstance(self.optimize, int): try: self.optimize = int(self.optimize) - if not (0 <= self.optimize <= 2): raise ValueError + if not (0 <= self.optimize <= 2): + raise ValueError except ValueError: raise DistutilsOptionError("--optimize must be 0, 1, or 2") - if self.delete_conflicting and self.ignore_conflicts_at_my_risk: - raise DistutilsOptionError( - "Can't use both --delete-conflicting and " - "--ignore-conflicts-at-my-risk at the same time" - ) if self.editable and not self.build_directory: raise DistutilsArgError( "Must specify a build directory (-b) when using --editable" @@ -203,19 +362,61 @@ class easy_install(Command): self.outputs = [] + def _fix_install_dir_for_user_site(self): + """ + Fix the install_dir if "--user" was used. + """ + if not self.user or not site.ENABLE_USER_SITE: + return + + self.create_home_path() + if self.install_userbase is None: + msg = "User base directory is not specified" + raise DistutilsPlatformError(msg) + self.install_base = self.install_platbase = self.install_userbase + scheme_name = os.name.replace('posix', 'unix') + '_user' + self.select_scheme(scheme_name) + + def _expand_attrs(self, attrs): + for attr in attrs: + val = getattr(self, attr) + if val is not None: + if os.name == 'posix' or os.name == 'nt': + val = os.path.expanduser(val) + val = subst_vars(val, self.config_vars) + setattr(self, attr, val) + + def expand_basedirs(self): + """Calls `os.path.expanduser` on install_base, install_platbase and + root.""" + self._expand_attrs(['install_base', 'install_platbase', 'root']) + + def expand_dirs(self): + """Calls `os.path.expanduser` on install dirs.""" + dirs = [ + 'install_purelib', + 'install_platlib', + 'install_lib', + 'install_headers', + 'install_scripts', + 'install_data', + ] + self._expand_attrs(dirs) + def run(self): - if self.verbose!=self.distribution.verbose: + if self.verbose != self.distribution.verbose: log.set_verbosity(self.verbose) try: for spec in self.args: self.easy_install(spec, not self.no_deps) if self.record: outputs = self.outputs - if self.root: # strip any package prefix + if self.root: # strip any package prefix root_len = len(self.root) - for counter in xrange(len(outputs)): + for counter in range(len(outputs)): outputs[counter] = outputs[counter][root_len:] from distutils import file_util + self.execute( file_util.write_file, (self.record, outputs), "writing list of installed files to '%s'" % @@ -232,22 +433,18 @@ class easy_install(Command): """ try: pid = os.getpid() - except: - pid = random.randint(0,sys.maxint) + except Exception: + pid = random.randint(0, sys.maxsize) return os.path.join(self.install_dir, "test-easy-install-%s" % pid) def warn_deprecated_options(self): - if self.delete_conflicting or self.ignore_conflicts_at_my_risk: - log.warn( - "Note: The -D, --delete-conflicting and" - " --ignore-conflicts-at-my-risk no longer have any purpose" - " and should not be used." - ) + pass def check_site_dir(self): """Verify that self.install_dir is .pth-capable dir, if needed""" + instdir = normalize_path(self.install_dir) - pth_file = os.path.join(instdir,'easy-install.pth') + pth_file = os.path.join(instdir, 'easy-install.pth') # Is it a configured, PYTHONPATH, implicit, or explicit site dir? is_site_dir = instdir in self.all_site_dirs @@ -257,13 +454,14 @@ class easy_install(Command): is_site_dir = self.check_pth_processing() else: # make sure we can write to target dir - testfile = self.pseudo_tempname()+'.write-test' + testfile = self.pseudo_tempname() + '.write-test' test_exists = os.path.exists(testfile) try: - if test_exists: os.unlink(testfile) - open(testfile,'w').close() + if test_exists: + os.unlink(testfile) + open(testfile, 'w').close() os.unlink(testfile) - except (OSError,IOError): + except (OSError, IOError): self.cant_write_to_target() if not is_site_dir and not self.multi_version: @@ -276,82 +474,102 @@ class easy_install(Command): else: self.pth_file = None - PYTHONPATH = os.environ.get('PYTHONPATH','').split(os.pathsep) - if instdir not in map(normalize_path, filter(None,PYTHONPATH)): + PYTHONPATH = os.environ.get('PYTHONPATH', '').split(os.pathsep) + if instdir not in map(normalize_path, filter(None, PYTHONPATH)): # only PYTHONPATH dirs need a site.py, so pretend it's there self.sitepy_installed = True elif self.multi_version and not os.path.exists(pth_file): - self.sitepy_installed = True # don't need site.py in this case - self.pth_file = None # and don't create a .pth file + self.sitepy_installed = True # don't need site.py in this case + self.pth_file = None # and don't create a .pth file self.install_dir = instdir - def cant_write_to_target(self): - msg = """can't create or remove files in install directory + __cant_write_msg = textwrap.dedent(""" + can't create or remove files in install directory -The following error occurred while trying to add or remove files in the -installation directory: + The following error occurred while trying to add or remove files in the + installation directory: - %s + %s -The installation directory you specified (via --install-dir, --prefix, or -the distutils default setting) was: + The installation directory you specified (via --install-dir, --prefix, or + the distutils default setting) was: - %s -""" % (sys.exc_info()[1], self.install_dir,) + %s + """).lstrip() - if not os.path.exists(self.install_dir): - msg += """ -This directory does not currently exist. Please create it and try again, or -choose a different installation directory (using the -d or --install-dir -option). -""" - else: - msg += """ -Perhaps your account does not have write access to this directory? If the -installation directory is a system-owned directory, you may need to sign in -as the administrator or "root" account. If you do not have administrative -access to this machine, you may wish to choose a different installation -directory, preferably one that is listed in your PYTHONPATH environment -variable. + __not_exists_id = textwrap.dedent(""" + This directory does not currently exist. Please create it and try again, or + choose a different installation directory (using the -d or --install-dir + option). + """).lstrip() -For information on other options, you may wish to consult the -documentation at: + __access_msg = textwrap.dedent(""" + Perhaps your account does not have write access to this directory? If the + installation directory is a system-owned directory, you may need to sign in + as the administrator or "root" account. If you do not have administrative + access to this machine, you may wish to choose a different installation + directory, preferably one that is listed in your PYTHONPATH environment + variable. - http://peak.telecommunity.com/EasyInstall.html + For information on other options, you may wish to consult the + documentation at: -Please make the appropriate changes for your system and try again. -""" - raise DistutilsError(msg) + https://setuptools.readthedocs.io/en/latest/easy_install.html + Please make the appropriate changes for your system and try again. + """).lstrip() + def cant_write_to_target(self): + msg = self.__cant_write_msg % (sys.exc_info()[1], self.install_dir,) + if not os.path.exists(self.install_dir): + msg += '\n' + self.__not_exists_id + else: + msg += '\n' + self.__access_msg + raise DistutilsError(msg) def check_pth_processing(self): """Empirically verify whether .pth files are supported in inst. dir""" instdir = self.install_dir log.info("Checking .pth file support in %s", instdir) - pth_file = self.pseudo_tempname()+".pth" - ok_file = pth_file+'.ok' + pth_file = self.pseudo_tempname() + ".pth" + ok_file = pth_file + '.ok' ok_exists = os.path.exists(ok_file) + tmpl = _one_liner(""" + import os + f = open({ok_file!r}, 'w') + f.write('OK') + f.close() + """) + '\n' try: - if ok_exists: os.unlink(ok_file) - f = open(pth_file,'w') - except (OSError,IOError): + if ok_exists: + os.unlink(ok_file) + dirname = os.path.dirname(ok_file) + if not os.path.exists(dirname): + os.makedirs(dirname) + f = open(pth_file, 'w') + except (OSError, IOError): self.cant_write_to_target() else: try: - f.write("import os;open(%r,'w').write('OK')\n" % (ok_file,)) - f.close(); f=None + f.write(tmpl.format(**locals())) + f.close() + f = None executable = sys.executable - if os.name=='nt': - dirname,basename = os.path.split(executable) - alt = os.path.join(dirname,'pythonw.exe') - if basename.lower()=='python.exe' and os.path.exists(alt): + if os.name == 'nt': + dirname, basename = os.path.split(executable) + alt = os.path.join(dirname, 'pythonw.exe') + use_alt = ( + basename.lower() == 'python.exe' and + os.path.exists(alt) + ) + if use_alt: # use pythonw.exe to avoid opening a console window executable = alt from distutils.spawn import spawn - spawn([executable,'-E','-c','pass'],0) + + spawn([executable, '-E', '-c', 'pass'], 0) if os.path.exists(ok_file): log.info( @@ -360,9 +578,12 @@ Please make the appropriate changes for your system and try again. ) return True finally: - if f: f.close() - if os.path.exists(ok_file): os.unlink(ok_file) - if os.path.exists(pth_file): os.unlink(pth_file) + if f: + f.close() + if os.path.exists(ok_file): + os.unlink(ok_file) + if os.path.exists(pth_file): + os.unlink(pth_file) if not self.multi_version: log.warn("TEST FAILED: %s does NOT support .pth files", instdir) return False @@ -371,9 +592,13 @@ Please make the appropriate changes for your system and try again. """Write all the scripts for `dist`, unless scripts are excluded""" if not self.exclude_scripts and dist.metadata_isdir('scripts'): for script_name in dist.metadata_listdir('scripts'): + if dist.metadata_isdir('scripts/' + script_name): + # The "script" is a directory, likely a Python 3 + # __pycache__ directory, so skip it. + continue self.install_script( dist, script_name, - dist.get_metadata('scripts/'+script_name) + dist.get_metadata('scripts/' + script_name) ) self.install_wrapper_scripts(dist) @@ -381,7 +606,7 @@ Please make the appropriate changes for your system and try again. if os.path.isdir(path): for base, dirs, files in os.walk(path): for filename in files: - self.outputs.append(os.path.join(base,filename)) + self.outputs.append(os.path.join(base, filename)) else: self.outputs.append(path) @@ -393,7 +618,7 @@ Please make the appropriate changes for your system and try again. % (spec,) ) - def check_editable(self,spec): + def check_editable(self, spec): if not self.editable: return @@ -403,23 +628,26 @@ Please make the appropriate changes for your system and try again. (spec.key, self.build_directory) ) - - - - + @contextlib.contextmanager + def _tmpdir(self): + tmpdir = tempfile.mkdtemp(prefix=six.u("easy_install-")) + try: + # cast to str as workaround for #709 and #710 and #712 + yield str(tmpdir) + finally: + os.path.exists(tmpdir) and rmtree(rmtree_safe(tmpdir)) def easy_install(self, spec, deps=False): - tmpdir = tempfile.mkdtemp(prefix="easy_install-") - download = None - if not self.editable: self.install_site_py() + if not self.editable: + self.install_site_py() - try: - if not isinstance(spec,Requirement): + with self._tmpdir() as tmpdir: + if not isinstance(spec, Requirement): if URL_SCHEME(spec): # It's a url, download it to tmpdir and process self.not_editable(spec) - download = self.package_index.download(spec, tmpdir) - return self.install_item(None, download, tmpdir, deps, True) + dl = self.package_index.download(spec, tmpdir) + return self.install_item(None, dl, tmpdir, deps, True) elif os.path.exists(spec): # Existing file or directory, just process it directly @@ -430,25 +658,21 @@ Please make the appropriate changes for your system and try again. self.check_editable(spec) dist = self.package_index.fetch_distribution( - spec, tmpdir, self.upgrade, self.editable, not self.always_copy, - self.local_index + spec, tmpdir, self.upgrade, self.editable, + not self.always_copy, self.local_index ) if dist is None: msg = "Could not find suitable distribution for %r" % spec if self.always_copy: - msg+=" (--always-copy skips system and development eggs)" + msg += " (--always-copy skips system and development eggs)" raise DistutilsError(msg) - elif dist.precedence==DEVELOP_DIST: + elif dist.precedence == DEVELOP_DIST: # .egg-info dists don't need installing, just process deps self.process_distribution(spec, dist, deps, "Using") return dist else: return self.install_item(spec, dist.location, tmpdir, deps) - finally: - if os.path.exists(tmpdir): - rmtree(tmpdir) - def install_item(self, spec, download, tmpdir, deps, install_needed=False): # Installation is also needed if file in tmpdir or is not an egg @@ -465,10 +689,10 @@ Please make the appropriate changes for your system and try again. # at this point, we know it's a local .egg, we just don't know if # it's already installed. for dist in self.local_index[spec.project_name]: - if dist.location==download: + if dist.location == download: break else: - install_needed = True # it's not in the local index + install_needed = True # it's not in the local index log.info("Processing %s", os.path.basename(download)) @@ -477,7 +701,7 @@ Please make the appropriate changes for your system and try again. for dist in dists: self.process_distribution(spec, dist, deps) else: - dists = [self.check_conflicts(self.egg_distribution(download))] + dists = [self.egg_distribution(download)] self.process_distribution(spec, dists[0], deps, "Using") if spec is not None: @@ -485,19 +709,26 @@ Please make the appropriate changes for your system and try again. if dist in spec: return dist - - - - + def select_scheme(self, name): + """Sets the install directories by applying the install schemes.""" + # it's the caller's problem if they supply a bad name! + scheme = INSTALL_SCHEMES[name] + for key in SCHEME_KEYS: + attrname = 'install_' + key + if getattr(self, attrname) is None: + setattr(self, attrname, scheme[key]) def process_distribution(self, requirement, dist, deps=True, *info): self.update_pth(dist) self.package_index.add(dist) + if dist in self.local_index[dist.key]: + self.local_index.remove(dist) self.local_index.add(dist) self.install_egg_scripts(dist) self.installed_projects[dist.key] = dist log.info(self.installation_report(requirement, dist, *info)) - if dist.has_metadata('dependency_links.txt'): + if (dist.has_metadata('dependency_links.txt') and + not self.no_find_links): self.package_index.add_find_links( dist.get_metadata_lines('dependency_links.txt') ) @@ -509,24 +740,16 @@ Please make the appropriate changes for your system and try again. elif requirement is None or dist not in requirement: # if we wound up with a different version, resolve what we've got distreq = dist.as_requirement() - requirement = requirement or distreq - requirement = Requirement( - distreq.project_name, distreq.specs, requirement.extras - ) + requirement = Requirement(str(distreq)) log.info("Processing dependencies for %s", requirement) try: distros = WorkingSet([]).resolve( [requirement], self.local_index, self.easy_install ) - except DistributionNotFound, e: - raise DistutilsError( - "Could not find required distribution %s" % e.args - ) - except VersionConflict, e: - raise DistutilsError( - "Installed distribution %s conflicts with requirement %s" - % e.args - ) + except DistributionNotFound as e: + raise DistutilsError(str(e)) + except VersionConflict as e: + raise DistutilsError(e.report()) if self.always_copy or self.always_copy_from: # Force all the relevant distros to be copied or activated for dist in distros: @@ -546,72 +769,74 @@ Please make the appropriate changes for your system and try again. def maybe_move(self, spec, dist_filename, setup_base): dst = os.path.join(self.build_directory, spec.key) if os.path.exists(dst): - log.warn( - "%r already exists in %s; build directory %s will not be kept", - spec.key, self.build_directory, setup_base + msg = ( + "%r already exists in %s; build directory %s will not be kept" ) + log.warn(msg, spec.key, self.build_directory, setup_base) return setup_base if os.path.isdir(dist_filename): setup_base = dist_filename else: - if os.path.dirname(dist_filename)==setup_base: - os.unlink(dist_filename) # get it out of the tmp dir + if os.path.dirname(dist_filename) == setup_base: + os.unlink(dist_filename) # get it out of the tmp dir contents = os.listdir(setup_base) - if len(contents)==1: - dist_filename = os.path.join(setup_base,contents[0]) + if len(contents) == 1: + dist_filename = os.path.join(setup_base, contents[0]) if os.path.isdir(dist_filename): # if the only thing there is a directory, move it instead setup_base = dist_filename - ensure_directory(dst); shutil.move(setup_base, dst) + ensure_directory(dst) + shutil.move(setup_base, dst) return dst def install_wrapper_scripts(self, dist): - if not self.exclude_scripts: - for args in get_script_args(dist): - self.write_script(*args) - - + if self.exclude_scripts: + return + for args in ScriptWriter.best().get_args(dist): + self.write_script(*args) def install_script(self, dist, script_name, script_text, dev_path=None): """Generate a legacy script wrapper and install it""" spec = str(dist.as_requirement()) is_script = is_python_script(script_text, script_name) - if is_script and dev_path: - script_text = get_script_header(script_text) + ( - "# EASY-INSTALL-DEV-SCRIPT: %(spec)r,%(script_name)r\n" - "__requires__ = %(spec)r\n" - "from pkg_resources import require; require(%(spec)r)\n" - "del require\n" - "__file__ = %(dev_path)r\n" - "execfile(__file__)\n" - ) % locals() - elif is_script: - script_text = get_script_header(script_text) + ( - "# EASY-INSTALL-SCRIPT: %(spec)r,%(script_name)r\n" - "__requires__ = %(spec)r\n" - "import pkg_resources\n" - "pkg_resources.run_script(%(spec)r, %(script_name)r)\n" - ) % locals() - self.write_script(script_name, script_text, 'b') + if is_script: + body = self._load_template(dev_path) % locals() + script_text = ScriptWriter.get_header(script_text) + body + self.write_script(script_name, _to_ascii(script_text), 'b') + + @staticmethod + def _load_template(dev_path): + """ + There are a couple of template scripts in the package. This + function loads one of them and prepares it for use. + """ + # See https://github.com/pypa/setuptools/issues/134 for info + # on script file naming and downstream issues with SVR4 + name = 'script.tmpl' + if dev_path: + name = name.replace('.tmpl', ' (dev).tmpl') + + raw_bytes = resource_string('setuptools', name) + return raw_bytes.decode('utf-8') def write_script(self, script_name, contents, mode="t", blockers=()): """Write an executable file to the scripts directory""" - self.delete_blockers( # clean up old .py/.pyw w/o a script - [os.path.join(self.script_dir,x) for x in blockers]) + self.delete_blockers( # clean up old .py/.pyw w/o a script + [os.path.join(self.script_dir, x) for x in blockers] + ) log.info("Installing %s script to %s", script_name, self.script_dir) target = os.path.join(self.script_dir, script_name) self.add_output(target) + mask = current_umask() if not self.dry_run: ensure_directory(target) - f = open(target,"w"+mode) - f.write(contents) - f.close() - chmod(target,0755) - - - + if os.path.exists(target): + os.unlink(target) + with open(target, "w" + mode) as f: + f.write(contents) + chmod(target, 0o777 - mask) def install_eggs(self, spec, dist_filename, tmpdir): # .egg dirs or files are already built, so just return them @@ -627,9 +852,8 @@ Please make the appropriate changes for your system and try again. elif os.path.isdir(dist_filename): setup_base = os.path.abspath(dist_filename) - if (setup_base.startswith(tmpdir) # something we downloaded - and self.build_directory and spec is not None - ): + if (setup_base.startswith(tmpdir) # something we downloaded + and self.build_directory and spec is not None): setup_base = self.maybe_move(spec, dist_filename, setup_base) # Find the setup.py file @@ -639,11 +863,13 @@ Please make the appropriate changes for your system and try again. setups = glob(os.path.join(setup_base, '*', 'setup.py')) if not setups: raise DistutilsError( - "Couldn't find a setup script in %s" % os.path.abspath(dist_filename) + "Couldn't find a setup script in %s" % + os.path.abspath(dist_filename) ) - if len(setups)>1: + if len(setups) > 1: raise DistutilsError( - "Multiple setup scripts in %s" % os.path.abspath(dist_filename) + "Multiple setup scripts in %s" % + os.path.abspath(dist_filename) ) setup_script = setups[0] @@ -656,41 +882,62 @@ Please make the appropriate changes for your system and try again. def egg_distribution(self, egg_path): if os.path.isdir(egg_path): - metadata = PathMetadata(egg_path,os.path.join(egg_path,'EGG-INFO')) + metadata = PathMetadata(egg_path, os.path.join(egg_path, + 'EGG-INFO')) else: metadata = EggMetadata(zipimport.zipimporter(egg_path)) - return Distribution.from_filename(egg_path,metadata=metadata) + return Distribution.from_filename(egg_path, metadata=metadata) def install_egg(self, egg_path, tmpdir): - destination = os.path.join(self.install_dir,os.path.basename(egg_path)) + destination = os.path.join( + self.install_dir, + os.path.basename(egg_path), + ) destination = os.path.abspath(destination) if not self.dry_run: ensure_directory(destination) dist = self.egg_distribution(egg_path) - self.check_conflicts(dist) if not samefile(egg_path, destination): if os.path.isdir(destination) and not os.path.islink(destination): dir_util.remove_tree(destination, dry_run=self.dry_run) elif os.path.exists(destination): - self.execute(os.unlink,(destination,),"Removing "+destination) - uncache_zipdir(destination) - if os.path.isdir(egg_path): - if egg_path.startswith(tmpdir): - f,m = shutil.move, "Moving" + self.execute( + os.unlink, + (destination,), + "Removing " + destination, + ) + try: + new_dist_is_zipped = False + if os.path.isdir(egg_path): + if egg_path.startswith(tmpdir): + f, m = shutil.move, "Moving" + else: + f, m = shutil.copytree, "Copying" + elif self.should_unzip(dist): + self.mkpath(destination) + f, m = self.unpack_and_compile, "Extracting" else: - f,m = shutil.copytree, "Copying" - elif self.should_unzip(dist): - self.mkpath(destination) - f,m = self.unpack_and_compile, "Extracting" - elif egg_path.startswith(tmpdir): - f,m = shutil.move, "Moving" - else: - f,m = shutil.copy2, "Copying" - - self.execute(f, (egg_path, destination), - (m+" %s to %s") % - (os.path.basename(egg_path),os.path.dirname(destination))) + new_dist_is_zipped = True + if egg_path.startswith(tmpdir): + f, m = shutil.move, "Moving" + else: + f, m = shutil.copy2, "Copying" + self.execute( + f, + (egg_path, destination), + (m + " %s to %s") % ( + os.path.basename(egg_path), + os.path.dirname(destination) + ), + ) + update_dist_caches( + destination, + fix_zipimporter_caches=new_dist_is_zipped, + ) + except Exception: + update_dist_caches(destination, fix_zipimporter_caches=False) + raise self.add_output(destination) return self.egg_distribution(destination) @@ -703,35 +950,39 @@ Please make the appropriate changes for your system and try again. "%s is not a valid distutils Windows .exe" % dist_filename ) # Create a dummy distribution object until we build the real distro - dist = Distribution(None, - project_name=cfg.get('metadata','name'), - version=cfg.get('metadata','version'), platform="win32" + dist = Distribution( + None, + project_name=cfg.get('metadata', 'name'), + version=cfg.get('metadata', 'version'), platform=get_platform(), ) # Convert the .exe to an unpacked egg - egg_path = dist.location = os.path.join(tmpdir, dist.egg_name()+'.egg') - egg_tmp = egg_path+'.tmp' - egg_info = os.path.join(egg_tmp, 'EGG-INFO') - pkg_inf = os.path.join(egg_info, 'PKG-INFO') - ensure_directory(pkg_inf) # make sure EGG-INFO dir exists - dist._provider = PathMetadata(egg_tmp, egg_info) # XXX + egg_path = os.path.join(tmpdir, dist.egg_name() + '.egg') + dist.location = egg_path + egg_tmp = egg_path + '.tmp' + _egg_info = os.path.join(egg_tmp, 'EGG-INFO') + pkg_inf = os.path.join(_egg_info, 'PKG-INFO') + ensure_directory(pkg_inf) # make sure EGG-INFO dir exists + dist._provider = PathMetadata(egg_tmp, _egg_info) # XXX self.exe_to_egg(dist_filename, egg_tmp) # Write EGG-INFO/PKG-INFO if not os.path.exists(pkg_inf): - f = open(pkg_inf,'w') + f = open(pkg_inf, 'w') f.write('Metadata-Version: 1.0\n') - for k,v in cfg.items('metadata'): - if k!='target_version': - f.write('%s: %s\n' % (k.replace('_','-').title(), v)) + for k, v in cfg.items('metadata'): + if k != 'target_version': + f.write('%s: %s\n' % (k.replace('_', '-').title(), v)) f.close() - script_dir = os.path.join(egg_info,'scripts') - self.delete_blockers( # delete entry-point scripts to avoid duping - [os.path.join(script_dir,args[0]) for args in get_script_args(dist)] - ) + script_dir = os.path.join(_egg_info, 'scripts') + # delete entry-point scripts to avoid duping + self.delete_blockers([ + os.path.join(script_dir, args[0]) + for args in ScriptWriter.get_args(dist) + ]) # Build .egg file from tmpdir bdist_egg.make_zipfile( - egg_path, egg_tmp, verbose=self.verbose, dry_run=self.dry_run + egg_path, egg_tmp, verbose=self.verbose, dry_run=self.dry_run, ) # install the .egg return self.install_egg(egg_path, tmpdir) @@ -743,11 +994,12 @@ Please make the appropriate changes for your system and try again. to_compile = [] native_libs = [] top_level = {} - def process(src,dst): + + def process(src, dst): s = src.lower() - for old,new in prefixes: + for old, new in prefixes: if s.startswith(old): - src = new+src[len(old):] + src = new + src[len(old):] parts = src.split('/') dst = os.path.join(egg_tmp, *parts) dl = dst.lower() @@ -755,178 +1007,116 @@ Please make the appropriate changes for your system and try again. parts[-1] = bdist_egg.strip_module(parts[-1]) top_level[os.path.splitext(parts[0])[0]] = 1 native_libs.append(src) - elif dl.endswith('.py') and old!='SCRIPTS/': + elif dl.endswith('.py') and old != 'SCRIPTS/': top_level[os.path.splitext(parts[0])[0]] = 1 to_compile.append(dst) return dst if not src.endswith('.pth'): log.warn("WARNING: can't process %s", src) return None + # extract, tracking .pyd/.dll->native_libs and .py -> to_compile unpack_archive(dist_filename, egg_tmp, process) stubs = [] for res in native_libs: - if res.lower().endswith('.pyd'): # create stubs for .pyd's + if res.lower().endswith('.pyd'): # create stubs for .pyd's parts = res.split('/') resource = parts[-1] - parts[-1] = bdist_egg.strip_module(parts[-1])+'.py' + parts[-1] = bdist_egg.strip_module(parts[-1]) + '.py' pyfile = os.path.join(egg_tmp, *parts) - to_compile.append(pyfile); stubs.append(pyfile) + to_compile.append(pyfile) + stubs.append(pyfile) bdist_egg.write_stub(resource, pyfile) - self.byte_compile(to_compile) # compile .py's - bdist_egg.write_safety_flag(os.path.join(egg_tmp,'EGG-INFO'), + self.byte_compile(to_compile) # compile .py's + bdist_egg.write_safety_flag( + os.path.join(egg_tmp, 'EGG-INFO'), bdist_egg.analyze_egg(egg_tmp, stubs)) # write zip-safety flag - for name in 'top_level','native_libs': + for name in 'top_level', 'native_libs': if locals()[name]: - txt = os.path.join(egg_tmp, 'EGG-INFO', name+'.txt') + txt = os.path.join(egg_tmp, 'EGG-INFO', name + '.txt') if not os.path.exists(txt): - open(txt,'w').write('\n'.join(locals()[name])+'\n') - - def check_conflicts(self, dist): - """Verify that there are no conflicting "old-style" packages""" - - return dist # XXX temporarily disable until new strategy is stable - from imp import find_module, get_suffixes - from glob import glob - - blockers = [] - names = dict.fromkeys(dist._get_metadata('top_level.txt')) # XXX private attr - - exts = {'.pyc':1, '.pyo':1} # get_suffixes() might leave one out - for ext,mode,typ in get_suffixes(): - exts[ext] = 1 - - for path,files in expand_paths([self.install_dir]+self.all_site_dirs): - for filename in files: - base,ext = os.path.splitext(filename) - if base in names: - if not ext: - # no extension, check for package - try: - f, filename, descr = find_module(base, [path]) - except ImportError: - continue - else: - if f: f.close() - if filename not in blockers: - blockers.append(filename) - elif ext in exts and base!='site': # XXX ugh - blockers.append(os.path.join(path,filename)) - if blockers: - self.found_conflicts(dist, blockers) - - return dist - - def found_conflicts(self, dist, blockers): - if self.delete_conflicting: - log.warn("Attempting to delete conflicting packages:") - return self.delete_blockers(blockers) - - msg = """\ -------------------------------------------------------------------------- -CONFLICT WARNING: - -The following modules or packages have the same names as modules or -packages being installed, and will be *before* the installed packages in -Python's search path. You MUST remove all of the relevant files and -directories before you will be able to use the package(s) you are -installing: - - %s - -""" % '\n '.join(blockers) - - if self.ignore_conflicts_at_my_risk: - msg += """\ -(Note: you can run EasyInstall on '%s' with the ---delete-conflicting option to attempt deletion of the above files -and/or directories.) -""" % dist.project_name - else: - msg += """\ -Note: you can attempt this installation again with EasyInstall, and use -either the --delete-conflicting (-D) option or the ---ignore-conflicts-at-my-risk option, to either delete the above files -and directories, or to ignore the conflicts, respectively. Note that if -you ignore the conflicts, the installed package(s) may not work. -""" - msg += """\ -------------------------------------------------------------------------- -""" - sys.stderr.write(msg) - sys.stderr.flush() - if not self.ignore_conflicts_at_my_risk: - raise DistutilsError("Installation aborted due to conflicts") + f = open(txt, 'w') + f.write('\n'.join(locals()[name]) + '\n') + f.close() + + __mv_warning = textwrap.dedent(""" + Because this distribution was installed --multi-version, before you can + import modules from this package in an application, you will need to + 'import pkg_resources' and then use a 'require()' call similar to one of + these examples, in order to select the desired version: + + pkg_resources.require("%(name)s") # latest installed version + pkg_resources.require("%(name)s==%(version)s") # this exact version + pkg_resources.require("%(name)s>=%(version)s") # this version or higher + """).lstrip() + + __id_warning = textwrap.dedent(""" + Note also that the installation directory must be on sys.path at runtime for + this to work. (e.g. by being the application's script directory, by being on + PYTHONPATH, or by being added to sys.path by your code.) + """) def installation_report(self, req, dist, what="Installed"): """Helpful installation message for display to package users""" msg = "\n%(what)s %(eggloc)s%(extras)s" if self.multi_version and not self.no_report: - msg += """ - -Because this distribution was installed --multi-version, before you can -import modules from this package in an application, you will need to -'import pkg_resources' and then use a 'require()' call similar to one of -these examples, in order to select the desired version: - - pkg_resources.require("%(name)s") # latest installed version - pkg_resources.require("%(name)s==%(version)s") # this exact version - pkg_resources.require("%(name)s>=%(version)s") # this version or higher -""" - if self.install_dir not in map(normalize_path,sys.path): - msg += """ + msg += '\n' + self.__mv_warning + if self.install_dir not in map(normalize_path, sys.path): + msg += '\n' + self.__id_warning -Note also that the installation directory must be on sys.path at runtime for -this to work. (e.g. by being the application's script directory, by being on -PYTHONPATH, or by being added to sys.path by your code.) -""" eggloc = dist.location name = dist.project_name version = dist.version - extras = '' # TODO: self.report_extras(req, dist) + extras = '' # TODO: self.report_extras(req, dist) return msg % locals() - def report_editable(self, spec, setup_script): - dirname = os.path.dirname(setup_script) - python = sys.executable - return """\nExtracted editable version of %(spec)s to %(dirname)s + __editable_msg = textwrap.dedent(""" + Extracted editable version of %(spec)s to %(dirname)s + + If it uses setuptools in its setup script, you can activate it in + "development" mode by going to that directory and running:: -If it uses setuptools in its setup script, you can activate it in -"development" mode by going to that directory and running:: + %(python)s setup.py develop - %(python)s setup.py develop + See the setuptools documentation for the "develop" command for more info. + """).lstrip() -See the setuptools documentation for the "develop" command for more info. -""" % locals() + def report_editable(self, spec, setup_script): + dirname = os.path.dirname(setup_script) + python = sys.executable + return '\n' + self.__editable_msg % locals() def run_setup(self, setup_script, setup_base, args): sys.modules.setdefault('distutils.command.bdist_egg', bdist_egg) sys.modules.setdefault('distutils.command.egg_info', egg_info) args = list(args) - if self.verbose>2: + if self.verbose > 2: v = 'v' * (self.verbose - 1) - args.insert(0,'-'+v) - elif self.verbose<2: - args.insert(0,'-q') + args.insert(0, '-' + v) + elif self.verbose < 2: + args.insert(0, '-q') if self.dry_run: - args.insert(0,'-n') + args.insert(0, '-n') log.info( - "Running %s %s", setup_script[len(setup_base)+1:], ' '.join(args) + "Running %s %s", setup_script[len(setup_base) + 1:], ' '.join(args) ) try: run_setup(setup_script, args) - except SystemExit, v: + except SystemExit as v: raise DistutilsError("Setup script exited with %s" % (v.args[0],)) def build_and_install(self, setup_script, setup_base): args = ['bdist_egg', '--dist-dir'] + dist_dir = tempfile.mkdtemp( prefix='egg-dist-tmp-', dir=os.path.dirname(setup_script) ) try: + self._set_fetcher_options(os.path.dirname(setup_script)) args.append(dist_dir) + self.run_setup(setup_script, setup_base, args) all_eggs = Environment([dist_dir]) eggs = [] @@ -935,17 +1125,41 @@ See the setuptools documentation for the "develop" command for more info. eggs.append(self.install_egg(dist.location, setup_base)) if not eggs and not self.dry_run: log.warn("No eggs found in %s (setup script problem?)", - dist_dir) + dist_dir) return eggs finally: rmtree(dist_dir) - log.set_verbosity(self.verbose) # restore our log verbosity + log.set_verbosity(self.verbose) # restore our log verbosity + + def _set_fetcher_options(self, base): + """ + When easy_install is about to run bdist_egg on a source dist, that + source dist might have 'setup_requires' directives, requiring + additional fetching. Ensure the fetcher options given to easy_install + are available to that command as well. + """ + # find the fetch options from easy_install and write them out + # to the setup.cfg file. + ei_opts = self.distribution.get_option_dict('easy_install').copy() + fetch_directives = ( + 'find_links', 'site_dirs', 'index_url', 'optimize', + 'site_dirs', 'allow_hosts', + ) + fetch_options = {} + for key, val in ei_opts.items(): + if key not in fetch_directives: + continue + fetch_options[key.replace('_', '-')] = val[1] + # create a settings dictionary suitable for `edit_config` + settings = dict(easy_install=fetch_options) + cfg_filename = os.path.join(base, 'setup.cfg') + setopt.edit_config(cfg_filename, settings) - def update_pth(self,dist): + def update_pth(self, dist): if self.pth_file is None: return - for d in self.pth_file[dist.key]: # drop old entries + for d in self.pth_file[dist.key]: # drop old entries if self.multi_version or d.location != dist.location: log.info("Removing %s from easy-install.pth file", d) self.pth_file.remove(d) @@ -956,11 +1170,11 @@ See the setuptools documentation for the "develop" command for more info. if dist.location in self.pth_file.paths: log.info( "%s is already the active version in easy-install.pth", - dist + dist, ) else: log.info("Adding %s to easy-install.pth file", dist) - self.pth_file.add(dist) # add new entry + self.pth_file.add(dist) # add new entry if dist.location not in self.shadow_path: self.shadow_path.append(dist.location) @@ -968,40 +1182,47 @@ See the setuptools documentation for the "develop" command for more info. self.pth_file.save() - if dist.key=='setuptools': + if dist.key == 'setuptools': # Ensure that setuptools itself never becomes unavailable! # XXX should this check for latest version? - filename = os.path.join(self.install_dir,'setuptools.pth') - if os.path.islink(filename): os.unlink(filename) + filename = os.path.join(self.install_dir, 'setuptools.pth') + if os.path.islink(filename): + os.unlink(filename) f = open(filename, 'wt') - f.write(self.pth_file.make_relative(dist.location)+'\n') + f.write(self.pth_file.make_relative(dist.location) + '\n') f.close() def unpack_progress(self, src, dst): # Progress filter for unpacking log.debug("Unpacking %s to %s", src, dst) - return dst # only unpack-and-compile skips files for dry run + return dst # only unpack-and-compile skips files for dry run def unpack_and_compile(self, egg_path, destination): - to_compile = []; to_chmod = [] + to_compile = [] + to_chmod = [] - def pf(src,dst): + def pf(src, dst): if dst.endswith('.py') and not src.startswith('EGG-INFO/'): to_compile.append(dst) elif dst.endswith('.dll') or dst.endswith('.so'): to_chmod.append(dst) - self.unpack_progress(src,dst) + self.unpack_progress(src, dst) return not self.dry_run and dst or None unpack_archive(egg_path, destination, pf) self.byte_compile(to_compile) if not self.dry_run: for f in to_chmod: - mode = ((os.stat(f)[stat.ST_MODE]) | 0555) & 07755 + mode = ((os.stat(f)[stat.ST_MODE]) | 0o555) & 0o7755 chmod(f, mode) def byte_compile(self, to_compile): + if sys.dont_write_bytecode: + self.warn('byte-compiling is disabled, skipping.') + return + from distutils.util import byte_compile + try: # try to make the byte compile messages quieter log.set_verbosity(self.verbose - 1) @@ -1010,59 +1231,45 @@ See the setuptools documentation for the "develop" command for more info. if self.optimize: byte_compile( to_compile, optimize=self.optimize, force=1, - dry_run=self.dry_run + dry_run=self.dry_run, ) finally: - log.set_verbosity(self.verbose) # restore original verbosity - - - - - - + log.set_verbosity(self.verbose) # restore original verbosity + __no_default_msg = textwrap.dedent(""" + bad install directory or PYTHONPATH + You are attempting to install a package to a directory that is not + on PYTHONPATH and which Python does not read ".pth" files from. The + installation directory you specified (via --install-dir, --prefix, or + the distutils default setting) was: - def no_default_version_msg(self): - return """bad install directory or PYTHONPATH - -You are attempting to install a package to a directory that is not -on PYTHONPATH and which Python does not read ".pth" files from. The -installation directory you specified (via --install-dir, --prefix, or -the distutils default setting) was: - - %s - -and your PYTHONPATH environment variable currently contains: - - %r + %s -Here are some of your options for correcting the problem: - -* You can choose a different installation directory, i.e., one that is - on PYTHONPATH or supports .pth files - -* You can add the installation directory to the PYTHONPATH environment - variable. (It must then also be on PYTHONPATH whenever you run - Python and want to use the package(s) you are installing.) - -* You can set up the installation directory to support ".pth" files by - using one of the approaches described here: - - http://peak.telecommunity.com/EasyInstall.html#custom-installation-locations - -Please make the appropriate changes for your system and try again.""" % ( - self.install_dir, os.environ.get('PYTHONPATH','') - ) + and your PYTHONPATH environment variable currently contains: + %r + Here are some of your options for correcting the problem: + * You can choose a different installation directory, i.e., one that is + on PYTHONPATH or supports .pth files + * You can add the installation directory to the PYTHONPATH environment + variable. (It must then also be on PYTHONPATH whenever you run + Python and want to use the package(s) you are installing.) + * You can set up the installation directory to support ".pth" files by + using one of the approaches described here: + https://setuptools.readthedocs.io/en/latest/easy_install.html#custom-installation-locations + Please make the appropriate changes for your system and try again.""").lstrip() + def no_default_version_msg(self): + template = self.__no_default_msg + return template % (self.install_dir, os.environ.get('PYTHONPATH', '')) def install_site_py(self): """Make sure there's a site.py in the target dir, if needed""" @@ -1071,12 +1278,15 @@ Please make the appropriate changes for your system and try again.""" % ( return # already did it, or don't need to sitepy = os.path.join(self.install_dir, "site.py") - source = resource_string(Requirement.parse("setuptools"), "site.py") + source = resource_string("setuptools", "site-patch.py") + source = source.decode('utf-8') current = "" if os.path.exists(sitepy): log.debug("Checking existing site.py in %s", self.install_dir) - current = open(sitepy,'rb').read() + with io.open(sitepy) as strm: + current = strm.read() + if not current.startswith('def __boot():'): raise DistutilsError( "%s is not a setuptools-generated site.py; please" @@ -1087,34 +1297,32 @@ Please make the appropriate changes for your system and try again.""" % ( log.info("Creating %s", sitepy) if not self.dry_run: ensure_directory(sitepy) - f = open(sitepy,'wb') - f.write(source) - f.close() + with io.open(sitepy, 'w', encoding='utf-8') as strm: + strm.write(source) self.byte_compile([sitepy]) self.sitepy_installed = True - - - - - - - - - - + def create_home_path(self): + """Create directories under ~.""" + if not self.user: + return + home = convert_path(os.path.expanduser("~")) + for name, path in six.iteritems(self.config_vars): + if path.startswith(home) and not os.path.isdir(path): + self.debug_print("os.makedirs('%s', 0o700)" % path) + os.makedirs(path, 0o700) INSTALL_SCHEMES = dict( - posix = dict( - install_dir = '$base/lib/python$py_version_short/site-packages', - script_dir = '$base/bin', + posix=dict( + install_dir='$base/lib/python$py_version_short/site-packages', + script_dir='$base/bin', ), ) DEFAULT_SCHEME = dict( - install_dir = '$base/Lib/site-packages', - script_dir = '$base/Scripts', + install_dir='$base/Lib/site-packages', + script_dir='$base/Scripts', ) def _expand(self, *attrs): @@ -1124,12 +1332,13 @@ Please make the appropriate changes for your system and try again.""" % ( # Set default install_dir/scripts from --prefix config_vars = config_vars.copy() config_vars['base'] = self.prefix - scheme = self.INSTALL_SCHEMES.get(os.name,self.DEFAULT_SCHEME) - for attr,val in scheme.items(): - if getattr(self,attr,None) is None: - setattr(self,attr,val) + scheme = self.INSTALL_SCHEMES.get(os.name, self.DEFAULT_SCHEME) + for attr, val in scheme.items(): + if getattr(self, attr, None) is None: + setattr(self, attr, val) from distutils.util import subst_vars + for attr in attrs: val = getattr(self, attr) if val is not None: @@ -1139,16 +1348,10 @@ Please make the appropriate changes for your system and try again.""" % ( setattr(self, attr, val) - - - - - - - def get_site_dirs(): # return a list of 'site' dirs - sitedirs = filter(None,os.environ.get('PYTHONPATH','').split(os.pathsep)) + sitedirs = [_f for _f in os.environ.get('PYTHONPATH', + '').split(os.pathsep) if _f] prefixes = [sys.prefix] if sys.exec_prefix != sys.prefix: prefixes.append(sys.exec_prefix) @@ -1157,15 +1360,20 @@ def get_site_dirs(): if sys.platform in ('os2emx', 'riscos'): sitedirs.append(os.path.join(prefix, "Lib", "site-packages")) elif os.sep == '/': - sitedirs.extend([os.path.join(prefix, - "lib", - "python" + sys.version[:3], - "site-packages"), - os.path.join(prefix, "lib", "site-python")]) + sitedirs.extend([ + os.path.join( + prefix, + "lib", + "python" + sys.version[:3], + "site-packages", + ), + os.path.join(prefix, "lib", "site-python"), + ]) else: - sitedirs.extend( - [prefix, os.path.join(prefix, "lib", "site-packages")] - ) + sitedirs.extend([ + prefix, + os.path.join(prefix, "lib", "site-packages"), + ]) if sys.platform == 'darwin': # for framework builds *only* we add the standard Apple # locations. Currently only per-user, but /Library and @@ -1173,17 +1381,29 @@ def get_site_dirs(): if 'Python.framework' in prefix: home = os.environ.get('HOME') if home: - sitedirs.append( - os.path.join(home, - 'Library', - 'Python', - sys.version[:3], - 'site-packages')) - for plat_specific in (0,1): - site_lib = get_python_lib(plat_specific) - if site_lib not in sitedirs: sitedirs.append(site_lib) - - sitedirs = map(normalize_path, sitedirs) + home_sp = os.path.join( + home, + 'Library', + 'Python', + sys.version[:3], + 'site-packages', + ) + sitedirs.append(home_sp) + lib_paths = get_path('purelib'), get_path('platlib') + for site_lib in lib_paths: + if site_lib not in sitedirs: + sitedirs.append(site_lib) + + if site.ENABLE_USER_SITE: + sitedirs.append(site.USER_SITE) + + try: + sitedirs.extend(site.getsitepackages()) + except AttributeError: + pass + + sitedirs = list(map(normalize_path, sitedirs)) + return sitedirs @@ -1208,12 +1428,12 @@ def expand_paths(inputs): if not name.endswith('.pth'): # We only care about the .pth files continue - if name in ('easy-install.pth','setuptools.pth'): + if name in ('easy-install.pth', 'setuptools.pth'): # Ignore .pth files that we control continue # Read the .pth file - f = open(os.path.join(dirname,name)) + f = open(os.path.join(dirname, name)) lines = list(yield_lines(f)) f.close() @@ -1231,9 +1451,9 @@ def expand_paths(inputs): def extract_wininst_cfg(dist_filename): """Extract configuration data from a bdist_wininst .exe - Returns a ConfigParser.RawConfigParser, or None + Returns a configparser.RawConfigParser, or None """ - f = open(dist_filename,'rb') + f = open(dist_filename, 'rb') try: endrec = zipfile._EndRecData(f) if endrec is None: @@ -1242,18 +1462,24 @@ def extract_wininst_cfg(dist_filename): prepended = (endrec[9] - endrec[5]) - endrec[6] if prepended < 12: # no wininst data here return None - f.seek(prepended-12) + f.seek(prepended - 12) - import struct, StringIO, ConfigParser - tag, cfglen, bmlen = struct.unpack("egg path translations for a given .exe file""" prefixes = [ - ('PURELIB/', ''), ('PLATLIB/pywin32_system32', ''), + ('PURELIB/', ''), + ('PLATLIB/pywin32_system32', ''), ('PLATLIB/', ''), - ('SCRIPTS/', 'EGG-INFO/scripts/') + ('SCRIPTS/', 'EGG-INFO/scripts/'), + ('DATA/lib/site-packages', ''), ] z = zipfile.ZipFile(exe_filename) try: for info in z.infolist(): name = info.filename parts = name.split('/') - if len(parts)==3 and parts[2]=='PKG-INFO': + if len(parts) == 3 and parts[2] == 'PKG-INFO': if parts[1].endswith('.egg-info'): - prefixes.insert(0,('/'.join(parts[:2]), 'EGG-INFO/')) + prefixes.insert(0, ('/'.join(parts[:2]), 'EGG-INFO/')) break - if len(parts)!=2 or not name.endswith('.pth'): + if len(parts) != 2 or not name.endswith('.pth'): continue if name.endswith('-nspkg.pth'): continue - if parts[0].upper() in ('PURELIB','PLATLIB'): - for pth in yield_lines(z.read(name)): - pth = pth.strip().replace('\\','/') + if parts[0].upper() in ('PURELIB', 'PLATLIB'): + contents = z.read(name) + if six.PY3: + contents = contents.decode() + for pth in yield_lines(contents): + pth = pth.strip().replace('\\', '/') if not pth.startswith('import'): - prefixes.append((('%s/%s/' % (parts[0],pth)), '')) + prefixes.append((('%s/%s/' % (parts[0], pth)), '')) finally: z.close() - prefixes = [(x.lower(),y) for x, y in prefixes] - prefixes.sort(); prefixes.reverse() + prefixes = [(x.lower(), y) for x, y in prefixes] + prefixes.sort() + prefixes.reverse() return prefixes -def parse_requirement_arg(spec): - try: - return Requirement.parse(spec) - except ValueError: - raise DistutilsError( - "Not a URL, existing file, or requirement spec: %r" % (spec,) - ) - class PthDistributions(Environment): """A .pth file with Distribution paths in it""" dirty = False def __init__(self, filename, sitedirs=()): - self.filename = filename; self.sitedirs=map(normalize_path, sitedirs) + self.filename = filename + self.sitedirs = list(map(normalize_path, sitedirs)) self.basedir = normalize_path(os.path.dirname(self.filename)) - self._load(); Environment.__init__(self, [], None, None) + self._load() + Environment.__init__(self, [], None, None) for path in yield_lines(self.paths): - map(self.add, find_distributions(path, True)) + list(map(self.add, find_distributions(path, True))) def _load(self): self.paths = [] saw_import = False seen = dict.fromkeys(self.sitedirs) if os.path.isfile(self.filename): - for line in open(self.filename,'rt'): + f = open(self.filename, 'rt') + for line in f: if line.startswith('import'): saw_import = True continue @@ -1338,16 +1559,17 @@ class PthDistributions(Environment): # skip non-existent paths, in case somebody deleted a package # manually, and duplicate paths as well path = self.paths[-1] = normalize_path( - os.path.join(self.basedir,path) + os.path.join(self.basedir, path) ) if not os.path.exists(path) or path in seen: - self.paths.pop() # skip it - self.dirty = True # we cleaned up, so we're dirty now :) + self.paths.pop() # skip it + self.dirty = True # we cleaned up, so we're dirty now :) continue seen[path] = 1 + f.close() if self.paths and not saw_import: - self.dirty = True # ensure anything we touch has import wrappers + self.dirty = True # ensure anything we touch has import wrappers while self.paths and not self.paths[-1].strip(): self.paths.pop() @@ -1356,22 +1578,16 @@ class PthDistributions(Environment): if not self.dirty: return - data = '\n'.join(map(self.make_relative,self.paths)) - if data: + rel_paths = list(map(self.make_relative, self.paths)) + if rel_paths: log.debug("Saving %s", self.filename) - data = ( - "import sys; sys.__plen = len(sys.path)\n" - "%s\n" - "import sys; new=sys.path[sys.__plen:];" - " del sys.path[sys.__plen:];" - " p=getattr(sys,'__egginsert',0); sys.path[p:p]=new;" - " sys.__egginsert = p+len(new)\n" - ) % data + lines = self._wrap_lines(rel_paths) + data = '\n'.join(lines) + '\n' if os.path.islink(self.filename): os.unlink(self.filename) - f = open(self.filename,'wb') - f.write(data); f.close() + with open(self.filename, 'wt') as f: + f.write(data) elif os.path.exists(self.filename): log.debug("Deleting empty %s", self.filename) @@ -1379,26 +1595,38 @@ class PthDistributions(Environment): self.dirty = False - def add(self,dist): + @staticmethod + def _wrap_lines(lines): + return lines + + def add(self, dist): """Add `dist` to the distribution map""" - if dist.location not in self.paths and dist.location not in self.sitedirs: - self.paths.append(dist.location); self.dirty = True - Environment.add(self,dist) + new_path = ( + dist.location not in self.paths and ( + dist.location not in self.sitedirs or + # account for '.' being in PYTHONPATH + dist.location == os.getcwd() + ) + ) + if new_path: + self.paths.append(dist.location) + self.dirty = True + Environment.add(self, dist) - def remove(self,dist): + def remove(self, dist): """Remove `dist` from the distribution map""" while dist.location in self.paths: - self.paths.remove(dist.location); self.dirty = True - Environment.remove(self,dist) + self.paths.remove(dist.location) + self.dirty = True + Environment.remove(self, dist) - - def make_relative(self,path): + def make_relative(self, path): npath, last = os.path.split(normalize_path(path)) baselen = len(self.basedir) parts = [last] - sep = os.altsep=='/' and '/' or os.sep - while len(npath)>=baselen: - if npath==self.basedir: + sep = os.altsep == '/' and '/' or os.sep + while len(npath) >= baselen: + if npath == self.basedir: parts.append(os.curdir) parts.reverse() return sep.join(parts) @@ -1407,54 +1635,231 @@ class PthDistributions(Environment): else: return path -def get_script_header(script_text, executable=sys_executable, wininst=False): - """Create a #! line, getting options (if any) from script_text""" - from distutils.command.build_scripts import first_line_re - first = (script_text+'\n').splitlines()[0] - match = first_line_re.match(first) - options = '' - if match: - options = match.group(1) or '' - if options: options = ' '+options - if wininst: - executable = "python.exe" - else: - executable = nt_quote_arg(executable) - hdr = "#!%(executable)s%(options)s\n" % locals() - if unicode(hdr,'ascii','ignore').encode('ascii') != hdr: - # Non-ascii path to sys.executable, use -x to prevent warnings - if options: - if options.strip().startswith('-'): - options = ' -x'+options.strip()[1:] - # else: punt, we can't do it, let the warning happen anyway - else: - options = ' -x' - executable = fix_jython_executable(executable, options) - hdr = "#!%(executable)s%(options)s\n" % locals() - return hdr + +class RewritePthDistributions(PthDistributions): + @classmethod + def _wrap_lines(cls, lines): + yield cls.prelude + for line in lines: + yield line + yield cls.postlude + + prelude = _one_liner(""" + import sys + sys.__plen = len(sys.path) + """) + postlude = _one_liner(""" + import sys + new = sys.path[sys.__plen:] + del sys.path[sys.__plen:] + p = getattr(sys, '__egginsert', 0) + sys.path[p:p] = new + sys.__egginsert = p + len(new) + """) + + +if os.environ.get('SETUPTOOLS_SYS_PATH_TECHNIQUE', 'raw') == 'rewrite': + PthDistributions = RewritePthDistributions + + +def _first_line_re(): + """ + Return a regular expression based on first_line_re suitable for matching + strings. + """ + if isinstance(first_line_re.pattern, str): + return first_line_re + + # first_line_re in Python >=3.1.4 and >=3.2.1 is a bytes pattern. + return re.compile(first_line_re.pattern.decode()) + def auto_chmod(func, arg, exc): - if func is os.remove and os.name=='nt': + if func in [os.unlink, os.remove] and os.name == 'nt': chmod(arg, stat.S_IWRITE) return func(arg) - exc = sys.exc_info() - raise exc[0], (exc[1][0], exc[1][1] + (" %s %s" % (func,arg))) - -def uncache_zipdir(path): - """Ensure that the importer caches dont have stale info for `path`""" - from zipimport import _zip_directory_cache as zdc - _uncache(path, zdc) - _uncache(path, sys.path_importer_cache) - -def _uncache(path, cache): - if path in cache: - del cache[path] + et, ev, _ = sys.exc_info() + six.reraise(et, (ev[0], ev[1] + (" %s %s" % (func, arg)))) + + +def update_dist_caches(dist_path, fix_zipimporter_caches): + """ + Fix any globally cached `dist_path` related data + + `dist_path` should be a path of a newly installed egg distribution (zipped + or unzipped). + + sys.path_importer_cache contains finder objects that have been cached when + importing data from the original distribution. Any such finders need to be + cleared since the replacement distribution might be packaged differently, + e.g. a zipped egg distribution might get replaced with an unzipped egg + folder or vice versa. Having the old finders cached may then cause Python + to attempt loading modules from the replacement distribution using an + incorrect loader. + + zipimport.zipimporter objects are Python loaders charged with importing + data packaged inside zip archives. If stale loaders referencing the + original distribution, are left behind, they can fail to load modules from + the replacement distribution. E.g. if an old zipimport.zipimporter instance + is used to load data from a new zipped egg archive, it may cause the + operation to attempt to locate the requested data in the wrong location - + one indicated by the original distribution's zip archive directory + information. Such an operation may then fail outright, e.g. report having + read a 'bad local file header', or even worse, it may fail silently & + return invalid data. + + zipimport._zip_directory_cache contains cached zip archive directory + information for all existing zipimport.zipimporter instances and all such + instances connected to the same archive share the same cached directory + information. + + If asked, and the underlying Python implementation allows it, we can fix + all existing zipimport.zipimporter instances instead of having to track + them down and remove them one by one, by updating their shared cached zip + archive directory information. This, of course, assumes that the + replacement distribution is packaged as a zipped egg. + + If not asked to fix existing zipimport.zipimporter instances, we still do + our best to clear any remaining zipimport.zipimporter related cached data + that might somehow later get used when attempting to load data from the new + distribution and thus cause such load operations to fail. Note that when + tracking down such remaining stale data, we can not catch every conceivable + usage from here, and we clear only those that we know of and have found to + cause problems if left alive. Any remaining caches should be updated by + whomever is in charge of maintaining them, i.e. they should be ready to + handle us replacing their zip archives with new distributions at runtime. + + """ + # There are several other known sources of stale zipimport.zipimporter + # instances that we do not clear here, but might if ever given a reason to + # do so: + # * Global setuptools pkg_resources.working_set (a.k.a. 'master working + # set') may contain distributions which may in turn contain their + # zipimport.zipimporter loaders. + # * Several zipimport.zipimporter loaders held by local variables further + # up the function call stack when running the setuptools installation. + # * Already loaded modules may have their __loader__ attribute set to the + # exact loader instance used when importing them. Python 3.4 docs state + # that this information is intended mostly for introspection and so is + # not expected to cause us problems. + normalized_path = normalize_path(dist_path) + _uncache(normalized_path, sys.path_importer_cache) + if fix_zipimporter_caches: + _replace_zip_directory_cache_data(normalized_path) else: - path = normalize_path(path) - for p in cache: - if normalize_path(p)==path: - del cache[p] - return + # Here, even though we do not want to fix existing and now stale + # zipimporter cache information, we still want to remove it. Related to + # Python's zip archive directory information cache, we clear each of + # its stale entries in two phases: + # 1. Clear the entry so attempting to access zip archive information + # via any existing stale zipimport.zipimporter instances fails. + # 2. Remove the entry from the cache so any newly constructed + # zipimport.zipimporter instances do not end up using old stale + # zip archive directory information. + # This whole stale data removal step does not seem strictly necessary, + # but has been left in because it was done before we started replacing + # the zip archive directory information cache content if possible, and + # there are no relevant unit tests that we can depend on to tell us if + # this is really needed. + _remove_and_clear_zip_directory_cache_data(normalized_path) + + +def _collect_zipimporter_cache_entries(normalized_path, cache): + """ + Return zipimporter cache entry keys related to a given normalized path. + + Alternative path spellings (e.g. those using different character case or + those using alternative path separators) related to the same path are + included. Any sub-path entries are included as well, i.e. those + corresponding to zip archives embedded in other zip archives. + + """ + result = [] + prefix_len = len(normalized_path) + for p in cache: + np = normalize_path(p) + if (np.startswith(normalized_path) and + np[prefix_len:prefix_len + 1] in (os.sep, '')): + result.append(p) + return result + + +def _update_zipimporter_cache(normalized_path, cache, updater=None): + """ + Update zipimporter cache data for a given normalized path. + + Any sub-path entries are processed as well, i.e. those corresponding to zip + archives embedded in other zip archives. + + Given updater is a callable taking a cache entry key and the original entry + (after already removing the entry from the cache), and expected to update + the entry and possibly return a new one to be inserted in its place. + Returning None indicates that the entry should not be replaced with a new + one. If no updater is given, the cache entries are simply removed without + any additional processing, the same as if the updater simply returned None. + + """ + for p in _collect_zipimporter_cache_entries(normalized_path, cache): + # N.B. pypy's custom zipimport._zip_directory_cache implementation does + # not support the complete dict interface: + # * Does not support item assignment, thus not allowing this function + # to be used only for removing existing cache entries. + # * Does not support the dict.pop() method, forcing us to use the + # get/del patterns instead. For more detailed information see the + # following links: + # https://github.com/pypa/setuptools/issues/202#issuecomment-202913420 + # https://bitbucket.org/pypy/pypy/src/dd07756a34a41f674c0cacfbc8ae1d4cc9ea2ae4/pypy/module/zipimport/interp_zipimport.py#cl-99 + old_entry = cache[p] + del cache[p] + new_entry = updater and updater(p, old_entry) + if new_entry is not None: + cache[p] = new_entry + + +def _uncache(normalized_path, cache): + _update_zipimporter_cache(normalized_path, cache) + + +def _remove_and_clear_zip_directory_cache_data(normalized_path): + def clear_and_remove_cached_zip_archive_directory_data(path, old_entry): + old_entry.clear() + + _update_zipimporter_cache( + normalized_path, zipimport._zip_directory_cache, + updater=clear_and_remove_cached_zip_archive_directory_data) + + +# PyPy Python implementation does not allow directly writing to the +# zipimport._zip_directory_cache and so prevents us from attempting to correct +# its content. The best we can do there is clear the problematic cache content +# and have PyPy repopulate it as needed. The downside is that if there are any +# stale zipimport.zipimporter instances laying around, attempting to use them +# will fail due to not having its zip archive directory information available +# instead of being automatically corrected to use the new correct zip archive +# directory information. +if '__pypy__' in sys.builtin_module_names: + _replace_zip_directory_cache_data = \ + _remove_and_clear_zip_directory_cache_data +else: + + def _replace_zip_directory_cache_data(normalized_path): + def replace_cached_zip_archive_directory_data(path, old_entry): + # N.B. In theory, we could load the zip directory information just + # once for all updated path spellings, and then copy it locally and + # update its contained path strings to contain the correct + # spelling, but that seems like a way too invasive move (this cache + # structure is not officially documented anywhere and could in + # theory change with new Python releases) for no significant + # benefit. + old_entry.clear() + zipimport.zipimporter(path) + old_entry.update(zipimport._zip_directory_cache[path]) + return old_entry + + _update_zipimporter_cache( + normalized_path, zipimport._zip_directory_cache, + updater=replace_cached_zip_archive_directory_data) + def is_python(text, filename=''): "Is this string a valid Python script?" @@ -1465,258 +1870,423 @@ def is_python(text, filename=''): else: return True + def is_sh(executable): """Determine if the specified executable is a .sh (contains a #! line)""" try: - fp = open(executable) - magic = fp.read(2) - fp.close() - except (OSError,IOError): return executable + with io.open(executable, encoding='latin-1') as fp: + magic = fp.read(2) + except (OSError, IOError): + return executable return magic == '#!' + def nt_quote_arg(arg): """Quote a command line argument according to Windows parsing rules""" - - result = [] - needquote = False - nb = 0 - - needquote = (" " in arg) or ("\t" in arg) - if needquote: - result.append('"') - - for c in arg: - if c == '\\': - nb += 1 - elif c == '"': - # double preceding backslashes, then add a \" - result.append('\\' * (nb*2) + '\\"') - nb = 0 - else: - if nb: - result.append('\\' * nb) - nb = 0 - result.append(c) - - if nb: - result.append('\\' * nb) - - if needquote: - result.append('\\' * nb) # double the trailing backslashes - result.append('"') - - return ''.join(result) - - - - - - - + return subprocess.list2cmdline([arg]) def is_python_script(script_text, filename): """Is this text, as a whole, a Python script? (as opposed to shell/bat/etc. """ if filename.endswith('.py') or filename.endswith('.pyw'): - return True # extension says it's Python + return True # extension says it's Python if is_python(script_text, filename): - return True # it's syntactically valid Python + return True # it's syntactically valid Python if script_text.startswith('#!'): # It begins with a '#!' line, so check if 'python' is in it somewhere return 'python' in script_text.splitlines()[0].lower() - return False # Not any Python I can recognize + return False # Not any Python I can recognize + try: from os import chmod as _chmod except ImportError: # Jython compatibility - def _chmod(*args): pass + def _chmod(*args): + pass + def chmod(path, mode): log.debug("changing mode of %s to %o", path, mode) try: _chmod(path, mode) - except os.error, e: + except os.error as e: log.debug("chmod failed: %s", e) -def fix_jython_executable(executable, options): - if sys.platform.startswith('java') and is_sh(executable): - # Workaround Jython's sys.executable being a .sh (an invalid - # shebang line interpreter) - if options: - # Can't apply the workaround, leave it broken - log.warn("WARNING: Unable to adapt shebang line for Jython," - " the following script is NOT executable\n" - " see http://bugs.jython.org/issue1112 for" - " more information.") - else: - return '/usr/bin/env %s' % executable - return executable - - -def get_script_args(dist, executable=sys_executable, wininst=False): - """Yield write_script() argument tuples for a distribution's entrypoints""" - spec = str(dist.as_requirement()) - header = get_script_header("", executable, wininst) - for group in 'console_scripts', 'gui_scripts': - for name,ep in dist.get_entry_map(group).items(): - script_text = ( - "# EASY-INSTALL-ENTRY-SCRIPT: %(spec)r,%(group)r,%(name)r\n" - "__requires__ = %(spec)r\n" - "import sys\n" - "from pkg_resources import load_entry_point\n" - "\n" - "sys.exit(\n" - " load_entry_point(%(spec)r, %(group)r, %(name)r)()\n" - ")\n" - ) % locals() - if sys.platform=='win32' or wininst: - # On Windows/wininst, add a .py extension and an .exe launcher - if group=='gui_scripts': - ext, launcher = '-script.pyw', 'gui.exe' - old = ['.pyw'] - new_header = re.sub('(?i)python.exe','pythonw.exe',header) - else: - ext, launcher = '-script.py', 'cli.exe' - old = ['.py','.pyc','.pyo'] - new_header = re.sub('(?i)pythonw.exe','python.exe',header) - if os.path.exists(new_header[2:-1]) or sys.platform!='win32': - hdr = new_header - else: - hdr = header - yield (name+ext, hdr+script_text, 't', [name+x for x in old]) - yield ( - name+'.exe', resource_string('setuptools', launcher), - 'b') # write in binary mode - yield (name+'.exe.manifest', _launcher_manifest % (name,), 't') - else: - # On other platforms, we assume the right thing to do is to - # just write the stub with no extension. - yield (name, header+script_text) +class CommandSpec(list): + """ + A command spec for a #! header, specified as a list of arguments akin to + those passed to Popen. + """ + + options = [] + split_args = dict() + + @classmethod + def best(cls): + """ + Choose the best CommandSpec class based on environmental conditions. + """ + return cls + + @classmethod + def _sys_executable(cls): + _default = os.path.normpath(sys.executable) + return os.environ.get('__PYVENV_LAUNCHER__', _default) + + @classmethod + def from_param(cls, param): + """ + Construct a CommandSpec from a parameter to build_scripts, which may + be None. + """ + if isinstance(param, cls): + return param + if isinstance(param, list): + return cls(param) + if param is None: + return cls.from_environment() + # otherwise, assume it's a string. + return cls.from_string(param) + + @classmethod + def from_environment(cls): + return cls([cls._sys_executable()]) + + @classmethod + def from_string(cls, string): + """ + Construct a command spec from a simple string representing a command + line parseable by shlex.split. + """ + items = shlex.split(string, **cls.split_args) + return cls(items) + + def install_options(self, script_text): + self.options = shlex.split(self._extract_options(script_text)) + cmdline = subprocess.list2cmdline(self) + if not isascii(cmdline): + self.options[:0] = ['-x'] + + @staticmethod + def _extract_options(orig_script): + """ + Extract any options from the first line of the script. + """ + first = (orig_script + '\n').splitlines()[0] + match = _first_line_re().match(first) + options = match.group(1) or '' if match else '' + return options.strip() + + def as_header(self): + return self._render(self + list(self.options)) + + @staticmethod + def _strip_quotes(item): + _QUOTES = '"\'' + for q in _QUOTES: + if item.startswith(q) and item.endswith(q): + return item[1:-1] + return item -_launcher_manifest = """ - - - + @staticmethod + def _render(items): + cmdline = subprocess.list2cmdline( + CommandSpec._strip_quotes(item.strip()) for item in items) + return '#!' + cmdline + '\n' - - - - - - - - -""" +# For pbr compat; will be removed in a future version. +sys_executable = CommandSpec._sys_executable() +class WindowsCommandSpec(CommandSpec): + split_args = dict(posix=False) +class ScriptWriter(object): + """ + Encapsulates behavior around writing entry point scripts for console and + gui apps. + """ + + template = textwrap.dedent(r""" + # EASY-INSTALL-ENTRY-SCRIPT: %(spec)r,%(group)r,%(name)r + __requires__ = %(spec)r + import re + import sys + from pkg_resources import load_entry_point + + if __name__ == '__main__': + sys.argv[0] = re.sub(r'(-script\.pyw?|\.exe)?$', '', sys.argv[0]) + sys.exit( + load_entry_point(%(spec)r, %(group)r, %(name)r)() + ) + """).lstrip() + + command_spec_class = CommandSpec + + @classmethod + def get_script_args(cls, dist, executable=None, wininst=False): + # for backward compatibility + warnings.warn("Use get_args", DeprecationWarning) + writer = (WindowsScriptWriter if wininst else ScriptWriter).best() + header = cls.get_script_header("", executable, wininst) + return writer.get_args(dist, header) + + @classmethod + def get_script_header(cls, script_text, executable=None, wininst=False): + # for backward compatibility + warnings.warn("Use get_header", DeprecationWarning) + if wininst: + executable = "python.exe" + cmd = cls.command_spec_class.best().from_param(executable) + cmd.install_options(script_text) + return cmd.as_header() + + @classmethod + def get_args(cls, dist, header=None): + """ + Yield write_script() argument tuples for a distribution's + console_scripts and gui_scripts entry points. + """ + if header is None: + header = cls.get_header() + spec = str(dist.as_requirement()) + for type_ in 'console', 'gui': + group = type_ + '_scripts' + for name, ep in dist.get_entry_map(group).items(): + cls._ensure_safe_name(name) + script_text = cls.template % locals() + args = cls._get_script_args(type_, name, header, script_text) + for res in args: + yield res + + @staticmethod + def _ensure_safe_name(name): + """ + Prevent paths in *_scripts entry point names. + """ + has_path_sep = re.search(r'[\\/]', name) + if has_path_sep: + raise ValueError("Path separators not allowed in script names") + + @classmethod + def get_writer(cls, force_windows): + # for backward compatibility + warnings.warn("Use best", DeprecationWarning) + return WindowsScriptWriter.best() if force_windows else cls.best() + + @classmethod + def best(cls): + """ + Select the best ScriptWriter for this environment. + """ + if sys.platform == 'win32' or (os.name == 'java' and os._name == 'nt'): + return WindowsScriptWriter.best() + else: + return cls + @classmethod + def _get_script_args(cls, type_, name, header, script_text): + # Simply write the stub with no extension. + yield (name, header + script_text) + @classmethod + def get_header(cls, script_text="", executable=None): + """Create a #! line, getting options (if any) from script_text""" + cmd = cls.command_spec_class.best().from_param(executable) + cmd.install_options(script_text) + return cmd.as_header() +class WindowsScriptWriter(ScriptWriter): + command_spec_class = WindowsCommandSpec + @classmethod + def get_writer(cls): + # for backward compatibility + warnings.warn("Use best", DeprecationWarning) + return cls.best() + @classmethod + def best(cls): + """ + Select the best ScriptWriter suitable for Windows + """ + writer_lookup = dict( + executable=WindowsExecutableLauncherWriter, + natural=cls, + ) + # for compatibility, use the executable launcher by default + launcher = os.environ.get('SETUPTOOLS_LAUNCHER', 'executable') + return writer_lookup[launcher] + + @classmethod + def _get_script_args(cls, type_, name, header, script_text): + "For Windows, add a .py extension" + ext = dict(console='.pya', gui='.pyw')[type_] + if ext not in os.environ['PATHEXT'].lower().split(';'): + msg = ( + "{ext} not listed in PATHEXT; scripts will not be " + "recognized as executables." + ).format(**locals()) + warnings.warn(msg, UserWarning) + old = ['.pya', '.py', '-script.py', '.pyc', '.pyo', '.pyw', '.exe'] + old.remove(ext) + header = cls._adjust_header(type_, header) + blockers = [name + x for x in old] + yield name + ext, header + script_text, 't', blockers + + @classmethod + def _adjust_header(cls, type_, orig_header): + """ + Make sure 'pythonw' is used for gui and and 'python' is used for + console (regardless of what sys.executable is). + """ + pattern = 'pythonw.exe' + repl = 'python.exe' + if type_ == 'gui': + pattern, repl = repl, pattern + pattern_ob = re.compile(re.escape(pattern), re.IGNORECASE) + new_header = pattern_ob.sub(string=orig_header, repl=repl) + return new_header if cls._use_header(new_header) else orig_header + + @staticmethod + def _use_header(new_header): + """ + Should _adjust_header use the replaced header? + On non-windows systems, always use. On + Windows systems, only use the replaced header if it resolves + to an executable on the system. + """ + clean_header = new_header[2:-1].strip('"') + return sys.platform != 'win32' or find_executable(clean_header) +class WindowsExecutableLauncherWriter(WindowsScriptWriter): + @classmethod + def _get_script_args(cls, type_, name, header, script_text): + """ + For Windows, add a .py extension and an .exe launcher + """ + if type_ == 'gui': + launcher_type = 'gui' + ext = '-script.pyw' + old = ['.pyw'] + else: + launcher_type = 'cli' + ext = '-script.py' + old = ['.py', '.pyc', '.pyo'] + hdr = cls._adjust_header(type_, header) + blockers = [name + x for x in old] + yield (name + ext, hdr + script_text, 't', blockers) + yield ( + name + '.exe', get_win_launcher(launcher_type), + 'b' # write in binary mode + ) + if not is_64bit(): + # install a manifest for the launcher to prevent Windows + # from detecting it as an installer (which it will for + # launchers like easy_install.exe). Consider only + # adding a manifest for launchers detected as installers. + # See Distribute #143 for details. + m_name = name + '.exe.manifest' + yield (m_name, load_launcher_manifest(name), 't') +# for backward-compatibility +get_script_args = ScriptWriter.get_script_args +get_script_header = ScriptWriter.get_script_header +def get_win_launcher(type): + """ + Load the Windows launcher (executable) suitable for launching a script. + `type` should be either 'cli' or 'gui' + Returns the executable as a byte string. + """ + launcher_fn = '%s.exe' % type + if is_64bit(): + launcher_fn = launcher_fn.replace(".", "-64.") + else: + launcher_fn = launcher_fn.replace(".", "-32.") + return resource_string('setuptools', launcher_fn) +def load_launcher_manifest(name): + manifest = pkg_resources.resource_string(__name__, 'launcher manifest.xml') + if six.PY2: + return manifest % vars() + else: + return manifest.decode('utf-8') % vars() def rmtree(path, ignore_errors=False, onerror=auto_chmod): - """Recursively delete a directory tree. + return shutil.rmtree(path, ignore_errors, onerror) + + +def current_umask(): + tmp = os.umask(0o022) + os.umask(tmp) + return tmp - This code is taken from the Python 2.4 version of 'shutil', because - the 2.3 version doesn't really work right. - """ - if ignore_errors: - def onerror(*args): - pass - elif onerror is None: - def onerror(*args): - raise - names = [] - try: - names = os.listdir(path) - except os.error, err: - onerror(os.listdir, path, sys.exc_info()) - for name in names: - fullname = os.path.join(path, name) - try: - mode = os.lstat(fullname).st_mode - except os.error: - mode = 0 - if stat.S_ISDIR(mode): - rmtree(fullname, ignore_errors, onerror) - else: - try: - os.remove(fullname) - except os.error, err: - onerror(os.remove, fullname, sys.exc_info()) - try: - os.rmdir(path) - except os.error: - onerror(os.rmdir, path, sys.exc_info()) def bootstrap(): # This function is called when setuptools*.egg is run using /bin/sh - import setuptools; argv0 = os.path.dirname(setuptools.__path__[0]) - sys.argv[0] = argv0; sys.argv.append(argv0); main() + import setuptools + + argv0 = os.path.dirname(setuptools.__path__[0]) + sys.argv[0] = argv0 + sys.argv.append(argv0) + main() def main(argv=None, **kw): from setuptools import setup from setuptools.dist import Distribution - import distutils.core - - USAGE = """\ -usage: %(script)s [options] requirement_or_url ... - or: %(script)s --help -""" - - def gen_usage (script_name): - script = os.path.basename(script_name) - return USAGE % vars() - - def with_ei_usage(f): - old_gen_usage = distutils.core.gen_usage - try: - distutils.core.gen_usage = gen_usage - return f() - finally: - distutils.core.gen_usage = old_gen_usage class DistributionWithoutHelpCommands(Distribution): common_usage = "" - def _show_help(self,*args,**kw): - with_ei_usage(lambda: Distribution._show_help(self,*args,**kw)) + + def _show_help(self, *args, **kw): + with _patch_usage(): + Distribution._show_help(self, *args, **kw) if argv is None: argv = sys.argv[1:] - with_ei_usage(lambda: + with _patch_usage(): setup( - script_args = ['-q','easy_install', '-v']+argv, - script_name = sys.argv[0] or 'easy_install', - distclass=DistributionWithoutHelpCommands, **kw + script_args=['-q', 'easy_install', '-v'] + argv, + script_name=sys.argv[0] or 'easy_install', + distclass=DistributionWithoutHelpCommands, + **kw ) - ) - +@contextlib.contextmanager +def _patch_usage(): + import distutils.core + USAGE = textwrap.dedent(""" + usage: %(script)s [options] requirement_or_url ... + or: %(script)s --help + """).lstrip() + + def gen_usage(script_name): + return USAGE % dict( + script=os.path.basename(script_name), + ) + saved = distutils.core.gen_usage + distutils.core.gen_usage = gen_usage + try: + yield + finally: + distutils.core.gen_usage = saved diff --git a/setuptools/command/egg_info.py b/setuptools/command/egg_info.py index 5a8b2db..1a6ea9c 100755 --- a/setuptools/command/egg_info.py +++ b/setuptools/command/egg_info.py @@ -2,17 +2,118 @@ Create a distribution's .egg-info directory and contents""" -# This module should be kept compatible with Python 2.3 -import os, re -from setuptools import Command -from distutils.errors import * +from distutils.filelist import FileList as _FileList +from distutils.errors import DistutilsInternalError +from distutils.util import convert_path from distutils import log +import distutils.errors +import distutils.filelist +import os +import re +import sys +import io +import warnings +import time +import collections + +import six +from six.moves import map + +from setuptools import Command from setuptools.command.sdist import sdist -from distutils.util import convert_path -from distutils.filelist import FileList -from pkg_resources import parse_requirements, safe_name, parse_version, \ - safe_version, yield_lines, EntryPoint, iter_entry_points, to_filename -from sdist import walk_revctrl +from setuptools.command.sdist import walk_revctrl +from setuptools.command.setopt import edit_config +from setuptools.command import bdist_egg +from pkg_resources import ( + parse_requirements, safe_name, parse_version, + safe_version, yield_lines, EntryPoint, iter_entry_points, to_filename) +import setuptools.unicode_utils as unicode_utils +from setuptools.glob import glob + +import packaging + + +def translate_pattern(glob): + """ + Translate a file path glob like '*.txt' in to a regular expression. + This differs from fnmatch.translate which allows wildcards to match + directory separators. It also knows about '**/' which matches any number of + directories. + """ + pat = '' + + # This will split on '/' within [character classes]. This is deliberate. + chunks = glob.split(os.path.sep) + + sep = re.escape(os.sep) + valid_char = '[^%s]' % (sep,) + + for c, chunk in enumerate(chunks): + last_chunk = c == len(chunks) - 1 + + # Chunks that are a literal ** are globstars. They match anything. + if chunk == '**': + if last_chunk: + # Match anything if this is the last component + pat += '.*' + else: + # Match '(name/)*' + pat += '(?:%s+%s)*' % (valid_char, sep) + continue # Break here as the whole path component has been handled + + # Find any special characters in the remainder + i = 0 + chunk_len = len(chunk) + while i < chunk_len: + char = chunk[i] + if char == '*': + # Match any number of name characters + pat += valid_char + '*' + elif char == '?': + # Match a name character + pat += valid_char + elif char == '[': + # Character class + inner_i = i + 1 + # Skip initial !/] chars + if inner_i < chunk_len and chunk[inner_i] == '!': + inner_i = inner_i + 1 + if inner_i < chunk_len and chunk[inner_i] == ']': + inner_i = inner_i + 1 + + # Loop till the closing ] is found + while inner_i < chunk_len and chunk[inner_i] != ']': + inner_i = inner_i + 1 + + if inner_i >= chunk_len: + # Got to the end of the string without finding a closing ] + # Do not treat this as a matching group, but as a literal [ + pat += re.escape(char) + else: + # Grab the insides of the [brackets] + inner = chunk[i + 1:inner_i] + char_class = '' + + # Class negation + if inner[0] == '!': + char_class = '^' + inner = inner[1:] + + char_class += re.escape(inner) + pat += '[%s]' % (char_class,) + + # Skip to the end ] + i = inner_i + else: + pat += re.escape(char) + i += 1 + + # Join each chunk with the dir separator + if not last_chunk: + pat += sep + + return re.compile(pat + r'\Z(?ms)') + class egg_info(Command): description = "create a distribution's .egg-info directory" @@ -20,24 +121,15 @@ class egg_info(Command): user_options = [ ('egg-base=', 'e', "directory containing .egg-info directories" " (default: top of the source tree)"), - ('tag-svn-revision', 'r', - "Add subversion revision ID to version number"), ('tag-date', 'd', "Add date stamp (e.g. 20050528) to version number"), ('tag-build=', 'b', "Specify explicit tag to add to version number"), - ('no-svn-revision', 'R', - "Don't add subversion revision ID [default]"), ('no-date', 'D', "Don't include date stamp [default]"), ] - boolean_options = ['tag-date', 'tag-svn-revision'] - negative_opt = {'no-svn-revision': 'tag-svn-revision', - 'no-date': 'tag-date'} - - - - - - + boolean_options = ['tag-date'] + negative_opt = { + 'no-date': 'tag-date', + } def initialize_options(self): self.egg_name = None @@ -45,65 +137,68 @@ class egg_info(Command): self.egg_base = None self.egg_info = None self.tag_build = None - self.tag_svn_revision = 0 self.tag_date = 0 self.broken_egg_info = False self.vtags = None - def save_version_info(self, filename): - from setopt import edit_config - edit_config( - filename, - {'egg_info': - {'tag_svn_revision':0, 'tag_date': 0, 'tag_build': self.tags()} - } - ) - - - - - - - - - - - - - - - - - - - + #################################### + # allow the 'tag_svn_revision' to be detected and + # set, supporting sdists built on older Setuptools. + @property + def tag_svn_revision(self): + pass + @tag_svn_revision.setter + def tag_svn_revision(self, value): + pass + #################################### + def save_version_info(self, filename): + """ + Materialize the value of date into the + build tag. Install build keys in a deterministic order + to avoid arbitrary reordering on subsequent builds. + """ + # python 2.6 compatibility + odict = getattr(collections, 'OrderedDict', dict) + egg_info = odict() + # follow the order these keys would have been added + # when PYTHONHASHSEED=0 + egg_info['tag_build'] = self.tags() + egg_info['tag_date'] = 0 + edit_config(filename, dict(egg_info=egg_info)) - def finalize_options (self): + def finalize_options(self): self.egg_name = safe_name(self.distribution.get_name()) self.vtags = self.tags() self.egg_version = self.tagged_version() + parsed_version = parse_version(self.egg_version) + try: + is_version = isinstance(parsed_version, packaging.version.Version) + spec = ( + "%s==%s" if is_version else "%s===%s" + ) list( - parse_requirements('%s==%s' % (self.egg_name,self.egg_version)) + parse_requirements(spec % (self.egg_name, self.egg_version)) ) except ValueError: - raise DistutilsOptionError( + raise distutils.errors.DistutilsOptionError( "Invalid distribution name or version syntax: %s-%s" % - (self.egg_name,self.egg_version) + (self.egg_name, self.egg_version) ) if self.egg_base is None: dirs = self.distribution.package_dir - self.egg_base = (dirs or {}).get('',os.curdir) + self.egg_base = (dirs or {}).get('', os.curdir) self.ensure_dirname('egg_base') - self.egg_info = to_filename(self.egg_name)+'.egg-info' + self.egg_info = to_filename(self.egg_name) + '.egg-info' if self.egg_base != os.curdir: self.egg_info = os.path.join(self.egg_base, self.egg_info) - if '-' in self.egg_name: self.check_broken_egg_info() + if '-' in self.egg_name: + self.check_broken_egg_info() # Set package version for the benefit of dumber commands # (e.g. sdist, bdist_wininst, etc.) @@ -115,12 +210,11 @@ class egg_info(Command): # to the version info # pd = self.distribution._patched_dist - if pd is not None and pd.key==self.egg_name.lower(): + if pd is not None and pd.key == self.egg_name.lower(): pd._version = self.egg_version pd._parsed_version = parse_version(self.egg_version) self.distribution._patched_dist = None - def write_or_delete_file(self, what, filename, data, force=False): """Write `data` to `filename` or delete if empty @@ -148,6 +242,8 @@ class egg_info(Command): to the file. """ log.info("writing %s to %s", what, filename) + if six.PY3: + data = data.encode("utf-8") if not self.dry_run: f = open(filename, 'wb') f.write(data) @@ -160,14 +256,20 @@ class egg_info(Command): os.unlink(filename) def tagged_version(self): - return safe_version(self.distribution.get_version() + self.vtags) + version = self.distribution.get_version() + # egg_info may be called more than once for a distribution, + # in which case the version string already contains all tags. + if self.vtags and version.endswith(self.vtags): + return safe_version(version) + return safe_version(version + self.vtags) def run(self): self.mkpath(self.egg_info) installer = self.distribution.fetch_build_egg for ep in iter_entry_points('egg_info.writers'): - writer = ep.load(installer=installer) - writer(self, ep.name, os.path.join(self.egg_info,ep.name)) + ep.require(installer=installer) + writer = ep.resolve() + writer(self, ep.name, os.path.join(self.egg_info, ep.name)) # Get rid of native_libs.txt if it was put there by older bdist_egg nl = os.path.join(self.egg_info, "native_libs.txt") @@ -179,117 +281,235 @@ class egg_info(Command): def tags(self): version = '' if self.tag_build: - version+=self.tag_build - if self.tag_svn_revision and ( - os.path.exists('.svn') or os.path.exists('PKG-INFO') - ): version += '-r%s' % self.get_svn_revision() + version += self.tag_build if self.tag_date: - import time; version += time.strftime("-%Y%m%d") + version += time.strftime("-%Y%m%d") return version - - - - - - - - - - - - - - - - - def get_svn_revision(self): - revision = 0 - urlre = re.compile('url="([^"]+)"') - revre = re.compile('committed-rev="(\d+)"') - - for base,dirs,files in os.walk(os.curdir): - if '.svn' not in dirs: - dirs[:] = [] - continue # no sense walking uncontrolled subdirs - dirs.remove('.svn') - f = open(os.path.join(base,'.svn','entries')) - data = f.read() - f.close() - - if data.startswith('9 and d[9]]+[0]) - if base==os.curdir: - base_url = dirurl+'/' # save the root url - elif not dirurl.startswith(base_url): - dirs[:] = [] - continue # not part of the same svn tree, skip it - revision = max(revision, localrev) - - return str(revision or get_pkg_info_revision()) - - - - def find_sources(self): """Generate SOURCES.txt manifest file""" - manifest_filename = os.path.join(self.egg_info,"SOURCES.txt") + manifest_filename = os.path.join(self.egg_info, "SOURCES.txt") mm = manifest_maker(self.distribution) mm.manifest = manifest_filename mm.run() self.filelist = mm.filelist def check_broken_egg_info(self): - bei = self.egg_name+'.egg-info' + bei = self.egg_name + '.egg-info' if self.egg_base != os.curdir: bei = os.path.join(self.egg_base, bei) if os.path.exists(bei): log.warn( - "-"*78+'\n' + "-" * 78 + '\n' "Note: Your current .egg-info directory has a '-' in its name;" '\nthis will not work correctly with "setup.py develop".\n\n' - 'Please rename %s to %s to correct this problem.\n'+'-'*78, + 'Please rename %s to %s to correct this problem.\n' + '-' * 78, bei, self.egg_info ) self.broken_egg_info = self.egg_info - self.egg_info = bei # make it work for now + self.egg_info = bei # make it work for now + + +class FileList(_FileList): + # Implementations of the various MANIFEST.in commands + + def process_template_line(self, line): + # Parse the line: split it up, make sure the right number of words + # is there, and return the relevant words. 'action' is always + # defined: it's the first word of the line. Which of the other + # three are defined depends on the action; it'll be either + # patterns, (dir and patterns), or (dir_pattern). + (action, patterns, dir, dir_pattern) = self._parse_template_line(line) + + # OK, now we know that the action is valid and we have the + # right number of words on the line for that action -- so we + # can proceed with minimal error-checking. + if action == 'include': + self.debug_print("include " + ' '.join(patterns)) + for pattern in patterns: + if not self.include(pattern): + log.warn("warning: no files found matching '%s'", pattern) + + elif action == 'exclude': + self.debug_print("exclude " + ' '.join(patterns)) + for pattern in patterns: + if not self.exclude(pattern): + log.warn(("warning: no previously-included files " + "found matching '%s'"), pattern) + + elif action == 'global-include': + self.debug_print("global-include " + ' '.join(patterns)) + for pattern in patterns: + if not self.global_include(pattern): + log.warn(("warning: no files found matching '%s' " + "anywhere in distribution"), pattern) + + elif action == 'global-exclude': + self.debug_print("global-exclude " + ' '.join(patterns)) + for pattern in patterns: + if not self.global_exclude(pattern): + log.warn(("warning: no previously-included files matching " + "'%s' found anywhere in distribution"), + pattern) + + elif action == 'recursive-include': + self.debug_print("recursive-include %s %s" % + (dir, ' '.join(patterns))) + for pattern in patterns: + if not self.recursive_include(dir, pattern): + log.warn(("warning: no files found matching '%s' " + "under directory '%s'"), + pattern, dir) + + elif action == 'recursive-exclude': + self.debug_print("recursive-exclude %s %s" % + (dir, ' '.join(patterns))) + for pattern in patterns: + if not self.recursive_exclude(dir, pattern): + log.warn(("warning: no previously-included files matching " + "'%s' found under directory '%s'"), + pattern, dir) + + elif action == 'graft': + self.debug_print("graft " + dir_pattern) + if not self.graft(dir_pattern): + log.warn("warning: no directories found matching '%s'", + dir_pattern) + + elif action == 'prune': + self.debug_print("prune " + dir_pattern) + if not self.prune(dir_pattern): + log.warn(("no previously-included directories found " + "matching '%s'"), dir_pattern) + + else: + raise DistutilsInternalError( + "this cannot happen: invalid action '%s'" % action) + + def _remove_files(self, predicate): + """ + Remove all files from the file list that match the predicate. + Return True if any matching files were removed + """ + found = False + for i in range(len(self.files) - 1, -1, -1): + if predicate(self.files[i]): + self.debug_print(" removing " + self.files[i]) + del self.files[i] + found = True + return found + + def include(self, pattern): + """Include files that match 'pattern'.""" + found = [f for f in glob(pattern) if not os.path.isdir(f)] + self.extend(found) + return bool(found) + + def exclude(self, pattern): + """Exclude files that match 'pattern'.""" + match = translate_pattern(pattern) + return self._remove_files(match.match) + + def recursive_include(self, dir, pattern): + """ + Include all files anywhere in 'dir/' that match the pattern. + """ + full_pattern = os.path.join(dir, '**', pattern) + found = [f for f in glob(full_pattern, recursive=True) + if not os.path.isdir(f)] + self.extend(found) + return bool(found) + + def recursive_exclude(self, dir, pattern): + """ + Exclude any file anywhere in 'dir/' that match the pattern. + """ + match = translate_pattern(os.path.join(dir, '**', pattern)) + return self._remove_files(match.match) + + def graft(self, dir): + """Include all files from 'dir/'.""" + found = [ + item + for match_dir in glob(dir) + for item in distutils.filelist.findall(match_dir) + ] + self.extend(found) + return bool(found) + + def prune(self, dir): + """Filter out files from 'dir/'.""" + match = translate_pattern(os.path.join(dir, '**')) + return self._remove_files(match.match) -class FileList(FileList): - """File list that accepts only existing, platform-independent paths""" + def global_include(self, pattern): + """ + Include all files anywhere in the current directory that match the + pattern. This is very inefficient on large file trees. + """ + if self.allfiles is None: + self.findall() + match = translate_pattern(os.path.join('**', pattern)) + found = [f for f in self.allfiles if match.match(f)] + self.extend(found) + return bool(found) + + def global_exclude(self, pattern): + """ + Exclude all files anywhere that match the pattern. + """ + match = translate_pattern(os.path.join('**', pattern)) + return self._remove_files(match.match) def append(self, item): - if item.endswith('\r'): # Fix older sdists built on Windows + if item.endswith('\r'): # Fix older sdists built on Windows item = item[:-1] path = convert_path(item) - if os.path.exists(path): + + if self._safe_path(path): self.files.append(path) + def extend(self, paths): + self.files.extend(filter(self._safe_path, paths)) + def _repair(self): + """ + Replace self.files with only safe paths + Because some owners of FileList manipulate the underlying + ``files`` attribute directly, this method must be called to + repair those paths. + """ + self.files = list(filter(self._safe_path, self.files)) + def _safe_path(self, path): + enc_warn = "'%s' not %s encodable -- skipping" + # To avoid accidental trans-codings errors, first to unicode + u_path = unicode_utils.filesys_decode(path) + if u_path is None: + log.warn("'%s' in unexpected encoding -- skipping" % path) + return False + # Must ensure utf-8 encodability + utf8_path = unicode_utils.try_encode(u_path, "utf-8") + if utf8_path is None: + log.warn(enc_warn, path, 'utf-8') + return False + try: + # accept is either way checks out + if os.path.exists(u_path) or os.path.exists(utf8_path): + return True + # this will catch any encode errors decoding u_path + except UnicodeEncodeError: + log.warn(enc_warn, path, sys.getfilesystemencoding()) class manifest_maker(sdist): - template = "MANIFEST.in" - def initialize_options (self): + def initialize_options(self): self.use_defaults = 1 self.prune = 1 self.manifest_only = 1 @@ -301,8 +521,7 @@ class manifest_maker(sdist): def run(self): self.filelist = FileList() if not os.path.exists(self.manifest): - self.write_manifest() # it must exist so it'll get in the list - self.filelist.findall() + self.write_manifest() # it must exist so it'll get in the list self.add_defaults() if os.path.exists(self.template): self.read_template() @@ -311,21 +530,33 @@ class manifest_maker(sdist): self.filelist.remove_duplicates() self.write_manifest() - def write_manifest (self): - """Write the file list in 'self.filelist' (presumably as filled in - by 'add_defaults()' and 'read_template()') to the manifest file + def _manifest_normalize(self, path): + path = unicode_utils.filesys_decode(path) + return path.replace(os.sep, '/') + + def write_manifest(self): + """ + Write the file list in 'self.filelist' to the manifest file named by 'self.manifest'. """ - files = self.filelist.files - if os.sep!='/': - files = [f.replace(os.sep,'/') for f in files] - self.execute(write_file, (self.manifest, files), - "writing manifest file '%s'" % self.manifest) - - def warn(self, msg): # suppress missing-file warnings from sdist - if not msg.startswith("standard file not found:"): + self.filelist._repair() + + # Now _repairs should encodability, but not unicode + files = [self._manifest_normalize(f) for f in self.filelist.files] + msg = "writing manifest file '%s'" % self.manifest + self.execute(write_file, (self.manifest, files), msg) + + def warn(self, msg): + if not self._should_suppress_warning(msg): sdist.warn(self, msg) + @staticmethod + def _should_suppress_warning(msg): + """ + suppress missing-file warnings from sdist + """ + return re.match(r"standard file .*not found", msg) + def add_defaults(self): sdist.add_defaults(self) self.filelist.append(self.template) @@ -336,35 +567,29 @@ class manifest_maker(sdist): elif os.path.exists(self.manifest): self.read_manifest() ei_cmd = self.get_finalized_command('egg_info') - self.filelist.include_pattern("*", prefix=ei_cmd.egg_info) + self.filelist.graft(ei_cmd.egg_info) - def prune_file_list (self): + def prune_file_list(self): build = self.get_finalized_command('build') base_dir = self.distribution.get_fullname() - self.filelist.exclude_pattern(None, prefix=build.build_base) - self.filelist.exclude_pattern(None, prefix=base_dir) + self.filelist.prune(build.build_base) + self.filelist.prune(base_dir) sep = re.escape(os.sep) - self.filelist.exclude_pattern(sep+r'(RCS|CVS|\.svn)'+sep, is_regex=1) + self.filelist.exclude_pattern(r'(^|' + sep + r')(RCS|CVS|\.svn)' + sep, + is_regex=1) -def write_file (filename, contents): +def write_file(filename, contents): """Create a file with the specified name and write 'contents' (a sequence of strings without line terminators) to it. """ - f = open(filename, "wb") # always write POSIX-style manifest - f.write("\n".join(contents)) - f.close() - - - - - - - - - + contents = "\n".join(contents) + # assuming the contents has been vetted for utf-8 encoding + contents = contents.encode("utf-8") + with open(filename, "wb") as f: # always write POSIX-style manifest + f.write(contents) def write_pkg_info(cmd, basename, filename): @@ -372,7 +597,7 @@ def write_pkg_info(cmd, basename, filename): if not cmd.dry_run: metadata = cmd.distribution.metadata metadata.version, oldver = cmd.egg_version, metadata.version - metadata.name, oldname = cmd.egg_name, metadata.name + metadata.name, oldname = cmd.egg_name, metadata.name try: # write unescaped data to PKG-INFO, so older pkg_resources # can still parse it @@ -380,8 +605,10 @@ def write_pkg_info(cmd, basename, filename): finally: metadata.name, metadata.version = oldname, oldver - safe = getattr(cmd.distribution,'zip_safe',None) - import bdist_egg; bdist_egg.write_safety_flag(cmd.egg_info, safe) + safe = getattr(cmd.distribution, 'zip_safe', None) + + bdist_egg.write_safety_flag(cmd.egg_info, safe) + def warn_depends_obsolete(cmd, basename, filename): if os.path.exists(filename): @@ -391,61 +618,79 @@ def warn_depends_obsolete(cmd, basename, filename): ) +def _write_requirements(stream, reqs): + lines = yield_lines(reqs or ()) + append_cr = lambda line: line + '\n' + lines = map(append_cr, lines) + stream.writelines(lines) + + def write_requirements(cmd, basename, filename): dist = cmd.distribution - data = ['\n'.join(yield_lines(dist.install_requires or ()))] - for extra,reqs in (dist.extras_require or {}).items(): - data.append('\n\n[%s]\n%s' % (extra, '\n'.join(yield_lines(reqs)))) - cmd.write_or_delete_file("requirements", filename, ''.join(data)) + data = six.StringIO() + _write_requirements(data, dist.install_requires) + extras_require = dist.extras_require or {} + for extra in sorted(extras_require): + data.write('\n[{extra}]\n'.format(**vars())) + _write_requirements(data, extras_require[extra]) + cmd.write_or_delete_file("requirements", filename, data.getvalue()) + + +def write_setup_requirements(cmd, basename, filename): + data = StringIO() + _write_requirements(data, cmd.distribution.setup_requires) + cmd.write_or_delete_file("setup-requirements", filename, data.getvalue()) + def write_toplevel_names(cmd, basename, filename): pkgs = dict.fromkeys( - [k.split('.',1)[0] + [ + k.split('.', 1)[0] for k in cmd.distribution.iter_distribution_names() ] ) - cmd.write_file("top-level names", filename, '\n'.join(pkgs)+'\n') - + cmd.write_file("top-level names", filename, '\n'.join(sorted(pkgs)) + '\n') def overwrite_arg(cmd, basename, filename): write_arg(cmd, basename, filename, True) + def write_arg(cmd, basename, filename, force=False): argname = os.path.splitext(basename)[0] value = getattr(cmd.distribution, argname, None) if value is not None: - value = '\n'.join(value)+'\n' + value = '\n'.join(value) + '\n' cmd.write_or_delete_file(argname, filename, value, force) + def write_entries(cmd, basename, filename): ep = cmd.distribution.entry_points - if isinstance(ep,basestring) or ep is None: + if isinstance(ep, six.string_types) or ep is None: data = ep elif ep is not None: data = [] - for section, contents in ep.items(): - if not isinstance(contents,basestring): + for section, contents in sorted(ep.items()): + if not isinstance(contents, six.string_types): contents = EntryPoint.parse_group(section, contents) - contents = '\n'.join(map(str,contents.values())) - data.append('[%s]\n%s\n\n' % (section,contents)) + contents = '\n'.join(sorted(map(str, contents.values()))) + data.append('[%s]\n%s\n\n' % (section, contents)) data = ''.join(data) cmd.write_or_delete_file('entry points', filename, data, True) + def get_pkg_info_revision(): - # See if we can get a -r### off of PKG-INFO, in case this is an sdist of - # a subversion revision - # + """ + Get a -r### off of PKG-INFO Version in case this is an sdist of + a subversion revision. + """ + warnings.warn("get_pkg_info_revision is deprecated.", DeprecationWarning) if os.path.exists('PKG-INFO'): - f = open('PKG-INFO','rU') - for line in f: - match = re.match(r"Version:.*-r(\d+)\s*$", line) - if match: - return int(match.group(1)) + with io.open('PKG-INFO') as f: + for line in f: + match = re.match(r"Version:.*-r(\d+)\s*$", line) + if match: + return int(match.group(1)) return 0 - - - -# diff --git a/setuptools/command/install.py b/setuptools/command/install.py index a150c43..31a5ddb 100644 --- a/setuptools/command/install.py +++ b/setuptools/command/install.py @@ -1,35 +1,41 @@ -import setuptools, sys, glob -from distutils.command.install import install as _install from distutils.errors import DistutilsArgError +import inspect +import glob +import warnings +import platform +import distutils.command.install as orig -class install(_install): +import setuptools + +# Prior to numpy 1.9, NumPy relies on the '_install' name, so provide it for +# now. See https://github.com/pypa/setuptools/issues/199/ +_install = orig.install + + +class install(orig.install): """Use easy_install to install the package, w/dependencies""" - user_options = _install.user_options + [ + user_options = orig.install.user_options + [ ('old-and-unmanageable', None, "Try not to use this!"), ('single-version-externally-managed', None, - "used by system package builders to create 'flat' eggs"), + "used by system package builders to create 'flat' eggs"), ] - boolean_options = _install.boolean_options + [ + boolean_options = orig.install.boolean_options + [ 'old-and-unmanageable', 'single-version-externally-managed', ] new_commands = [ ('install_egg_info', lambda self: True), - ('install_scripts', lambda self: True), + ('install_scripts', lambda self: True), ] _nc = dict(new_commands) - sub_commands = [ - cmd for cmd in _install.sub_commands if cmd[0] not in _nc - ] + new_commands def initialize_options(self): - _install.initialize_options(self) + orig.install.initialize_options(self) self.old_and_unmanageable = None self.single_version_externally_managed = None - self.no_compile = None # make DISTUTILS_DEBUG work right! def finalize_options(self): - _install.finalize_options(self) + orig.install.finalize_options(self) if self.root: self.single_version_externally_managed = True elif self.single_version_externally_managed: @@ -42,43 +48,50 @@ class install(_install): def handle_extra_path(self): if self.root or self.single_version_externally_managed: # explicit backward-compatibility mode, allow extra_path to work - return _install.handle_extra_path(self) + return orig.install.handle_extra_path(self) # Ignore extra_path when installing an egg (or being run by another # command without --root or --single-version-externally-managed self.path_file = None self.extra_dirs = '' - def run(self): # Explicit request for old-style install? Just do it if self.old_and_unmanageable or self.single_version_externally_managed: - return _install.run(self) - - # Attempt to detect whether we were called from setup() or by another - # command. If we were called by setup(), our caller will be the - # 'run_command' method in 'distutils.dist', and *its* caller will be - # the 'run_commands' method. If we were called any other way, our - # immediate caller *might* be 'run_command', but it won't have been - # called by 'run_commands'. This is slightly kludgy, but seems to - # work. - # - caller = sys._getframe(2) - caller_module = caller.f_globals.get('__name__','') - caller_name = caller.f_code.co_name - - if caller_module != 'distutils.dist' or caller_name!='run_commands': - # We weren't called from the command line or setup(), so we - # should run in backward-compatibility mode to support bdist_* - # commands. - _install.run(self) + return orig.install.run(self) + + if not self._called_from_setup(inspect.currentframe()): + # Run in backward-compatibility mode to support bdist_* commands. + orig.install.run(self) else: self.do_egg_install() - - - - + @staticmethod + def _called_from_setup(run_frame): + """ + Attempt to detect whether run() was called from setup() or by another + command. If called by setup(), the parent caller will be the + 'run_command' method in 'distutils.dist', and *its* caller will be + the 'run_commands' method. If called any other way, the + immediate caller *might* be 'run_command', but it won't have been + called by 'run_commands'. Return True in that case or if a call stack + is unavailable. Return False otherwise. + """ + if run_frame is None: + msg = "Call stack not available. bdist_* commands may fail." + warnings.warn(msg) + if platform.python_implementation() == 'IronPython': + msg = "For best results, pass -X:Frames to enable call stack." + warnings.warn(msg) + return True + res = inspect.getouterframes(run_frame)[2] + caller, = res[:1] + info = inspect.getframeinfo(caller) + caller_module = caller.f_globals.get('__name__', '') + return ( + caller_module == 'distutils.dist' + and info.function == 'run_commands' + ) def do_egg_install(self): @@ -105,19 +118,8 @@ class install(_install): setuptools.bootstrap_install_from = None - - - - - - - - - - - - - - - -# +# XXX Python 3.1 doesn't see _nc if this is inside the class +install.sub_commands = ( + [cmd for cmd in orig.install.sub_commands if cmd[0] not in install._nc] + + install.new_commands +) diff --git a/setuptools/command/install_egg_info.py b/setuptools/command/install_egg_info.py index 939340c..edc4718 100755 --- a/setuptools/command/install_egg_info.py +++ b/setuptools/command/install_egg_info.py @@ -1,9 +1,13 @@ +from distutils import log, dir_util +import os + from setuptools import Command +from setuptools import namespaces from setuptools.archive_util import unpack_archive -from distutils import log, dir_util -import os, shutil, pkg_resources +import pkg_resources -class install_egg_info(Command): + +class install_egg_info(namespaces.Installer, Command): """Install an .egg-info directory for the package""" description = "Install an .egg-info directory for the package" @@ -16,26 +20,26 @@ class install_egg_info(Command): self.install_dir = None def finalize_options(self): - self.set_undefined_options('install_lib',('install_dir','install_dir')) + self.set_undefined_options('install_lib', + ('install_dir', 'install_dir')) ei_cmd = self.get_finalized_command("egg_info") basename = pkg_resources.Distribution( None, None, ei_cmd.egg_name, ei_cmd.egg_version - ).egg_name()+'.egg-info' + ).egg_name() + '.egg-info' self.source = ei_cmd.egg_info self.target = os.path.join(self.install_dir, basename) - self.outputs = [self.target] + self.outputs = [] def run(self): self.run_command('egg_info') - target = self.target if os.path.isdir(self.target) and not os.path.islink(self.target): dir_util.remove_tree(self.target, dry_run=self.dry_run) elif os.path.exists(self.target): - self.execute(os.unlink,(self.target,),"Removing "+self.target) + self.execute(os.unlink, (self.target,), "Removing " + self.target) if not self.dry_run: pkg_resources.ensure_directory(self.target) - self.execute(self.copytree, (), - "Copying %s to %s" % (self.source, self.target) + self.execute( + self.copytree, (), "Copying %s to %s" % (self.source, self.target) ) self.install_namespaces() @@ -44,80 +48,15 @@ class install_egg_info(Command): def copytree(self): # Copy the .egg-info tree to site-packages - def skimmer(src,dst): + def skimmer(src, dst): # filter out source-control directories; note that 'src' is always # a '/'-separated path, regardless of platform. 'dst' is a # platform-specific path. - for skip in '.svn/','CVS/': - if src.startswith(skip) or '/'+skip in src: + for skip in '.svn/', 'CVS/': + if src.startswith(skip) or '/' + skip in src: return None self.outputs.append(dst) log.debug("Copying %s to %s", src, dst) return dst - unpack_archive(self.source, self.target, skimmer) - - - - - - - - - - - - - - - - - - - - - - - - - - def install_namespaces(self): - nsp = self._get_all_ns_packages() - if not nsp: return - filename,ext = os.path.splitext(self.target) - filename += '-nspkg.pth'; self.outputs.append(filename) - log.info("Installing %s",filename) - if not self.dry_run: - f = open(filename,'wb') - for pkg in nsp: - pth = tuple(pkg.split('.')) - trailer = '\n' - if '.' in pkg: - trailer = ( - "; m and setattr(sys.modules[%r], %r, m)\n" - % ('.'.join(pth[:-1]), pth[-1]) - ) - f.write( - "import sys,new,os; " - "p = os.path.join(sys._getframe(1).f_locals['sitedir'], " - "*%(pth)r); " - "ie = os.path.exists(os.path.join(p,'__init__.py')); " - "m = not ie and " - "sys.modules.setdefault(%(pkg)r,new.module(%(pkg)r)); " - "mp = (m or []) and m.__dict__.setdefault('__path__',[]); " - "(p not in mp) and mp.append(p)%(trailer)s" - % locals() - ) - f.close() - - def _get_all_ns_packages(self): - nsp = {} - for pkg in self.distribution.namespace_packages or []: - pkg = pkg.split('.') - while pkg: - nsp['.'.join(pkg)] = 1 - pkg.pop() - nsp=list(nsp) - nsp.sort() # set up shorter names first - return nsp - + unpack_archive(self.source, self.target, skimmer) diff --git a/setuptools/command/install_lib.py b/setuptools/command/install_lib.py index 82afa14..2b31c3e 100644 --- a/setuptools/command/install_lib.py +++ b/setuptools/command/install_lib.py @@ -1,20 +1,11 @@ -from distutils.command.install_lib import install_lib as _install_lib import os +import imp +from itertools import product, starmap +import distutils.command.install_lib as orig -class install_lib(_install_lib): - """Don't add compiled flags to filenames of non-Python files""" - - def _bytecode_filenames (self, py_filenames): - bytecode_files = [] - for py_file in py_filenames: - if not py_file.endswith('.py'): - continue - if self.compile: - bytecode_files.append(py_file + "c") - if self.optimize > 0: - bytecode_files.append(py_file + "o") - return bytecode_files +class install_lib(orig.install_lib): + """Don't add compiled flags to filenames of non-Python files""" def run(self): self.build() @@ -24,30 +15,83 @@ class install_lib(_install_lib): self.byte_compile(outfiles) def get_exclusions(self): - exclude = {} - nsp = self.distribution.namespace_packages - - if (nsp and self.get_finalized_command('install') - .single_version_externally_managed - ): - for pkg in nsp: - parts = pkg.split('.') - while parts: - pkgdir = os.path.join(self.install_dir, *parts) - for f in '__init__.py', '__init__.pyc', '__init__.pyo': - exclude[os.path.join(pkgdir,f)] = 1 - parts.pop() - return exclude + """ + Return a collections.Sized collections.Container of paths to be + excluded for single_version_externally_managed installations. + """ + all_packages = ( + pkg + for ns_pkg in self._get_SVEM_NSPs() + for pkg in self._all_packages(ns_pkg) + ) + + excl_specs = product(all_packages, self._gen_exclusion_paths()) + return set(starmap(self._exclude_pkg_path, excl_specs)) + + def _exclude_pkg_path(self, pkg, exclusion_path): + """ + Given a package name and exclusion path within that package, + compute the full exclusion path. + """ + parts = pkg.split('.') + [exclusion_path] + return os.path.join(self.install_dir, *parts) + + @staticmethod + def _all_packages(pkg_name): + """ + >>> list(install_lib._all_packages('foo.bar.baz')) + ['foo.bar.baz', 'foo.bar', 'foo'] + """ + while pkg_name: + yield pkg_name + pkg_name, sep, child = pkg_name.rpartition('.') + + def _get_SVEM_NSPs(self): + """ + Get namespace packages (list) but only for + single_version_externally_managed installations and empty otherwise. + """ + # TODO: is it necessary to short-circuit here? i.e. what's the cost + # if get_finalized_command is called even when namespace_packages is + # False? + if not self.distribution.namespace_packages: + return [] + + install_cmd = self.get_finalized_command('install') + svem = install_cmd.single_version_externally_managed + + return self.distribution.namespace_packages if svem else [] + + @staticmethod + def _gen_exclusion_paths(): + """ + Generate file paths to be excluded for namespace packages (bytecode + cache files). + """ + # always exclude the package module itself + yield '__init__.py' + + yield '__init__.pyc' + yield '__init__.pyo' + + if not hasattr(imp, 'get_tag'): + return + + base = os.path.join('__pycache__', '__init__.' + imp.get_tag()) + yield base + '.pyc' + yield base + '.pyo' + yield base + '.opt-1.pyc' + yield base + '.opt-2.pyc' def copy_tree( - self, infile, outfile, - preserve_mode=1, preserve_times=1, preserve_symlinks=0, level=1 + self, infile, outfile, + preserve_mode=1, preserve_times=1, preserve_symlinks=0, level=1 ): assert preserve_mode and preserve_times and not preserve_symlinks exclude = self.get_exclusions() if not exclude: - return _install_lib.copy_tree(self, infile, outfile) + return orig.install_lib.copy_tree(self, infile, outfile) # Exclude namespace package __init__.py* files from the output @@ -58,7 +102,8 @@ class install_lib(_install_lib): def pf(src, dst): if dst in exclude: - log.warn("Skipping installation of %s (namespace package)",dst) + log.warn("Skipping installation of %s (namespace package)", + dst) return False log.info("copying %s -> %s", src, os.path.dirname(dst)) @@ -69,14 +114,8 @@ class install_lib(_install_lib): return outfiles def get_outputs(self): - outputs = _install_lib.get_outputs(self) + outputs = orig.install_lib.get_outputs(self) exclude = self.get_exclusions() if exclude: return [f for f in outputs if f not in exclude] return outputs - - - - - - diff --git a/setuptools/command/install_scripts.py b/setuptools/command/install_scripts.py index c2dc2d5..1623427 100755 --- a/setuptools/command/install_scripts.py +++ b/setuptools/command/install_scripts.py @@ -1,82 +1,65 @@ -from distutils.command.install_scripts import install_scripts \ - as _install_scripts -from easy_install import get_script_args, sys_executable, chmod -from pkg_resources import Distribution, PathMetadata, ensure_directory -import os from distutils import log +import distutils.command.install_scripts as orig +import os +import sys + +from pkg_resources import Distribution, PathMetadata, ensure_directory + -class install_scripts(_install_scripts): +class install_scripts(orig.install_scripts): """Do normal script install, plus any egg_info wrapper scripts""" def initialize_options(self): - _install_scripts.initialize_options(self) + orig.install_scripts.initialize_options(self) self.no_ep = False - + def run(self): + import setuptools.command.easy_install as ei + self.run_command("egg_info") if self.distribution.scripts: - _install_scripts.run(self) # run first to set up self.outfiles + orig.install_scripts.run(self) # run first to set up self.outfiles else: self.outfiles = [] if self.no_ep: # don't install entry point scripts into .egg file! - return + return - ei_cmd = self.get_finalized_command("egg_info") + ei_cmd = self.get_finalized_command("egg_info") dist = Distribution( ei_cmd.egg_base, PathMetadata(ei_cmd.egg_base, ei_cmd.egg_info), ei_cmd.egg_name, ei_cmd.egg_version, ) bs_cmd = self.get_finalized_command('build_scripts') - executable = getattr(bs_cmd,'executable',sys_executable) - is_wininst = getattr( - self.get_finalized_command("bdist_wininst"), '_is_running', False - ) - for args in get_script_args(dist, executable, is_wininst): + exec_param = getattr(bs_cmd, 'executable', None) + bw_cmd = self.get_finalized_command("bdist_wininst") + is_wininst = getattr(bw_cmd, '_is_running', False) + writer = ei.ScriptWriter + if is_wininst: + exec_param = "python.exe" + writer = ei.WindowsScriptWriter + if exec_param == sys.executable: + # In case the path to the Python executable contains a space, wrap + # it so it's not split up. + exec_param = [exec_param] + # resolve the writer to the environment + writer = writer.best() + cmd = writer.command_spec_class.best().from_param(exec_param) + for args in writer.get_args(dist, cmd.as_header()): self.write_script(*args) - - - - def write_script(self, script_name, contents, mode="t", *ignored): """Write an executable file to the scripts directory""" + from setuptools.command.easy_install import chmod, current_umask + log.info("Installing %s script to %s", script_name, self.install_dir) target = os.path.join(self.install_dir, script_name) self.outfiles.append(target) + mask = current_umask() if not self.dry_run: ensure_directory(target) - f = open(target,"w"+mode) + f = open(target, "w" + mode) f.write(contents) f.close() - chmod(target,0755) - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + chmod(target, 0o777 - mask) diff --git a/setuptools/command/launcher manifest.xml b/setuptools/command/launcher manifest.xml new file mode 100644 index 0000000..5972a96 --- /dev/null +++ b/setuptools/command/launcher manifest.xml @@ -0,0 +1,15 @@ + + + + + + + + + + + + diff --git a/setuptools/command/py36compat.py b/setuptools/command/py36compat.py new file mode 100644 index 0000000..a2c74b2 --- /dev/null +++ b/setuptools/command/py36compat.py @@ -0,0 +1,136 @@ +import os +from glob import glob +from distutils.util import convert_path +from distutils.command import sdist + +from six.moves import filter + + +class sdist_add_defaults: + """ + Mix-in providing forward-compatibility for functionality as found in + distutils on Python 3.7. + + Do not edit the code in this class except to update functionality + as implemented in distutils. Instead, override in the subclass. + """ + + def add_defaults(self): + """Add all the default files to self.filelist: + - README or README.txt + - setup.py + - test/test*.py + - all pure Python modules mentioned in setup script + - all files pointed by package_data (build_py) + - all files defined in data_files. + - all files defined as scripts. + - all C sources listed as part of extensions or C libraries + in the setup script (doesn't catch C headers!) + Warns if (README or README.txt) or setup.py are missing; everything + else is optional. + """ + self._add_defaults_standards() + self._add_defaults_optional() + self._add_defaults_python() + self._add_defaults_data_files() + self._add_defaults_ext() + self._add_defaults_c_libs() + self._add_defaults_scripts() + + @staticmethod + def _cs_path_exists(fspath): + """ + Case-sensitive path existence check + + >>> sdist_add_defaults._cs_path_exists(__file__) + True + >>> sdist_add_defaults._cs_path_exists(__file__.upper()) + False + """ + if not os.path.exists(fspath): + return False + # make absolute so we always have a directory + abspath = os.path.abspath(fspath) + directory, filename = os.path.split(abspath) + return filename in os.listdir(directory) + + def _add_defaults_standards(self): + standards = [self.READMES, self.distribution.script_name] + for fn in standards: + if isinstance(fn, tuple): + alts = fn + got_it = False + for fn in alts: + if self._cs_path_exists(fn): + got_it = True + self.filelist.append(fn) + break + + if not got_it: + self.warn("standard file not found: should have one of " + + ', '.join(alts)) + else: + if self._cs_path_exists(fn): + self.filelist.append(fn) + else: + self.warn("standard file '%s' not found" % fn) + + def _add_defaults_optional(self): + optional = ['test/test*.py', 'setup.cfg'] + for pattern in optional: + files = filter(os.path.isfile, glob(pattern)) + self.filelist.extend(files) + + def _add_defaults_python(self): + # build_py is used to get: + # - python modules + # - files defined in package_data + build_py = self.get_finalized_command('build_py') + + # getting python files + if self.distribution.has_pure_modules(): + self.filelist.extend(build_py.get_source_files()) + + # getting package_data files + # (computed in build_py.data_files by build_py.finalize_options) + for pkg, src_dir, build_dir, filenames in build_py.data_files: + for filename in filenames: + self.filelist.append(os.path.join(src_dir, filename)) + + def _add_defaults_data_files(self): + # getting distribution.data_files + if self.distribution.has_data_files(): + for item in self.distribution.data_files: + if isinstance(item, str): + # plain file + item = convert_path(item) + if os.path.isfile(item): + self.filelist.append(item) + else: + # a (dirname, filenames) tuple + dirname, filenames = item + for f in filenames: + f = convert_path(f) + if os.path.isfile(f): + self.filelist.append(f) + + def _add_defaults_ext(self): + if self.distribution.has_ext_modules(): + build_ext = self.get_finalized_command('build_ext') + self.filelist.extend(build_ext.get_source_files()) + + def _add_defaults_c_libs(self): + if self.distribution.has_c_libraries(): + build_clib = self.get_finalized_command('build_clib') + self.filelist.extend(build_clib.get_source_files()) + + def _add_defaults_scripts(self): + if self.distribution.has_scripts(): + build_scripts = self.get_finalized_command('build_scripts') + self.filelist.extend(build_scripts.get_source_files()) + + +if hasattr(sdist.sdist, '_add_defaults_standards'): + # disable the functionality already available upstream + class sdist_add_defaults: + pass diff --git a/setuptools/command/register.py b/setuptools/command/register.py index 3b2e085..8d6336a 100755 --- a/setuptools/command/register.py +++ b/setuptools/command/register.py @@ -1,10 +1,10 @@ -from distutils.command.register import register as _register +import distutils.command.register as orig -class register(_register): - __doc__ = _register.__doc__ + +class register(orig.register): + __doc__ = orig.register.__doc__ def run(self): # Make sure that we are using valid current name/version info self.run_command('egg_info') - _register.run(self) - + orig.register.run(self) diff --git a/setuptools/command/rotate.py b/setuptools/command/rotate.py index 11b6eae..7ea36e9 100755 --- a/setuptools/command/rotate.py +++ b/setuptools/command/rotate.py @@ -1,17 +1,22 @@ -import distutils, os -from setuptools import Command from distutils.util import convert_path from distutils import log -from distutils.errors import * +from distutils.errors import DistutilsOptionError +import os +import shutil + +import six + +from setuptools import Command + class rotate(Command): """Delete older distributions""" description = "delete older distributions, keeping N newest files" user_options = [ - ('match=', 'm', "patterns to match (required)"), + ('match=', 'm', "patterns to match (required)"), ('dist-dir=', 'd', "directory where the distributions are"), - ('keep=', 'k', "number of matching distributions to keep"), + ('keep=', 'k', "number of matching distributions to keep"), ] boolean_options = [] @@ -28,55 +33,34 @@ class rotate(Command): "(e.g. '.zip' or '.egg')" ) if self.keep is None: - raise DistutilsOptionError("Must specify number of files to keep") + raise DistutilsOptionError("Must specify number of files to keep") try: self.keep = int(self.keep) except ValueError: raise DistutilsOptionError("--keep must be an integer") - if isinstance(self.match, basestring): + if isinstance(self.match, six.string_types): self.match = [ convert_path(p.strip()) for p in self.match.split(',') ] - self.set_undefined_options('bdist',('dist_dir', 'dist_dir')) + self.set_undefined_options('bdist', ('dist_dir', 'dist_dir')) def run(self): self.run_command("egg_info") from glob import glob + for pattern in self.match: - pattern = self.distribution.get_name()+'*'+pattern - files = glob(os.path.join(self.dist_dir,pattern)) - files = [(os.path.getmtime(f),f) for f in files] + pattern = self.distribution.get_name() + '*' + pattern + files = glob(os.path.join(self.dist_dir, pattern)) + files = [(os.path.getmtime(f), f) for f in files] files.sort() files.reverse() log.info("%d file(s) matching %s", len(files), pattern) files = files[self.keep:] - for (t,f) in files: + for (t, f) in files: log.info("Deleting %s", f) if not self.dry_run: - os.unlink(f) - - - - - - - - - - - - - - - - - - - - - - - - - + if os.path.isdir(f): + shutil.rmtree(f) + else: + os.unlink(f) diff --git a/setuptools/command/saveopts.py b/setuptools/command/saveopts.py index 1180a44..611cec5 100755 --- a/setuptools/command/saveopts.py +++ b/setuptools/command/saveopts.py @@ -1,7 +1,6 @@ -import distutils, os -from setuptools import Command from setuptools.command.setopt import edit_config, option_base + class saveopts(option_base): """Save command-line options to a file""" @@ -9,17 +8,15 @@ class saveopts(option_base): def run(self): dist = self.distribution - commands = dist.command_options.keys() settings = {} - for cmd in commands: + for cmd in dist.command_options: - if cmd=='saveopts': - continue # don't save our own options! + if cmd == 'saveopts': + continue # don't save our own options! - for opt,(src,val) in dist.get_option_dict(cmd).items(): - if src=="command line": - settings.setdefault(cmd,{})[opt] = val + for opt, (src, val) in dist.get_option_dict(cmd).items(): + if src == "command line": + settings.setdefault(cmd, {})[opt] = val edit_config(self.filename, settings, self.dry_run) - diff --git a/setuptools/command/sdist.py b/setuptools/command/sdist.py index d84afdb..2c2d88a 100755 --- a/setuptools/command/sdist.py +++ b/setuptools/command/sdist.py @@ -1,42 +1,17 @@ -from distutils.command.sdist import sdist as _sdist -from distutils.util import convert_path from distutils import log -from glob import glob -import os, re, sys, pkg_resources - -entities = [ - ("<","<"), (">", ">"), (""", '"'), ("'", "'"), - ("&", "&") -] - -def unescape(data): - for old,new in entities: - data = data.replace(old,new) - return data - -def re_finder(pattern, postproc=None): - def find(dirname, filename): - f = open(filename,'rU') - data = f.read() - f.close() - for match in pattern.finditer(data): - path = match.group(1) - if postproc: - path = postproc(path) - yield joinpath(dirname,path) - return find - -def joinpath(prefix,suffix): - if not prefix: - return suffix - return os.path.join(prefix,suffix) - - - +import distutils.command.sdist as orig +import os +import sys +import io +import contextlib +import six +from .py36compat import sdist_add_defaults +import pkg_resources +_default_revctrl = list def walk_revctrl(dirname=''): @@ -45,83 +20,8 @@ def walk_revctrl(dirname=''): for item in ep.load()(dirname): yield item -def _default_revctrl(dirname=''): - for path, finder in finders: - path = joinpath(dirname,path) - if os.path.isfile(path): - for path in finder(dirname,path): - if os.path.isfile(path): - yield path - elif os.path.isdir(path): - for item in _default_revctrl(path): - yield item - -def externals_finder(dirname, filename): - """Find any 'svn:externals' directories""" - found = False - f = open(filename,'rb') - for line in iter(f.readline, ''): # can't use direct iter! - parts = line.split() - if len(parts)==2: - kind,length = parts - data = f.read(int(length)) - if kind=='K' and data=='svn:externals': - found = True - elif kind=='V' and found: - f.close() - break - else: - f.close() - return - - for line in data.splitlines(): - parts = line.split() - if parts: - yield joinpath(dirname, parts[0]) - - -entries_pattern = re.compile(r'name="([^"]+)"(?![^>]+deleted="true")', re.I) - -def entries_finder(dirname, filename): - f = open(filename,'rU') - data = f.read() - f.close() - if data.startswith('=6 and record[5]=="delete": - continue # skip deleted - yield joinpath(dirname, record[0]) - - -finders = [ - (convert_path('CVS/Entries'), - re_finder(re.compile(r"^\w?/([^/]+)/", re.M))), - (convert_path('.svn/entries'), entries_finder), - (convert_path('.svn/dir-props'), externals_finder), - (convert_path('.svn/dir-prop-base'), externals_finder), # svn 1.4 -] - - - - - - - - - - - -class sdist(_sdist): +class sdist(sdist_add_defaults, orig.sdist): """Smart sdist that finds anything supported by revision control""" user_options = [ @@ -133,102 +33,139 @@ class sdist(_sdist): ('dist-dir=', 'd', "directory to put the source distribution archive(s) in " "[default: dist]"), - ] + ] negative_opt = {} + READMES = 'README', 'README.rst', 'README.txt' + def run(self): self.run_command('egg_info') ei_cmd = self.get_finalized_command('egg_info') self.filelist = ei_cmd.filelist - self.filelist.append(os.path.join(ei_cmd.egg_info,'SOURCES.txt')) + self.filelist.append(os.path.join(ei_cmd.egg_info, 'SOURCES.txt')) self.check_readme() - self.check_metadata() + + # Run sub commands + for cmd_name in self.get_sub_commands(): + self.run_command(cmd_name) + + # Call check_metadata only if no 'check' command + # (distutils <= 2.6) + import distutils.command + + if 'check' not in distutils.command.__all__: + self.check_metadata() + self.make_distribution() - dist_files = getattr(self.distribution,'dist_files',[]) + dist_files = getattr(self.distribution, 'dist_files', []) for file in self.archive_files: data = ('sdist', '', file) if data not in dist_files: dist_files.append(data) - def read_template(self): - try: - _sdist.read_template(self) - except: - # grody hack to close the template file (MANIFEST.in) - # this prevents easy_install's attempt at deleting the file from - # dying and thus masking the real error - sys.exc_info()[2].tb_next.tb_frame.f_locals['template'].close() - raise + def initialize_options(self): + orig.sdist.initialize_options(self) - # Cribbed from old distutils code, to work around new distutils code - # that tries to do some of the same stuff as we do, in a way that makes - # us loop. - - def add_defaults (self): - standards = [('README', 'README.txt'), self.distribution.script_name] - - for fn in standards: - if type(fn) is tuple: - alts = fn - got_it = 0 - for fn in alts: - if os.path.exists(fn): - got_it = 1 - self.filelist.append(fn) - break - - if not got_it: - self.warn("standard file not found: should have one of " + - ', '.join(alts)) - else: - if os.path.exists(fn): - self.filelist.append(fn) - else: - self.warn("standard file '%s' not found" % fn) - - optional = ['test/test*.py', 'setup.cfg'] - - for pattern in optional: - files = filter(os.path.isfile, glob(pattern)) - if files: - self.filelist.extend(files) + self._default_to_gztar() - if self.distribution.has_pure_modules(): - build_py = self.get_finalized_command('build_py') - self.filelist.extend(build_py.get_source_files()) + def _default_to_gztar(self): + # only needed on Python prior to 3.6. + if sys.version_info >= (3, 6, 0, 'beta', 1): + return + self.formats = ['gztar'] + + def make_distribution(self): + """ + Workaround for #516 + """ + with self._remove_os_link(): + orig.sdist.make_distribution(self) - if self.distribution.has_ext_modules(): - build_ext = self.get_finalized_command('build_ext') - self.filelist.extend(build_ext.get_source_files()) + @staticmethod + @contextlib.contextmanager + def _remove_os_link(): + """ + In a context, remove and restore os.link if it exists + """ - if self.distribution.has_c_libraries(): - build_clib = self.get_finalized_command('build_clib') - self.filelist.extend(build_clib.get_source_files()) + class NoValue: + pass - if self.distribution.has_scripts(): - build_scripts = self.get_finalized_command('build_scripts') - self.filelist.extend(build_scripts.get_source_files()) + orig_val = getattr(os, 'link', NoValue) + try: + del os.link + except Exception: + pass + try: + yield + finally: + if orig_val is not NoValue: + setattr(os, 'link', orig_val) + + def __read_template_hack(self): + # This grody hack closes the template file (MANIFEST.in) if an + # exception occurs during read_template. + # Doing so prevents an error when easy_install attempts to delete the + # file. + try: + orig.sdist.read_template(self) + except Exception: + _, _, tb = sys.exc_info() + tb.tb_next.tb_frame.f_locals['template'].close() + raise + # Beginning with Python 2.7.2, 3.1.4, and 3.2.1, this leaky file handle + # has been fixed, so only override the method if we're using an earlier + # Python. + has_leaky_handle = ( + sys.version_info < (2, 7, 2) + or (3, 0) <= sys.version_info < (3, 1, 4) + or (3, 2) <= sys.version_info < (3, 2, 1) + ) + if has_leaky_handle: + read_template = __read_template_hack + + def _add_defaults_python(self): + """getting python files""" + if self.distribution.has_pure_modules(): + build_py = self.get_finalized_command('build_py') + self.filelist.extend(build_py.get_source_files()) + # This functionality is incompatible with include_package_data, and + # will in fact create an infinite recursion if include_package_data + # is True. Use of include_package_data will imply that + # distutils-style automatic handling of package_data is disabled + if not self.distribution.include_package_data: + for _, src_dir, _, filenames in build_py.data_files: + self.filelist.extend([os.path.join(src_dir, filename) + for filename in filenames]) + + def _add_defaults_data_files(self): + try: + if six.PY2: + sdist_add_defaults._add_defaults_data_files(self) + else: + super()._add_defaults_data_files() + except TypeError: + log.warn("data_files contains unexpected objects") def check_readme(self): - alts = ("README", "README.txt") - for f in alts: + for f in self.READMES: if os.path.exists(f): return else: self.warn( - "standard file not found: should have one of " +', '.join(alts) + "standard file not found: should have one of " + + ', '.join(self.READMES) ) - def make_release_tree(self, base_dir, files): - _sdist.make_release_tree(self, base_dir, files) + orig.sdist.make_release_tree(self, base_dir, files) # Save any egg_info command line options used to create this sdist dest = os.path.join(base_dir, 'setup.cfg') - if hasattr(os,'link') and os.path.exists(dest): + if hasattr(os, 'link') and os.path.exists(dest): # unlink and re-copy, since it might be hard-linked, and # we don't want to change the source version os.unlink(dest) @@ -236,11 +173,34 @@ class sdist(_sdist): self.get_finalized_command('egg_info').save_version_info(dest) - - - - - - - -# + def _manifest_is_not_generated(self): + # check for special comment used in 2.7.1 and higher + if not os.path.isfile(self.manifest): + return False + + with io.open(self.manifest, 'rb') as fp: + first_line = fp.readline() + return (first_line != + '# file GENERATED by distutils, do NOT edit\n'.encode()) + + def read_manifest(self): + """Read the manifest file (named by 'self.manifest') and use it to + fill in 'self.filelist', the list of files to include in the source + distribution. + """ + log.info("reading manifest file '%s'", self.manifest) + manifest = open(self.manifest, 'rb') + for line in manifest: + # The manifest must contain UTF-8. See #303. + if six.PY3: + try: + line = line.decode('UTF-8') + except UnicodeDecodeError: + log.warn("%r not UTF-8 decodable -- skipping" % line) + continue + # ignore comments and blank lines + line = line.strip() + if line.startswith('#') or not line: + continue + self.filelist.append(line) + manifest.close() diff --git a/setuptools/command/setopt.py b/setuptools/command/setopt.py index dbf3a94..6f6298c 100755 --- a/setuptools/command/setopt.py +++ b/setuptools/command/setopt.py @@ -1,8 +1,12 @@ -import distutils, os -from setuptools import Command from distutils.util import convert_path from distutils import log -from distutils.errors import * +from distutils.errors import DistutilsOptionError +import distutils +import os + +from six.moves import configparser + +from setuptools import Command __all__ = ['config_file', 'edit_config', 'option_base', 'setopt'] @@ -12,33 +16,20 @@ def config_file(kind="local"): `kind` must be one of "local", "global", or "user" """ - if kind=='local': + if kind == 'local': return 'setup.cfg' - if kind=='global': + if kind == 'global': return os.path.join( - os.path.dirname(distutils.__file__),'distutils.cfg' + os.path.dirname(distutils.__file__), 'distutils.cfg' ) - if kind=='user': - dot = os.name=='posix' and '.' or '' + if kind == 'user': + dot = os.name == 'posix' and '.' or '' return os.path.expanduser(convert_path("~/%spydistutils.cfg" % dot)) raise ValueError( "config_file() type must be 'local', 'global', or 'user'", kind ) - - - - - - - - - - - - - def edit_config(filename, settings, dry_run=False): """Edit a configuration file to include `settings` @@ -47,9 +38,8 @@ def edit_config(filename, settings, dry_run=False): while a dictionary lists settings to be changed or deleted in that section. A setting of ``None`` means to delete that setting. """ - from ConfigParser import RawConfigParser log.debug("Reading configuration from %s", filename) - opts = RawConfigParser() + opts = configparser.RawConfigParser() opts.read([filename]) for section, options in settings.items(): if options is None: @@ -59,46 +49,49 @@ def edit_config(filename, settings, dry_run=False): if not opts.has_section(section): log.debug("Adding new section [%s] to %s", section, filename) opts.add_section(section) - for option,value in options.items(): + for option, value in options.items(): if value is None: - log.debug("Deleting %s.%s from %s", + log.debug( + "Deleting %s.%s from %s", section, option, filename ) - opts.remove_option(section,option) + opts.remove_option(section, option) if not opts.options(section): log.info("Deleting empty [%s] section from %s", - section, filename) + section, filename) opts.remove_section(section) else: log.debug( "Setting %s.%s to %r in %s", section, option, value, filename ) - opts.set(section,option,value) + opts.set(section, option, value) log.info("Writing %s", filename) if not dry_run: - f = open(filename,'w'); opts.write(f); f.close() + with open(filename, 'w') as f: + opts.write(f) + class option_base(Command): """Abstract base class for commands that mess with config files""" - + user_options = [ ('global-config', 'g', - "save options to the site-wide distutils.cfg file"), + "save options to the site-wide distutils.cfg file"), ('user-config', 'u', - "save options to the current user's pydistutils.cfg file"), + "save options to the current user's pydistutils.cfg file"), ('filename=', 'f', - "configuration file to use (default=setup.cfg)"), + "configuration file to use (default=setup.cfg)"), ] boolean_options = [ 'global-config', 'user-config', - ] + ] def initialize_options(self): self.global_config = None - self.user_config = None + self.user_config = None self.filename = None def finalize_options(self): @@ -111,14 +104,12 @@ class option_base(Command): filenames.append(self.filename) if not filenames: filenames.append(config_file('local')) - if len(filenames)>1: + if len(filenames) > 1: raise DistutilsOptionError( "Must specify only one configuration file option", filenames ) - self.filename, = filenames - - + self.filename, = filenames class setopt(option_base): @@ -128,9 +119,9 @@ class setopt(option_base): user_options = [ ('command=', 'c', 'command to set an option for'), - ('option=', 'o', 'option to set'), - ('set-value=', 's', 'value of the option'), - ('remove', 'r', 'remove (unset) the value'), + ('option=', 'o', 'option to set'), + ('set-value=', 's', 'value of the option'), + ('remove', 'r', 'remove (unset) the value'), ] + option_base.user_options boolean_options = option_base.boolean_options + ['remove'] @@ -152,13 +143,7 @@ class setopt(option_base): def run(self): edit_config( self.filename, { - self.command: {self.option.replace('-','_'):self.set_value} + self.command: {self.option.replace('-', '_'): self.set_value} }, self.dry_run ) - - - - - - diff --git a/setuptools/command/test.py b/setuptools/command/test.py index db918da..e7a386d 100644 --- a/setuptools/command/test.py +++ b/setuptools/command/test.py @@ -1,12 +1,24 @@ -from setuptools import Command -from distutils.errors import DistutilsOptionError +import os +import operator import sys -from pkg_resources import * -from unittest import TestLoader, main +import contextlib +import itertools +from distutils.errors import DistutilsError, DistutilsOptionError +from distutils import log +from unittest import TestLoader + +import six +from six.moves import map, filter + +from pkg_resources import (resource_listdir, resource_exists, normalize_path, + working_set, _namespace_packages, + add_activation_listener, require, EntryPoint) +from setuptools import Command +from setuptools.py31compat import unittest_main -class ScanningLoader(TestLoader): - def loadTestsFromModule(self, module): +class ScanningLoader(TestLoader): + def loadTestsFromModule(self, module, pattern=None): """Return a suite of all tests cases contained in the given module If the module is a package, load tests from all the modules in it. @@ -14,79 +26,119 @@ class ScanningLoader(TestLoader): the return value to the tests. """ tests = [] - if module.__name__!='setuptools.tests.doctest': # ugh - tests.append(TestLoader.loadTestsFromModule(self,module)) + tests.append(TestLoader.loadTestsFromModule(self, module)) if hasattr(module, "additional_tests"): tests.append(module.additional_tests()) if hasattr(module, '__path__'): for file in resource_listdir(module.__name__, ''): - if file.endswith('.py') and file!='__init__.py': - submodule = module.__name__+'.'+file[:-3] + if file.endswith('.py') and file != '__init__.py': + submodule = module.__name__ + '.' + file[:-3] else: - if resource_exists( - module.__name__, file+'/__init__.py' - ): - submodule = module.__name__+'.'+file + if resource_exists(module.__name__, file + '/__init__.py'): + submodule = module.__name__ + '.' + file else: continue tests.append(self.loadTestsFromName(submodule)) - if len(tests)!=1: + if len(tests) != 1: return self.suiteClass(tests) else: - return tests[0] # don't create a nested suite for only one return + return tests[0] # don't create a nested suite for only one return -class test(Command): +# adapted from jaraco.classes.properties:NonDataProperty +class NonDataProperty(object): + def __init__(self, fget): + self.fget = fget + def __get__(self, obj, objtype=None): + if obj is None: + return self + return self.fget(obj) + + +class test(Command): """Command to run unit tests after in-place build""" description = "run unit tests after in-place build" user_options = [ - ('test-module=','m', "Run 'test_suite' in specified module"), - ('test-suite=','s', - "Test suite to run (e.g. 'some_module.test_suite')"), + ('test-module=', 'm', "Run 'test_suite' in specified module"), + ('test-suite=', 's', + "Test suite to run (e.g. 'some_module.test_suite')"), + ('test-runner=', 'r', "Test runner to use"), ] def initialize_options(self): self.test_suite = None self.test_module = None self.test_loader = None - + self.test_runner = None def finalize_options(self): + if self.test_suite and self.test_module: + msg = "You may specify a module or a suite, but not both" + raise DistutilsOptionError(msg) + if self.test_suite is None: if self.test_module is None: self.test_suite = self.distribution.test_suite else: - self.test_suite = self.test_module+".test_suite" - elif self.test_module: - raise DistutilsOptionError( - "You may specify a module or a suite, but not both" - ) - - self.test_args = [self.test_suite] + self.test_suite = self.test_module + ".test_suite" - if self.verbose: - self.test_args.insert(0,'--verbose') if self.test_loader is None: - self.test_loader = getattr(self.distribution,'test_loader',None) + self.test_loader = getattr(self.distribution, 'test_loader', None) if self.test_loader is None: self.test_loader = "setuptools.command.test:ScanningLoader" + if self.test_runner is None: + self.test_runner = getattr(self.distribution, 'test_runner', None) + @NonDataProperty + def test_args(self): + return list(self._test_args()) + def _test_args(self): + if self.verbose: + yield '--verbose' + if self.test_suite: + yield self.test_suite def with_project_on_sys_path(self, func): - # Ensure metadata is up-to-date - self.run_command('egg_info') + """ + Backward compatibility for project_on_sys_path context. + """ + with self.project_on_sys_path(): + func() - # Build extensions in-place - self.reinitialize_command('build_ext', inplace=1) - self.run_command('build_ext') + @contextlib.contextmanager + def project_on_sys_path(self, include_dists=[]): + with_2to3 = six.PY3 and getattr(self.distribution, 'use_2to3', False) + + if with_2to3: + # If we run 2to3 we can not do this inplace: + + # Ensure metadata is up-to-date + self.reinitialize_command('build_py', inplace=0) + self.run_command('build_py') + bpy_cmd = self.get_finalized_command("build_py") + build_path = normalize_path(bpy_cmd.build_lib) + + # Build extensions + self.reinitialize_command('egg_info', egg_base=build_path) + self.run_command('egg_info') + + self.reinitialize_command('build_ext', inplace=0) + self.run_command('build_ext') + else: + # Without 2to3 inplace works fine: + self.run_command('egg_info') + + # Build extensions in-place + self.reinitialize_command('build_ext', inplace=1) + self.run_command('build_ext') ei_cmd = self.get_finalized_command("egg_info") @@ -94,71 +146,109 @@ class test(Command): old_modules = sys.modules.copy() try: - sys.path.insert(0, normalize_path(ei_cmd.egg_base)) + project_path = normalize_path(ei_cmd.egg_base) + sys.path.insert(0, project_path) working_set.__init__() add_activation_listener(lambda dist: dist.activate()) require('%s==%s' % (ei_cmd.egg_name, ei_cmd.egg_version)) - func() + with self.paths_on_pythonpath([project_path]): + yield finally: sys.path[:] = old_path sys.modules.clear() sys.modules.update(old_modules) working_set.__init__() + @staticmethod + @contextlib.contextmanager + def paths_on_pythonpath(paths): + """ + Add the indicated paths to the head of the PYTHONPATH environment + variable so that subprocesses will also see the packages at + these paths. - def run(self): - if self.distribution.install_requires: - self.distribution.fetch_build_eggs(self.distribution.install_requires) - if self.distribution.tests_require: - self.distribution.fetch_build_eggs(self.distribution.tests_require) - - if self.test_suite: - cmd = ' '.join(self.test_args) - if self.dry_run: - self.announce('skipping "unittest %s" (dry run)' % cmd) + Do this in a context that restores the value on exit. + """ + nothing = object() + orig_pythonpath = os.environ.get('PYTHONPATH', nothing) + current_pythonpath = os.environ.get('PYTHONPATH', '') + try: + prefix = os.pathsep.join(paths) + to_join = filter(None, [prefix, current_pythonpath]) + new_path = os.pathsep.join(to_join) + if new_path: + os.environ['PYTHONPATH'] = new_path + yield + finally: + if orig_pythonpath is nothing: + os.environ.pop('PYTHONPATH', None) else: - self.announce('running "unittest %s"' % cmd) - self.with_project_on_sys_path(self.run_tests) - - - def run_tests(self): - import unittest - loader_ep = EntryPoint.parse("x="+self.test_loader) - loader_class = loader_ep.load(require=False) - unittest.main( - None, None, [unittest.__file__]+self.test_args, - testLoader = loader_class() - ) - - - - - - - - - - - - - - - - - - - - - - - - - + os.environ['PYTHONPATH'] = orig_pythonpath + @staticmethod + def install_dists(dist): + """ + Install the requirements indicated by self.distribution and + return an iterable of the dists that were built. + """ + ir_d = dist.fetch_build_eggs(dist.install_requires or []) + tr_d = dist.fetch_build_eggs(dist.tests_require or []) + return itertools.chain(ir_d, tr_d) + def run(self): + installed_dists = self.install_dists(self.distribution) + cmd = ' '.join(self._argv) + if self.dry_run: + self.announce('skipping "%s" (dry run)' % cmd) + return + self.announce('running "%s"' % cmd) + paths = map(operator.attrgetter('location'), installed_dists) + with self.paths_on_pythonpath(paths): + with self.project_on_sys_path(): + self.run_tests() + def run_tests(self): + # Purge modules under test from sys.modules. The test loader will + # re-import them from the build location. Required when 2to3 is used + # with namespace packages. + if six.PY3 and getattr(self.distribution, 'use_2to3', False): + module = self.test_suite.split('.')[0] + if module in _namespace_packages: + del_modules = [] + if module in sys.modules: + del_modules.append(module) + module += '.' + for name in sys.modules: + if name.startswith(module): + del_modules.append(name) + list(map(sys.modules.__delitem__, del_modules)) + + exit_kwarg = {} if sys.version_info < (2, 7) else {"exit": False} + test = unittest_main( + None, None, self._argv, + testLoader=self._resolve_as_ep(self.test_loader), + testRunner=self._resolve_as_ep(self.test_runner), + **exit_kwarg + ) + if not test.result.wasSuccessful(): + msg = 'Test failed: %s' % test.result + self.announce(msg, log.ERROR) + raise DistutilsError(msg) + @property + def _argv(self): + return ['unittest'] + self.test_args + @staticmethod + def _resolve_as_ep(val): + """ + Load the indicated attribute value, called, as a as if it were + specified as an entry point. + """ + if val is None: + return + parsed = EntryPoint.parse("x=" + val) + return parsed.resolve()() diff --git a/setuptools/command/upload.py b/setuptools/command/upload.py old mode 100755 new mode 100644 index 7ac08c2..a44173a --- a/setuptools/command/upload.py +++ b/setuptools/command/upload.py @@ -1,181 +1,42 @@ -"""distutils.command.upload +import getpass +from distutils.command import upload as orig -Implements the Distutils 'upload' subcommand (upload package to PyPI).""" -from distutils.errors import * -from distutils.core import Command -from distutils.spawn import spawn -from distutils import log -try: - from hashlib import md5 -except ImportError: - from md5 import md5 -import os -import socket -import platform -import ConfigParser -import httplib -import base64 -import urlparse -import cStringIO as StringIO - -class upload(Command): - - description = "upload binary package to PyPI" - - DEFAULT_REPOSITORY = 'http://pypi.python.org/pypi' - - user_options = [ - ('repository=', 'r', - "url of repository [default: %s]" % DEFAULT_REPOSITORY), - ('show-response', None, - 'display full response text from server'), - ('sign', 's', - 'sign files to upload using gpg'), - ('identity=', 'i', 'GPG identity used to sign files'), - ] - boolean_options = ['show-response', 'sign'] - - def initialize_options(self): - self.username = '' - self.password = '' - self.repository = '' - self.show_response = 0 - self.sign = False - self.identity = None +class upload(orig.upload): + """ + Override default upload behavior to obtain password + in a variety of different ways. + """ def finalize_options(self): - if self.identity and not self.sign: - raise DistutilsOptionError( - "Must use --sign for --identity to have meaning" - ) - if os.environ.has_key('HOME'): - rc = os.path.join(os.environ['HOME'], '.pypirc') - if os.path.exists(rc): - self.announce('Using PyPI login from %s' % rc) - config = ConfigParser.ConfigParser({ - 'username':'', - 'password':'', - 'repository':''}) - config.read(rc) - if not self.repository: - self.repository = config.get('server-login', 'repository') - if not self.username: - self.username = config.get('server-login', 'username') - if not self.password: - self.password = config.get('server-login', 'password') - if not self.repository: - self.repository = self.DEFAULT_REPOSITORY - - def run(self): - if not self.distribution.dist_files: - raise DistutilsOptionError("No dist file created in earlier command") - for command, pyversion, filename in self.distribution.dist_files: - self.upload_file(command, pyversion, filename) - - def upload_file(self, command, pyversion, filename): - # Sign if requested - if self.sign: - gpg_args = ["gpg", "--detach-sign", "-a", filename] - if self.identity: - gpg_args[2:2] = ["--local-user", self.identity] - spawn(gpg_args, - dry_run=self.dry_run) - - # Fill in the data - content = open(filename,'rb').read() - basename = os.path.basename(filename) - comment = '' - if command=='bdist_egg' and self.distribution.has_ext_modules(): - comment = "built on %s" % platform.platform(terse=1) - data = { - ':action':'file_upload', - 'protcol_version':'1', - 'name':self.distribution.get_name(), - 'version':self.distribution.get_version(), - 'content':(basename,content), - 'filetype':command, - 'pyversion':pyversion, - 'md5_digest':md5(content).hexdigest(), - } - if command == 'bdist_rpm': - dist, version, id = platform.dist() - if dist: - comment = 'built for %s %s' % (dist, version) - elif command == 'bdist_dumb': - comment = 'built for %s' % platform.platform(terse=1) - data['comment'] = comment - - if self.sign: - data['gpg_signature'] = (os.path.basename(filename) + ".asc", - open(filename+".asc").read()) - - # set up the authentication - auth = "Basic " + base64.encodestring(self.username + ":" + self.password).strip() - - # Build up the MIME payload for the POST data - boundary = '--------------GHSKFJDLGDS7543FJKLFHRE75642756743254' - sep_boundary = '\n--' + boundary - end_boundary = sep_boundary + '--' - body = StringIO.StringIO() - for key, value in data.items(): - # handle multiple entries for the same name - if type(value) != type([]): - value = [value] - for value in value: - if type(value) is tuple: - fn = ';filename="%s"' % value[0] - value = value[1] - else: - fn = "" - value = str(value) - body.write(sep_boundary) - body.write('\nContent-Disposition: form-data; name="%s"'%key) - body.write(fn) - body.write("\n\n") - body.write(value) - if value and value[-1] == '\r': - body.write('\n') # write an extra newline (lurve Macs) - body.write(end_boundary) - body.write("\n") - body = body.getvalue() - - self.announce("Submitting %s to %s" % (filename, self.repository), log.INFO) - - # build the Request - # We can't use urllib2 since we need to send the Basic - # auth right with the first request - schema, netloc, url, params, query, fragments = \ - urlparse.urlparse(self.repository) - assert not params and not query and not fragments - if schema == 'http': - http = httplib.HTTPConnection(netloc) - elif schema == 'https': - http = httplib.HTTPSConnection(netloc) - else: - raise AssertionError, "unsupported schema "+schema - - data = '' - loglevel = log.INFO + orig.upload.finalize_options(self) + self.username = ( + self.username or + getpass.getuser() + ) + # Attempt to obtain password. Short circuit evaluation at the first + # sign of success. + self.password = ( + self.password or + self._load_password_from_keyring() or + self._prompt_for_password() + ) + + def _load_password_from_keyring(self): + """ + Attempt to load password from keyring. Suppress Exceptions. + """ try: - http.connect() - http.putrequest("POST", url) - http.putheader('Content-type', - 'multipart/form-data; boundary=%s'%boundary) - http.putheader('Content-length', str(len(body))) - http.putheader('Authorization', auth) - http.endheaders() - http.send(body) - except socket.error, e: - self.announce(str(e), log.ERROR) - return - - r = http.getresponse() - if r.status == 200: - self.announce('Server response (%s): %s' % (r.status, r.reason), - log.INFO) - else: - self.announce('Upload failed (%s): %s' % (r.status, r.reason), - log.ERROR) - if self.show_response: - print '-'*75, r.read(), '-'*75 + keyring = __import__('keyring') + return keyring.get_password(self.repository, self.username) + except Exception: + pass + + def _prompt_for_password(self): + """ + Prompt for a password on the tty. Suppress Exceptions. + """ + try: + return getpass.getpass() + except (Exception, KeyboardInterrupt): + pass diff --git a/setuptools/command/upload_docs.py b/setuptools/command/upload_docs.py new file mode 100644 index 0000000..468cb37 --- /dev/null +++ b/setuptools/command/upload_docs.py @@ -0,0 +1,205 @@ +# -*- coding: utf-8 -*- +"""upload_docs + +Implements a Distutils 'upload_docs' subcommand (upload documentation to +PyPI's pythonhosted.org). +""" + +from base64 import standard_b64encode +from distutils import log +from distutils.errors import DistutilsOptionError +import os +import socket +import zipfile +import tempfile +import shutil +import itertools +import functools + +import six +from six.moves import http_client, urllib + +from pkg_resources import iter_entry_points +from .upload import upload + + +def _encode(s): + errors = 'surrogateescape' if six.PY3 else 'strict' + return s.encode('utf-8', errors) + + +class upload_docs(upload): + # override the default repository as upload_docs isn't + # supported by Warehouse (and won't be). + DEFAULT_REPOSITORY = 'https://pypi.python.org/pypi/' + + description = 'Upload documentation to PyPI' + + user_options = [ + ('repository=', 'r', + "url of repository [default: %s]" % upload.DEFAULT_REPOSITORY), + ('show-response', None, + 'display full response text from server'), + ('upload-dir=', None, 'directory to upload'), + ] + boolean_options = upload.boolean_options + + def has_sphinx(self): + if self.upload_dir is None: + for ep in iter_entry_points('distutils.commands', 'build_sphinx'): + return True + + sub_commands = [('build_sphinx', has_sphinx)] + + def initialize_options(self): + upload.initialize_options(self) + self.upload_dir = None + self.target_dir = None + + def finalize_options(self): + log.warn("Upload_docs command is deprecated. Use RTD instead.") + upload.finalize_options(self) + if self.upload_dir is None: + if self.has_sphinx(): + build_sphinx = self.get_finalized_command('build_sphinx') + self.target_dir = build_sphinx.builder_target_dir + else: + build = self.get_finalized_command('build') + self.target_dir = os.path.join(build.build_base, 'docs') + else: + self.ensure_dirname('upload_dir') + self.target_dir = self.upload_dir + self.announce('Using upload directory %s' % self.target_dir) + + def create_zipfile(self, filename): + zip_file = zipfile.ZipFile(filename, "w") + try: + self.mkpath(self.target_dir) # just in case + for root, dirs, files in os.walk(self.target_dir): + if root == self.target_dir and not files: + tmpl = "no files found in upload directory '%s'" + raise DistutilsOptionError(tmpl % self.target_dir) + for name in files: + full = os.path.join(root, name) + relative = root[len(self.target_dir):].lstrip(os.path.sep) + dest = os.path.join(relative, name) + zip_file.write(full, dest) + finally: + zip_file.close() + + def run(self): + # Run sub commands + for cmd_name in self.get_sub_commands(): + self.run_command(cmd_name) + + tmp_dir = tempfile.mkdtemp() + name = self.distribution.metadata.get_name() + zip_file = os.path.join(tmp_dir, "%s.zip" % name) + try: + self.create_zipfile(zip_file) + self.upload_file(zip_file) + finally: + shutil.rmtree(tmp_dir) + + @staticmethod + def _build_part(item, sep_boundary): + key, values = item + title = '\nContent-Disposition: form-data; name="%s"' % key + # handle multiple entries for the same name + if not isinstance(values, list): + values = [values] + for value in values: + if isinstance(value, tuple): + title += '; filename="%s"' % value[0] + value = value[1] + else: + value = _encode(value) + yield sep_boundary + yield _encode(title) + yield b"\n\n" + yield value + if value and value[-1:] == b'\r': + yield b'\n' # write an extra newline (lurve Macs) + + @classmethod + def _build_multipart(cls, data): + """ + Build up the MIME payload for the POST data + """ + boundary = b'--------------GHSKFJDLGDS7543FJKLFHRE75642756743254' + sep_boundary = b'\n--' + boundary + end_boundary = sep_boundary + b'--' + end_items = end_boundary, b"\n", + builder = functools.partial( + cls._build_part, + sep_boundary=sep_boundary, + ) + part_groups = map(builder, data.items()) + parts = itertools.chain.from_iterable(part_groups) + body_items = itertools.chain(parts, end_items) + content_type = 'multipart/form-data; boundary=%s' % boundary.decode('ascii') + return b''.join(body_items), content_type + + def upload_file(self, filename): + with open(filename, 'rb') as f: + content = f.read() + meta = self.distribution.metadata + data = { + ':action': 'doc_upload', + 'name': meta.get_name(), + 'content': (os.path.basename(filename), content), + } + # set up the authentication + credentials = _encode(self.username + ':' + self.password) + credentials = standard_b64encode(credentials) + if six.PY3: + credentials = credentials.decode('ascii') + auth = "Basic " + credentials + + body, ct = self._build_multipart(data) + + msg = "Submitting documentation to %s" % (self.repository) + self.announce(msg, log.INFO) + + # build the Request + # We can't use urllib2 since we need to send the Basic + # auth right with the first request + schema, netloc, url, params, query, fragments = \ + urllib.parse.urlparse(self.repository) + assert not params and not query and not fragments + if schema == 'http': + conn = http_client.HTTPConnection(netloc) + elif schema == 'https': + conn = http_client.HTTPSConnection(netloc) + else: + raise AssertionError("unsupported schema " + schema) + + data = '' + try: + conn.connect() + conn.putrequest("POST", url) + content_type = ct + conn.putheader('Content-type', content_type) + conn.putheader('Content-length', str(len(body))) + conn.putheader('Authorization', auth) + conn.endheaders() + conn.send(body) + except socket.error as e: + self.announce(str(e), log.ERROR) + return + + r = conn.getresponse() + if r.status == 200: + msg = 'Server response (%s): %s' % (r.status, r.reason) + self.announce(msg, log.INFO) + elif r.status == 301: + location = r.getheader('Location') + if location is None: + location = 'https://pythonhosted.org/%s/' % meta.get_name() + msg = 'Upload successful. Visit %s' % location + self.announce(msg, log.INFO) + else: + msg = 'Upload failed (%s): %s' % (r.status, r.reason) + self.announce(msg, log.ERROR) + if self.show_response: + print('-' * 75, r.read(), '-' * 75) diff --git a/setuptools/config.py b/setuptools/config.py new file mode 100644 index 0000000..39a01f8 --- /dev/null +++ b/setuptools/config.py @@ -0,0 +1,547 @@ +from __future__ import absolute_import, unicode_literals +import io +import os +import sys +from collections import defaultdict +from functools import partial + +from distutils.errors import DistutilsOptionError, DistutilsFileError +from setuptools.py26compat import import_module +from six import string_types + + +def read_configuration( + filepath, find_others=False, ignore_option_errors=False): + """Read given configuration file and returns options from it as a dict. + + :param str|unicode filepath: Path to configuration file + to get options from. + + :param bool find_others: Whether to search for other configuration files + which could be on in various places. + + :param bool ignore_option_errors: Whether to silently ignore + options, values of which could not be resolved (e.g. due to exceptions + in directives such as file:, attr:, etc.). + If False exceptions are propagated as expected. + + :rtype: dict + """ + from setuptools.dist import Distribution, _Distribution + + filepath = os.path.abspath(filepath) + + if not os.path.isfile(filepath): + raise DistutilsFileError( + 'Configuration file %s does not exist.' % filepath) + + current_directory = os.getcwd() + os.chdir(os.path.dirname(filepath)) + + try: + dist = Distribution() + + filenames = dist.find_config_files() if find_others else [] + if filepath not in filenames: + filenames.append(filepath) + + _Distribution.parse_config_files(dist, filenames=filenames) + + handlers = parse_configuration( + dist, dist.command_options, + ignore_option_errors=ignore_option_errors) + + finally: + os.chdir(current_directory) + + return configuration_to_dict(handlers) + + +def configuration_to_dict(handlers): + """Returns configuration data gathered by given handlers as a dict. + + :param list[ConfigHandler] handlers: Handlers list, + usually from parse_configuration() + + :rtype: dict + """ + config_dict = defaultdict(dict) + + for handler in handlers: + + obj_alias = handler.section_prefix + target_obj = handler.target_obj + + for option in handler.set_options: + getter = getattr(target_obj, 'get_%s' % option, None) + + if getter is None: + value = getattr(target_obj, option) + + else: + value = getter() + + config_dict[obj_alias][option] = value + + return config_dict + + +def parse_configuration( + distribution, command_options, ignore_option_errors=False): + """Performs additional parsing of configuration options + for a distribution. + + Returns a list of used option handlers. + + :param Distribution distribution: + :param dict command_options: + :param bool ignore_option_errors: Whether to silently ignore + options, values of which could not be resolved (e.g. due to exceptions + in directives such as file:, attr:, etc.). + If False exceptions are propagated as expected. + :rtype: list + """ + meta = ConfigMetadataHandler( + distribution.metadata, command_options, ignore_option_errors) + meta.parse() + + options = ConfigOptionsHandler( + distribution, command_options, ignore_option_errors) + options.parse() + + return [meta, options] + + +class ConfigHandler(object): + """Handles metadata supplied in configuration files.""" + + section_prefix = None + """Prefix for config sections handled by this handler. + Must be provided by class heirs. + + """ + + aliases = {} + """Options aliases. + For compatibility with various packages. E.g.: d2to1 and pbr. + Note: `-` in keys is replaced with `_` by config parser. + + """ + + def __init__(self, target_obj, options, ignore_option_errors=False): + sections = {} + + section_prefix = self.section_prefix + for section_name, section_options in options.items(): + if not section_name.startswith(section_prefix): + continue + + section_name = section_name.replace(section_prefix, '').strip('.') + sections[section_name] = section_options + + self.ignore_option_errors = ignore_option_errors + self.target_obj = target_obj + self.sections = sections + self.set_options = [] + + @property + def parsers(self): + """Metadata item name to parser function mapping.""" + raise NotImplementedError( + '%s must provide .parsers property' % self.__class__.__name__) + + def __setitem__(self, option_name, value): + unknown = tuple() + target_obj = self.target_obj + + # Translate alias into real name. + option_name = self.aliases.get(option_name, option_name) + + current_value = getattr(target_obj, option_name, unknown) + + if current_value is unknown: + raise KeyError(option_name) + + if current_value: + # Already inhabited. Skipping. + return + + skip_option = False + parser = self.parsers.get(option_name) + if parser: + try: + value = parser(value) + + except Exception: + skip_option = True + if not self.ignore_option_errors: + raise + + if skip_option: + return + + setter = getattr(target_obj, 'set_%s' % option_name, None) + if setter is None: + setattr(target_obj, option_name, value) + else: + setter(value) + + self.set_options.append(option_name) + + @classmethod + def _parse_list(cls, value, separator=','): + """Represents value as a list. + + Value is split either by separator (defaults to comma) or by lines. + + :param value: + :param separator: List items separator character. + :rtype: list + """ + if isinstance(value, list): # _get_parser_compound case + return value + + if '\n' in value: + value = value.splitlines() + else: + value = value.split(separator) + + return [chunk.strip() for chunk in value if chunk.strip()] + + @classmethod + def _parse_dict(cls, value): + """Represents value as a dict. + + :param value: + :rtype: dict + """ + separator = '=' + result = {} + for line in cls._parse_list(value): + key, sep, val = line.partition(separator) + if sep != separator: + raise DistutilsOptionError( + 'Unable to parse option value to dict: %s' % value) + result[key.strip()] = val.strip() + + return result + + @classmethod + def _parse_bool(cls, value): + """Represents value as boolean. + + :param value: + :rtype: bool + """ + value = value.lower() + return value in ('1', 'true', 'yes') + + @classmethod + def _parse_file(cls, value): + """Represents value as a string, allowing including text + from nearest files using `file:` directive. + + Directive is sandboxed and won't reach anything outside + directory with setup.py. + + Examples: + include: LICENSE + include: src/file.txt + + :param str value: + :rtype: str + """ + if not isinstance(value, string_types): + return value + + include_directive = 'file:' + if not value.startswith(include_directive): + return value + + current_directory = os.getcwd() + + filepath = value.replace(include_directive, '').strip() + filepath = os.path.abspath(filepath) + + if not filepath.startswith(current_directory): + raise DistutilsOptionError( + '`file:` directive can not access %s' % filepath) + + if os.path.isfile(filepath): + with io.open(filepath, encoding='utf-8') as f: + value = f.read() + + return value + + @classmethod + def _parse_attr(cls, value): + """Represents value as a module attribute. + + Examples: + attr: package.attr + attr: package.module.attr + + :param str value: + :rtype: str + """ + attr_directive = 'attr:' + if not value.startswith(attr_directive): + return value + + attrs_path = value.replace(attr_directive, '').strip().split('.') + attr_name = attrs_path.pop() + + module_name = '.'.join(attrs_path) + module_name = module_name or '__init__' + + sys.path.insert(0, os.getcwd()) + try: + module = import_module(module_name) + value = getattr(module, attr_name) + + finally: + sys.path = sys.path[1:] + + return value + + @classmethod + def _get_parser_compound(cls, *parse_methods): + """Returns parser function to represents value as a list. + + Parses a value applying given methods one after another. + + :param parse_methods: + :rtype: callable + """ + def parse(value): + parsed = value + + for method in parse_methods: + parsed = method(parsed) + + return parsed + + return parse + + @classmethod + def _parse_section_to_dict(cls, section_options, values_parser=None): + """Parses section options into a dictionary. + + Optionally applies a given parser to values. + + :param dict section_options: + :param callable values_parser: + :rtype: dict + """ + value = {} + values_parser = values_parser or (lambda val: val) + for key, (_, val) in section_options.items(): + value[key] = values_parser(val) + return value + + def parse_section(self, section_options): + """Parses configuration file section. + + :param dict section_options: + """ + for (name, (_, value)) in section_options.items(): + try: + self[name] = value + + except KeyError: + pass # Keep silent for a new option may appear anytime. + + def parse(self): + """Parses configuration file items from one + or more related sections. + + """ + for section_name, section_options in self.sections.items(): + + method_postfix = '' + if section_name: # [section.option] variant + method_postfix = '_%s' % section_name + + section_parser_method = getattr( + self, + # Dots in section names are tranlsated into dunderscores. + ('parse_section%s' % method_postfix).replace('.', '__'), + None) + + if section_parser_method is None: + raise DistutilsOptionError( + 'Unsupported distribution option section: [%s.%s]' % ( + self.section_prefix, section_name)) + + section_parser_method(section_options) + + +class ConfigMetadataHandler(ConfigHandler): + + section_prefix = 'metadata' + + aliases = { + 'home_page': 'url', + 'summary': 'description', + 'classifier': 'classifiers', + 'platform': 'platforms', + } + + strict_mode = False + """We need to keep it loose, to be partially compatible with + `pbr` and `d2to1` packages which also uses `metadata` section. + + """ + + @property + def parsers(self): + """Metadata item name to parser function mapping.""" + parse_list = self._parse_list + parse_file = self._parse_file + + return { + 'platforms': parse_list, + 'keywords': parse_list, + 'provides': parse_list, + 'requires': parse_list, + 'obsoletes': parse_list, + 'classifiers': self._get_parser_compound(parse_file, parse_list), + 'license': parse_file, + 'description': parse_file, + 'long_description': parse_file, + 'version': self._parse_version, + } + + def _parse_version(self, value): + """Parses `version` option value. + + :param value: + :rtype: str + + """ + version = self._parse_attr(value) + + if callable(version): + version = version() + + if not isinstance(version, string_types): + if hasattr(version, '__iter__'): + version = '.'.join(map(str, version)) + else: + version = '%s' % version + + return version + + +class ConfigOptionsHandler(ConfigHandler): + + section_prefix = 'options' + + @property + def parsers(self): + """Metadata item name to parser function mapping.""" + parse_list = self._parse_list + parse_list_semicolon = partial(self._parse_list, separator=';') + parse_bool = self._parse_bool + parse_dict = self._parse_dict + + return { + 'zip_safe': parse_bool, + 'use_2to3': parse_bool, + 'include_package_data': parse_bool, + 'package_dir': parse_dict, + 'use_2to3_fixers': parse_list, + 'use_2to3_exclude_fixers': parse_list, + 'convert_2to3_doctests': parse_list, + 'scripts': parse_list, + 'eager_resources': parse_list, + 'dependency_links': parse_list, + 'namespace_packages': parse_list, + 'install_requires': parse_list_semicolon, + 'setup_requires': parse_list_semicolon, + 'tests_require': parse_list_semicolon, + 'packages': self._parse_packages, + 'entry_points': self._parse_file, + } + + def _parse_packages(self, value): + """Parses `packages` option value. + + :param value: + :rtype: list + """ + find_directive = 'find:' + + if not value.startswith(find_directive): + return self._parse_list(value) + + # Read function arguments from a dedicated section. + find_kwargs = self.parse_section_packages__find( + self.sections.get('packages.find', {})) + + from setuptools import find_packages + + return find_packages(**find_kwargs) + + def parse_section_packages__find(self, section_options): + """Parses `packages.find` configuration file section. + + To be used in conjunction with _parse_packages(). + + :param dict section_options: + """ + section_data = self._parse_section_to_dict( + section_options, self._parse_list) + + valid_keys = ['where', 'include', 'exclude'] + + find_kwargs = dict( + [(k, v) for k, v in section_data.items() if k in valid_keys and v]) + + where = find_kwargs.get('where') + if where is not None: + find_kwargs['where'] = where[0] # cast list to single val + + return find_kwargs + + def parse_section_entry_points(self, section_options): + """Parses `entry_points` configuration file section. + + :param dict section_options: + """ + parsed = self._parse_section_to_dict(section_options, self._parse_list) + self['entry_points'] = parsed + + def _parse_package_data(self, section_options): + parsed = self._parse_section_to_dict(section_options, self._parse_list) + + root = parsed.get('*') + if root: + parsed[''] = root + del parsed['*'] + + return parsed + + def parse_section_package_data(self, section_options): + """Parses `package_data` configuration file section. + + :param dict section_options: + """ + self['package_data'] = self._parse_package_data(section_options) + + def parse_section_exclude_package_data(self, section_options): + """Parses `exclude_package_data` configuration file section. + + :param dict section_options: + """ + self['exclude_package_data'] = self._parse_package_data( + section_options) + + def parse_section_extras_require(self, section_options): + """Parses `extras_require` configuration file section. + + :param dict section_options: + """ + parse_list = partial(self._parse_list, separator=';') + self['extras_require'] = self._parse_section_to_dict( + section_options, parse_list) diff --git a/setuptools/dep_util.py b/setuptools/dep_util.py new file mode 100644 index 0000000..2931c13 --- /dev/null +++ b/setuptools/dep_util.py @@ -0,0 +1,23 @@ +from distutils.dep_util import newer_group + +# yes, this is was almost entirely copy-pasted from +# 'newer_pairwise()', this is just another convenience +# function. +def newer_pairwise_group(sources_groups, targets): + """Walk both arguments in parallel, testing if each source group is newer + than its corresponding target. Returns a pair of lists (sources_groups, + targets) where sources is newer than target, according to the semantics + of 'newer_group()'. + """ + if len(sources_groups) != len(targets): + raise ValueError("'sources_group' and 'targets' must be the same length") + + # build a pair of lists (sources_groups, targets) where source is newer + n_sources = [] + n_targets = [] + for i in range(len(sources_groups)): + if newer_group(sources_groups[i], targets[i]): + n_sources.append(sources_groups[i]) + n_targets.append(targets[i]) + + return n_sources, n_targets diff --git a/setuptools/depends.py b/setuptools/depends.py index 5fdf2d7..45e7052 100644 --- a/setuptools/depends.py +++ b/setuptools/depends.py @@ -1,18 +1,22 @@ -from __future__ import generators -import sys, imp, marshal +import sys +import imp +import marshal +from distutils.version import StrictVersion from imp import PKG_DIRECTORY, PY_COMPILED, PY_SOURCE, PY_FROZEN -from distutils.version import StrictVersion, LooseVersion + +from .py33compat import Bytecode + __all__ = [ 'Require', 'find_module', 'get_module_constant', 'extract_constant' ] + class Require: """A prerequisite to building or installing a distribution""" - def __init__(self,name,requested_version,module,homepage='', - attribute=None,format=None - ): + def __init__(self, name, requested_version, module, homepage='', + attribute=None, format=None): if format is None and requested_version is not None: format = StrictVersion @@ -25,22 +29,18 @@ class Require: self.__dict__.update(locals()) del self.self - def full_name(self): """Return full package/distribution name, w/version""" if self.requested_version is not None: - return '%s-%s' % (self.name,self.requested_version) + return '%s-%s' % (self.name, self.requested_version) return self.name - - def version_ok(self,version): + def version_ok(self, version): """Is 'version' sufficiently up-to-date?""" return self.attribute is None or self.format is None or \ - str(version)!="unknown" and version >= self.requested_version - + str(version) != "unknown" and version >= self.requested_version def get_version(self, paths=None, default="unknown"): - """Get version number of installed module, 'None', or 'default' Search 'paths' for module. If not found, return 'None'. If found, @@ -53,26 +53,25 @@ class Require: if self.attribute is None: try: - f,p,i = find_module(self.module,paths) - if f: f.close() + f, p, i = find_module(self.module, paths) + if f: + f.close() return default except ImportError: return None - v = get_module_constant(self.module,self.attribute,default,paths) + v = get_module_constant(self.module, self.attribute, default, paths) if v is not None and v is not default and self.format is not None: return self.format(v) return v - - def is_present(self,paths=None): + def is_present(self, paths=None): """Return true if dependency is present on 'paths'""" return self.get_version(paths) is not None - - def is_current(self,paths=None): + def is_current(self, paths=None): """Return true if dependency is present and up-to-date on 'paths'""" version = self.get_version(paths) if version is None: @@ -80,47 +79,6 @@ class Require: return self.version_ok(version) -def _iter_code(code): - - """Yield '(op,arg)' pair for each operation in code object 'code'""" - - from array import array - from dis import HAVE_ARGUMENT, EXTENDED_ARG - - bytes = array('b',code.co_code) - eof = len(code.co_code) - - ptr = 0 - extended_arg = 0 - - while ptr=HAVE_ARGUMENT: - - arg = bytes[ptr+1] + bytes[ptr+2]*256 + extended_arg - ptr += 3 - - if op==EXTENDED_ARG: - extended_arg = arg * 65536L - continue - - else: - arg = None - ptr += 1 - - yield op,arg - - - - - - - - - - def find_module(module, paths=None): """Just like 'imp.find_module()', but with package support""" @@ -128,42 +86,19 @@ def find_module(module, paths=None): while parts: part = parts.pop(0) - f, path, (suffix,mode,kind) = info = imp.find_module(part, paths) + f, path, (suffix, mode, kind) = info = imp.find_module(part, paths) - if kind==PKG_DIRECTORY: + if kind == PKG_DIRECTORY: parts = parts or ['__init__'] paths = [path] elif parts: - raise ImportError("Can't find %r in %s" % (parts,module)) + raise ImportError("Can't find %r in %s" % (parts, module)) return info - - - - - - - - - - - - - - - - - - - - - - def get_module_constant(module, symbol, default=-1, paths=None): - """Find 'module' by searching 'paths', and extract 'symbol' Return 'None' if 'module' does not exist on 'paths', or it does not define @@ -171,39 +106,33 @@ def get_module_constant(module, symbol, default=-1, paths=None): constant. Otherwise, return 'default'.""" try: - f, path, (suffix,mode,kind) = find_module(module,paths) + f, path, (suffix, mode, kind) = find_module(module, paths) except ImportError: # Module doesn't exist return None try: - if kind==PY_COMPILED: - f.read(8) # skip magic & date + if kind == PY_COMPILED: + f.read(8) # skip magic & date code = marshal.load(f) - elif kind==PY_FROZEN: + elif kind == PY_FROZEN: code = imp.get_frozen_object(module) - elif kind==PY_SOURCE: + elif kind == PY_SOURCE: code = compile(f.read(), path, 'exec') else: # Not something we can parse; we'll have to import it. :( if module not in sys.modules: - imp.load_module(module,f,path,(suffix,mode,kind)) - return getattr(sys.modules[module],symbol,None) + imp.load_module(module, f, path, (suffix, mode, kind)) + return getattr(sys.modules[module], symbol, None) finally: if f: f.close() - return extract_constant(code,symbol,default) - - - - - + return extract_constant(code, symbol, default) - -def extract_constant(code,symbol,default=-1): +def extract_constant(code, symbol, default=-1): """Extract the constant value of 'symbol' from 'code' If the name 'symbol' is bound to a constant value by the Python code @@ -215,9 +144,8 @@ def extract_constant(code,symbol,default=-1): only 'STORE_NAME' and 'STORE_GLOBAL' opcodes are checked, and 'symbol' must be present in 'code.co_names'. """ - if symbol not in code.co_names: - # name's not there, can't possibly be an assigment + # name's not there, can't possibly be an assignment return None name_idx = list(code.co_names).index(symbol) @@ -228,19 +156,31 @@ def extract_constant(code,symbol,default=-1): const = default - for op, arg in _iter_code(code): + for byte_code in Bytecode(code): + op = byte_code.opcode + arg = byte_code.arg - if op==LOAD_CONST: + if op == LOAD_CONST: const = code.co_consts[arg] - elif arg==name_idx and (op==STORE_NAME or op==STORE_GLOBAL): + elif arg == name_idx and (op == STORE_NAME or op == STORE_GLOBAL): return const else: const = default - -if sys.platform.startswith('java') or sys.platform == 'cli': - # XXX it'd be better to test assertions about bytecode instead... - del extract_constant, get_module_constant - __all__.remove('extract_constant') - __all__.remove('get_module_constant') +def _update_globals(): + """ + Patch the globals to remove the objects not available on some platforms. + + XXX it'd be better to test assertions about bytecode instead. + """ + + if not sys.platform.startswith('java') and sys.platform != 'cli': + return + incompatible = 'extract_constant', 'get_module_constant' + for name in incompatible: + del globals()[name] + __all__.remove(name) + + +_update_globals() diff --git a/setuptools/dist.py b/setuptools/dist.py index c1218ef..71c6c28 100644 --- a/setuptools/dist.py +++ b/setuptools/dist.py @@ -1,76 +1,139 @@ __all__ = ['Distribution'] -from distutils.core import Distribution as _Distribution +import re +import os +import warnings +import numbers +import distutils.log +import distutils.core +import distutils.cmd +import distutils.dist +from distutils.errors import (DistutilsOptionError, DistutilsPlatformError, + DistutilsSetupError) +from distutils.util import rfc822_escape + +import six +from six.moves import map +import packaging.specifiers +import packaging.version + from setuptools.depends import Require -from setuptools.command.install import install -from setuptools.command.sdist import sdist -from setuptools.command.install_lib import install_lib -from distutils.errors import DistutilsOptionError, DistutilsPlatformError -from distutils.errors import DistutilsSetupError -import setuptools, pkg_resources, distutils.core, distutils.dist, distutils.cmd -import os, distutils.log, re +from setuptools import windows_support +from setuptools.monkey import get_unpatched +from setuptools.config import parse_configuration +import pkg_resources +from .py36compat import Distribution_parse_config_files + def _get_unpatched(cls): - """Protect against re-patching the distutils if reloaded + warnings.warn("Do not call this function", DeprecationWarning) + return get_unpatched(cls) + - Also ensures that no other distutils extension monkeypatched the distutils - first. +# Based on Python 3.5 version +def write_pkg_file(self, file): + """Write the PKG-INFO format data to a file object. """ - while cls.__module__.startswith('setuptools'): - cls, = cls.__bases__ - if not cls.__module__.startswith('distutils'): - raise AssertionError( - "distutils has already been patched by %r" % cls - ) - return cls + version = '1.0' + if (self.provides or self.requires or self.obsoletes or + self.classifiers or self.download_url): + version = '1.1' + # Setuptools specific for PEP 345 + if hasattr(self, 'python_requires'): + version = '1.2' + + file.write('Metadata-Version: %s\n' % version) + file.write('Name: %s\n' % self.get_name()) + file.write('Version: %s\n' % self.get_version()) + file.write('Summary: %s\n' % self.get_description()) + file.write('Home-page: %s\n' % self.get_url()) + file.write('Author: %s\n' % self.get_contact()) + file.write('Author-email: %s\n' % self.get_contact_email()) + file.write('License: %s\n' % self.get_license()) + if self.download_url: + file.write('Download-URL: %s\n' % self.download_url) + + long_desc = rfc822_escape(self.get_long_description()) + file.write('Description: %s\n' % long_desc) + + keywords = ','.join(self.get_keywords()) + if keywords: + file.write('Keywords: %s\n' % keywords) + + self._write_list(file, 'Platform', self.get_platforms()) + self._write_list(file, 'Classifier', self.get_classifiers()) + + # PEP 314 + self._write_list(file, 'Requires', self.get_requires()) + self._write_list(file, 'Provides', self.get_provides()) + self._write_list(file, 'Obsoletes', self.get_obsoletes()) + + # Setuptools specific for PEP 345 + if hasattr(self, 'python_requires'): + file.write('Requires-Python: %s\n' % self.python_requires) + + +# from Python 3.4 +def write_pkg_info(self, base_dir): + """Write the PKG-INFO file into the release tree. + """ + with open(os.path.join(base_dir, 'PKG-INFO'), 'w', + encoding='UTF-8') as pkg_info: + self.write_pkg_file(pkg_info) -_Distribution = _get_unpatched(_Distribution) sequence = tuple, list + def check_importable(dist, attr, value): try: - ep = pkg_resources.EntryPoint.parse('x='+value) + ep = pkg_resources.EntryPoint.parse('x=' + value) assert not ep.extras - except (TypeError,ValueError,AttributeError,AssertionError): + except (TypeError, ValueError, AttributeError, AssertionError): raise DistutilsSetupError( "%r must be importable 'module:attrs' string (got %r)" - % (attr,value) + % (attr, value) ) def assert_string_list(dist, attr, value): """Verify that value is a string list or None""" try: - assert ''.join(value)!=value - except (TypeError,ValueError,AttributeError,AssertionError): + assert ''.join(value) != value + except (TypeError, ValueError, AttributeError, AssertionError): raise DistutilsSetupError( - "%r must be a list of strings (got %r)" % (attr,value) + "%r must be a list of strings (got %r)" % (attr, value) ) + def check_nsp(dist, attr, value): """Verify that namespace packages are valid""" - assert_string_list(dist,attr,value) - for nsp in value: + ns_packages = value + assert_string_list(dist, attr, ns_packages) + for nsp in ns_packages: if not dist.has_contents_for(nsp): raise DistutilsSetupError( "Distribution contains no modules or packages for " + "namespace package %r" % nsp ) - if '.' in nsp: - parent = '.'.join(nsp.split('.')[:-1]) - if parent not in value: - distutils.log.warn( - "WARNING: %r is declared as a package namespace, but %r" - " is not: please correct this in setup.py", nsp, parent - ) + parent, sep, child = nsp.rpartition('.') + if parent and parent not in ns_packages: + distutils.log.warn( + "WARNING: %r is declared as a package namespace, but %r" + " is not: please correct this in setup.py", nsp, parent + ) + def check_extras(dist, attr, value): """Verify that extras_require mapping is valid""" try: - for k,v in value.items(): + for k, v in value.items(): + if ':' in k: + k, m = k.split(':', 1) + if pkg_resources.invalid_marker(m): + raise DistutilsSetupError("Invalid environment marker: " + m) list(pkg_resources.parse_requirements(v)) - except (TypeError,ValueError,AttributeError): + except (TypeError, ValueError, AttributeError): raise DistutilsSetupError( "'extras_require' must be a dictionary whose values are " "strings or lists of strings containing valid project/version " @@ -78,91 +141,81 @@ def check_extras(dist, attr, value): ) - - def assert_bool(dist, attr, value): """Verify that value is True, False, 0, or 1""" if bool(value) != value: - raise DistutilsSetupError( - "%r must be a boolean value (got %r)" % (attr,value) - ) + tmpl = "{attr!r} must be a boolean value (got {value!r})" + raise DistutilsSetupError(tmpl.format(attr=attr, value=value)) + + def check_requirements(dist, attr, value): """Verify that install_requires is a valid requirements list""" try: list(pkg_resources.parse_requirements(value)) - except (TypeError,ValueError): - raise DistutilsSetupError( - "%r must be a string or list of strings " - "containing valid project/version requirement specifiers" % (attr,) + except (TypeError, ValueError) as error: + tmpl = ( + "{attr!r} must be a string or list of strings " + "containing valid project/version requirement specifiers; {error}" ) + raise DistutilsSetupError(tmpl.format(attr=attr, error=error)) + + +def check_specifier(dist, attr, value): + """Verify that value is a valid version specifier""" + try: + packaging.specifiers.SpecifierSet(value) + except packaging.specifiers.InvalidSpecifier as error: + tmpl = ( + "{attr!r} must be a string or list of strings " + "containing valid version specifiers; {error}" + ) + raise DistutilsSetupError(tmpl.format(attr=attr, error=error)) + + def check_entry_points(dist, attr, value): """Verify that entry_points map is parseable""" try: pkg_resources.EntryPoint.parse_map(value) - except ValueError, e: + except ValueError as e: raise DistutilsSetupError(e) + def check_test_suite(dist, attr, value): - if not isinstance(value,basestring): + if not isinstance(value, six.string_types): raise DistutilsSetupError("test_suite must be a string") + def check_package_data(dist, attr, value): """Verify that value is a dictionary of package names to glob lists""" - if isinstance(value,dict): - for k,v in value.items(): - if not isinstance(k,str): break - try: iter(v) + if isinstance(value, dict): + for k, v in value.items(): + if not isinstance(k, str): + break + try: + iter(v) except TypeError: break else: return raise DistutilsSetupError( - attr+" must be a dictionary mapping package names to lists of " + attr + " must be a dictionary mapping package names to lists of " "wildcard patterns" ) + def check_packages(dist, attr, value): for pkgname in value: if not re.match(r'\w+(\.\w+)*', pkgname): distutils.log.warn( - "WARNING: %r not a valid package name; please use only" + "WARNING: %r not a valid package name; please use only " ".-separated package names in setup.py", pkgname ) - - - - - - - - - - - - - - - - - - +_Distribution = get_unpatched(distutils.core.Distribution) - - - - - - - - - - - - -class Distribution(_Distribution): +class Distribution(Distribution_parse_config_files, _Distribution): """Distribution with support for features, tests, and package data This is an enhanced version of 'distutils.dist.Distribution' that @@ -190,7 +243,8 @@ class Distribution(_Distribution): EasyInstall and requests one of your extras, the corresponding additional requirements will be installed if needed. - 'features' -- a dictionary mapping option names to 'setuptools.Feature' + 'features' **deprecated** -- a dictionary mapping option names to + 'setuptools.Feature' objects. Features are a portion of the distribution that can be included or excluded based on user options, inter-feature dependencies, and availability on the current system. Excluded features are omitted @@ -244,28 +298,62 @@ class Distribution(_Distribution): dist._version = pkg_resources.safe_version(str(attrs['version'])) self._patched_dist = dist - def __init__ (self, attrs=None): + def __init__(self, attrs=None): have_package_data = hasattr(self, "package_data") if not have_package_data: self.package_data = {} + _attrs_dict = attrs or {} + if 'features' in _attrs_dict or 'require_features' in _attrs_dict: + Feature.warn_deprecated() self.require_features = [] self.features = {} self.dist_files = [] + self.src_root = attrs and attrs.pop("src_root", None) self.patch_missing_pkg_info(attrs) # Make sure we have any eggs needed to interpret 'attrs' if attrs is not None: self.dependency_links = attrs.pop('dependency_links', []) - assert_string_list(self,'dependency_links',self.dependency_links) + assert_string_list(self, 'dependency_links', self.dependency_links) if attrs and 'setup_requires' in attrs: - self.fetch_build_eggs(attrs.pop('setup_requires')) + self.fetch_build_eggs(attrs['setup_requires']) for ep in pkg_resources.iter_entry_points('distutils.setup_keywords'): - if not hasattr(self,ep.name): - setattr(self,ep.name,None) - _Distribution.__init__(self,attrs) - if isinstance(self.metadata.version, (int,long,float)): + vars(self).setdefault(ep.name, None) + _Distribution.__init__(self, attrs) + if isinstance(self.metadata.version, numbers.Number): # Some people apparently take "version number" too literally :) self.metadata.version = str(self.metadata.version) + if self.metadata.version is not None: + try: + ver = packaging.version.Version(self.metadata.version) + normalized_version = str(ver) + if self.metadata.version != normalized_version: + warnings.warn( + "Normalizing '%s' to '%s'" % ( + self.metadata.version, + normalized_version, + ) + ) + self.metadata.version = normalized_version + except (packaging.version.InvalidVersion, TypeError): + warnings.warn( + "The version specified (%r) is an invalid version, this " + "may not work as expected with newer versions of " + "setuptools, pip, and PyPI. Please see PEP 440 for more " + "details." % self.metadata.version + ) + if getattr(self, 'python_requires', None): + self.metadata.python_requires = self.python_requires + + def parse_config_files(self, filenames=None): + """Parses configuration files from various levels + and loads configuration. + + """ + _Distribution.parse_config_files(self, filenames=filenames) + + parse_configuration(self, self.command_options) + def parse_command_line(self): """Process features after parsing command line options""" result = _Distribution.parse_command_line(self) @@ -273,17 +361,20 @@ class Distribution(_Distribution): self._finalize_features() return result - def _feature_attrname(self,name): + def _feature_attrname(self, name): """Convert feature name to corresponding option attribute name""" - return 'with_'+name.replace('-','_') + return 'with_' + name.replace('-', '_') def fetch_build_eggs(self, requires): """Resolve pre-setup requirements""" - from pkg_resources import working_set, parse_requirements - for dist in working_set.resolve( - parse_requirements(requires), installer=self.fetch_build_egg - ): - working_set.add(dist) + resolved_dists = pkg_resources.working_set.resolve( + pkg_resources.parse_requirements(requires), + installer=self.fetch_build_egg, + replace_conflicting=True, + ) + for dist in resolved_dists: + pkg_resources.working_set.add(dist, replace=True) + return resolved_dists def finalize_options(self): _Distribution.finalize_options(self) @@ -291,36 +382,59 @@ class Distribution(_Distribution): self._set_global_opts_from_features() for ep in pkg_resources.iter_entry_points('distutils.setup_keywords'): - value = getattr(self,ep.name,None) + value = getattr(self, ep.name, None) if value is not None: ep.require(installer=self.fetch_build_egg) ep.load()(self, ep.name, value) + if getattr(self, 'convert_2to3_doctests', None): + # XXX may convert to set here when we can rely on set being builtin + self.convert_2to3_doctests = [os.path.abspath(p) for p in self.convert_2to3_doctests] + else: + self.convert_2to3_doctests = [] + + def get_egg_cache_dir(self): + egg_cache_dir = os.path.join(os.curdir, '.eggs') + if not os.path.exists(egg_cache_dir): + os.mkdir(egg_cache_dir) + windows_support.hide_file(egg_cache_dir) + readme_txt_filename = os.path.join(egg_cache_dir, 'README.txt') + with open(readme_txt_filename, 'w') as f: + f.write('This directory contains eggs that were downloaded ' + 'by setuptools to build, test, and run plug-ins.\n\n') + f.write('This directory caches those eggs to prevent ' + 'repeated downloads.\n\n') + f.write('However, it is safe to delete this directory.\n\n') + + return egg_cache_dir def fetch_build_egg(self, req): """Fetch an egg needed for building""" + try: cmd = self._egg_fetcher + cmd.package_index.to_scan = [] except AttributeError: from setuptools.command.easy_install import easy_install - dist = self.__class__({'script_args':['easy_install']}) + dist = self.__class__({'script_args': ['easy_install']}) dist.parse_config_files() opts = dist.get_option_dict('easy_install') keep = ( 'find_links', 'site_dirs', 'index_url', 'optimize', 'site_dirs', 'allow_hosts' ) - for key in opts.keys(): + for key in list(opts): if key not in keep: - del opts[key] # don't use any other settings + del opts[key] # don't use any other settings if self.dependency_links: links = self.dependency_links[:] if 'find_links' in opts: links = opts['find_links'][1].split() + links opts['find_links'] = ('setup', links) + install_dir = self.get_egg_cache_dir() cmd = easy_install( - dist, args=["x"], install_dir=os.curdir, exclude_scripts=True, + dist, args=["x"], install_dir=install_dir, exclude_scripts=True, always_copy=False, build_directory=None, editable=False, - upgrade=False, multi_version=True, no_report = True + upgrade=False, multi_version=True, no_report=True, user=False ) cmd.ensure_finalized() self._egg_fetcher = cmd @@ -332,65 +446,47 @@ class Distribution(_Distribution): go = [] no = self.negative_opt.copy() - for name,feature in self.features.items(): - self._set_feature(name,None) + for name, feature in self.features.items(): + self._set_feature(name, None) feature.validate(self) if feature.optional: descr = feature.description incdef = ' (default)' - excdef='' + excdef = '' if not feature.include_by_default(): excdef, incdef = incdef, excdef - go.append(('with-'+name, None, 'include '+descr+incdef)) - go.append(('without-'+name, None, 'exclude '+descr+excdef)) - no['without-'+name] = 'with-'+name + go.append(('with-' + name, None, 'include ' + descr + incdef)) + go.append(('without-' + name, None, 'exclude ' + descr + excdef)) + no['without-' + name] = 'with-' + name self.global_options = self.feature_options = go + self.global_options self.negative_opt = self.feature_negopt = no - - - - - - - - - - - - - - - - - def _finalize_features(self): """Add/remove features and resolve dependencies between them""" # First, flag all the enabled items (and thus their dependencies) - for name,feature in self.features.items(): + for name, feature in self.features.items(): enabled = self.feature_is_included(name) if enabled or (enabled is None and feature.include_by_default()): feature.include_in(self) - self._set_feature(name,1) + self._set_feature(name, 1) # Then disable the rest, so that off-by-default features don't # get flagged as errors when they're required by an enabled feature - for name,feature in self.features.items(): + for name, feature in self.features.items(): if not self.feature_is_included(name): feature.exclude_from(self) - self._set_feature(name,0) - + self._set_feature(name, 0) def get_command_class(self, command): """Pluggable version of get_command_class()""" if command in self.cmdclass: return self.cmdclass[command] - for ep in pkg_resources.iter_entry_points('distutils.commands',command): + for ep in pkg_resources.iter_entry_points('distutils.commands', command): ep.require(installer=self.fetch_build_egg) self.cmdclass[command] = cmdclass = ep.load() return cmdclass @@ -400,34 +496,39 @@ class Distribution(_Distribution): def print_commands(self): for ep in pkg_resources.iter_entry_points('distutils.commands'): if ep.name not in self.cmdclass: - cmdclass = ep.load(False) # don't require extras, we're not running + # don't require extras as the commands won't be invoked + cmdclass = ep.resolve() self.cmdclass[ep.name] = cmdclass return _Distribution.print_commands(self) + def get_command_list(self): + for ep in pkg_resources.iter_entry_points('distutils.commands'): + if ep.name not in self.cmdclass: + # don't require extras as the commands won't be invoked + cmdclass = ep.resolve() + self.cmdclass[ep.name] = cmdclass + return _Distribution.get_command_list(self) - - - - def _set_feature(self,name,status): + def _set_feature(self, name, status): """Set feature's inclusion status""" - setattr(self,self._feature_attrname(name),status) + setattr(self, self._feature_attrname(name), status) - def feature_is_included(self,name): + def feature_is_included(self, name): """Return 1 if feature is included, 0 if excluded, 'None' if unknown""" - return getattr(self,self._feature_attrname(name)) + return getattr(self, self._feature_attrname(name)) - def include_feature(self,name): + def include_feature(self, name): """Request inclusion of feature named 'name'""" - if self.feature_is_included(name)==0: + if self.feature_is_included(name) == 0: descr = self.features[name].description raise DistutilsOptionError( - descr + " is required, but was excluded or is not available" - ) + descr + " is required, but was excluded or is not available" + ) self.features[name].include_in(self) - self._set_feature(name,1) + self._set_feature(name, 1) - def include(self,**attrs): + def include(self, **attrs): """Add items to distribution that are named in keyword arguments For example, 'dist.exclude(py_modules=["x"])' would add 'x' to @@ -442,96 +543,86 @@ class Distribution(_Distribution): will try to call 'dist._include_foo({"bar":"baz"})', which can then handle whatever special inclusion logic is needed. """ - for k,v in attrs.items(): - include = getattr(self, '_include_'+k, None) + for k, v in attrs.items(): + include = getattr(self, '_include_' + k, None) if include: include(v) else: - self._include_misc(k,v) + self._include_misc(k, v) - def exclude_package(self,package): + def exclude_package(self, package): """Remove packages, modules, and extensions in named package""" - pfx = package+'.' + pfx = package + '.' if self.packages: self.packages = [ p for p in self.packages - if p!=package and not p.startswith(pfx) + if p != package and not p.startswith(pfx) ] if self.py_modules: self.py_modules = [ p for p in self.py_modules - if p!=package and not p.startswith(pfx) + if p != package and not p.startswith(pfx) ] if self.ext_modules: self.ext_modules = [ p for p in self.ext_modules - if p.name!=package and not p.name.startswith(pfx) + if p.name != package and not p.name.startswith(pfx) ] - - def has_contents_for(self,package): + def has_contents_for(self, package): """Return true if 'exclude_package(package)' would do something""" - pfx = package+'.' + pfx = package + '.' for p in self.iter_distribution_names(): - if p==package or p.startswith(pfx): + if p == package or p.startswith(pfx): return True - - - - - - - - - - def _exclude_misc(self,name,value): + def _exclude_misc(self, name, value): """Handle 'exclude()' for list/tuple attrs without a special handler""" - if not isinstance(value,sequence): + if not isinstance(value, sequence): raise DistutilsSetupError( "%s: setting must be a list or tuple (%r)" % (name, value) ) try: - old = getattr(self,name) + old = getattr(self, name) except AttributeError: raise DistutilsSetupError( "%s: No such distribution setting" % name ) - if old is not None and not isinstance(old,sequence): + if old is not None and not isinstance(old, sequence): raise DistutilsSetupError( - name+": this setting cannot be changed via include/exclude" + name + ": this setting cannot be changed via include/exclude" ) elif old: - setattr(self,name,[item for item in old if item not in value]) + setattr(self, name, [item for item in old if item not in value]) - def _include_misc(self,name,value): + def _include_misc(self, name, value): """Handle 'include()' for list/tuple attrs without a special handler""" - if not isinstance(value,sequence): + if not isinstance(value, sequence): raise DistutilsSetupError( "%s: setting must be a list (%r)" % (name, value) ) try: - old = getattr(self,name) + old = getattr(self, name) except AttributeError: raise DistutilsSetupError( "%s: No such distribution setting" % name ) if old is None: - setattr(self,name,value) - elif not isinstance(old,sequence): + setattr(self, name, value) + elif not isinstance(old, sequence): raise DistutilsSetupError( - name+": this setting cannot be changed via include/exclude" + name + ": this setting cannot be changed via include/exclude" ) else: - setattr(self,name,old+[item for item in value if item not in old]) + setattr(self, name, old + [item for item in value if item not in old]) - def exclude(self,**attrs): + def exclude(self, **attrs): """Remove items from distribution that are named in keyword arguments For example, 'dist.exclude(py_modules=["x"])' would remove 'x' from @@ -547,30 +638,19 @@ class Distribution(_Distribution): will try to call 'dist._exclude_foo({"bar":"baz"})', which can then handle whatever special exclusion logic is needed. """ - for k,v in attrs.items(): - exclude = getattr(self, '_exclude_'+k, None) + for k, v in attrs.items(): + exclude = getattr(self, '_exclude_' + k, None) if exclude: exclude(v) else: - self._exclude_misc(k,v) + self._exclude_misc(k, v) - def _exclude_packages(self,packages): - if not isinstance(packages,sequence): + def _exclude_packages(self, packages): + if not isinstance(packages, sequence): raise DistutilsSetupError( "packages: setting must be a list or tuple (%r)" % (packages,) ) - map(self.exclude_package, packages) - - - - - - - - - - - + list(map(self.exclude_package, packages)) def _parse_command_opts(self, parser, args): # Remove --with-X/--without-X options when processing command args @@ -581,38 +661,23 @@ class Distribution(_Distribution): command = args[0] aliases = self.get_option_dict('aliases') while command in aliases: - src,alias = aliases[command] - del aliases[command] # ensure each alias can expand only once! + src, alias = aliases[command] + del aliases[command] # ensure each alias can expand only once! import shlex - args[:1] = shlex.split(alias,True) + args[:1] = shlex.split(alias, True) command = args[0] nargs = _Distribution._parse_command_opts(self, parser, args) # Handle commands that want to consume all remaining arguments cmd_class = self.get_command_class(command) - if getattr(cmd_class,'command_consumes_arguments',None): + if getattr(cmd_class, 'command_consumes_arguments', None): self.get_option_dict(command)['args'] = ("command line", nargs) if nargs is not None: return [] return nargs - - - - - - - - - - - - - - - def get_cmdline_options(self): """Return a '{cmd: {opt:val}}' map of all command-line options @@ -625,35 +690,34 @@ class Distribution(_Distribution): d = {} - for cmd,opts in self.command_options.items(): + for cmd, opts in self.command_options.items(): - for opt,(src,val) in opts.items(): + for opt, (src, val) in opts.items(): if src != "command line": continue - opt = opt.replace('_','-') + opt = opt.replace('_', '-') - if val==0: + if val == 0: cmdobj = self.get_command_obj(cmd) neg_opt = self.negative_opt.copy() - neg_opt.update(getattr(cmdobj,'negative_opt',{})) - for neg,pos in neg_opt.items(): - if pos==opt: - opt=neg - val=None + neg_opt.update(getattr(cmdobj, 'negative_opt', {})) + for neg, pos in neg_opt.items(): + if pos == opt: + opt = neg + val = None break else: raise AssertionError("Shouldn't be able to get here") - elif val==1: + elif val == 1: val = None - d.setdefault(cmd,{})[opt] = val + d.setdefault(cmd, {})[opt] = val return d - def iter_distribution_names(self): """Yield all packages, modules, and extension names in distribution""" @@ -664,7 +728,7 @@ class Distribution(_Distribution): yield module for ext in self.ext_modules or (): - if isinstance(ext,tuple): + if isinstance(ext, tuple): name, buildinfo = ext else: name = ext.name @@ -672,31 +736,50 @@ class Distribution(_Distribution): name = name[:-6] yield name -# Install it throughout the distutils -for module in distutils.dist, distutils.core, distutils.cmd: - module.Distribution = Distribution - - - - - - - - - - - - - + def handle_display_options(self, option_order): + """If there were any non-global "display-only" options + (--help-commands or the metadata display options) on the command + line, display the requested info and return true; else return + false. + """ + import sys + if six.PY2 or self.help_commands: + return _Distribution.handle_display_options(self, option_order) + # Stdout may be StringIO (e.g. in tests) + import io + if not isinstance(sys.stdout, io.TextIOWrapper): + return _Distribution.handle_display_options(self, option_order) + # Don't wrap stdout if utf-8 is already the encoding. Provides + # workaround for #334. + if sys.stdout.encoding.lower() in ('utf-8', 'utf8'): + return _Distribution.handle_display_options(self, option_order) + # Print metadata in UTF-8 no matter the platform + encoding = sys.stdout.encoding + errors = sys.stdout.errors + newline = sys.platform != 'win32' and '\n' or None + line_buffering = sys.stdout.line_buffering + sys.stdout = io.TextIOWrapper( + sys.stdout.detach(), 'utf-8', errors, newline, line_buffering) + try: + return _Distribution.handle_display_options(self, option_order) + finally: + sys.stdout = io.TextIOWrapper( + sys.stdout.detach(), encoding, errors, newline, line_buffering) class Feature: - """A subset of the distribution that can be excluded if unneeded/wanted + """ + **deprecated** -- The `Feature` facility was never completely implemented + or supported, `has reported issues + `_ and will be removed in + a future version. + + A subset of the distribution that can be excluded if unneeded/wanted Features are created using these keyword arguments: @@ -745,24 +828,35 @@ class Feature: Aside from the methods, the only feature attributes that distributions look at are 'description' and 'optional'. """ + + @staticmethod + def warn_deprecated(): + warnings.warn( + "Features are deprecated and will be removed in a future " + "version. See https://github.com/pypa/setuptools/issues/65.", + DeprecationWarning, + stacklevel=3, + ) + def __init__(self, description, standard=False, available=True, - optional=True, require_features=(), remove=(), **extras - ): + optional=True, require_features=(), remove=(), **extras): + self.warn_deprecated() self.description = description self.standard = standard self.available = available self.optional = optional - if isinstance(require_features,(str,Require)): + if isinstance(require_features, (str, Require)): require_features = require_features, self.require_features = [ - r for r in require_features if isinstance(r,str) + r for r in require_features if isinstance(r, str) ] - er = [r for r in require_features if not isinstance(r,str)] - if er: extras['require_features'] = er + er = [r for r in require_features if not isinstance(r, str)] + if er: + extras['require_features'] = er - if isinstance(remove,str): + if isinstance(remove, str): remove = remove, self.remove = remove self.extras = extras @@ -777,8 +871,7 @@ class Feature: """Should this feature be included by default?""" return self.available and self.standard - def include_in(self,dist): - + def include_in(self, dist): """Ensure feature and its requirements are included in distribution You may override this in a subclass to perform additional operations on @@ -789,7 +882,7 @@ class Feature: if not self.available: raise DistutilsPlatformError( - self.description+" is required," + self.description + " is required, " "but is not available on this platform" ) @@ -798,10 +891,7 @@ class Feature: for f in self.require_features: dist.include_feature(f) - - - def exclude_from(self,dist): - + def exclude_from(self, dist): """Ensure feature is excluded from distribution You may override this in a subclass to perform additional operations on @@ -816,10 +906,7 @@ class Feature: for item in self.remove: dist.exclude_package(item) - - - def validate(self,dist): - + def validate(self, dist): """Verify that feature makes sense in context of distribution This method is called by the distribution just before it parses its @@ -837,25 +924,3 @@ class Feature: " doesn't contain any packages or modules under %s" % (self.description, item, item) ) - - - - - - - - - - - - - - - - - - - - - - diff --git a/setuptools/extension.py b/setuptools/extension.py index 2bef84e..34a36df 100644 --- a/setuptools/extension.py +++ b/setuptools/extension.py @@ -1,36 +1,57 @@ -from distutils.core import Extension as _Extension -from dist import _get_unpatched -_Extension = _get_unpatched(_Extension) +import re +import functools +import distutils.core +import distutils.errors +import distutils.extension -try: - from Pyrex.Distutils.build_ext import build_ext -except ImportError: - have_pyrex = False -else: - have_pyrex = True +from six.moves import map + +from .monkey import get_unpatched + + +def _have_cython(): + """ + Return True if Cython can be imported. + """ + cython_impl = 'Cython.Distutils.build_ext' + try: + # from (cython_impl) import build_ext + __import__(cython_impl, fromlist=['build_ext']).build_ext + return True + except Exception: + pass + return False + + +# for compatibility +have_pyrex = _have_cython + +_Extension = get_unpatched(distutils.core.Extension) class Extension(_Extension): """Extension that uses '.c' files in place of '.pyx' files""" - if not have_pyrex: - # convert .pyx extensions to .c - def __init__(self,*args,**kw): - _Extension.__init__(self,*args,**kw) - sources = [] - for s in self.sources: - if s.endswith('.pyx'): - sources.append(s[:-3]+'c') - else: - sources.append(s) - self.sources = sources + def __init__(self, name, sources, *args, **kw): + # The *args is needed for compatibility as calls may use positional + # arguments. py_limited_api may be set only via keyword. + self.py_limited_api = kw.pop("py_limited_api", False) + _Extension.__init__(self, name, sources, *args, **kw) -class Library(Extension): - """Just like a regular Extension, but built as a library instead""" + def _convert_pyx_sources_to_lang(self): + """ + Replace sources with .pyx extensions to sources with the target + language extension. This mechanism allows language authors to supply + pre-converted sources but to prefer the .pyx sources. + """ + if _have_cython(): + # the build has Cython, so allow it to compile the .pyx files + return + lang = self.language or '' + target_ext = '.cpp' if lang.lower() == 'c++' else '.c' + sub = functools.partial(re.sub, '.pyx$', target_ext) + self.sources = list(map(sub, self.sources)) -import sys, distutils.core, distutils.extension -distutils.core.Extension = Extension -distutils.extension.Extension = Extension -if 'distutils.command.build_ext' in sys.modules: - sys.modules['distutils.command.build_ext'].Extension = Extension +class Library(Extension): + """Just like a regular Extension, but built as a library instead""" diff --git a/setuptools/glob.py b/setuptools/glob.py new file mode 100644 index 0000000..f264402 --- /dev/null +++ b/setuptools/glob.py @@ -0,0 +1,176 @@ +""" +Filename globbing utility. Mostly a copy of `glob` from Python 3.5. + +Changes include: + * `yield from` and PEP3102 `*` removed. + * `bytes` changed to `six.binary_type`. + * Hidden files are not ignored. +""" + +import os +import re +import fnmatch +from six import binary_type + +__all__ = ["glob", "iglob", "escape"] + + +def glob(pathname, recursive=False): + """Return a list of paths matching a pathname pattern. + + The pattern may contain simple shell-style wildcards a la + fnmatch. However, unlike fnmatch, filenames starting with a + dot are special cases that are not matched by '*' and '?' + patterns. + + If recursive is true, the pattern '**' will match any files and + zero or more directories and subdirectories. + """ + return list(iglob(pathname, recursive=recursive)) + + +def iglob(pathname, recursive=False): + """Return an iterator which yields the paths matching a pathname pattern. + + The pattern may contain simple shell-style wildcards a la + fnmatch. However, unlike fnmatch, filenames starting with a + dot are special cases that are not matched by '*' and '?' + patterns. + + If recursive is true, the pattern '**' will match any files and + zero or more directories and subdirectories. + """ + it = _iglob(pathname, recursive) + if recursive and _isrecursive(pathname): + s = next(it) # skip empty string + assert not s + return it + + +def _iglob(pathname, recursive): + dirname, basename = os.path.split(pathname) + if not has_magic(pathname): + if basename: + if os.path.lexists(pathname): + yield pathname + else: + # Patterns ending with a slash should match only directories + if os.path.isdir(dirname): + yield pathname + return + if not dirname: + if recursive and _isrecursive(basename): + for x in glob2(dirname, basename): + yield x + else: + for x in glob1(dirname, basename): + yield x + return + # `os.path.split()` returns the argument itself as a dirname if it is a + # drive or UNC path. Prevent an infinite recursion if a drive or UNC path + # contains magic characters (i.e. r'\\?\C:'). + if dirname != pathname and has_magic(dirname): + dirs = _iglob(dirname, recursive) + else: + dirs = [dirname] + if has_magic(basename): + if recursive and _isrecursive(basename): + glob_in_dir = glob2 + else: + glob_in_dir = glob1 + else: + glob_in_dir = glob0 + for dirname in dirs: + for name in glob_in_dir(dirname, basename): + yield os.path.join(dirname, name) + + +# These 2 helper functions non-recursively glob inside a literal directory. +# They return a list of basenames. `glob1` accepts a pattern while `glob0` +# takes a literal basename (so it only has to check for its existence). + + +def glob1(dirname, pattern): + if not dirname: + if isinstance(pattern, binary_type): + dirname = os.curdir.encode('ASCII') + else: + dirname = os.curdir + try: + names = os.listdir(dirname) + except OSError: + return [] + return fnmatch.filter(names, pattern) + + +def glob0(dirname, basename): + if not basename: + # `os.path.split()` returns an empty basename for paths ending with a + # directory separator. 'q*x/' should match only directories. + if os.path.isdir(dirname): + return [basename] + else: + if os.path.lexists(os.path.join(dirname, basename)): + return [basename] + return [] + + +# This helper function recursively yields relative pathnames inside a literal +# directory. + + +def glob2(dirname, pattern): + assert _isrecursive(pattern) + yield pattern[:0] + for x in _rlistdir(dirname): + yield x + + +# Recursively yields relative pathnames inside a literal directory. +def _rlistdir(dirname): + if not dirname: + if isinstance(dirname, binary_type): + dirname = binary_type(os.curdir, 'ASCII') + else: + dirname = os.curdir + try: + names = os.listdir(dirname) + except os.error: + return + for x in names: + yield x + path = os.path.join(dirname, x) if dirname else x + for y in _rlistdir(path): + yield os.path.join(x, y) + + +magic_check = re.compile('([*?[])') +magic_check_bytes = re.compile(b'([*?[])') + + +def has_magic(s): + if isinstance(s, binary_type): + match = magic_check_bytes.search(s) + else: + match = magic_check.search(s) + return match is not None + + +def _isrecursive(pattern): + if isinstance(pattern, binary_type): + return pattern == b'**' + else: + return pattern == '**' + + +def escape(pathname): + """Escape all special characters. + """ + # Escaping is done by wrapping any of "*?[" between square brackets. + # Metacharacters do not work in the drive part and shouldn't be escaped. + drive, pathname = os.path.splitdrive(pathname) + if isinstance(pathname, binary_type): + pathname = magic_check_bytes.sub(br'[\1]', pathname) + else: + pathname = magic_check.sub(r'[\1]', pathname) + return drive + pathname diff --git a/setuptools/gui-32.exe b/setuptools/gui-32.exe new file mode 100644 index 0000000..f8d3509 Binary files /dev/null and b/setuptools/gui-32.exe differ diff --git a/setuptools/gui-64.exe b/setuptools/gui-64.exe new file mode 100644 index 0000000..330c51a Binary files /dev/null and b/setuptools/gui-64.exe differ diff --git a/setuptools/gui.exe b/setuptools/gui.exe old mode 100755 new mode 100644 index 474838d..f8d3509 Binary files a/setuptools/gui.exe and b/setuptools/gui.exe differ diff --git a/setuptools/launch.py b/setuptools/launch.py new file mode 100644 index 0000000..308283e --- /dev/null +++ b/setuptools/launch.py @@ -0,0 +1,35 @@ +""" +Launch the Python script on the command line after +setuptools is bootstrapped via import. +""" + +# Note that setuptools gets imported implicitly by the +# invocation of this script using python -m setuptools.launch + +import tokenize +import sys + + +def run(): + """ + Run the script in sys.argv[1] as if it had + been invoked naturally. + """ + __builtins__ + script_name = sys.argv[1] + namespace = dict( + __file__=script_name, + __name__='__main__', + __doc__=None, + ) + sys.argv[:] = sys.argv[1:] + + open_ = getattr(tokenize, 'open', open) + script = open_(script_name).read() + norm_script = script.replace('\\r\\n', '\\n') + code = compile(norm_script, script_name, 'exec') + exec(code, namespace) + + +if __name__ == '__main__': + run() diff --git a/setuptools/lib2to3_ex.py b/setuptools/lib2to3_ex.py new file mode 100644 index 0000000..4b1a73f --- /dev/null +++ b/setuptools/lib2to3_ex.py @@ -0,0 +1,62 @@ +""" +Customized Mixin2to3 support: + + - adds support for converting doctests + + +This module raises an ImportError on Python 2. +""" + +from distutils.util import Mixin2to3 as _Mixin2to3 +from distutils import log +from lib2to3.refactor import RefactoringTool, get_fixers_from_package + +import setuptools + + +class DistutilsRefactoringTool(RefactoringTool): + def log_error(self, msg, *args, **kw): + log.error(msg, *args) + + def log_message(self, msg, *args): + log.info(msg, *args) + + def log_debug(self, msg, *args): + log.debug(msg, *args) + + +class Mixin2to3(_Mixin2to3): + def run_2to3(self, files, doctests=False): + # See of the distribution option has been set, otherwise check the + # setuptools default. + if self.distribution.use_2to3 is not True: + return + if not files: + return + log.info("Fixing " + " ".join(files)) + self.__build_fixer_names() + self.__exclude_fixers() + if doctests: + if setuptools.run_2to3_on_doctests: + r = DistutilsRefactoringTool(self.fixer_names) + r.refactor(files, write=True, doctests_only=True) + else: + _Mixin2to3.run_2to3(self, files) + + def __build_fixer_names(self): + if self.fixer_names: + return + self.fixer_names = [] + for p in setuptools.lib2to3_fixer_packages: + self.fixer_names.extend(get_fixers_from_package(p)) + if self.distribution.use_2to3_fixers is not None: + for p in self.distribution.use_2to3_fixers: + self.fixer_names.extend(get_fixers_from_package(p)) + + def __exclude_fixers(self): + excluded_fixers = getattr(self, 'exclude_fixers', []) + if self.distribution.use_2to3_exclude_fixers is not None: + excluded_fixers.extend(self.distribution.use_2to3_exclude_fixers) + for fixer_name in excluded_fixers: + if fixer_name in self.fixer_names: + self.fixer_names.remove(fixer_name) diff --git a/setuptools/monkey.py b/setuptools/monkey.py new file mode 100644 index 0000000..68fad9d --- /dev/null +++ b/setuptools/monkey.py @@ -0,0 +1,183 @@ +""" +Monkey patching of distutils. +""" + +import sys +import distutils.filelist +import platform +import types +import functools +import inspect + +from .py26compat import import_module +import six + +import setuptools + +__all__ = [] +""" +Everything is private. Contact the project team +if you think you need this functionality. +""" + + +def get_unpatched(item): + lookup = ( + get_unpatched_class if isinstance(item, six.class_types) else + get_unpatched_function if isinstance(item, types.FunctionType) else + lambda item: None + ) + return lookup(item) + + +def get_unpatched_class(cls): + """Protect against re-patching the distutils if reloaded + + Also ensures that no other distutils extension monkeypatched the distutils + first. + """ + external_bases = ( + cls + for cls in inspect.getmro(cls) + if not cls.__module__.startswith('setuptools') + ) + base = next(external_bases) + if not base.__module__.startswith('distutils'): + msg = "distutils has already been patched by %r" % cls + raise AssertionError(msg) + return base + + +def patch_all(): + # we can't patch distutils.cmd, alas + distutils.core.Command = setuptools.Command + + has_issue_12885 = sys.version_info <= (3, 5, 3) + + if has_issue_12885: + # fix findall bug in distutils (http://bugs.python.org/issue12885) + distutils.filelist.findall = setuptools.findall + + needs_warehouse = ( + sys.version_info < (2, 7, 13) + or + (3, 0) < sys.version_info < (3, 3, 7) + or + (3, 4) < sys.version_info < (3, 4, 6) + or + (3, 5) < sys.version_info <= (3, 5, 3) + ) + + if needs_warehouse: + warehouse = 'https://upload.pypi.org/legacy/' + distutils.config.PyPIRCCommand.DEFAULT_REPOSITORY = warehouse + + _patch_distribution_metadata_write_pkg_file() + _patch_distribution_metadata_write_pkg_info() + + # Install Distribution throughout the distutils + for module in distutils.dist, distutils.core, distutils.cmd: + module.Distribution = setuptools.dist.Distribution + + # Install the patched Extension + distutils.core.Extension = setuptools.extension.Extension + distutils.extension.Extension = setuptools.extension.Extension + if 'distutils.command.build_ext' in sys.modules: + sys.modules['distutils.command.build_ext'].Extension = ( + setuptools.extension.Extension + ) + + patch_for_msvc_specialized_compiler() + + +def _patch_distribution_metadata_write_pkg_file(): + """Patch write_pkg_file to also write Requires-Python/Requires-External""" + distutils.dist.DistributionMetadata.write_pkg_file = ( + setuptools.dist.write_pkg_file + ) + + +def _patch_distribution_metadata_write_pkg_info(): + """ + Workaround issue #197 - Python 3 prior to 3.2.2 uses an environment-local + encoding to save the pkg_info. Monkey-patch its write_pkg_info method to + correct this undesirable behavior. + """ + environment_local = (3,) <= sys.version_info[:3] < (3, 2, 2) + if not environment_local: + return + + distutils.dist.DistributionMetadata.write_pkg_info = ( + setuptools.dist.write_pkg_info + ) + + +def patch_func(replacement, target_mod, func_name): + """ + Patch func_name in target_mod with replacement + + Important - original must be resolved by name to avoid + patching an already patched function. + """ + original = getattr(target_mod, func_name) + + # set the 'unpatched' attribute on the replacement to + # point to the original. + vars(replacement).setdefault('unpatched', original) + + # replace the function in the original module + setattr(target_mod, func_name, replacement) + + +def get_unpatched_function(candidate): + return getattr(candidate, 'unpatched') + + +def patch_for_msvc_specialized_compiler(): + """ + Patch functions in distutils to use standalone Microsoft Visual C++ + compilers. + """ + # import late to avoid circular imports on Python < 3.5 + msvc = import_module('setuptools.msvc') + + if platform.system() != 'Windows': + # Compilers only availables on Microsoft Windows + return + + def patch_params(mod_name, func_name): + """ + Prepare the parameters for patch_func to patch indicated function. + """ + repl_prefix = 'msvc9_' if 'msvc9' in mod_name else 'msvc14_' + repl_name = repl_prefix + func_name.lstrip('_') + repl = getattr(msvc, repl_name) + mod = import_module(mod_name) + if not hasattr(mod, func_name): + raise ImportError(func_name) + return repl, mod, func_name + + # Python 2.7 to 3.4 + msvc9 = functools.partial(patch_params, 'distutils.msvc9compiler') + + # Python 3.5+ + msvc14 = functools.partial(patch_params, 'distutils._msvccompiler') + + try: + # Patch distutils.msvc9compiler + patch_func(*msvc9('find_vcvarsall')) + patch_func(*msvc9('query_vcvarsall')) + except ImportError: + pass + + try: + # Patch distutils._msvccompiler._get_vc_env + patch_func(*msvc14('_get_vc_env')) + except ImportError: + pass + + try: + # Patch distutils._msvccompiler.gen_lib_options for Numpy + patch_func(*msvc14('gen_lib_options')) + except ImportError: + pass diff --git a/setuptools/msvc.py b/setuptools/msvc.py new file mode 100644 index 0000000..d41daec --- /dev/null +++ b/setuptools/msvc.py @@ -0,0 +1,1193 @@ +""" +Improved support for Microsoft Visual C++ compilers. + +Known supported compilers: +-------------------------- +Microsoft Visual C++ 9.0: + Microsoft Visual C++ Compiler for Python 2.7 (x86, amd64); + Microsoft Windows SDK 7.0 (x86, x64, ia64); + Microsoft Windows SDK 6.1 (x86, x64, ia64) + +Microsoft Visual C++ 10.0: + Microsoft Windows SDK 7.1 (x86, x64, ia64) + +Microsoft Visual C++ 14.0: + Microsoft Visual C++ Build Tools 2015 (x86, x64, arm) +""" + +import os +import sys +import platform +import itertools +import distutils.errors +from packaging.version import LegacyVersion + +from six.moves import filterfalse + +from .monkey import get_unpatched + +if platform.system() == 'Windows': + from six.moves import winreg + safe_env = os.environ +else: + """ + Mock winreg and environ so the module can be imported + on this platform. + """ + + class winreg: + HKEY_USERS = None + HKEY_CURRENT_USER = None + HKEY_LOCAL_MACHINE = None + HKEY_CLASSES_ROOT = None + + safe_env = dict() + +try: + from distutils.msvc9compiler import Reg +except ImportError: + pass + + +def msvc9_find_vcvarsall(version): + """ + Patched "distutils.msvc9compiler.find_vcvarsall" to use the standalone + compiler build for Python (VCForPython). Fall back to original behavior + when the standalone compiler is not available. + + Redirect the path of "vcvarsall.bat". + + Known supported compilers + ------------------------- + Microsoft Visual C++ 9.0: + Microsoft Visual C++ Compiler for Python 2.7 (x86, amd64) + + Parameters + ---------- + version: float + Required Microsoft Visual C++ version. + + Return + ------ + vcvarsall.bat path: str + """ + VC_BASE = r'Software\%sMicrosoft\DevDiv\VCForPython\%0.1f' + key = VC_BASE % ('', version) + try: + # Per-user installs register the compiler path here + productdir = Reg.get_value(key, "installdir") + except KeyError: + try: + # All-user installs on a 64-bit system register here + key = VC_BASE % ('Wow6432Node\\', version) + productdir = Reg.get_value(key, "installdir") + except KeyError: + productdir = None + + if productdir: + vcvarsall = os.path.os.path.join(productdir, "vcvarsall.bat") + if os.path.isfile(vcvarsall): + return vcvarsall + + return get_unpatched(msvc9_find_vcvarsall)(version) + + +def msvc9_query_vcvarsall(ver, arch='x86', *args, **kwargs): + """ + Patched "distutils.msvc9compiler.query_vcvarsall" for support standalones + compilers. + + Set environment without use of "vcvarsall.bat". + + Known supported compilers + ------------------------- + Microsoft Visual C++ 9.0: + Microsoft Visual C++ Compiler for Python 2.7 (x86, amd64); + Microsoft Windows SDK 7.0 (x86, x64, ia64); + Microsoft Windows SDK 6.1 (x86, x64, ia64) + + Microsoft Visual C++ 10.0: + Microsoft Windows SDK 7.1 (x86, x64, ia64) + + Parameters + ---------- + ver: float + Required Microsoft Visual C++ version. + arch: str + Target architecture. + + Return + ------ + environment: dict + """ + # Try to get environement from vcvarsall.bat (Classical way) + try: + orig = get_unpatched(msvc9_query_vcvarsall) + return orig(ver, arch, *args, **kwargs) + except distutils.errors.DistutilsPlatformError: + # Pass error if Vcvarsall.bat is missing + pass + except ValueError: + # Pass error if environment not set after executing vcvarsall.bat + pass + + # If error, try to set environment directly + try: + return EnvironmentInfo(arch, ver).return_env() + except distutils.errors.DistutilsPlatformError as exc: + _augment_exception(exc, ver, arch) + raise + + +def msvc14_get_vc_env(plat_spec): + """ + Patched "distutils._msvccompiler._get_vc_env" for support standalones + compilers. + + Set environment without use of "vcvarsall.bat". + + Known supported compilers + ------------------------- + Microsoft Visual C++ 14.0: + Microsoft Visual C++ Build Tools 2015 (x86, x64, arm) + + Parameters + ---------- + plat_spec: str + Target architecture. + + Return + ------ + environment: dict + """ + # Try to get environment from vcvarsall.bat (Classical way) + try: + return get_unpatched(msvc14_get_vc_env)(plat_spec) + except distutils.errors.DistutilsPlatformError: + # Pass error Vcvarsall.bat is missing + pass + + # If error, try to set environment directly + try: + return EnvironmentInfo(plat_spec, vc_min_ver=14.0).return_env() + except distutils.errors.DistutilsPlatformError as exc: + _augment_exception(exc, 14.0) + raise + + +def msvc14_gen_lib_options(*args, **kwargs): + """ + Patched "distutils._msvccompiler.gen_lib_options" for fix + compatibility between "numpy.distutils" and "distutils._msvccompiler" + (for Numpy < 1.11.2) + """ + if "numpy.distutils" in sys.modules: + import numpy as np + if LegacyVersion(np.__version__) < LegacyVersion('1.11.2'): + return np.distutils.ccompiler.gen_lib_options(*args, **kwargs) + return get_unpatched(msvc14_gen_lib_options)(*args, **kwargs) + + +def _augment_exception(exc, version, arch=''): + """ + Add details to the exception message to help guide the user + as to what action will resolve it. + """ + # Error if MSVC++ directory not found or environment not set + message = exc.args[0] + + if "vcvarsall" in message.lower() or "visual c" in message.lower(): + # Special error message if MSVC++ not installed + tmpl = 'Microsoft Visual C++ {version:0.1f} is required.' + message = tmpl.format(**locals()) + msdownload = 'www.microsoft.com/download/details.aspx?id=%d' + if version == 9.0: + if arch.lower().find('ia64') > -1: + # For VC++ 9.0, if IA64 support is needed, redirect user + # to Windows SDK 7.0 + message += ' Get it with "Microsoft Windows SDK 7.0": ' + message += msdownload % 3138 + else: + # For VC++ 9.0 redirect user to Vc++ for Python 2.7 : + # This redirection link is maintained by Microsoft. + # Contact vspython@microsoft.com if it needs updating. + message += ' Get it from http://aka.ms/vcpython27' + elif version == 10.0: + # For VC++ 10.0 Redirect user to Windows SDK 7.1 + message += ' Get it with "Microsoft Windows SDK 7.1": ' + message += msdownload % 8279 + elif version >= 14.0: + # For VC++ 14.0 Redirect user to Visual C++ Build Tools + message += (' Get it with "Microsoft Visual C++ Build Tools": ' + r'http://landinghub.visualstudio.com/' + 'visual-cpp-build-tools') + + exc.args = (message, ) + + +class PlatformInfo: + """ + Current and Target Architectures informations. + + Parameters + ---------- + arch: str + Target architecture. + """ + current_cpu = safe_env.get('processor_architecture', '').lower() + + def __init__(self, arch): + self.arch = arch.lower().replace('x64', 'amd64') + + @property + def target_cpu(self): + return self.arch[self.arch.find('_') + 1:] + + def target_is_x86(self): + return self.target_cpu == 'x86' + + def current_is_x86(self): + return self.current_cpu == 'x86' + + def current_dir(self, hidex86=False, x64=False): + """ + Current platform specific subfolder. + + Parameters + ---------- + hidex86: bool + return '' and not '\x86' if architecture is x86. + x64: bool + return '\x64' and not '\amd64' if architecture is amd64. + + Return + ------ + subfolder: str + '\target', or '' (see hidex86 parameter) + """ + return ( + '' if (self.current_cpu == 'x86' and hidex86) else + r'\x64' if (self.current_cpu == 'amd64' and x64) else + r'\%s' % self.current_cpu + ) + + def target_dir(self, hidex86=False, x64=False): + r""" + Target platform specific subfolder. + + Parameters + ---------- + hidex86: bool + return '' and not '\x86' if architecture is x86. + x64: bool + return '\x64' and not '\amd64' if architecture is amd64. + + Return + ------ + subfolder: str + '\current', or '' (see hidex86 parameter) + """ + return ( + '' if (self.target_cpu == 'x86' and hidex86) else + r'\x64' if (self.target_cpu == 'amd64' and x64) else + r'\%s' % self.target_cpu + ) + + def cross_dir(self, forcex86=False): + r""" + Cross platform specific subfolder. + + Parameters + ---------- + forcex86: bool + Use 'x86' as current architecture even if current acritecture is + not x86. + + Return + ------ + subfolder: str + '' if target architecture is current architecture, + '\current_target' if not. + """ + current = 'x86' if forcex86 else self.current_cpu + return ( + '' if self.target_cpu == current else + self.target_dir().replace('\\', '\\%s_' % current) + ) + + +class RegistryInfo: + """ + Microsoft Visual Studio related registry informations. + + Parameters + ---------- + platform_info: PlatformInfo + "PlatformInfo" instance. + """ + HKEYS = (winreg.HKEY_USERS, + winreg.HKEY_CURRENT_USER, + winreg.HKEY_LOCAL_MACHINE, + winreg.HKEY_CLASSES_ROOT) + + def __init__(self, platform_info): + self.pi = platform_info + + @property + def visualstudio(self): + """ + Microsoft Visual Studio root registry key. + """ + return 'VisualStudio' + + @property + def sxs(self): + """ + Microsoft Visual Studio SxS registry key. + """ + return os.path.join(self.visualstudio, 'SxS') + + @property + def vc(self): + """ + Microsoft Visual C++ VC7 registry key. + """ + return os.path.join(self.sxs, 'VC7') + + @property + def vs(self): + """ + Microsoft Visual Studio VS7 registry key. + """ + return os.path.join(self.sxs, 'VS7') + + @property + def vc_for_python(self): + """ + Microsoft Visual C++ for Python registry key. + """ + return r'DevDiv\VCForPython' + + @property + def microsoft_sdk(self): + """ + Microsoft SDK registry key. + """ + return 'Microsoft SDKs' + + @property + def windows_sdk(self): + """ + Microsoft Windows/Platform SDK registry key. + """ + return os.path.join(self.microsoft_sdk, 'Windows') + + @property + def netfx_sdk(self): + """ + Microsoft .NET Framework SDK registry key. + """ + return os.path.join(self.microsoft_sdk, 'NETFXSDK') + + @property + def windows_kits_roots(self): + """ + Microsoft Windows Kits Roots registry key. + """ + return r'Windows Kits\Installed Roots' + + def microsoft(self, key, x86=False): + """ + Return key in Microsoft software registry. + + Parameters + ---------- + key: str + Registry key path where look. + x86: str + Force x86 software registry. + + Return + ------ + str: value + """ + node64 = '' if self.pi.current_is_x86() or x86 else r'\Wow6432Node' + return os.path.join('Software', node64, 'Microsoft', key) + + def lookup(self, key, name): + """ + Look for values in registry in Microsoft software registry. + + Parameters + ---------- + key: str + Registry key path where look. + name: str + Value name to find. + + Return + ------ + str: value + """ + KEY_READ = winreg.KEY_READ + openkey = winreg.OpenKey + ms = self.microsoft + for hkey in self.HKEYS: + try: + bkey = openkey(hkey, ms(key), 0, KEY_READ) + except (OSError, IOError): + if not self.pi.current_is_x86(): + try: + bkey = openkey(hkey, ms(key, True), 0, KEY_READ) + except (OSError, IOError): + continue + else: + continue + try: + return winreg.QueryValueEx(bkey, name)[0] + except (OSError, IOError): + pass + + +class SystemInfo: + """ + Microsoft Windows and Visual Studio related system inormations. + + Parameters + ---------- + registry_info: RegistryInfo + "RegistryInfo" instance. + vc_ver: float + Required Microsoft Visual C++ version. + """ + + # Variables and properties in this class use originals CamelCase variables + # names from Microsoft source files for more easy comparaison. + WinDir = safe_env.get('WinDir', '') + ProgramFiles = safe_env.get('ProgramFiles', '') + ProgramFilesx86 = safe_env.get('ProgramFiles(x86)', ProgramFiles) + + def __init__(self, registry_info, vc_ver=None): + self.ri = registry_info + self.pi = self.ri.pi + if vc_ver: + self.vc_ver = vc_ver + else: + try: + self.vc_ver = self.find_available_vc_vers()[-1] + except IndexError: + err = 'No Microsoft Visual C++ version found' + raise distutils.errors.DistutilsPlatformError(err) + + def find_available_vc_vers(self): + """ + Find all available Microsoft Visual C++ versions. + """ + vckeys = (self.ri.vc, self.ri.vc_for_python) + vc_vers = [] + for hkey in self.ri.HKEYS: + for key in vckeys: + try: + bkey = winreg.OpenKey(hkey, key, 0, winreg.KEY_READ) + except (OSError, IOError): + continue + subkeys, values, _ = winreg.QueryInfoKey(bkey) + for i in range(values): + try: + ver = float(winreg.EnumValue(bkey, i)[0]) + if ver not in vc_vers: + vc_vers.append(ver) + except ValueError: + pass + for i in range(subkeys): + try: + ver = float(winreg.EnumKey(bkey, i)) + if ver not in vc_vers: + vc_vers.append(ver) + except ValueError: + pass + return sorted(vc_vers) + + @property + def VSInstallDir(self): + """ + Microsoft Visual Studio directory. + """ + # Default path + name = 'Microsoft Visual Studio %0.1f' % self.vc_ver + default = os.path.join(self.ProgramFilesx86, name) + + # Try to get path from registry, if fail use default path + return self.ri.lookup(self.ri.vs, '%0.1f' % self.vc_ver) or default + + @property + def VCInstallDir(self): + """ + Microsoft Visual C++ directory. + """ + # Default path + default = r'Microsoft Visual Studio %0.1f\VC' % self.vc_ver + guess_vc = os.path.join(self.ProgramFilesx86, default) + + # Try to get "VC++ for Python" path from registry as default path + reg_path = os.path.join(self.ri.vc_for_python, '%0.1f' % self.vc_ver) + python_vc = self.ri.lookup(reg_path, 'installdir') + default_vc = os.path.join(python_vc, 'VC') if python_vc else guess_vc + + # Try to get path from registry, if fail use default path + path = self.ri.lookup(self.ri.vc, '%0.1f' % self.vc_ver) or default_vc + + if not os.path.isdir(path): + msg = 'Microsoft Visual C++ directory not found' + raise distutils.errors.DistutilsPlatformError(msg) + + return path + + @property + def WindowsSdkVersion(self): + """ + Microsoft Windows SDK versions. + """ + # Set Windows SDK versions for specified MSVC++ version + if self.vc_ver <= 9.0: + return ('7.0', '6.1', '6.0a') + elif self.vc_ver == 10.0: + return ('7.1', '7.0a') + elif self.vc_ver == 11.0: + return ('8.0', '8.0a') + elif self.vc_ver == 12.0: + return ('8.1', '8.1a') + elif self.vc_ver >= 14.0: + return ('10.0', '8.1') + + @property + def WindowsSdkDir(self): + """ + Microsoft Windows SDK directory. + """ + sdkdir = '' + for ver in self.WindowsSdkVersion: + # Try to get it from registry + loc = os.path.join(self.ri.windows_sdk, 'v%s' % ver) + sdkdir = self.ri.lookup(loc, 'installationfolder') + if sdkdir: + break + if not sdkdir or not os.path.isdir(sdkdir): + # Try to get "VC++ for Python" version from registry + path = os.path.join(self.ri.vc_for_python, '%0.1f' % self.vc_ver) + install_base = self.ri.lookup(path, 'installdir') + if install_base: + sdkdir = os.path.join(install_base, 'WinSDK') + if not sdkdir or not os.path.isdir(sdkdir): + # If fail, use default new path + for ver in self.WindowsSdkVersion: + intver = ver[:ver.rfind('.')] + path = r'Microsoft SDKs\Windows Kits\%s' % (intver) + d = os.path.join(self.ProgramFiles, path) + if os.path.isdir(d): + sdkdir = d + if not sdkdir or not os.path.isdir(sdkdir): + # If fail, use default old path + for ver in self.WindowsSdkVersion: + path = r'Microsoft SDKs\Windows\v%s' % ver + d = os.path.join(self.ProgramFiles, path) + if os.path.isdir(d): + sdkdir = d + if not sdkdir: + # If fail, use Platform SDK + sdkdir = os.path.join(self.VCInstallDir, 'PlatformSDK') + return sdkdir + + @property + def WindowsSDKExecutablePath(self): + """ + Microsoft Windows SDK executable directory. + """ + # Find WinSDK NetFx Tools registry dir name + if self.vc_ver <= 11.0: + netfxver = 35 + arch = '' + else: + netfxver = 40 + hidex86 = True if self.vc_ver <= 12.0 else False + arch = self.pi.current_dir(x64=True, hidex86=hidex86) + fx = 'WinSDK-NetFx%dTools%s' % (netfxver, arch.replace('\\', '-')) + + # liste all possibles registry paths + regpaths = [] + if self.vc_ver >= 14.0: + for ver in self.NetFxSdkVersion: + regpaths += [os.path.join(self.ri.netfx_sdk, ver, fx)] + + for ver in self.WindowsSdkVersion: + regpaths += [os.path.join(self.ri.windows_sdk, 'v%sA' % ver, fx)] + + # Return installation folder from the more recent path + for path in regpaths: + execpath = self.ri.lookup(path, 'installationfolder') + if execpath: + break + return execpath + + @property + def FSharpInstallDir(self): + """ + Microsoft Visual F# directory. + """ + path = r'%0.1f\Setup\F#' % self.vc_ver + path = os.path.join(self.ri.visualstudio, path) + return self.ri.lookup(path, 'productdir') or '' + + @property + def UniversalCRTSdkDir(self): + """ + Microsoft Universal CRT SDK directory. + """ + # Set Kit Roots versions for specified MSVC++ version + if self.vc_ver >= 14.0: + vers = ('10', '81') + else: + vers = () + + # Find path of the more recent Kit + for ver in vers: + sdkdir = self.ri.lookup(self.ri.windows_kits_roots, + 'kitsroot%s' % ver) + if sdkdir: + break + return sdkdir or '' + + @property + def NetFxSdkVersion(self): + """ + Microsoft .NET Framework SDK versions. + """ + # Set FxSdk versions for specified MSVC++ version + if self.vc_ver >= 14.0: + return ('4.6.1', '4.6') + else: + return () + + @property + def NetFxSdkDir(self): + """ + Microsoft .NET Framework SDK directory. + """ + for ver in self.NetFxSdkVersion: + loc = os.path.join(self.ri.netfx_sdk, ver) + sdkdir = self.ri.lookup(loc, 'kitsinstallationfolder') + if sdkdir: + break + return sdkdir or '' + + @property + def FrameworkDir32(self): + """ + Microsoft .NET Framework 32bit directory. + """ + # Default path + guess_fw = os.path.join(self.WinDir, r'Microsoft.NET\Framework') + + # Try to get path from registry, if fail use default path + return self.ri.lookup(self.ri.vc, 'frameworkdir32') or guess_fw + + @property + def FrameworkDir64(self): + """ + Microsoft .NET Framework 64bit directory. + """ + # Default path + guess_fw = os.path.join(self.WinDir, r'Microsoft.NET\Framework64') + + # Try to get path from registry, if fail use default path + return self.ri.lookup(self.ri.vc, 'frameworkdir64') or guess_fw + + @property + def FrameworkVersion32(self): + """ + Microsoft .NET Framework 32bit versions. + """ + return self._find_dot_net_versions(32) + + @property + def FrameworkVersion64(self): + """ + Microsoft .NET Framework 64bit versions. + """ + return self._find_dot_net_versions(64) + + def _find_dot_net_versions(self, bits=32): + """ + Find Microsoft .NET Framework versions. + + Parameters + ---------- + bits: int + Platform number of bits: 32 or 64. + """ + # Find actual .NET version + ver = self.ri.lookup(self.ri.vc, 'frameworkver%d' % bits) or '' + + # Set .NET versions for specified MSVC++ version + if self.vc_ver >= 12.0: + frameworkver = (ver, 'v4.0') + elif self.vc_ver >= 10.0: + frameworkver = ('v4.0.30319' if ver.lower()[:2] != 'v4' else ver, + 'v3.5') + elif self.vc_ver == 9.0: + frameworkver = ('v3.5', 'v2.0.50727') + if self.vc_ver == 8.0: + frameworkver = ('v3.0', 'v2.0.50727') + return frameworkver + + +class EnvironmentInfo: + """ + Return environment variables for specified Microsoft Visual C++ version + and platform : Lib, Include, Path and libpath. + + This function is compatible with Microsoft Visual C++ 9.0 to 14.0. + + Script created by analysing Microsoft environment configuration files like + "vcvars[...].bat", "SetEnv.Cmd", "vcbuildtools.bat", ... + + Parameters + ---------- + arch: str + Target architecture. + vc_ver: float + Required Microsoft Visual C++ version. If not set, autodetect the last + version. + vc_min_ver: float + Minimum Microsoft Visual C++ version. + """ + + # Variables and properties in this class use originals CamelCase variables + # names from Microsoft source files for more easy comparaison. + + def __init__(self, arch, vc_ver=None, vc_min_ver=None): + self.pi = PlatformInfo(arch) + self.ri = RegistryInfo(self.pi) + self.si = SystemInfo(self.ri, vc_ver) + + if vc_min_ver: + if self.vc_ver < vc_min_ver: + err = 'No suitable Microsoft Visual C++ version found' + raise distutils.errors.DistutilsPlatformError(err) + + @property + def vc_ver(self): + """ + Microsoft Visual C++ version. + """ + return self.si.vc_ver + + @property + def VSTools(self): + """ + Microsoft Visual Studio Tools + """ + paths = [r'Common7\IDE', r'Common7\Tools'] + + if self.vc_ver >= 14.0: + arch_subdir = self.pi.current_dir(hidex86=True, x64=True) + paths += [r'Common7\IDE\CommonExtensions\Microsoft\TestWindow'] + paths += [r'Team Tools\Performance Tools'] + paths += [r'Team Tools\Performance Tools%s' % arch_subdir] + + return [os.path.join(self.si.VSInstallDir, path) for path in paths] + + @property + def VCIncludes(self): + """ + Microsoft Visual C++ & Microsoft Foundation Class Includes + """ + return [os.path.join(self.si.VCInstallDir, 'Include'), + os.path.join(self.si.VCInstallDir, r'ATLMFC\Include')] + + @property + def VCLibraries(self): + """ + Microsoft Visual C++ & Microsoft Foundation Class Libraries + """ + arch_subdir = self.pi.target_dir(hidex86=True) + paths = ['Lib%s' % arch_subdir, r'ATLMFC\Lib%s' % arch_subdir] + + if self.vc_ver >= 14.0: + paths += [r'Lib\store%s' % arch_subdir] + + return [os.path.join(self.si.VCInstallDir, path) for path in paths] + + @property + def VCStoreRefs(self): + """ + Microsoft Visual C++ store references Libraries + """ + if self.vc_ver < 14.0: + return [] + return [os.path.join(self.si.VCInstallDir, r'Lib\store\references')] + + @property + def VCTools(self): + """ + Microsoft Visual C++ Tools + """ + si = self.si + tools = [os.path.join(si.VCInstallDir, 'VCPackages')] + + forcex86 = True if self.vc_ver <= 10.0 else False + arch_subdir = self.pi.cross_dir(forcex86) + if arch_subdir: + tools += [os.path.join(si.VCInstallDir, 'Bin%s' % arch_subdir)] + + if self.vc_ver >= 14.0: + path = 'Bin%s' % self.pi.current_dir(hidex86=True) + tools += [os.path.join(si.VCInstallDir, path)] + + else: + tools += [os.path.join(si.VCInstallDir, 'Bin')] + + return tools + + @property + def OSLibraries(self): + """ + Microsoft Windows SDK Libraries + """ + if self.vc_ver <= 10.0: + arch_subdir = self.pi.target_dir(hidex86=True, x64=True) + return [os.path.join(self.si.WindowsSdkDir, 'Lib%s' % arch_subdir)] + + else: + arch_subdir = self.pi.target_dir(x64=True) + lib = os.path.join(self.si.WindowsSdkDir, 'lib') + libver = self._get_content_dirname(lib) + return [os.path.join(lib, '%sum%s' % (libver, arch_subdir))] + + @property + def OSIncludes(self): + """ + Microsoft Windows SDK Include + """ + include = os.path.join(self.si.WindowsSdkDir, 'include') + + if self.vc_ver <= 10.0: + return [include, os.path.join(include, 'gl')] + + else: + if self.vc_ver >= 14.0: + sdkver = self._get_content_dirname(include) + else: + sdkver = '' + return [os.path.join(include, '%sshared' % sdkver), + os.path.join(include, '%sum' % sdkver), + os.path.join(include, '%swinrt' % sdkver)] + + @property + def OSLibpath(self): + """ + Microsoft Windows SDK Libraries Paths + """ + ref = os.path.join(self.si.WindowsSdkDir, 'References') + libpath = [] + + if self.vc_ver <= 9.0: + libpath += self.OSLibraries + + if self.vc_ver >= 11.0: + libpath += [os.path.join(ref, r'CommonConfiguration\Neutral')] + + if self.vc_ver >= 14.0: + libpath += [ + ref, + os.path.join(self.si.WindowsSdkDir, 'UnionMetadata'), + os.path.join( + ref, + 'Windows.Foundation.UniversalApiContract', + '1.0.0.0', + ), + os.path.join( + ref, + 'Windows.Foundation.FoundationContract', + '1.0.0.0', + ), + os.path.join( + ref, + 'Windows.Networking.Connectivity.WwanContract', + '1.0.0.0', + ), + os.path.join( + self.si.WindowsSdkDir, + 'ExtensionSDKs', + 'Microsoft.VCLibs', + '%0.1f' % self.vc_ver, + 'References', + 'CommonConfiguration', + 'neutral', + ), + ] + return libpath + + @property + def SdkTools(self): + """ + Microsoft Windows SDK Tools + """ + bin_dir = 'Bin' if self.vc_ver <= 11.0 else r'Bin\x86' + tools = [os.path.join(self.si.WindowsSdkDir, bin_dir)] + + if not self.pi.current_is_x86(): + arch_subdir = self.pi.current_dir(x64=True) + path = 'Bin%s' % arch_subdir + tools += [os.path.join(self.si.WindowsSdkDir, path)] + + if self.vc_ver == 10.0 or self.vc_ver == 11.0: + if self.pi.target_is_x86(): + arch_subdir = '' + else: + arch_subdir = self.pi.current_dir(hidex86=True, x64=True) + path = r'Bin\NETFX 4.0 Tools%s' % arch_subdir + tools += [os.path.join(self.si.WindowsSdkDir, path)] + + if self.si.WindowsSDKExecutablePath: + tools += [self.si.WindowsSDKExecutablePath] + + return tools + + @property + def SdkSetup(self): + """ + Microsoft Windows SDK Setup + """ + if self.vc_ver > 9.0: + return [] + + return [os.path.join(self.si.WindowsSdkDir, 'Setup')] + + @property + def FxTools(self): + """ + Microsoft .NET Framework Tools + """ + pi = self.pi + si = self.si + + if self.vc_ver <= 10.0: + include32 = True + include64 = not pi.target_is_x86() and not pi.current_is_x86() + else: + include32 = pi.target_is_x86() or pi.current_is_x86() + include64 = pi.current_cpu == 'amd64' or pi.target_cpu == 'amd64' + + tools = [] + if include32: + tools += [os.path.join(si.FrameworkDir32, ver) + for ver in si.FrameworkVersion32] + if include64: + tools += [os.path.join(si.FrameworkDir64, ver) + for ver in si.FrameworkVersion64] + return tools + + @property + def NetFxSDKLibraries(self): + """ + Microsoft .Net Framework SDK Libraries + """ + if self.vc_ver < 14.0 or not self.si.NetFxSdkDir: + return [] + + arch_subdir = self.pi.target_dir(x64=True) + return [os.path.join(self.si.NetFxSdkDir, r'lib\um%s' % arch_subdir)] + + @property + def NetFxSDKIncludes(self): + """ + Microsoft .Net Framework SDK Includes + """ + if self.vc_ver < 14.0 or not self.si.NetFxSdkDir: + return [] + + return [os.path.join(self.si.NetFxSdkDir, r'include\um')] + + @property + def VsTDb(self): + """ + Microsoft Visual Studio Team System Database + """ + return [os.path.join(self.si.VSInstallDir, r'VSTSDB\Deploy')] + + @property + def MSBuild(self): + """ + Microsoft Build Engine + """ + if self.vc_ver < 12.0: + return [] + + arch_subdir = self.pi.current_dir(hidex86=True) + path = r'MSBuild\%0.1f\bin%s' % (self.vc_ver, arch_subdir) + return [os.path.join(self.si.ProgramFilesx86, path)] + + @property + def HTMLHelpWorkshop(self): + """ + Microsoft HTML Help Workshop + """ + if self.vc_ver < 11.0: + return [] + + return [os.path.join(self.si.ProgramFilesx86, 'HTML Help Workshop')] + + @property + def UCRTLibraries(self): + """ + Microsoft Universal CRT Libraries + """ + if self.vc_ver < 14.0: + return [] + + arch_subdir = self.pi.target_dir(x64=True) + lib = os.path.join(self.si.UniversalCRTSdkDir, 'lib') + ucrtver = self._get_content_dirname(lib) + return [os.path.join(lib, '%sucrt%s' % (ucrtver, arch_subdir))] + + @property + def UCRTIncludes(self): + """ + Microsoft Universal CRT Include + """ + if self.vc_ver < 14.0: + return [] + + include = os.path.join(self.si.UniversalCRTSdkDir, 'include') + ucrtver = self._get_content_dirname(include) + return [os.path.join(include, '%sucrt' % ucrtver)] + + @property + def FSharp(self): + """ + Microsoft Visual F# + """ + if self.vc_ver < 11.0 and self.vc_ver > 12.0: + return [] + + return self.si.FSharpInstallDir + + @property + def VCRuntimeRedist(self): + """ + Microsoft Visual C++ runtime redistribuable dll + """ + arch_subdir = self.pi.target_dir(x64=True) + vcruntime = 'redist%s\\Microsoft.VC%d0.CRT\\vcruntime%d0.dll' + vcruntime = vcruntime % (arch_subdir, self.vc_ver, self.vc_ver) + return os.path.join(self.si.VCInstallDir, vcruntime) + + def return_env(self, exists=True): + """ + Return environment dict. + + Parameters + ---------- + exists: bool + It True, only return existing paths. + """ + env = dict( + include=self._build_paths('include', + [self.VCIncludes, + self.OSIncludes, + self.UCRTIncludes, + self.NetFxSDKIncludes], + exists), + lib=self._build_paths('lib', + [self.VCLibraries, + self.OSLibraries, + self.FxTools, + self.UCRTLibraries, + self.NetFxSDKLibraries], + exists), + libpath=self._build_paths('libpath', + [self.VCLibraries, + self.FxTools, + self.VCStoreRefs, + self.OSLibpath], + exists), + path=self._build_paths('path', + [self.VCTools, + self.VSTools, + self.VsTDb, + self.SdkTools, + self.SdkSetup, + self.FxTools, + self.MSBuild, + self.HTMLHelpWorkshop, + self.FSharp], + exists), + ) + if self.vc_ver >= 14 and os.path.isfile(self.VCRuntimeRedist): + env['py_vcruntime_redist'] = self.VCRuntimeRedist + return env + + def _build_paths(self, name, spec_path_lists, exists): + """ + Given an environment variable name and specified paths, + return a pathsep-separated string of paths containing + unique, extant, directories from those paths and from + the environment variable. Raise an error if no paths + are resolved. + """ + # flatten spec_path_lists + spec_paths = itertools.chain.from_iterable(spec_path_lists) + env_paths = safe_env.get(name, '').split(os.pathsep) + paths = itertools.chain(spec_paths, env_paths) + extant_paths = list(filter(os.path.isdir, paths)) if exists else paths + if not extant_paths: + msg = "%s environment variable is empty" % name.upper() + raise distutils.errors.DistutilsPlatformError(msg) + unique_paths = self._unique_everseen(extant_paths) + return os.pathsep.join(unique_paths) + + # from Python docs + def _unique_everseen(self, iterable, key=None): + """ + List unique elements, preserving order. + Remember all elements ever seen. + + _unique_everseen('AAAABBBCCDAABBB') --> A B C D + + _unique_everseen('ABBCcAD', str.lower) --> A B C D + """ + seen = set() + seen_add = seen.add + if key is None: + for element in filterfalse(seen.__contains__, iterable): + seen_add(element) + yield element + else: + for element in iterable: + k = key(element) + if k not in seen: + seen_add(k) + yield element + + def _get_content_dirname(self, path): + """ + Return name of the first dir in path or '' if no dir found. + + Parameters + ---------- + path: str + Path where search dir. + + Return + ------ + foldername: str + "name\" or "" + """ + try: + name = os.listdir(path) + if name: + return '%s\\' % name[0] + return '' + except (OSError, IOError): + return '' diff --git a/setuptools/namespaces.py b/setuptools/namespaces.py new file mode 100755 index 0000000..7c24a56 --- /dev/null +++ b/setuptools/namespaces.py @@ -0,0 +1,107 @@ +import os +from distutils import log +import itertools + +from six.moves import map + + +flatten = itertools.chain.from_iterable + + +class Installer: + + nspkg_ext = '-nspkg.pth' + + def install_namespaces(self): + nsp = self._get_all_ns_packages() + if not nsp: + return + filename, ext = os.path.splitext(self._get_target()) + filename += self.nspkg_ext + self.outputs.append(filename) + log.info("Installing %s", filename) + lines = map(self._gen_nspkg_line, nsp) + + if self.dry_run: + # always generate the lines, even in dry run + list(lines) + return + + with open(filename, 'wt') as f: + f.writelines(lines) + + def uninstall_namespaces(self): + filename, ext = os.path.splitext(self._get_target()) + filename += self.nspkg_ext + if not os.path.exists(filename): + return + log.info("Removing %s", filename) + os.remove(filename) + + def _get_target(self): + return self.target + + _nspkg_tmpl = ( + "import sys, types, os", + "has_mfs = sys.version_info > (3, 5)", + "p = os.path.join(%(root)s, *%(pth)r)", + "importlib = has_mfs and __import__('importlib.util')", + "has_mfs and __import__('importlib.machinery')", + "m = has_mfs and " + "sys.modules.setdefault(%(pkg)r, " + "importlib.util.module_from_spec(" + "importlib.machinery.PathFinder.find_spec(%(pkg)r, " + "[os.path.dirname(p)])))", + "m = m or " + "sys.modules.setdefault(%(pkg)r, types.ModuleType(%(pkg)r))", + "mp = (m or []) and m.__dict__.setdefault('__path__',[])", + "(p not in mp) and mp.append(p)", + ) + "lines for the namespace installer" + + _nspkg_tmpl_multi = ( + 'm and setattr(sys.modules[%(parent)r], %(child)r, m)', + ) + "additional line(s) when a parent package is indicated" + + def _get_root(self): + return "sys._getframe(1).f_locals['sitedir']" + + def _gen_nspkg_line(self, pkg): + # ensure pkg is not a unicode string under Python 2.7 + pkg = str(pkg) + pth = tuple(pkg.split('.')) + root = self._get_root() + tmpl_lines = self._nspkg_tmpl + parent, sep, child = pkg.rpartition('.') + if parent: + tmpl_lines += self._nspkg_tmpl_multi + return ';'.join(tmpl_lines) % locals() + '\n' + + def _get_all_ns_packages(self): + """Return sorted list of all package namespaces""" + pkgs = self.distribution.namespace_packages or [] + return sorted(flatten(map(self._pkg_names, pkgs))) + + @staticmethod + def _pkg_names(pkg): + """ + Given a namespace package, yield the components of that + package. + + >>> names = Installer._pkg_names('a.b.c') + >>> set(names) == set(['a', 'a.b', 'a.b.c']) + True + """ + parts = pkg.split('.') + while parts: + yield '.'.join(parts) + parts.pop() + + +class DevelopInstaller(Installer): + def _get_root(self): + return repr(str(self.egg_path)) + + def _get_target(self): + return self.egg_link diff --git a/setuptools/package_index.py b/setuptools/package_index.py index 70b75a6..5d397b6 100755 --- a/setuptools/package_index.py +++ b/setuptools/package_index.py @@ -1,22 +1,43 @@ """PyPI and direct package downloading""" -import sys, os.path, re, urlparse, urllib2, shutil, random, socket, cStringIO -import httplib -from pkg_resources import * -from distutils import log -from distutils.errors import DistutilsError +import sys +import os +import re +import shutil +import socket +import base64 +import hashlib +import itertools +from functools import wraps + try: - from hashlib import md5 + from urllib.parse import splituser except ImportError: - from md5 import md5 + from urllib2 import splituser + +import six +from six.moves import urllib, http_client, configparser, map + +import setuptools +from pkg_resources import ( + CHECKOUT_DIST, Distribution, BINARY_DIST, normalize_path, SOURCE_DIST, + Environment, find_distributions, safe_name, safe_version, + to_filename, Requirement, DEVELOP_DIST, +) +from setuptools import ssl_support +from distutils import log +from distutils.errors import DistutilsError from fnmatch import translate -EGG_FRAGMENT = re.compile(r'^egg=([-A-Za-z0-9_.]+)$') +from setuptools.py26compat import strip_fragment +from setuptools.py27compat import get_all_headers + +EGG_FRAGMENT = re.compile(r'^egg=([-A-Za-z0-9_.+!]+)$') HREF = re.compile("""href\\s*=\\s*['"]?([^'"> ]+)""", re.I) # this is here to fix emacs' cruddy broken syntax highlighting PYPI_MD5 = re.compile( - '([^<]+)\n\s+\\(md5\\)' + '([^<]+)\n\\s+\\(md5\\)' ) -URL_SCHEME = re.compile('([-+.a-z0-9]{2,}):',re.I).match +URL_SCHEME = re.compile('([-+.a-z0-9]{2,}):', re.I).match EXTENSIONS = ".tar.gz .tar.bz2 .tar .zip .tgz".split() __all__ = [ @@ -24,53 +45,82 @@ __all__ = [ 'interpret_distro_name', ] +_SOCKET_TIMEOUT = 15 + +_tmpl = "setuptools/{setuptools.__version__} Python-urllib/{py_major}" +user_agent = _tmpl.format(py_major=sys.version[:3], setuptools=setuptools) + + +def parse_requirement_arg(spec): + try: + return Requirement.parse(spec) + except ValueError: + raise DistutilsError( + "Not a URL, existing file, or requirement spec: %r" % (spec,) + ) + + def parse_bdist_wininst(name): """Return (base,pyversion) or (None,None) for possible .exe name""" lower = name.lower() - base, py_ver = None, None + base, py_ver, plat = None, None, None if lower.endswith('.exe'): if lower.endswith('.win32.exe'): base = name[:-10] - elif lower.startswith('.win32-py',-16): + plat = 'win32' + elif lower.startswith('.win32-py', -16): py_ver = name[-7:-4] base = name[:-16] + plat = 'win32' + elif lower.endswith('.win-amd64.exe'): + base = name[:-14] + plat = 'win-amd64' + elif lower.startswith('.win-amd64-py', -20): + py_ver = name[-7:-4] + base = name[:-20] + plat = 'win-amd64' + return base, py_ver, plat - return base,py_ver def egg_info_for_url(url): - scheme, server, path, parameters, query, fragment = urlparse.urlparse(url) - base = urllib2.unquote(path.split('/')[-1]) - if server=='sourceforge.net' and base=='download': # XXX Yuck - base = urllib2.unquote(path.split('/')[-2]) - if '#' in base: base, fragment = base.split('#',1) - return base,fragment + parts = urllib.parse.urlparse(url) + scheme, server, path, parameters, query, fragment = parts + base = urllib.parse.unquote(path.split('/')[-1]) + if server == 'sourceforge.net' and base == 'download': # XXX Yuck + base = urllib.parse.unquote(path.split('/')[-2]) + if '#' in base: + base, fragment = base.split('#', 1) + return base, fragment + def distros_for_url(url, metadata=None): """Yield egg or source distribution objects that might be found at a URL""" base, fragment = egg_info_for_url(url) - for dist in distros_for_location(url, base, metadata): yield dist + for dist in distros_for_location(url, base, metadata): + yield dist if fragment: match = EGG_FRAGMENT.match(fragment) if match: for dist in interpret_distro_name( - url, match.group(1), metadata, precedence = CHECKOUT_DIST + url, match.group(1), metadata, precedence=CHECKOUT_DIST ): yield dist + def distros_for_location(location, basename, metadata=None): """Yield egg or source distribution objects based on basename""" if basename.endswith('.egg.zip'): - basename = basename[:-4] # strip the .zip + basename = basename[:-4] # strip the .zip if basename.endswith('.egg') and '-' in basename: # only one, unambiguous interpretation return [Distribution.from_location(location, basename, metadata)] if basename.endswith('.exe'): - win_base, py_ver = parse_bdist_wininst(basename) + win_base, py_ver, platform = parse_bdist_wininst(basename) if win_base is not None: return interpret_distro_name( - location, win_base, metadata, py_ver, BINARY_DIST, "win32" + location, win_base, metadata, py_ver, BINARY_DIST, platform ) # Try source distro extensions (.zip, .tgz, etc.) # @@ -80,6 +130,7 @@ def distros_for_location(location, basename, metadata=None): return interpret_distro_name(location, basename, metadata) return [] # no extension matched + def distros_for_filename(filename, metadata=None): """Yield possible egg or source distribution objects based on a filename""" return distros_for_location( @@ -87,9 +138,10 @@ def distros_for_filename(filename, metadata=None): ) -def interpret_distro_name(location, basename, metadata, - py_version=None, precedence=SOURCE_DIST, platform=None -): +def interpret_distro_name( + location, basename, metadata, py_version=None, precedence=SOURCE_DIST, + platform=None + ): """Generate alternative interpretations of a source distro name Note: if `location` is a filesystem filename, you should call @@ -109,58 +161,154 @@ def interpret_distro_name(location, basename, metadata, # versions in distribution archive names (sdist and bdist). parts = basename.split('-') - if not py_version: - for i,p in enumerate(parts[2:]): - if len(p)==5 and p.startswith('py2.'): - return # It's a bdist_dumb, not an sdist -- bail out + if not py_version and any(re.match(r'py\d\.\d$', p) for p in parts[2:]): + # it is a bdist_dumb, not an sdist -- bail out + return - for p in range(1,len(parts)+1): + for p in range(1, len(parts) + 1): yield Distribution( location, metadata, '-'.join(parts[:p]), '-'.join(parts[p:]), - py_version=py_version, precedence = precedence, - platform = platform + py_version=py_version, precedence=precedence, + platform=platform ) -REL = re.compile("""<([^>]*\srel\s*=\s*['"]?([^'">]+)[^>]*)>""", re.I) + +# From Python 2.7 docs +def unique_everseen(iterable, key=None): + "List unique elements, preserving order. Remember all elements ever seen." + # unique_everseen('AAAABBBCCDAABBB') --> A B C D + # unique_everseen('ABBCcAD', str.lower) --> A B C D + seen = set() + seen_add = seen.add + if key is None: + for element in six.moves.filterfalse(seen.__contains__, iterable): + seen_add(element) + yield element + else: + for element in iterable: + k = key(element) + if k not in seen: + seen_add(k) + yield element + + +def unique_values(func): + """ + Wrap a function returning an iterable such that the resulting iterable + only ever yields unique items. + """ + + @wraps(func) + def wrapper(*args, **kwargs): + return unique_everseen(func(*args, **kwargs)) + + return wrapper + + +REL = re.compile(r"""<([^>]*\srel\s*=\s*['"]?([^'">]+)[^>]*)>""", re.I) # this line is here to fix emacs' cruddy broken syntax highlighting + +@unique_values def find_external_links(url, page): """Find rel="homepage" and rel="download" links in `page`, yielding URLs""" for match in REL.finditer(page): tag, rel = match.groups() - rels = map(str.strip, rel.lower().split(',')) + rels = set(map(str.strip, rel.lower().split(','))) if 'homepage' in rels or 'download' in rels: for match in HREF.finditer(tag): - yield urlparse.urljoin(url, htmldecode(match.group(1))) + yield urllib.parse.urljoin(url, htmldecode(match.group(1))) for tag in ("Home Page", "Download URL"): pos = page.find(tag) - if pos!=-1: - match = HREF.search(page,pos) + if pos != -1: + match = HREF.search(page, pos) if match: - yield urlparse.urljoin(url, htmldecode(match.group(1))) + yield urllib.parse.urljoin(url, htmldecode(match.group(1))) -user_agent = "Python-urllib/%s setuptools/%s" % ( - urllib2.__version__, require('setuptools')[0].version -) + +class ContentChecker(object): + """ + A null content checker that defines the interface for checking content + """ + + def feed(self, block): + """ + Feed a block of data to the hash. + """ + return + + def is_valid(self): + """ + Check the hash. Return False if validation fails. + """ + return True + + def report(self, reporter, template): + """ + Call reporter with information about the checker (hash name) + substituted into the template. + """ + return + + +class HashChecker(ContentChecker): + pattern = re.compile( + r'(?Psha1|sha224|sha384|sha256|sha512|md5)=' + r'(?P[a-f0-9]+)' + ) + + def __init__(self, hash_name, expected): + self.hash_name = hash_name + self.hash = hashlib.new(hash_name) + self.expected = expected + + @classmethod + def from_url(cls, url): + "Construct a (possibly null) ContentChecker from a URL" + fragment = urllib.parse.urlparse(url)[-1] + if not fragment: + return ContentChecker() + match = cls.pattern.search(fragment) + if not match: + return ContentChecker() + return cls(**match.groupdict()) + + def feed(self, block): + self.hash.update(block) + + def is_valid(self): + return self.hash.hexdigest() == self.expected + + def report(self, reporter, template): + msg = template % self.hash_name + return reporter(msg) class PackageIndex(Environment): """A distribution index that scans web pages for download URLs""" - def __init__(self, index_url="http://pypi.python.org/simple", hosts=('*',), - *args, **kw - ): - Environment.__init__(self,*args,**kw) - self.index_url = index_url + "/"[:not index_url.endswith('/')] + def __init__( + self, index_url="https://pypi.python.org/simple", hosts=('*',), + ca_bundle=None, verify_ssl=True, *args, **kw + ): + Environment.__init__(self, *args, **kw) + self.index_url = index_url + "/" [:not index_url.endswith('/')] self.scanned_urls = {} self.fetched_urls = {} self.package_pages = {} - self.allows = re.compile('|'.join(map(translate,hosts))).match + self.allows = re.compile('|'.join(map(translate, hosts))).match self.to_scan = [] - - + use_ssl = ( + verify_ssl + and ssl_support.is_available + and (ca_bundle or ssl_support.find_ca_bundle()) + ) + if use_ssl: + self.opener = ssl_support.opener_for(ca_bundle) + else: + self.opener = urllib.request.urlopen def process_url(self, url, retrieve=False): """Evaluate a URL as a possible download, and maybe retrieve it""" @@ -178,7 +326,7 @@ class PackageIndex(Environment): self.debug("Found link: %s", url) if dists or not retrieve or url in self.fetched_urls: - map(self.add, dists) + list(map(self.add, dists)) return # don't need the actual page if not self.url_ok(url): @@ -186,22 +334,31 @@ class PackageIndex(Environment): return self.info("Reading %s", url) - self.fetched_urls[url] = True # prevent multiple fetch attempts - f = self.open_url(url, "Download error: %s -- Some packages may not be found!") - if f is None: return + self.fetched_urls[url] = True # prevent multiple fetch attempts + tmpl = "Download error on %s: %%s -- Some packages may not be found!" + f = self.open_url(url, tmpl % url) + if f is None: + return self.fetched_urls[f.url] = True if 'html' not in f.headers.get('content-type', '').lower(): - f.close() # not html, we can't process it + f.close() # not html, we can't process it return - base = f.url # handle redirects + base = f.url # handle redirects page = f.read() + if not isinstance(page, str): # We are in Python 3 and got bytes. We want str. + if isinstance(f, urllib.error.HTTPError): + # Errors have no charset, assume latin1: + charset = 'latin-1' + else: + charset = f.headers.get_param('charset') or 'latin-1' + page = page.decode(charset, "ignore") f.close() - if url.startswith(self.index_url) and getattr(f,'code',None)!=404: - page = self.process_index(url, page) for match in HREF.finditer(page): - link = urlparse.urljoin(base, htmldecode(match.group(1))) + link = urllib.parse.urljoin(base, htmldecode(match.group(1))) self.process_url(link) + if url.startswith(self.index_url) and getattr(f, 'code', None) != 404: + page = self.process_index(url, page) def process_filename(self, fn, nested=False): # process filenames or directories @@ -212,59 +369,76 @@ class PackageIndex(Environment): if os.path.isdir(fn) and not nested: path = os.path.realpath(fn) for item in os.listdir(path): - self.process_filename(os.path.join(path,item), True) + self.process_filename(os.path.join(path, item), True) dists = distros_for_filename(fn) if dists: self.debug("Found: %s", fn) - map(self.add, dists) + list(map(self.add, dists)) def url_ok(self, url, fatal=False): s = URL_SCHEME(url) - if (s and s.group(1).lower()=='file') or self.allows(urlparse.urlparse(url)[1]): + is_file = s and s.group(1).lower() == 'file' + if is_file or self.allows(urllib.parse.urlparse(url)[1]): return True - msg = "\nLink to % s ***BLOCKED*** by --allow-hosts\n" + msg = ("\nNote: Bypassing %s (disallowed host; see " + "http://bit.ly/1dg9ijs for details).\n") if fatal: raise DistutilsError(msg % url) else: self.warn(msg, url) def scan_egg_links(self, search_path): - for item in search_path: - if os.path.isdir(item): - for entry in os.listdir(item): - if entry.endswith('.egg-link'): - self.scan_egg_link(item, entry) + dirs = filter(os.path.isdir, search_path) + egg_links = ( + (path, entry) + for path in dirs + for entry in os.listdir(path) + if entry.endswith('.egg-link') + ) + list(itertools.starmap(self.scan_egg_link, egg_links)) def scan_egg_link(self, path, entry): - lines = filter(None, map(str.strip, file(os.path.join(path, entry)))) - if len(lines)==2: - for dist in find_distributions(os.path.join(path, lines[0])): - dist.location = os.path.join(path, *lines) - dist.precedence = SOURCE_DIST - self.add(dist) - - def process_index(self,url,page): + with open(os.path.join(path, entry)) as raw_lines: + # filter non-empty lines + lines = list(filter(None, map(str.strip, raw_lines))) + + if len(lines) != 2: + # format is not recognized; punt + return + + egg_path, setup_path = lines + + for dist in find_distributions(os.path.join(path, egg_path)): + dist.location = os.path.join(path, *lines) + dist.precedence = SOURCE_DIST + self.add(dist) + + def process_index(self, url, page): """Process the contents of a PyPI page""" + def scan(link): # Process a URL to see if it's for a package page if link.startswith(self.index_url): - parts = map( - urllib2.unquote, link[len(self.index_url):].split('/') - ) - if len(parts)==2 and '#' not in parts[1]: + parts = list(map( + urllib.parse.unquote, link[len(self.index_url):].split('/') + )) + if len(parts) == 2 and '#' not in parts[1]: # it's a package page, sanitize and index it pkg = safe_name(parts[0]) ver = safe_version(parts[1]) - self.package_pages.setdefault(pkg.lower(),{})[link] = True + self.package_pages.setdefault(pkg.lower(), {})[link] = True return to_filename(pkg), to_filename(ver) return None, None # process an index page into the package-page index for match in HREF.finditer(page): - scan( urlparse.urljoin(url, htmldecode(match.group(1))) ) + try: + scan(urllib.parse.urljoin(url, htmldecode(match.group(1)))) + except ValueError: + pass - pkg, ver = scan(url) # ensure this page is in the page index + pkg, ver = scan(url) # ensure this page is in the page index if pkg: # process individual package page for new_url in find_external_links(url, page): @@ -272,18 +446,16 @@ class PackageIndex(Environment): base, frag = egg_info_for_url(new_url) if base.endswith('.py') and not frag: if ver: - new_url+='#egg=%s-%s' % (pkg,ver) + new_url += '#egg=%s-%s' % (pkg, ver) else: self.need_version_info(url) self.scan_url(new_url) return PYPI_MD5.sub( - lambda m: '%s' % m.group(1,3,2), page + lambda m: '%s' % m.group(1, 3, 2), page ) else: - return "" # no sense double-scanning non-package pages - - + return "" # no sense double-scanning non-package pages def need_version_info(self, url): self.scan_all( @@ -293,58 +465,60 @@ class PackageIndex(Environment): def scan_all(self, msg=None, *args): if self.index_url not in self.fetched_urls: - if msg: self.warn(msg,*args) + if msg: + self.warn(msg, *args) self.info( "Scanning index of all packages (this may take a while)" ) self.scan_url(self.index_url) def find_packages(self, requirement): - self.scan_url(self.index_url + requirement.unsafe_name+'/') + self.scan_url(self.index_url + requirement.unsafe_name + '/') if not self.package_pages.get(requirement.key): # Fall back to safe version of the name - self.scan_url(self.index_url + requirement.project_name+'/') + self.scan_url(self.index_url + requirement.project_name + '/') if not self.package_pages.get(requirement.key): # We couldn't find the target package, so search the index page too self.not_found_in_index(requirement) - for url in list(self.package_pages.get(requirement.key,())): + for url in list(self.package_pages.get(requirement.key, ())): # scan each page that might be related to the desired package self.scan_url(url) def obtain(self, requirement, installer=None): - self.prescan(); self.find_packages(requirement) + self.prescan() + self.find_packages(requirement) for dist in self[requirement.key]: if dist in requirement: return dist self.debug("%s does not match %s", requirement, dist) - return super(PackageIndex, self).obtain(requirement,installer) - + return super(PackageIndex, self).obtain(requirement, installer) - - - - def check_md5(self, cs, info, filename, tfp): - if re.match('md5=[0-9a-f]{32}$', info): - self.debug("Validating md5 checksum for %s", filename) - if cs.hexdigest()!=info[4:]: - tfp.close() - os.unlink(filename) - raise DistutilsError( - "MD5 validation failed for "+os.path.basename(filename)+ - "; possible download problem?" - ) + def check_hash(self, checker, filename, tfp): + """ + checker is a ContentChecker + """ + checker.report(self.debug, + "Validating %%s checksum for %s" % filename) + if not checker.is_valid(): + tfp.close() + os.unlink(filename) + raise DistutilsError( + "%s validation failed for %s; " + "possible download problem?" % ( + checker.hash.name, os.path.basename(filename)) + ) def add_find_links(self, urls): """Add `urls` to the list that will be prescanned for searches""" for url in urls: if ( - self.to_scan is None # if we have already "gone online" - or not URL_SCHEME(url) # or it's a local file/directory + self.to_scan is None # if we have already "gone online" + or not URL_SCHEME(url) # or it's a local file/directory or url.startswith('file:') - or list(distros_for_url(url)) # or a direct package link + or list(distros_for_url(url)) # or a direct package link ): # then go ahead and process it now self.scan_url(url) @@ -355,13 +529,13 @@ class PackageIndex(Environment): def prescan(self): """Scan urls scheduled for prescanning (e.g. --find-links)""" if self.to_scan: - map(self.scan_url, self.to_scan) - self.to_scan = None # from now on, go ahead and process immediately + list(map(self.scan_url, self.to_scan)) + self.to_scan = None # from now on, go ahead and process immediately def not_found_in_index(self, requirement): - if self[requirement.key]: # we've seen at least one distro + if self[requirement.key]: # we've seen at least one distro meth, msg = self.info, "Couldn't retrieve index page for %r" - else: # no distros seen for this name, might be misspelled + else: # no distros seen for this name, might be misspelled meth, msg = (self.warn, "Couldn't find index page for %r (maybe misspelled?)") meth(msg, requirement.unsafe_name) @@ -385,33 +559,26 @@ class PackageIndex(Environment): of `tmpdir`, and the local filename is returned. Various errors may be raised if a problem occurs during downloading. """ - if not isinstance(spec,Requirement): + if not isinstance(spec, Requirement): scheme = URL_SCHEME(spec) if scheme: # It's a url, download it to tmpdir found = self._download_url(scheme.group(1), spec, tmpdir) base, fragment = egg_info_for_url(spec) if base.endswith('.py'): - found = self.gen_setup(found,fragment,tmpdir) + found = self.gen_setup(found, fragment, tmpdir) return found elif os.path.exists(spec): # Existing file or directory, just return it return spec else: - try: - spec = Requirement.parse(spec) - except ValueError: - raise DistutilsError( - "Not a URL, existing file, or requirement spec: %r" % - (spec,) - ) - return getattr(self.fetch_distribution(spec, tmpdir),'location',None) + spec = parse_requirement_arg(spec) + return getattr(self.fetch_distribution(spec, tmpdir), 'location', None) - - def fetch_distribution(self, - requirement, tmpdir, force_scan=False, source=False, develop_ok=False, - local_index=None, - ): + def fetch_distribution( + self, requirement, tmpdir, force_scan=False, source=False, + develop_ok=False, local_index=None + ): """Obtain a distribution suitable for fulfilling `requirement` `requirement` must be a ``pkg_resources.Requirement`` instance. @@ -433,47 +600,50 @@ class PackageIndex(Environment): skipped = {} dist = None - def find(env, req): + def find(req, env=None): + if env is None: + env = self # Find a matching distribution; may be called more than once for dist in env[req.key]: - if dist.precedence==DEVELOP_DIST and not develop_ok: + if dist.precedence == DEVELOP_DIST and not develop_ok: if dist not in skipped: - self.warn("Skipping development or system egg: %s",dist) + self.warn("Skipping development or system egg: %s", dist) skipped[dist] = 1 continue - if dist in req and (dist.precedence<=SOURCE_DIST or not source): - return dist - - + if dist in req and (dist.precedence <= SOURCE_DIST or not source): + dist.download_location = self.download(dist.location, tmpdir) + if os.path.exists(dist.download_location): + return dist if force_scan: self.prescan() self.find_packages(requirement) - dist = find(self, requirement) - - if local_index is not None: - dist = dist or find(local_index, requirement) + dist = find(requirement) - if dist is None and self.to_scan is not None: - self.prescan() - dist = find(self, requirement) + if not dist and local_index is not None: + dist = find(requirement, local_index) + + if dist is None: + if self.to_scan is not None: + self.prescan() + dist = find(requirement) if dist is None and not force_scan: self.find_packages(requirement) - dist = find(self, requirement) + dist = find(requirement) if dist is None: self.warn( - "No local packages or download links found for %s%s", + "No local packages or working download links found for %s%s", (source and "a source distribution of " or ""), requirement, ) - self.info("Best match: %s", dist) - return dist.clone(location=self.download(dist.location, tmpdir)) - + else: + self.info("Best match: %s", dist) + return dist.clone(location=dist.download_location) def fetch(self, requirement, tmpdir, force_scan=False, source=False): """Obtain a file suitable for fulfilling `requirement` @@ -483,20 +653,19 @@ class PackageIndex(Environment): ``location`` of the downloaded distribution instead of a distribution object. """ - dist = self.fetch_distribution(requirement,tmpdir,force_scan,source) + dist = self.fetch_distribution(requirement, tmpdir, force_scan, source) if dist is not None: return dist.location return None - - def gen_setup(self, filename, fragment, tmpdir): match = EGG_FRAGMENT.match(fragment) - dists = match and [d for d in + dists = match and [ + d for d in interpret_distro_name(filename, match.group(1), None) if d.version ] or [] - if len(dists)==1: # unambiguous ``#egg`` fragment + if len(dists) == 1: # unambiguous ``#egg`` fragment basename = os.path.basename(filename) # Make sure the file has been downloaded to the temp dir. @@ -505,25 +674,24 @@ class PackageIndex(Environment): from setuptools.command.easy_install import samefile if not samefile(filename, dst): shutil.copy2(filename, dst) - filename=dst - - file = open(os.path.join(tmpdir, 'setup.py'), 'w') - file.write( - "from setuptools import setup\n" - "setup(name=%r, version=%r, py_modules=[%r])\n" - % ( - dists[0].project_name, dists[0].version, - os.path.splitext(basename)[0] + filename = dst + + with open(os.path.join(tmpdir, 'setup.py'), 'w') as file: + file.write( + "from setuptools import setup\n" + "setup(name=%r, version=%r, py_modules=[%r])\n" + % ( + dists[0].project_name, dists[0].version, + os.path.splitext(basename)[0] + ) ) - ) - file.close() return filename elif match: raise DistutilsError( "Can't unambiguously interpret project/version identifier %r; " "any dashes in the name or version should be escaped using " - "underscores. %r" % (fragment,dists) + "underscores. %r" % (fragment, dists) ) else: raise DistutilsError( @@ -532,94 +700,116 @@ class PackageIndex(Environment): ) dl_blocksize = 8192 + def _download_to(self, url, filename): self.info("Downloading %s", url) # Download the file - fp, tfp, info = None, None, None + fp, info = None, None try: - if '#' in url: - url, info = url.split('#', 1) - fp = self.open_url(url) - if isinstance(fp, urllib2.HTTPError): + checker = HashChecker.from_url(url) + fp = self.open_url(strip_fragment(url)) + if isinstance(fp, urllib.error.HTTPError): raise DistutilsError( - "Can't download %s: %s %s" % (url, fp.code,fp.msg) + "Can't download %s: %s %s" % (url, fp.code, fp.msg) ) - cs = md5() headers = fp.info() blocknum = 0 bs = self.dl_blocksize size = -1 if "content-length" in headers: - size = int(headers["Content-Length"]) + # Some servers return multiple Content-Length headers :( + sizes = get_all_headers(headers, 'Content-Length') + size = max(map(int, sizes)) self.reporthook(url, filename, blocknum, bs, size) - tfp = open(filename,'wb') - while True: - block = fp.read(bs) - if block: - cs.update(block) - tfp.write(block) - blocknum += 1 - self.reporthook(url, filename, blocknum, bs, size) - else: - break - if info: self.check_md5(cs, info, filename, tfp) + with open(filename, 'wb') as tfp: + while True: + block = fp.read(bs) + if block: + checker.feed(block) + tfp.write(block) + blocknum += 1 + self.reporthook(url, filename, blocknum, bs, size) + else: + break + self.check_hash(checker, filename, tfp) return headers finally: - if fp: fp.close() - if tfp: tfp.close() + if fp: + fp.close() def reporthook(self, url, filename, blocknum, blksize, size): - pass # no-op - + pass # no-op def open_url(self, url, warning=None): - if url.startswith('file:'): return local_open(url) + if url.startswith('file:'): + return local_open(url) try: - return open_with_auth(url) - except urllib2.HTTPError, v: + return open_with_auth(url, self.opener) + except (ValueError, http_client.InvalidURL) as v: + msg = ' '.join([str(arg) for arg in v.args]) + if warning: + self.warn(warning, msg) + else: + raise DistutilsError('%s %s' % (url, msg)) + except urllib.error.HTTPError as v: return v - except urllib2.URLError, v: - reason = v.reason - except httplib.HTTPException, v: - reason = "%s: %s" % (v.__doc__ or v.__class__.__name__, v) - if warning: - self.warn(warning, reason) - else: - raise DistutilsError("Download error for %s: %s" % (url, reason)) + except urllib.error.URLError as v: + if warning: + self.warn(warning, v.reason) + else: + raise DistutilsError("Download error for %s: %s" + % (url, v.reason)) + except http_client.BadStatusLine as v: + if warning: + self.warn(warning, v.line) + else: + raise DistutilsError( + '%s returned a bad status line. The server might be ' + 'down, %s' % + (url, v.line) + ) + except (http_client.HTTPException, socket.error) as v: + if warning: + self.warn(warning, v) + else: + raise DistutilsError("Download error for %s: %s" + % (url, v)) def _download_url(self, scheme, url, tmpdir): # Determine download filename # - name = filter(None,urlparse.urlparse(url)[2].split('/')) + name, fragment = egg_info_for_url(url) if name: - name = name[-1] while '..' in name: - name = name.replace('..','.').replace('\\','_') + name = name.replace('..', '.').replace('\\', '_') else: - name = "__downloaded__" # default if URL has no path contents + name = "__downloaded__" # default if URL has no path contents if name.endswith('.egg.zip'): - name = name[:-4] # strip the extra .zip before download + name = name[:-4] # strip the extra .zip before download - filename = os.path.join(tmpdir,name) + filename = os.path.join(tmpdir, name) # Download the file # - if scheme=='svn' or scheme.startswith('svn+'): + if scheme == 'svn' or scheme.startswith('svn+'): return self._download_svn(url, filename) - elif scheme=='file': - return urllib2.url2pathname(urlparse.urlparse(url)[2]) + elif scheme == 'git' or scheme.startswith('git+'): + return self._download_git(url, filename) + elif scheme.startswith('hg+'): + return self._download_hg(url, filename) + elif scheme == 'file': + return urllib.request.url2pathname(urllib.parse.urlparse(url)[2]) else: - self.url_ok(url, True) # raises error if not allowed + self.url_ok(url, True) # raises error if not allowed return self._attempt_download(url, filename) def scan_url(self, url): self.process_url(url, True) - def _attempt_download(self, url, filename): headers = self._download_to(url, filename) - if 'html' in headers.get('content-type','').lower(): + if 'html' in headers.get('content-type', '').lower(): return self._download_html(url, headers, filename) else: return filename @@ -634,15 +824,80 @@ class PackageIndex(Environment): file.close() os.unlink(filename) return self._download_svn(url, filename) - break # not an index page + break # not an index page file.close() os.unlink(filename) - raise DistutilsError("Unexpected HTML page found at "+url) + raise DistutilsError("Unexpected HTML page found at " + url) def _download_svn(self, url, filename): - url = url.split('#',1)[0] # remove any fragment for svn's sake + url = url.split('#', 1)[0] # remove any fragment for svn's sake + creds = '' + if url.lower().startswith('svn:') and '@' in url: + scheme, netloc, path, p, q, f = urllib.parse.urlparse(url) + if not netloc and path.startswith('//') and '/' in path[2:]: + netloc, path = path[2:].split('/', 1) + auth, host = splituser(netloc) + if auth: + if ':' in auth: + user, pw = auth.split(':', 1) + creds = " --username=%s --password=%s" % (user, pw) + else: + creds = " --username=" + auth + netloc = host + parts = scheme, netloc, url, p, q, f + url = urllib.parse.urlunparse(parts) self.info("Doing subversion checkout from %s to %s", url, filename) - os.system("svn checkout -q %s %s" % (url, filename)) + os.system("svn checkout%s -q %s %s" % (creds, url, filename)) + return filename + + @staticmethod + def _vcs_split_rev_from_url(url, pop_prefix=False): + scheme, netloc, path, query, frag = urllib.parse.urlsplit(url) + + scheme = scheme.split('+', 1)[-1] + + # Some fragment identification fails + path = path.split('#', 1)[0] + + rev = None + if '@' in path: + path, rev = path.rsplit('@', 1) + + # Also, discard fragment + url = urllib.parse.urlunsplit((scheme, netloc, path, query, '')) + + return url, rev + + def _download_git(self, url, filename): + filename = filename.split('#', 1)[0] + url, rev = self._vcs_split_rev_from_url(url, pop_prefix=True) + + self.info("Doing git clone from %s to %s", url, filename) + os.system("git clone --quiet %s %s" % (url, filename)) + + if rev is not None: + self.info("Checking out %s", rev) + os.system("(cd %s && git checkout --quiet %s)" % ( + filename, + rev, + )) + + return filename + + def _download_hg(self, url, filename): + filename = filename.split('#', 1)[0] + url, rev = self._vcs_split_rev_from_url(url, pop_prefix=True) + + self.info("Doing hg clone from %s to %s", url, filename) + os.system("hg clone --quiet %s %s" % (url, filename)) + + if rev is not None: + self.info("Updating to %s", rev) + os.system("(cd %s && hg up -C -r %s >&-)" % ( + filename, + rev, + )) + return filename def debug(self, msg, *args): @@ -654,16 +909,20 @@ class PackageIndex(Environment): def warn(self, msg, *args): log.warn(msg, *args) + # This pattern matches a character entity reference (a decimal numeric # references, a hexadecimal numeric reference, or a named reference). entity_sub = re.compile(r'&(#(\d+|x[\da-fA-F]+)|[\w.:-]+);?').sub + def uchr(c): if not isinstance(c, int): return c - if c>255: return unichr(c) + if c > 255: + return six.unichr(c) return chr(c) + def decode_entity(match): what = match.group(1) if what.startswith('#x'): @@ -671,109 +930,186 @@ def decode_entity(match): elif what.startswith('#'): what = int(what[1:]) else: - from htmlentitydefs import name2codepoint - what = name2codepoint.get(what, match.group(0)) + what = six.moves.html_entities.name2codepoint.get(what, match.group(0)) return uchr(what) + def htmldecode(text): """Decode HTML entities in the given text.""" return entity_sub(decode_entity, text) +def socket_timeout(timeout=15): + def _socket_timeout(func): + def _socket_timeout(*args, **kwargs): + old_timeout = socket.getdefaulttimeout() + socket.setdefaulttimeout(timeout) + try: + return func(*args, **kwargs) + finally: + socket.setdefaulttimeout(old_timeout) + return _socket_timeout + return _socket_timeout +def _encode_auth(auth): + """ + A function compatible with Python 2.3-3.3 that will encode + auth from a URL suitable for an HTTP header. + >>> str(_encode_auth('username%3Apassword')) + 'dXNlcm5hbWU6cGFzc3dvcmQ=' + + Long auth strings should not cause a newline to be inserted. + >>> long_auth = 'username:' + 'password'*10 + >>> chr(10) in str(_encode_auth(long_auth)) + False + """ + auth_s = urllib.parse.unquote(auth) + # convert to bytes + auth_bytes = auth_s.encode() + # use the legacy interface for Python 2.3 support + encoded_bytes = base64.encodestring(auth_bytes) + # convert back to a string + encoded = encoded_bytes.decode() + # strip the trailing carriage return + return encoded.replace('\n', '') + + +class Credential(object): + """ + A username/password pair. Use like a namedtuple. + """ + def __init__(self, username, password): + self.username = username + self.password = password + def __iter__(self): + yield self.username + yield self.password + def __str__(self): + return '%(username)s:%(password)s' % vars(self) +class PyPIConfig(configparser.RawConfigParser): + def __init__(self): + """ + Load from ~/.pypirc + """ + defaults = dict.fromkeys(['username', 'password', 'repository'], '') + configparser.RawConfigParser.__init__(self, defaults) + + rc = os.path.join(os.path.expanduser('~'), '.pypirc') + if os.path.exists(rc): + self.read(rc) + + @property + def creds_by_repository(self): + sections_with_repositories = [ + section for section in self.sections() + if self.get(section, 'repository').strip() + ] + + return dict(map(self._get_repo_cred, sections_with_repositories)) + + def _get_repo_cred(self, section): + repo = self.get(section, 'repository').strip() + return repo, Credential( + self.get(section, 'username').strip(), + self.get(section, 'password').strip(), + ) + def find_credential(self, url): + """ + If the URL indicated appears to be a repository defined in this + config, return the credential for that repository. + """ + for repository, cred in self.creds_by_repository.items(): + if url.startswith(repository): + return cred - - - -def open_with_auth(url): +def open_with_auth(url, opener=urllib.request.urlopen): """Open a urllib2 request, handling HTTP authentication""" - scheme, netloc, path, params, query, frag = urlparse.urlparse(url) + scheme, netloc, path, params, query, frag = urllib.parse.urlparse(url) + + # Double scheme does not raise on Mac OS X as revealed by a + # failing test. We would expect "nonnumeric port". Refs #20. + if netloc.endswith(':'): + raise http_client.InvalidURL("nonnumeric port: ''") if scheme in ('http', 'https'): - auth, host = urllib2.splituser(netloc) + auth, host = splituser(netloc) else: auth = None + if not auth: + cred = PyPIConfig().find_credential(url) + if cred: + auth = str(cred) + info = cred.username, url + log.info('Authenticating as %s for %s (from .pypirc)', *info) + if auth: - auth = "Basic " + urllib2.unquote(auth).encode('base64').strip() - new_url = urlparse.urlunparse((scheme,host,path,params,query,frag)) - request = urllib2.Request(new_url) + auth = "Basic " + _encode_auth(auth) + parts = scheme, host, path, params, query, frag + new_url = urllib.parse.urlunparse(parts) + request = urllib.request.Request(new_url) request.add_header("Authorization", auth) else: - request = urllib2.Request(url) + request = urllib.request.Request(url) request.add_header('User-Agent', user_agent) - fp = urllib2.urlopen(request) + fp = opener(request) if auth: # Put authentication info back into request URL if same host, # so that links found on the page will work - s2, h2, path2, param2, query2, frag2 = urlparse.urlparse(fp.url) - if s2==scheme and h2==host: - fp.url = urlparse.urlunparse((s2,netloc,path2,param2,query2,frag2)) + s2, h2, path2, param2, query2, frag2 = urllib.parse.urlparse(fp.url) + if s2 == scheme and h2 == host: + parts = s2, netloc, path2, param2, query2, frag2 + fp.url = urllib.parse.urlunparse(parts) return fp - - - - - - - - +# adding a timeout to avoid freezing package_index +open_with_auth = socket_timeout(_SOCKET_TIMEOUT)(open_with_auth) def fix_sf_url(url): - return url # backward compatibility + return url # backward compatibility + def local_open(url): """Read a local path, with special support for directories""" - scheme, server, path, param, query, frag = urlparse.urlparse(url) - filename = urllib2.url2pathname(path) + scheme, server, path, param, query, frag = urllib.parse.urlparse(url) + filename = urllib.request.url2pathname(path) if os.path.isfile(filename): - return urllib2.urlopen(url) + return urllib.request.urlopen(url) elif path.endswith('/') and os.path.isdir(filename): files = [] for f in os.listdir(filename): - if f=='index.html': - body = open(os.path.join(filename,f),'rb').read() + filepath = os.path.join(filename, f) + if f == 'index.html': + with open(filepath, 'r') as fp: + body = fp.read() break - elif os.path.isdir(os.path.join(filename,f)): - f+='/' - files.append("%s" % (f,f)) + elif os.path.isdir(filepath): + f += '/' + files.append('{name}'.format(name=f)) else: - body = ("%s" % url) + \ - "%s" % '\n'.join(files) + tmpl = ("{url}" + "{files}") + body = tmpl.format(url=url, files='\n'.join(files)) status, message = 200, "OK" else: status, message, body = 404, "Path not found", "Not found" - return urllib2.HTTPError(url, status, message, - {'content-type':'text/html'}, cStringIO.StringIO(body)) - - - - - - - - - - - - - -# this line is a kludge to keep the trailing blank lines for pje's editor + headers = {'content-type': 'text/html'} + body_stream = six.StringIO(body) + return urllib.error.HTTPError(url, status, message, headers, body_stream) diff --git a/setuptools/py26compat.py b/setuptools/py26compat.py new file mode 100644 index 0000000..4d3add8 --- /dev/null +++ b/setuptools/py26compat.py @@ -0,0 +1,31 @@ +""" +Compatibility Support for Python 2.6 and earlier +""" + +import sys + +try: + from urllib.parse import splittag +except ImportError: + from urllib import splittag + + +def strip_fragment(url): + """ + In `Python 8280 `_, Python 2.7 and + later was patched to disregard the fragment when making URL requests. + Do the same for Python 2.6 and earlier. + """ + url, fragment = splittag(url) + return url + + +if sys.version_info >= (2, 7): + strip_fragment = lambda x: x + +try: + from importlib import import_module +except ImportError: + + def import_module(module_name): + return __import__(module_name, fromlist=['__name__']) diff --git a/setuptools/py27compat.py b/setuptools/py27compat.py new file mode 100644 index 0000000..f0a80a8 --- /dev/null +++ b/setuptools/py27compat.py @@ -0,0 +1,28 @@ +""" +Compatibility Support for Python 2.7 and earlier +""" + +import sys +import platform + + +def get_all_headers(message, key): + """ + Given an HTTPMessage, return all headers matching a given key. + """ + return message.get_all(key) + + +if sys.version_info < (3,): + + def get_all_headers(message, key): + return message.getheaders(key) + + +linux_py2_ascii = ( + platform.system() == 'Linux' and + sys.version_info < (3,) +) + +rmtree_safe = str if linux_py2_ascii else lambda x: x +"""Workaround for http://bugs.python.org/issue24672""" diff --git a/setuptools/py31compat.py b/setuptools/py31compat.py new file mode 100644 index 0000000..44b025d --- /dev/null +++ b/setuptools/py31compat.py @@ -0,0 +1,56 @@ +import sys +import unittest + +__all__ = ['get_config_vars', 'get_path'] + +try: + # Python 2.7 or >=3.2 + from sysconfig import get_config_vars, get_path +except ImportError: + from distutils.sysconfig import get_config_vars, get_python_lib + + def get_path(name): + if name not in ('platlib', 'purelib'): + raise ValueError("Name must be purelib or platlib") + return get_python_lib(name == 'platlib') + + +try: + # Python >=3.2 + from tempfile import TemporaryDirectory +except ImportError: + import shutil + import tempfile + + class TemporaryDirectory(object): + """ + Very simple temporary directory context manager. + Will try to delete afterward, but will also ignore OS and similar + errors on deletion. + """ + + def __init__(self): + self.name = None # Handle mkdtemp raising an exception + self.name = tempfile.mkdtemp() + + def __enter__(self): + return self.name + + def __exit__(self, exctype, excvalue, exctrace): + try: + shutil.rmtree(self.name, True) + except OSError: # removal errors are not the only possible + pass + self.name = None + + +unittest_main = unittest.main + +_PY31 = (3, 1) <= sys.version_info[:2] < (3, 2) +if _PY31: + # on Python 3.1, translate testRunner==None to TextTestRunner + # for compatibility with Python 2.6, 2.7, and 3.2+ + def unittest_main(*args, **kwargs): + if 'testRunner' in kwargs and kwargs['testRunner'] is None: + kwargs['testRunner'] = unittest.TextTestRunner + return unittest.main(*args, **kwargs) diff --git a/setuptools/py33compat.py b/setuptools/py33compat.py new file mode 100644 index 0000000..0caa200 --- /dev/null +++ b/setuptools/py33compat.py @@ -0,0 +1,45 @@ +import dis +import array +import collections + +import six + + +OpArg = collections.namedtuple('OpArg', 'opcode arg') + + +class Bytecode_compat(object): + def __init__(self, code): + self.code = code + + def __iter__(self): + """Yield '(op,arg)' pair for each operation in code object 'code'""" + + bytes = array.array('b', self.code.co_code) + eof = len(self.code.co_code) + + ptr = 0 + extended_arg = 0 + + while ptr < eof: + + op = bytes[ptr] + + if op >= dis.HAVE_ARGUMENT: + + arg = bytes[ptr + 1] + bytes[ptr + 2] * 256 + extended_arg + ptr += 3 + + if op == dis.EXTENDED_ARG: + long_type = six.integer_types[-1] + extended_arg = arg * long_type(65536) + continue + + else: + arg = None + ptr += 1 + + yield OpArg(op, arg) + + +Bytecode = getattr(dis, 'Bytecode', Bytecode_compat) diff --git a/setuptools/py36compat.py b/setuptools/py36compat.py new file mode 100644 index 0000000..f527969 --- /dev/null +++ b/setuptools/py36compat.py @@ -0,0 +1,82 @@ +import sys +from distutils.errors import DistutilsOptionError +from distutils.util import strtobool +from distutils.debug import DEBUG + + +class Distribution_parse_config_files: + """ + Mix-in providing forward-compatibility for functionality to be + included by default on Python 3.7. + + Do not edit the code in this class except to update functionality + as implemented in distutils. + """ + def parse_config_files(self, filenames=None): + from configparser import ConfigParser + + # Ignore install directory options if we have a venv + if sys.prefix != sys.base_prefix: + ignore_options = [ + 'install-base', 'install-platbase', 'install-lib', + 'install-platlib', 'install-purelib', 'install-headers', + 'install-scripts', 'install-data', 'prefix', 'exec-prefix', + 'home', 'user', 'root'] + else: + ignore_options = [] + + ignore_options = frozenset(ignore_options) + + if filenames is None: + filenames = self.find_config_files() + + if DEBUG: + self.announce("Distribution.parse_config_files():") + + parser = ConfigParser(interpolation=None) + for filename in filenames: + if DEBUG: + self.announce(" reading %s" % filename) + parser.read(filename) + for section in parser.sections(): + options = parser.options(section) + opt_dict = self.get_option_dict(section) + + for opt in options: + if opt != '__name__' and opt not in ignore_options: + val = parser.get(section,opt) + opt = opt.replace('-', '_') + opt_dict[opt] = (filename, val) + + # Make the ConfigParser forget everything (so we retain + # the original filenames that options come from) + parser.__init__() + + # If there was a "global" section in the config file, use it + # to set Distribution options. + + if 'global' in self.command_options: + for (opt, (src, val)) in self.command_options['global'].items(): + alias = self.negative_opt.get(opt) + try: + if alias: + setattr(self, alias, not strtobool(val)) + elif opt in ('verbose', 'dry_run'): # ugh! + setattr(self, opt, strtobool(val)) + else: + setattr(self, opt, val) + except ValueError as msg: + raise DistutilsOptionError(msg) + + +if sys.version_info < (3,): + # Python 2 behavior is sufficient + class Distribution_parse_config_files: + pass + + +if False: + # When updated behavior is available upstream, + # disable override here. + class Distribution_parse_config_files: + pass diff --git a/setuptools/sandbox.py b/setuptools/sandbox.py index 00eb012..41c1c3b 100755 --- a/setuptools/sandbox.py +++ b/setuptools/sandbox.py @@ -1,8 +1,27 @@ -import os, sys, __builtin__, tempfile, operator, pkg_resources -_os = sys.modules[os.name] +import os +import sys +import tempfile +import operator +import functools +import itertools +import re +import contextlib +import pickle + +import six +from six.moves import builtins, map + +import pkg_resources + +if sys.platform.startswith('java'): + import org.python.modules.posix.PosixModule as _os +else: + _os = sys.modules[os.name] +try: + _file = file +except NameError: + _file = None _open = open -_file = file - from distutils.errors import DistutilsError from pkg_resources import working_set @@ -11,73 +30,233 @@ __all__ = [ ] +def _execfile(filename, globals, locals=None): + """ + Python 3 implementation of execfile. + """ + mode = 'rb' + with open(filename, mode) as stream: + script = stream.read() + # compile() function in Python 2.6 and 3.1 requires LF line endings. + if sys.version_info[:2] < (2, 7) or sys.version_info[:2] >= (3, 0) and sys.version_info[:2] < (3, 2): + script = script.replace(b'\r\n', b'\n') + script = script.replace(b'\r', b'\n') + if locals is None: + locals = globals + code = compile(script, filename, 'exec') + exec(code, globals, locals) + + +@contextlib.contextmanager +def save_argv(repl=None): + saved = sys.argv[:] + if repl is not None: + sys.argv[:] = repl + try: + yield saved + finally: + sys.argv[:] = saved + + +@contextlib.contextmanager +def save_path(): + saved = sys.path[:] + try: + yield saved + finally: + sys.path[:] = saved + + +@contextlib.contextmanager +def override_temp(replacement): + """ + Monkey-patch tempfile.tempdir with replacement, ensuring it exists + """ + if not os.path.isdir(replacement): + os.makedirs(replacement) + saved = tempfile.tempdir + tempfile.tempdir = replacement + try: + yield + finally: + tempfile.tempdir = saved +@contextlib.contextmanager +def pushd(target): + saved = os.getcwd() + os.chdir(target) + try: + yield saved + finally: + os.chdir(saved) +class UnpickleableException(Exception): + """ + An exception representing another Exception that could not be pickled. + """ + @staticmethod + def dump(type, exc): + """ + Always return a dumped (pickled) type and exc. If exc can't be pickled, + wrap it in UnpickleableException first. + """ + try: + return pickle.dumps(type), pickle.dumps(exc) + except Exception: + # get UnpickleableException inside the sandbox + from setuptools.sandbox import UnpickleableException as cls + return cls.dump(cls, cls(repr(exc))) +class ExceptionSaver: + """ + A Context Manager that will save an exception, serialized, and restore it + later. + """ + def __enter__(self): + return self + def __exit__(self, type, exc, tb): + if not exc: + return + # dump the exception + self._saved = UnpickleableException.dump(type, exc) + self._tb = tb + # suppress the exception + return True + def resume(self): + "restore and re-raise any exception" + if '_saved' not in vars(self): + return + type, exc = map(pickle.loads, self._saved) + six.reraise(type, exc, self._tb) +@contextlib.contextmanager +def save_modules(): + """ + Context in which imported modules are saved. + Translates exceptions internal to the context into the equivalent exception + outside the context. + """ + saved = sys.modules.copy() + with ExceptionSaver() as saved_exc: + yield saved + sys.modules.update(saved) + # remove any modules imported since + del_modules = ( + mod_name for mod_name in sys.modules + if mod_name not in saved + # exclude any encodings modules. See #285 + and not mod_name.startswith('encodings.') + ) + _clear_modules(del_modules) + saved_exc.resume() +def _clear_modules(module_names): + for mod_name in list(module_names): + del sys.modules[mod_name] +@contextlib.contextmanager +def save_pkg_resources_state(): + saved = pkg_resources.__getstate__() + try: + yield saved + finally: + pkg_resources.__setstate__(saved) + + +@contextlib.contextmanager +def setup_context(setup_dir): + temp_dir = os.path.join(setup_dir, 'temp') + with save_pkg_resources_state(): + with save_modules(): + hide_setuptools() + with save_path(): + with save_argv(): + with override_temp(temp_dir): + with pushd(setup_dir): + # ensure setuptools commands are available + __import__('setuptools') + yield + + +def _needs_hiding(mod_name): + """ + >>> _needs_hiding('setuptools') + True + >>> _needs_hiding('pkg_resources') + True + >>> _needs_hiding('setuptools_plugin') + False + >>> _needs_hiding('setuptools.__init__') + True + >>> _needs_hiding('distutils') + True + >>> _needs_hiding('os') + False + >>> _needs_hiding('Cython') + True + """ + pattern = re.compile(r'(setuptools|pkg_resources|distutils|Cython)(\.|$)') + return bool(pattern.match(mod_name)) + + +def hide_setuptools(): + """ + Remove references to setuptools' modules from sys.modules to allow the + invocation to import the most appropriate setuptools. This technique is + necessary to avoid issues such as #315 where setuptools upgrading itself + would fail to find a function declared in the metadata. + """ + modules = filter(_needs_hiding, sys.modules) + _clear_modules(modules) def run_setup(setup_script, args): """Run a distutils setup script, sandboxed in its directory""" - old_dir = os.getcwd() - save_argv = sys.argv[:] - save_path = sys.path[:] setup_dir = os.path.abspath(os.path.dirname(setup_script)) - temp_dir = os.path.join(setup_dir,'temp') - if not os.path.isdir(temp_dir): os.makedirs(temp_dir) - save_tmp = tempfile.tempdir - save_modules = sys.modules.copy() - pr_state = pkg_resources.__getstate__() - try: - tempfile.tempdir = temp_dir; os.chdir(setup_dir) + with setup_context(setup_dir): try: - sys.argv[:] = [setup_script]+list(args) + sys.argv[:] = [setup_script] + list(args) sys.path.insert(0, setup_dir) # reset to include setup dir, w/clean callback list - working_set.__init__() - working_set.callbacks.append(lambda dist:dist.activate()) - DirectorySandbox(setup_dir).run( - lambda: execfile( - "setup.py", - {'__file__':setup_script, '__name__':'__main__'} - ) + working_set.__init__() + working_set.callbacks.append(lambda dist: dist.activate()) + + # __file__ should be a byte string on Python 2 (#712) + dunder_file = ( + setup_script + if isinstance(setup_script, str) else + setup_script.encode(sys.getfilesystemencoding()) ) - except SystemExit, v: + + def runner(): + ns = dict(__file__=dunder_file, __name__='__main__') + _execfile(setup_script, ns) + + DirectorySandbox(setup_dir).run(runner) + except SystemExit as v: if v.args and v.args[0]: raise # Normal exit, just return - finally: - pkg_resources.__setstate__(pr_state) - sys.modules.update(save_modules) - for key in list(sys.modules): - if key not in save_modules: del sys.modules[key] - os.chdir(old_dir) - sys.path[:] = save_path - sys.argv[:] = save_argv - tempfile.tempdir = save_tmp - class AbstractSandbox: @@ -88,100 +267,120 @@ class AbstractSandbox: def __init__(self): self._attrs = [ name for name in dir(_os) - if not name.startswith('_') and hasattr(self,name) + if not name.startswith('_') and hasattr(self, name) ] def _copy(self, source): for name in self._attrs: - setattr(os, name, getattr(source,name)) + setattr(os, name, getattr(source, name)) def run(self, func): """Run 'func' under os sandboxing""" try: self._copy(self) - __builtin__.file = self._file - __builtin__.open = self._open + if _file: + builtins.file = self._file + builtins.open = self._open self._active = True return func() finally: self._active = False - __builtin__.open = _file - __builtin__.file = _open + if _file: + builtins.file = _file + builtins.open = _open self._copy(_os) def _mk_dual_path_wrapper(name): - original = getattr(_os,name) - def wrap(self,src,dst,*args,**kw): + original = getattr(_os, name) + + def wrap(self, src, dst, *args, **kw): if self._active: - src,dst = self._remap_pair(name,src,dst,*args,**kw) - return original(src,dst,*args,**kw) + src, dst = self._remap_pair(name, src, dst, *args, **kw) + return original(src, dst, *args, **kw) + return wrap for name in ["rename", "link", "symlink"]: - if hasattr(_os,name): locals()[name] = _mk_dual_path_wrapper(name) - + if hasattr(_os, name): + locals()[name] = _mk_dual_path_wrapper(name) def _mk_single_path_wrapper(name, original=None): - original = original or getattr(_os,name) - def wrap(self,path,*args,**kw): + original = original or getattr(_os, name) + + def wrap(self, path, *args, **kw): if self._active: - path = self._remap_input(name,path,*args,**kw) - return original(path,*args,**kw) + path = self._remap_input(name, path, *args, **kw) + return original(path, *args, **kw) + return wrap + if _file: + _file = _mk_single_path_wrapper('file', _file) _open = _mk_single_path_wrapper('open', _open) - _file = _mk_single_path_wrapper('file', _file) for name in [ "stat", "listdir", "chdir", "open", "chmod", "chown", "mkdir", "remove", "unlink", "rmdir", "utime", "lchown", "chroot", "lstat", "startfile", "mkfifo", "mknod", "pathconf", "access" ]: - if hasattr(_os,name): locals()[name] = _mk_single_path_wrapper(name) + if hasattr(_os, name): + locals()[name] = _mk_single_path_wrapper(name) def _mk_single_with_return(name): - original = getattr(_os,name) - def wrap(self,path,*args,**kw): + original = getattr(_os, name) + + def wrap(self, path, *args, **kw): if self._active: - path = self._remap_input(name,path,*args,**kw) - return self._remap_output(name, original(path,*args,**kw)) - return original(path,*args,**kw) + path = self._remap_input(name, path, *args, **kw) + return self._remap_output(name, original(path, *args, **kw)) + return original(path, *args, **kw) + return wrap for name in ['readlink', 'tempnam']: - if hasattr(_os,name): locals()[name] = _mk_single_with_return(name) + if hasattr(_os, name): + locals()[name] = _mk_single_with_return(name) def _mk_query(name): - original = getattr(_os,name) - def wrap(self,*args,**kw): - retval = original(*args,**kw) + original = getattr(_os, name) + + def wrap(self, *args, **kw): + retval = original(*args, **kw) if self._active: return self._remap_output(name, retval) return retval + return wrap for name in ['getcwd', 'tmpnam']: - if hasattr(_os,name): locals()[name] = _mk_query(name) + if hasattr(_os, name): + locals()[name] = _mk_query(name) - def _validate_path(self,path): + def _validate_path(self, path): """Called to remap or validate any path, whether input or output""" return path - def _remap_input(self,operation,path,*args,**kw): + def _remap_input(self, operation, path, *args, **kw): """Called for path inputs""" return self._validate_path(path) - def _remap_output(self,operation,path): + def _remap_output(self, operation, path): """Called for path outputs""" return self._validate_path(path) - def _remap_pair(self,operation,src,dst,*args,**kw): + def _remap_pair(self, operation, src, dst, *args, **kw): """Called for path pairs like rename, link, and symlink operations""" return ( - self._remap_input(operation+'-from',src,*args,**kw), - self._remap_input(operation+'-to',dst,*args,**kw) + self._remap_input(operation + '-from', src, *args, **kw), + self._remap_input(operation + '-to', dst, *args, **kw) ) +if hasattr(os, 'devnull'): + _EXCEPTIONS = [os.devnull,] +else: + _EXCEPTIONS = [] + + class DirectorySandbox(AbstractSandbox): """Restrict operations to a single subdirectory - pseudo-chroot""" @@ -190,60 +389,90 @@ class DirectorySandbox(AbstractSandbox): "utime", "lchown", "chroot", "mkfifo", "mknod", "tempnam", ]) - def __init__(self,sandbox): + _exception_patterns = [ + # Allow lib2to3 to attempt to save a pickled grammar object (#121) + r'.*lib2to3.*\.pickle$', + ] + "exempt writing to paths that match the pattern" + + def __init__(self, sandbox, exceptions=_EXCEPTIONS): self._sandbox = os.path.normcase(os.path.realpath(sandbox)) - self._prefix = os.path.join(self._sandbox,'') + self._prefix = os.path.join(self._sandbox, '') + self._exceptions = [ + os.path.normcase(os.path.realpath(path)) + for path in exceptions + ] AbstractSandbox.__init__(self) def _violation(self, operation, *args, **kw): + from setuptools.sandbox import SandboxViolation raise SandboxViolation(operation, args, kw) + if _file: + + def _file(self, path, mode='r', *args, **kw): + if mode not in ('r', 'rt', 'rb', 'rU', 'U') and not self._ok(path): + self._violation("file", path, mode, *args, **kw) + return _file(path, mode, *args, **kw) + def _open(self, path, mode='r', *args, **kw): if mode not in ('r', 'rt', 'rb', 'rU', 'U') and not self._ok(path): self._violation("open", path, mode, *args, **kw) - return _open(path,mode,*args,**kw) + return _open(path, mode, *args, **kw) def tmpnam(self): self._violation("tmpnam") - def _ok(self,path): + def _ok(self, path): active = self._active try: self._active = False realpath = os.path.normcase(os.path.realpath(path)) - if realpath==self._sandbox or realpath.startswith(self._prefix): - return True + return ( + self._exempted(realpath) + or realpath == self._sandbox + or realpath.startswith(self._prefix) + ) finally: self._active = active - def _remap_input(self,operation,path,*args,**kw): + def _exempted(self, filepath): + start_matches = ( + filepath.startswith(exception) + for exception in self._exceptions + ) + pattern_matches = ( + re.match(pattern, filepath) + for pattern in self._exception_patterns + ) + candidates = itertools.chain(start_matches, pattern_matches) + return any(candidates) + + def _remap_input(self, operation, path, *args, **kw): """Called for path inputs""" if operation in self.write_ops and not self._ok(path): self._violation(operation, os.path.realpath(path), *args, **kw) return path - def _remap_pair(self,operation,src,dst,*args,**kw): + def _remap_pair(self, operation, src, dst, *args, **kw): """Called for path pairs like rename, link, and symlink operations""" if not self._ok(src) or not self._ok(dst): self._violation(operation, src, dst, *args, **kw) - return (src,dst) + return (src, dst) - def _file(self, path, mode='r', *args, **kw): - if mode not in ('r', 'rt', 'rb', 'rU', 'U') and not self._ok(path): - self._violation("file", path, mode, *args, **kw) - return _file(path,mode,*args,**kw) - - def open(self, file, flags, mode=0777): + def open(self, file, flags, mode=0o777, *args, **kw): """Called for low-level os.open()""" if flags & WRITE_FLAGS and not self._ok(file): - self._violation("os.open", file, flags, mode) - return _os.open(file,flags,mode) + self._violation("os.open", file, flags, mode, *args, **kw) + return _os.open(file, flags, mode, *args, **kw) -WRITE_FLAGS = reduce( + +WRITE_FLAGS = functools.reduce( operator.or_, [getattr(_os, a, 0) for a in "O_WRONLY O_RDWR O_APPEND O_CREAT O_TRUNC O_TEMPORARY".split()] ) + class SandboxViolation(DistutilsError): """A setup script attempted to modify the filesystem outside the sandbox""" @@ -259,29 +488,4 @@ script by hand. Please inform the package's author and the EasyInstall maintainers to find out if a fix or workaround is available.""" % self.args - - - - - - - - - - - - - - - - - - - - - - - - - # diff --git a/setuptools/script (dev).tmpl b/setuptools/script (dev).tmpl new file mode 100644 index 0000000..d58b1bb --- /dev/null +++ b/setuptools/script (dev).tmpl @@ -0,0 +1,5 @@ +# EASY-INSTALL-DEV-SCRIPT: %(spec)r,%(script_name)r +__requires__ = %(spec)r +__import__('pkg_resources').require(%(spec)r) +__file__ = %(dev_path)r +exec(compile(open(__file__).read(), __file__, 'exec')) diff --git a/setuptools/script.tmpl b/setuptools/script.tmpl new file mode 100644 index 0000000..ff5efbc --- /dev/null +++ b/setuptools/script.tmpl @@ -0,0 +1,3 @@ +# EASY-INSTALL-SCRIPT: %(spec)r,%(script_name)r +__requires__ = %(spec)r +__import__('pkg_resources').run_script(%(spec)r, %(script_name)r) diff --git a/setuptools/site-patch.py b/setuptools/site-patch.py new file mode 100644 index 0000000..0d2d2ff --- /dev/null +++ b/setuptools/site-patch.py @@ -0,0 +1,74 @@ +def __boot(): + import sys + import os + PYTHONPATH = os.environ.get('PYTHONPATH') + if PYTHONPATH is None or (sys.platform == 'win32' and not PYTHONPATH): + PYTHONPATH = [] + else: + PYTHONPATH = PYTHONPATH.split(os.pathsep) + + pic = getattr(sys, 'path_importer_cache', {}) + stdpath = sys.path[len(PYTHONPATH):] + mydir = os.path.dirname(__file__) + + for item in stdpath: + if item == mydir or not item: + continue # skip if current dir. on Windows, or my own directory + importer = pic.get(item) + if importer is not None: + loader = importer.find_module('site') + if loader is not None: + # This should actually reload the current module + loader.load_module('site') + break + else: + try: + import imp # Avoid import loop in Python >= 3.3 + stream, path, descr = imp.find_module('site', [item]) + except ImportError: + continue + if stream is None: + continue + try: + # This should actually reload the current module + imp.load_module('site', stream, path, descr) + finally: + stream.close() + break + else: + raise ImportError("Couldn't find the real 'site' module") + + known_paths = dict([(makepath(item)[1], 1) for item in sys.path]) # 2.2 comp + + oldpos = getattr(sys, '__egginsert', 0) # save old insertion position + sys.__egginsert = 0 # and reset the current one + + for item in PYTHONPATH: + addsitedir(item) + + sys.__egginsert += oldpos # restore effective old position + + d, nd = makepath(stdpath[0]) + insert_at = None + new_path = [] + + for item in sys.path: + p, np = makepath(item) + + if np == nd and insert_at is None: + # We've hit the first 'system' path entry, so added entries go here + insert_at = len(new_path) + + if np in known_paths or insert_at is None: + new_path.append(item) + else: + # new path after the insert point, back-insert it + new_path.insert(insert_at, item) + insert_at += 1 + + sys.path[:] = new_path + + +if __name__ == 'site': + __boot() + del __boot diff --git a/setuptools/ssl_support.py b/setuptools/ssl_support.py new file mode 100644 index 0000000..fa5e442 --- /dev/null +++ b/setuptools/ssl_support.py @@ -0,0 +1,255 @@ +import os +import socket +import atexit +import re +import functools + +from six.moves import urllib, http_client, map, filter + +from pkg_resources import ResolutionError, ExtractionError + +try: + import ssl +except ImportError: + ssl = None + +__all__ = [ + 'VerifyingHTTPSHandler', 'find_ca_bundle', 'is_available', 'cert_paths', + 'opener_for' +] + +cert_paths = """ +/etc/pki/tls/certs/ca-bundle.crt +/etc/ssl/certs/ca-certificates.crt +/usr/share/ssl/certs/ca-bundle.crt +/usr/local/share/certs/ca-root.crt +/etc/ssl/cert.pem +/System/Library/OpenSSL/certs/cert.pem +/usr/local/share/certs/ca-root-nss.crt +/etc/ssl/ca-bundle.pem +""".strip().split() + +try: + HTTPSHandler = urllib.request.HTTPSHandler + HTTPSConnection = http_client.HTTPSConnection +except AttributeError: + HTTPSHandler = HTTPSConnection = object + +is_available = ssl is not None and object not in (HTTPSHandler, HTTPSConnection) + + +try: + from ssl import CertificateError, match_hostname +except ImportError: + try: + from backports.ssl_match_hostname import CertificateError + from backports.ssl_match_hostname import match_hostname + except ImportError: + CertificateError = None + match_hostname = None + +if not CertificateError: + + class CertificateError(ValueError): + pass + + +if not match_hostname: + + def _dnsname_match(dn, hostname, max_wildcards=1): + """Matching according to RFC 6125, section 6.4.3 + + http://tools.ietf.org/html/rfc6125#section-6.4.3 + """ + pats = [] + if not dn: + return False + + # Ported from python3-syntax: + # leftmost, *remainder = dn.split(r'.') + parts = dn.split(r'.') + leftmost = parts[0] + remainder = parts[1:] + + wildcards = leftmost.count('*') + if wildcards > max_wildcards: + # Issue #17980: avoid denials of service by refusing more + # than one wildcard per fragment. A survey of established + # policy among SSL implementations showed it to be a + # reasonable choice. + raise CertificateError( + "too many wildcards in certificate DNS name: " + repr(dn)) + + # speed up common case w/o wildcards + if not wildcards: + return dn.lower() == hostname.lower() + + # RFC 6125, section 6.4.3, subitem 1. + # The client SHOULD NOT attempt to match a presented identifier in which + # the wildcard character comprises a label other than the left-most label. + if leftmost == '*': + # When '*' is a fragment by itself, it matches a non-empty dotless + # fragment. + pats.append('[^.]+') + elif leftmost.startswith('xn--') or hostname.startswith('xn--'): + # RFC 6125, section 6.4.3, subitem 3. + # The client SHOULD NOT attempt to match a presented identifier + # where the wildcard character is embedded within an A-label or + # U-label of an internationalized domain name. + pats.append(re.escape(leftmost)) + else: + # Otherwise, '*' matches any dotless string, e.g. www* + pats.append(re.escape(leftmost).replace(r'\*', '[^.]*')) + + # add the remaining fragments, ignore any wildcards + for frag in remainder: + pats.append(re.escape(frag)) + + pat = re.compile(r'\A' + r'\.'.join(pats) + r'\Z', re.IGNORECASE) + return pat.match(hostname) + + def match_hostname(cert, hostname): + """Verify that *cert* (in decoded format as returned by + SSLSocket.getpeercert()) matches the *hostname*. RFC 2818 and RFC 6125 + rules are followed, but IP addresses are not accepted for *hostname*. + + CertificateError is raised on failure. On success, the function + returns nothing. + """ + if not cert: + raise ValueError("empty or no certificate") + dnsnames = [] + san = cert.get('subjectAltName', ()) + for key, value in san: + if key == 'DNS': + if _dnsname_match(value, hostname): + return + dnsnames.append(value) + if not dnsnames: + # The subject is only checked when there is no dNSName entry + # in subjectAltName + for sub in cert.get('subject', ()): + for key, value in sub: + # XXX according to RFC 2818, the most specific Common Name + # must be used. + if key == 'commonName': + if _dnsname_match(value, hostname): + return + dnsnames.append(value) + if len(dnsnames) > 1: + raise CertificateError("hostname %r " + "doesn't match either of %s" + % (hostname, ', '.join(map(repr, dnsnames)))) + elif len(dnsnames) == 1: + raise CertificateError("hostname %r " + "doesn't match %r" + % (hostname, dnsnames[0])) + else: + raise CertificateError("no appropriate commonName or " + "subjectAltName fields were found") + + +class VerifyingHTTPSHandler(HTTPSHandler): + """Simple verifying handler: no auth, subclasses, timeouts, etc.""" + + def __init__(self, ca_bundle): + self.ca_bundle = ca_bundle + HTTPSHandler.__init__(self) + + def https_open(self, req): + return self.do_open( + lambda host, **kw: VerifyingHTTPSConn(host, self.ca_bundle, **kw), req + ) + + +class VerifyingHTTPSConn(HTTPSConnection): + """Simple verifying connection: no auth, subclasses, timeouts, etc.""" + + def __init__(self, host, ca_bundle, **kw): + HTTPSConnection.__init__(self, host, **kw) + self.ca_bundle = ca_bundle + + def connect(self): + sock = socket.create_connection( + (self.host, self.port), getattr(self, 'source_address', None) + ) + + # Handle the socket if a (proxy) tunnel is present + if hasattr(self, '_tunnel') and getattr(self, '_tunnel_host', None): + self.sock = sock + self._tunnel() + # http://bugs.python.org/issue7776: Python>=3.4.1 and >=2.7.7 + # change self.host to mean the proxy server host when tunneling is + # being used. Adapt, since we are interested in the destination + # host for the match_hostname() comparison. + actual_host = self._tunnel_host + else: + actual_host = self.host + + self.sock = ssl.wrap_socket( + sock, cert_reqs=ssl.CERT_REQUIRED, ca_certs=self.ca_bundle + ) + try: + match_hostname(self.sock.getpeercert(), actual_host) + except CertificateError: + self.sock.shutdown(socket.SHUT_RDWR) + self.sock.close() + raise + + +def opener_for(ca_bundle=None): + """Get a urlopen() replacement that uses ca_bundle for verification""" + return urllib.request.build_opener( + VerifyingHTTPSHandler(ca_bundle or find_ca_bundle()) + ).open + + +# from jaraco.functools +def once(func): + @functools.wraps(func) + def wrapper(*args, **kwargs): + if not hasattr(func, 'always_returns'): + func.always_returns = func(*args, **kwargs) + return func.always_returns + return wrapper + + +@once +def get_win_certfile(): + try: + import wincertstore + except ImportError: + return None + + class CertFile(wincertstore.CertFile): + def __init__(self): + super(CertFile, self).__init__() + atexit.register(self.close) + + def close(self): + try: + super(CertFile, self).close() + except OSError: + pass + + _wincerts = CertFile() + _wincerts.addstore('CA') + _wincerts.addstore('ROOT') + return _wincerts.name + + +def find_ca_bundle(): + """Return an existing CA bundle path, or None""" + extant_cert_paths = filter(os.path.isfile, cert_paths) + return ( + get_win_certfile() + or next(extant_cert_paths, None) + or _certifi_where() + ) + + +def _certifi_where(): + try: + return __import__('certifi').where() + except (ImportError, ResolutionError, ExtractionError): + pass diff --git a/setuptools/tests/__init__.py b/setuptools/tests/__init__.py index 287bc24..f54c478 100644 --- a/setuptools/tests/__init__.py +++ b/setuptools/tests/__init__.py @@ -1,28 +1,25 @@ """Tests for the 'setuptools' package""" -from unittest import TestSuite, TestCase, makeSuite, defaultTestLoader -import distutils.core, distutils.cmd +import sys +import os +import distutils.core +import distutils.cmd from distutils.errors import DistutilsOptionError, DistutilsPlatformError from distutils.errors import DistutilsSetupError -import setuptools, setuptools.dist -from setuptools import Feature from distutils.core import Extension -extract_constant, get_module_constant = None, None -from setuptools.depends import * -from distutils.version import StrictVersion, LooseVersion -from distutils.util import convert_path -import sys, os.path - -def additional_tests(): - import doctest, unittest - suite = unittest.TestSuite(( - doctest.DocFileSuite( - 'api_tests.txt', - optionflags=doctest.ELLIPSIS, package='pkg_resources', - ), - )) - if sys.platform == 'win32': - suite.addTest(doctest.DocFileSuite('win_script_wrapper.txt')) - return suite +from distutils.version import LooseVersion + +import six +import pytest + +import setuptools.dist +import setuptools.depends as dep +from setuptools import Feature +from setuptools.depends import Require + +c_type = os.environ.get("LC_CTYPE", os.environ.get("LC_ALL")) +is_ascii = c_type in ("C", "POSIX") +fail_on_ascii = pytest.mark.xfail(is_ascii, reason="Test fails in this locale") + def makeSetup(**args): """Return distribution from 'setup(**args)', without executing commands""" @@ -30,157 +27,150 @@ def makeSetup(**args): distutils.core._setup_stop_after = "commandline" # Don't let system command line leak into tests! - args.setdefault('script_args',['install']) + args.setdefault('script_args', ['install']) try: return setuptools.setup(**args) finally: - distutils.core_setup_stop_after = None + distutils.core._setup_stop_after = None +needs_bytecode = pytest.mark.skipif( + not hasattr(dep, 'get_module_constant'), + reason="bytecode support not available", +) -class DependsTests(TestCase): - +class TestDepends: def testExtractConst(self): - if not extract_constant: return # skip on non-bytecode platforms + if not hasattr(dep, 'extract_constant'): + # skip on non-bytecode platforms + return def f1(): - global x,y,z + global x, y, z x = "test" y = z + fc = six.get_function_code(f1) + # unrecognized name - self.assertEqual(extract_constant(f1.func_code,'q', -1), None) + assert dep.extract_constant(fc, 'q', -1) is None # constant assigned - self.assertEqual(extract_constant(f1.func_code,'x', -1), "test") + dep.extract_constant(fc, 'x', -1) == "test" # expression assigned - self.assertEqual(extract_constant(f1.func_code,'y', -1), -1) + dep.extract_constant(fc, 'y', -1) == -1 # recognized name, not assigned - self.assertEqual(extract_constant(f1.func_code,'z', -1), None) - + dep.extract_constant(fc, 'z', -1) is None def testFindModule(self): - self.assertRaises(ImportError, find_module, 'no-such.-thing') - self.assertRaises(ImportError, find_module, 'setuptools.non-existent') - f,p,i = find_module('setuptools.tests'); f.close() - + with pytest.raises(ImportError): + dep.find_module('no-such.-thing') + with pytest.raises(ImportError): + dep.find_module('setuptools.non-existent') + f, p, i = dep.find_module('setuptools.tests') + f.close() + + @needs_bytecode def testModuleExtract(self): - if not get_module_constant: return # skip on non-bytecode platforms - from distutils import __version__ - self.assertEqual( - get_module_constant('distutils','__version__'), __version__ - ) - self.assertEqual( - get_module_constant('sys','version'), sys.version - ) - self.assertEqual( - get_module_constant('setuptools.tests','__doc__'),__doc__ - ) + from email import __version__ + assert dep.get_module_constant('email', '__version__') == __version__ + assert dep.get_module_constant('sys', 'version') == sys.version + assert dep.get_module_constant('setuptools.tests', '__doc__') == __doc__ + @needs_bytecode def testRequire(self): - if not extract_constant: return # skip on non-bytecode platforms - - req = Require('Distutils','1.0.3','distutils') + req = Require('Email', '1.0.3', 'email') - self.assertEqual(req.name, 'Distutils') - self.assertEqual(req.module, 'distutils') - self.assertEqual(req.requested_version, '1.0.3') - self.assertEqual(req.attribute, '__version__') - self.assertEqual(req.full_name(), 'Distutils-1.0.3') + assert req.name == 'Email' + assert req.module == 'email' + assert req.requested_version == '1.0.3' + assert req.attribute == '__version__' + assert req.full_name() == 'Email-1.0.3' - from distutils import __version__ - self.assertEqual(req.get_version(), __version__) - self.failUnless(req.version_ok('1.0.9')) - self.failIf(req.version_ok('0.9.1')) - self.failIf(req.version_ok('unknown')) + from email import __version__ + assert req.get_version() == __version__ + assert req.version_ok('1.0.9') + assert not req.version_ok('0.9.1') + assert not req.version_ok('unknown') - self.failUnless(req.is_present()) - self.failUnless(req.is_current()) + assert req.is_present() + assert req.is_current() - req = Require('Distutils 3000','03000','distutils',format=LooseVersion) - self.failUnless(req.is_present()) - self.failIf(req.is_current()) - self.failIf(req.version_ok('unknown')) + req = Require('Email 3000', '03000', 'email', format=LooseVersion) + assert req.is_present() + assert not req.is_current() + assert not req.version_ok('unknown') - req = Require('Do-what-I-mean','1.0','d-w-i-m') - self.failIf(req.is_present()) - self.failIf(req.is_current()) + req = Require('Do-what-I-mean', '1.0', 'd-w-i-m') + assert not req.is_present() + assert not req.is_current() req = Require('Tests', None, 'tests', homepage="http://example.com") - self.assertEqual(req.format, None) - self.assertEqual(req.attribute, None) - self.assertEqual(req.requested_version, None) - self.assertEqual(req.full_name(), 'Tests') - self.assertEqual(req.homepage, 'http://example.com') + assert req.format is None + assert req.attribute is None + assert req.requested_version is None + assert req.full_name() == 'Tests' + assert req.homepage == 'http://example.com' paths = [os.path.dirname(p) for p in __path__] - self.failUnless(req.is_present(paths)) - self.failUnless(req.is_current(paths)) + assert req.is_present(paths) + assert req.is_current(paths) -class DistroTests(TestCase): - - def setUp(self): - self.e1 = Extension('bar.ext',['bar.c']) +class TestDistro: + def setup_method(self, method): + self.e1 = Extension('bar.ext', ['bar.c']) self.e2 = Extension('c.y', ['y.c']) self.dist = makeSetup( packages=['a', 'a.b', 'a.b.c', 'b', 'c'], - py_modules=['b.d','x'], - ext_modules = (self.e1, self.e2), - package_dir = {}, + py_modules=['b.d', 'x'], + ext_modules=(self.e1, self.e2), + package_dir={}, ) - def testDistroType(self): - self.failUnless(isinstance(self.dist,setuptools.dist.Distribution)) - + assert isinstance(self.dist, setuptools.dist.Distribution) def testExcludePackage(self): self.dist.exclude_package('a') - self.assertEqual(self.dist.packages, ['b','c']) + assert self.dist.packages == ['b', 'c'] self.dist.exclude_package('b') - self.assertEqual(self.dist.packages, ['c']) - self.assertEqual(self.dist.py_modules, ['x']) - self.assertEqual(self.dist.ext_modules, [self.e1, self.e2]) + assert self.dist.packages == ['c'] + assert self.dist.py_modules == ['x'] + assert self.dist.ext_modules == [self.e1, self.e2] self.dist.exclude_package('c') - self.assertEqual(self.dist.packages, []) - self.assertEqual(self.dist.py_modules, ['x']) - self.assertEqual(self.dist.ext_modules, [self.e1]) + assert self.dist.packages == [] + assert self.dist.py_modules == ['x'] + assert self.dist.ext_modules == [self.e1] # test removals from unspecified options makeSetup().exclude_package('x') - - - - - - def testIncludeExclude(self): # remove an extension self.dist.exclude(ext_modules=[self.e1]) - self.assertEqual(self.dist.ext_modules, [self.e2]) + assert self.dist.ext_modules == [self.e2] # add it back in self.dist.include(ext_modules=[self.e1]) - self.assertEqual(self.dist.ext_modules, [self.e2, self.e1]) + assert self.dist.ext_modules == [self.e2, self.e1] # should not add duplicate self.dist.include(ext_modules=[self.e1]) - self.assertEqual(self.dist.ext_modules, [self.e2, self.e1]) + assert self.dist.ext_modules == [self.e2, self.e1] def testExcludePackages(self): - self.dist.exclude(packages=['c','b','a']) - self.assertEqual(self.dist.packages, []) - self.assertEqual(self.dist.py_modules, ['x']) - self.assertEqual(self.dist.ext_modules, [self.e1]) + self.dist.exclude(packages=['c', 'b', 'a']) + assert self.dist.packages == [] + assert self.dist.py_modules == ['x'] + assert self.dist.ext_modules == [self.e1] def testEmpty(self): dist = makeSetup() @@ -189,182 +179,148 @@ class DistroTests(TestCase): dist.exclude(packages=['a'], py_modules=['b'], ext_modules=[self.e2]) def testContents(self): - self.failUnless(self.dist.has_contents_for('a')) + assert self.dist.has_contents_for('a') self.dist.exclude_package('a') - self.failIf(self.dist.has_contents_for('a')) + assert not self.dist.has_contents_for('a') - self.failUnless(self.dist.has_contents_for('b')) + assert self.dist.has_contents_for('b') self.dist.exclude_package('b') - self.failIf(self.dist.has_contents_for('b')) + assert not self.dist.has_contents_for('b') - self.failUnless(self.dist.has_contents_for('c')) + assert self.dist.has_contents_for('c') self.dist.exclude_package('c') - self.failIf(self.dist.has_contents_for('c')) - - - + assert not self.dist.has_contents_for('c') def testInvalidIncludeExclude(self): - self.assertRaises(DistutilsSetupError, - self.dist.include, nonexistent_option='x' - ) - self.assertRaises(DistutilsSetupError, - self.dist.exclude, nonexistent_option='x' - ) - self.assertRaises(DistutilsSetupError, - self.dist.include, packages={'x':'y'} - ) - self.assertRaises(DistutilsSetupError, - self.dist.exclude, packages={'x':'y'} - ) - self.assertRaises(DistutilsSetupError, - self.dist.include, ext_modules={'x':'y'} - ) - self.assertRaises(DistutilsSetupError, - self.dist.exclude, ext_modules={'x':'y'} - ) - - self.assertRaises(DistutilsSetupError, - self.dist.include, package_dir=['q'] - ) - self.assertRaises(DistutilsSetupError, - self.dist.exclude, package_dir=['q'] - ) - - - - - - - - - - - - - - - -class FeatureTests(TestCase): - - def setUp(self): - self.req = Require('Distutils','1.0.3','distutils') + with pytest.raises(DistutilsSetupError): + self.dist.include(nonexistent_option='x') + with pytest.raises(DistutilsSetupError): + self.dist.exclude(nonexistent_option='x') + with pytest.raises(DistutilsSetupError): + self.dist.include(packages={'x': 'y'}) + with pytest.raises(DistutilsSetupError): + self.dist.exclude(packages={'x': 'y'}) + with pytest.raises(DistutilsSetupError): + self.dist.include(ext_modules={'x': 'y'}) + with pytest.raises(DistutilsSetupError): + self.dist.exclude(ext_modules={'x': 'y'}) + + with pytest.raises(DistutilsSetupError): + self.dist.include(package_dir=['q']) + with pytest.raises(DistutilsSetupError): + self.dist.exclude(package_dir=['q']) + + +class TestFeatures: + def setup_method(self, method): + self.req = Require('Distutils', '1.0.3', 'distutils') self.dist = makeSetup( features={ - 'foo': Feature("foo",standard=True,require_features=['baz',self.req]), - 'bar': Feature("bar", standard=True, packages=['pkg.bar'], + 'foo': Feature("foo", standard=True, require_features=['baz', self.req]), + 'bar': Feature("bar", standard=True, packages=['pkg.bar'], py_modules=['bar_et'], remove=['bar.ext'], - ), + ), 'baz': Feature( "baz", optional=False, packages=['pkg.baz'], - scripts = ['scripts/baz_it'], - libraries=[('libfoo','foo/foofoo.c')] + scripts=['scripts/baz_it'], + libraries=[('libfoo', 'foo/foofoo.c')] ), 'dwim': Feature("DWIM", available=False, remove='bazish'), }, script_args=['--without-bar', 'install'], - packages = ['pkg.bar', 'pkg.foo'], - py_modules = ['bar_et', 'bazish'], - ext_modules = [Extension('bar.ext',['bar.c'])] + packages=['pkg.bar', 'pkg.foo'], + py_modules=['bar_et', 'bazish'], + ext_modules=[Extension('bar.ext', ['bar.c'])] ) def testDefaults(self): - self.failIf( - Feature( - "test",standard=True,remove='x',available=False - ).include_by_default() - ) - self.failUnless( - Feature("test",standard=True,remove='x').include_by_default() - ) + assert not Feature( + "test", standard=True, remove='x', available=False + ).include_by_default() + assert Feature("test", standard=True, remove='x').include_by_default() # Feature must have either kwargs, removes, or require_features - self.assertRaises(DistutilsSetupError, Feature, "test") + with pytest.raises(DistutilsSetupError): + Feature("test") def testAvailability(self): - self.assertRaises( - DistutilsPlatformError, - self.dist.features['dwim'].include_in, self.dist - ) + with pytest.raises(DistutilsPlatformError): + self.dist.features['dwim'].include_in(self.dist) def testFeatureOptions(self): dist = self.dist - self.failUnless( - ('with-dwim',None,'include DWIM') in dist.feature_options + assert ( + ('with-dwim', None, 'include DWIM') in dist.feature_options ) - self.failUnless( - ('without-dwim',None,'exclude DWIM (default)') in dist.feature_options + assert ( + ('without-dwim', None, 'exclude DWIM (default)') in dist.feature_options ) - self.failUnless( - ('with-bar',None,'include bar (default)') in dist.feature_options + assert ( + ('with-bar', None, 'include bar (default)') in dist.feature_options ) - self.failUnless( - ('without-bar',None,'exclude bar') in dist.feature_options + assert ( + ('without-bar', None, 'exclude bar') in dist.feature_options ) - self.assertEqual(dist.feature_negopt['without-foo'],'with-foo') - self.assertEqual(dist.feature_negopt['without-bar'],'with-bar') - self.assertEqual(dist.feature_negopt['without-dwim'],'with-dwim') - self.failIf('without-baz' in dist.feature_negopt) + assert dist.feature_negopt['without-foo'] == 'with-foo' + assert dist.feature_negopt['without-bar'] == 'with-bar' + assert dist.feature_negopt['without-dwim'] == 'with-dwim' + assert ('without-baz' not in dist.feature_negopt) def testUseFeatures(self): dist = self.dist - self.assertEqual(dist.with_foo,1) - self.assertEqual(dist.with_bar,0) - self.assertEqual(dist.with_baz,1) - self.failIf('bar_et' in dist.py_modules) - self.failIf('pkg.bar' in dist.packages) - self.failUnless('pkg.baz' in dist.packages) - self.failUnless('scripts/baz_it' in dist.scripts) - self.failUnless(('libfoo','foo/foofoo.c') in dist.libraries) - self.assertEqual(dist.ext_modules,[]) - self.assertEqual(dist.require_features, [self.req]) + assert dist.with_foo == 1 + assert dist.with_bar == 0 + assert dist.with_baz == 1 + assert ('bar_et' not in dist.py_modules) + assert ('pkg.bar' not in dist.packages) + assert ('pkg.baz' in dist.packages) + assert ('scripts/baz_it' in dist.scripts) + assert (('libfoo', 'foo/foofoo.c') in dist.libraries) + assert dist.ext_modules == [] + assert dist.require_features == [self.req] # If we ask for bar, it should fail because we explicitly disabled # it on the command line - self.assertRaises(DistutilsOptionError, dist.include_feature, 'bar') + with pytest.raises(DistutilsOptionError): + dist.include_feature('bar') def testFeatureWithInvalidRemove(self): - self.assertRaises( - SystemExit, makeSetup, features = {'x':Feature('x', remove='y')} - ) + with pytest.raises(SystemExit): + makeSetup(features={'x': Feature('x', remove='y')}) -class TestCommandTests(TestCase): +class TestCommandTests: def testTestIsCommand(self): test_cmd = makeSetup().get_command_obj('test') - self.failUnless(isinstance(test_cmd, distutils.cmd.Command)) + assert (isinstance(test_cmd, distutils.cmd.Command)) def testLongOptSuiteWNoDefault(self): - ts1 = makeSetup(script_args=['test','--test-suite=foo.tests.suite']) + ts1 = makeSetup(script_args=['test', '--test-suite=foo.tests.suite']) ts1 = ts1.get_command_obj('test') ts1.ensure_finalized() - self.assertEqual(ts1.test_suite, 'foo.tests.suite') + assert ts1.test_suite == 'foo.tests.suite' def testDefaultSuite(self): ts2 = makeSetup(test_suite='bar.tests.suite').get_command_obj('test') ts2.ensure_finalized() - self.assertEqual(ts2.test_suite, 'bar.tests.suite') + assert ts2.test_suite == 'bar.tests.suite' def testDefaultWModuleOnCmdLine(self): ts3 = makeSetup( test_suite='bar.tests', - script_args=['test','-m','foo.tests'] + script_args=['test', '-m', 'foo.tests'] ).get_command_obj('test') ts3.ensure_finalized() - self.assertEqual(ts3.test_module, 'foo.tests') - self.assertEqual(ts3.test_suite, 'foo.tests.test_suite') + assert ts3.test_module == 'foo.tests' + assert ts3.test_suite == 'foo.tests.test_suite' def testConflictingOptions(self): ts4 = makeSetup( - script_args=['test','-m','bar.tests', '-s','foo.tests.suite'] + script_args=['test', '-m', 'bar.tests', '-s', 'foo.tests.suite'] ).get_command_obj('test') - self.assertRaises(DistutilsOptionError, ts4.ensure_finalized) + with pytest.raises(DistutilsOptionError): + ts4.ensure_finalized() def testNoSuite(self): ts5 = makeSetup().get_command_obj('test') ts5.ensure_finalized() - self.assertEqual(ts5.test_suite, None) - - - - - + assert ts5.test_suite is None diff --git a/setuptools/tests/contexts.py b/setuptools/tests/contexts.py new file mode 100644 index 0000000..77ebecf --- /dev/null +++ b/setuptools/tests/contexts.py @@ -0,0 +1,98 @@ +import tempfile +import os +import shutil +import sys +import contextlib +import site + +import six +import pkg_resources + + +@contextlib.contextmanager +def tempdir(cd=lambda dir: None, **kwargs): + temp_dir = tempfile.mkdtemp(**kwargs) + orig_dir = os.getcwd() + try: + cd(temp_dir) + yield temp_dir + finally: + cd(orig_dir) + shutil.rmtree(temp_dir) + + +@contextlib.contextmanager +def environment(**replacements): + """ + In a context, patch the environment with replacements. Pass None values + to clear the values. + """ + saved = dict( + (key, os.environ[key]) + for key in replacements + if key in os.environ + ) + + # remove values that are null + remove = (key for (key, value) in replacements.items() if value is None) + for key in list(remove): + os.environ.pop(key, None) + replacements.pop(key) + + os.environ.update(replacements) + + try: + yield saved + finally: + for key in replacements: + os.environ.pop(key, None) + os.environ.update(saved) + + +@contextlib.contextmanager +def quiet(): + """ + Redirect stdout/stderr to StringIO objects to prevent console output from + distutils commands. + """ + + old_stdout = sys.stdout + old_stderr = sys.stderr + new_stdout = sys.stdout = six.StringIO() + new_stderr = sys.stderr = six.StringIO() + try: + yield new_stdout, new_stderr + finally: + new_stdout.seek(0) + new_stderr.seek(0) + sys.stdout = old_stdout + sys.stderr = old_stderr + + +@contextlib.contextmanager +def save_user_site_setting(): + saved = site.ENABLE_USER_SITE + try: + yield saved + finally: + site.ENABLE_USER_SITE = saved + + +@contextlib.contextmanager +def save_pkg_resources_state(): + pr_state = pkg_resources.__getstate__() + # also save sys.path + sys_path = sys.path[:] + try: + yield pr_state, sys_path + finally: + sys.path[:] = sys_path + pkg_resources.__setstate__(pr_state) + + +@contextlib.contextmanager +def suppress_exceptions(*excs): + try: + yield + except excs: + pass diff --git a/setuptools/tests/doctest.py b/setuptools/tests/doctest.py deleted file mode 100644 index bffce58..0000000 --- a/setuptools/tests/doctest.py +++ /dev/null @@ -1,2679 +0,0 @@ -# Module doctest. -# Released to the public domain 16-Jan-2001, by Tim Peters (tim@python.org). -# Major enhancements and refactoring by: -# Jim Fulton -# Edward Loper - -# Provided as-is; use at your own risk; no warranty; no promises; enjoy! - -try: - basestring -except NameError: - basestring = str,unicode - -try: - enumerate -except NameError: - def enumerate(seq): - return zip(range(len(seq)),seq) - -r"""Module doctest -- a framework for running examples in docstrings. - -In simplest use, end each module M to be tested with: - -def _test(): - import doctest - doctest.testmod() - -if __name__ == "__main__": - _test() - -Then running the module as a script will cause the examples in the -docstrings to get executed and verified: - -python M.py - -This won't display anything unless an example fails, in which case the -failing example(s) and the cause(s) of the failure(s) are printed to stdout -(why not stderr? because stderr is a lame hack <0.2 wink>), and the final -line of output is "Test failed.". - -Run it with the -v switch instead: - -python M.py -v - -and a detailed report of all examples tried is printed to stdout, along -with assorted summaries at the end. - -You can force verbose mode by passing "verbose=True" to testmod, or prohibit -it by passing "verbose=False". In either of those cases, sys.argv is not -examined by testmod. - -There are a variety of other ways to run doctests, including integration -with the unittest framework, and support for running non-Python text -files containing doctests. There are also many ways to override parts -of doctest's default behaviors. See the Library Reference Manual for -details. -""" - -__docformat__ = 'reStructuredText en' - -__all__ = [ - # 0, Option Flags - 'register_optionflag', - 'DONT_ACCEPT_TRUE_FOR_1', - 'DONT_ACCEPT_BLANKLINE', - 'NORMALIZE_WHITESPACE', - 'ELLIPSIS', - 'IGNORE_EXCEPTION_DETAIL', - 'COMPARISON_FLAGS', - 'REPORT_UDIFF', - 'REPORT_CDIFF', - 'REPORT_NDIFF', - 'REPORT_ONLY_FIRST_FAILURE', - 'REPORTING_FLAGS', - # 1. Utility Functions - 'is_private', - # 2. Example & DocTest - 'Example', - 'DocTest', - # 3. Doctest Parser - 'DocTestParser', - # 4. Doctest Finder - 'DocTestFinder', - # 5. Doctest Runner - 'DocTestRunner', - 'OutputChecker', - 'DocTestFailure', - 'UnexpectedException', - 'DebugRunner', - # 6. Test Functions - 'testmod', - 'testfile', - 'run_docstring_examples', - # 7. Tester - 'Tester', - # 8. Unittest Support - 'DocTestSuite', - 'DocFileSuite', - 'set_unittest_reportflags', - # 9. Debugging Support - 'script_from_examples', - 'testsource', - 'debug_src', - 'debug', -] - -import __future__ - -import sys, traceback, inspect, linecache, os, re, types -import unittest, difflib, pdb, tempfile -import warnings -from StringIO import StringIO - -# Don't whine about the deprecated is_private function in this -# module's tests. -warnings.filterwarnings("ignore", "is_private", DeprecationWarning, - __name__, 0) - -# There are 4 basic classes: -# - Example: a pair, plus an intra-docstring line number. -# - DocTest: a collection of examples, parsed from a docstring, plus -# info about where the docstring came from (name, filename, lineno). -# - DocTestFinder: extracts DocTests from a given object's docstring and -# its contained objects' docstrings. -# - DocTestRunner: runs DocTest cases, and accumulates statistics. -# -# So the basic picture is: -# -# list of: -# +------+ +---------+ +-------+ -# |object| --DocTestFinder-> | DocTest | --DocTestRunner-> |results| -# +------+ +---------+ +-------+ -# | Example | -# | ... | -# | Example | -# +---------+ - -# Option constants. - -OPTIONFLAGS_BY_NAME = {} -def register_optionflag(name): - flag = 1 << len(OPTIONFLAGS_BY_NAME) - OPTIONFLAGS_BY_NAME[name] = flag - return flag - -DONT_ACCEPT_TRUE_FOR_1 = register_optionflag('DONT_ACCEPT_TRUE_FOR_1') -DONT_ACCEPT_BLANKLINE = register_optionflag('DONT_ACCEPT_BLANKLINE') -NORMALIZE_WHITESPACE = register_optionflag('NORMALIZE_WHITESPACE') -ELLIPSIS = register_optionflag('ELLIPSIS') -IGNORE_EXCEPTION_DETAIL = register_optionflag('IGNORE_EXCEPTION_DETAIL') - -COMPARISON_FLAGS = (DONT_ACCEPT_TRUE_FOR_1 | - DONT_ACCEPT_BLANKLINE | - NORMALIZE_WHITESPACE | - ELLIPSIS | - IGNORE_EXCEPTION_DETAIL) - -REPORT_UDIFF = register_optionflag('REPORT_UDIFF') -REPORT_CDIFF = register_optionflag('REPORT_CDIFF') -REPORT_NDIFF = register_optionflag('REPORT_NDIFF') -REPORT_ONLY_FIRST_FAILURE = register_optionflag('REPORT_ONLY_FIRST_FAILURE') - -REPORTING_FLAGS = (REPORT_UDIFF | - REPORT_CDIFF | - REPORT_NDIFF | - REPORT_ONLY_FIRST_FAILURE) - -# Special string markers for use in `want` strings: -BLANKLINE_MARKER = '' -ELLIPSIS_MARKER = '...' - -###################################################################### -## Table of Contents -###################################################################### -# 1. Utility Functions -# 2. Example & DocTest -- store test cases -# 3. DocTest Parser -- extracts examples from strings -# 4. DocTest Finder -- extracts test cases from objects -# 5. DocTest Runner -- runs test cases -# 6. Test Functions -- convenient wrappers for testing -# 7. Tester Class -- for backwards compatibility -# 8. Unittest Support -# 9. Debugging Support -# 10. Example Usage - -###################################################################### -## 1. Utility Functions -###################################################################### - -def is_private(prefix, base): - """prefix, base -> true iff name prefix + "." + base is "private". - - Prefix may be an empty string, and base does not contain a period. - Prefix is ignored (although functions you write conforming to this - protocol may make use of it). - Return true iff base begins with an (at least one) underscore, but - does not both begin and end with (at least) two underscores. - - >>> is_private("a.b", "my_func") - False - >>> is_private("____", "_my_func") - True - >>> is_private("someclass", "__init__") - False - >>> is_private("sometypo", "__init_") - True - >>> is_private("x.y.z", "_") - True - >>> is_private("_x.y.z", "__") - False - >>> is_private("", "") # senseless but consistent - False - """ - warnings.warn("is_private is deprecated; it wasn't useful; " - "examine DocTestFinder.find() lists instead", - DeprecationWarning, stacklevel=2) - return base[:1] == "_" and not base[:2] == "__" == base[-2:] - -def _extract_future_flags(globs): - """ - Return the compiler-flags associated with the future features that - have been imported into the given namespace (globs). - """ - flags = 0 - for fname in __future__.all_feature_names: - feature = globs.get(fname, None) - if feature is getattr(__future__, fname): - flags |= feature.compiler_flag - return flags - -def _normalize_module(module, depth=2): - """ - Return the module specified by `module`. In particular: - - If `module` is a module, then return module. - - If `module` is a string, then import and return the - module with that name. - - If `module` is None, then return the calling module. - The calling module is assumed to be the module of - the stack frame at the given depth in the call stack. - """ - if inspect.ismodule(module): - return module - elif isinstance(module, (str, unicode)): - return __import__(module, globals(), locals(), ["*"]) - elif module is None: - return sys.modules[sys._getframe(depth).f_globals['__name__']] - else: - raise TypeError("Expected a module, string, or None") - -def _indent(s, indent=4): - """ - Add the given number of space characters to the beginning every - non-blank line in `s`, and return the result. - """ - # This regexp matches the start of non-blank lines: - return re.sub('(?m)^(?!$)', indent*' ', s) - -def _exception_traceback(exc_info): - """ - Return a string containing a traceback message for the given - exc_info tuple (as returned by sys.exc_info()). - """ - # Get a traceback message. - excout = StringIO() - exc_type, exc_val, exc_tb = exc_info - traceback.print_exception(exc_type, exc_val, exc_tb, file=excout) - return excout.getvalue() - -# Override some StringIO methods. -class _SpoofOut(StringIO): - def getvalue(self): - result = StringIO.getvalue(self) - # If anything at all was written, make sure there's a trailing - # newline. There's no way for the expected output to indicate - # that a trailing newline is missing. - if result and not result.endswith("\n"): - result += "\n" - # Prevent softspace from screwing up the next test case, in - # case they used print with a trailing comma in an example. - if hasattr(self, "softspace"): - del self.softspace - return result - - def truncate(self, size=None): - StringIO.truncate(self, size) - if hasattr(self, "softspace"): - del self.softspace - -# Worst-case linear-time ellipsis matching. -def _ellipsis_match(want, got): - """ - Essentially the only subtle case: - >>> _ellipsis_match('aa...aa', 'aaa') - False - """ - if want.find(ELLIPSIS_MARKER)==-1: - return want == got - - # Find "the real" strings. - ws = want.split(ELLIPSIS_MARKER) - assert len(ws) >= 2 - - # Deal with exact matches possibly needed at one or both ends. - startpos, endpos = 0, len(got) - w = ws[0] - if w: # starts with exact match - if got.startswith(w): - startpos = len(w) - del ws[0] - else: - return False - w = ws[-1] - if w: # ends with exact match - if got.endswith(w): - endpos -= len(w) - del ws[-1] - else: - return False - - if startpos > endpos: - # Exact end matches required more characters than we have, as in - # _ellipsis_match('aa...aa', 'aaa') - return False - - # For the rest, we only need to find the leftmost non-overlapping - # match for each piece. If there's no overall match that way alone, - # there's no overall match period. - for w in ws: - # w may be '' at times, if there are consecutive ellipses, or - # due to an ellipsis at the start or end of `want`. That's OK. - # Search for an empty string succeeds, and doesn't change startpos. - startpos = got.find(w, startpos, endpos) - if startpos < 0: - return False - startpos += len(w) - - return True - -def _comment_line(line): - "Return a commented form of the given line" - line = line.rstrip() - if line: - return '# '+line - else: - return '#' - -class _OutputRedirectingPdb(pdb.Pdb): - """ - A specialized version of the python debugger that redirects stdout - to a given stream when interacting with the user. Stdout is *not* - redirected when traced code is executed. - """ - def __init__(self, out): - self.__out = out - pdb.Pdb.__init__(self) - - def trace_dispatch(self, *args): - # Redirect stdout to the given stream. - save_stdout = sys.stdout - sys.stdout = self.__out - # Call Pdb's trace dispatch method. - try: - return pdb.Pdb.trace_dispatch(self, *args) - finally: - sys.stdout = save_stdout - -# [XX] Normalize with respect to os.path.pardir? -def _module_relative_path(module, path): - if not inspect.ismodule(module): - raise TypeError, 'Expected a module: %r' % module - if path.startswith('/'): - raise ValueError, 'Module-relative files may not have absolute paths' - - # Find the base directory for the path. - if hasattr(module, '__file__'): - # A normal module/package - basedir = os.path.split(module.__file__)[0] - elif module.__name__ == '__main__': - # An interactive session. - if len(sys.argv)>0 and sys.argv[0] != '': - basedir = os.path.split(sys.argv[0])[0] - else: - basedir = os.curdir - else: - # A module w/o __file__ (this includes builtins) - raise ValueError("Can't resolve paths relative to the module " + - module + " (it has no __file__)") - - # Combine the base directory and the path. - return os.path.join(basedir, *(path.split('/'))) - -###################################################################### -## 2. Example & DocTest -###################################################################### -## - An "example" is a pair, where "source" is a -## fragment of source code, and "want" is the expected output for -## "source." The Example class also includes information about -## where the example was extracted from. -## -## - A "doctest" is a collection of examples, typically extracted from -## a string (such as an object's docstring). The DocTest class also -## includes information about where the string was extracted from. - -class Example: - """ - A single doctest example, consisting of source code and expected - output. `Example` defines the following attributes: - - - source: A single Python statement, always ending with a newline. - The constructor adds a newline if needed. - - - want: The expected output from running the source code (either - from stdout, or a traceback in case of exception). `want` ends - with a newline unless it's empty, in which case it's an empty - string. The constructor adds a newline if needed. - - - exc_msg: The exception message generated by the example, if - the example is expected to generate an exception; or `None` if - it is not expected to generate an exception. This exception - message is compared against the return value of - `traceback.format_exception_only()`. `exc_msg` ends with a - newline unless it's `None`. The constructor adds a newline - if needed. - - - lineno: The line number within the DocTest string containing - this Example where the Example begins. This line number is - zero-based, with respect to the beginning of the DocTest. - - - indent: The example's indentation in the DocTest string. - I.e., the number of space characters that preceed the - example's first prompt. - - - options: A dictionary mapping from option flags to True or - False, which is used to override default options for this - example. Any option flags not contained in this dictionary - are left at their default value (as specified by the - DocTestRunner's optionflags). By default, no options are set. - """ - def __init__(self, source, want, exc_msg=None, lineno=0, indent=0, - options=None): - # Normalize inputs. - if not source.endswith('\n'): - source += '\n' - if want and not want.endswith('\n'): - want += '\n' - if exc_msg is not None and not exc_msg.endswith('\n'): - exc_msg += '\n' - # Store properties. - self.source = source - self.want = want - self.lineno = lineno - self.indent = indent - if options is None: options = {} - self.options = options - self.exc_msg = exc_msg - -class DocTest: - """ - A collection of doctest examples that should be run in a single - namespace. Each `DocTest` defines the following attributes: - - - examples: the list of examples. - - - globs: The namespace (aka globals) that the examples should - be run in. - - - name: A name identifying the DocTest (typically, the name of - the object whose docstring this DocTest was extracted from). - - - filename: The name of the file that this DocTest was extracted - from, or `None` if the filename is unknown. - - - lineno: The line number within filename where this DocTest - begins, or `None` if the line number is unavailable. This - line number is zero-based, with respect to the beginning of - the file. - - - docstring: The string that the examples were extracted from, - or `None` if the string is unavailable. - """ - def __init__(self, examples, globs, name, filename, lineno, docstring): - """ - Create a new DocTest containing the given examples. The - DocTest's globals are initialized with a copy of `globs`. - """ - assert not isinstance(examples, basestring), \ - "DocTest no longer accepts str; use DocTestParser instead" - self.examples = examples - self.docstring = docstring - self.globs = globs.copy() - self.name = name - self.filename = filename - self.lineno = lineno - - def __repr__(self): - if len(self.examples) == 0: - examples = 'no examples' - elif len(self.examples) == 1: - examples = '1 example' - else: - examples = '%d examples' % len(self.examples) - return ('' % - (self.name, self.filename, self.lineno, examples)) - - - # This lets us sort tests by name: - def __cmp__(self, other): - if not isinstance(other, DocTest): - return -1 - return cmp((self.name, self.filename, self.lineno, id(self)), - (other.name, other.filename, other.lineno, id(other))) - -###################################################################### -## 3. DocTestParser -###################################################################### - -class DocTestParser: - """ - A class used to parse strings containing doctest examples. - """ - # This regular expression is used to find doctest examples in a - # string. It defines three groups: `source` is the source code - # (including leading indentation and prompts); `indent` is the - # indentation of the first (PS1) line of the source code; and - # `want` is the expected output (including leading indentation). - _EXAMPLE_RE = re.compile(r''' - # Source consists of a PS1 line followed by zero or more PS2 lines. - (?P - (?:^(?P [ ]*) >>> .*) # PS1 line - (?:\n [ ]* \.\.\. .*)*) # PS2 lines - \n? - # Want consists of any non-blank lines that do not start with PS1. - (?P (?:(?![ ]*$) # Not a blank line - (?![ ]*>>>) # Not a line starting with PS1 - .*$\n? # But any other line - )*) - ''', re.MULTILINE | re.VERBOSE) - - # A regular expression for handling `want` strings that contain - # expected exceptions. It divides `want` into three pieces: - # - the traceback header line (`hdr`) - # - the traceback stack (`stack`) - # - the exception message (`msg`), as generated by - # traceback.format_exception_only() - # `msg` may have multiple lines. We assume/require that the - # exception message is the first non-indented line starting with a word - # character following the traceback header line. - _EXCEPTION_RE = re.compile(r""" - # Grab the traceback header. Different versions of Python have - # said different things on the first traceback line. - ^(?P Traceback\ \( - (?: most\ recent\ call\ last - | innermost\ last - ) \) : - ) - \s* $ # toss trailing whitespace on the header. - (?P .*?) # don't blink: absorb stuff until... - ^ (?P \w+ .*) # a line *starts* with alphanum. - """, re.VERBOSE | re.MULTILINE | re.DOTALL) - - # A callable returning a true value iff its argument is a blank line - # or contains a single comment. - _IS_BLANK_OR_COMMENT = re.compile(r'^[ ]*(#.*)?$').match - - def parse(self, string, name=''): - """ - Divide the given string into examples and intervening text, - and return them as a list of alternating Examples and strings. - Line numbers for the Examples are 0-based. The optional - argument `name` is a name identifying this string, and is only - used for error messages. - """ - string = string.expandtabs() - # If all lines begin with the same indentation, then strip it. - min_indent = self._min_indent(string) - if min_indent > 0: - string = '\n'.join([l[min_indent:] for l in string.split('\n')]) - - output = [] - charno, lineno = 0, 0 - # Find all doctest examples in the string: - for m in self._EXAMPLE_RE.finditer(string): - # Add the pre-example text to `output`. - output.append(string[charno:m.start()]) - # Update lineno (lines before this example) - lineno += string.count('\n', charno, m.start()) - # Extract info from the regexp match. - (source, options, want, exc_msg) = \ - self._parse_example(m, name, lineno) - # Create an Example, and add it to the list. - if not self._IS_BLANK_OR_COMMENT(source): - output.append( Example(source, want, exc_msg, - lineno=lineno, - indent=min_indent+len(m.group('indent')), - options=options) ) - # Update lineno (lines inside this example) - lineno += string.count('\n', m.start(), m.end()) - # Update charno. - charno = m.end() - # Add any remaining post-example text to `output`. - output.append(string[charno:]) - return output - - def get_doctest(self, string, globs, name, filename, lineno): - """ - Extract all doctest examples from the given string, and - collect them into a `DocTest` object. - - `globs`, `name`, `filename`, and `lineno` are attributes for - the new `DocTest` object. See the documentation for `DocTest` - for more information. - """ - return DocTest(self.get_examples(string, name), globs, - name, filename, lineno, string) - - def get_examples(self, string, name=''): - """ - Extract all doctest examples from the given string, and return - them as a list of `Example` objects. Line numbers are - 0-based, because it's most common in doctests that nothing - interesting appears on the same line as opening triple-quote, - and so the first interesting line is called \"line 1\" then. - - The optional argument `name` is a name identifying this - string, and is only used for error messages. - """ - return [x for x in self.parse(string, name) - if isinstance(x, Example)] - - def _parse_example(self, m, name, lineno): - """ - Given a regular expression match from `_EXAMPLE_RE` (`m`), - return a pair `(source, want)`, where `source` is the matched - example's source code (with prompts and indentation stripped); - and `want` is the example's expected output (with indentation - stripped). - - `name` is the string's name, and `lineno` is the line number - where the example starts; both are used for error messages. - """ - # Get the example's indentation level. - indent = len(m.group('indent')) - - # Divide source into lines; check that they're properly - # indented; and then strip their indentation & prompts. - source_lines = m.group('source').split('\n') - self._check_prompt_blank(source_lines, indent, name, lineno) - self._check_prefix(source_lines[1:], ' '*indent + '.', name, lineno) - source = '\n'.join([sl[indent+4:] for sl in source_lines]) - - # Divide want into lines; check that it's properly indented; and - # then strip the indentation. Spaces before the last newline should - # be preserved, so plain rstrip() isn't good enough. - want = m.group('want') - want_lines = want.split('\n') - if len(want_lines) > 1 and re.match(r' *$', want_lines[-1]): - del want_lines[-1] # forget final newline & spaces after it - self._check_prefix(want_lines, ' '*indent, name, - lineno + len(source_lines)) - want = '\n'.join([wl[indent:] for wl in want_lines]) - - # If `want` contains a traceback message, then extract it. - m = self._EXCEPTION_RE.match(want) - if m: - exc_msg = m.group('msg') - else: - exc_msg = None - - # Extract options from the source. - options = self._find_options(source, name, lineno) - - return source, options, want, exc_msg - - # This regular expression looks for option directives in the - # source code of an example. Option directives are comments - # starting with "doctest:". Warning: this may give false - # positives for string-literals that contain the string - # "#doctest:". Eliminating these false positives would require - # actually parsing the string; but we limit them by ignoring any - # line containing "#doctest:" that is *followed* by a quote mark. - _OPTION_DIRECTIVE_RE = re.compile(r'#\s*doctest:\s*([^\n\'"]*)$', - re.MULTILINE) - - def _find_options(self, source, name, lineno): - """ - Return a dictionary containing option overrides extracted from - option directives in the given source string. - - `name` is the string's name, and `lineno` is the line number - where the example starts; both are used for error messages. - """ - options = {} - # (note: with the current regexp, this will match at most once:) - for m in self._OPTION_DIRECTIVE_RE.finditer(source): - option_strings = m.group(1).replace(',', ' ').split() - for option in option_strings: - if (option[0] not in '+-' or - option[1:] not in OPTIONFLAGS_BY_NAME): - raise ValueError('line %r of the doctest for %s ' - 'has an invalid option: %r' % - (lineno+1, name, option)) - flag = OPTIONFLAGS_BY_NAME[option[1:]] - options[flag] = (option[0] == '+') - if options and self._IS_BLANK_OR_COMMENT(source): - raise ValueError('line %r of the doctest for %s has an option ' - 'directive on a line with no example: %r' % - (lineno, name, source)) - return options - - # This regular expression finds the indentation of every non-blank - # line in a string. - _INDENT_RE = re.compile('^([ ]*)(?=\S)', re.MULTILINE) - - def _min_indent(self, s): - "Return the minimum indentation of any non-blank line in `s`" - indents = [len(indent) for indent in self._INDENT_RE.findall(s)] - if len(indents) > 0: - return min(indents) - else: - return 0 - - def _check_prompt_blank(self, lines, indent, name, lineno): - """ - Given the lines of a source string (including prompts and - leading indentation), check to make sure that every prompt is - followed by a space character. If any line is not followed by - a space character, then raise ValueError. - """ - for i, line in enumerate(lines): - if len(line) >= indent+4 and line[indent+3] != ' ': - raise ValueError('line %r of the docstring for %s ' - 'lacks blank after %s: %r' % - (lineno+i+1, name, - line[indent:indent+3], line)) - - def _check_prefix(self, lines, prefix, name, lineno): - """ - Check that every line in the given list starts with the given - prefix; if any line does not, then raise a ValueError. - """ - for i, line in enumerate(lines): - if line and not line.startswith(prefix): - raise ValueError('line %r of the docstring for %s has ' - 'inconsistent leading whitespace: %r' % - (lineno+i+1, name, line)) - - -###################################################################### -## 4. DocTest Finder -###################################################################### - -class DocTestFinder: - """ - A class used to extract the DocTests that are relevant to a given - object, from its docstring and the docstrings of its contained - objects. Doctests can currently be extracted from the following - object types: modules, functions, classes, methods, staticmethods, - classmethods, and properties. - """ - - def __init__(self, verbose=False, parser=DocTestParser(), - recurse=True, _namefilter=None, exclude_empty=True): - """ - Create a new doctest finder. - - The optional argument `parser` specifies a class or - function that should be used to create new DocTest objects (or - objects that implement the same interface as DocTest). The - signature for this factory function should match the signature - of the DocTest constructor. - - If the optional argument `recurse` is false, then `find` will - only examine the given object, and not any contained objects. - - If the optional argument `exclude_empty` is false, then `find` - will include tests for objects with empty docstrings. - """ - self._parser = parser - self._verbose = verbose - self._recurse = recurse - self._exclude_empty = exclude_empty - # _namefilter is undocumented, and exists only for temporary backward- - # compatibility support of testmod's deprecated isprivate mess. - self._namefilter = _namefilter - - def find(self, obj, name=None, module=None, globs=None, - extraglobs=None): - """ - Return a list of the DocTests that are defined by the given - object's docstring, or by any of its contained objects' - docstrings. - - The optional parameter `module` is the module that contains - the given object. If the module is not specified or is None, then - the test finder will attempt to automatically determine the - correct module. The object's module is used: - - - As a default namespace, if `globs` is not specified. - - To prevent the DocTestFinder from extracting DocTests - from objects that are imported from other modules. - - To find the name of the file containing the object. - - To help find the line number of the object within its - file. - - Contained objects whose module does not match `module` are ignored. - - If `module` is False, no attempt to find the module will be made. - This is obscure, of use mostly in tests: if `module` is False, or - is None but cannot be found automatically, then all objects are - considered to belong to the (non-existent) module, so all contained - objects will (recursively) be searched for doctests. - - The globals for each DocTest is formed by combining `globs` - and `extraglobs` (bindings in `extraglobs` override bindings - in `globs`). A new copy of the globals dictionary is created - for each DocTest. If `globs` is not specified, then it - defaults to the module's `__dict__`, if specified, or {} - otherwise. If `extraglobs` is not specified, then it defaults - to {}. - - """ - # If name was not specified, then extract it from the object. - if name is None: - name = getattr(obj, '__name__', None) - if name is None: - raise ValueError("DocTestFinder.find: name must be given " - "when obj.__name__ doesn't exist: %r" % - (type(obj),)) - - # Find the module that contains the given object (if obj is - # a module, then module=obj.). Note: this may fail, in which - # case module will be None. - if module is False: - module = None - elif module is None: - module = inspect.getmodule(obj) - - # Read the module's source code. This is used by - # DocTestFinder._find_lineno to find the line number for a - # given object's docstring. - try: - file = inspect.getsourcefile(obj) or inspect.getfile(obj) - source_lines = linecache.getlines(file) - if not source_lines: - source_lines = None - except TypeError: - source_lines = None - - # Initialize globals, and merge in extraglobs. - if globs is None: - if module is None: - globs = {} - else: - globs = module.__dict__.copy() - else: - globs = globs.copy() - if extraglobs is not None: - globs.update(extraglobs) - - # Recursively expore `obj`, extracting DocTests. - tests = [] - self._find(tests, obj, name, module, source_lines, globs, {}) - return tests - - def _filter(self, obj, prefix, base): - """ - Return true if the given object should not be examined. - """ - return (self._namefilter is not None and - self._namefilter(prefix, base)) - - def _from_module(self, module, object): - """ - Return true if the given object is defined in the given - module. - """ - if module is None: - return True - elif inspect.isfunction(object): - return module.__dict__ is object.func_globals - elif inspect.isclass(object): - return module.__name__ == object.__module__ - elif inspect.getmodule(object) is not None: - return module is inspect.getmodule(object) - elif hasattr(object, '__module__'): - return module.__name__ == object.__module__ - elif isinstance(object, property): - return True # [XX] no way not be sure. - else: - raise ValueError("object must be a class or function") - - def _find(self, tests, obj, name, module, source_lines, globs, seen): - """ - Find tests for the given object and any contained objects, and - add them to `tests`. - """ - if self._verbose: - print 'Finding tests in %s' % name - - # If we've already processed this object, then ignore it. - if id(obj) in seen: - return - seen[id(obj)] = 1 - - # Find a test for this object, and add it to the list of tests. - test = self._get_test(obj, name, module, globs, source_lines) - if test is not None: - tests.append(test) - - # Look for tests in a module's contained objects. - if inspect.ismodule(obj) and self._recurse: - for valname, val in obj.__dict__.items(): - # Check if this contained object should be ignored. - if self._filter(val, name, valname): - continue - valname = '%s.%s' % (name, valname) - # Recurse to functions & classes. - if ((inspect.isfunction(val) or inspect.isclass(val)) and - self._from_module(module, val)): - self._find(tests, val, valname, module, source_lines, - globs, seen) - - # Look for tests in a module's __test__ dictionary. - if inspect.ismodule(obj) and self._recurse: - for valname, val in getattr(obj, '__test__', {}).items(): - if not isinstance(valname, basestring): - raise ValueError("DocTestFinder.find: __test__ keys " - "must be strings: %r" % - (type(valname),)) - if not (inspect.isfunction(val) or inspect.isclass(val) or - inspect.ismethod(val) or inspect.ismodule(val) or - isinstance(val, basestring)): - raise ValueError("DocTestFinder.find: __test__ values " - "must be strings, functions, methods, " - "classes, or modules: %r" % - (type(val),)) - valname = '%s.__test__.%s' % (name, valname) - self._find(tests, val, valname, module, source_lines, - globs, seen) - - # Look for tests in a class's contained objects. - if inspect.isclass(obj) and self._recurse: - for valname, val in obj.__dict__.items(): - # Check if this contained object should be ignored. - if self._filter(val, name, valname): - continue - # Special handling for staticmethod/classmethod. - if isinstance(val, staticmethod): - val = getattr(obj, valname) - if isinstance(val, classmethod): - val = getattr(obj, valname).im_func - - # Recurse to methods, properties, and nested classes. - if ((inspect.isfunction(val) or inspect.isclass(val) or - isinstance(val, property)) and - self._from_module(module, val)): - valname = '%s.%s' % (name, valname) - self._find(tests, val, valname, module, source_lines, - globs, seen) - - def _get_test(self, obj, name, module, globs, source_lines): - """ - Return a DocTest for the given object, if it defines a docstring; - otherwise, return None. - """ - # Extract the object's docstring. If it doesn't have one, - # then return None (no test for this object). - if isinstance(obj, basestring): - docstring = obj - else: - try: - if obj.__doc__ is None: - docstring = '' - else: - docstring = obj.__doc__ - if not isinstance(docstring, basestring): - docstring = str(docstring) - except (TypeError, AttributeError): - docstring = '' - - # Find the docstring's location in the file. - lineno = self._find_lineno(obj, source_lines) - - # Don't bother if the docstring is empty. - if self._exclude_empty and not docstring: - return None - - # Return a DocTest for this object. - if module is None: - filename = None - else: - filename = getattr(module, '__file__', module.__name__) - if filename[-4:] in (".pyc", ".pyo"): - filename = filename[:-1] - return self._parser.get_doctest(docstring, globs, name, - filename, lineno) - - def _find_lineno(self, obj, source_lines): - """ - Return a line number of the given object's docstring. Note: - this method assumes that the object has a docstring. - """ - lineno = None - - # Find the line number for modules. - if inspect.ismodule(obj): - lineno = 0 - - # Find the line number for classes. - # Note: this could be fooled if a class is defined multiple - # times in a single file. - if inspect.isclass(obj): - if source_lines is None: - return None - pat = re.compile(r'^\s*class\s*%s\b' % - getattr(obj, '__name__', '-')) - for i, line in enumerate(source_lines): - if pat.match(line): - lineno = i - break - - # Find the line number for functions & methods. - if inspect.ismethod(obj): obj = obj.im_func - if inspect.isfunction(obj): obj = obj.func_code - if inspect.istraceback(obj): obj = obj.tb_frame - if inspect.isframe(obj): obj = obj.f_code - if inspect.iscode(obj): - lineno = getattr(obj, 'co_firstlineno', None)-1 - - # Find the line number where the docstring starts. Assume - # that it's the first line that begins with a quote mark. - # Note: this could be fooled by a multiline function - # signature, where a continuation line begins with a quote - # mark. - if lineno is not None: - if source_lines is None: - return lineno+1 - pat = re.compile('(^|.*:)\s*\w*("|\')') - for lineno in range(lineno, len(source_lines)): - if pat.match(source_lines[lineno]): - return lineno - - # We couldn't find the line number. - return None - -###################################################################### -## 5. DocTest Runner -###################################################################### - -class DocTestRunner: - """ - A class used to run DocTest test cases, and accumulate statistics. - The `run` method is used to process a single DocTest case. It - returns a tuple `(f, t)`, where `t` is the number of test cases - tried, and `f` is the number of test cases that failed. - - >>> tests = DocTestFinder().find(_TestClass) - >>> runner = DocTestRunner(verbose=False) - >>> for test in tests: - ... print runner.run(test) - (0, 2) - (0, 1) - (0, 2) - (0, 2) - - The `summarize` method prints a summary of all the test cases that - have been run by the runner, and returns an aggregated `(f, t)` - tuple: - - >>> runner.summarize(verbose=1) - 4 items passed all tests: - 2 tests in _TestClass - 2 tests in _TestClass.__init__ - 2 tests in _TestClass.get - 1 tests in _TestClass.square - 7 tests in 4 items. - 7 passed and 0 failed. - Test passed. - (0, 7) - - The aggregated number of tried examples and failed examples is - also available via the `tries` and `failures` attributes: - - >>> runner.tries - 7 - >>> runner.failures - 0 - - The comparison between expected outputs and actual outputs is done - by an `OutputChecker`. This comparison may be customized with a - number of option flags; see the documentation for `testmod` for - more information. If the option flags are insufficient, then the - comparison may also be customized by passing a subclass of - `OutputChecker` to the constructor. - - The test runner's display output can be controlled in two ways. - First, an output function (`out) can be passed to - `TestRunner.run`; this function will be called with strings that - should be displayed. It defaults to `sys.stdout.write`. If - capturing the output is not sufficient, then the display output - can be also customized by subclassing DocTestRunner, and - overriding the methods `report_start`, `report_success`, - `report_unexpected_exception`, and `report_failure`. - """ - # This divider string is used to separate failure messages, and to - # separate sections of the summary. - DIVIDER = "*" * 70 - - def __init__(self, checker=None, verbose=None, optionflags=0): - """ - Create a new test runner. - - Optional keyword arg `checker` is the `OutputChecker` that - should be used to compare the expected outputs and actual - outputs of doctest examples. - - Optional keyword arg 'verbose' prints lots of stuff if true, - only failures if false; by default, it's true iff '-v' is in - sys.argv. - - Optional argument `optionflags` can be used to control how the - test runner compares expected output to actual output, and how - it displays failures. See the documentation for `testmod` for - more information. - """ - self._checker = checker or OutputChecker() - if verbose is None: - verbose = '-v' in sys.argv - self._verbose = verbose - self.optionflags = optionflags - self.original_optionflags = optionflags - - # Keep track of the examples we've run. - self.tries = 0 - self.failures = 0 - self._name2ft = {} - - # Create a fake output target for capturing doctest output. - self._fakeout = _SpoofOut() - - #///////////////////////////////////////////////////////////////// - # Reporting methods - #///////////////////////////////////////////////////////////////// - - def report_start(self, out, test, example): - """ - Report that the test runner is about to process the given - example. (Only displays a message if verbose=True) - """ - if self._verbose: - if example.want: - out('Trying:\n' + _indent(example.source) + - 'Expecting:\n' + _indent(example.want)) - else: - out('Trying:\n' + _indent(example.source) + - 'Expecting nothing\n') - - def report_success(self, out, test, example, got): - """ - Report that the given example ran successfully. (Only - displays a message if verbose=True) - """ - if self._verbose: - out("ok\n") - - def report_failure(self, out, test, example, got): - """ - Report that the given example failed. - """ - out(self._failure_header(test, example) + - self._checker.output_difference(example, got, self.optionflags)) - - def report_unexpected_exception(self, out, test, example, exc_info): - """ - Report that the given example raised an unexpected exception. - """ - out(self._failure_header(test, example) + - 'Exception raised:\n' + _indent(_exception_traceback(exc_info))) - - def _failure_header(self, test, example): - out = [self.DIVIDER] - if test.filename: - if test.lineno is not None and example.lineno is not None: - lineno = test.lineno + example.lineno + 1 - else: - lineno = '?' - out.append('File "%s", line %s, in %s' % - (test.filename, lineno, test.name)) - else: - out.append('Line %s, in %s' % (example.lineno+1, test.name)) - out.append('Failed example:') - source = example.source - out.append(_indent(source)) - return '\n'.join(out) - - #///////////////////////////////////////////////////////////////// - # DocTest Running - #///////////////////////////////////////////////////////////////// - - def __run(self, test, compileflags, out): - """ - Run the examples in `test`. Write the outcome of each example - with one of the `DocTestRunner.report_*` methods, using the - writer function `out`. `compileflags` is the set of compiler - flags that should be used to execute examples. Return a tuple - `(f, t)`, where `t` is the number of examples tried, and `f` - is the number of examples that failed. The examples are run - in the namespace `test.globs`. - """ - # Keep track of the number of failures and tries. - failures = tries = 0 - - # Save the option flags (since option directives can be used - # to modify them). - original_optionflags = self.optionflags - - SUCCESS, FAILURE, BOOM = range(3) # `outcome` state - - check = self._checker.check_output - - # Process each example. - for examplenum, example in enumerate(test.examples): - - # If REPORT_ONLY_FIRST_FAILURE is set, then supress - # reporting after the first failure. - quiet = (self.optionflags & REPORT_ONLY_FIRST_FAILURE and - failures > 0) - - # Merge in the example's options. - self.optionflags = original_optionflags - if example.options: - for (optionflag, val) in example.options.items(): - if val: - self.optionflags |= optionflag - else: - self.optionflags &= ~optionflag - - # Record that we started this example. - tries += 1 - if not quiet: - self.report_start(out, test, example) - - # Use a special filename for compile(), so we can retrieve - # the source code during interactive debugging (see - # __patched_linecache_getlines). - filename = '' % (test.name, examplenum) - - # Run the example in the given context (globs), and record - # any exception that gets raised. (But don't intercept - # keyboard interrupts.) - try: - # Don't blink! This is where the user's code gets run. - exec compile(example.source, filename, "single", - compileflags, 1) in test.globs - self.debugger.set_continue() # ==== Example Finished ==== - exception = None - except KeyboardInterrupt: - raise - except: - exception = sys.exc_info() - self.debugger.set_continue() # ==== Example Finished ==== - - got = self._fakeout.getvalue() # the actual output - self._fakeout.truncate(0) - outcome = FAILURE # guilty until proved innocent or insane - - # If the example executed without raising any exceptions, - # verify its output. - if exception is None: - if check(example.want, got, self.optionflags): - outcome = SUCCESS - - # The example raised an exception: check if it was expected. - else: - exc_info = sys.exc_info() - exc_msg = traceback.format_exception_only(*exc_info[:2])[-1] - if not quiet: - got += _exception_traceback(exc_info) - - # If `example.exc_msg` is None, then we weren't expecting - # an exception. - if example.exc_msg is None: - outcome = BOOM - - # We expected an exception: see whether it matches. - elif check(example.exc_msg, exc_msg, self.optionflags): - outcome = SUCCESS - - # Another chance if they didn't care about the detail. - elif self.optionflags & IGNORE_EXCEPTION_DETAIL: - m1 = re.match(r'[^:]*:', example.exc_msg) - m2 = re.match(r'[^:]*:', exc_msg) - if m1 and m2 and check(m1.group(0), m2.group(0), - self.optionflags): - outcome = SUCCESS - - # Report the outcome. - if outcome is SUCCESS: - if not quiet: - self.report_success(out, test, example, got) - elif outcome is FAILURE: - if not quiet: - self.report_failure(out, test, example, got) - failures += 1 - elif outcome is BOOM: - if not quiet: - self.report_unexpected_exception(out, test, example, - exc_info) - failures += 1 - else: - assert False, ("unknown outcome", outcome) - - # Restore the option flags (in case they were modified) - self.optionflags = original_optionflags - - # Record and return the number of failures and tries. - self.__record_outcome(test, failures, tries) - return failures, tries - - def __record_outcome(self, test, f, t): - """ - Record the fact that the given DocTest (`test`) generated `f` - failures out of `t` tried examples. - """ - f2, t2 = self._name2ft.get(test.name, (0,0)) - self._name2ft[test.name] = (f+f2, t+t2) - self.failures += f - self.tries += t - - __LINECACHE_FILENAME_RE = re.compile(r'[\w\.]+)' - r'\[(?P\d+)\]>$') - def __patched_linecache_getlines(self, filename, module_globals=None): - m = self.__LINECACHE_FILENAME_RE.match(filename) - if m and m.group('name') == self.test.name: - example = self.test.examples[int(m.group('examplenum'))] - return example.source.splitlines(True) - elif self.save_linecache_getlines.func_code.co_argcount>1: - return self.save_linecache_getlines(filename, module_globals) - else: - return self.save_linecache_getlines(filename) - - def run(self, test, compileflags=None, out=None, clear_globs=True): - """ - Run the examples in `test`, and display the results using the - writer function `out`. - - The examples are run in the namespace `test.globs`. If - `clear_globs` is true (the default), then this namespace will - be cleared after the test runs, to help with garbage - collection. If you would like to examine the namespace after - the test completes, then use `clear_globs=False`. - - `compileflags` gives the set of flags that should be used by - the Python compiler when running the examples. If not - specified, then it will default to the set of future-import - flags that apply to `globs`. - - The output of each example is checked using - `DocTestRunner.check_output`, and the results are formatted by - the `DocTestRunner.report_*` methods. - """ - self.test = test - - if compileflags is None: - compileflags = _extract_future_flags(test.globs) - - save_stdout = sys.stdout - if out is None: - out = save_stdout.write - sys.stdout = self._fakeout - - # Patch pdb.set_trace to restore sys.stdout during interactive - # debugging (so it's not still redirected to self._fakeout). - # Note that the interactive output will go to *our* - # save_stdout, even if that's not the real sys.stdout; this - # allows us to write test cases for the set_trace behavior. - save_set_trace = pdb.set_trace - self.debugger = _OutputRedirectingPdb(save_stdout) - self.debugger.reset() - pdb.set_trace = self.debugger.set_trace - - # Patch linecache.getlines, so we can see the example's source - # when we're inside the debugger. - self.save_linecache_getlines = linecache.getlines - linecache.getlines = self.__patched_linecache_getlines - - try: - return self.__run(test, compileflags, out) - finally: - sys.stdout = save_stdout - pdb.set_trace = save_set_trace - linecache.getlines = self.save_linecache_getlines - if clear_globs: - test.globs.clear() - - #///////////////////////////////////////////////////////////////// - # Summarization - #///////////////////////////////////////////////////////////////// - def summarize(self, verbose=None): - """ - Print a summary of all the test cases that have been run by - this DocTestRunner, and return a tuple `(f, t)`, where `f` is - the total number of failed examples, and `t` is the total - number of tried examples. - - The optional `verbose` argument controls how detailed the - summary is. If the verbosity is not specified, then the - DocTestRunner's verbosity is used. - """ - if verbose is None: - verbose = self._verbose - notests = [] - passed = [] - failed = [] - totalt = totalf = 0 - for x in self._name2ft.items(): - name, (f, t) = x - assert f <= t - totalt += t - totalf += f - if t == 0: - notests.append(name) - elif f == 0: - passed.append( (name, t) ) - else: - failed.append(x) - if verbose: - if notests: - print len(notests), "items had no tests:" - notests.sort() - for thing in notests: - print " ", thing - if passed: - print len(passed), "items passed all tests:" - passed.sort() - for thing, count in passed: - print " %3d tests in %s" % (count, thing) - if failed: - print self.DIVIDER - print len(failed), "items had failures:" - failed.sort() - for thing, (f, t) in failed: - print " %3d of %3d in %s" % (f, t, thing) - if verbose: - print totalt, "tests in", len(self._name2ft), "items." - print totalt - totalf, "passed and", totalf, "failed." - if totalf: - print "***Test Failed***", totalf, "failures." - elif verbose: - print "Test passed." - return totalf, totalt - - #///////////////////////////////////////////////////////////////// - # Backward compatibility cruft to maintain doctest.master. - #///////////////////////////////////////////////////////////////// - def merge(self, other): - d = self._name2ft - for name, (f, t) in other._name2ft.items(): - if name in d: - print "*** DocTestRunner.merge: '" + name + "' in both" \ - " testers; summing outcomes." - f2, t2 = d[name] - f = f + f2 - t = t + t2 - d[name] = f, t - -class OutputChecker: - """ - A class used to check the whether the actual output from a doctest - example matches the expected output. `OutputChecker` defines two - methods: `check_output`, which compares a given pair of outputs, - and returns true if they match; and `output_difference`, which - returns a string describing the differences between two outputs. - """ - def check_output(self, want, got, optionflags): - """ - Return True iff the actual output from an example (`got`) - matches the expected output (`want`). These strings are - always considered to match if they are identical; but - depending on what option flags the test runner is using, - several non-exact match types are also possible. See the - documentation for `TestRunner` for more information about - option flags. - """ - # Handle the common case first, for efficiency: - # if they're string-identical, always return true. - if got == want: - return True - - # The values True and False replaced 1 and 0 as the return - # value for boolean comparisons in Python 2.3. - if not (optionflags & DONT_ACCEPT_TRUE_FOR_1): - if (got,want) == ("True\n", "1\n"): - return True - if (got,want) == ("False\n", "0\n"): - return True - - # can be used as a special sequence to signify a - # blank line, unless the DONT_ACCEPT_BLANKLINE flag is used. - if not (optionflags & DONT_ACCEPT_BLANKLINE): - # Replace in want with a blank line. - want = re.sub('(?m)^%s\s*?$' % re.escape(BLANKLINE_MARKER), - '', want) - # If a line in got contains only spaces, then remove the - # spaces. - got = re.sub('(?m)^\s*?$', '', got) - if got == want: - return True - - # This flag causes doctest to ignore any differences in the - # contents of whitespace strings. Note that this can be used - # in conjunction with the ELLIPSIS flag. - if optionflags & NORMALIZE_WHITESPACE: - got = ' '.join(got.split()) - want = ' '.join(want.split()) - if got == want: - return True - - # The ELLIPSIS flag says to let the sequence "..." in `want` - # match any substring in `got`. - if optionflags & ELLIPSIS: - if _ellipsis_match(want, got): - return True - - # We didn't find any match; return false. - return False - - # Should we do a fancy diff? - def _do_a_fancy_diff(self, want, got, optionflags): - # Not unless they asked for a fancy diff. - if not optionflags & (REPORT_UDIFF | - REPORT_CDIFF | - REPORT_NDIFF): - return False - - # If expected output uses ellipsis, a meaningful fancy diff is - # too hard ... or maybe not. In two real-life failures Tim saw, - # a diff was a major help anyway, so this is commented out. - # [todo] _ellipsis_match() knows which pieces do and don't match, - # and could be the basis for a kick-ass diff in this case. - ##if optionflags & ELLIPSIS and ELLIPSIS_MARKER in want: - ## return False - - # ndiff does intraline difference marking, so can be useful even - # for 1-line differences. - if optionflags & REPORT_NDIFF: - return True - - # The other diff types need at least a few lines to be helpful. - return want.count('\n') > 2 and got.count('\n') > 2 - - def output_difference(self, example, got, optionflags): - """ - Return a string describing the differences between the - expected output for a given example (`example`) and the actual - output (`got`). `optionflags` is the set of option flags used - to compare `want` and `got`. - """ - want = example.want - # If s are being used, then replace blank lines - # with in the actual output string. - if not (optionflags & DONT_ACCEPT_BLANKLINE): - got = re.sub('(?m)^[ ]*(?=\n)', BLANKLINE_MARKER, got) - - # Check if we should use diff. - if self._do_a_fancy_diff(want, got, optionflags): - # Split want & got into lines. - want_lines = want.splitlines(True) # True == keep line ends - got_lines = got.splitlines(True) - # Use difflib to find their differences. - if optionflags & REPORT_UDIFF: - diff = difflib.unified_diff(want_lines, got_lines, n=2) - diff = list(diff)[2:] # strip the diff header - kind = 'unified diff with -expected +actual' - elif optionflags & REPORT_CDIFF: - diff = difflib.context_diff(want_lines, got_lines, n=2) - diff = list(diff)[2:] # strip the diff header - kind = 'context diff with expected followed by actual' - elif optionflags & REPORT_NDIFF: - engine = difflib.Differ(charjunk=difflib.IS_CHARACTER_JUNK) - diff = list(engine.compare(want_lines, got_lines)) - kind = 'ndiff with -expected +actual' - else: - assert 0, 'Bad diff option' - # Remove trailing whitespace on diff output. - diff = [line.rstrip() + '\n' for line in diff] - return 'Differences (%s):\n' % kind + _indent(''.join(diff)) - - # If we're not using diff, then simply list the expected - # output followed by the actual output. - if want and got: - return 'Expected:\n%sGot:\n%s' % (_indent(want), _indent(got)) - elif want: - return 'Expected:\n%sGot nothing\n' % _indent(want) - elif got: - return 'Expected nothing\nGot:\n%s' % _indent(got) - else: - return 'Expected nothing\nGot nothing\n' - -class DocTestFailure(Exception): - """A DocTest example has failed in debugging mode. - - The exception instance has variables: - - - test: the DocTest object being run - - - excample: the Example object that failed - - - got: the actual output - """ - def __init__(self, test, example, got): - self.test = test - self.example = example - self.got = got - - def __str__(self): - return str(self.test) - -class UnexpectedException(Exception): - """A DocTest example has encountered an unexpected exception - - The exception instance has variables: - - - test: the DocTest object being run - - - excample: the Example object that failed - - - exc_info: the exception info - """ - def __init__(self, test, example, exc_info): - self.test = test - self.example = example - self.exc_info = exc_info - - def __str__(self): - return str(self.test) - -class DebugRunner(DocTestRunner): - r"""Run doc tests but raise an exception as soon as there is a failure. - - If an unexpected exception occurs, an UnexpectedException is raised. - It contains the test, the example, and the original exception: - - >>> runner = DebugRunner(verbose=False) - >>> test = DocTestParser().get_doctest('>>> raise KeyError\n42', - ... {}, 'foo', 'foo.py', 0) - >>> try: - ... runner.run(test) - ... except UnexpectedException, failure: - ... pass - - >>> failure.test is test - True - - >>> failure.example.want - '42\n' - - >>> exc_info = failure.exc_info - >>> raise exc_info[0], exc_info[1], exc_info[2] - Traceback (most recent call last): - ... - KeyError - - We wrap the original exception to give the calling application - access to the test and example information. - - If the output doesn't match, then a DocTestFailure is raised: - - >>> test = DocTestParser().get_doctest(''' - ... >>> x = 1 - ... >>> x - ... 2 - ... ''', {}, 'foo', 'foo.py', 0) - - >>> try: - ... runner.run(test) - ... except DocTestFailure, failure: - ... pass - - DocTestFailure objects provide access to the test: - - >>> failure.test is test - True - - As well as to the example: - - >>> failure.example.want - '2\n' - - and the actual output: - - >>> failure.got - '1\n' - - If a failure or error occurs, the globals are left intact: - - >>> del test.globs['__builtins__'] - >>> test.globs - {'x': 1} - - >>> test = DocTestParser().get_doctest(''' - ... >>> x = 2 - ... >>> raise KeyError - ... ''', {}, 'foo', 'foo.py', 0) - - >>> runner.run(test) - Traceback (most recent call last): - ... - UnexpectedException: - - >>> del test.globs['__builtins__'] - >>> test.globs - {'x': 2} - - But the globals are cleared if there is no error: - - >>> test = DocTestParser().get_doctest(''' - ... >>> x = 2 - ... ''', {}, 'foo', 'foo.py', 0) - - >>> runner.run(test) - (0, 1) - - >>> test.globs - {} - - """ - - def run(self, test, compileflags=None, out=None, clear_globs=True): - r = DocTestRunner.run(self, test, compileflags, out, False) - if clear_globs: - test.globs.clear() - return r - - def report_unexpected_exception(self, out, test, example, exc_info): - raise UnexpectedException(test, example, exc_info) - - def report_failure(self, out, test, example, got): - raise DocTestFailure(test, example, got) - -###################################################################### -## 6. Test Functions -###################################################################### -# These should be backwards compatible. - -# For backward compatibility, a global instance of a DocTestRunner -# class, updated by testmod. -master = None - -def testmod(m=None, name=None, globs=None, verbose=None, isprivate=None, - report=True, optionflags=0, extraglobs=None, - raise_on_error=False, exclude_empty=False): - """m=None, name=None, globs=None, verbose=None, isprivate=None, - report=True, optionflags=0, extraglobs=None, raise_on_error=False, - exclude_empty=False - - Test examples in docstrings in functions and classes reachable - from module m (or the current module if m is not supplied), starting - with m.__doc__. Unless isprivate is specified, private names - are not skipped. - - Also test examples reachable from dict m.__test__ if it exists and is - not None. m.__test__ maps names to functions, classes and strings; - function and class docstrings are tested even if the name is private; - strings are tested directly, as if they were docstrings. - - Return (#failures, #tests). - - See doctest.__doc__ for an overview. - - Optional keyword arg "name" gives the name of the module; by default - use m.__name__. - - Optional keyword arg "globs" gives a dict to be used as the globals - when executing examples; by default, use m.__dict__. A copy of this - dict is actually used for each docstring, so that each docstring's - examples start with a clean slate. - - Optional keyword arg "extraglobs" gives a dictionary that should be - merged into the globals that are used to execute examples. By - default, no extra globals are used. This is new in 2.4. - - Optional keyword arg "verbose" prints lots of stuff if true, prints - only failures if false; by default, it's true iff "-v" is in sys.argv. - - Optional keyword arg "report" prints a summary at the end when true, - else prints nothing at the end. In verbose mode, the summary is - detailed, else very brief (in fact, empty if all tests passed). - - Optional keyword arg "optionflags" or's together module constants, - and defaults to 0. This is new in 2.3. Possible values (see the - docs for details): - - DONT_ACCEPT_TRUE_FOR_1 - DONT_ACCEPT_BLANKLINE - NORMALIZE_WHITESPACE - ELLIPSIS - IGNORE_EXCEPTION_DETAIL - REPORT_UDIFF - REPORT_CDIFF - REPORT_NDIFF - REPORT_ONLY_FIRST_FAILURE - - Optional keyword arg "raise_on_error" raises an exception on the - first unexpected exception or failure. This allows failures to be - post-mortem debugged. - - Deprecated in Python 2.4: - Optional keyword arg "isprivate" specifies a function used to - determine whether a name is private. The default function is - treat all functions as public. Optionally, "isprivate" can be - set to doctest.is_private to skip over functions marked as private - using the underscore naming convention; see its docs for details. - - Advanced tomfoolery: testmod runs methods of a local instance of - class doctest.Tester, then merges the results into (or creates) - global Tester instance doctest.master. Methods of doctest.master - can be called directly too, if you want to do something unusual. - Passing report=0 to testmod is especially useful then, to delay - displaying a summary. Invoke doctest.master.summarize(verbose) - when you're done fiddling. - """ - global master - - if isprivate is not None: - warnings.warn("the isprivate argument is deprecated; " - "examine DocTestFinder.find() lists instead", - DeprecationWarning) - - # If no module was given, then use __main__. - if m is None: - # DWA - m will still be None if this wasn't invoked from the command - # line, in which case the following TypeError is about as good an error - # as we should expect - m = sys.modules.get('__main__') - - # Check that we were actually given a module. - if not inspect.ismodule(m): - raise TypeError("testmod: module required; %r" % (m,)) - - # If no name was given, then use the module's name. - if name is None: - name = m.__name__ - - # Find, parse, and run all tests in the given module. - finder = DocTestFinder(_namefilter=isprivate, exclude_empty=exclude_empty) - - if raise_on_error: - runner = DebugRunner(verbose=verbose, optionflags=optionflags) - else: - runner = DocTestRunner(verbose=verbose, optionflags=optionflags) - - for test in finder.find(m, name, globs=globs, extraglobs=extraglobs): - runner.run(test) - - if report: - runner.summarize() - - if master is None: - master = runner - else: - master.merge(runner) - - return runner.failures, runner.tries - -def testfile(filename, module_relative=True, name=None, package=None, - globs=None, verbose=None, report=True, optionflags=0, - extraglobs=None, raise_on_error=False, parser=DocTestParser()): - """ - Test examples in the given file. Return (#failures, #tests). - - Optional keyword arg "module_relative" specifies how filenames - should be interpreted: - - - If "module_relative" is True (the default), then "filename" - specifies a module-relative path. By default, this path is - relative to the calling module's directory; but if the - "package" argument is specified, then it is relative to that - package. To ensure os-independence, "filename" should use - "/" characters to separate path segments, and should not - be an absolute path (i.e., it may not begin with "/"). - - - If "module_relative" is False, then "filename" specifies an - os-specific path. The path may be absolute or relative (to - the current working directory). - - Optional keyword arg "name" gives the name of the test; by default - use the file's basename. - - Optional keyword argument "package" is a Python package or the - name of a Python package whose directory should be used as the - base directory for a module relative filename. If no package is - specified, then the calling module's directory is used as the base - directory for module relative filenames. It is an error to - specify "package" if "module_relative" is False. - - Optional keyword arg "globs" gives a dict to be used as the globals - when executing examples; by default, use {}. A copy of this dict - is actually used for each docstring, so that each docstring's - examples start with a clean slate. - - Optional keyword arg "extraglobs" gives a dictionary that should be - merged into the globals that are used to execute examples. By - default, no extra globals are used. - - Optional keyword arg "verbose" prints lots of stuff if true, prints - only failures if false; by default, it's true iff "-v" is in sys.argv. - - Optional keyword arg "report" prints a summary at the end when true, - else prints nothing at the end. In verbose mode, the summary is - detailed, else very brief (in fact, empty if all tests passed). - - Optional keyword arg "optionflags" or's together module constants, - and defaults to 0. Possible values (see the docs for details): - - DONT_ACCEPT_TRUE_FOR_1 - DONT_ACCEPT_BLANKLINE - NORMALIZE_WHITESPACE - ELLIPSIS - IGNORE_EXCEPTION_DETAIL - REPORT_UDIFF - REPORT_CDIFF - REPORT_NDIFF - REPORT_ONLY_FIRST_FAILURE - - Optional keyword arg "raise_on_error" raises an exception on the - first unexpected exception or failure. This allows failures to be - post-mortem debugged. - - Optional keyword arg "parser" specifies a DocTestParser (or - subclass) that should be used to extract tests from the files. - - Advanced tomfoolery: testmod runs methods of a local instance of - class doctest.Tester, then merges the results into (or creates) - global Tester instance doctest.master. Methods of doctest.master - can be called directly too, if you want to do something unusual. - Passing report=0 to testmod is especially useful then, to delay - displaying a summary. Invoke doctest.master.summarize(verbose) - when you're done fiddling. - """ - global master - - if package and not module_relative: - raise ValueError("Package may only be specified for module-" - "relative paths.") - - # Relativize the path - if module_relative: - package = _normalize_module(package) - filename = _module_relative_path(package, filename) - - # If no name was given, then use the file's name. - if name is None: - name = os.path.basename(filename) - - # Assemble the globals. - if globs is None: - globs = {} - else: - globs = globs.copy() - if extraglobs is not None: - globs.update(extraglobs) - - if raise_on_error: - runner = DebugRunner(verbose=verbose, optionflags=optionflags) - else: - runner = DocTestRunner(verbose=verbose, optionflags=optionflags) - - # Read the file, convert it to a test, and run it. - s = open(filename).read() - test = parser.get_doctest(s, globs, name, filename, 0) - runner.run(test) - - if report: - runner.summarize() - - if master is None: - master = runner - else: - master.merge(runner) - - return runner.failures, runner.tries - -def run_docstring_examples(f, globs, verbose=False, name="NoName", - compileflags=None, optionflags=0): - """ - Test examples in the given object's docstring (`f`), using `globs` - as globals. Optional argument `name` is used in failure messages. - If the optional argument `verbose` is true, then generate output - even if there are no failures. - - `compileflags` gives the set of flags that should be used by the - Python compiler when running the examples. If not specified, then - it will default to the set of future-import flags that apply to - `globs`. - - Optional keyword arg `optionflags` specifies options for the - testing and output. See the documentation for `testmod` for more - information. - """ - # Find, parse, and run all tests in the given module. - finder = DocTestFinder(verbose=verbose, recurse=False) - runner = DocTestRunner(verbose=verbose, optionflags=optionflags) - for test in finder.find(f, name, globs=globs): - runner.run(test, compileflags=compileflags) - -###################################################################### -## 7. Tester -###################################################################### -# This is provided only for backwards compatibility. It's not -# actually used in any way. - -class Tester: - def __init__(self, mod=None, globs=None, verbose=None, - isprivate=None, optionflags=0): - - warnings.warn("class Tester is deprecated; " - "use class doctest.DocTestRunner instead", - DeprecationWarning, stacklevel=2) - if mod is None and globs is None: - raise TypeError("Tester.__init__: must specify mod or globs") - if mod is not None and not inspect.ismodule(mod): - raise TypeError("Tester.__init__: mod must be a module; %r" % - (mod,)) - if globs is None: - globs = mod.__dict__ - self.globs = globs - - self.verbose = verbose - self.isprivate = isprivate - self.optionflags = optionflags - self.testfinder = DocTestFinder(_namefilter=isprivate) - self.testrunner = DocTestRunner(verbose=verbose, - optionflags=optionflags) - - def runstring(self, s, name): - test = DocTestParser().get_doctest(s, self.globs, name, None, None) - if self.verbose: - print "Running string", name - (f,t) = self.testrunner.run(test) - if self.verbose: - print f, "of", t, "examples failed in string", name - return (f,t) - - def rundoc(self, object, name=None, module=None): - f = t = 0 - tests = self.testfinder.find(object, name, module=module, - globs=self.globs) - for test in tests: - (f2, t2) = self.testrunner.run(test) - (f,t) = (f+f2, t+t2) - return (f,t) - - def rundict(self, d, name, module=None): - import new - m = new.module(name) - m.__dict__.update(d) - if module is None: - module = False - return self.rundoc(m, name, module) - - def run__test__(self, d, name): - import new - m = new.module(name) - m.__test__ = d - return self.rundoc(m, name) - - def summarize(self, verbose=None): - return self.testrunner.summarize(verbose) - - def merge(self, other): - self.testrunner.merge(other.testrunner) - -###################################################################### -## 8. Unittest Support -###################################################################### - -_unittest_reportflags = 0 - -def set_unittest_reportflags(flags): - """Sets the unittest option flags. - - The old flag is returned so that a runner could restore the old - value if it wished to: - - >>> old = _unittest_reportflags - >>> set_unittest_reportflags(REPORT_NDIFF | - ... REPORT_ONLY_FIRST_FAILURE) == old - True - - >>> import doctest - >>> doctest._unittest_reportflags == (REPORT_NDIFF | - ... REPORT_ONLY_FIRST_FAILURE) - True - - Only reporting flags can be set: - - >>> set_unittest_reportflags(ELLIPSIS) - Traceback (most recent call last): - ... - ValueError: ('Only reporting flags allowed', 8) - - >>> set_unittest_reportflags(old) == (REPORT_NDIFF | - ... REPORT_ONLY_FIRST_FAILURE) - True - """ - global _unittest_reportflags - - if (flags & REPORTING_FLAGS) != flags: - raise ValueError("Only reporting flags allowed", flags) - old = _unittest_reportflags - _unittest_reportflags = flags - return old - - -class DocTestCase(unittest.TestCase): - - def __init__(self, test, optionflags=0, setUp=None, tearDown=None, - checker=None): - - unittest.TestCase.__init__(self) - self._dt_optionflags = optionflags - self._dt_checker = checker - self._dt_test = test - self._dt_setUp = setUp - self._dt_tearDown = tearDown - - def setUp(self): - test = self._dt_test - - if self._dt_setUp is not None: - self._dt_setUp(test) - - def tearDown(self): - test = self._dt_test - - if self._dt_tearDown is not None: - self._dt_tearDown(test) - - test.globs.clear() - - def runTest(self): - test = self._dt_test - old = sys.stdout - new = StringIO() - optionflags = self._dt_optionflags - - if not (optionflags & REPORTING_FLAGS): - # The option flags don't include any reporting flags, - # so add the default reporting flags - optionflags |= _unittest_reportflags - - runner = DocTestRunner(optionflags=optionflags, - checker=self._dt_checker, verbose=False) - - try: - runner.DIVIDER = "-"*70 - failures, tries = runner.run( - test, out=new.write, clear_globs=False) - finally: - sys.stdout = old - - if failures: - raise self.failureException(self.format_failure(new.getvalue())) - - def format_failure(self, err): - test = self._dt_test - if test.lineno is None: - lineno = 'unknown line number' - else: - lineno = '%s' % test.lineno - lname = '.'.join(test.name.split('.')[-1:]) - return ('Failed doctest test for %s\n' - ' File "%s", line %s, in %s\n\n%s' - % (test.name, test.filename, lineno, lname, err) - ) - - def debug(self): - r"""Run the test case without results and without catching exceptions - - The unit test framework includes a debug method on test cases - and test suites to support post-mortem debugging. The test code - is run in such a way that errors are not caught. This way a - caller can catch the errors and initiate post-mortem debugging. - - The DocTestCase provides a debug method that raises - UnexpectedException errors if there is an unexepcted - exception: - - >>> test = DocTestParser().get_doctest('>>> raise KeyError\n42', - ... {}, 'foo', 'foo.py', 0) - >>> case = DocTestCase(test) - >>> try: - ... case.debug() - ... except UnexpectedException, failure: - ... pass - - The UnexpectedException contains the test, the example, and - the original exception: - - >>> failure.test is test - True - - >>> failure.example.want - '42\n' - - >>> exc_info = failure.exc_info - >>> raise exc_info[0], exc_info[1], exc_info[2] - Traceback (most recent call last): - ... - KeyError - - If the output doesn't match, then a DocTestFailure is raised: - - >>> test = DocTestParser().get_doctest(''' - ... >>> x = 1 - ... >>> x - ... 2 - ... ''', {}, 'foo', 'foo.py', 0) - >>> case = DocTestCase(test) - - >>> try: - ... case.debug() - ... except DocTestFailure, failure: - ... pass - - DocTestFailure objects provide access to the test: - - >>> failure.test is test - True - - As well as to the example: - - >>> failure.example.want - '2\n' - - and the actual output: - - >>> failure.got - '1\n' - - """ - - self.setUp() - runner = DebugRunner(optionflags=self._dt_optionflags, - checker=self._dt_checker, verbose=False) - runner.run(self._dt_test) - self.tearDown() - - def id(self): - return self._dt_test.name - - def __repr__(self): - name = self._dt_test.name.split('.') - return "%s (%s)" % (name[-1], '.'.join(name[:-1])) - - __str__ = __repr__ - - def shortDescription(self): - return "Doctest: " + self._dt_test.name - -def DocTestSuite(module=None, globs=None, extraglobs=None, test_finder=None, - **options): - """ - Convert doctest tests for a module to a unittest test suite. - - This converts each documentation string in a module that - contains doctest tests to a unittest test case. If any of the - tests in a doc string fail, then the test case fails. An exception - is raised showing the name of the file containing the test and a - (sometimes approximate) line number. - - The `module` argument provides the module to be tested. The argument - can be either a module or a module name. - - If no argument is given, the calling module is used. - - A number of options may be provided as keyword arguments: - - setUp - A set-up function. This is called before running the - tests in each file. The setUp function will be passed a DocTest - object. The setUp function can access the test globals as the - globs attribute of the test passed. - - tearDown - A tear-down function. This is called after running the - tests in each file. The tearDown function will be passed a DocTest - object. The tearDown function can access the test globals as the - globs attribute of the test passed. - - globs - A dictionary containing initial global variables for the tests. - - optionflags - A set of doctest option flags expressed as an integer. - """ - - if test_finder is None: - test_finder = DocTestFinder() - - module = _normalize_module(module) - tests = test_finder.find(module, globs=globs, extraglobs=extraglobs) - if globs is None: - globs = module.__dict__ - if not tests: - # Why do we want to do this? Because it reveals a bug that might - # otherwise be hidden. - raise ValueError(module, "has no tests") - - tests.sort() - suite = unittest.TestSuite() - for test in tests: - if len(test.examples) == 0: - continue - if not test.filename: - filename = module.__file__ - if filename[-4:] in (".pyc", ".pyo"): - filename = filename[:-1] - test.filename = filename - suite.addTest(DocTestCase(test, **options)) - - return suite - -class DocFileCase(DocTestCase): - - def id(self): - return '_'.join(self._dt_test.name.split('.')) - - def __repr__(self): - return self._dt_test.filename - __str__ = __repr__ - - def format_failure(self, err): - return ('Failed doctest test for %s\n File "%s", line 0\n\n%s' - % (self._dt_test.name, self._dt_test.filename, err) - ) - -def DocFileTest(path, module_relative=True, package=None, - globs=None, parser=DocTestParser(), **options): - if globs is None: - globs = {} - - if package and not module_relative: - raise ValueError("Package may only be specified for module-" - "relative paths.") - - # Relativize the path. - if module_relative: - package = _normalize_module(package) - path = _module_relative_path(package, path) - - # Find the file and read it. - name = os.path.basename(path) - doc = open(path).read() - - # Convert it to a test, and wrap it in a DocFileCase. - test = parser.get_doctest(doc, globs, name, path, 0) - return DocFileCase(test, **options) - -def DocFileSuite(*paths, **kw): - """A unittest suite for one or more doctest files. - - The path to each doctest file is given as a string; the - interpretation of that string depends on the keyword argument - "module_relative". - - A number of options may be provided as keyword arguments: - - module_relative - If "module_relative" is True, then the given file paths are - interpreted as os-independent module-relative paths. By - default, these paths are relative to the calling module's - directory; but if the "package" argument is specified, then - they are relative to that package. To ensure os-independence, - "filename" should use "/" characters to separate path - segments, and may not be an absolute path (i.e., it may not - begin with "/"). - - If "module_relative" is False, then the given file paths are - interpreted as os-specific paths. These paths may be absolute - or relative (to the current working directory). - - package - A Python package or the name of a Python package whose directory - should be used as the base directory for module relative paths. - If "package" is not specified, then the calling module's - directory is used as the base directory for module relative - filenames. It is an error to specify "package" if - "module_relative" is False. - - setUp - A set-up function. This is called before running the - tests in each file. The setUp function will be passed a DocTest - object. The setUp function can access the test globals as the - globs attribute of the test passed. - - tearDown - A tear-down function. This is called after running the - tests in each file. The tearDown function will be passed a DocTest - object. The tearDown function can access the test globals as the - globs attribute of the test passed. - - globs - A dictionary containing initial global variables for the tests. - - optionflags - A set of doctest option flags expressed as an integer. - - parser - A DocTestParser (or subclass) that should be used to extract - tests from the files. - """ - suite = unittest.TestSuite() - - # We do this here so that _normalize_module is called at the right - # level. If it were called in DocFileTest, then this function - # would be the caller and we might guess the package incorrectly. - if kw.get('module_relative', True): - kw['package'] = _normalize_module(kw.get('package')) - - for path in paths: - suite.addTest(DocFileTest(path, **kw)) - - return suite - -###################################################################### -## 9. Debugging Support -###################################################################### - -def script_from_examples(s): - r"""Extract script from text with examples. - - Converts text with examples to a Python script. Example input is - converted to regular code. Example output and all other words - are converted to comments: - - >>> text = ''' - ... Here are examples of simple math. - ... - ... Python has super accurate integer addition - ... - ... >>> 2 + 2 - ... 5 - ... - ... And very friendly error messages: - ... - ... >>> 1/0 - ... To Infinity - ... And - ... Beyond - ... - ... You can use logic if you want: - ... - ... >>> if 0: - ... ... blah - ... ... blah - ... ... - ... - ... Ho hum - ... ''' - - >>> print script_from_examples(text) - # Here are examples of simple math. - # - # Python has super accurate integer addition - # - 2 + 2 - # Expected: - ## 5 - # - # And very friendly error messages: - # - 1/0 - # Expected: - ## To Infinity - ## And - ## Beyond - # - # You can use logic if you want: - # - if 0: - blah - blah - # - # Ho hum - """ - output = [] - for piece in DocTestParser().parse(s): - if isinstance(piece, Example): - # Add the example's source code (strip trailing NL) - output.append(piece.source[:-1]) - # Add the expected output: - want = piece.want - if want: - output.append('# Expected:') - output += ['## '+l for l in want.split('\n')[:-1]] - else: - # Add non-example text. - output += [_comment_line(l) - for l in piece.split('\n')[:-1]] - - # Trim junk on both ends. - while output and output[-1] == '#': - output.pop() - while output and output[0] == '#': - output.pop(0) - # Combine the output, and return it. - return '\n'.join(output) - -def testsource(module, name): - """Extract the test sources from a doctest docstring as a script. - - Provide the module (or dotted name of the module) containing the - test to be debugged and the name (within the module) of the object - with the doc string with tests to be debugged. - """ - module = _normalize_module(module) - tests = DocTestFinder().find(module) - test = [t for t in tests if t.name == name] - if not test: - raise ValueError(name, "not found in tests") - test = test[0] - testsrc = script_from_examples(test.docstring) - return testsrc - -def debug_src(src, pm=False, globs=None): - """Debug a single doctest docstring, in argument `src`'""" - testsrc = script_from_examples(src) - debug_script(testsrc, pm, globs) - -def debug_script(src, pm=False, globs=None): - "Debug a test script. `src` is the script, as a string." - import pdb - - # Note that tempfile.NameTemporaryFile() cannot be used. As the - # docs say, a file so created cannot be opened by name a second time - # on modern Windows boxes, and execfile() needs to open it. - srcfilename = tempfile.mktemp(".py", "doctestdebug") - f = open(srcfilename, 'w') - f.write(src) - f.close() - - try: - if globs: - globs = globs.copy() - else: - globs = {} - - if pm: - try: - execfile(srcfilename, globs, globs) - except: - print sys.exc_info()[1] - pdb.post_mortem(sys.exc_info()[2]) - else: - # Note that %r is vital here. '%s' instead can, e.g., cause - # backslashes to get treated as metacharacters on Windows. - pdb.run("execfile(%r)" % srcfilename, globs, globs) - - finally: - os.remove(srcfilename) - -def debug(module, name, pm=False): - """Debug a single doctest docstring. - - Provide the module (or dotted name of the module) containing the - test to be debugged and the name (within the module) of the object - with the docstring with tests to be debugged. - """ - module = _normalize_module(module) - testsrc = testsource(module, name) - debug_script(testsrc, pm, module.__dict__) - -###################################################################### -## 10. Example Usage -###################################################################### -class _TestClass: - """ - A pointless class, for sanity-checking of docstring testing. - - Methods: - square() - get() - - >>> _TestClass(13).get() + _TestClass(-12).get() - 1 - >>> hex(_TestClass(13).square().get()) - '0xa9' - """ - - def __init__(self, val): - """val -> _TestClass object with associated value val. - - >>> t = _TestClass(123) - >>> print t.get() - 123 - """ - - self.val = val - - def square(self): - """square() -> square TestClass's associated value - - >>> _TestClass(13).square().get() - 169 - """ - - self.val = self.val ** 2 - return self - - def get(self): - """get() -> return TestClass's associated value. - - >>> x = _TestClass(-42) - >>> print x.get() - -42 - """ - - return self.val - -__test__ = {"_TestClass": _TestClass, - "string": r""" - Example of a string object, searched as-is. - >>> x = 1; y = 2 - >>> x + y, x * y - (3, 2) - """, - - "bool-int equivalence": r""" - In 2.2, boolean expressions displayed - 0 or 1. By default, we still accept - them. This can be disabled by passing - DONT_ACCEPT_TRUE_FOR_1 to the new - optionflags argument. - >>> 4 == 4 - 1 - >>> 4 == 4 - True - >>> 4 > 4 - 0 - >>> 4 > 4 - False - """, - - "blank lines": r""" - Blank lines can be marked with : - >>> print 'foo\n\nbar\n' - foo - - bar - - """, - - "ellipsis": r""" - If the ellipsis flag is used, then '...' can be used to - elide substrings in the desired output: - >>> print range(1000) #doctest: +ELLIPSIS - [0, 1, 2, ..., 999] - """, - - "whitespace normalization": r""" - If the whitespace normalization flag is used, then - differences in whitespace are ignored. - >>> print range(30) #doctest: +NORMALIZE_WHITESPACE - [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, - 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, - 27, 28, 29] - """, - } - -def _test(): - r = unittest.TextTestRunner() - r.run(DocTestSuite()) - -if __name__ == "__main__": - _test() - diff --git a/setuptools/tests/environment.py b/setuptools/tests/environment.py new file mode 100644 index 0000000..c67898c --- /dev/null +++ b/setuptools/tests/environment.py @@ -0,0 +1,60 @@ +import os +import sys +import unicodedata + +from subprocess import Popen as _Popen, PIPE as _PIPE + + +def _which_dirs(cmd): + result = set() + for path in os.environ.get('PATH', '').split(os.pathsep): + filename = os.path.join(path, cmd) + if os.access(filename, os.X_OK): + result.add(path) + return result + + +def run_setup_py(cmd, pypath=None, path=None, + data_stream=0, env=None): + """ + Execution command for tests, separate from those used by the + code directly to prevent accidental behavior issues + """ + if env is None: + env = dict() + for envname in os.environ: + env[envname] = os.environ[envname] + + # override the python path if needed + if pypath is not None: + env["PYTHONPATH"] = pypath + + # overide the execution path if needed + if path is not None: + env["PATH"] = path + if not env.get("PATH", ""): + env["PATH"] = _which_dirs("tar").union(_which_dirs("gzip")) + env["PATH"] = os.pathsep.join(env["PATH"]) + + cmd = [sys.executable, "setup.py"] + list(cmd) + + # http://bugs.python.org/issue8557 + shell = sys.platform == 'win32' + + try: + proc = _Popen( + cmd, stdout=_PIPE, stderr=_PIPE, shell=shell, env=env, + ) + + data = proc.communicate()[data_stream] + except OSError: + return 1, '' + + # decode the console string if needed + if hasattr(data, "decode"): + # use the default encoding + data = data.decode() + data = unicodedata.normalize('NFC', data) + + # communicate calls wait() + return proc.returncode, data diff --git a/setuptools/tests/files.py b/setuptools/tests/files.py new file mode 100644 index 0000000..4364241 --- /dev/null +++ b/setuptools/tests/files.py @@ -0,0 +1,32 @@ +import os + + +def build_files(file_defs, prefix=""): + """ + Build a set of files/directories, as described by the file_defs dictionary. + + Each key/value pair in the dictionary is interpreted as a filename/contents + pair. If the contents value is a dictionary, a directory is created, and the + dictionary interpreted as the files within it, recursively. + + For example: + + {"README.txt": "A README file", + "foo": { + "__init__.py": "", + "bar": { + "__init__.py": "", + }, + "baz.py": "# Some code", + } + } + """ + for name, contents in file_defs.items(): + full_name = os.path.join(prefix, name) + if isinstance(contents, dict): + if not os.path.exists(full_name): + os.makedirs(full_name) + build_files(contents, prefix=full_name) + else: + with open(full_name, 'w') as f: + f.write(contents) diff --git a/setuptools/tests/fixtures.py b/setuptools/tests/fixtures.py new file mode 100644 index 0000000..5204c8d --- /dev/null +++ b/setuptools/tests/fixtures.py @@ -0,0 +1,23 @@ +import pytest + +from . import contexts + + +@pytest.yield_fixture +def user_override(monkeypatch): + """ + Override site.USER_BASE and site.USER_SITE with temporary directories in + a context. + """ + with contexts.tempdir() as user_base: + monkeypatch.setattr('site.USER_BASE', user_base) + with contexts.tempdir() as user_site: + monkeypatch.setattr('site.USER_SITE', user_site) + with contexts.save_user_site_setting(): + yield + + +@pytest.yield_fixture +def tmpdir_cwd(tmpdir): + with tmpdir.as_cwd() as orig: + yield orig diff --git a/setuptools/tests/indexes/test_links_priority/external.html b/setuptools/tests/indexes/test_links_priority/external.html new file mode 100644 index 0000000..92e4702 --- /dev/null +++ b/setuptools/tests/indexes/test_links_priority/external.html @@ -0,0 +1,3 @@ + +bad old link + diff --git a/setuptools/tests/indexes/test_links_priority/simple/foobar/index.html b/setuptools/tests/indexes/test_links_priority/simple/foobar/index.html new file mode 100644 index 0000000..fefb028 --- /dev/null +++ b/setuptools/tests/indexes/test_links_priority/simple/foobar/index.html @@ -0,0 +1,4 @@ + +foobar-0.1.tar.gz
    +external homepage
    + diff --git a/setuptools/tests/mod_with_constant.py b/setuptools/tests/mod_with_constant.py new file mode 100644 index 0000000..ef755dd --- /dev/null +++ b/setuptools/tests/mod_with_constant.py @@ -0,0 +1 @@ +value = 'three, sir!' diff --git a/setuptools/tests/namespaces.py b/setuptools/tests/namespaces.py new file mode 100644 index 0000000..ef5ecda --- /dev/null +++ b/setuptools/tests/namespaces.py @@ -0,0 +1,42 @@ +from __future__ import absolute_import, unicode_literals + +import textwrap + + +def build_namespace_package(tmpdir, name): + src_dir = tmpdir / name + src_dir.mkdir() + setup_py = src_dir / 'setup.py' + namespace, sep, rest = name.partition('.') + script = textwrap.dedent(""" + import setuptools + setuptools.setup( + name={name!r}, + version="1.0", + namespace_packages=[{namespace!r}], + packages=[{namespace!r}], + ) + """).format(**locals()) + setup_py.write_text(script, encoding='utf-8') + ns_pkg_dir = src_dir / namespace + ns_pkg_dir.mkdir() + pkg_init = ns_pkg_dir / '__init__.py' + tmpl = '__import__("pkg_resources").declare_namespace({namespace!r})' + decl = tmpl.format(**locals()) + pkg_init.write_text(decl, encoding='utf-8') + pkg_mod = ns_pkg_dir / (rest + '.py') + some_functionality = 'name = {rest!r}'.format(**locals()) + pkg_mod.write_text(some_functionality, encoding='utf-8') + return src_dir + + +def make_site_dir(target): + """ + Add a sitecustomize.py module in target to cause + target to be added to site dirs such that .pth files + are processed there. + """ + sc = target / 'sitecustomize.py' + target_str = str(target) + tmpl = '__import__("site").addsitedir({target_str!r})' + sc.write_text(tmpl.format(**locals()), encoding='utf-8') diff --git a/setuptools/tests/py26compat.py b/setuptools/tests/py26compat.py new file mode 100644 index 0000000..18cece0 --- /dev/null +++ b/setuptools/tests/py26compat.py @@ -0,0 +1,16 @@ +import sys +import tarfile +import contextlib + + +def _tarfile_open_ex(*args, **kwargs): + """ + Extend result as a context manager. + """ + return contextlib.closing(tarfile.open(*args, **kwargs)) + + +if sys.version_info[:2] < (2, 7) or (3, 0) <= sys.version_info[:2] < (3, 2): + tarfile_open = _tarfile_open_ex +else: + tarfile_open = tarfile.open diff --git a/setuptools/tests/script-with-bom.py b/setuptools/tests/script-with-bom.py new file mode 100644 index 0000000..22dee0d --- /dev/null +++ b/setuptools/tests/script-with-bom.py @@ -0,0 +1,3 @@ +# -*- coding: utf-8 -*- + +result = 'passed' diff --git a/setuptools/tests/server.py b/setuptools/tests/server.py new file mode 100644 index 0000000..5cdde21 --- /dev/null +++ b/setuptools/tests/server.py @@ -0,0 +1,72 @@ +"""Basic http server for tests to simulate PyPI or custom indexes +""" + +import time +import threading + +from six.moves import BaseHTTPServer, SimpleHTTPServer + + +class IndexServer(BaseHTTPServer.HTTPServer): + """Basic single-threaded http server simulating a package index + + You can use this server in unittest like this:: + s = IndexServer() + s.start() + index_url = s.base_url() + 'mytestindex' + # do some test requests to the index + # The index files should be located in setuptools/tests/indexes + s.stop() + """ + + def __init__(self, server_address=('', 0), + RequestHandlerClass=SimpleHTTPServer.SimpleHTTPRequestHandler): + BaseHTTPServer.HTTPServer.__init__(self, server_address, + RequestHandlerClass) + self._run = True + + def start(self): + self.thread = threading.Thread(target=self.serve_forever) + self.thread.start() + + def stop(self): + "Stop the server" + + # Let the server finish the last request and wait for a new one. + time.sleep(0.1) + + self.shutdown() + self.thread.join() + self.socket.close() + + def base_url(self): + port = self.server_port + return 'http://127.0.0.1:%s/setuptools/tests/indexes/' % port + + +class RequestRecorder(BaseHTTPServer.BaseHTTPRequestHandler): + def do_GET(self): + requests = vars(self.server).setdefault('requests', []) + requests.append(self) + self.send_response(200, 'OK') + + +class MockServer(BaseHTTPServer.HTTPServer, threading.Thread): + """ + A simple HTTP Server that records the requests made to it. + """ + + def __init__(self, server_address=('', 0), + RequestHandlerClass=RequestRecorder): + BaseHTTPServer.HTTPServer.__init__(self, server_address, + RequestHandlerClass) + threading.Thread.__init__(self) + self.setDaemon(True) + self.requests = [] + + def run(self): + self.serve_forever() + + @property + def url(self): + return 'http://localhost:%(server_port)s/' % vars(self) diff --git a/setuptools/tests/test_archive_util.py b/setuptools/tests/test_archive_util.py new file mode 100644 index 0000000..5cdf63f --- /dev/null +++ b/setuptools/tests/test_archive_util.py @@ -0,0 +1,42 @@ +# coding: utf-8 + +import tarfile +import io + +import six + +import pytest + +from setuptools import archive_util + + +@pytest.fixture +def tarfile_with_unicode(tmpdir): + """ + Create a tarfile containing only a file whose name is + a zero byte file called testimäge.png. + """ + tarobj = io.BytesIO() + + with tarfile.open(fileobj=tarobj, mode="w:gz") as tgz: + data = b"" + + filename = "testimäge.png" + if six.PY2: + filename = filename.decode('utf-8') + + t = tarfile.TarInfo(filename) + t.size = len(data) + + tgz.addfile(t, io.BytesIO(data)) + + target = tmpdir / 'unicode-pkg-1.0.tar.gz' + with open(str(target), mode='wb') as tf: + tf.write(tarobj.getvalue()) + return str(target) + + +@pytest.mark.xfail(reason="#710 and #712") +def test_unicode_files(tarfile_with_unicode, tmpdir): + target = tmpdir / 'out' + archive_util.unpack_archive(tarfile_with_unicode, six.text_type(target)) diff --git a/setuptools/tests/test_bdist_egg.py b/setuptools/tests/test_bdist_egg.py new file mode 100644 index 0000000..d24aa36 --- /dev/null +++ b/setuptools/tests/test_bdist_egg.py @@ -0,0 +1,44 @@ +"""develop tests +""" +import os +import re + +import pytest + +from setuptools.dist import Distribution + +from . import contexts + +SETUP_PY = """\ +from setuptools import setup + +setup(name='foo', py_modules=['hi']) +""" + + +@pytest.yield_fixture +def setup_context(tmpdir): + with (tmpdir / 'setup.py').open('w') as f: + f.write(SETUP_PY) + with (tmpdir / 'hi.py').open('w') as f: + f.write('1\n') + with tmpdir.as_cwd(): + yield tmpdir + + +class Test: + def test_bdist_egg(self, setup_context, user_override): + dist = Distribution(dict( + script_name='setup.py', + script_args=['bdist_egg'], + name='foo', + py_modules=['hi'] + )) + os.makedirs(os.path.join('build', 'src')) + with contexts.quiet(): + dist.parse_command_line() + dist.run_commands() + + # let's see if we got our egg link at the right place + [content] = os.listdir('dist') + assert re.match(r'foo-0.0.0-py[23].\d.egg$', content) diff --git a/setuptools/tests/test_build_clib.py b/setuptools/tests/test_build_clib.py new file mode 100644 index 0000000..7e3d1de --- /dev/null +++ b/setuptools/tests/test_build_clib.py @@ -0,0 +1,59 @@ +import pytest +import os +import shutil + +from unittest import mock +from distutils.errors import DistutilsSetupError +from setuptools.command.build_clib import build_clib +from setuptools.dist import Distribution + + +class TestBuildCLib: + @mock.patch( + 'setuptools.command.build_clib.newer_pairwise_group' + ) + def test_build_libraries(self, mock_newer): + dist = Distribution() + cmd = build_clib(dist) + + # this will be a long section, just making sure all + # exceptions are properly raised + libs = [('example', {'sources': 'broken.c'})] + with pytest.raises(DistutilsSetupError): + cmd.build_libraries(libs) + + obj_deps = 'some_string' + libs = [('example', {'sources': ['source.c'], 'obj_deps': obj_deps})] + with pytest.raises(DistutilsSetupError): + cmd.build_libraries(libs) + + obj_deps = {'': ''} + libs = [('example', {'sources': ['source.c'], 'obj_deps': obj_deps})] + with pytest.raises(DistutilsSetupError): + cmd.build_libraries(libs) + + obj_deps = {'source.c': ''} + libs = [('example', {'sources': ['source.c'], 'obj_deps': obj_deps})] + with pytest.raises(DistutilsSetupError): + cmd.build_libraries(libs) + + # with that out of the way, let's see if the crude dependency + # system works + cmd.compiler = mock.MagicMock(spec=cmd.compiler) + mock_newer.return_value = ([],[]) + + obj_deps = {'': ('global.h',), 'example.c': ('example.h',)} + libs = [('example', {'sources': ['example.c'] ,'obj_deps': obj_deps})] + + cmd.build_libraries(libs) + assert [['example.c', 'global.h', 'example.h']] in mock_newer.call_args[0] + assert not cmd.compiler.compile.called + assert cmd.compiler.create_static_lib.call_count == 1 + + # reset the call numbers so we can test again + cmd.compiler.reset_mock() + + mock_newer.return_value = '' # anything as long as it's not ([],[]) + cmd.build_libraries(libs) + assert cmd.compiler.compile.call_count == 1 + assert cmd.compiler.create_static_lib.call_count == 1 diff --git a/setuptools/tests/test_build_ext.py b/setuptools/tests/test_build_ext.py new file mode 100644 index 0000000..59a896d --- /dev/null +++ b/setuptools/tests/test_build_ext.py @@ -0,0 +1,45 @@ +import sys +import distutils.command.build_ext as orig +from distutils.sysconfig import get_config_var + +import six + +from setuptools.command.build_ext import build_ext, get_abi3_suffix +from setuptools.dist import Distribution +from setuptools.extension import Extension + + +class TestBuildExt: + def test_get_ext_filename(self): + """ + Setuptools needs to give back the same + result as distutils, even if the fullname + is not in ext_map. + """ + dist = Distribution() + cmd = build_ext(dist) + cmd.ext_map['foo/bar'] = '' + res = cmd.get_ext_filename('foo') + wanted = orig.build_ext.get_ext_filename(cmd, 'foo') + assert res == wanted + + def test_abi3_filename(self): + """ + Filename needs to be loadable by several versions + of Python 3 if 'is_abi3' is truthy on Extension() + """ + print(get_abi3_suffix()) + + extension = Extension('spam.eggs', ['eggs.c'], py_limited_api=True) + dist = Distribution(dict(ext_modules=[extension])) + cmd = build_ext(dist) + cmd.finalize_options() + assert 'spam.eggs' in cmd.ext_map + res = cmd.get_ext_filename('spam.eggs') + + if six.PY2 or not get_abi3_suffix(): + assert res.endswith(get_config_var('SO')) + elif sys.platform == 'win32': + assert res.endswith('eggs.pyd') + else: + assert 'abi3' in res diff --git a/setuptools/tests/test_build_py.py b/setuptools/tests/test_build_py.py new file mode 100644 index 0000000..cc701ae --- /dev/null +++ b/setuptools/tests/test_build_py.py @@ -0,0 +1,30 @@ +import os + +import pytest + +from setuptools.dist import Distribution + + +@pytest.yield_fixture +def tmpdir_as_cwd(tmpdir): + with tmpdir.as_cwd(): + yield tmpdir + + +def test_directories_in_package_data_glob(tmpdir_as_cwd): + """ + Directories matching the glob in package_data should + not be included in the package data. + + Regression test for #261. + """ + dist = Distribution(dict( + script_name='setup.py', + script_args=['build_py'], + packages=[''], + name='foo', + package_data={'': ['path/*']}, + )) + os.makedirs('path/subpath') + dist.parse_command_line() + dist.run_commands() diff --git a/setuptools/tests/test_config.py b/setuptools/tests/test_config.py new file mode 100644 index 0000000..799fb16 --- /dev/null +++ b/setuptools/tests/test_config.py @@ -0,0 +1,542 @@ +import contextlib +import pytest +from distutils.errors import DistutilsOptionError, DistutilsFileError +from setuptools.dist import Distribution +from setuptools.config import ConfigHandler, read_configuration + + +class ErrConfigHandler(ConfigHandler): + """Erroneous handler. Fails to implement required methods.""" + + +def make_package_dir(name, base_dir): + dir_package = base_dir.mkdir(name) + init_file = dir_package.join('__init__.py') + init_file.write('') + return dir_package, init_file + + +def fake_env(tmpdir, setup_cfg, setup_py=None): + + if setup_py is None: + setup_py = ( + 'from setuptools import setup\n' + 'setup()\n' + ) + + tmpdir.join('setup.py').write(setup_py) + config = tmpdir.join('setup.cfg') + config.write(setup_cfg) + + package_dir, init_file = make_package_dir('fake_package', tmpdir) + + init_file.write( + 'VERSION = (1, 2, 3)\n' + '\n' + 'VERSION_MAJOR = 1' + '\n' + 'def get_version():\n' + ' return [3, 4, 5, "dev"]\n' + '\n' + ) + return package_dir, config + + +@contextlib.contextmanager +def get_dist(tmpdir, kwargs_initial=None, parse=True): + kwargs_initial = kwargs_initial or {} + + with tmpdir.as_cwd(): + dist = Distribution(kwargs_initial) + dist.script_name = 'setup.py' + parse and dist.parse_config_files() + + yield dist + + +def test_parsers_implemented(): + + with pytest.raises(NotImplementedError): + handler = ErrConfigHandler(None, {}) + handler.parsers + + +class TestConfigurationReader: + + def test_basic(self, tmpdir): + _, config = fake_env( + tmpdir, + '[metadata]\n' + 'version = 10.1.1\n' + 'keywords = one, two\n' + '\n' + '[options]\n' + 'scripts = bin/a.py, bin/b.py\n' + ) + config_dict = read_configuration('%s' % config) + assert config_dict['metadata']['version'] == '10.1.1' + assert config_dict['metadata']['keywords'] == ['one', 'two'] + assert config_dict['options']['scripts'] == ['bin/a.py', 'bin/b.py'] + + def test_no_config(self, tmpdir): + with pytest.raises(DistutilsFileError): + read_configuration('%s' % tmpdir.join('setup.cfg')) + + def test_ignore_errors(self, tmpdir): + _, config = fake_env( + tmpdir, + '[metadata]\n' + 'version = attr: none.VERSION\n' + 'keywords = one, two\n' + ) + with pytest.raises(ImportError): + read_configuration('%s' % config) + + config_dict = read_configuration( + '%s' % config, ignore_option_errors=True) + + assert config_dict['metadata']['keywords'] == ['one', 'two'] + assert 'version' not in config_dict['metadata'] + + config.remove() + + +class TestMetadata: + + def test_basic(self, tmpdir): + + fake_env( + tmpdir, + '[metadata]\n' + 'version = 10.1.1\n' + 'description = Some description\n' + 'long_description = file: README\n' + 'name = fake_name\n' + 'keywords = one, two\n' + 'provides = package, package.sub\n' + 'license = otherlic\n' + 'download_url = http://test.test.com/test/\n' + 'maintainer_email = test@test.com\n' + ) + + tmpdir.join('README').write('readme contents\nline2') + + meta_initial = { + # This will be used so `otherlic` won't replace it. + 'license': 'BSD 3-Clause License', + } + + with get_dist(tmpdir, meta_initial) as dist: + metadata = dist.metadata + + assert metadata.version == '10.1.1' + assert metadata.description == 'Some description' + assert metadata.long_description == 'readme contents\nline2' + assert metadata.provides == ['package', 'package.sub'] + assert metadata.license == 'BSD 3-Clause License' + assert metadata.name == 'fake_name' + assert metadata.keywords == ['one', 'two'] + assert metadata.download_url == 'http://test.test.com/test/' + assert metadata.maintainer_email == 'test@test.com' + + def test_file_sandboxed(self, tmpdir): + + fake_env( + tmpdir, + '[metadata]\n' + 'long_description = file: ../../README\n' + ) + + with get_dist(tmpdir, parse=False) as dist: + with pytest.raises(DistutilsOptionError): + dist.parse_config_files() # file: out of sandbox + + def test_aliases(self, tmpdir): + + fake_env( + tmpdir, + '[metadata]\n' + 'author-email = test@test.com\n' + 'home-page = http://test.test.com/test/\n' + 'summary = Short summary\n' + 'platform = a, b\n' + 'classifier =\n' + ' Framework :: Django\n' + ' Programming Language :: Python :: 3.5\n' + ) + + with get_dist(tmpdir) as dist: + metadata = dist.metadata + assert metadata.author_email == 'test@test.com' + assert metadata.url == 'http://test.test.com/test/' + assert metadata.description == 'Short summary' + assert metadata.platforms == ['a', 'b'] + assert metadata.classifiers == [ + 'Framework :: Django', + 'Programming Language :: Python :: 3.5', + ] + + def test_multiline(self, tmpdir): + + fake_env( + tmpdir, + '[metadata]\n' + 'name = fake_name\n' + 'keywords =\n' + ' one\n' + ' two\n' + 'classifiers =\n' + ' Framework :: Django\n' + ' Programming Language :: Python :: 3.5\n' + ) + with get_dist(tmpdir) as dist: + metadata = dist.metadata + assert metadata.keywords == ['one', 'two'] + assert metadata.classifiers == [ + 'Framework :: Django', + 'Programming Language :: Python :: 3.5', + ] + + def test_version(self, tmpdir): + + _, config = fake_env( + tmpdir, + '[metadata]\n' + 'version = attr: fake_package.VERSION\n' + ) + with get_dist(tmpdir) as dist: + assert dist.metadata.version == '1.2.3' + + config.write( + '[metadata]\n' + 'version = attr: fake_package.get_version\n' + ) + with get_dist(tmpdir) as dist: + assert dist.metadata.version == '3.4.5.dev' + + config.write( + '[metadata]\n' + 'version = attr: fake_package.VERSION_MAJOR\n' + ) + with get_dist(tmpdir) as dist: + assert dist.metadata.version == '1' + + subpack = tmpdir.join('fake_package').mkdir('subpackage') + subpack.join('__init__.py').write('') + subpack.join('submodule.py').write('VERSION = (2016, 11, 26)') + + config.write( + '[metadata]\n' + 'version = attr: fake_package.subpackage.submodule.VERSION\n' + ) + with get_dist(tmpdir) as dist: + assert dist.metadata.version == '2016.11.26' + + def test_unknown_meta_item(self, tmpdir): + + fake_env( + tmpdir, + '[metadata]\n' + 'name = fake_name\n' + 'unknown = some\n' + ) + with get_dist(tmpdir, parse=False) as dist: + dist.parse_config_files() # Skip unknown. + + def test_usupported_section(self, tmpdir): + + fake_env( + tmpdir, + '[metadata.some]\n' + 'key = val\n' + ) + with get_dist(tmpdir, parse=False) as dist: + with pytest.raises(DistutilsOptionError): + dist.parse_config_files() + + def test_classifiers(self, tmpdir): + expected = set([ + 'Framework :: Django', + 'Programming Language :: Python :: 3', + 'Programming Language :: Python :: 3.5', + ]) + + # From file. + _, config = fake_env( + tmpdir, + '[metadata]\n' + 'classifiers = file: classifiers\n' + ) + + tmpdir.join('classifiers').write( + 'Framework :: Django\n' + 'Programming Language :: Python :: 3\n' + 'Programming Language :: Python :: 3.5\n' + ) + + with get_dist(tmpdir) as dist: + assert set(dist.metadata.classifiers) == expected + + # From list notation + config.write( + '[metadata]\n' + 'classifiers =\n' + ' Framework :: Django\n' + ' Programming Language :: Python :: 3\n' + ' Programming Language :: Python :: 3.5\n' + ) + with get_dist(tmpdir) as dist: + assert set(dist.metadata.classifiers) == expected + + +class TestOptions: + + def test_basic(self, tmpdir): + + fake_env( + tmpdir, + '[options]\n' + 'zip_safe = True\n' + 'use_2to3 = 1\n' + 'include_package_data = yes\n' + 'package_dir = b=c, =src\n' + 'packages = pack_a, pack_b.subpack\n' + 'namespace_packages = pack1, pack2\n' + 'use_2to3_fixers = your.fixers, or.here\n' + 'use_2to3_exclude_fixers = one.here, two.there\n' + 'convert_2to3_doctests = src/tests/one.txt, src/two.txt\n' + 'scripts = bin/one.py, bin/two.py\n' + 'eager_resources = bin/one.py, bin/two.py\n' + 'install_requires = docutils>=0.3; pack ==1.1, ==1.3; hey\n' + 'tests_require = mock==0.7.2; pytest\n' + 'setup_requires = docutils>=0.3; spack ==1.1, ==1.3; there\n' + 'dependency_links = http://some.com/here/1, ' + 'http://some.com/there/2\n' + ) + with get_dist(tmpdir) as dist: + assert dist.zip_safe + assert dist.use_2to3 + assert dist.include_package_data + assert dist.package_dir == {'': 'src', 'b': 'c'} + assert dist.packages == ['pack_a', 'pack_b.subpack'] + assert dist.namespace_packages == ['pack1', 'pack2'] + assert dist.use_2to3_fixers == ['your.fixers', 'or.here'] + assert dist.use_2to3_exclude_fixers == ['one.here', 'two.there'] + assert dist.convert_2to3_doctests == ([ + 'src/tests/one.txt', 'src/two.txt']) + assert dist.scripts == ['bin/one.py', 'bin/two.py'] + assert dist.dependency_links == ([ + 'http://some.com/here/1', + 'http://some.com/there/2' + ]) + assert dist.install_requires == ([ + 'docutils>=0.3', + 'pack ==1.1, ==1.3', + 'hey' + ]) + assert dist.setup_requires == ([ + 'docutils>=0.3', + 'spack ==1.1, ==1.3', + 'there' + ]) + assert dist.tests_require == ['mock==0.7.2', 'pytest'] + + def test_multiline(self, tmpdir): + fake_env( + tmpdir, + '[options]\n' + 'package_dir = \n' + ' b=c\n' + ' =src\n' + 'packages = \n' + ' pack_a\n' + ' pack_b.subpack\n' + 'namespace_packages = \n' + ' pack1\n' + ' pack2\n' + 'use_2to3_fixers = \n' + ' your.fixers\n' + ' or.here\n' + 'use_2to3_exclude_fixers = \n' + ' one.here\n' + ' two.there\n' + 'convert_2to3_doctests = \n' + ' src/tests/one.txt\n' + ' src/two.txt\n' + 'scripts = \n' + ' bin/one.py\n' + ' bin/two.py\n' + 'eager_resources = \n' + ' bin/one.py\n' + ' bin/two.py\n' + 'install_requires = \n' + ' docutils>=0.3\n' + ' pack ==1.1, ==1.3\n' + ' hey\n' + 'tests_require = \n' + ' mock==0.7.2\n' + ' pytest\n' + 'setup_requires = \n' + ' docutils>=0.3\n' + ' spack ==1.1, ==1.3\n' + ' there\n' + 'dependency_links = \n' + ' http://some.com/here/1\n' + ' http://some.com/there/2\n' + ) + with get_dist(tmpdir) as dist: + assert dist.package_dir == {'': 'src', 'b': 'c'} + assert dist.packages == ['pack_a', 'pack_b.subpack'] + assert dist.namespace_packages == ['pack1', 'pack2'] + assert dist.use_2to3_fixers == ['your.fixers', 'or.here'] + assert dist.use_2to3_exclude_fixers == ['one.here', 'two.there'] + assert dist.convert_2to3_doctests == ( + ['src/tests/one.txt', 'src/two.txt']) + assert dist.scripts == ['bin/one.py', 'bin/two.py'] + assert dist.dependency_links == ([ + 'http://some.com/here/1', + 'http://some.com/there/2' + ]) + assert dist.install_requires == ([ + 'docutils>=0.3', + 'pack ==1.1, ==1.3', + 'hey' + ]) + assert dist.setup_requires == ([ + 'docutils>=0.3', + 'spack ==1.1, ==1.3', + 'there' + ]) + assert dist.tests_require == ['mock==0.7.2', 'pytest'] + + def test_package_dir_fail(self, tmpdir): + fake_env( + tmpdir, + '[options]\n' + 'package_dir = a b\n' + ) + with get_dist(tmpdir, parse=False) as dist: + with pytest.raises(DistutilsOptionError): + dist.parse_config_files() + + def test_package_data(self, tmpdir): + fake_env( + tmpdir, + '[options.package_data]\n' + '* = *.txt, *.rst\n' + 'hello = *.msg\n' + '\n' + '[options.exclude_package_data]\n' + '* = fake1.txt, fake2.txt\n' + 'hello = *.dat\n' + ) + + with get_dist(tmpdir) as dist: + assert dist.package_data == { + '': ['*.txt', '*.rst'], + 'hello': ['*.msg'], + } + assert dist.exclude_package_data == { + '': ['fake1.txt', 'fake2.txt'], + 'hello': ['*.dat'], + } + + def test_packages(self, tmpdir): + fake_env( + tmpdir, + '[options]\n' + 'packages = find:\n' + ) + + with get_dist(tmpdir) as dist: + assert dist.packages == ['fake_package'] + + def test_find_directive(self, tmpdir): + dir_package, config = fake_env( + tmpdir, + '[options]\n' + 'packages = find:\n' + ) + + dir_sub_one, _ = make_package_dir('sub_one', dir_package) + dir_sub_two, _ = make_package_dir('sub_two', dir_package) + + with get_dist(tmpdir) as dist: + assert set(dist.packages) == set([ + 'fake_package', 'fake_package.sub_two', 'fake_package.sub_one' + ]) + + config.write( + '[options]\n' + 'packages = find:\n' + '\n' + '[options.packages.find]\n' + 'where = .\n' + 'include =\n' + ' fake_package.sub_one\n' + ' two\n' + ) + with get_dist(tmpdir) as dist: + assert dist.packages == ['fake_package.sub_one'] + + config.write( + '[options]\n' + 'packages = find:\n' + '\n' + '[options.packages.find]\n' + 'exclude =\n' + ' fake_package.sub_one\n' + ) + with get_dist(tmpdir) as dist: + assert set(dist.packages) == set( + ['fake_package', 'fake_package.sub_two']) + + def test_extras_require(self, tmpdir): + fake_env( + tmpdir, + '[options.extras_require]\n' + 'pdf = ReportLab>=1.2; RXP\n' + 'rest = \n' + ' docutils>=0.3\n' + ' pack ==1.1, ==1.3\n' + ) + + with get_dist(tmpdir) as dist: + assert dist.extras_require == { + 'pdf': ['ReportLab>=1.2', 'RXP'], + 'rest': ['docutils>=0.3', 'pack ==1.1, ==1.3'] + } + + def test_entry_points(self, tmpdir): + _, config = fake_env( + tmpdir, + '[options.entry_points]\n' + 'group1 = point1 = pack.module:func, ' + '.point2 = pack.module2:func_rest [rest]\n' + 'group2 = point3 = pack.module:func2\n' + ) + + with get_dist(tmpdir) as dist: + assert dist.entry_points == { + 'group1': [ + 'point1 = pack.module:func', + '.point2 = pack.module2:func_rest [rest]', + ], + 'group2': ['point3 = pack.module:func2'] + } + + expected = ( + '[blogtool.parsers]\n' + '.rst = some.nested.module:SomeClass.some_classmethod[reST]\n' + ) + + tmpdir.join('entry_points').write(expected) + + # From file. + config.write( + '[options]\n' + 'entry_points = file: entry_points\n' + ) + + with get_dist(tmpdir) as dist: + assert dist.entry_points == expected diff --git a/setuptools/tests/test_dep_util.py b/setuptools/tests/test_dep_util.py new file mode 100644 index 0000000..e5027c1 --- /dev/null +++ b/setuptools/tests/test_dep_util.py @@ -0,0 +1,30 @@ +from setuptools.dep_util import newer_pairwise_group +import os +import pytest + + +@pytest.fixture +def groups_target(tmpdir): + """Sets up some older sources, a target and newer sources. + Returns a 3-tuple in this order. + """ + creation_order = ['older.c', 'older.h', 'target.o', 'newer.c', 'newer.h'] + mtime = 0 + + for i in range(len(creation_order)): + creation_order[i] = os.path.join(str(tmpdir), creation_order[i]) + with open(creation_order[i], 'w'): + pass + + # make sure modification times are sequential + os.utime(creation_order[i], (mtime, mtime)) + mtime += 1 + + return creation_order[:2], creation_order[2], creation_order[3:] + + +def test_newer_pairwise_group(groups_target): + older = newer_pairwise_group([groups_target[0]], [groups_target[1]]) + newer = newer_pairwise_group([groups_target[2]], [groups_target[1]]) + assert older == ([], []) + assert newer == ([groups_target[2]], [groups_target[1]]) diff --git a/setuptools/tests/test_depends.py b/setuptools/tests/test_depends.py new file mode 100644 index 0000000..e0cfa88 --- /dev/null +++ b/setuptools/tests/test_depends.py @@ -0,0 +1,16 @@ +import sys + +from setuptools import depends + + +class TestGetModuleConstant: + + def test_basic(self): + """ + Invoke get_module_constant on a module in + the test package. + """ + mod_name = 'setuptools.tests.mod_with_constant' + val = depends.get_module_constant(mod_name, 'value') + assert val == 'three, sir!' + assert 'setuptools.tests.mod_with_constant' not in sys.modules diff --git a/setuptools/tests/test_develop.py b/setuptools/tests/test_develop.py new file mode 100644 index 0000000..54e199c --- /dev/null +++ b/setuptools/tests/test_develop.py @@ -0,0 +1,191 @@ +"""develop tests +""" + +from __future__ import absolute_import, unicode_literals + +import os +import site +import sys +import io +import subprocess + +import six +from setuptools.command import test + +import pytest + +from setuptools.command.develop import develop +from setuptools.dist import Distribution +from . import contexts +from . import namespaces + +SETUP_PY = """\ +from setuptools import setup + +setup(name='foo', + packages=['foo'], + use_2to3=True, +) +""" + +INIT_PY = """print "foo" +""" + + +@pytest.yield_fixture +def temp_user(monkeypatch): + with contexts.tempdir() as user_base: + with contexts.tempdir() as user_site: + monkeypatch.setattr('site.USER_BASE', user_base) + monkeypatch.setattr('site.USER_SITE', user_site) + yield + + +@pytest.yield_fixture +def test_env(tmpdir, temp_user): + target = tmpdir + foo = target.mkdir('foo') + setup = target / 'setup.py' + if setup.isfile(): + raise ValueError(dir(target)) + with setup.open('w') as f: + f.write(SETUP_PY) + init = foo / '__init__.py' + with init.open('w') as f: + f.write(INIT_PY) + with target.as_cwd(): + yield target + + +class TestDevelop: + in_virtualenv = hasattr(sys, 'real_prefix') + in_venv = hasattr(sys, 'base_prefix') and sys.base_prefix != sys.prefix + + @pytest.mark.skipif(in_virtualenv or in_venv, + reason="Cannot run when invoked in a virtualenv or venv") + def test_2to3_user_mode(self, test_env): + settings = dict( + name='foo', + packages=['foo'], + use_2to3=True, + version='0.0', + ) + dist = Distribution(settings) + dist.script_name = 'setup.py' + cmd = develop(dist) + cmd.user = 1 + cmd.ensure_finalized() + cmd.install_dir = site.USER_SITE + cmd.user = 1 + with contexts.quiet(): + cmd.run() + + # let's see if we got our egg link at the right place + content = os.listdir(site.USER_SITE) + content.sort() + assert content == ['easy-install.pth', 'foo.egg-link'] + + # Check that we are using the right code. + fn = os.path.join(site.USER_SITE, 'foo.egg-link') + with io.open(fn) as egg_link_file: + path = egg_link_file.read().split()[0].strip() + fn = os.path.join(path, 'foo', '__init__.py') + with io.open(fn) as init_file: + init = init_file.read().strip() + + expected = 'print("foo")' if six.PY3 else 'print "foo"' + assert init == expected + + def test_console_scripts(self, tmpdir): + """ + Test that console scripts are installed and that they reference + only the project by name and not the current version. + """ + pytest.skip("TODO: needs a fixture to cause 'develop' " + "to be invoked without mutating environment.") + settings = dict( + name='foo', + packages=['foo'], + version='0.0', + entry_points={ + 'console_scripts': [ + 'foocmd = foo:foo', + ], + }, + ) + dist = Distribution(settings) + dist.script_name = 'setup.py' + cmd = develop(dist) + cmd.ensure_finalized() + cmd.install_dir = tmpdir + cmd.run() + # assert '0.0' not in foocmd_text + + +class TestResolver: + """ + TODO: These tests were written with a minimal understanding + of what _resolve_setup_path is intending to do. Come up with + more meaningful cases that look like real-world scenarios. + """ + def test_resolve_setup_path_cwd(self): + assert develop._resolve_setup_path('.', '.', '.') == '.' + + def test_resolve_setup_path_one_dir(self): + assert develop._resolve_setup_path('pkgs', '.', 'pkgs') == '../' + + def test_resolve_setup_path_one_dir_trailing_slash(self): + assert develop._resolve_setup_path('pkgs/', '.', 'pkgs') == '../' + + +class TestNamespaces: + + @staticmethod + def install_develop(src_dir, target): + + develop_cmd = [ + sys.executable, + 'setup.py', + 'develop', + '--install-dir', str(target), + ] + with src_dir.as_cwd(): + with test.test.paths_on_pythonpath([str(target)]): + subprocess.check_call(develop_cmd) + + @pytest.mark.skipif(bool(os.environ.get("APPVEYOR")), + reason="https://github.com/pypa/setuptools/issues/851") + def test_namespace_package_importable(self, tmpdir): + """ + Installing two packages sharing the same namespace, one installed + naturally using pip or `--single-version-externally-managed` + and the other installed using `develop` should leave the namespace + in tact and both packages reachable by import. + """ + pkg_A = namespaces.build_namespace_package(tmpdir, 'myns.pkgA') + pkg_B = namespaces.build_namespace_package(tmpdir, 'myns.pkgB') + target = tmpdir / 'packages' + # use pip to install to the target directory + install_cmd = [ + 'pip', + 'install', + str(pkg_A), + '-t', str(target), + ] + subprocess.check_call(install_cmd) + self.install_develop(pkg_B, target) + namespaces.make_site_dir(target) + try_import = [ + sys.executable, + '-c', 'import myns.pkgA; import myns.pkgB', + ] + with test.test.paths_on_pythonpath([str(target)]): + subprocess.check_call(try_import) + + # additionally ensure that pkg_resources import works + pkg_resources_imp = [ + sys.executable, + '-c', 'import pkg_resources', + ] + with test.test.paths_on_pythonpath([str(target)]): + subprocess.check_call(pkg_resources_imp) diff --git a/setuptools/tests/test_dist_info.py b/setuptools/tests/test_dist_info.py new file mode 100644 index 0000000..24c5149 --- /dev/null +++ b/setuptools/tests/test_dist_info.py @@ -0,0 +1,78 @@ +"""Test .dist-info style distributions. +""" + +from __future__ import unicode_literals + +from six.moves import map + +import pytest + +import pkg_resources +from .textwrap import DALS + + +class TestDistInfo: + + metadata_base = DALS(""" + Metadata-Version: 1.2 + Requires-Dist: splort (==4) + Provides-Extra: baz + Requires-Dist: quux (>=1.1); extra == 'baz' + """) + + @classmethod + def build_metadata(cls, **kwargs): + lines = ( + '{key}: {value}\n'.format(**locals()) + for key, value in kwargs.items() + ) + return cls.metadata_base + ''.join(lines) + + @pytest.fixture + def metadata(self, tmpdir): + dist_info_name = 'VersionedDistribution-2.718.dist-info' + versioned = tmpdir / dist_info_name + versioned.mkdir() + filename = versioned / 'METADATA' + content = self.build_metadata( + Name='VersionedDistribution', + ) + filename.write_text(content, encoding='utf-8') + + dist_info_name = 'UnversionedDistribution.dist-info' + unversioned = tmpdir / dist_info_name + unversioned.mkdir() + filename = unversioned / 'METADATA' + content = self.build_metadata( + Name='UnversionedDistribution', + Version='0.3', + ) + filename.write_text(content, encoding='utf-8') + + return str(tmpdir) + + def test_distinfo(self, metadata): + dists = dict( + (d.project_name, d) + for d in pkg_resources.find_distributions(metadata) + ) + + assert len(dists) == 2, dists + + unversioned = dists['UnversionedDistribution'] + versioned = dists['VersionedDistribution'] + + assert versioned.version == '2.718' # from filename + assert unversioned.version == '0.3' # from METADATA + + def test_conditional_dependencies(self, metadata): + specs = 'splort==4', 'quux>=1.1' + requires = list(map(pkg_resources.Requirement.parse, specs)) + + for d in pkg_resources.find_distributions(metadata): + assert d.requires() == requires[:1] + assert d.requires(extras=('baz',)) == [ + requires[0], + pkg_resources.Requirement.parse('quux>=1.1;extra=="baz"'), + ] + assert d.extras == ['baz'] diff --git a/setuptools/tests/test_easy_install.py b/setuptools/tests/test_easy_install.py new file mode 100644 index 0000000..fd8300a --- /dev/null +++ b/setuptools/tests/test_easy_install.py @@ -0,0 +1,655 @@ +# -*- coding: utf-8 -*- +"""Easy install Tests +""" +from __future__ import absolute_import + +import sys +import os +import tempfile +import site +import contextlib +import tarfile +import logging +import itertools +import distutils.errors +import io +import zipfile +from unittest import mock + +import time +from six.moves import urllib + +import pytest + +from setuptools import sandbox +from setuptools.sandbox import run_setup +import setuptools.command.easy_install as ei +from setuptools.command.easy_install import PthDistributions +from setuptools.command import easy_install as easy_install_pkg +from setuptools.dist import Distribution +from pkg_resources import normalize_path, working_set +from pkg_resources import Distribution as PRDistribution +import setuptools.tests.server +import pkg_resources + +from .py26compat import tarfile_open +from . import contexts +from .textwrap import DALS + + +class FakeDist(object): + def get_entry_map(self, group): + if group != 'console_scripts': + return {} + return {'name': 'ep'} + + def as_requirement(self): + return 'spec' + + +SETUP_PY = DALS(""" + from setuptools import setup + + setup(name='foo') + """) + + +class TestEasyInstallTest: + def test_install_site_py(self, tmpdir): + dist = Distribution() + cmd = ei.easy_install(dist) + cmd.sitepy_installed = False + cmd.install_dir = str(tmpdir) + cmd.install_site_py() + assert (tmpdir / 'site.py').exists() + + def test_get_script_args(self): + header = ei.CommandSpec.best().from_environment().as_header() + expected = header + DALS(r""" + # EASY-INSTALL-ENTRY-SCRIPT: 'spec','console_scripts','name' + __requires__ = 'spec' + import re + import sys + from pkg_resources import load_entry_point + + if __name__ == '__main__': + sys.argv[0] = re.sub(r'(-script\.pyw?|\.exe)?$', '', sys.argv[0]) + sys.exit( + load_entry_point('spec', 'console_scripts', 'name')() + ) + """) + dist = FakeDist() + + args = next(ei.ScriptWriter.get_args(dist)) + name, script = itertools.islice(args, 2) + + assert script == expected + + def test_no_find_links(self): + # new option '--no-find-links', that blocks find-links added at + # the project level + dist = Distribution() + cmd = ei.easy_install(dist) + cmd.check_pth_processing = lambda: True + cmd.no_find_links = True + cmd.find_links = ['link1', 'link2'] + cmd.install_dir = os.path.join(tempfile.mkdtemp(), 'ok') + cmd.args = ['ok'] + cmd.ensure_finalized() + assert cmd.package_index.scanned_urls == {} + + # let's try without it (default behavior) + cmd = ei.easy_install(dist) + cmd.check_pth_processing = lambda: True + cmd.find_links = ['link1', 'link2'] + cmd.install_dir = os.path.join(tempfile.mkdtemp(), 'ok') + cmd.args = ['ok'] + cmd.ensure_finalized() + keys = sorted(cmd.package_index.scanned_urls.keys()) + assert keys == ['link1', 'link2'] + + def test_write_exception(self): + """ + Test that `cant_write_to_target` is rendered as a DistutilsError. + """ + dist = Distribution() + cmd = ei.easy_install(dist) + cmd.install_dir = os.getcwd() + with pytest.raises(distutils.errors.DistutilsError): + cmd.cant_write_to_target() + + def test_all_site_dirs(self, monkeypatch): + """ + get_site_dirs should always return site dirs reported by + site.getsitepackages. + """ + path = normalize_path('/setuptools/test/site-packages') + mock_gsp = lambda: [path] + monkeypatch.setattr(site, 'getsitepackages', mock_gsp, raising=False) + assert path in ei.get_site_dirs() + + def test_all_site_dirs_works_without_getsitepackages(self, monkeypatch): + monkeypatch.delattr(site, 'getsitepackages', raising=False) + assert ei.get_site_dirs() + + @pytest.fixture + def sdist_unicode(self, tmpdir): + files = [ + ( + 'setup.py', + DALS(""" + import setuptools + setuptools.setup( + name="setuptools-test-unicode", + version="1.0", + packages=["mypkg"], + include_package_data=True, + ) + """), + ), + ( + 'mypkg/__init__.py', + "", + ), + ( + u'mypkg/\u2603.txt', + "", + ), + ] + sdist_name = 'setuptools-test-unicode-1.0.zip' + sdist = tmpdir / sdist_name + # can't use make_sdist, because the issue only occurs + # with zip sdists. + sdist_zip = zipfile.ZipFile(str(sdist), 'w') + for filename, content in files: + sdist_zip.writestr(filename, content) + sdist_zip.close() + return str(sdist) + + def test_unicode_filename_in_sdist(self, sdist_unicode, tmpdir, monkeypatch): + """ + The install command should execute correctly even if + the package has unicode filenames. + """ + dist = Distribution({'script_args': ['easy_install']}) + target = (tmpdir / 'target').ensure_dir() + cmd = ei.easy_install( + dist, + install_dir=str(target), + args=['x'], + ) + monkeypatch.setitem(os.environ, 'PYTHONPATH', str(target)) + cmd.ensure_finalized() + cmd.easy_install(sdist_unicode) + + +class TestPTHFileWriter: + def test_add_from_cwd_site_sets_dirty(self): + '''a pth file manager should set dirty + if a distribution is in site but also the cwd + ''' + pth = PthDistributions('does-not_exist', [os.getcwd()]) + assert not pth.dirty + pth.add(PRDistribution(os.getcwd())) + assert pth.dirty + + def test_add_from_site_is_ignored(self): + location = '/test/location/does-not-have-to-exist' + # PthDistributions expects all locations to be normalized + location = pkg_resources.normalize_path(location) + pth = PthDistributions('does-not_exist', [location, ]) + assert not pth.dirty + pth.add(PRDistribution(location)) + assert not pth.dirty + + +@pytest.yield_fixture +def setup_context(tmpdir): + with (tmpdir / 'setup.py').open('w') as f: + f.write(SETUP_PY) + with tmpdir.as_cwd(): + yield tmpdir + + +@pytest.mark.usefixtures("user_override") +@pytest.mark.usefixtures("setup_context") +class TestUserInstallTest: + + # prevent check that site-packages is writable. easy_install + # shouldn't be writing to system site-packages during finalize + # options, but while it does, bypass the behavior. + prev_sp_write = mock.patch( + 'setuptools.command.easy_install.easy_install.check_site_dir', + mock.Mock(), + ) + + # simulate setuptools installed in user site packages + @mock.patch('setuptools.command.easy_install.__file__', site.USER_SITE) + @mock.patch('site.ENABLE_USER_SITE', True) + @prev_sp_write + def test_user_install_not_implied_user_site_enabled(self): + self.assert_not_user_site() + + @mock.patch('site.ENABLE_USER_SITE', False) + @prev_sp_write + def test_user_install_not_implied_user_site_disabled(self): + self.assert_not_user_site() + + @staticmethod + def assert_not_user_site(): + # create a finalized easy_install command + dist = Distribution() + dist.script_name = 'setup.py' + cmd = ei.easy_install(dist) + cmd.args = ['py'] + cmd.ensure_finalized() + assert not cmd.user, 'user should not be implied' + + def test_multiproc_atexit(self): + pytest.importorskip('multiprocessing') + + log = logging.getLogger('test_easy_install') + logging.basicConfig(level=logging.INFO, stream=sys.stderr) + log.info('this should not break') + + @pytest.fixture() + def foo_package(self, tmpdir): + egg_file = tmpdir / 'foo-1.0.egg-info' + with egg_file.open('w') as f: + f.write('Name: foo\n') + return str(tmpdir) + + @pytest.yield_fixture() + def install_target(self, tmpdir): + target = str(tmpdir) + with mock.patch('sys.path', sys.path + [target]): + python_path = os.path.pathsep.join(sys.path) + with mock.patch.dict(os.environ, PYTHONPATH=python_path): + yield target + + def test_local_index(self, foo_package, install_target): + """ + The local index must be used when easy_install locates installed + packages. + """ + dist = Distribution() + dist.script_name = 'setup.py' + cmd = ei.easy_install(dist) + cmd.install_dir = install_target + cmd.args = ['foo'] + cmd.ensure_finalized() + cmd.local_index.scan([foo_package]) + res = cmd.easy_install('foo') + actual = os.path.normcase(os.path.realpath(res.location)) + expected = os.path.normcase(os.path.realpath(foo_package)) + assert actual == expected + + @contextlib.contextmanager + def user_install_setup_context(self, *args, **kwargs): + """ + Wrap sandbox.setup_context to patch easy_install in that context to + appear as user-installed. + """ + with self.orig_context(*args, **kwargs): + import setuptools.command.easy_install as ei + ei.__file__ = site.USER_SITE + yield + + def patched_setup_context(self): + self.orig_context = sandbox.setup_context + + return mock.patch( + 'setuptools.sandbox.setup_context', + self.user_install_setup_context, + ) + + +@pytest.yield_fixture +def distutils_package(): + distutils_setup_py = SETUP_PY.replace( + 'from setuptools import setup', + 'from distutils.core import setup', + ) + with contexts.tempdir(cd=os.chdir): + with open('setup.py', 'w') as f: + f.write(distutils_setup_py) + yield + + +class TestDistutilsPackage: + def test_bdist_egg_available_on_distutils_pkg(self, distutils_package): + run_setup('setup.py', ['bdist_egg']) + + +class TestSetupRequires: + def test_setup_requires_honors_fetch_params(self): + """ + When easy_install installs a source distribution which specifies + setup_requires, it should honor the fetch parameters (such as + allow-hosts, index-url, and find-links). + """ + # set up a server which will simulate an alternate package index. + p_index = setuptools.tests.server.MockServer() + p_index.start() + netloc = 1 + p_index_loc = urllib.parse.urlparse(p_index.url)[netloc] + if p_index_loc.endswith(':0'): + # Some platforms (Jython) don't find a port to which to bind, + # so skip this test for them. + return + with contexts.quiet(): + # create an sdist that has a build-time dependency. + with TestSetupRequires.create_sdist() as dist_file: + with contexts.tempdir() as temp_install_dir: + with contexts.environment(PYTHONPATH=temp_install_dir): + ei_params = [ + '--index-url', p_index.url, + '--allow-hosts', p_index_loc, + '--exclude-scripts', + '--install-dir', temp_install_dir, + dist_file, + ] + with sandbox.save_argv(['easy_install']): + # attempt to install the dist. It should fail because + # it doesn't exist. + with pytest.raises(SystemExit): + easy_install_pkg.main(ei_params) + # there should have been two or three requests to the server + # (three happens on Python 3.3a) + assert 2 <= len(p_index.requests) <= 3 + assert p_index.requests[0].path == '/does-not-exist/' + + @staticmethod + @contextlib.contextmanager + def create_sdist(): + """ + Return an sdist with a setup_requires dependency (of something that + doesn't exist) + """ + with contexts.tempdir() as dir: + dist_path = os.path.join(dir, 'setuptools-test-fetcher-1.0.tar.gz') + make_sdist(dist_path, [ + ('setup.py', DALS(""" + import setuptools + setuptools.setup( + name="setuptools-test-fetcher", + version="1.0", + setup_requires = ['does-not-exist'], + ) + """))]) + yield dist_path + + def test_setup_requires_overrides_version_conflict(self): + """ + Regression test for distribution issue 323: + https://bitbucket.org/tarek/distribute/issues/323 + + Ensures that a distribution's setup_requires requirements can still be + installed and used locally even if a conflicting version of that + requirement is already on the path. + """ + + fake_dist = PRDistribution('does-not-matter', project_name='foobar', + version='0.0') + working_set.add(fake_dist) + + with contexts.save_pkg_resources_state(): + with contexts.tempdir() as temp_dir: + test_pkg = create_setup_requires_package(temp_dir) + test_setup_py = os.path.join(test_pkg, 'setup.py') + with contexts.quiet() as (stdout, stderr): + # Don't even need to install the package, just + # running the setup.py at all is sufficient + run_setup(test_setup_py, ['--name']) + + lines = stdout.readlines() + assert len(lines) > 0 + assert lines[-1].strip(), 'test_pkg' + + def test_setup_requires_override_nspkg(self): + """ + Like ``test_setup_requires_overrides_version_conflict`` but where the + ``setup_requires`` package is part of a namespace package that has + *already* been imported. + """ + + with contexts.save_pkg_resources_state(): + with contexts.tempdir() as temp_dir: + foobar_1_archive = os.path.join(temp_dir, 'foo.bar-0.1.tar.gz') + make_nspkg_sdist(foobar_1_archive, 'foo.bar', '0.1') + # Now actually go ahead an extract to the temp dir and add the + # extracted path to sys.path so foo.bar v0.1 is importable + foobar_1_dir = os.path.join(temp_dir, 'foo.bar-0.1') + os.mkdir(foobar_1_dir) + with tarfile_open(foobar_1_archive) as tf: + tf.extractall(foobar_1_dir) + sys.path.insert(1, foobar_1_dir) + + dist = PRDistribution(foobar_1_dir, project_name='foo.bar', + version='0.1') + working_set.add(dist) + + template = DALS("""\ + import foo # Even with foo imported first the + # setup_requires package should override + import setuptools + setuptools.setup(**%r) + + if not (hasattr(foo, '__path__') and + len(foo.__path__) == 2): + print('FAIL') + + if 'foo.bar-0.2' not in foo.__path__[0]: + print('FAIL') + """) + + test_pkg = create_setup_requires_package( + temp_dir, 'foo.bar', '0.2', make_nspkg_sdist, template) + + test_setup_py = os.path.join(test_pkg, 'setup.py') + + with contexts.quiet() as (stdout, stderr): + try: + # Don't even need to install the package, just + # running the setup.py at all is sufficient + run_setup(test_setup_py, ['--name']) + except pkg_resources.VersionConflict: + self.fail('Installing setup.py requirements ' + 'caused a VersionConflict') + + assert 'FAIL' not in stdout.getvalue() + lines = stdout.readlines() + assert len(lines) > 0 + assert lines[-1].strip() == 'test_pkg' + + +def make_trivial_sdist(dist_path, distname, version): + """ + Create a simple sdist tarball at dist_path, containing just a simple + setup.py. + """ + + make_sdist(dist_path, [ + ('setup.py', + DALS("""\ + import setuptools + setuptools.setup( + name=%r, + version=%r + ) + """ % (distname, version)))]) + + +def make_nspkg_sdist(dist_path, distname, version): + """ + Make an sdist tarball with distname and version which also contains one + package with the same name as distname. The top-level package is + designated a namespace package). + """ + + parts = distname.split('.') + nspackage = parts[0] + + packages = ['.'.join(parts[:idx]) for idx in range(1, len(parts) + 1)] + + setup_py = DALS("""\ + import setuptools + setuptools.setup( + name=%r, + version=%r, + packages=%r, + namespace_packages=[%r] + ) + """ % (distname, version, packages, nspackage)) + + init = "__import__('pkg_resources').declare_namespace(__name__)" + + files = [('setup.py', setup_py), + (os.path.join(nspackage, '__init__.py'), init)] + for package in packages[1:]: + filename = os.path.join(*(package.split('.') + ['__init__.py'])) + files.append((filename, '')) + + make_sdist(dist_path, files) + + +def make_sdist(dist_path, files): + """ + Create a simple sdist tarball at dist_path, containing the files + listed in ``files`` as ``(filename, content)`` tuples. + """ + + with tarfile_open(dist_path, 'w:gz') as dist: + for filename, content in files: + file_bytes = io.BytesIO(content.encode('utf-8')) + file_info = tarfile.TarInfo(name=filename) + file_info.size = len(file_bytes.getvalue()) + file_info.mtime = int(time.time()) + dist.addfile(file_info, fileobj=file_bytes) + + +def create_setup_requires_package(path, distname='foobar', version='0.1', + make_package=make_trivial_sdist, + setup_py_template=None): + """Creates a source tree under path for a trivial test package that has a + single requirement in setup_requires--a tarball for that requirement is + also created and added to the dependency_links argument. + + ``distname`` and ``version`` refer to the name/version of the package that + the test package requires via ``setup_requires``. The name of the test + package itself is just 'test_pkg'. + """ + + test_setup_attrs = { + 'name': 'test_pkg', 'version': '0.0', + 'setup_requires': ['%s==%s' % (distname, version)], + 'dependency_links': [os.path.abspath(path)] + } + + test_pkg = os.path.join(path, 'test_pkg') + test_setup_py = os.path.join(test_pkg, 'setup.py') + os.mkdir(test_pkg) + + if setup_py_template is None: + setup_py_template = DALS("""\ + import setuptools + setuptools.setup(**%r) + """) + + with open(test_setup_py, 'w') as f: + f.write(setup_py_template % test_setup_attrs) + + foobar_path = os.path.join(path, '%s-%s.tar.gz' % (distname, version)) + make_package(foobar_path, distname, version) + + return test_pkg + + +def make_trivial_sdist(dist_path, setup_py): + """Create a simple sdist tarball at dist_path, containing just a + setup.py, the contents of which are provided by the setup_py string. + """ + + setup_py_file = tarfile.TarInfo(name='setup.py') + setup_py_bytes = io.BytesIO(setup_py.encode('utf-8')) + setup_py_file.size = len(setup_py_bytes.getvalue()) + with tarfile_open(dist_path, 'w:gz') as dist: + dist.addfile(setup_py_file, fileobj=setup_py_bytes) + + +@pytest.mark.skipif( + sys.platform.startswith('java') and ei.is_sh(sys.executable), + reason="Test cannot run under java when executable is sh" +) +class TestScriptHeader: + non_ascii_exe = '/Users/José/bin/python' + exe_with_spaces = r'C:\Program Files\Python33\python.exe' + + def test_get_script_header(self): + expected = '#!%s\n' % ei.nt_quote_arg(os.path.normpath(sys.executable)) + actual = ei.ScriptWriter.get_script_header('#!/usr/local/bin/python') + assert actual == expected + + def test_get_script_header_args(self): + expected = '#!%s -x\n' % ei.nt_quote_arg(os.path.normpath + (sys.executable)) + actual = ei.ScriptWriter.get_script_header('#!/usr/bin/python -x') + assert actual == expected + + def test_get_script_header_non_ascii_exe(self): + actual = ei.ScriptWriter.get_script_header('#!/usr/bin/python', + executable=self.non_ascii_exe) + expected = '#!%s -x\n' % self.non_ascii_exe + assert actual == expected + + def test_get_script_header_exe_with_spaces(self): + actual = ei.ScriptWriter.get_script_header('#!/usr/bin/python', + executable='"' + self.exe_with_spaces + '"') + expected = '#!"%s"\n' % self.exe_with_spaces + assert actual == expected + + +class TestCommandSpec: + def test_custom_launch_command(self): + """ + Show how a custom CommandSpec could be used to specify a #! executable + which takes parameters. + """ + cmd = ei.CommandSpec(['/usr/bin/env', 'python3']) + assert cmd.as_header() == '#!/usr/bin/env python3\n' + + def test_from_param_for_CommandSpec_is_passthrough(self): + """ + from_param should return an instance of a CommandSpec + """ + cmd = ei.CommandSpec(['python']) + cmd_new = ei.CommandSpec.from_param(cmd) + assert cmd is cmd_new + + @mock.patch('sys.executable', TestScriptHeader.exe_with_spaces) + @mock.patch.dict(os.environ) + def test_from_environment_with_spaces_in_executable(self): + os.environ.pop('__PYVENV_LAUNCHER__', None) + cmd = ei.CommandSpec.from_environment() + assert len(cmd) == 1 + assert cmd.as_header().startswith('#!"') + + def test_from_simple_string_uses_shlex(self): + """ + In order to support `executable = /usr/bin/env my-python`, make sure + from_param invokes shlex on that input. + """ + cmd = ei.CommandSpec.from_param('/usr/bin/env my-python') + assert len(cmd) == 2 + assert '"' not in cmd.as_header() + + +class TestWindowsScriptWriter: + def test_header(self): + hdr = ei.WindowsScriptWriter.get_script_header('') + assert hdr.startswith('#!') + assert hdr.endswith('\n') + hdr = hdr.lstrip('#!') + hdr = hdr.rstrip('\n') + # header should not start with an escaped quote + assert not hdr.startswith('\\"') diff --git a/setuptools/tests/test_egg_info.py b/setuptools/tests/test_egg_info.py new file mode 100644 index 0000000..c9a4425 --- /dev/null +++ b/setuptools/tests/test_egg_info.py @@ -0,0 +1,282 @@ +import os +import glob +import re +import stat +import sys + +from setuptools.command.egg_info import egg_info, manifest_maker +from setuptools.dist import Distribution +from six.moves import map + +import pytest + +from . import environment +from .files import build_files +from .textwrap import DALS +from . import contexts + + +class Environment(str): + pass + + +class TestEggInfo(object): + + setup_script = DALS(""" + from setuptools import setup + + setup( + name='foo', + py_modules=['hello'], + entry_points={'console_scripts': ['hi = hello.run']}, + zip_safe=False, + ) + """) + + def _create_project(self): + build_files({ + 'setup.py': self.setup_script, + 'hello.py': DALS(""" + def run(): + print('hello') + """) + }) + + @pytest.yield_fixture + def env(self): + with contexts.tempdir(prefix='setuptools-test.') as env_dir: + env = Environment(env_dir) + os.chmod(env_dir, stat.S_IRWXU) + subs = 'home', 'lib', 'scripts', 'data', 'egg-base' + env.paths = dict( + (dirname, os.path.join(env_dir, dirname)) + for dirname in subs + ) + list(map(os.mkdir, env.paths.values())) + build_files({ + env.paths['home']: { + '.pydistutils.cfg': DALS(""" + [egg_info] + egg-base = %(egg-base)s + """ % env.paths) + } + }) + yield env + + dict_order_fails = pytest.mark.skipif( + sys.version_info < (2,7), + reason="Intermittent failures on Python 2.6", + ) + + @dict_order_fails + def test_egg_info_save_version_info_setup_empty(self, tmpdir_cwd, env): + """ + When the egg_info section is empty or not present, running + save_version_info should add the settings to the setup.cfg + in a deterministic order, consistent with the ordering found + on Python 2.7 with PYTHONHASHSEED=0. + """ + setup_cfg = os.path.join(env.paths['home'], 'setup.cfg') + dist = Distribution() + ei = egg_info(dist) + ei.initialize_options() + ei.save_version_info(setup_cfg) + + with open(setup_cfg, 'r') as f: + content = f.read() + + assert '[egg_info]' in content + assert 'tag_build =' in content + assert 'tag_date = 0' in content + + expected_order = 'tag_build', 'tag_date', + + self._validate_content_order(content, expected_order) + + @staticmethod + def _validate_content_order(content, expected): + """ + Assert that the strings in expected appear in content + in order. + """ + pattern = '.*'.join(expected) + flags = re.MULTILINE | re.DOTALL + assert re.search(pattern, content, flags) + + @dict_order_fails + def test_egg_info_save_version_info_setup_defaults(self, tmpdir_cwd, env): + """ + When running save_version_info on an existing setup.cfg + with the 'default' values present from a previous run, + the file should remain unchanged. + """ + setup_cfg = os.path.join(env.paths['home'], 'setup.cfg') + build_files({ + setup_cfg: DALS(""" + [egg_info] + tag_build = + tag_date = 0 + """), + }) + dist = Distribution() + ei = egg_info(dist) + ei.initialize_options() + ei.save_version_info(setup_cfg) + + with open(setup_cfg, 'r') as f: + content = f.read() + + assert '[egg_info]' in content + assert 'tag_build =' in content + assert 'tag_date = 0' in content + + expected_order = 'tag_build', 'tag_date', + + self._validate_content_order(content, expected_order) + + def test_egg_base_installed_egg_info(self, tmpdir_cwd, env): + self._create_project() + + self._run_install_command(tmpdir_cwd, env) + actual = self._find_egg_info_files(env.paths['lib']) + + expected = [ + 'PKG-INFO', + 'SOURCES.txt', + 'dependency_links.txt', + 'entry_points.txt', + 'not-zip-safe', + 'top_level.txt', + ] + assert sorted(actual) == expected + + def test_manifest_template_is_read(self, tmpdir_cwd, env): + self._create_project() + build_files({ + 'MANIFEST.in': DALS(""" + recursive-include docs *.rst + """), + 'docs': { + 'usage.rst': "Run 'hi'", + } + }) + self._run_install_command(tmpdir_cwd, env) + egg_info_dir = self._find_egg_info_files(env.paths['lib']).base + sources_txt = os.path.join(egg_info_dir, 'SOURCES.txt') + assert 'docs/usage.rst' in open(sources_txt).read().split('\n') + + def _setup_script_with_requires(self, requires_line): + setup_script = DALS(""" + from setuptools import setup + + setup( + name='foo', + %s + zip_safe=False, + ) + """ % requires_line) + build_files({ + 'setup.py': setup_script, + }) + + def test_install_requires_with_markers(self, tmpdir_cwd, env): + self._setup_script_with_requires( + """install_requires=["barbazquux;python_version<'2'"],""") + self._run_install_command(tmpdir_cwd, env) + egg_info_dir = self._find_egg_info_files(env.paths['lib']).base + requires_txt = os.path.join(egg_info_dir, 'requires.txt') + assert "barbazquux;python_version<'2'" in open( + requires_txt).read().split('\n') + assert glob.glob(os.path.join(env.paths['lib'], 'barbazquux*')) == [] + + def test_setup_requires_with_markers(self, tmpdir_cwd, env): + self._setup_script_with_requires( + """setup_requires=["barbazquux;python_version<'2'"],""") + self._run_install_command(tmpdir_cwd, env) + assert glob.glob(os.path.join(env.paths['lib'], 'barbazquux*')) == [] + + def test_tests_require_with_markers(self, tmpdir_cwd, env): + self._setup_script_with_requires( + """tests_require=["barbazquux;python_version<'2'"],""") + self._run_install_command( + tmpdir_cwd, env, cmd=['test'], output="Ran 0 tests in") + assert glob.glob(os.path.join(env.paths['lib'], 'barbazquux*')) == [] + + def test_extra_requires_with_markers(self, tmpdir_cwd, env): + self._setup_script_with_requires( + """extra_requires={":python_version<'2'": ["barbazquux"]},""") + self._run_install_command(tmpdir_cwd, env) + assert glob.glob(os.path.join(env.paths['lib'], 'barbazquux*')) == [] + + def test_python_requires_egg_info(self, tmpdir_cwd, env): + self._setup_script_with_requires( + """python_requires='>=2.7.12',""") + environ = os.environ.copy().update( + HOME=env.paths['home'], + ) + code, data = environment.run_setup_py( + cmd=['egg_info'], + pypath=os.pathsep.join([env.paths['lib'], str(tmpdir_cwd)]), + data_stream=1, + env=environ, + ) + egg_info_dir = os.path.join('.', 'foo.egg-info') + with open(os.path.join(egg_info_dir, 'PKG-INFO')) as pkginfo_file: + pkg_info_lines = pkginfo_file.read().split('\n') + assert 'Requires-Python: >=2.7.12' in pkg_info_lines + assert 'Metadata-Version: 1.2' in pkg_info_lines + + def test_python_requires_install(self, tmpdir_cwd, env): + self._setup_script_with_requires( + """python_requires='>=1.2.3',""") + self._run_install_command(tmpdir_cwd, env) + egg_info_dir = self._find_egg_info_files(env.paths['lib']).base + pkginfo = os.path.join(egg_info_dir, 'PKG-INFO') + assert 'Requires-Python: >=1.2.3' in open(pkginfo).read().split('\n') + + def test_manifest_maker_warning_suppression(self): + fixtures = [ + "standard file not found: should have one of foo.py, bar.py", + "standard file 'setup.py' not found" + ] + + for msg in fixtures: + assert manifest_maker._should_suppress_warning(msg) + + def _run_install_command(self, tmpdir_cwd, env, cmd=None, output=None): + environ = os.environ.copy().update( + HOME=env.paths['home'], + ) + if cmd is None: + cmd = [ + 'install', + '--home', env.paths['home'], + '--install-lib', env.paths['lib'], + '--install-scripts', env.paths['scripts'], + '--install-data', env.paths['data'], + ] + code, data = environment.run_setup_py( + cmd=cmd, + pypath=os.pathsep.join([env.paths['lib'], str(tmpdir_cwd)]), + data_stream=1, + env=environ, + ) + if code: + raise AssertionError(data) + if output: + assert output in data + + def _find_egg_info_files(self, root): + class DirList(list): + def __init__(self, files, base): + super(DirList, self).__init__(files) + self.base = base + + results = ( + DirList(filenames, dirpath) + for dirpath, dirnames, filenames in os.walk(root) + if os.path.basename(dirpath) == 'EGG-INFO' + ) + # expect exactly one result + result, = results + return result diff --git a/setuptools/tests/test_find_packages.py b/setuptools/tests/test_find_packages.py new file mode 100644 index 0000000..a6023de --- /dev/null +++ b/setuptools/tests/test_find_packages.py @@ -0,0 +1,182 @@ +"""Tests for setuptools.find_packages().""" +import os +import sys +import shutil +import tempfile +import platform + +import pytest + +import setuptools +from setuptools import find_packages + +find_420_packages = setuptools.PEP420PackageFinder.find + +# modeled after CPython's test.support.can_symlink + + +def can_symlink(): + TESTFN = tempfile.mktemp() + symlink_path = TESTFN + "can_symlink" + try: + os.symlink(TESTFN, symlink_path) + can = True + except (OSError, NotImplementedError, AttributeError): + can = False + else: + os.remove(symlink_path) + globals().update(can_symlink=lambda: can) + return can + + +def has_symlink(): + bad_symlink = ( + # Windows symlink directory detection is broken on Python 3.2 + platform.system() == 'Windows' and sys.version_info[:2] == (3, 2) + ) + return can_symlink() and not bad_symlink + + +class TestFindPackages: + def setup_method(self, method): + self.dist_dir = tempfile.mkdtemp() + self._make_pkg_structure() + + def teardown_method(self, method): + shutil.rmtree(self.dist_dir) + + def _make_pkg_structure(self): + """Make basic package structure. + + dist/ + docs/ + conf.py + pkg/ + __pycache__/ + nspkg/ + mod.py + subpkg/ + assets/ + asset + __init__.py + setup.py + + """ + self.docs_dir = self._mkdir('docs', self.dist_dir) + self._touch('conf.py', self.docs_dir) + self.pkg_dir = self._mkdir('pkg', self.dist_dir) + self._mkdir('__pycache__', self.pkg_dir) + self.ns_pkg_dir = self._mkdir('nspkg', self.pkg_dir) + self._touch('mod.py', self.ns_pkg_dir) + self.sub_pkg_dir = self._mkdir('subpkg', self.pkg_dir) + self.asset_dir = self._mkdir('assets', self.sub_pkg_dir) + self._touch('asset', self.asset_dir) + self._touch('__init__.py', self.sub_pkg_dir) + self._touch('setup.py', self.dist_dir) + + def _mkdir(self, path, parent_dir=None): + if parent_dir: + path = os.path.join(parent_dir, path) + os.mkdir(path) + return path + + def _touch(self, path, dir_=None): + if dir_: + path = os.path.join(dir_, path) + fp = open(path, 'w') + fp.close() + return path + + def test_regular_package(self): + self._touch('__init__.py', self.pkg_dir) + packages = find_packages(self.dist_dir) + assert packages == ['pkg', 'pkg.subpkg'] + + def test_exclude(self): + self._touch('__init__.py', self.pkg_dir) + packages = find_packages(self.dist_dir, exclude=('pkg.*',)) + assert packages == ['pkg'] + + def test_exclude_recursive(self): + """ + Excluding a parent package should not exclude child packages as well. + """ + self._touch('__init__.py', self.pkg_dir) + self._touch('__init__.py', self.sub_pkg_dir) + packages = find_packages(self.dist_dir, exclude=('pkg',)) + assert packages == ['pkg.subpkg'] + + def test_include_excludes_other(self): + """ + If include is specified, other packages should be excluded. + """ + self._touch('__init__.py', self.pkg_dir) + alt_dir = self._mkdir('other_pkg', self.dist_dir) + self._touch('__init__.py', alt_dir) + packages = find_packages(self.dist_dir, include=['other_pkg']) + assert packages == ['other_pkg'] + + def test_dir_with_dot_is_skipped(self): + shutil.rmtree(os.path.join(self.dist_dir, 'pkg/subpkg/assets')) + data_dir = self._mkdir('some.data', self.pkg_dir) + self._touch('__init__.py', data_dir) + self._touch('file.dat', data_dir) + packages = find_packages(self.dist_dir) + assert 'pkg.some.data' not in packages + + def test_dir_with_packages_in_subdir_is_excluded(self): + """ + Ensure that a package in a non-package such as build/pkg/__init__.py + is excluded. + """ + build_dir = self._mkdir('build', self.dist_dir) + build_pkg_dir = self._mkdir('pkg', build_dir) + self._touch('__init__.py', build_pkg_dir) + packages = find_packages(self.dist_dir) + assert 'build.pkg' not in packages + + @pytest.mark.skipif(not has_symlink(), reason='Symlink support required') + def test_symlinked_packages_are_included(self): + """ + A symbolically-linked directory should be treated like any other + directory when matched as a package. + + Create a link from lpkg -> pkg. + """ + self._touch('__init__.py', self.pkg_dir) + linked_pkg = os.path.join(self.dist_dir, 'lpkg') + os.symlink('pkg', linked_pkg) + assert os.path.isdir(linked_pkg) + packages = find_packages(self.dist_dir) + assert 'lpkg' in packages + + def _assert_packages(self, actual, expected): + assert set(actual) == set(expected) + + def test_pep420_ns_package(self): + packages = find_420_packages( + self.dist_dir, include=['pkg*'], exclude=['pkg.subpkg.assets']) + self._assert_packages(packages, ['pkg', 'pkg.nspkg', 'pkg.subpkg']) + + def test_pep420_ns_package_no_includes(self): + packages = find_420_packages( + self.dist_dir, exclude=['pkg.subpkg.assets']) + self._assert_packages(packages, ['docs', 'pkg', 'pkg.nspkg', 'pkg.subpkg']) + + def test_pep420_ns_package_no_includes_or_excludes(self): + packages = find_420_packages(self.dist_dir) + expected = [ + 'docs', 'pkg', 'pkg.nspkg', 'pkg.subpkg', 'pkg.subpkg.assets'] + self._assert_packages(packages, expected) + + def test_regular_package_with_nested_pep420_ns_packages(self): + self._touch('__init__.py', self.pkg_dir) + packages = find_420_packages( + self.dist_dir, exclude=['docs', 'pkg.subpkg.assets']) + self._assert_packages(packages, ['pkg', 'pkg.nspkg', 'pkg.subpkg']) + + def test_pep420_ns_package_no_non_package_dirs(self): + shutil.rmtree(self.docs_dir) + shutil.rmtree(os.path.join(self.dist_dir, 'pkg/subpkg/assets')) + packages = find_420_packages(self.dist_dir) + self._assert_packages(packages, ['pkg', 'pkg.nspkg', 'pkg.subpkg']) diff --git a/setuptools/tests/test_install_scripts.py b/setuptools/tests/test_install_scripts.py new file mode 100644 index 0000000..7393241 --- /dev/null +++ b/setuptools/tests/test_install_scripts.py @@ -0,0 +1,88 @@ +"""install_scripts tests +""" + +import io +import sys + +import pytest + +from setuptools.command.install_scripts import install_scripts +from setuptools.dist import Distribution +from . import contexts + + +class TestInstallScripts: + settings = dict( + name='foo', + entry_points={'console_scripts': ['foo=foo:foo']}, + version='0.0', + ) + unix_exe = '/usr/dummy-test-path/local/bin/python' + unix_spaces_exe = '/usr/bin/env dummy-test-python' + win32_exe = 'C:\\Dummy Test Path\\Program Files\\Python 3.3\\python.exe' + + def _run_install_scripts(self, install_dir, executable=None): + dist = Distribution(self.settings) + dist.script_name = 'setup.py' + cmd = install_scripts(dist) + cmd.install_dir = install_dir + if executable is not None: + bs = cmd.get_finalized_command('build_scripts') + bs.executable = executable + cmd.ensure_finalized() + with contexts.quiet(): + cmd.run() + + @pytest.mark.skipif(sys.platform == 'win32', reason='non-Windows only') + def test_sys_executable_escaping_unix(self, tmpdir, monkeypatch): + """ + Ensure that shebang is not quoted on Unix when getting the Python exe + from sys.executable. + """ + expected = '#!%s\n' % self.unix_exe + monkeypatch.setattr('sys.executable', self.unix_exe) + with tmpdir.as_cwd(): + self._run_install_scripts(str(tmpdir)) + with io.open(str(tmpdir.join('foo')), 'r') as f: + actual = f.readline() + assert actual == expected + + @pytest.mark.skipif(sys.platform != 'win32', reason='Windows only') + def test_sys_executable_escaping_win32(self, tmpdir, monkeypatch): + """ + Ensure that shebang is quoted on Windows when getting the Python exe + from sys.executable and it contains a space. + """ + expected = '#!"%s"\n' % self.win32_exe + monkeypatch.setattr('sys.executable', self.win32_exe) + with tmpdir.as_cwd(): + self._run_install_scripts(str(tmpdir)) + with io.open(str(tmpdir.join('foo-script.py')), 'r') as f: + actual = f.readline() + assert actual == expected + + @pytest.mark.skipif(sys.platform == 'win32', reason='non-Windows only') + def test_executable_with_spaces_escaping_unix(self, tmpdir): + """ + Ensure that shebang on Unix is not quoted, even when a value with spaces + is specified using --executable. + """ + expected = '#!%s\n' % self.unix_spaces_exe + with tmpdir.as_cwd(): + self._run_install_scripts(str(tmpdir), self.unix_spaces_exe) + with io.open(str(tmpdir.join('foo')), 'r') as f: + actual = f.readline() + assert actual == expected + + @pytest.mark.skipif(sys.platform != 'win32', reason='Windows only') + def test_executable_arg_escaping_win32(self, tmpdir): + """ + Ensure that shebang on Windows is quoted when getting a path with spaces + from --executable, that is itself properly quoted. + """ + expected = '#!"%s"\n' % self.win32_exe + with tmpdir.as_cwd(): + self._run_install_scripts(str(tmpdir), '"' + self.win32_exe + '"') + with io.open(str(tmpdir.join('foo-script.py')), 'r') as f: + actual = f.readline() + assert actual == expected diff --git a/setuptools/tests/test_integration.py b/setuptools/tests/test_integration.py new file mode 100644 index 0000000..cb62eb2 --- /dev/null +++ b/setuptools/tests/test_integration.py @@ -0,0 +1,150 @@ +"""Run some integration tests. + +Try to install a few packages. +""" + +import glob +import os +import sys + +from six.moves import urllib +import pytest + +from setuptools.command.easy_install import easy_install +from setuptools.command import easy_install as easy_install_pkg +from setuptools.dist import Distribution + + +def setup_module(module): + packages = 'stevedore', 'virtualenvwrapper', 'pbr', 'novaclient' + for pkg in packages: + try: + __import__(pkg) + tmpl = "Integration tests cannot run when {pkg} is installed" + pytest.skip(tmpl.format(**locals())) + except ImportError: + pass + + try: + urllib.request.urlopen('https://pypi.python.org/pypi') + except Exception as exc: + pytest.skip(str(exc)) + + +@pytest.fixture +def install_context(request, tmpdir, monkeypatch): + """Fixture to set up temporary installation directory. + """ + # Save old values so we can restore them. + new_cwd = tmpdir.mkdir('cwd') + user_base = tmpdir.mkdir('user_base') + user_site = tmpdir.mkdir('user_site') + install_dir = tmpdir.mkdir('install_dir') + + def fin(): + # undo the monkeypatch, particularly needed under + # windows because of kept handle on cwd + monkeypatch.undo() + new_cwd.remove() + user_base.remove() + user_site.remove() + install_dir.remove() + + request.addfinalizer(fin) + + # Change the environment and site settings to control where the + # files are installed and ensure we do not overwrite anything. + monkeypatch.chdir(new_cwd) + monkeypatch.setattr(easy_install_pkg, '__file__', user_site.strpath) + monkeypatch.setattr('site.USER_BASE', user_base.strpath) + monkeypatch.setattr('site.USER_SITE', user_site.strpath) + monkeypatch.setattr('sys.path', sys.path + [install_dir.strpath]) + monkeypatch.setenv('PYTHONPATH', os.path.pathsep.join(sys.path)) + + # Set up the command for performing the installation. + dist = Distribution() + cmd = easy_install(dist) + cmd.install_dir = install_dir.strpath + return cmd + + +def _install_one(requirement, cmd, pkgname, modulename): + cmd.args = [requirement] + cmd.ensure_finalized() + cmd.run() + target = cmd.install_dir + dest_path = glob.glob(os.path.join(target, pkgname + '*.egg')) + assert dest_path + assert os.path.exists(os.path.join(dest_path[0], pkgname, modulename)) + + +def test_stevedore(install_context): + _install_one('stevedore', install_context, + 'stevedore', 'extension.py') + + +@pytest.mark.xfail +def test_virtualenvwrapper(install_context): + _install_one('virtualenvwrapper', install_context, + 'virtualenvwrapper', 'hook_loader.py') + + +def test_pbr(install_context): + _install_one('pbr', install_context, + 'pbr', 'core.py') + + +@pytest.mark.xfail +def test_python_novaclient(install_context): + _install_one('python-novaclient', install_context, + 'novaclient', 'base.py') + +import re +import subprocess +import functools +import tarfile, zipfile + + +build_deps = ['appdirs', 'packaging', 'pyparsing', 'six'] +@pytest.mark.parametrize("build_dep", build_deps) +@pytest.mark.skipif(sys.version_info < (3, 6), reason='run only on late versions') +def test_build_deps_on_distutils(request, tmpdir_factory, build_dep): + """ + All setuptools build dependencies must build without + setuptools. + """ + if 'pyparsing' in build_dep: + pytest.xfail(reason="Project imports setuptools unconditionally") + build_target = tmpdir_factory.mktemp('source') + build_dir = download_and_extract(request, build_dep, build_target) + install_target = tmpdir_factory.mktemp('target') + output = install(build_dir, install_target) + for line in output.splitlines(): + match = re.search('Unknown distribution option: (.*)', line) + allowed_unknowns = [ + 'test_suite', + 'tests_require', + 'install_requires', + ] + assert not match or match.group(1).strip('"\'') in allowed_unknowns + + +def install(pkg_dir, install_dir): + with open(os.path.join(pkg_dir, 'setuptools.py'), 'w') as breaker: + breaker.write('raise ImportError()') + cmd = [sys.executable, 'setup.py', 'install', '--prefix', install_dir] + env = dict(os.environ, PYTHONPATH=pkg_dir) + output = subprocess.check_output(cmd, cwd=pkg_dir, env=env, stderr=subprocess.STDOUT) + return output.decode('utf-8') + + +def download_and_extract(request, req, target): + cmd = [sys.executable, '-m', 'pip', 'download', '--no-deps', + '--no-binary', ':all:', req] + output = subprocess.check_output(cmd, encoding='utf-8') + filename = re.search('Saved (.*)', output).group(1) + request.addfinalizer(functools.partial(os.remove, filename)) + opener = zipfile.ZipFile if filename.endswith('.zip') else tarfile.open + with opener(filename) as archive: + archive.extractall(target) + return os.path.join(target, os.listdir(target)[0]) diff --git a/setuptools/tests/test_manifest.py b/setuptools/tests/test_manifest.py new file mode 100644 index 0000000..3b34c88 --- /dev/null +++ b/setuptools/tests/test_manifest.py @@ -0,0 +1,539 @@ +# -*- coding: utf-8 -*- +"""sdist tests""" + +import contextlib +import os +import shutil +import sys +import tempfile +from distutils import log +from distutils.errors import DistutilsTemplateError + +from setuptools.command.egg_info import FileList, egg_info, translate_pattern +from setuptools.dist import Distribution +import six +from setuptools.tests.textwrap import DALS + +import pytest + +py3_only = pytest.mark.xfail(six.PY2, reason="Test runs on Python 3 only") + + +def make_local_path(s): + """Converts '/' in a string to os.sep""" + return s.replace('/', os.sep) + + +SETUP_ATTRS = { + 'name': 'app', + 'version': '0.0', + 'packages': ['app'], +} + +SETUP_PY = """\ +from setuptools import setup + +setup(**%r) +""" % SETUP_ATTRS + + +@contextlib.contextmanager +def quiet(): + old_stdout, old_stderr = sys.stdout, sys.stderr + sys.stdout, sys.stderr = six.StringIO(), six.StringIO() + try: + yield + finally: + sys.stdout, sys.stderr = old_stdout, old_stderr + + +def touch(filename): + open(filename, 'w').close() + + +# The set of files always in the manifest, including all files in the +# .egg-info directory +default_files = frozenset(map(make_local_path, [ + 'README.rst', + 'MANIFEST.in', + 'setup.py', + 'app.egg-info/PKG-INFO', + 'app.egg-info/SOURCES.txt', + 'app.egg-info/dependency_links.txt', + 'app.egg-info/top_level.txt', + 'app/__init__.py', +])) + + +def get_pattern(glob): + return translate_pattern(make_local_path(glob)).pattern + + +def test_translated_pattern_test(): + l = make_local_path + assert get_pattern('foo') == r'foo\Z(?ms)' + assert get_pattern(l('foo/bar')) == l(r'foo\/bar\Z(?ms)') + + # Glob matching + assert get_pattern('*.txt') == l(r'[^\/]*\.txt\Z(?ms)') + assert get_pattern('dir/*.txt') == l(r'dir\/[^\/]*\.txt\Z(?ms)') + assert get_pattern('*/*.py') == l(r'[^\/]*\/[^\/]*\.py\Z(?ms)') + assert get_pattern('docs/page-?.txt') \ + == l(r'docs\/page\-[^\/]\.txt\Z(?ms)') + + # Globstars change what they mean depending upon where they are + assert get_pattern(l('foo/**/bar')) == l(r'foo\/(?:[^\/]+\/)*bar\Z(?ms)') + assert get_pattern(l('foo/**')) == l(r'foo\/.*\Z(?ms)') + assert get_pattern(l('**')) == r'.*\Z(?ms)' + + # Character classes + assert get_pattern('pre[one]post') == r'pre[one]post\Z(?ms)' + assert get_pattern('hello[!one]world') == r'hello[^one]world\Z(?ms)' + assert get_pattern('[]one].txt') == r'[\]one]\.txt\Z(?ms)' + assert get_pattern('foo[!]one]bar') == r'foo[^\]one]bar\Z(?ms)' + + +class TempDirTestCase(object): + def setup_method(self, method): + self.temp_dir = tempfile.mkdtemp() + self.old_cwd = os.getcwd() + os.chdir(self.temp_dir) + + def teardown_method(self, method): + os.chdir(self.old_cwd) + shutil.rmtree(self.temp_dir) + + +class TestManifestTest(TempDirTestCase): + def setup_method(self, method): + super(TestManifestTest, self).setup_method(method) + + f = open(os.path.join(self.temp_dir, 'setup.py'), 'w') + f.write(SETUP_PY) + f.close() + """ + Create a file tree like: + - LICENSE + - README.rst + - testing.rst + - .hidden.rst + - app/ + - __init__.py + - a.txt + - b.txt + - c.rst + - static/ + - app.js + - app.js.map + - app.css + - app.css.map + """ + + for fname in ['README.rst', '.hidden.rst', 'testing.rst', 'LICENSE']: + touch(os.path.join(self.temp_dir, fname)) + + # Set up the rest of the test package + test_pkg = os.path.join(self.temp_dir, 'app') + os.mkdir(test_pkg) + for fname in ['__init__.py', 'a.txt', 'b.txt', 'c.rst']: + touch(os.path.join(test_pkg, fname)) + + # Some compiled front-end assets to include + static = os.path.join(test_pkg, 'static') + os.mkdir(static) + for fname in ['app.js', 'app.js.map', 'app.css', 'app.css.map']: + touch(os.path.join(static, fname)) + + def make_manifest(self, contents): + """Write a MANIFEST.in.""" + with open(os.path.join(self.temp_dir, 'MANIFEST.in'), 'w') as f: + f.write(DALS(contents)) + + def get_files(self): + """Run egg_info and get all the files to include, as a set""" + dist = Distribution(SETUP_ATTRS) + dist.script_name = 'setup.py' + cmd = egg_info(dist) + cmd.ensure_finalized() + + cmd.run() + + return set(cmd.filelist.files) + + def test_no_manifest(self): + """Check a missing MANIFEST.in includes only the standard files.""" + assert (default_files - set(['MANIFEST.in'])) == self.get_files() + + def test_empty_files(self): + """Check an empty MANIFEST.in includes only the standard files.""" + self.make_manifest("") + assert default_files == self.get_files() + + def test_include(self): + """Include extra rst files in the project root.""" + self.make_manifest("include *.rst") + files = default_files | set([ + 'testing.rst', '.hidden.rst']) + assert files == self.get_files() + + def test_exclude(self): + """Include everything in app/ except the text files""" + l = make_local_path + self.make_manifest( + """ + include app/* + exclude app/*.txt + """) + files = default_files | set([l('app/c.rst')]) + assert files == self.get_files() + + def test_include_multiple(self): + """Include with multiple patterns.""" + l = make_local_path + self.make_manifest("include app/*.txt app/static/*") + files = default_files | set([ + l('app/a.txt'), l('app/b.txt'), + l('app/static/app.js'), l('app/static/app.js.map'), + l('app/static/app.css'), l('app/static/app.css.map')]) + assert files == self.get_files() + + def test_graft(self): + """Include the whole app/static/ directory.""" + l = make_local_path + self.make_manifest("graft app/static") + files = default_files | set([ + l('app/static/app.js'), l('app/static/app.js.map'), + l('app/static/app.css'), l('app/static/app.css.map')]) + assert files == self.get_files() + + def test_graft_glob_syntax(self): + """Include the whole app/static/ directory.""" + l = make_local_path + self.make_manifest("graft */static") + files = default_files | set([ + l('app/static/app.js'), l('app/static/app.js.map'), + l('app/static/app.css'), l('app/static/app.css.map')]) + assert files == self.get_files() + + def test_graft_global_exclude(self): + """Exclude all *.map files in the project.""" + l = make_local_path + self.make_manifest( + """ + graft app/static + global-exclude *.map + """) + files = default_files | set([ + l('app/static/app.js'), l('app/static/app.css')]) + assert files == self.get_files() + + def test_global_include(self): + """Include all *.rst, *.js, and *.css files in the whole tree.""" + l = make_local_path + self.make_manifest( + """ + global-include *.rst *.js *.css + """) + files = default_files | set([ + '.hidden.rst', 'testing.rst', l('app/c.rst'), + l('app/static/app.js'), l('app/static/app.css')]) + assert files == self.get_files() + + def test_graft_prune(self): + """Include all files in app/, except for the whole app/static/ dir.""" + l = make_local_path + self.make_manifest( + """ + graft app + prune app/static + """) + files = default_files | set([ + l('app/a.txt'), l('app/b.txt'), l('app/c.rst')]) + assert files == self.get_files() + + +class TestFileListTest(TempDirTestCase): + """ + A copy of the relevant bits of distutils/tests/test_filelist.py, + to ensure setuptools' version of FileList keeps parity with distutils. + """ + + def setup_method(self, method): + super(TestFileListTest, self).setup_method(method) + self.threshold = log.set_threshold(log.FATAL) + self._old_log = log.Log._log + log.Log._log = self._log + self.logs = [] + + def teardown_method(self, method): + log.set_threshold(self.threshold) + log.Log._log = self._old_log + super(TestFileListTest, self).teardown_method(method) + + def _log(self, level, msg, args): + if level not in (log.DEBUG, log.INFO, log.WARN, log.ERROR, log.FATAL): + raise ValueError('%s wrong log level' % str(level)) + self.logs.append((level, msg, args)) + + def get_logs(self, *levels): + def _format(msg, args): + if len(args) == 0: + return msg + return msg % args + return [_format(msg, args) for level, msg, args + in self.logs if level in levels] + + def clear_logs(self): + self.logs = [] + + def assertNoWarnings(self): + assert self.get_logs(log.WARN) == [] + self.clear_logs() + + def assertWarnings(self): + assert len(self.get_logs(log.WARN)) > 0 + self.clear_logs() + + def make_files(self, files): + for file in files: + file = os.path.join(self.temp_dir, file) + dirname, basename = os.path.split(file) + if not os.path.exists(dirname): + os.makedirs(dirname) + open(file, 'w').close() + + def test_process_template_line(self): + # testing all MANIFEST.in template patterns + file_list = FileList() + l = make_local_path + + # simulated file list + self.make_files([ + 'foo.tmp', 'ok', 'xo', 'four.txt', + 'buildout.cfg', + # filelist does not filter out VCS directories, + # it's sdist that does + l('.hg/last-message.txt'), + l('global/one.txt'), + l('global/two.txt'), + l('global/files.x'), + l('global/here.tmp'), + l('f/o/f.oo'), + l('dir/graft-one'), + l('dir/dir2/graft2'), + l('dir3/ok'), + l('dir3/sub/ok.txt'), + ]) + + MANIFEST_IN = DALS("""\ + include ok + include xo + exclude xo + include foo.tmp + include buildout.cfg + global-include *.x + global-include *.txt + global-exclude *.tmp + recursive-include f *.oo + recursive-exclude global *.x + graft dir + prune dir3 + """) + + for line in MANIFEST_IN.split('\n'): + if not line: + continue + file_list.process_template_line(line) + + wanted = [ + 'buildout.cfg', + 'four.txt', + 'ok', + l('.hg/last-message.txt'), + l('dir/graft-one'), + l('dir/dir2/graft2'), + l('f/o/f.oo'), + l('global/one.txt'), + l('global/two.txt'), + ] + + file_list.sort() + assert file_list.files == wanted + + def test_exclude_pattern(self): + # return False if no match + file_list = FileList() + assert not file_list.exclude_pattern('*.py') + + # return True if files match + file_list = FileList() + file_list.files = ['a.py', 'b.py'] + assert file_list.exclude_pattern('*.py') + + # test excludes + file_list = FileList() + file_list.files = ['a.py', 'a.txt'] + file_list.exclude_pattern('*.py') + file_list.sort() + assert file_list.files == ['a.txt'] + + def test_include_pattern(self): + # return False if no match + file_list = FileList() + self.make_files([]) + assert not file_list.include_pattern('*.py') + + # return True if files match + file_list = FileList() + self.make_files(['a.py', 'b.txt']) + assert file_list.include_pattern('*.py') + + # test * matches all files + file_list = FileList() + self.make_files(['a.py', 'b.txt']) + file_list.include_pattern('*') + file_list.sort() + assert file_list.files == ['a.py', 'b.txt'] + + def test_process_template_line_invalid(self): + # invalid lines + file_list = FileList() + for action in ('include', 'exclude', 'global-include', + 'global-exclude', 'recursive-include', + 'recursive-exclude', 'graft', 'prune', 'blarg'): + try: + file_list.process_template_line(action) + except DistutilsTemplateError: + pass + except Exception: + assert False, "Incorrect error thrown" + else: + assert False, "Should have thrown an error" + + def test_include(self): + l = make_local_path + # include + file_list = FileList() + self.make_files(['a.py', 'b.txt', l('d/c.py')]) + + file_list.process_template_line('include *.py') + file_list.sort() + assert file_list.files == ['a.py'] + self.assertNoWarnings() + + file_list.process_template_line('include *.rb') + file_list.sort() + assert file_list.files == ['a.py'] + self.assertWarnings() + + def test_exclude(self): + l = make_local_path + # exclude + file_list = FileList() + file_list.files = ['a.py', 'b.txt', l('d/c.py')] + + file_list.process_template_line('exclude *.py') + file_list.sort() + assert file_list.files == ['b.txt', l('d/c.py')] + self.assertNoWarnings() + + file_list.process_template_line('exclude *.rb') + file_list.sort() + assert file_list.files == ['b.txt', l('d/c.py')] + self.assertWarnings() + + def test_global_include(self): + l = make_local_path + # global-include + file_list = FileList() + self.make_files(['a.py', 'b.txt', l('d/c.py')]) + + file_list.process_template_line('global-include *.py') + file_list.sort() + assert file_list.files == ['a.py', l('d/c.py')] + self.assertNoWarnings() + + file_list.process_template_line('global-include *.rb') + file_list.sort() + assert file_list.files == ['a.py', l('d/c.py')] + self.assertWarnings() + + def test_global_exclude(self): + l = make_local_path + # global-exclude + file_list = FileList() + file_list.files = ['a.py', 'b.txt', l('d/c.py')] + + file_list.process_template_line('global-exclude *.py') + file_list.sort() + assert file_list.files == ['b.txt'] + self.assertNoWarnings() + + file_list.process_template_line('global-exclude *.rb') + file_list.sort() + assert file_list.files == ['b.txt'] + self.assertWarnings() + + def test_recursive_include(self): + l = make_local_path + # recursive-include + file_list = FileList() + self.make_files(['a.py', l('d/b.py'), l('d/c.txt'), l('d/d/e.py')]) + + file_list.process_template_line('recursive-include d *.py') + file_list.sort() + assert file_list.files == [l('d/b.py'), l('d/d/e.py')] + self.assertNoWarnings() + + file_list.process_template_line('recursive-include e *.py') + file_list.sort() + assert file_list.files == [l('d/b.py'), l('d/d/e.py')] + self.assertWarnings() + + def test_recursive_exclude(self): + l = make_local_path + # recursive-exclude + file_list = FileList() + file_list.files = ['a.py', l('d/b.py'), l('d/c.txt'), l('d/d/e.py')] + + file_list.process_template_line('recursive-exclude d *.py') + file_list.sort() + assert file_list.files == ['a.py', l('d/c.txt')] + self.assertNoWarnings() + + file_list.process_template_line('recursive-exclude e *.py') + file_list.sort() + assert file_list.files == ['a.py', l('d/c.txt')] + self.assertWarnings() + + def test_graft(self): + l = make_local_path + # graft + file_list = FileList() + self.make_files(['a.py', l('d/b.py'), l('d/d/e.py'), l('f/f.py')]) + + file_list.process_template_line('graft d') + file_list.sort() + assert file_list.files == [l('d/b.py'), l('d/d/e.py')] + self.assertNoWarnings() + + file_list.process_template_line('graft e') + file_list.sort() + assert file_list.files == [l('d/b.py'), l('d/d/e.py')] + self.assertWarnings() + + def test_prune(self): + l = make_local_path + # prune + file_list = FileList() + file_list.files = ['a.py', l('d/b.py'), l('d/d/e.py'), l('f/f.py')] + + file_list.process_template_line('prune d') + file_list.sort() + assert file_list.files == ['a.py', l('f/f.py')] + self.assertNoWarnings() + + file_list.process_template_line('prune e') + file_list.sort() + assert file_list.files == ['a.py', l('f/f.py')] + self.assertWarnings() diff --git a/setuptools/tests/test_msvc.py b/setuptools/tests/test_msvc.py new file mode 100644 index 0000000..fbeed1d --- /dev/null +++ b/setuptools/tests/test_msvc.py @@ -0,0 +1,178 @@ +""" +Tests for msvc support module. +""" + +import os +import contextlib +import distutils.errors +from unittest import mock + +import pytest + +from . import contexts + +# importing only setuptools should apply the patch +__import__('setuptools') + +pytest.importorskip("distutils.msvc9compiler") + + +def mock_reg(hkcu=None, hklm=None): + """ + Return a mock for distutils.msvc9compiler.Reg, patched + to mock out the functions that access the registry. + """ + + _winreg = getattr(distutils.msvc9compiler, '_winreg', None) + winreg = getattr(distutils.msvc9compiler, 'winreg', _winreg) + + hives = { + winreg.HKEY_CURRENT_USER: hkcu or {}, + winreg.HKEY_LOCAL_MACHINE: hklm or {}, + } + + @classmethod + def read_keys(cls, base, key): + """Return list of registry keys.""" + hive = hives.get(base, {}) + return [ + k.rpartition('\\')[2] + for k in hive if k.startswith(key.lower()) + ] + + @classmethod + def read_values(cls, base, key): + """Return dict of registry keys and values.""" + hive = hives.get(base, {}) + return dict( + (k.rpartition('\\')[2], hive[k]) + for k in hive if k.startswith(key.lower()) + ) + + return mock.patch.multiple(distutils.msvc9compiler.Reg, + read_keys=read_keys, read_values=read_values) + + +class TestModulePatch: + """ + Ensure that importing setuptools is sufficient to replace + the standard find_vcvarsall function with a version that + recognizes the "Visual C++ for Python" package. + """ + + key_32 = r'software\microsoft\devdiv\vcforpython\9.0\installdir' + key_64 = r'software\wow6432node\microsoft\devdiv\vcforpython\9.0\installdir' + + def test_patched(self): + "Test the module is actually patched" + mod_name = distutils.msvc9compiler.find_vcvarsall.__module__ + assert mod_name == "setuptools.msvc", "find_vcvarsall unpatched" + + def test_no_registry_entries_means_nothing_found(self): + """ + No registry entries or environment variable should lead to an error + directing the user to download vcpython27. + """ + find_vcvarsall = distutils.msvc9compiler.find_vcvarsall + query_vcvarsall = distutils.msvc9compiler.query_vcvarsall + + with contexts.environment(VS90COMNTOOLS=None): + with mock_reg(): + assert find_vcvarsall(9.0) is None + + try: + query_vcvarsall(9.0) + except Exception as exc: + expected = distutils.errors.DistutilsPlatformError + assert isinstance(exc, expected) + assert 'aka.ms/vcpython27' in str(exc) + + @pytest.yield_fixture + def user_preferred_setting(self): + """ + Set up environment with different install dirs for user vs. system + and yield the user_install_dir for the expected result. + """ + with self.mock_install_dir() as user_install_dir: + with self.mock_install_dir() as system_install_dir: + reg = mock_reg( + hkcu={ + self.key_32: user_install_dir, + }, + hklm={ + self.key_32: system_install_dir, + self.key_64: system_install_dir, + }, + ) + with reg: + yield user_install_dir + + def test_prefer_current_user(self, user_preferred_setting): + """ + Ensure user's settings are preferred. + """ + result = distutils.msvc9compiler.find_vcvarsall(9.0) + expected = os.path.join(user_preferred_setting, 'vcvarsall.bat') + assert expected == result + + @pytest.yield_fixture + def local_machine_setting(self): + """ + Set up environment with only the system environment configured. + """ + with self.mock_install_dir() as system_install_dir: + reg = mock_reg( + hklm={ + self.key_32: system_install_dir, + }, + ) + with reg: + yield system_install_dir + + def test_local_machine_recognized(self, local_machine_setting): + """ + Ensure machine setting is honored if user settings are not present. + """ + result = distutils.msvc9compiler.find_vcvarsall(9.0) + expected = os.path.join(local_machine_setting, 'vcvarsall.bat') + assert expected == result + + @pytest.yield_fixture + def x64_preferred_setting(self): + """ + Set up environment with 64-bit and 32-bit system settings configured + and yield the canonical location. + """ + with self.mock_install_dir() as x32_dir: + with self.mock_install_dir() as x64_dir: + reg = mock_reg( + hklm={ + # This *should* only exist on 32-bit machines + self.key_32: x32_dir, + # This *should* only exist on 64-bit machines + self.key_64: x64_dir, + }, + ) + with reg: + yield x32_dir + + def test_ensure_64_bit_preferred(self, x64_preferred_setting): + """ + Ensure 64-bit system key is preferred. + """ + result = distutils.msvc9compiler.find_vcvarsall(9.0) + expected = os.path.join(x64_preferred_setting, 'vcvarsall.bat') + assert expected == result + + @staticmethod + @contextlib.contextmanager + def mock_install_dir(): + """ + Make a mock install dir in a unique location so that tests can + distinguish which dir was detected in a given scenario. + """ + with contexts.tempdir() as result: + vcvarsall = os.path.join(result, 'vcvarsall.bat') + with open(vcvarsall, 'w'): + pass + yield result diff --git a/setuptools/tests/test_namespaces.py b/setuptools/tests/test_namespaces.py new file mode 100644 index 0000000..721cad1 --- /dev/null +++ b/setuptools/tests/test_namespaces.py @@ -0,0 +1,105 @@ +from __future__ import absolute_import, unicode_literals + +import os +import sys +import subprocess + +import pytest + +from . import namespaces +from setuptools.command import test + + +class TestNamespaces: + + @pytest.mark.xfail(sys.version_info < (3, 5), + reason="Requires importlib.util.module_from_spec") + @pytest.mark.skipif(bool(os.environ.get("APPVEYOR")), + reason="https://github.com/pypa/setuptools/issues/851") + def test_mixed_site_and_non_site(self, tmpdir): + """ + Installing two packages sharing the same namespace, one installed + to a site dir and the other installed just to a path on PYTHONPATH + should leave the namespace in tact and both packages reachable by + import. + """ + pkg_A = namespaces.build_namespace_package(tmpdir, 'myns.pkgA') + pkg_B = namespaces.build_namespace_package(tmpdir, 'myns.pkgB') + site_packages = tmpdir / 'site-packages' + path_packages = tmpdir / 'path-packages' + targets = site_packages, path_packages + # use pip to install to the target directory + install_cmd = [ + 'pip', + 'install', + str(pkg_A), + '-t', str(site_packages), + ] + subprocess.check_call(install_cmd) + namespaces.make_site_dir(site_packages) + install_cmd = [ + 'pip', + 'install', + str(pkg_B), + '-t', str(path_packages), + ] + subprocess.check_call(install_cmd) + try_import = [ + sys.executable, + '-c', 'import myns.pkgA; import myns.pkgB', + ] + with test.test.paths_on_pythonpath(map(str, targets)): + subprocess.check_call(try_import) + + @pytest.mark.skipif(bool(os.environ.get("APPVEYOR")), + reason="https://github.com/pypa/setuptools/issues/851") + def test_pkg_resources_import(self, tmpdir): + """ + Ensure that a namespace package doesn't break on import + of pkg_resources. + """ + pkg = namespaces.build_namespace_package(tmpdir, 'myns.pkgA') + target = tmpdir / 'packages' + target.mkdir() + install_cmd = [ + sys.executable, + '-m', 'easy_install', + '-d', str(target), + str(pkg), + ] + with test.test.paths_on_pythonpath([str(target)]): + subprocess.check_call(install_cmd) + namespaces.make_site_dir(target) + try_import = [ + sys.executable, + '-c', 'import pkg_resources', + ] + with test.test.paths_on_pythonpath([str(target)]): + subprocess.check_call(try_import) + + @pytest.mark.skipif(bool(os.environ.get("APPVEYOR")), + reason="https://github.com/pypa/setuptools/issues/851") + def test_namespace_package_installed_and_cwd(self, tmpdir): + """ + Installing a namespace packages but also having it in the current + working directory, only one version should take precedence. + """ + pkg_A = namespaces.build_namespace_package(tmpdir, 'myns.pkgA') + target = tmpdir / 'packages' + # use pip to install to the target directory + install_cmd = [ + 'pip', + 'install', + str(pkg_A), + '-t', str(target), + ] + subprocess.check_call(install_cmd) + namespaces.make_site_dir(target) + + # ensure that package imports and pkg_resources imports + pkg_resources_imp = [ + sys.executable, + '-c', 'import pkg_resources; import myns.pkgA', + ] + with test.test.paths_on_pythonpath([str(target)]): + subprocess.check_call(pkg_resources_imp, cwd=str(pkg_A)) diff --git a/setuptools/tests/test_packageindex.py b/setuptools/tests/test_packageindex.py index 0231eda..1a66394 100644 --- a/setuptools/tests/test_packageindex.py +++ b/setuptools/tests/test_packageindex.py @@ -1,27 +1,275 @@ -"""Package Index Tests -""" -# More would be better! +from __future__ import absolute_import + +import sys +import os +import distutils.errors + +import six +from six.moves import urllib, http_client -import os, shutil, tempfile, unittest, urllib2 import pkg_resources import setuptools.package_index +from setuptools.tests.server import IndexServer +from .textwrap import DALS -class TestPackageIndex(unittest.TestCase): - def test_bad_urls(self): +class TestPackageIndex: + def test_regex(self): + hash_url = 'http://other_url?:action=show_md5&' + hash_url += 'digest=0123456789abcdef0123456789abcdef' + doc = """ + Name + (md5) + """.lstrip().format(**locals()) + assert setuptools.package_index.PYPI_MD5.match(doc) + + def test_bad_url_bad_port(self): index = setuptools.package_index.PackageIndex() - url = 'http://127.0.0.1/nonesuch/test_package_index' + url = 'http://127.0.0.1:0/nonesuch/test_package_index' + try: + v = index.open_url(url) + except Exception as v: + assert url in str(v) + else: + assert isinstance(v, urllib.error.HTTPError) + + def test_bad_url_typo(self): + # issue 16 + # easy_install inquant.contentmirror.plone breaks because of a typo + # in its home URL + index = setuptools.package_index.PackageIndex( + hosts=('www.example.com',) + ) + + url = 'url:%20https://svn.plone.org/svn/collective/inquant.contentmirror.plone/trunk' + try: + v = index.open_url(url) + except Exception as v: + assert url in str(v) + else: + assert isinstance(v, urllib.error.HTTPError) + + def test_bad_url_bad_status_line(self): + index = setuptools.package_index.PackageIndex( + hosts=('www.example.com',) + ) + + def _urlopen(*args): + raise http_client.BadStatusLine('line') + + index.opener = _urlopen + url = 'http://example.com' try: v = index.open_url(url) - except Exception, v: - self.assert_(url in str(v)) + except Exception as v: + assert 'line' in str(v) else: - self.assert_(isinstance(v,urllib2.HTTPError)) + raise AssertionError('Should have raise here!') + + def test_bad_url_double_scheme(self): + """ + A bad URL with a double scheme should raise a DistutilsError. + """ + index = setuptools.package_index.PackageIndex( + hosts=('www.example.com',) + ) + + # issue 20 + url = 'http://http://svn.pythonpaste.org/Paste/wphp/trunk' + try: + index.open_url(url) + except distutils.errors.DistutilsError as error: + msg = six.text_type(error) + assert 'nonnumeric port' in msg or 'getaddrinfo failed' in msg or 'Name or service not known' in msg + return + raise RuntimeError("Did not raise") + + def test_bad_url_screwy_href(self): + index = setuptools.package_index.PackageIndex( + hosts=('www.example.com',) + ) + + # issue #160 + if sys.version_info[0] == 2 and sys.version_info[1] == 7: + # this should not fail + url = 'http://example.com' + page = ('') + index.process_index(url, page) def test_url_ok(self): index = setuptools.package_index.PackageIndex( hosts=('www.example.com',) ) url = 'file:///tmp/test_package_index' - self.assert_(index.url_ok(url, True)) + assert index.url_ok(url, True) + + def test_links_priority(self): + """ + Download links from the pypi simple index should be used before + external download links. + https://bitbucket.org/tarek/distribute/issue/163 + + Usecase : + - someone uploads a package on pypi, a md5 is generated + - someone manually copies this link (with the md5 in the url) onto an + external page accessible from the package page. + - someone reuploads the package (with a different md5) + - while easy_installing, an MD5 error occurs because the external link + is used + -> Setuptools should use the link from pypi, not the external one. + """ + if sys.platform.startswith('java'): + # Skip this test on jython because binding to :0 fails + return + + # start an index server + server = IndexServer() + server.start() + index_url = server.base_url() + 'test_links_priority/simple/' + + # scan a test index + pi = setuptools.package_index.PackageIndex(index_url) + requirement = pkg_resources.Requirement.parse('foobar') + pi.find_packages(requirement) + server.stop() + + # the distribution has been found + assert 'foobar' in pi + # we have only one link, because links are compared without md5 + assert len(pi['foobar']) == 1 + # the link should be from the index + assert 'correct_md5' in pi['foobar'][0].location + + def test_parse_bdist_wininst(self): + parse = setuptools.package_index.parse_bdist_wininst + + actual = parse('reportlab-2.5.win32-py2.4.exe') + expected = 'reportlab-2.5', '2.4', 'win32' + assert actual == expected + + actual = parse('reportlab-2.5.win32.exe') + expected = 'reportlab-2.5', None, 'win32' + assert actual == expected + + actual = parse('reportlab-2.5.win-amd64-py2.7.exe') + expected = 'reportlab-2.5', '2.7', 'win-amd64' + assert actual == expected + + actual = parse('reportlab-2.5.win-amd64.exe') + expected = 'reportlab-2.5', None, 'win-amd64' + assert actual == expected + + def test__vcs_split_rev_from_url(self): + """ + Test the basic usage of _vcs_split_rev_from_url + """ + vsrfu = setuptools.package_index.PackageIndex._vcs_split_rev_from_url + url, rev = vsrfu('https://example.com/bar@2995') + assert url == 'https://example.com/bar' + assert rev == '2995' + + def test_local_index(self, tmpdir): + """ + local_open should be able to read an index from the file system. + """ + index_file = tmpdir / 'index.html' + with index_file.open('w') as f: + f.write('
    content
    ') + url = 'file:' + urllib.request.pathname2url(str(tmpdir)) + '/' + res = setuptools.package_index.local_open(url) + assert 'content' in res.read() + + def test_egg_fragment(self): + """ + EGG fragments must comply to PEP 440 + """ + epoch = [ + '', + '1!', + ] + releases = [ + '0', + '0.0', + '0.0.0', + ] + pre = [ + 'a0', + 'b0', + 'rc0', + ] + post = [ + '.post0' + ] + dev = [ + '.dev0', + ] + local = [ + ('', ''), + ('+ubuntu.0', '+ubuntu.0'), + ('+ubuntu-0', '+ubuntu.0'), + ('+ubuntu_0', '+ubuntu.0'), + ] + versions = [ + [''.join([e, r, p, l]) for l in ll] + for e in epoch + for r in releases + for p in sum([pre, post, dev], ['']) + for ll in local] + for v, vc in versions: + dists = list(setuptools.package_index.distros_for_url( + 'http://example.com/example.zip#egg=example-' + v)) + assert dists[0].version == '' + assert dists[1].version == vc + + +class TestContentCheckers: + def test_md5(self): + checker = setuptools.package_index.HashChecker.from_url( + 'http://foo/bar#md5=f12895fdffbd45007040d2e44df98478') + checker.feed('You should probably not be using MD5'.encode('ascii')) + assert checker.hash.hexdigest() == 'f12895fdffbd45007040d2e44df98478' + assert checker.is_valid() + + def test_other_fragment(self): + "Content checks should succeed silently if no hash is present" + checker = setuptools.package_index.HashChecker.from_url( + 'http://foo/bar#something%20completely%20different') + checker.feed('anything'.encode('ascii')) + assert checker.is_valid() + + def test_blank_md5(self): + "Content checks should succeed if a hash is empty" + checker = setuptools.package_index.HashChecker.from_url( + 'http://foo/bar#md5=') + checker.feed('anything'.encode('ascii')) + assert checker.is_valid() + + def test_get_hash_name_md5(self): + checker = setuptools.package_index.HashChecker.from_url( + 'http://foo/bar#md5=f12895fdffbd45007040d2e44df98478') + assert checker.hash_name == 'md5' + + def test_report(self): + checker = setuptools.package_index.HashChecker.from_url( + 'http://foo/bar#md5=f12895fdffbd45007040d2e44df98478') + rep = checker.report(lambda x: x, 'My message about %s') + assert rep == 'My message about md5' + +class TestPyPIConfig: + def test_percent_in_password(self, tmpdir, monkeypatch): + monkeypatch.setitem(os.environ, 'HOME', str(tmpdir)) + pypirc = tmpdir / '.pypirc' + with pypirc.open('w') as strm: + strm.write(DALS(""" + [pypi] + repository=https://pypi.python.org + username=jaraco + password=pity% + """)) + cfg = setuptools.package_index.PyPIConfig() + cred = cfg.creds_by_repository['https://pypi.python.org'] + assert cred.username == 'jaraco' + assert cred.password == 'pity%' diff --git a/setuptools/tests/test_resources.py b/setuptools/tests/test_resources.py deleted file mode 100644 index 03e5d0f..0000000 --- a/setuptools/tests/test_resources.py +++ /dev/null @@ -1,533 +0,0 @@ -#!/usr/bin/python -# -*- coding: utf-8 -*- -# NOTE: the shebang and encoding lines are for ScriptHeaderTests; do not remove -from unittest import TestCase, makeSuite; from pkg_resources import * -from setuptools.command.easy_install import get_script_header, is_sh -import os, pkg_resources, sys, StringIO -try: frozenset -except NameError: - from sets import ImmutableSet as frozenset - -class Metadata(EmptyProvider): - """Mock object to return metadata as if from an on-disk distribution""" - - def __init__(self,*pairs): - self.metadata = dict(pairs) - - def has_metadata(self,name): - return name in self.metadata - - def get_metadata(self,name): - return self.metadata[name] - - def get_metadata_lines(self,name): - return yield_lines(self.get_metadata(name)) - -class DistroTests(TestCase): - - def testCollection(self): - # empty path should produce no distributions - ad = Environment([], platform=None, python=None) - self.assertEqual(list(ad), []) - self.assertEqual(ad['FooPkg'],[]) - ad.add(Distribution.from_filename("FooPkg-1.3_1.egg")) - ad.add(Distribution.from_filename("FooPkg-1.4-py2.4-win32.egg")) - ad.add(Distribution.from_filename("FooPkg-1.2-py2.4.egg")) - - # Name is in there now - self.failUnless(ad['FooPkg']) - # But only 1 package - self.assertEqual(list(ad), ['foopkg']) - - # Distributions sort by version - self.assertEqual( - [dist.version for dist in ad['FooPkg']], ['1.4','1.3-1','1.2'] - ) - # Removing a distribution leaves sequence alone - ad.remove(ad['FooPkg'][1]) - self.assertEqual( - [dist.version for dist in ad['FooPkg']], ['1.4','1.2'] - ) - # And inserting adds them in order - ad.add(Distribution.from_filename("FooPkg-1.9.egg")) - self.assertEqual( - [dist.version for dist in ad['FooPkg']], ['1.9','1.4','1.2'] - ) - - ws = WorkingSet([]) - foo12 = Distribution.from_filename("FooPkg-1.2-py2.4.egg") - foo14 = Distribution.from_filename("FooPkg-1.4-py2.4-win32.egg") - req, = parse_requirements("FooPkg>=1.3") - - # Nominal case: no distros on path, should yield all applicable - self.assertEqual(ad.best_match(req,ws).version, '1.9') - # If a matching distro is already installed, should return only that - ws.add(foo14); self.assertEqual(ad.best_match(req,ws).version, '1.4') - - # If the first matching distro is unsuitable, it's a version conflict - ws = WorkingSet([]); ws.add(foo12); ws.add(foo14) - self.assertRaises(VersionConflict, ad.best_match, req, ws) - - # If more than one match on the path, the first one takes precedence - ws = WorkingSet([]); ws.add(foo14); ws.add(foo12); ws.add(foo14); - self.assertEqual(ad.best_match(req,ws).version, '1.4') - - def checkFooPkg(self,d): - self.assertEqual(d.project_name, "FooPkg") - self.assertEqual(d.key, "foopkg") - self.assertEqual(d.version, "1.3-1") - self.assertEqual(d.py_version, "2.4") - self.assertEqual(d.platform, "win32") - self.assertEqual(d.parsed_version, parse_version("1.3-1")) - - def testDistroBasics(self): - d = Distribution( - "/some/path", - project_name="FooPkg",version="1.3-1",py_version="2.4",platform="win32" - ) - self.checkFooPkg(d) - - d = Distribution("/some/path") - self.assertEqual(d.py_version, sys.version[:3]) - self.assertEqual(d.platform, None) - - def testDistroParse(self): - d = Distribution.from_filename("FooPkg-1.3_1-py2.4-win32.egg") - self.checkFooPkg(d) - d = Distribution.from_filename("FooPkg-1.3_1-py2.4-win32.egg-info") - self.checkFooPkg(d) - - def testDistroMetadata(self): - d = Distribution( - "/some/path", project_name="FooPkg", py_version="2.4", platform="win32", - metadata = Metadata( - ('PKG-INFO',"Metadata-Version: 1.0\nVersion: 1.3-1\n") - ) - ) - self.checkFooPkg(d) - - - def distRequires(self, txt): - return Distribution("/foo", metadata=Metadata(('depends.txt', txt))) - - def checkRequires(self, dist, txt, extras=()): - self.assertEqual( - list(dist.requires(extras)), - list(parse_requirements(txt)) - ) - - def testDistroDependsSimple(self): - for v in "Twisted>=1.5", "Twisted>=1.5\nZConfig>=2.0": - self.checkRequires(self.distRequires(v), v) - - - def testResolve(self): - ad = Environment([]); ws = WorkingSet([]) - # Resolving no requirements -> nothing to install - self.assertEqual( list(ws.resolve([],ad)), [] ) - # Request something not in the collection -> DistributionNotFound - self.assertRaises( - DistributionNotFound, ws.resolve, parse_requirements("Foo"), ad - ) - Foo = Distribution.from_filename( - "/foo_dir/Foo-1.2.egg", - metadata=Metadata(('depends.txt', "[bar]\nBaz>=2.0")) - ) - ad.add(Foo); ad.add(Distribution.from_filename("Foo-0.9.egg")) - - # Request thing(s) that are available -> list to activate - for i in range(3): - targets = list(ws.resolve(parse_requirements("Foo"), ad)) - self.assertEqual(targets, [Foo]) - map(ws.add,targets) - self.assertRaises(VersionConflict, ws.resolve, - parse_requirements("Foo==0.9"), ad) - ws = WorkingSet([]) # reset - - # Request an extra that causes an unresolved dependency for "Baz" - self.assertRaises( - DistributionNotFound, ws.resolve,parse_requirements("Foo[bar]"), ad - ) - Baz = Distribution.from_filename( - "/foo_dir/Baz-2.1.egg", metadata=Metadata(('depends.txt', "Foo")) - ) - ad.add(Baz) - - # Activation list now includes resolved dependency - self.assertEqual( - list(ws.resolve(parse_requirements("Foo[bar]"), ad)), [Foo,Baz] - ) - # Requests for conflicting versions produce VersionConflict - self.assertRaises( VersionConflict, - ws.resolve, parse_requirements("Foo==1.2\nFoo!=1.2"), ad - ) - - def testDistroDependsOptions(self): - d = self.distRequires(""" - Twisted>=1.5 - [docgen] - ZConfig>=2.0 - docutils>=0.3 - [fastcgi] - fcgiapp>=0.1""") - self.checkRequires(d,"Twisted>=1.5") - self.checkRequires( - d,"Twisted>=1.5 ZConfig>=2.0 docutils>=0.3".split(), ["docgen"] - ) - self.checkRequires( - d,"Twisted>=1.5 fcgiapp>=0.1".split(), ["fastcgi"] - ) - self.checkRequires( - d,"Twisted>=1.5 ZConfig>=2.0 docutils>=0.3 fcgiapp>=0.1".split(), - ["docgen","fastcgi"] - ) - self.checkRequires( - d,"Twisted>=1.5 fcgiapp>=0.1 ZConfig>=2.0 docutils>=0.3".split(), - ["fastcgi", "docgen"] - ) - self.assertRaises(UnknownExtra, d.requires, ["foo"]) - - - - - - - - - - - - - - - - - -class EntryPointTests(TestCase): - - def assertfields(self, ep): - self.assertEqual(ep.name,"foo") - self.assertEqual(ep.module_name,"setuptools.tests.test_resources") - self.assertEqual(ep.attrs, ("EntryPointTests",)) - self.assertEqual(ep.extras, ("x",)) - self.failUnless(ep.load() is EntryPointTests) - self.assertEqual( - str(ep), - "foo = setuptools.tests.test_resources:EntryPointTests [x]" - ) - - def setUp(self): - self.dist = Distribution.from_filename( - "FooPkg-1.2-py2.4.egg", metadata=Metadata(('requires.txt','[x]'))) - - def testBasics(self): - ep = EntryPoint( - "foo", "setuptools.tests.test_resources", ["EntryPointTests"], - ["x"], self.dist - ) - self.assertfields(ep) - - def testParse(self): - s = "foo = setuptools.tests.test_resources:EntryPointTests [x]" - ep = EntryPoint.parse(s, self.dist) - self.assertfields(ep) - - ep = EntryPoint.parse("bar baz= spammity[PING]") - self.assertEqual(ep.name,"bar baz") - self.assertEqual(ep.module_name,"spammity") - self.assertEqual(ep.attrs, ()) - self.assertEqual(ep.extras, ("ping",)) - - ep = EntryPoint.parse(" fizzly = wocka:foo") - self.assertEqual(ep.name,"fizzly") - self.assertEqual(ep.module_name,"wocka") - self.assertEqual(ep.attrs, ("foo",)) - self.assertEqual(ep.extras, ()) - - def testRejects(self): - for ep in [ - "foo", "x=1=2", "x=a:b:c", "q=x/na", "fez=pish:tush-z", "x=f[a]>2", - ]: - try: EntryPoint.parse(ep) - except ValueError: pass - else: raise AssertionError("Should've been bad", ep) - - def checkSubMap(self, m): - self.assertEqual(len(m), len(self.submap_expect)) - for key, ep in self.submap_expect.iteritems(): - self.assertEqual(repr(m.get(key)), repr(ep)) - - submap_expect = dict( - feature1=EntryPoint('feature1', 'somemodule', ['somefunction']), - feature2=EntryPoint('feature2', 'another.module', ['SomeClass'], ['extra1','extra2']), - feature3=EntryPoint('feature3', 'this.module', extras=['something']) - ) - submap_str = """ - # define features for blah blah - feature1 = somemodule:somefunction - feature2 = another.module:SomeClass [extra1,extra2] - feature3 = this.module [something] - """ - - def testParseList(self): - self.checkSubMap(EntryPoint.parse_group("xyz", self.submap_str)) - self.assertRaises(ValueError, EntryPoint.parse_group, "x a", "foo=bar") - self.assertRaises(ValueError, EntryPoint.parse_group, "x", - ["foo=baz", "foo=bar"]) - - def testParseMap(self): - m = EntryPoint.parse_map({'xyz':self.submap_str}) - self.checkSubMap(m['xyz']) - self.assertEqual(m.keys(),['xyz']) - m = EntryPoint.parse_map("[xyz]\n"+self.submap_str) - self.checkSubMap(m['xyz']) - self.assertEqual(m.keys(),['xyz']) - self.assertRaises(ValueError, EntryPoint.parse_map, ["[xyz]", "[xyz]"]) - self.assertRaises(ValueError, EntryPoint.parse_map, self.submap_str) - -class RequirementsTests(TestCase): - - def testBasics(self): - r = Requirement.parse("Twisted>=1.2") - self.assertEqual(str(r),"Twisted>=1.2") - self.assertEqual(repr(r),"Requirement.parse('Twisted>=1.2')") - self.assertEqual(r, Requirement("Twisted", [('>=','1.2')], ())) - self.assertEqual(r, Requirement("twisTed", [('>=','1.2')], ())) - self.assertNotEqual(r, Requirement("Twisted", [('>=','2.0')], ())) - self.assertNotEqual(r, Requirement("Zope", [('>=','1.2')], ())) - self.assertNotEqual(r, Requirement("Zope", [('>=','3.0')], ())) - self.assertNotEqual(r, Requirement.parse("Twisted[extras]>=1.2")) - - def testOrdering(self): - r1 = Requirement("Twisted", [('==','1.2c1'),('>=','1.2')], ()) - r2 = Requirement("Twisted", [('>=','1.2'),('==','1.2c1')], ()) - self.assertEqual(r1,r2) - self.assertEqual(str(r1),str(r2)) - self.assertEqual(str(r2),"Twisted==1.2c1,>=1.2") - - def testBasicContains(self): - r = Requirement("Twisted", [('>=','1.2')], ()) - foo_dist = Distribution.from_filename("FooPkg-1.3_1.egg") - twist11 = Distribution.from_filename("Twisted-1.1.egg") - twist12 = Distribution.from_filename("Twisted-1.2.egg") - self.failUnless(parse_version('1.2') in r) - self.failUnless(parse_version('1.1') not in r) - self.failUnless('1.2' in r) - self.failUnless('1.1' not in r) - self.failUnless(foo_dist not in r) - self.failUnless(twist11 not in r) - self.failUnless(twist12 in r) - - def testAdvancedContains(self): - r, = parse_requirements("Foo>=1.2,<=1.3,==1.9,>2.0,!=2.5,<3.0,==4.5") - for v in ('1.2','1.2.2','1.3','1.9','2.0.1','2.3','2.6','3.0c1','4.5'): - self.failUnless(v in r, (v,r)) - for v in ('1.2c1','1.3.1','1.5','1.9.1','2.0','2.5','3.0','4.0'): - self.failUnless(v not in r, (v,r)) - - - def testOptionsAndHashing(self): - r1 = Requirement.parse("Twisted[foo,bar]>=1.2") - r2 = Requirement.parse("Twisted[bar,FOO]>=1.2") - r3 = Requirement.parse("Twisted[BAR,FOO]>=1.2.0") - self.assertEqual(r1,r2) - self.assertEqual(r1,r3) - self.assertEqual(r1.extras, ("foo","bar")) - self.assertEqual(r2.extras, ("bar","foo")) # extras are normalized - self.assertEqual(hash(r1), hash(r2)) - self.assertEqual( - hash(r1), hash(("twisted", ((">=",parse_version("1.2")),), - frozenset(["foo","bar"]))) - ) - - def testVersionEquality(self): - r1 = Requirement.parse("setuptools==0.3a2") - r2 = Requirement.parse("setuptools!=0.3a4") - d = Distribution.from_filename - - self.failIf(d("setuptools-0.3a4.egg") in r1) - self.failIf(d("setuptools-0.3a1.egg") in r1) - self.failIf(d("setuptools-0.3a4.egg") in r2) - - self.failUnless(d("setuptools-0.3a2.egg") in r1) - self.failUnless(d("setuptools-0.3a2.egg") in r2) - self.failUnless(d("setuptools-0.3a3.egg") in r2) - self.failUnless(d("setuptools-0.3a5.egg") in r2) - - - - - - - - - - - - - - -class ParseTests(TestCase): - - def testEmptyParse(self): - self.assertEqual(list(parse_requirements('')), []) - - def testYielding(self): - for inp,out in [ - ([], []), ('x',['x']), ([[]],[]), (' x\n y', ['x','y']), - (['x\n\n','y'], ['x','y']), - ]: - self.assertEqual(list(pkg_resources.yield_lines(inp)),out) - - def testSplitting(self): - self.assertEqual( - list( - pkg_resources.split_sections(""" - x - [Y] - z - - a - [b ] - # foo - c - [ d] - [q] - v - """ - ) - ), - [(None,["x"]), ("Y",["z","a"]), ("b",["c"]), ("d",[]), ("q",["v"])] - ) - self.assertRaises(ValueError,list,pkg_resources.split_sections("[foo")) - - def testSafeName(self): - self.assertEqual(safe_name("adns-python"), "adns-python") - self.assertEqual(safe_name("WSGI Utils"), "WSGI-Utils") - self.assertEqual(safe_name("WSGI Utils"), "WSGI-Utils") - self.assertEqual(safe_name("Money$$$Maker"), "Money-Maker") - self.assertNotEqual(safe_name("peak.web"), "peak-web") - - def testSafeVersion(self): - self.assertEqual(safe_version("1.2-1"), "1.2-1") - self.assertEqual(safe_version("1.2 alpha"), "1.2.alpha") - self.assertEqual(safe_version("2.3.4 20050521"), "2.3.4.20050521") - self.assertEqual(safe_version("Money$$$Maker"), "Money-Maker") - self.assertEqual(safe_version("peak.web"), "peak.web") - - def testSimpleRequirements(self): - self.assertEqual( - list(parse_requirements('Twis-Ted>=1.2-1')), - [Requirement('Twis-Ted',[('>=','1.2-1')], ())] - ) - self.assertEqual( - list(parse_requirements('Twisted >=1.2, \ # more\n<2.0')), - [Requirement('Twisted',[('>=','1.2'),('<','2.0')], ())] - ) - self.assertEqual( - Requirement.parse("FooBar==1.99a3"), - Requirement("FooBar", [('==','1.99a3')], ()) - ) - self.assertRaises(ValueError,Requirement.parse,">=2.3") - self.assertRaises(ValueError,Requirement.parse,"x\\") - self.assertRaises(ValueError,Requirement.parse,"x==2 q") - self.assertRaises(ValueError,Requirement.parse,"X==1\nY==2") - self.assertRaises(ValueError,Requirement.parse,"#") - - def testVersionEquality(self): - def c(s1,s2): - p1, p2 = parse_version(s1),parse_version(s2) - self.assertEqual(p1,p2, (s1,s2,p1,p2)) - - c('1.2-rc1', '1.2rc1') - c('0.4', '0.4.0') - c('0.4.0.0', '0.4.0') - c('0.4.0-0', '0.4-0') - c('0pl1', '0.0pl1') - c('0pre1', '0.0c1') - c('0.0.0preview1', '0c1') - c('0.0c1', '0-rc1') - c('1.2a1', '1.2.a.1'); c('1.2...a', '1.2a') - - def testVersionOrdering(self): - def c(s1,s2): - p1, p2 = parse_version(s1),parse_version(s2) - self.failUnless(p1>> import os, sys, tempfile - >>> from setuptools.command.easy_install import nt_quote_arg - >>> sample_directory = tempfile.mkdtemp() - >>> open(os.path.join(sample_directory, 'foo-script.py'), 'w').write( - ... """#!%(python_exe)s - ... import sys - ... input = repr(sys.stdin.read()) - ... print sys.argv[0][-14:] - ... print sys.argv[1:] - ... print input - ... if __debug__: - ... print 'non-optimized' - ... """ % dict(python_exe=nt_quote_arg(sys.executable))) - -Note that the script starts with a Unix-style '#!' line saying which -Python executable to run. The wrapper will use this to find the -correct Python executable. - -We'll also copy cli.exe to the sample-directory with the name foo.exe: - - >>> import pkg_resources - >>> open(os.path.join(sample_directory, 'foo.exe'), 'wb').write( - ... pkg_resources.resource_string('setuptools', 'cli.exe') - ... ) - -When the copy of cli.exe, foo.exe in this example, runs, it examines -the path name it was run with and computes a Python script path name -by removing the '.exe' suffic and adding the '-script.py' suffix. (For -GUI programs, the suffix '-script-pyw' is added.) This is why we -named out script the way we did. Now we can run out script by running -the wrapper: - - >>> import os - >>> input, output = os.popen4('"'+nt_quote_arg(os.path.join(sample_directory, 'foo.exe')) - ... + r' arg1 "arg 2" "arg \"2\\\"" "arg 4\\" "arg5 a\\b"') - >>> input.write('hello\nworld\n') - >>> input.close() - >>> print output.read(), - \foo-script.py - ['arg1', 'arg 2', 'arg "2\\"', 'arg 4\\', 'arg5 a\\\\b'] - 'hello\nworld\n' - non-optimized - -This example was a little pathological in that it exercised windows -(MS C runtime) quoting rules: - -- Strings containing spaces are surrounded by double quotes. - -- Double quotes in strings need to be escaped by preceding them with - back slashes. - -- One or more backslashes preceding double quotes quotes need to be - escaped by preceding each of them them with back slashes. - - -Specifying Python Command-line Options --------------------------------------- - -You can specify a single argument on the '#!' line. This can be used -to specify Python options like -O, to run in optimized mode or -i -to start the interactive interpreter. You can combine multiple -options as usual. For example, to run in optimized mode and -enter the interpreter after running the script, you could use -Oi: - - >>> open(os.path.join(sample_directory, 'foo-script.py'), 'w').write( - ... """#!%(python_exe)s -Oi - ... import sys - ... input = repr(sys.stdin.read()) - ... print sys.argv[0][-14:] - ... print sys.argv[1:] - ... print input - ... if __debug__: - ... print 'non-optimized' - ... sys.ps1 = '---' - ... """ % dict(python_exe=nt_quote_arg(sys.executable))) - - >>> input, output = os.popen4(nt_quote_arg(os.path.join(sample_directory, 'foo.exe'))) - >>> input.close() - >>> print output.read(), - \foo-script.py - [] - '' - --- - -Testing the GUI Version ------------------------ - -Now let's test the GUI version with the simple scipt, bar-script.py: - - >>> import os, sys, tempfile - >>> from setuptools.command.easy_install import nt_quote_arg - >>> sample_directory = tempfile.mkdtemp() - >>> open(os.path.join(sample_directory, 'bar-script.pyw'), 'w').write( - ... """#!%(python_exe)s - ... import sys - ... open(sys.argv[1], 'wb').write(repr(sys.argv[2])) - ... """ % dict(python_exe=nt_quote_arg(sys.executable))) - -We'll also copy gui.exe to the sample-directory with the name bar.exe: - - >>> import pkg_resources - >>> open(os.path.join(sample_directory, 'bar.exe'), 'wb').write( - ... pkg_resources.resource_string('setuptools', 'gui.exe') - ... ) - -Finally, we'll run the script and check the result: - - >>> import os - >>> input, output = os.popen4('"'+nt_quote_arg(os.path.join(sample_directory, 'bar.exe')) - ... + r' "%s" "Test Argument"' % os.path.join(sample_directory, 'test_output.txt')) - >>> input.close() - >>> print output.read() - - >>> print open(os.path.join(sample_directory, 'test_output.txt'), 'rb').read() - 'Test Argument' - - -We're done with the sample_directory: - - >>> import shutil - >>> shutil.rmtree(sample_directory) - diff --git a/setuptools/unicode_utils.py b/setuptools/unicode_utils.py new file mode 100644 index 0000000..6a84f9b --- /dev/null +++ b/setuptools/unicode_utils.py @@ -0,0 +1,44 @@ +import unicodedata +import sys + +import six + + +# HFS Plus uses decomposed UTF-8 +def decompose(path): + if isinstance(path, six.text_type): + return unicodedata.normalize('NFD', path) + try: + path = path.decode('utf-8') + path = unicodedata.normalize('NFD', path) + path = path.encode('utf-8') + except UnicodeError: + pass # Not UTF-8 + return path + + +def filesys_decode(path): + """ + Ensure that the given path is decoded, + NONE when no expected encoding works + """ + + if isinstance(path, six.text_type): + return path + + fs_enc = sys.getfilesystemencoding() or 'utf-8' + candidates = fs_enc, 'utf-8' + + for enc in candidates: + try: + return path.decode(enc) + except UnicodeDecodeError: + continue + + +def try_encode(string, enc): + "turn unicode encoding into a functional routine" + try: + return string.encode(enc) + except UnicodeEncodeError: + return None diff --git a/setuptools/version.py b/setuptools/version.py new file mode 100644 index 0000000..95e1869 --- /dev/null +++ b/setuptools/version.py @@ -0,0 +1,6 @@ +import pkg_resources + +try: + __version__ = pkg_resources.get_distribution('setuptools').version +except Exception: + __version__ = 'unknown' diff --git a/setuptools/windows_support.py b/setuptools/windows_support.py new file mode 100644 index 0000000..cb977cf --- /dev/null +++ b/setuptools/windows_support.py @@ -0,0 +1,29 @@ +import platform +import ctypes + + +def windows_only(func): + if platform.system() != 'Windows': + return lambda *args, **kwargs: None + return func + + +@windows_only +def hide_file(path): + """ + Set the hidden attribute on a file or directory. + + From http://stackoverflow.com/questions/19622133/ + + `path` must be text. + """ + __import__('ctypes.wintypes') + SetFileAttributes = ctypes.windll.kernel32.SetFileAttributesW + SetFileAttributes.argtypes = ctypes.wintypes.LPWSTR, ctypes.wintypes.DWORD + SetFileAttributes.restype = ctypes.wintypes.BOOL + + FILE_ATTRIBUTE_HIDDEN = 0x02 + + ret = SetFileAttributes(path, FILE_ATTRIBUTE_HIDDEN) + if not ret: + raise ctypes.WinError() diff --git a/site.py b/site.py deleted file mode 100755 index 80e084b..0000000 --- a/site.py +++ /dev/null @@ -1,82 +0,0 @@ -def __boot(): - import sys, imp, os, os.path - PYTHONPATH = os.environ.get('PYTHONPATH') - if PYTHONPATH is None or (sys.platform=='win32' and not PYTHONPATH): - PYTHONPATH = [] - else: - PYTHONPATH = PYTHONPATH.split(os.pathsep) - - pic = getattr(sys,'path_importer_cache',{}) - stdpath = sys.path[len(PYTHONPATH):] - mydir = os.path.dirname(__file__) - #print "searching",stdpath,sys.path - - for item in stdpath: - if item==mydir or not item: - continue # skip if current dir. on Windows, or my own directory - importer = pic.get(item) - if importer is not None: - loader = importer.find_module('site') - if loader is not None: - # This should actually reload the current module - loader.load_module('site') - break - else: - try: - stream, path, descr = imp.find_module('site',[item]) - except ImportError: - continue - if stream is None: - continue - try: - # This should actually reload the current module - imp.load_module('site',stream,path,descr) - finally: - stream.close() - break - else: - raise ImportError("Couldn't find the real 'site' module") - - #print "loaded", __file__ - - known_paths = dict([(makepath(item)[1],1) for item in sys.path]) # 2.2 comp - - oldpos = getattr(sys,'__egginsert',0) # save old insertion position - sys.__egginsert = 0 # and reset the current one - - for item in PYTHONPATH: - addsitedir(item) - - sys.__egginsert += oldpos # restore effective old position - - d,nd = makepath(stdpath[0]) - insert_at = None - new_path = [] - - for item in sys.path: - p,np = makepath(item) - - if np==nd and insert_at is None: - # We've hit the first 'system' path entry, so added entries go here - insert_at = len(new_path) - - if np in known_paths or insert_at is None: - new_path.append(item) - else: - # new path after the insert point, back-insert it - new_path.insert(insert_at, item) - insert_at += 1 - - sys.path[:] = new_path - -if __name__=='site': - __boot() - del __boot - - - - - - - - diff --git a/tests/manual_test.py b/tests/manual_test.py new file mode 100644 index 0000000..e5aaf17 --- /dev/null +++ b/tests/manual_test.py @@ -0,0 +1,98 @@ +#!/usr/bin/env python + +import sys +import os +import shutil +import tempfile +import subprocess +from distutils.command.install import INSTALL_SCHEMES +from string import Template + +from six.moves import urllib + + +def _system_call(*args): + assert subprocess.call(args) == 0 + + +def tempdir(func): + def _tempdir(*args, **kwargs): + test_dir = tempfile.mkdtemp() + old_dir = os.getcwd() + os.chdir(test_dir) + try: + return func(*args, **kwargs) + finally: + os.chdir(old_dir) + shutil.rmtree(test_dir) + + return _tempdir + + +SIMPLE_BUILDOUT = """\ +[buildout] + +parts = eggs + +[eggs] +recipe = zc.recipe.egg + +eggs = + extensions +""" + +BOOTSTRAP = 'http://downloads.buildout.org/1/bootstrap.py' +PYVER = sys.version.split()[0][:3] + +_VARS = {'base': '.', + 'py_version_short': PYVER} + +scheme = 'nt' if sys.platform == 'win32' else 'unix_prefix' +PURELIB = INSTALL_SCHEMES[scheme]['purelib'] + + +@tempdir +def test_virtualenv(): + """virtualenv with setuptools""" + purelib = os.path.abspath(Template(PURELIB).substitute(**_VARS)) + _system_call('virtualenv', '--no-site-packages', '.') + _system_call('bin/easy_install', 'setuptools==dev') + # linux specific + site_pkg = os.listdir(purelib) + site_pkg.sort() + assert 'setuptools' in site_pkg[0] + easy_install = os.path.join(purelib, 'easy-install.pth') + with open(easy_install) as f: + res = f.read() + assert 'setuptools' in res + + +@tempdir +def test_full(): + """virtualenv + pip + buildout""" + _system_call('virtualenv', '--no-site-packages', '.') + _system_call('bin/easy_install', '-q', 'setuptools==dev') + _system_call('bin/easy_install', '-qU', 'setuptools==dev') + _system_call('bin/easy_install', '-q', 'pip') + _system_call('bin/pip', 'install', '-q', 'zc.buildout') + + with open('buildout.cfg', 'w') as f: + f.write(SIMPLE_BUILDOUT) + + with open('bootstrap.py', 'w') as f: + f.write(urllib.request.urlopen(BOOTSTRAP).read()) + + _system_call('bin/python', 'bootstrap.py') + _system_call('bin/buildout', '-q') + eggs = os.listdir('eggs') + eggs.sort() + assert len(eggs) == 3 + assert eggs[1].startswith('setuptools') + del eggs[1] + assert eggs == ['extensions-0.3-py2.6.egg', + 'zc.recipe.egg-1.2.2-py2.6.egg'] + + +if __name__ == '__main__': + test_virtualenv() + test_full() diff --git a/tests/shlib_test/hello.c b/tests/shlib_test/hello.c deleted file mode 100755 index 9998372..0000000 --- a/tests/shlib_test/hello.c +++ /dev/null @@ -1,168 +0,0 @@ -/* Generated by Pyrex 0.9.3 on Thu Jan 05 17:47:12 2006 */ - -#include "Python.h" -#include "structmember.h" -#ifndef PY_LONG_LONG - #define PY_LONG_LONG LONG_LONG -#endif - - -typedef struct {PyObject **p; char *s;} __Pyx_InternTabEntry; /*proto*/ -typedef struct {PyObject **p; char *s; long n;} __Pyx_StringTabEntry; /*proto*/ -static PyObject *__Pyx_UnpackItem(PyObject *, int); /*proto*/ -static int __Pyx_EndUnpack(PyObject *, int); /*proto*/ -static int __Pyx_PrintItem(PyObject *); /*proto*/ -static int __Pyx_PrintNewline(void); /*proto*/ -static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb); /*proto*/ -static void __Pyx_ReRaise(void); /*proto*/ -static PyObject *__Pyx_Import(PyObject *name, PyObject *from_list); /*proto*/ -static PyObject *__Pyx_GetExcValue(void); /*proto*/ -static int __Pyx_ArgTypeTest(PyObject *obj, PyTypeObject *type, int none_allowed, char *name); /*proto*/ -static int __Pyx_TypeTest(PyObject *obj, PyTypeObject *type); /*proto*/ -static int __Pyx_GetStarArgs(PyObject **args, PyObject **kwds, char *kwd_list[], int nargs, PyObject **args2, PyObject **kwds2); /*proto*/ -static void __Pyx_WriteUnraisable(char *name); /*proto*/ -static void __Pyx_AddTraceback(char *funcname); /*proto*/ -static PyTypeObject *__Pyx_ImportType(char *module_name, char *class_name, long size); /*proto*/ -static int __Pyx_SetVtable(PyObject *dict, void *vtable); /*proto*/ -static int __Pyx_GetVtable(PyObject *dict, void *vtabptr); /*proto*/ -static PyObject *__Pyx_CreateClass(PyObject *bases, PyObject *dict, PyObject *name, char *modname); /*proto*/ -static int __Pyx_InternStrings(__Pyx_InternTabEntry *t); /*proto*/ -static int __Pyx_InitStrings(__Pyx_StringTabEntry *t); /*proto*/ -static PyObject *__Pyx_GetName(PyObject *dict, PyObject *name); /*proto*/ - -static PyObject *__pyx_m; -static PyObject *__pyx_b; -static int __pyx_lineno; -static char *__pyx_filename; -staticforward char **__pyx_f; - -/* Declarations from hello */ - -char (*(get_hello_msg(void))); /*proto*/ - -/* Implementation of hello */ - -static PyObject *__pyx_n_hello; - -static PyObject *__pyx_f_5hello_hello(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ -static PyObject *__pyx_f_5hello_hello(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { - PyObject *__pyx_r; - PyObject *__pyx_1 = 0; - static char *__pyx_argnames[] = {0}; - if (!PyArg_ParseTupleAndKeywords(__pyx_args, __pyx_kwds, "", __pyx_argnames)) return 0; - - /* "C:\cygwin\home\pje\setuptools\tests\shlib_test\hello.pyx":4 */ - __pyx_1 = PyString_FromString(get_hello_msg()); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 4; goto __pyx_L1;} - __pyx_r = __pyx_1; - __pyx_1 = 0; - goto __pyx_L0; - - __pyx_r = Py_None; Py_INCREF(__pyx_r); - goto __pyx_L0; - __pyx_L1:; - Py_XDECREF(__pyx_1); - __Pyx_AddTraceback("hello.hello"); - __pyx_r = 0; - __pyx_L0:; - return __pyx_r; -} - -static __Pyx_InternTabEntry __pyx_intern_tab[] = { - {&__pyx_n_hello, "hello"}, - {0, 0} -}; - -static struct PyMethodDef __pyx_methods[] = { - {"hello", (PyCFunction)__pyx_f_5hello_hello, METH_VARARGS|METH_KEYWORDS, 0}, - {0, 0, 0, 0} -}; - -DL_EXPORT(void) inithello(void); /*proto*/ -DL_EXPORT(void) inithello(void) { - __pyx_m = Py_InitModule4("hello", __pyx_methods, 0, 0, PYTHON_API_VERSION); - if (!__pyx_m) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 1; goto __pyx_L1;}; - __pyx_b = PyImport_AddModule("__builtin__"); - if (!__pyx_b) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 1; goto __pyx_L1;}; - if (PyObject_SetAttrString(__pyx_m, "__builtins__", __pyx_b) < 0) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 1; goto __pyx_L1;}; - if (__Pyx_InternStrings(__pyx_intern_tab) < 0) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 1; goto __pyx_L1;}; - - /* "C:\cygwin\home\pje\setuptools\tests\shlib_test\hello.pyx":3 */ - return; - __pyx_L1:; - __Pyx_AddTraceback("hello"); -} - -static char *__pyx_filenames[] = { - "hello.pyx", -}; -statichere char **__pyx_f = __pyx_filenames; - -/* Runtime support code */ - -static int __Pyx_InternStrings(__Pyx_InternTabEntry *t) { - while (t->p) { - *t->p = PyString_InternFromString(t->s); - if (!*t->p) - return -1; - ++t; - } - return 0; -} - -#include "compile.h" -#include "frameobject.h" -#include "traceback.h" - -static void __Pyx_AddTraceback(char *funcname) { - PyObject *py_srcfile = 0; - PyObject *py_funcname = 0; - PyObject *py_globals = 0; - PyObject *empty_tuple = 0; - PyObject *empty_string = 0; - PyCodeObject *py_code = 0; - PyFrameObject *py_frame = 0; - - py_srcfile = PyString_FromString(__pyx_filename); - if (!py_srcfile) goto bad; - py_funcname = PyString_FromString(funcname); - if (!py_funcname) goto bad; - py_globals = PyModule_GetDict(__pyx_m); - if (!py_globals) goto bad; - empty_tuple = PyTuple_New(0); - if (!empty_tuple) goto bad; - empty_string = PyString_FromString(""); - if (!empty_string) goto bad; - py_code = PyCode_New( - 0, /*int argcount,*/ - 0, /*int nlocals,*/ - 0, /*int stacksize,*/ - 0, /*int flags,*/ - empty_string, /*PyObject *code,*/ - empty_tuple, /*PyObject *consts,*/ - empty_tuple, /*PyObject *names,*/ - empty_tuple, /*PyObject *varnames,*/ - empty_tuple, /*PyObject *freevars,*/ - empty_tuple, /*PyObject *cellvars,*/ - py_srcfile, /*PyObject *filename,*/ - py_funcname, /*PyObject *name,*/ - __pyx_lineno, /*int firstlineno,*/ - empty_string /*PyObject *lnotab*/ - ); - if (!py_code) goto bad; - py_frame = PyFrame_New( - PyThreadState_Get(), /*PyThreadState *tstate,*/ - py_code, /*PyCodeObject *code,*/ - py_globals, /*PyObject *globals,*/ - 0 /*PyObject *locals*/ - ); - if (!py_frame) goto bad; - py_frame->f_lineno = __pyx_lineno; - PyTraceBack_Here(py_frame); -bad: - Py_XDECREF(py_srcfile); - Py_XDECREF(py_funcname); - Py_XDECREF(empty_tuple); - Py_XDECREF(empty_string); - Py_XDECREF(py_code); - Py_XDECREF(py_frame); -} diff --git a/tests/shlib_test/hello.pyx b/tests/shlib_test/hello.pyx deleted file mode 100755 index 58ce691..0000000 --- a/tests/shlib_test/hello.pyx +++ /dev/null @@ -1,4 +0,0 @@ -cdef extern char *get_hello_msg() - -def hello(): - return get_hello_msg() diff --git a/tests/shlib_test/hellolib.c b/tests/shlib_test/hellolib.c deleted file mode 100755 index 88d65ce..0000000 --- a/tests/shlib_test/hellolib.c +++ /dev/null @@ -1,3 +0,0 @@ -extern char* get_hello_msg() { - return "Hello, world!"; -} diff --git a/tests/shlib_test/setup.py b/tests/shlib_test/setup.py deleted file mode 100755 index b0c9399..0000000 --- a/tests/shlib_test/setup.py +++ /dev/null @@ -1,10 +0,0 @@ -from setuptools import setup, Extension, Library - -setup( - name="shlib_test", - ext_modules = [ - Library("hellolib", ["hellolib.c"]), - Extension("hello", ["hello.pyx"], libraries=["hellolib"]) - ], - test_suite="test_hello.HelloWorldTest", -) diff --git a/tests/shlib_test/test_hello.py b/tests/shlib_test/test_hello.py deleted file mode 100755 index 6da02e3..0000000 --- a/tests/shlib_test/test_hello.py +++ /dev/null @@ -1,7 +0,0 @@ -from unittest import TestCase - -class HelloWorldTest(TestCase): - def testHelloMsg(self): - from hello import hello - self.assertEqual(hello(), "Hello, world!") - diff --git a/tests/test_pypi.py b/tests/test_pypi.py new file mode 100644 index 0000000..173b2c8 --- /dev/null +++ b/tests/test_pypi.py @@ -0,0 +1,82 @@ +import os +import subprocess + +import virtualenv +from six.moves import http_client +from six.moves import xmlrpc_client + +TOP = 200 +PYPI_HOSTNAME = 'pypi.python.org' + + +def rpc_pypi(method, *args): + """Call an XML-RPC method on the Pypi server.""" + conn = http_client.HTTPSConnection(PYPI_HOSTNAME) + headers = {'Content-Type': 'text/xml'} + payload = xmlrpc_client.dumps(args, method) + + conn.request("POST", "/pypi", payload, headers) + response = conn.getresponse() + if response.status == 200: + result = xmlrpc_client.loads(response.read())[0][0] + return result + else: + raise RuntimeError("Unable to download the list of top " + "packages from Pypi.") + + +def get_top_packages(limit): + """Collect the name of the top packages on Pypi.""" + packages = rpc_pypi('top_packages') + return packages[:limit] + + +def _package_install(package_name, tmp_dir=None, local_setuptools=True): + """Try to install a package and return the exit status. + + This function creates a virtual environment, install setuptools using pip + and then install the required package. If local_setuptools is True, it + will install the local version of setuptools. + """ + package_dir = os.path.join(tmp_dir, "test_%s" % package_name) + if not local_setuptools: + package_dir = package_dir + "_baseline" + + virtualenv.create_environment(package_dir) + + pip_path = os.path.join(package_dir, "bin", "pip") + if local_setuptools: + subprocess.check_call([pip_path, "install", "."]) + returncode = subprocess.call([pip_path, "install", package_name]) + return returncode + + +def test_package_install(package_name, tmpdir): + """Test to verify the outcome of installing a package. + + This test compare that the return code when installing a package is the + same as with the current stable version of setuptools. + """ + new_exit_status = _package_install(package_name, tmp_dir=str(tmpdir)) + if new_exit_status: + print("Installation failed, testing against stable setuptools", + package_name) + old_exit_status = _package_install(package_name, tmp_dir=str(tmpdir), + local_setuptools=False) + assert new_exit_status == old_exit_status + + +def pytest_generate_tests(metafunc): + """Generator function for test_package_install. + + This function will generate calls to test_package_install. If a package + list has been specified on the command line, it will be used. Otherwise, + Pypi will be queried to get the current list of top packages. + """ + if "package_name" in metafunc.fixturenames: + if not metafunc.config.option.package_name: + packages = get_top_packages(TOP) + packages = [name for name, downloads in packages] + else: + packages = metafunc.config.option.package_name + metafunc.parametrize("package_name", packages) diff --git a/tox.ini b/tox.ini new file mode 100644 index 0000000..6b43dcd --- /dev/null +++ b/tox.ini @@ -0,0 +1,8 @@ +[testenv] +deps= + -rtests/requirements.txt + -rrequirements.txt +passenv=APPDATA USERPROFILE HOMEDRIVE HOMEPATH windir APPVEYOR +commands=py.test {posargs:-rsx} +usedevelop=True +extras=ssl diff --git a/version b/version deleted file mode 100755 index d7318d7..0000000 --- a/version +++ /dev/null @@ -1,48 +0,0 @@ -#!/usr/local/bin/invoke /usr/bin/peak version-config - -# This is a PEAK 'version' tool configuration file, that's -# also executable. PJE uses it to bump version numbers in -# the various parts of the project without having to edit them -# by hand. The current version is stored in the version.dat -# file. - -# These are not the droids you're looking for. You can go on -# about your business... - - - DefaultFormat full - part major - part minor - part status choice alpha beta "release candidate" final - part build - part date timestamp - - - trailer remap status "a%(build)s" "b%(build)s" "c%(build)s" "%(dot-maint)s" - dot-maint optional build ".%(build)s" - full "%(major)s.%(minor)s %(status)s %(build)s" - short "%(major)s.%(minor)s%(trailer)s" - - - - - Name setuptools - - - File setup.py - File ez_setup.py - Match 'VERSION = "%(short)s"' - - - - File release.sh - Match 'VERSION="%(short)s"' - - - - File setuptools/__init__.py - Match "__version__ = '%(short)s'" - - - - diff --git a/version.dat b/version.dat deleted file mode 100755 index d312ddd..0000000 --- a/version.dat +++ /dev/null @@ -1,6 +0,0 @@ -[setuptools] -status = 'release candidate' -major = 0 -build = 11 -minor = 6 - diff --git a/virtual-python.py b/virtual-python.py deleted file mode 100755 index 8624f37..0000000 --- a/virtual-python.py +++ /dev/null @@ -1,123 +0,0 @@ -"""Create a "virtual" Python installation - -Based on a script created by Ian Bicking.""" - -import sys, os, optparse, shutil -join = os.path.join -py_version = 'python%s.%s' % (sys.version_info[0], sys.version_info[1]) - -def mkdir(path): - if not os.path.exists(path): - print 'Creating %s' % path - os.makedirs(path) - else: - if verbose: - print 'Directory %s already exists' - -def symlink(src, dest): - if not os.path.exists(dest): - if verbose: - print 'Creating symlink %s' % dest - os.symlink(src, dest) - else: - print 'Symlink %s already exists' % dest - - -def rmtree(dir): - if os.path.exists(dir): - print 'Deleting tree %s' % dir - shutil.rmtree(dir) - else: - if verbose: - print 'Do not need to delete %s; already gone' % dir - -def make_exe(fn): - if os.name == 'posix': - oldmode = os.stat(fn).st_mode & 07777 - newmode = (oldmode | 0555) & 07777 - os.chmod(fn, newmode) - if verbose: - print 'Changed mode of %s to %s' % (fn, oct(newmode)) - -def main(): - if os.name != 'posix': - print "This script only works on Unix-like platforms, sorry." - return - - parser = optparse.OptionParser() - - parser.add_option('-v', '--verbose', action='count', dest='verbose', - default=0, help="Increase verbosity") - - parser.add_option('--prefix', dest="prefix", default='~', - help="The base directory to install to (default ~)") - - parser.add_option('--clear', dest='clear', action='store_true', - help="Clear out the non-root install and start from scratch") - - parser.add_option('--no-site-packages', dest='no_site_packages', - action='store_true', - help="Don't copy the contents of the global site-packages dir to the " - "non-root site-packages") - - options, args = parser.parse_args() - global verbose - - home_dir = os.path.expanduser(options.prefix) - lib_dir = join(home_dir, 'lib', py_version) - inc_dir = join(home_dir, 'include', py_version) - bin_dir = join(home_dir, 'bin') - - if sys.executable.startswith(bin_dir): - print 'Please use the *system* python to run this script' - return - - verbose = options.verbose - assert not args, "No arguments allowed" - - if options.clear: - rmtree(lib_dir) - rmtree(inc_dir) - print 'Not deleting', bin_dir - - prefix = sys.prefix - mkdir(lib_dir) - stdlib_dir = join(prefix, 'lib', py_version) - for fn in os.listdir(stdlib_dir): - if fn != 'site-packages': - symlink(join(stdlib_dir, fn), join(lib_dir, fn)) - - mkdir(join(lib_dir, 'site-packages')) - if not options.no_site_packages: - for fn in os.listdir(join(stdlib_dir, 'site-packages')): - symlink(join(stdlib_dir, 'site-packages', fn), - join(lib_dir, 'site-packages', fn)) - - mkdir(inc_dir) - stdinc_dir = join(prefix, 'include', py_version) - for fn in os.listdir(stdinc_dir): - symlink(join(stdinc_dir, fn), join(inc_dir, fn)) - - if sys.exec_prefix != sys.prefix: - exec_dir = join(sys.exec_prefix, 'lib', py_version) - for fn in os.listdir(exec_dir): - symlink(join(exec_dir, fn), join(lib_dir, fn)) - - mkdir(bin_dir) - print 'Copying %s to %s' % (sys.executable, bin_dir) - py_executable = join(bin_dir, 'python') - if sys.executable != py_executable: - shutil.copyfile(sys.executable, py_executable) - make_exe(py_executable) - - pydistutils = os.path.expanduser('~/.pydistutils.cfg') - if os.path.exists(pydistutils): - print 'Please make sure you remove any previous custom paths from' - print "your", pydistutils, "file." - - print "You're now ready to download ez_setup.py, and run" - print py_executable, "ez_setup.py" - -if __name__ == '__main__': - main() - diff --git a/wikiup.cfg b/wikiup.cfg deleted file mode 100755 index 5861915..0000000 --- a/wikiup.cfg +++ /dev/null @@ -1,5 +0,0 @@ -[PEAK] -EasyInstall = EasyInstall.txt -setuptools = setuptools.txt -PkgResources = pkg_resources.txt -EggFormats = doc/formats.txt