+2020-11-18 Jim Meyering <meyering@fb.com>
+
+ version 1.16.3
+
+ * configure.ac (AC_INIT): Bump version number to 1.16.3.
+ * m4/amversion.m4: Likewise (auto-updated by "make bootstrap").
+ * NEWS: Record release version.
+
+2020-11-16 Jim Meyering <meyering@fb.com>
+
+ tests: correct preceding change.
+
+ * t/auxdir-pr19311.sh: Fix error in case stmt and match more
+ upcoming versions.
+
+2020-11-16 Jim Meyering <meyering@fb.com>
+
+ tests: auxdir-pr19311.sh no longer fails with latest autoconf
+
+ * t/list-of-tests.mk (XFAIL_TESTS): Remove from this list.
+ * t/auxdir-pr19311.sh: Instead, run this test only when autoconf
+ is 2.69d or newer. Otherwise, skip it.
+
+2020-11-16 Jim Meyering <meyering@fb.com>
+
+ maint: placate maintainer-check's rm -f check
+
+ * t/vala-recursive-setup.sh: Add an unnecessary -f option
+ to an rm invocation to avoid "make maintainer-check" failure.
+
+2020-11-15 Jim Meyering <meyering@fb.com>
+
+ maint: Update files from upstream with 'make fetch'
+
+ * lib/config.guess: Update.
+ * lib/config.sub: Likewise.
+ * lib/texinfo.tex: Likewise.
+
+2020-11-15 Jim Meyering <meyering@fb.com>
+
+ tests: avoid missing .dvi failure with parallel tests
+
+ * t/txinfo-no-clutter.sh: Tests of texinfo-related rules
+ had overlap that made them fail often when some rules were
+ run in parallel, so inhibit parallelism in that one directory.
+ See discussion starting at
+ https://lists.gnu.org/r/automake-patches/2020-11/msg00011.html
+
+2020-11-15 Jim Meyering <meyering@fb.com>
+
+ tests: protect against parallel false failure
+
+ * t/parallel-tests-console-output.sh: Do not depend on the order
+ of items in test summary. With a parallel test run, they may
+ appear in a different order, e.g., when running tests like this:
+ make check AM_TESTSUITE_MAKE='make -j14'
+ Sort the expected output and the actual output before comparing.
+
+2020-11-15 Jim Meyering <meyering@fb.com>
+
+ doc: fix quoting in suggested parallel test invocation
+
+ * t/README: Fix reversed single/double quotes.
+
+2020-11-14 Jim Meyering <meyering@fb.com>
+
+ tests: accommodate an $ac_aux_dir of "." or "./"
+
+ * t/auxdir-pr15981.sh: This test would fail when run with
+ autoconf-2.69d because $ac_aux_dir would be "./" rather than
+ the expected ".". Accept both.
+
+2020-11-14 Jim Meyering <meyering@fb.com>
+
+ tests: avoid failures due to missing ar-lib
+
+ * t/ar4.sh: Create dummy ar-lib, as done in other tests,
+ to avoid failure like this:
+ configure: error: cannot find required auxiliary files: ar-lib
+ * t/ar5.sh: Likewise.
+
+2020-11-13 Karl Berry <karl@freefriends.org>
+
+ install-sh: trailing whitespace.
+
+ * lib/install-sh: remove trailing whitespace. Sigh.
+ (scriptversion): 2020-11-14.01
+
+2020-11-13 Robert Menteer <reetnem@mac.com>
+
+ dejagnu: quote `pwd` when writing "set objdir" line to site.exp.
+
+ This change fixes https://bugs.gnu.org/44600.
+
+ * lib/am/dejagnu.am (site.exp): quote set objdir line.
+ * NEWS: mention this.
+
+2020-11-12 Karl Berry <karl@freefriends.org>
+
+ install-sh: new option -S SUFFIX for simple file backups.
+
+ * lib/install-sh: implement and document -S.
+ Patch sent by Julien Elie:
+ https://lists.gnu.org/archive/html/automake-patches/2018-03/msg00004.html
+ (scriptversion): 2020-11-13.01
+ * t/install-sh-option-S.sh: new test.
+ * t/list-of-tests.mk (handwritten_tests): add it.
+ * NEWS: mention it.
+
+2020-11-10 Karl Berry <karl@freefriends.org>
+
+ install-sh: --help tweaks.
+
+ * lib/install-sh (usage): avoid too-long line, mention
+ bug reporting address (bug-automake) and automake home page.
+
+2020-11-10 Karl Berry <karl@freefriends.org>
+
+ install-sh: new option -p to preserve mtime.
+
+ * lib/install-sh: new option -p to call cp -p.
+ Idea from patch sent by Julien Elie:
+ https://lists.gnu.org/archive/html/automake-patches/2018-03/msg00002.html
+ (scriptversion): 2020-11-11.02
+ * NEWS: mention this.
+
+ 2020-11-10 Karl Berry <karl@freefriends.org>
+
+2020-11-10 Karl Berry <karl@freefriends.org>
+
+ vala: forgot to update lists-of-tests.mk.
+
+ * t/list-of-tests.mk (handwritten_tests): include
+ t/vala-libs-distcheck.sh and t/vala-libs-vpath.sh.
+ Should have been committed with 2020-10-29 vala change,
+ but somehow missed.
+
+ 2020-11-10 Karl Berry <karl@freefriends.org>
+
+2020-11-10 Karl Berry <karl@freefriends.org>
+
+ install-sh: do not chown existing directories.
+
+ * lib/install-sh: do not chown existing directories.
+ Original patch sent by Julien Elie:
+ https://lists.gnu.org/archive/html/automake-patches/2018-03/msg00003.html
+ (scriptversion): 2020-11-11.01
+ * NEWS: mention this.
+
+2020-11-10 Karl Berry <karl@freefriends.org>
+
+ install-sh: do not redundantly specify -f to rm.
+
+ * lib/install-sh: do not redundantly specify -f to rm.
+ Mention implication for RMPROG in the --help message.
+ Original patch sent by Julien Elie:
+ https://lists.gnu.org/archive/html/automake-patches/2018-03/msg00005.html
+ * NEWS: mention this.
+
+2020-11-07 Reuben Thomas <rrt@sc3d.org>
+
+ vala: improve support, especially builddir vs. srcdir.
+
+ This change fixes https://bugs.gnu.org/13002.
+
+ * NEWS: mention these changes.
+ * bin/automake.in: generated C files go in builddir, not srcdir.
+ Distribute the header files generated from VAPI files.
+ * t/vala-libs-distcheck.sh: new test for `make distcheck' of a
+ Vala library.
+ * t/vala-libs-vpath.sh: new test for a VPATH build of a Vala library.
+ * t/vala-libs.sh: add local VAPIs used as external --package to test.
+
+ * t/vala-recursive-setup.sh: we need to make
+ maintainer-clean at one point to remove stamp files to avoid
+ confusing a VPATH build performed after a non-VPATH build.
+ * t/vala-non-recursive-setup.sh: likewise.
+
+ * t/vala-parallel.sh: some test paths need changing to take into
+ account that generated C files now go in builddir.
+ * t/vala-per-target-flags.sh: likewise.
+ * t/vala-recursive-setup.sh: likewise.
+ * t/vala-vpath.sh: likewise.
+
+2020-11-07 Karl Berry <karl@freefriends.org>
+
+ tests: recompute dependencies when lists-of-tests.mk changes.
+
+ This change fixes https://bugs.gnu.org/44458
+ and updates https://bugs.gnu.org/11347.
+
+ * t/local.mk ($(srcdir)/%D/testsuite-part.am): restore
+ dependency on '%D/list-of-tests.mk' (i.e., t/list-of-tests.mk),
+ partially reverting the change of 26 Apr 2012 for bug#11347.
+ Otherwise, new tests that have dependencies will not cause an
+ update of testsuite-part.am, leading to strange failures
+ (bug#44458). The original problem being fixed in #11347 was
+ unnecessary rebuilding when modifying tests; that should not be
+ affected here, but when new tests are added, it seems reasonable,
+ as well as necessary, to ensure dependencies are updated.
+
+2020-10-27 Miro Hron\v{c}ok <miro@hroncok.cz>
+
+ python: determine Python (3.10) version number correctly.
+
+ This change fixes https://bugs.gnu.org/44239
+ (and https://bugzilla.redhat.com/show_bug.cgi?id=1889732).
+
+ * m4/python.m4: use print('%u.%u' % sys.version_info[:2]) for
+ the version number instead of merely sys.version[:3], so the
+ numbers are treated as numbers.
+ * t/python-vars.sh (PYTHON_VERSION): Likewise.
+ * doc/automake.texi: Document it.
+ * NEWS: mention it. (Minor tweaks from Karl Berry.)
+
+2020-10-23 Jim Meyering <meyering@fb.com>
+
+ doc: correct "moved in", to "moved to"
+
+ * NEWS: Correct wording.
+ * contrib/README: Likewise.
+ * doc/automake.texi: Likewise.
+
+2020-10-23 Reuben Thomas <rrt@sc3d.org>
+
+ contrib/README: fix and clarify the English
+
+2020-10-23 Reuben Thomas <rrt@sc3d.org>
+
+ Improve Vala compiler detection: use API version, not compiler version
+
+ * m4/vala.m4: check `valac --api-version', not `valac --version'.
+ * doc/automake.texi: update documentation.
+
+2020-10-07 Zack Weinberg <zackw@panix.com>
+
+ Use complete configure.ac’s in testsuite.
+
+ Autoconf 2.70 will issue warnings if it encounters a configure.ac that doesn’t
+ call both AC_INIT and AC_OUTPUT.
+
+ Automake already issues warnings if it encounters a configure.ac that uses an
+ AM_ macro but doesn’t call AM_INIT_AUTOMAKE or AC_CONFIG_FILES([Makefile]).
+
+ In two places, the testsuite was tripping these warnings, leading to spurious
+ failures with Autoconf 2.70 betas.
+
+ * t/aminit-moreargs-deprecation.sh: Add AC_OUTPUT to test configure.ac.
+ * t/mkdirp-deprecation.sh: Use a complete test configure.ac, not a stub
+ containing only a use of AM_PROG_MKDIR_P.
+
+2020-10-06 Karl Berry <karl@freefriends.org>
+
+ automake: install-exec did not depend on $(BUILT_SOURCES).
+
+ This change fixes https://bugs.gnu.org/43683.
+
+ * lib/am/install.am (install-exec): %maybe_BUILT_SOURCES% dependency,
+ twice. Basic patch from madmurphy (tiny change), message#8.
+ (.MAKE) [maybe_BUILT_SOURCES]: depend on install-exec.
+ * NEWS: mention it.
+ * doc/automake.texi (Sources): mention this (also that make dist
+ depends on $(BUILT_SOURCES)).
+ * t/built-sources-install-exec.sh: new test.
+ * t/list-of-tests.mk (handwritten_TESTS): add it.
+ * t/built-sources-install.sh: typo.
+ * t/built-sources-check.sh: typo.
+
+2020-09-21 Zack Weinberg <zackw@panix.com>
+
+ Update documentation of warnings options and strictness levels.
+
+ The warning categories ‘cross’ and ‘portability-recursive’ were not mentioned
+ in the manual.
+
+ Also clarify the relationship between warnings categories and strictness
+ levels, and streamline the description of strictness levels by merging the
+ “Gnits” section into the “Strictness” section.
+
+ * doc/automake.texi (Gnits, Strictness): Combine these sections.
+ Minor revisions to explanation of strictness levels.
+ (automake Invocation): Add documentation of all the warnings
+ categories that have been added since the last time this section
+ was updated. Minor clarifications.
+
+2020-09-21 Zack Weinberg <zackw@panix.com>
+
+ Use WARNINGS=none to suppress warnings from autom4te runs.
+
+ aclocal uses autom4te in trace mode to scan configure.ac for macros whose
+ definition is not yet available. It has a kludge to prevent this from
+ producing spurious warnings, but a cleaner, fully backward compatible, way to
+ get the same effect is to set WARNINGS=none in the environment and not pass
+ down any -W options. (This is better than passing -Wnone on the command line
+ because it automatically inherits to any subprocesses started by autom4te.)
+
+ Perl’s ‘local’ feature can be used to make the enviironment variable setting
+ temporary, reverting to the previous value when we exit the function.
+
+ automake also runs autom4te (technically autoconf) in trace mode; warnings
+ from this invocation will not be *spurious*, but in the common case where
+ the person running automake is going to run autoconf next, they will be
+ duplicated. Therefore, make the same change to automake.
+
+ * bin/aclocal.in (trace_used_macros)
+ * bin/automake.in (scan_autoconf_traces):
+ Use “local $ENV{WARNINGS}='none'” to suppress warnings from autom4te.
+
+2020-09-18 Zack Weinberg <zackw@panix.com>
+
+ New utility function Automake::ChannelDefs::merge_WARNINGS.
+
+ This function merges a list of warnings categories into the environment
+ variable WARNINGS, returning a new value to set it to. The intended use
+ is in code of the form
+
+ {
+ local $ENV{WARNINGS} = merge_WARNINGS ("this", "that");
+
+ # run a command here with WARNINGS=this,that,etc
+ }
+
+ This is not actually used in automake, but will be in autoconf.
+
+ * lib/Automake/ChannelDefs.pm (merge_WARNINGS): New function.
+
+2020-09-12 Zack Weinberg <zackw@panix.com>
+
+ t/python-virtualenv.sh: Skip when versions don’t match
+
+ On some operating systems ‘python’ is Python 2.x but ‘virtualenv -ppython’
+ will create a virtualenv that uses Python 3.x. This is a bug, but it’s
+ not *automake’s* bug, and should not cause t/python-virtualenv.sh to fail.
+ Skip the test, instead of failing it, when the inner=outer version check
+ fails.
+
+ (This also has nothing to do with the main goal of this patchset, it just
+ annoyed me while I was testing.)
+
+ * t/python-virtualenv.sh: Skip test, rather than failing it, when
+ $py_version_pre != $py_version_post.
+
+2020-09-12 Zack Weinberg <zackw@panix.com>
+
+ Consistently use ‘our’ instead of ‘use vars’.
+
+ At file scope of a file containing at most one ‘package’ declaration,
+ ‘use vars’ is exactly equivalent to ‘our’, and the latter is preferred
+ starting with Perl 5.6.0, which happens to be the oldest version we
+ support.
+
+ (This change has nothing to do with the previous two, but I want to make the
+ same change in Autoconf and that means doing it here for all the files synced
+ from Automake.)
+
+ (I don’t know why, but this change exposed a latent bug in FileUtils.pm where
+ the last pod block in the file didn’t have a ‘=cut’ delimiter, so the code
+ after it was considered documentation, causing ‘require FileUtils’ to fail.)
+
+ * lib/Automake/ChannelDefs.pm
+ * lib/Automake/Channels.pm
+ * lib/Automake/Condition.pm
+ * lib/Automake/Configure_ac.pm
+ * lib/Automake/DisjConditions.pm
+ * lib/Automake/FileUtils.pm
+ * lib/Automake/General.pm
+ * lib/Automake/Getopt.pm
+ * lib/Automake/Options.pm
+ * lib/Automake/Rule.pm
+ * lib/Automake/RuleDef.pm
+ * lib/Automake/VarDef.pm
+ * lib/Automake/Variable.pm
+ * lib/Automake/Wrap.pm
+ * lib/Automake/XFile.pm:
+ Replace all uses of ‘use vars’ with ‘our’.
+
+ * lib/Automake/FileUtils.pm:
+ Add missing ‘=cut’ to a pod block near the end of the file.
+
+2020-09-12 Zack Weinberg <zackw@panix.com>
+
+ Consistently process -W(no-)error after all other warning options.
+
+ automake and aclocal were processing ‘-W(no-)error’ whenever it
+ appeared on the command line, which means that
+ ‘-Werror,something-strange’ would issue a hard error, but
+ ‘-Wsomething-strange,error’ would only issue a warning.
+
+ It is not desirable for warnings about unknown warning categories ever to be
+ treated as a hard error; that leads to problems for driver scripts like
+ autoreconf, which would like to pass whatever -W options it got on its own
+ command line down to all the tools and not worry about which tools understand
+ which warning categories. Also, this sort of order dependence is confusing
+ for humans.
+
+ Change parse_warnings to take just one option, the _complete_ list of warning
+ categories seen on the command line, and to process -Werror / -Wno-error after
+ processing all other warnings options. Thus, unknown warnings categories will
+ always just be a plain warning. This does mean aclocal has to stop using
+ parse_warnings as a Getopt::Long callback, but that’s not a big deal.
+
+ Similarly, change parse_WARNINGS to record whether ‘error’ appeared in the
+ environment variable, but not activate warnings-are-errors mode itself.
+ parse_warnings picks up the record and honors it, unless it’s overridden by
+ the command line.
+
+ * lib/Automake/ChannelDefs.pm ($werror): New package global (not exported).
+ (parse_WARNINGS): Do not call switch_warning for ‘error’ / ‘no-error’;
+ just toggle the value of $werror.
+ (parse_warnings): Do not call switch_warning immediately for
+ ‘error’ / ‘no-error’; toggle $werror instead. Call switch_warning ‘error’
+ at the very end if $werror is true. Remove unused $OPTION argument.
+ * bin/automake.in: parse_warnings now takes only one argument.
+ * bin/aclocal.in: Call parse_warnings after parse_options instead of
+ using it as a parse_options callback.
+
+2020-09-11 Zack Weinberg <zackw@panix.com>
+
+ Sync ChannelDefs.pm from autoconf.
+
+ ChannelDefs.pm *ought* to be kept in sync between automake and autoconf,
+ because it defines the set of valid -W options, and autoreconf assumes
+ that it can pass arbitrary -W options to all of the tools it invokes.
+ However, it isn’t covered by either project’s ‘make fetch’ and it hasn’t
+ actually *been* in sync for more than 17 years.
+
+ This patch manually brings over all of the changes made on the autoconf side.
+ Most importantly, there is a new warnings channel ‘cross’, for warnings
+ related to cross-compilation. Also, the ‘usage’ function now *returns*
+ the text to be put into a usage message, instead of printing it itself.
+ (This is necessary on autoconf’s side.)
+
+ * lib/Automake/ChannelDefs.pm: Sync from autoconf.
+ (cross): New warnings channel.
+ (portability-recursive): Document.
+ (usage): Now returns the text to be printed, instead of printing it.
+ (parse_warnings): Second argument may now be a list.
+
+2020-09-05 Zack Weinberg <zackw@panix.com>
+
+ automake: be robust against directories containing ().
+
+ This change fixes https://bugs.gnu.org/14196.
+
+ * m4/missing.m4 (AM_MISSING_HAS_RUN): always quote the
+ invocation (not just if $am_aux_dir contains space or tab), in
+ case $am_aux_dir contains () or other metachars not rejected by
+ AM_SANITY_CHECK; quoting with '...' suggested by Jim Meyering.
+ * t/man6.sh (HELP2MAN): adjust grep since missing value
+ is quoted now.
+ * t/am-missing-prog.sh: likewise.
+
+2020-09-04 Issam E. Maghni <issam.e.maghni@mailbox.org>
+
+ maint: Update files from upstream with 'make fetch'
+
+ * lib/config.guess: Update.
+ * lib/config.sub: Likewise.
+ * lib/gendocs_template: Likewise.
+ * lib/gitlog-to-changelog: Likewise.
+ * lib/texinfo.tex: Likewise.
+ * lib/update-copyright: Likewise.
+
+2020-08-31 Zack Weinberg <zackw@panix.com>
+
+ perl: use warnings instead of -w; consistent ordering of use, etc.
+
+ Per thread at:
+ https://lists.gnu.org/archive/html/automake-patches/2020-08/msg00009.html
+
+ * bin/aclocal.in: use warnings instead of #!...-w;
+ consistent ordering of basic "use" directives,
+ then BEGIN block,
+ then standard modules in ASCII order,
+ then Automake:: modules (not sort),
+ finally use vars.
+ Also sort @ISA lists and use qw(...) in ASCII order.
+ * bin/automake.in: likewise.
+ * lib/Automake/ChannelDefs.pm: likewise.
+ * lib/Automake/Channels.pm: likewise.
+ * lib/Automake/Condition.pm: likewise.
+ * lib/Automake/Config.in: likewise.
+ * lib/Automake/Configure_ac.pm: likewise.
+ * lib/Automake/DisjConditions.pm: likewise.
+ * lib/Automake/FileUtils.pm: likewise.
+ * lib/Automake/General.pm: likewise.
+ * lib/Automake/Getopt.pm: likewise.
+ * lib/Automake/Item.pm: likewise.
+ * lib/Automake/ItemDef.pm: likewise.
+ * lib/Automake/Language.pm: likewise.
+ * lib/Automake/Location.pm: likewise.
+ * lib/Automake/Options.pm: likewise.
+ * lib/Automake/Rule.pm: likewise.
+ * lib/Automake/RuleDef.pm: likewise.
+ * lib/Automake/VarDef.pm: likewise.
+ * lib/Automake/Variable.pm: likewise.
+ * lib/Automake/Version.pm: likewise.
+ * lib/Automake/Wrap.pm: likewise.
+ * lib/Automake/XFile.pm: remove unnecessary imports of
+ Carp, DynaLoader, and File::Basename.
+
+2020-08-28 Robert Wanamaker <rlw@nycap.rr.com>
+
+ docs: automake-history.texi @dircategory Software development.
+
+ Per thread at:
+ https://lists.gnu.org/archive/html/automake-patches/2020-08/msg00006.html
+
+ * doc/automake-history.texi (@dircategory): Define.
+
+2020-08-28 Karl Berry <karl@freefriends.org>
+
+ automake: if TEST_EXTENSIONS is set to empty, don't look inside it.
+
+ This change fixes https://bugs.gnu.org/42635.
+
+ * bin/automake.in (handle_tests): do not use $test_suffixes[0]
+ if it does not exist.
+ * t/test-extensions-empty.sh: new test.
+ * t/list-of-tests.mk (handwritten_TESTS): add it.
+
+2020-08-13 Felix Yan <felixonmars@archlinux.org>
+
+ docs: typo in tap-driver.sh.
+
+ Per thread at:
+ https://lists.gnu.org/archive/html/automake-patches/2020-08/msg00000.html
+
+ * lib/tap-driver.sh (setup_result_obj): "assing" typo, etc.
+
+2020-08-01 Paul Eggert <eggert@cs.ucla.edu>
+
+ port XFile locking to OpenIndiana
+
+ I observed this problem on an NFS filesystem on an OpenIndiana
+ host (5.11 illumos-dde7ba523f i386). fcntl (fd, F_SETLK, ...)
+ failed with errno == EINVAL, which POSIX allows for files that
+ do not support locking.
+ * lib/Automake/XFile.pm (lock): Treat EINVAL like ENOLCK.
+
+2020-07-26 Paul Eggert <eggert@cs.ucla.edu>
+
+ * Update scriptversions for install-sh, mkinstalldirs.
+
+2020-07-26 Paul Eggert <eggert@cs.ucla.edu>
+
+ Install directories mode 755 instead of using umask
+
+ Problem reported by Antoine Amarilli in:
+ https://lists.gnu.org/archive/html/automake/2019-01/msg00000.html
+ and followed up by Akim Demaille in:
+ https://lists.gnu.org/archive/html/bug-bison/2020-07/msg00040.html
+ * bin/automake.in: Add a comment about this.
+ * lib/install-sh: Ignore umask; just create directories mode 755
+ unless overridden via -m (for non-intermediate directories only).
+ Also, fix 'umask=$mkdir_umask' typo.
+ * lib/mkinstalldirs: Likewise.
+
+2020-06-29 Paul Eggert <eggert@cs.ucla.edu>
+
+ automake: remove stray up_to_date_p
+
+ * lib/Automake/FileUtils.pm (up_to_date_p):
+ Don’t export up_to_date_p, which was removed in
+ 2020-05-11T00:40:14Z!karl@freefriends.org.
+
+2020-06-06 Karl Berry <karl@freefriends.org>
+
+ tests: support -fno-common in vala-mix2 test.
+
+ This change fixes https://bugs.gnu.org/41726.
+
+ * t/vala-mix2.sh: extern in .h, initialization in .c.
+ GCC 10 defaults to -fno-common.
+
+2020-06-06 Karl Berry <karl@freefriends.org>
+
+ automake: support AM_TESTSUITE_SUMMARY_HEADER override.
+
+ This change handles https://bugs.gnu.org/11745.
+
+ * lib/am/check.am (AM_TESTSUITE_SUMMARY_HEADER): new variable.
+ Default value is " for $(PACKAGE_STRING)", including quotes,
+ to keep the default output the same.
+ ($(TEST_SUITE_LOG)): use it, unquoted.
+ * doc/automake.texi (Scripts-based Testsuites): document it.
+ * NEWS: mention it.
+ * t/testsuite-summary-header.sh: new test.
+ * t/list-of-tests.mk (handwritten_tests): add it.
+ * t/ax/testsuite-summary-checks.sh: fix typo.
+
+2020-05-28 Akim Demaille <akim@gnu.org>
+
+ docs: promote Makefile snippets that work properly with make -n.
+
+ This change handles https://bugs.gnu.org/10852.
+
+ * doc/automake.texi (Multiple Outputs): Split commands than
+ reinvoke $(MAKE) to avoid file removals during dry runs.
+
+2020-05-25 Karl Berry <karl@freefriends.org>
+
+ docs: forgot TAR in NEWS; fix " -- " in manual.
+
+ * NEWS: it seems the TAR envvar was never mentioned in NEWS;
+ add it, back for 1.11.3 when it was apparently implemented.
+ * doc/automake.texi: consistently use "---" instead of " --".
+
+2020-05-25 Karl Berry <karl@freefriends.org>
+
+ docs: TAR envvar overrides "tar" for make dist.
+
+ This change finishes https://bugs.gnu.org/9822.
+
+ * doc/automake.texi (Basics of Distribution): mention that
+ environment variable TAR overrides "tar".
+
+2020-05-17 Karl Berry <karl@freefriends.org>
+
+ automake: new variable AM_DISTCHECK_DVI_TARGET to override "dvi".
+
+ This change fixes https://bugs.gnu.org/8289.
+
+ * lib/am/distdir.am (AM_DISTCHECK_DVI_TARGET): define as dvi.
+ (distcheck): use it, isntead of hardcoding dvi.
+ * lib/Automake/Variable.pm (%_silent_variable_override): add
+ AM_DISTCHECK_DVI_TARGET.
+ * t/distcheck-override-dvi.sh: new test.
+ * t/list-of-tests.mk (handwritten_TESTS): add it.
+ * doc/automake.texi (Checking the Distribution): document this.
+ (Third-Party Makefiles): explicitly mention that
+ EMPTY_AUTOMAKE_TARGETS is not a built-in or special name.
+ Various other index entries and wording tweaks.
+ * NEWS (Distribution): mention this.
+
+2020-05-10 Karl Berry <karl@freefriends.org>
+
+ automake: remove unused Automake::FileUtils::up_to_date_p function.
+
+ Per thread at:
+ https://lists.gnu.org/archive/html/automake-patches/2020-04/msg00000.html>
+ especially:
+ https://lists.gnu.org/archive/html/automake-patches/2020-05/msg00003.html
+
+ * lib/Automake/FileUtils.pm (up_to_date_p): remove.
+ Nothing in Automake itself uses this. It is used in Autoconf's
+ autom4te utility, but Autoconf has its own copy, and the duplication
+ was impeding auto4mte development, as discussed in the thread above.
+ (While here, insert missing =over/=back to placate pod2text.)
+ * NEWS (Miscellanous changes): note this.
+
+2020-05-07 Karl Berry <karl@freefriends.org>
+
+ docs: make dist implies make dvi.
+
+ This change handles https://bugs.gnu.org/7994.
+
+ * doc/automake.texi (Preparing Distributions): make distcheck
+ runs make dvi.
+ (Auxiliary Programs) <texinfo.tex>: mention
+ that make dist runs make dvi, and therefore a TeX system is
+ required when Texinfo sources are present. Add @cmindex entries
+ for all auxiliary programs while we're here.
+
+2020-05-07 Karl Berry <karl@freefriends.org>
+
+ tests: TeX system required for two more tests.
+
+ * t/instdir-no-empty.sh (required): makeinfo tex texi2dvi dvips.
+ * t/txinfo-bsd-make-recurs.sh (required): likewise.
+
+2020-04-23 Vincent Lefevre <vincent@vinc17.net>
+
+ bug#40699: "dist Hook" documentation in manual is incorrect or unclear about write permissions
+
+ On 2020-04-20 14:59:00 -0600, Karl Berry wrote:
+ > i.e. it does not change the permissions in order to make the removal
+ > work recursively
+ >
+ > Right, I see it now. Had been testing the wrong thing.
+ >
+ > So, can you propose a specific change for the manual? -k
+
+ I think that it is sufficient to fix the example (the explanation
+ is just below). BTW, the second example is also incorrect.
+
+ commit a639e5b51cadbaff88ca4059b4db4571c811070c
+ Author: Vincent Lefevre <vincent@vinc17.net>
+ Date: 2020-04-23 17:33:54 +0200
+
+ doc: fix dist-hook examples
+
+2020-04-18 Karl Berry <karl@freefriends.org>
+
+ cosmetics: spurious word in README, copyright year.
+
+ * README: delete spurious "that"; update copyright year end to 2020.
+ Original suggestion from Vincent Lefevre,
+ https://lists.gnu.org/archive/html/automake-patches/2020-04/msg00007.html
+
+2020-04-08 Samuel Tardieu <sam@rfc1149.net>
+
+ docs: test-driver options do not accept =, update --help.
+
+ This change fixes https://bugs.gnu.org/22445.
+
+ * lib/test-driver (print_usage): space after --test-name,
+ --log-file, --trs-file, not =. Also mention Automake as source.
+
+2020-04-06 Samy Mahmoudi <samy.mahmoudi@gmail.com>
+
+ cosmetics: typo in comment.
+
+ This change fixes https://bugs.gnu.org/32100.
+
+ * bin/aclocal.in (install_file): remove duplicate "the" in
+ "Using the real the destination file ...".
+
+2020-04-05 Colomban Wendling <lists.ban@herbesfolles.org>
+
+ vala: more precise argument matching.
+
+ This change fixes https://bugs.gnu.org/18734.
+
+ * bin/automake.in (lang_vala_finish_target): anchor option regexp
+ so that, e.g., an argument "vapi" does not match the option --vapi.
+ * NEWS: mention this (and preceding checklinkx change, sorry).
+
+2020-04-03 Karl Berry <karl@freefriends.org>
+
+ doc: update urls in manual and include checklinkx script.
+
+ * doc/automake.texi: update many urls; http -> https,
+ search.cpan.org -> metacpan.org/pod/distribution, node names, etc.
+ Remove sourceware.org/cgi-bin/gnatsweb.pl and
+ miller.emu.id.au/pmiller/books/rmch/ as these are 404
+ and no good replacement is evident.
+ s/perl/Perl/ a couple times in text for good measure.
+ * contrib/checklinkx: new script, a small modification of
+ W3C checklink <https://validator.w3.org/checklink/docs/checklink.html>
+ (W3C license, which is free software), starting from version 4.81
+ installed from CPAN:
+ https://metacpan.org/pod/distribution/W3C-LinkChecker/bin/checklink.pod
+ * doc/local.mk (checklinkx): new target to invoke it, with variables.
+ * Makefile.am (EXTRA_DIST): distribute it.
+ * NEWS: mention all this.
+
+2020-03-24 Karl Berry <karl@freefriends.org>
+
+ tests: require etags for tags-lisp-space test.
+
+ * t/tags-lisp-space.sh (required): set to etags.
+
+2020-03-24 Karl Berry <karl@freefriends.org>
+
+ Merge branch 'master' of git.savannah.gnu.org:/srv/git/automake
+
+2020-03-24 Karl Berry <karl@freefriends.org>
+
+ doc: forgot Python 3 NEWS entries.
+
+ * NEWS: item for Python 3 support in 1.16.2.
+
+2020-03-23 Jim Meyering <meyering@fb.com>
+
+ maint: Post-release administrivia
+
+ * NEWS: Add header line for next release.
+ * configure.ac (AC_INIT): Bump version number to 1.16b.
+ * m4/amversion.m4: Likewise (auto-updated by "make bootstrap").
+
2020-03-16 Jim Meyering <meyering@fb.com>
version 1.16.2
EXTRA_DIST += \
contrib/tap-driver.pl \
contrib/check-html.am \
+ contrib/checklinkx \
contrib/multilib/README \
contrib/multilib/config-ml.in \
contrib/multilib/symlink-tree \
-# Makefile.in generated by automake 1.16.2 from Makefile.am.
+# Makefile.in generated by automake 1.16.3 from Makefile.am.
# @configure_input@
# Copyright (C) 1994-2020 Free Software Foundation, Inc.
host_triplet = @host@
subdir = .
ACLOCAL_M4 = $(top_srcdir)/aclocal.m4
-am__aclocal_m4_deps = $(top_srcdir)/configure.ac
+am__aclocal_m4_deps = $(top_srcdir)/m4/amversion.m4 \
+ $(top_srcdir)/m4/auxdir.m4 $(top_srcdir)/m4/cond.m4 \
+ $(top_srcdir)/m4/init.m4 $(top_srcdir)/m4/install-sh.m4 \
+ $(top_srcdir)/m4/lead-dot.m4 $(top_srcdir)/m4/missing.m4 \
+ $(top_srcdir)/m4/options.m4 $(top_srcdir)/m4/prog-cc-c-o.m4 \
+ $(top_srcdir)/m4/runlog.m4 $(top_srcdir)/m4/sanity.m4 \
+ $(top_srcdir)/m4/silent.m4 $(top_srcdir)/m4/strip.m4 \
+ $(top_srcdir)/m4/substnot.m4 $(top_srcdir)/m4/tar.m4 \
+ $(top_srcdir)/configure.ac
am__configure_deps = $(am__aclocal_m4_deps) $(CONFIGURE_DEPENDENCIES) \
$(ACLOCAL_M4)
DIST_COMMON = $(srcdir)/Makefile.am $(top_srcdir)/configure \
bases='$(TEST_LOGS)'; \
bases=`for i in $$bases; do echo $$i; done | sed 's/\.log$$//'`; \
bases=`echo $$bases`
+AM_TESTSUITE_SUMMARY_HEADER = ' for $(PACKAGE_STRING)'
RECHECK_LOGS = $(TEST_LOGS)
TEST_SUITE_LOG = test-suite.log
am__test_logs1 = $(TESTS:=.log)
DIST_ARCHIVES = $(distdir).tar.gz $(distdir).tar.xz
GZIP_ENV = --best
DIST_TARGETS = dist-xz dist-gzip
+# Exists only to be overridden by the user if desired.
+AM_DISTCHECK_DVI_TARGET = dvi
distuninstallcheck_listfiles = find . -type f -print
am__distuninstallcheck_listfiles = $(distuninstallcheck_listfiles) \
| sed 's|^\./|$(prefix)/|' | grep -v '$(infodir)/dir$$'
# Maintainer-specific files and scripts.
# XXX: This script should be updated with 'fetch' target.
EXTRA_DIST = bootstrap GNUmakefile HACKING PLANS contrib/tap-driver.pl \
- contrib/check-html.am contrib/multilib/README \
- contrib/multilib/config-ml.in contrib/multilib/symlink-tree \
- contrib/multilib/multilib.am contrib/multilib/multi.m4 \
- contrib/README old/ChangeLog-tests old/ChangeLog.96 \
- old/ChangeLog.98 old/ChangeLog.00 old/ChangeLog.01 \
- old/ChangeLog.02 old/ChangeLog.03 old/ChangeLog.04 \
- old/ChangeLog.09 old/ChangeLog.11 old/TODO maintainer/am-ft \
- maintainer/am-xft maintainer/rename-tests maintainer/maint.mk \
- maintainer/syntax-checks.mk $(AUTOMAKESOURCES) doc/help2man \
- lib/Automake/Config.in m4/amversion.in t/README t/ax/is \
- t/ax/is_newest t/ax/deltree.pl $(handwritten_TESTS) \
- t/ax/tap-summary-aux.sh t/ax/testsuite-summary-checks.sh \
- t/ax/depcomp.sh t/ax/extract-testsuite-summary.pl \
- t/ax/tap-setup.sh t/ax/trivial-test-driver $(generated_TESTS) \
- gen-testsuite-part $(contrib_TESTS) t/ax/distcheck-hook-m4.am \
- t/ax/test-defs.in t/ax/shell-no-trail-bslash.in \
- t/ax/cc-no-c-o.in t/ax/runtest.in $(perf_TESTS)
+ contrib/check-html.am contrib/checklinkx \
+ contrib/multilib/README contrib/multilib/config-ml.in \
+ contrib/multilib/symlink-tree contrib/multilib/multilib.am \
+ contrib/multilib/multi.m4 contrib/README old/ChangeLog-tests \
+ old/ChangeLog.96 old/ChangeLog.98 old/ChangeLog.00 \
+ old/ChangeLog.01 old/ChangeLog.02 old/ChangeLog.03 \
+ old/ChangeLog.04 old/ChangeLog.09 old/ChangeLog.11 old/TODO \
+ maintainer/am-ft maintainer/am-xft maintainer/rename-tests \
+ maintainer/maint.mk maintainer/syntax-checks.mk \
+ $(AUTOMAKESOURCES) doc/help2man lib/Automake/Config.in \
+ m4/amversion.in t/README t/ax/is t/ax/is_newest \
+ t/ax/deltree.pl $(handwritten_TESTS) t/ax/tap-summary-aux.sh \
+ t/ax/testsuite-summary-checks.sh t/ax/depcomp.sh \
+ t/ax/extract-testsuite-summary.pl t/ax/tap-setup.sh \
+ t/ax/trivial-test-driver $(generated_TESTS) gen-testsuite-part \
+ $(contrib_TESTS) t/ax/distcheck-hook-m4.am t/ax/test-defs.in \
+ t/ax/shell-no-trail-bslash.in t/ax/cc-no-c-o.in \
+ t/ax/runtest.in $(perf_TESTS)
TAGS_FILES = $(AUTOMAKESOURCES)
# Static dependencies valid for each test case (also further
&& $(MKDIR_P) doc \
&& ./pre-inst-env $(PERL) $(srcdir)/doc/help2man --output=$@
+checklinkx = $(top_srcdir)/contrib/checklinkx
+# that 4-second sleep seems to be what gnu.org likes.
+chlx_args = -v --sleep 8 #--exclude-url-file=/tmp/xf
+# Explanation of excludes:
+# - w3.org dtds, they are fine (and slow).
+# - mailto urls, they are always forbidden.
+# - vala, redirects to a Gnome subpage and returns 403 to us.
+# - cfortran, forbidden by site's robots.txt.
+# - search.cpan.org, gets
+# - debbugs.gnu.org/automake, forbidden by robots.txt.
+# - autoconf.html, forbidden by robots.txt (since served from savannah).
+# - https://fsf.org redirects to https://www.fsf.org and nothing to do
+# (it's in the FDL). --suppress-redirect options do not suppress the msg.
+#
+chlx_excludes = \
+ -X 'http.*w3\.org/.*dtd' \
+ -X 'mailto:.*' \
+ -X 'https://www\.vala-project\.org/' \
+ -X 'https://www-zeus\.desy\.de/~burow/cfortran/' \
+ -X 'http://xsearch\.cpan\.org/~mschwern/Test-Simple/lib/Test/More\.pm' \
+ -X 'https://debbugs\.gnu\.org/automake' \
+ -X 'https://www\.gnu\.org/software/autoconf/manual/autoconf\.html' \
+ -X 'https://fsf\.org/'
+
+chlx_file = $(top_srcdir)/doc/automake.html
amhello_sources = \
doc/amhello/configure.ac \
doc/amhello/Makefile.am \
XFAIL_TESTS = \
t/all.sh \
-t/auxdir-pr19311.sh \
t/cond17.sh \
t/gcj6.sh \
t/override-conditional-2.sh \
t/built-sources-cond.sh \
t/built-sources-fork-bomb.sh \
t/built-sources-install.sh \
+t/built-sources-install-exec.sh \
t/built-sources-subdir.sh \
t/built-sources.sh \
t/candist.sh \
t/distcheck-missing-m4.sh \
t/distcheck-outdated-m4.sh \
t/distcheck-no-prefix-or-srcdir-override.sh \
+t/distcheck-override-dvi.sh \
t/distcheck-override-infodir.sh \
t/distcheck-pr9579.sh \
t/distcheck-pr10470.sh \
t/add-missing-install-sh.sh \
t/install-sh-unittests.sh \
t/install-sh-option-C.sh \
+t/install-sh-option-S.sh \
t/instdat.sh \
t/instdat2.sh \
t/instdir.sh \
t/testsuite-summary-color.sh \
t/testsuite-summary-count.sh \
t/testsuite-summary-count-many.sh \
+t/testsuite-summary-header.sh \
t/testsuite-summary-reference-log.sh \
t/test-driver-acsubst.sh \
t/test-driver-cond.sh \
t/test-driver-trs-suffix-registered.sh \
t/test-driver-fail.sh \
t/test-driver-is-distributed.sh \
+t/test-extensions-empty.sh \
t/test-harness-vpath-rewrite.sh \
t/test-log.sh \
t/test-logs-repeated.sh \
t/vala-grepping.sh \
t/vala-headers.sh \
t/vala-libs.sh \
+t/vala-libs-distcheck.sh \
+t/vala-libs-vpath.sh \
t/vala-mix.sh \
t/vala-mix2.sh \
t/vala-non-recursive-setup.sh \
t/color-tests2-w.sh t/compile-w.sh t/compile2-w.sh \
t/compile3-w.sh t/compile4-w.sh t/compile5-w.sh \
t/compile6-w.sh t/compile7-w.sh t/exeext4-w.sh \
- t/install-sh-option-C-w.sh t/install-sh-unittests-w.sh \
- t/maken3-w.sh t/mdate5-w.sh t/mdate6-w.sh \
- t/missing-version-mismatch-w.sh t/missing3-w.sh t/mkinst3-w.sh \
- t/posixsubst-tests-w.sh t/depcomp-lt-auto.tap \
+ t/install-sh-option-C-w.sh t/install-sh-option-S-w.sh \
+ t/install-sh-unittests-w.sh t/maken3-w.sh t/mdate5-w.sh \
+ t/mdate6-w.sh t/missing-version-mismatch-w.sh t/missing3-w.sh \
+ t/mkinst3-w.sh t/posixsubst-tests-w.sh t/depcomp-lt-auto.tap \
t/depcomp-lt-cpp.tap t/depcomp-lt-dashmstdout.tap \
t/depcomp-lt-disabled.tap t/depcomp-lt-gcc.tap \
t/depcomp-lt-makedepend.tap t/depcomp-lt-msvcmsys.tap \
check_testsuite_summary_TESTS = \
t/testsuite-summary-color.sh \
- t/testsuite-summary-count.sh
+ t/testsuite-summary-count.sh \
+ t/testsuite-summary-header.sh
depcomp_TESTS = \
t/depcomp-lt-auto.tap \
t/depcomp-msvisualcpp.tap
extract_testsuite_summary_TESTS = \
- t/testsuite-summary-count-many.sh
+ t/testsuite-summary-count-many.sh \
+ t/testsuite-summary-header.sh
gettext_macros_TESTS = \
t/gettext-basics.sh \
t/suffix8.tap \
t/suffix10.tap \
t/vala-libs.sh \
+ t/vala-libs-distcheck.sh \
+ t/vala-libs-vpath.sh \
t/vartypo2.sh \
t/depcomp-lt-auto.tap \
t/depcomp-lt-cpp.tap \
pkgconfig_macros_TESTS = \
t/vala-headers.sh \
t/vala-libs.sh \
+ t/vala-libs-distcheck.sh \
+ t/vala-libs-vpath.sh \
t/vala-mix.sh \
t/vala-mix2.sh \
t/vala-non-recursive-setup.sh \
test x"$$VERBOSE" = x || cat $(TEST_SUITE_LOG); \
fi; \
echo "$${col}$$br$${std}"; \
- echo "$${col}Testsuite summary for $(PACKAGE_STRING)$${std}"; \
+ echo "$${col}Testsuite summary"$(AM_TESTSUITE_SUMMARY_HEADER)"$${std}"; \
echo "$${col}$$br$${std}"; \
create_testsuite_report --maybe-color; \
echo "$$col$$br$$std"; \
$(DISTCHECK_CONFIGURE_FLAGS) \
--srcdir=../.. --prefix="$$dc_install_base" \
&& $(MAKE) $(AM_MAKEFLAGS) \
- && $(MAKE) $(AM_MAKEFLAGS) dvi \
+ && $(MAKE) $(AM_MAKEFLAGS) $(AM_DISTCHECK_DVI_TARGET) \
&& $(MAKE) $(AM_MAKEFLAGS) check \
&& $(MAKE) $(AM_MAKEFLAGS) install \
&& $(MAKE) $(AM_MAKEFLAGS) installcheck \
$(update_mans) aclocal-$(APIVERSION)
doc/automake-$(APIVERSION).1: $(automake_script) lib/Automake/Config.pm
$(update_mans) automake-$(APIVERSION)
+.PHONY: checklinkx
+checklinkx:
+ $(checklinkx) $(chlx_args) $(chlx_excludes) $(chlx_file)
# We depend on configure.ac so that we regenerate the tarball
# whenever the Automake version changes.
t/exeext4-w.log: t/exeext4.log
t/install-sh-option-C-w.log: t/install-sh-option-C.sh
t/install-sh-option-C-w.log: t/install-sh-option-C.log
+t/install-sh-option-S-w.log: t/install-sh-option-S.sh
+t/install-sh-option-S-w.log: t/install-sh-option-S.log
t/install-sh-unittests-w.log: t/install-sh-unittests.sh
t/install-sh-unittests-w.log: t/install-sh-unittests.log
t/maken3-w.log: t/maken3.sh
t/tap-summary-color.log: t/ax/tap-summary-aux.sh
t/testsuite-summary-color.log: t/ax/testsuite-summary-checks.sh
t/testsuite-summary-count.log: t/ax/testsuite-summary-checks.sh
+t/testsuite-summary-header.log: t/ax/testsuite-summary-checks.sh
t/depcomp-lt-auto.log: t/ax/depcomp.sh
t/depcomp-lt-cpp.log: t/ax/depcomp.sh
t/depcomp-lt-dashmstdout.log: t/ax/depcomp.sh
t/depcomp-msvcmsys.log: t/ax/depcomp.sh
t/depcomp-msvisualcpp.log: t/ax/depcomp.sh
t/testsuite-summary-count-many.log: t/ax/extract-testsuite-summary.pl
+t/testsuite-summary-header.log: t/ax/extract-testsuite-summary.pl
t/gettext-basics.log: t/gettext-macros.log
t/gettext-config-rpath.log: t/gettext-macros.log
t/gettext-external-pr338.log: t/gettext-macros.log
t/suffix8.log: t/libtool-macros.log
t/suffix10.log: t/libtool-macros.log
t/vala-libs.log: t/libtool-macros.log
+t/vala-libs-distcheck.log: t/libtool-macros.log
+t/vala-libs-vpath.log: t/libtool-macros.log
t/vartypo2.log: t/libtool-macros.log
t/depcomp-lt-auto.log: t/libtool-macros.log
t/depcomp-lt-cpp.log: t/libtool-macros.log
t/tap-xfail-tests.log: t/ax/tap-setup.sh t/tap-common-setup.log
t/vala-headers.log: t/pkg-config-macros.log
t/vala-libs.log: t/pkg-config-macros.log
+t/vala-libs-distcheck.log: t/pkg-config-macros.log
+t/vala-libs-vpath.log: t/pkg-config-macros.log
t/vala-mix.log: t/pkg-config-macros.log
t/vala-mix2.log: t/pkg-config-macros.log
t/vala-non-recursive-setup.log: t/pkg-config-macros.log
$(AM_V_at)chmod a-w t/testsuite-part.tmp
$(AM_V_at)mv -f t/testsuite-part.tmp $@
-# The dependecies declared here are not truly complete, but such
+# The dependencies declared here are not truly complete, but such
# completeness would cause more issues than it would solve. See
-# automake bug#11347.
+# automake bug#11347 and #44458.
$(generated_TESTS): $(srcdir)/gen-testsuite-part
$(srcdir)/t/testsuite-part.am: $(srcdir)/gen-testsuite-part
$(srcdir)/t/testsuite-part.am: Makefile.am
+$(srcdir)/t/testsuite-part.am: t/list-of-tests.mk
# Few more static dependencies.
t/distcheck-missing-m4.log: t/ax/distcheck-hook-m4.am
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+New in ?.?.?:
+
+
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+New in 1.16.3:
+
+* New features added
+
+ - In the testsuite summary, the "for $(PACKAGE_STRING)" suffix
+ can be overridden with the AM_TESTSUITE_SUMMARY_HEADER variable.
+
+* Bugs fixed
+
+ - Python 3.10 version number no longer considered to be 3.1.
+
+ - Broken links in manual fixed or removed, and new script
+ contrib/checklinkx (a small modification of W3C checklink) added,
+ with accompany target checklinkx to recheck urls.
+
+ - install-exec target depends on $(BUILT_SOURCES).
+
+ - valac argument matching more precise, to avoid garbage in DIST_COMMON.
+
+ - Support for Vala in VPATH builds fixed so that both freshly-generated and
+ distributed C files work, and operation is more reliable with or without
+ an installed valac.
+
+ - Dejagnu doesn't break on directories containing spaces.
+
+* Distribution
+
+ - new variable AM_DISTCHECK_DVI_TARGET, to allow overriding the
+ "make dvi" that is done as part of distcheck.
+
+* Miscellaneous changes
+
+ - install-sh tweaks:
+ . new option -p to preserve mtime, i.e., invoke cp -p.
+ . new option -S SUFFIX to attempt backup files using SUFFIX.
+ . no longer unconditionally uses -f when rm is overridden by RMPROG.
+ . does not chown existing directories.
+
+ - Removed function up_to_date_p in lib/Automake/FileUtils.pm.
+ We believe this function is completely unused.
+
+ - Support for in-tree Vala libraries improved.
+
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
New in 1.16.2:
* New features added
- add zstd support and the automake option, dist-zstd.
+ - support for Python 3: py-compile now supports both Python 3
+ and Python 2; tests do not require .pyo files, and uninstall
+ deletes __pycache__ correctly (automake bug #32088).
+
* Miscellaneous changes
- automake no longer requires a @setfilename in each .texi file
script. Similarly, the obsolescent variable '$(AMTAR)' (which you
shouldn't be using BTW ;-) no longer invokes the 'missing' script
to wrap tar, but simply invokes the 'tar' program itself.
+ The TAR environment variable overrides.
- "make dist" can now create lzip-compressed tarballs.
using a `dirlist' file within the aclocal directory.
* automake --output-dir is deprecated.
* The part of the distcheck target that checks whether uninstall actually
- removes all installed files has been moved in a separate target,
+ removes all installed files has been moved to a separate target,
distuninstallcheck, so it can be overridden easily.
* Many bug fixes.
* EXTRA_DIST can contain generated directories.
* Support for dot-less extensions in suffix rules.
* The part of the distcheck target that checks whether distclean actually
- cleans all built files has been moved in a separate target, distcleancheck,
+ cleans all built files has been moved to a separate target, distcleancheck,
so it can be overridden easily.
* `make distcheck' will pass additional options defined in
$(DISTCHECK_CONFIGURE_FLAGS) to configure.
<https://lists.gnu.org/mailman/listinfo/autotools-announce>.
For any copyright year range specified as YYYY-ZZZZ in this package,
-that the range specifies every single year in that closed interval.
+the range specifies every single year in that closed interval.
-----
-Copyright (C) 1994-2012 Free Software Foundation, Inc.
+Copyright (C) 1994-2020 Free Software Foundation, Inc.
This program is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
-# generated automatically by aclocal 1.16.2 -*- Autoconf -*-
+# generated automatically by aclocal 1.16.3 -*- Autoconf -*-
# Copyright (C) 1996-2020 Free Software Foundation, Inc.
m4_ifndef([AC_CONFIG_MACRO_DIRS], [m4_defun([_AM_CONFIG_MACRO_DIRS], [])m4_defun([AC_CONFIG_MACRO_DIRS], [_AM_CONFIG_MACRO_DIRS($@)])])
m4_ifndef([AC_AUTOCONF_VERSION],
[m4_copy([m4_PACKAGE_VERSION], [AC_AUTOCONF_VERSION])])dnl
-m4_if(m4_defn([AC_AUTOCONF_VERSION]), [2.69.204-98d6],,
-[m4_warning([this file was generated for autoconf 2.69.204-98d6.
+m4_if(m4_defn([AC_AUTOCONF_VERSION]), [2.69d.4-8e54],,
+[m4_warning([this file was generated for autoconf 2.69d.4-8e54.
You have another version of autoconf. It may work, but is not guaranteed to.
If you have problems, you may need to regenerate the build system entirely.
To do so, use the procedure documented by the package, typically 'autoreconf'.])])
-# Copyright (C) 2002-2020 Free Software Foundation, Inc.
-#
-# This file is free software; the Free Software Foundation
-# gives unlimited permission to copy and/or distribute it,
-# with or without modifications, as long as this notice is preserved.
-
-# AM_AUTOMAKE_VERSION(VERSION)
-# ----------------------------
-# Automake X.Y traces this macro to ensure aclocal.m4 has been
-# generated from the m4 files accompanying Automake X.Y.
-# (This private macro should not be called outside this file.)
-AC_DEFUN([AM_AUTOMAKE_VERSION],
-[am__api_version='1.16'
-dnl Some users find AM_AUTOMAKE_VERSION and mistake it for a way to
-dnl require some minimum version. Point them to the right macro.
-m4_if([$1], [1.16.2], [],
- [AC_FATAL([Do not call $0, use AM_INIT_AUTOMAKE([$1]).])])dnl
-])
-
-# _AM_AUTOCONF_VERSION(VERSION)
-# -----------------------------
-# aclocal traces this macro to find the Autoconf version.
-# This is a private macro too. Using m4_define simplifies
-# the logic in aclocal, which can simply ignore this definition.
-m4_define([_AM_AUTOCONF_VERSION], [])
-
-# AM_SET_CURRENT_AUTOMAKE_VERSION
-# -------------------------------
-# Call AM_AUTOMAKE_VERSION and AM_AUTOMAKE_VERSION so they can be traced.
-# This function is AC_REQUIREd by AM_INIT_AUTOMAKE.
-AC_DEFUN([AM_SET_CURRENT_AUTOMAKE_VERSION],
-[AM_AUTOMAKE_VERSION([1.16.2])dnl
-m4_ifndef([AC_AUTOCONF_VERSION],
- [m4_copy([m4_PACKAGE_VERSION], [AC_AUTOCONF_VERSION])])dnl
-_AM_AUTOCONF_VERSION(m4_defn([AC_AUTOCONF_VERSION]))])
-
-# AM_AUX_DIR_EXPAND -*- Autoconf -*-
-
-# Copyright (C) 2001-2020 Free Software Foundation, Inc.
-#
-# This file is free software; the Free Software Foundation
-# gives unlimited permission to copy and/or distribute it,
-# with or without modifications, as long as this notice is preserved.
-
-# For projects using AC_CONFIG_AUX_DIR([foo]), Autoconf sets
-# $ac_aux_dir to '$srcdir/foo'. In other projects, it is set to
-# '$srcdir', '$srcdir/..', or '$srcdir/../..'.
-#
-# Of course, Automake must honor this variable whenever it calls a
-# tool from the auxiliary directory. The problem is that $srcdir (and
-# therefore $ac_aux_dir as well) can be either absolute or relative,
-# depending on how configure is run. This is pretty annoying, since
-# it makes $ac_aux_dir quite unusable in subdirectories: in the top
-# source directory, any form will work fine, but in subdirectories a
-# relative path needs to be adjusted first.
-#
-# $ac_aux_dir/missing
-# fails when called from a subdirectory if $ac_aux_dir is relative
-# $top_srcdir/$ac_aux_dir/missing
-# fails if $ac_aux_dir is absolute,
-# fails when called from a subdirectory in a VPATH build with
-# a relative $ac_aux_dir
-#
-# The reason of the latter failure is that $top_srcdir and $ac_aux_dir
-# are both prefixed by $srcdir. In an in-source build this is usually
-# harmless because $srcdir is '.', but things will broke when you
-# start a VPATH build or use an absolute $srcdir.
-#
-# So we could use something similar to $top_srcdir/$ac_aux_dir/missing,
-# iff we strip the leading $srcdir from $ac_aux_dir. That would be:
-# am_aux_dir='\$(top_srcdir)/'`expr "$ac_aux_dir" : "$srcdir//*\(.*\)"`
-# and then we would define $MISSING as
-# MISSING="\${SHELL} $am_aux_dir/missing"
-# This will work as long as MISSING is not called from configure, because
-# unfortunately $(top_srcdir) has no meaning in configure.
-# However there are other variables, like CC, which are often used in
-# configure, and could therefore not use this "fixed" $ac_aux_dir.
-#
-# Another solution, used here, is to always expand $ac_aux_dir to an
-# absolute PATH. The drawback is that using absolute paths prevent a
-# configured tree to be moved without reconfiguration.
-
-AC_DEFUN([AM_AUX_DIR_EXPAND],
-[AC_REQUIRE([AC_CONFIG_AUX_DIR_DEFAULT])dnl
-# Expand $ac_aux_dir to an absolute path.
-am_aux_dir=`cd "$ac_aux_dir" && pwd`
-])
-
-# AM_CONDITIONAL -*- Autoconf -*-
-
-# Copyright (C) 1997-2020 Free Software Foundation, Inc.
-#
-# This file is free software; the Free Software Foundation
-# gives unlimited permission to copy and/or distribute it,
-# with or without modifications, as long as this notice is preserved.
-
-# AM_CONDITIONAL(NAME, SHELL-CONDITION)
-# -------------------------------------
-# Define a conditional.
-AC_DEFUN([AM_CONDITIONAL],
-[AC_PREREQ([2.52])dnl
- m4_if([$1], [TRUE], [AC_FATAL([$0: invalid condition: $1])],
- [$1], [FALSE], [AC_FATAL([$0: invalid condition: $1])])dnl
-AC_SUBST([$1_TRUE])dnl
-AC_SUBST([$1_FALSE])dnl
-_AM_SUBST_NOTMAKE([$1_TRUE])dnl
-_AM_SUBST_NOTMAKE([$1_FALSE])dnl
-m4_define([_AM_COND_VALUE_$1], [$2])dnl
-if $2; then
- $1_TRUE=
- $1_FALSE='#'
-else
- $1_TRUE='#'
- $1_FALSE=
-fi
-AC_CONFIG_COMMANDS_PRE(
-[if test -z "${$1_TRUE}" && test -z "${$1_FALSE}"; then
- AC_MSG_ERROR([[conditional "$1" was never defined.
-Usually this means the macro was only invoked conditionally.]])
-fi])])
-
-# Do all the work for Automake. -*- Autoconf -*-
-
-# Copyright (C) 1996-2020 Free Software Foundation, Inc.
-#
-# This file is free software; the Free Software Foundation
-# gives unlimited permission to copy and/or distribute it,
-# with or without modifications, as long as this notice is preserved.
-
-# This macro actually does too much. Some checks are only needed if
-# your package does certain things. But this isn't really a big deal.
-
-dnl Redefine AC_PROG_CC to automatically invoke _AM_PROG_CC_C_O.
-m4_define([AC_PROG_CC],
-m4_defn([AC_PROG_CC])
-[_AM_PROG_CC_C_O
-])
-
-# AM_INIT_AUTOMAKE(PACKAGE, VERSION, [NO-DEFINE])
-# AM_INIT_AUTOMAKE([OPTIONS])
-# -----------------------------------------------
-# The call with PACKAGE and VERSION arguments is the old style
-# call (pre autoconf-2.50), which is being phased out. PACKAGE
-# and VERSION should now be passed to AC_INIT and removed from
-# the call to AM_INIT_AUTOMAKE.
-# We support both call styles for the transition. After
-# the next Automake release, Autoconf can make the AC_INIT
-# arguments mandatory, and then we can depend on a new Autoconf
-# release and drop the old call support.
-AC_DEFUN([AM_INIT_AUTOMAKE],
-[AC_PREREQ([2.65])dnl
-dnl Autoconf wants to disallow AM_ names. We explicitly allow
-dnl the ones we care about.
-m4_pattern_allow([^AM_[A-Z]+FLAGS$])dnl
-AC_REQUIRE([AM_SET_CURRENT_AUTOMAKE_VERSION])dnl
-AC_REQUIRE([AC_PROG_INSTALL])dnl
-if test "`cd $srcdir && pwd`" != "`pwd`"; then
- # Use -I$(srcdir) only when $(srcdir) != ., so that make's output
- # is not polluted with repeated "-I."
- AC_SUBST([am__isrc], [' -I$(srcdir)'])_AM_SUBST_NOTMAKE([am__isrc])dnl
- # test to see if srcdir already configured
- if test -f $srcdir/config.status; then
- AC_MSG_ERROR([source directory already configured; run "make distclean" there first])
- fi
-fi
-
-# test whether we have cygpath
-if test -z "$CYGPATH_W"; then
- if (cygpath --version) >/dev/null 2>/dev/null; then
- CYGPATH_W='cygpath -w'
- else
- CYGPATH_W=echo
- fi
-fi
-AC_SUBST([CYGPATH_W])
-
-# Define the identity of the package.
-dnl Distinguish between old-style and new-style calls.
-m4_ifval([$2],
-[AC_DIAGNOSE([obsolete],
- [$0: two- and three-arguments forms are deprecated.])
-m4_ifval([$3], [_AM_SET_OPTION([no-define])])dnl
- AC_SUBST([PACKAGE], [$1])dnl
- AC_SUBST([VERSION], [$2])],
-[_AM_SET_OPTIONS([$1])dnl
-dnl Diagnose old-style AC_INIT with new-style AM_AUTOMAKE_INIT.
-m4_if(
- m4_ifdef([AC_PACKAGE_NAME], [ok]):m4_ifdef([AC_PACKAGE_VERSION], [ok]),
- [ok:ok],,
- [m4_fatal([AC_INIT should be called with package and version arguments])])dnl
- AC_SUBST([PACKAGE], ['AC_PACKAGE_TARNAME'])dnl
- AC_SUBST([VERSION], ['AC_PACKAGE_VERSION'])])dnl
-
-_AM_IF_OPTION([no-define],,
-[AC_DEFINE_UNQUOTED([PACKAGE], ["$PACKAGE"], [Name of package])
- AC_DEFINE_UNQUOTED([VERSION], ["$VERSION"], [Version number of package])])dnl
-
-# Some tools Automake needs.
-AC_REQUIRE([AM_SANITY_CHECK])dnl
-AC_REQUIRE([AC_ARG_PROGRAM])dnl
-AM_MISSING_PROG([ACLOCAL], [aclocal-${am__api_version}])
-AM_MISSING_PROG([AUTOCONF], [autoconf])
-AM_MISSING_PROG([AUTOMAKE], [automake-${am__api_version}])
-AM_MISSING_PROG([AUTOHEADER], [autoheader])
-AM_MISSING_PROG([MAKEINFO], [makeinfo])
-AC_REQUIRE([AM_PROG_INSTALL_SH])dnl
-AC_REQUIRE([AM_PROG_INSTALL_STRIP])dnl
-AC_REQUIRE([AC_PROG_MKDIR_P])dnl
-# For better backward compatibility. To be removed once Automake 1.9.x
-# dies out for good. For more background, see:
-# <https://lists.gnu.org/archive/html/automake/2012-07/msg00001.html>
-# <https://lists.gnu.org/archive/html/automake/2012-07/msg00014.html>
-AC_SUBST([mkdir_p], ['$(MKDIR_P)'])
-# We need awk for the "check" target (and possibly the TAP driver). The
-# system "awk" is bad on some platforms.
-AC_REQUIRE([AC_PROG_AWK])dnl
-AC_REQUIRE([AC_PROG_MAKE_SET])dnl
-AC_REQUIRE([AM_SET_LEADING_DOT])dnl
-_AM_IF_OPTION([tar-ustar], [_AM_PROG_TAR([ustar])],
- [_AM_IF_OPTION([tar-pax], [_AM_PROG_TAR([pax])],
- [_AM_PROG_TAR([v7])])])
-_AM_IF_OPTION([no-dependencies],,
-[AC_PROVIDE_IFELSE([AC_PROG_CC],
- [_AM_DEPENDENCIES([CC])],
- [m4_define([AC_PROG_CC],
- m4_defn([AC_PROG_CC])[_AM_DEPENDENCIES([CC])])])dnl
-AC_PROVIDE_IFELSE([AC_PROG_CXX],
- [_AM_DEPENDENCIES([CXX])],
- [m4_define([AC_PROG_CXX],
- m4_defn([AC_PROG_CXX])[_AM_DEPENDENCIES([CXX])])])dnl
-AC_PROVIDE_IFELSE([AC_PROG_OBJC],
- [_AM_DEPENDENCIES([OBJC])],
- [m4_define([AC_PROG_OBJC],
- m4_defn([AC_PROG_OBJC])[_AM_DEPENDENCIES([OBJC])])])dnl
-AC_PROVIDE_IFELSE([AC_PROG_OBJCXX],
- [_AM_DEPENDENCIES([OBJCXX])],
- [m4_define([AC_PROG_OBJCXX],
- m4_defn([AC_PROG_OBJCXX])[_AM_DEPENDENCIES([OBJCXX])])])dnl
-])
-AC_REQUIRE([AM_SILENT_RULES])dnl
-dnl The testsuite driver may need to know about EXEEXT, so add the
-dnl 'am__EXEEXT' conditional if _AM_COMPILER_EXEEXT was seen. This
-dnl macro is hooked onto _AC_COMPILER_EXEEXT early, see below.
-AC_CONFIG_COMMANDS_PRE(dnl
-[m4_provide_if([_AM_COMPILER_EXEEXT],
- [AM_CONDITIONAL([am__EXEEXT], [test -n "$EXEEXT"])])])dnl
-
-# POSIX will say in a future version that running "rm -f" with no argument
-# is OK; and we want to be able to make that assumption in our Makefile
-# recipes. So use an aggressive probe to check that the usage we want is
-# actually supported "in the wild" to an acceptable degree.
-# See automake bug#10828.
-# To make any issue more visible, cause the running configure to be aborted
-# by default if the 'rm' program in use doesn't match our expectations; the
-# user can still override this though.
-if rm -f && rm -fr && rm -rf; then : OK; else
- cat >&2 <<'END'
-Oops!
-
-Your 'rm' program seems unable to run without file operands specified
-on the command line, even when the '-f' option is present. This is contrary
-to the behaviour of most rm programs out there, and not conforming with
-the upcoming POSIX standard: <http://austingroupbugs.net/view.php?id=542>
-
-Please tell bug-automake@gnu.org about your system, including the value
-of your $PATH and any error possibly output before this message. This
-can help us improve future automake versions.
-
-END
- if test x"$ACCEPT_INFERIOR_RM_PROGRAM" = x"yes"; then
- echo 'Configuration will proceed anyway, since you have set the' >&2
- echo 'ACCEPT_INFERIOR_RM_PROGRAM variable to "yes"' >&2
- echo >&2
- else
- cat >&2 <<'END'
-Aborting the configuration process, to ensure you take notice of the issue.
-
-You can download and install GNU coreutils to get an 'rm' implementation
-that behaves properly: <https://www.gnu.org/software/coreutils/>.
-
-If you want to complete the configuration process using your problematic
-'rm' anyway, export the environment variable ACCEPT_INFERIOR_RM_PROGRAM
-to "yes", and re-run configure.
-
-END
- AC_MSG_ERROR([Your 'rm' program is bad, sorry.])
- fi
-fi
-dnl The trailing newline in this macro's definition is deliberate, for
-dnl backward compatibility and to allow trailing 'dnl'-style comments
-dnl after the AM_INIT_AUTOMAKE invocation. See automake bug#16841.
-])
-
-dnl Hook into '_AC_COMPILER_EXEEXT' early to learn its expansion. Do not
-dnl add the conditional right here, as _AC_COMPILER_EXEEXT may be further
-dnl mangled by Autoconf and run in a shell conditional statement.
-m4_define([_AC_COMPILER_EXEEXT],
-m4_defn([_AC_COMPILER_EXEEXT])[m4_provide([_AM_COMPILER_EXEEXT])])
-
-# When config.status generates a header, we must update the stamp-h file.
-# This file resides in the same directory as the config header
-# that is generated. The stamp files are numbered to have different names.
-
-# Autoconf calls _AC_AM_CONFIG_HEADER_HOOK (when defined) in the
-# loop where config.status creates the headers, so we can generate
-# our stamp files there.
-AC_DEFUN([_AC_AM_CONFIG_HEADER_HOOK],
-[# Compute $1's index in $config_headers.
-_am_arg=$1
-_am_stamp_count=1
-for _am_header in $config_headers :; do
- case $_am_header in
- $_am_arg | $_am_arg:* )
- break ;;
- * )
- _am_stamp_count=`expr $_am_stamp_count + 1` ;;
- esac
-done
-echo "timestamp for $_am_arg" >`AS_DIRNAME(["$_am_arg"])`/stamp-h[]$_am_stamp_count])
-
-# Copyright (C) 2001-2020 Free Software Foundation, Inc.
-#
-# This file is free software; the Free Software Foundation
-# gives unlimited permission to copy and/or distribute it,
-# with or without modifications, as long as this notice is preserved.
-
-# AM_PROG_INSTALL_SH
-# ------------------
-# Define $install_sh.
-AC_DEFUN([AM_PROG_INSTALL_SH],
-[AC_REQUIRE([AM_AUX_DIR_EXPAND])dnl
-if test x"${install_sh+set}" != xset; then
- case $am_aux_dir in
- *\ * | *\ *)
- install_sh="\${SHELL} '$am_aux_dir/install-sh'" ;;
- *)
- install_sh="\${SHELL} $am_aux_dir/install-sh"
- esac
-fi
-AC_SUBST([install_sh])])
-
-# Copyright (C) 2003-2020 Free Software Foundation, Inc.
-#
-# This file is free software; the Free Software Foundation
-# gives unlimited permission to copy and/or distribute it,
-# with or without modifications, as long as this notice is preserved.
-
-# Check whether the underlying file-system supports filenames
-# with a leading dot. For instance MS-DOS doesn't.
-AC_DEFUN([AM_SET_LEADING_DOT],
-[rm -rf .tst 2>/dev/null
-mkdir .tst 2>/dev/null
-if test -d .tst; then
- am__leading_dot=.
-else
- am__leading_dot=_
-fi
-rmdir .tst 2>/dev/null
-AC_SUBST([am__leading_dot])])
-
-# Fake the existence of programs that GNU maintainers use. -*- Autoconf -*-
-
-# Copyright (C) 1997-2020 Free Software Foundation, Inc.
-#
-# This file is free software; the Free Software Foundation
-# gives unlimited permission to copy and/or distribute it,
-# with or without modifications, as long as this notice is preserved.
-
-# AM_MISSING_PROG(NAME, PROGRAM)
-# ------------------------------
-AC_DEFUN([AM_MISSING_PROG],
-[AC_REQUIRE([AM_MISSING_HAS_RUN])
-$1=${$1-"${am_missing_run}$2"}
-AC_SUBST($1)])
-
-# AM_MISSING_HAS_RUN
-# ------------------
-# Define MISSING if not defined so far and test if it is modern enough.
-# If it is, set am_missing_run to use it, otherwise, to nothing.
-AC_DEFUN([AM_MISSING_HAS_RUN],
-[AC_REQUIRE([AM_AUX_DIR_EXPAND])dnl
-AC_REQUIRE_AUX_FILE([missing])dnl
-if test x"${MISSING+set}" != xset; then
- case $am_aux_dir in
- *\ * | *\ *)
- MISSING="\${SHELL} \"$am_aux_dir/missing\"" ;;
- *)
- MISSING="\${SHELL} $am_aux_dir/missing" ;;
- esac
-fi
-# Use eval to expand $SHELL
-if eval "$MISSING --is-lightweight"; then
- am_missing_run="$MISSING "
-else
- am_missing_run=
- AC_MSG_WARN(['missing' script is too old or missing])
-fi
-])
-
-# Helper functions for option handling. -*- Autoconf -*-
-
-# Copyright (C) 2001-2020 Free Software Foundation, Inc.
-#
-# This file is free software; the Free Software Foundation
-# gives unlimited permission to copy and/or distribute it,
-# with or without modifications, as long as this notice is preserved.
-
-# _AM_MANGLE_OPTION(NAME)
-# -----------------------
-AC_DEFUN([_AM_MANGLE_OPTION],
-[[_AM_OPTION_]m4_bpatsubst($1, [[^a-zA-Z0-9_]], [_])])
-
-# _AM_SET_OPTION(NAME)
-# --------------------
-# Set option NAME. Presently that only means defining a flag for this option.
-AC_DEFUN([_AM_SET_OPTION],
-[m4_define(_AM_MANGLE_OPTION([$1]), [1])])
-
-# _AM_SET_OPTIONS(OPTIONS)
-# ------------------------
-# OPTIONS is a space-separated list of Automake options.
-AC_DEFUN([_AM_SET_OPTIONS],
-[m4_foreach_w([_AM_Option], [$1], [_AM_SET_OPTION(_AM_Option)])])
-
-# _AM_IF_OPTION(OPTION, IF-SET, [IF-NOT-SET])
-# -------------------------------------------
-# Execute IF-SET if OPTION is set, IF-NOT-SET otherwise.
-AC_DEFUN([_AM_IF_OPTION],
-[m4_ifset(_AM_MANGLE_OPTION([$1]), [$2], [$3])])
-
-# Copyright (C) 1999-2020 Free Software Foundation, Inc.
-#
-# This file is free software; the Free Software Foundation
-# gives unlimited permission to copy and/or distribute it,
-# with or without modifications, as long as this notice is preserved.
-
-# _AM_PROG_CC_C_O
-# ---------------
-# Like AC_PROG_CC_C_O, but changed for automake. We rewrite AC_PROG_CC
-# to automatically call this.
-AC_DEFUN([_AM_PROG_CC_C_O],
-[AC_REQUIRE([AM_AUX_DIR_EXPAND])dnl
-AC_REQUIRE_AUX_FILE([compile])dnl
-AC_LANG_PUSH([C])dnl
-AC_CACHE_CHECK(
- [whether $CC understands -c and -o together],
- [am_cv_prog_cc_c_o],
- [AC_LANG_CONFTEST([AC_LANG_PROGRAM([])])
- # Make sure it works both with $CC and with simple cc.
- # Following AC_PROG_CC_C_O, we do the test twice because some
- # compilers refuse to overwrite an existing .o file with -o,
- # though they will create one.
- am_cv_prog_cc_c_o=yes
- for am_i in 1 2; do
- if AM_RUN_LOG([$CC -c conftest.$ac_ext -o conftest2.$ac_objext]) \
- && test -f conftest2.$ac_objext; then
- : OK
- else
- am_cv_prog_cc_c_o=no
- break
- fi
- done
- rm -f core conftest*
- unset am_i])
-if test "$am_cv_prog_cc_c_o" != yes; then
- # Losing compiler, so override with the script.
- # FIXME: It is wrong to rewrite CC.
- # But if we don't then we get into trouble of one sort or another.
- # A longer-term fix would be to have automake use am__CC in this case,
- # and then we could set am__CC="\$(top_srcdir)/compile \$(CC)"
- CC="$am_aux_dir/compile $CC"
-fi
-AC_LANG_POP([C])])
-
-# For backward compatibility.
-AC_DEFUN_ONCE([AM_PROG_CC_C_O], [AC_REQUIRE([AC_PROG_CC])])
-
-# Copyright (C) 2001-2020 Free Software Foundation, Inc.
-#
-# This file is free software; the Free Software Foundation
-# gives unlimited permission to copy and/or distribute it,
-# with or without modifications, as long as this notice is preserved.
-
-# AM_RUN_LOG(COMMAND)
-# -------------------
-# Run COMMAND, save the exit status in ac_status, and log it.
-# (This has been adapted from Autoconf's _AC_RUN_LOG macro.)
-AC_DEFUN([AM_RUN_LOG],
-[{ echo "$as_me:$LINENO: $1" >&AS_MESSAGE_LOG_FD
- ($1) >&AS_MESSAGE_LOG_FD 2>&AS_MESSAGE_LOG_FD
- ac_status=$?
- echo "$as_me:$LINENO: \$? = $ac_status" >&AS_MESSAGE_LOG_FD
- (exit $ac_status); }])
-
-# Check to make sure that the build environment is sane. -*- Autoconf -*-
-
-# Copyright (C) 1996-2020 Free Software Foundation, Inc.
-#
-# This file is free software; the Free Software Foundation
-# gives unlimited permission to copy and/or distribute it,
-# with or without modifications, as long as this notice is preserved.
-
-# AM_SANITY_CHECK
-# ---------------
-AC_DEFUN([AM_SANITY_CHECK],
-[AC_MSG_CHECKING([whether build environment is sane])
-# Reject unsafe characters in $srcdir or the absolute working directory
-# name. Accept space and tab only in the latter.
-am_lf='
-'
-case `pwd` in
- *[[\\\"\#\$\&\'\`$am_lf]]*)
- AC_MSG_ERROR([unsafe absolute working directory name]);;
-esac
-case $srcdir in
- *[[\\\"\#\$\&\'\`$am_lf\ \ ]]*)
- AC_MSG_ERROR([unsafe srcdir value: '$srcdir']);;
-esac
-
-# Do 'set' in a subshell so we don't clobber the current shell's
-# arguments. Must try -L first in case configure is actually a
-# symlink; some systems play weird games with the mod time of symlinks
-# (eg FreeBSD returns the mod time of the symlink's containing
-# directory).
-if (
- am_has_slept=no
- for am_try in 1 2; do
- echo "timestamp, slept: $am_has_slept" > conftest.file
- set X `ls -Lt "$srcdir/configure" conftest.file 2> /dev/null`
- if test "$[*]" = "X"; then
- # -L didn't work.
- set X `ls -t "$srcdir/configure" conftest.file`
- fi
- if test "$[*]" != "X $srcdir/configure conftest.file" \
- && test "$[*]" != "X conftest.file $srcdir/configure"; then
-
- # If neither matched, then we have a broken ls. This can happen
- # if, for instance, CONFIG_SHELL is bash and it inherits a
- # broken ls alias from the environment. This has actually
- # happened. Such a system could not be considered "sane".
- AC_MSG_ERROR([ls -t appears to fail. Make sure there is not a broken
- alias in your environment])
- fi
- if test "$[2]" = conftest.file || test $am_try -eq 2; then
- break
- fi
- # Just in case.
- sleep 1
- am_has_slept=yes
- done
- test "$[2]" = conftest.file
- )
-then
- # Ok.
- :
-else
- AC_MSG_ERROR([newly created file is older than distributed files!
-Check your system clock])
-fi
-AC_MSG_RESULT([yes])
-# If we didn't sleep, we still need to ensure time stamps of config.status and
-# generated files are strictly newer.
-am_sleep_pid=
-if grep 'slept: no' conftest.file >/dev/null 2>&1; then
- ( sleep 1 ) &
- am_sleep_pid=$!
-fi
-AC_CONFIG_COMMANDS_PRE(
- [AC_MSG_CHECKING([that generated files are newer than configure])
- if test -n "$am_sleep_pid"; then
- # Hide warnings about reused PIDs.
- wait $am_sleep_pid 2>/dev/null
- fi
- AC_MSG_RESULT([done])])
-rm -f conftest.file
-])
-
-# Copyright (C) 2009-2020 Free Software Foundation, Inc.
-#
-# This file is free software; the Free Software Foundation
-# gives unlimited permission to copy and/or distribute it,
-# with or without modifications, as long as this notice is preserved.
-
-# AM_SILENT_RULES([DEFAULT])
-# --------------------------
-# Enable less verbose build rules; with the default set to DEFAULT
-# ("yes" being less verbose, "no" or empty being verbose).
-AC_DEFUN([AM_SILENT_RULES],
-[AC_ARG_ENABLE([silent-rules], [dnl
-AS_HELP_STRING(
- [--enable-silent-rules],
- [less verbose build output (undo: "make V=1")])
-AS_HELP_STRING(
- [--disable-silent-rules],
- [verbose build output (undo: "make V=0")])dnl
-])
-case $enable_silent_rules in @%:@ (((
- yes) AM_DEFAULT_VERBOSITY=0;;
- no) AM_DEFAULT_VERBOSITY=1;;
- *) AM_DEFAULT_VERBOSITY=m4_if([$1], [yes], [0], [1]);;
-esac
-dnl
-dnl A few 'make' implementations (e.g., NonStop OS and NextStep)
-dnl do not support nested variable expansions.
-dnl See automake bug#9928 and bug#10237.
-am_make=${MAKE-make}
-AC_CACHE_CHECK([whether $am_make supports nested variables],
- [am_cv_make_support_nested_variables],
- [if AS_ECHO([['TRUE=$(BAR$(V))
-BAR0=false
-BAR1=true
-V=1
-am__doit:
- @$(TRUE)
-.PHONY: am__doit']]) | $am_make -f - >/dev/null 2>&1; then
- am_cv_make_support_nested_variables=yes
-else
- am_cv_make_support_nested_variables=no
-fi])
-if test $am_cv_make_support_nested_variables = yes; then
- dnl Using '$V' instead of '$(V)' breaks IRIX make.
- AM_V='$(V)'
- AM_DEFAULT_V='$(AM_DEFAULT_VERBOSITY)'
-else
- AM_V=$AM_DEFAULT_VERBOSITY
- AM_DEFAULT_V=$AM_DEFAULT_VERBOSITY
-fi
-AC_SUBST([AM_V])dnl
-AM_SUBST_NOTMAKE([AM_V])dnl
-AC_SUBST([AM_DEFAULT_V])dnl
-AM_SUBST_NOTMAKE([AM_DEFAULT_V])dnl
-AC_SUBST([AM_DEFAULT_VERBOSITY])dnl
-AM_BACKSLASH='\'
-AC_SUBST([AM_BACKSLASH])dnl
-_AM_SUBST_NOTMAKE([AM_BACKSLASH])dnl
-])
-
-# Copyright (C) 2001-2020 Free Software Foundation, Inc.
-#
-# This file is free software; the Free Software Foundation
-# gives unlimited permission to copy and/or distribute it,
-# with or without modifications, as long as this notice is preserved.
-
-# AM_PROG_INSTALL_STRIP
-# ---------------------
-# One issue with vendor 'install' (even GNU) is that you can't
-# specify the program used to strip binaries. This is especially
-# annoying in cross-compiling environments, where the build's strip
-# is unlikely to handle the host's binaries.
-# Fortunately install-sh will honor a STRIPPROG variable, so we
-# always use install-sh in "make install-strip", and initialize
-# STRIPPROG with the value of the STRIP variable (set by the user).
-AC_DEFUN([AM_PROG_INSTALL_STRIP],
-[AC_REQUIRE([AM_PROG_INSTALL_SH])dnl
-# Installed binaries are usually stripped using 'strip' when the user
-# run "make install-strip". However 'strip' might not be the right
-# tool to use in cross-compilation environments, therefore Automake
-# will honor the 'STRIP' environment variable to overrule this program.
-dnl Don't test for $cross_compiling = yes, because it might be 'maybe'.
-if test "$cross_compiling" != no; then
- AC_CHECK_TOOL([STRIP], [strip], :)
-fi
-INSTALL_STRIP_PROGRAM="\$(install_sh) -c -s"
-AC_SUBST([INSTALL_STRIP_PROGRAM])])
-
-# Copyright (C) 2006-2020 Free Software Foundation, Inc.
-#
-# This file is free software; the Free Software Foundation
-# gives unlimited permission to copy and/or distribute it,
-# with or without modifications, as long as this notice is preserved.
-
-# _AM_SUBST_NOTMAKE(VARIABLE)
-# ---------------------------
-# Prevent Automake from outputting VARIABLE = @VARIABLE@ in Makefile.in.
-# This macro is traced by Automake.
-AC_DEFUN([_AM_SUBST_NOTMAKE])
-
-# AM_SUBST_NOTMAKE(VARIABLE)
-# --------------------------
-# Public sister of _AM_SUBST_NOTMAKE.
-AC_DEFUN([AM_SUBST_NOTMAKE], [_AM_SUBST_NOTMAKE($@)])
-
-# Check how to create a tarball. -*- Autoconf -*-
-
-# Copyright (C) 2004-2020 Free Software Foundation, Inc.
-#
-# This file is free software; the Free Software Foundation
-# gives unlimited permission to copy and/or distribute it,
-# with or without modifications, as long as this notice is preserved.
-
-# _AM_PROG_TAR(FORMAT)
-# --------------------
-# Check how to create a tarball in format FORMAT.
-# FORMAT should be one of 'v7', 'ustar', or 'pax'.
-#
-# Substitute a variable $(am__tar) that is a command
-# writing to stdout a FORMAT-tarball containing the directory
-# $tardir.
-# tardir=directory && $(am__tar) > result.tar
-#
-# Substitute a variable $(am__untar) that extract such
-# a tarball read from stdin.
-# $(am__untar) < result.tar
-#
-AC_DEFUN([_AM_PROG_TAR],
-[# Always define AMTAR for backward compatibility. Yes, it's still used
-# in the wild :-( We should find a proper way to deprecate it ...
-AC_SUBST([AMTAR], ['$${TAR-tar}'])
-
-# We'll loop over all known methods to create a tar archive until one works.
-_am_tools='gnutar m4_if([$1], [ustar], [plaintar]) pax cpio none'
-
-m4_if([$1], [v7],
- [am__tar='$${TAR-tar} chof - "$$tardir"' am__untar='$${TAR-tar} xf -'],
-
- [m4_case([$1],
- [ustar],
- [# The POSIX 1988 'ustar' format is defined with fixed-size fields.
- # There is notably a 21 bits limit for the UID and the GID. In fact,
- # the 'pax' utility can hang on bigger UID/GID (see automake bug#8343
- # and bug#13588).
- am_max_uid=2097151 # 2^21 - 1
- am_max_gid=$am_max_uid
- # The $UID and $GID variables are not portable, so we need to resort
- # to the POSIX-mandated id(1) utility. Errors in the 'id' calls
- # below are definitely unexpected, so allow the users to see them
- # (that is, avoid stderr redirection).
- am_uid=`id -u || echo unknown`
- am_gid=`id -g || echo unknown`
- AC_MSG_CHECKING([whether UID '$am_uid' is supported by ustar format])
- if test $am_uid -le $am_max_uid; then
- AC_MSG_RESULT([yes])
- else
- AC_MSG_RESULT([no])
- _am_tools=none
- fi
- AC_MSG_CHECKING([whether GID '$am_gid' is supported by ustar format])
- if test $am_gid -le $am_max_gid; then
- AC_MSG_RESULT([yes])
- else
- AC_MSG_RESULT([no])
- _am_tools=none
- fi],
-
- [pax],
- [],
-
- [m4_fatal([Unknown tar format])])
-
- AC_MSG_CHECKING([how to create a $1 tar archive])
-
- # Go ahead even if we have the value already cached. We do so because we
- # need to set the values for the 'am__tar' and 'am__untar' variables.
- _am_tools=${am_cv_prog_tar_$1-$_am_tools}
-
- for _am_tool in $_am_tools; do
- case $_am_tool in
- gnutar)
- for _am_tar in tar gnutar gtar; do
- AM_RUN_LOG([$_am_tar --version]) && break
- done
- am__tar="$_am_tar --format=m4_if([$1], [pax], [posix], [$1]) -chf - "'"$$tardir"'
- am__tar_="$_am_tar --format=m4_if([$1], [pax], [posix], [$1]) -chf - "'"$tardir"'
- am__untar="$_am_tar -xf -"
- ;;
- plaintar)
- # Must skip GNU tar: if it does not support --format= it doesn't create
- # ustar tarball either.
- (tar --version) >/dev/null 2>&1 && continue
- am__tar='tar chf - "$$tardir"'
- am__tar_='tar chf - "$tardir"'
- am__untar='tar xf -'
- ;;
- pax)
- am__tar='pax -L -x $1 -w "$$tardir"'
- am__tar_='pax -L -x $1 -w "$tardir"'
- am__untar='pax -r'
- ;;
- cpio)
- am__tar='find "$$tardir" -print | cpio -o -H $1 -L'
- am__tar_='find "$tardir" -print | cpio -o -H $1 -L'
- am__untar='cpio -i -H $1 -d'
- ;;
- none)
- am__tar=false
- am__tar_=false
- am__untar=false
- ;;
- esac
-
- # If the value was cached, stop now. We just wanted to have am__tar
- # and am__untar set.
- test -n "${am_cv_prog_tar_$1}" && break
-
- # tar/untar a dummy directory, and stop if the command works.
- rm -rf conftest.dir
- mkdir conftest.dir
- echo GrepMe > conftest.dir/file
- AM_RUN_LOG([tardir=conftest.dir && eval $am__tar_ >conftest.tar])
- rm -rf conftest.dir
- if test -s conftest.tar; then
- AM_RUN_LOG([$am__untar <conftest.tar])
- AM_RUN_LOG([cat conftest.dir/file])
- grep GrepMe conftest.dir/file >/dev/null 2>&1 && break
- fi
- done
- rm -rf conftest.dir
-
- AC_CACHE_VAL([am_cv_prog_tar_$1], [am_cv_prog_tar_$1=$_am_tool])
- AC_MSG_RESULT([$am_cv_prog_tar_$1])])
-
-AC_SUBST([am__tar])
-AC_SUBST([am__untar])
-]) # _AM_PROG_TAR
-
+m4_include([m4/amversion.m4])
+m4_include([m4/auxdir.m4])
+m4_include([m4/cond.m4])
+m4_include([m4/init.m4])
+m4_include([m4/install-sh.m4])
+m4_include([m4/lead-dot.m4])
+m4_include([m4/missing.m4])
+m4_include([m4/options.m4])
+m4_include([m4/prog-cc-c-o.m4])
+m4_include([m4/runlog.m4])
+m4_include([m4/sanity.m4])
+m4_include([m4/silent.m4])
+m4_include([m4/strip.m4])
+m4_include([m4/substnot.m4])
+m4_include([m4/tar.m4])
-#!@PERL@ -w
+#!@PERL@
# aclocal - create aclocal.m4 by scanning configure.ac -*- perl -*-
# @configure_input@
# Copyright (C) 1996-2020 Free Software Foundation, Inc.
# Written by Tom Tromey <tromey@redhat.com>, and
# Alexandre Duret-Lutz <adl@gnu.org>.
+use 5.006;
+use strict;
+use warnings FATAL => 'all';
+
BEGIN
{
unshift (@INC, '@datadir@/@PACKAGE@-@APIVERSION@')
unless $ENV{AUTOMAKE_UNINSTALLED};
}
-use strict;
+use File::Basename;
+use File::Path ();
use Automake::Config;
use Automake::General;
use Automake::ChannelDefs;
use Automake::XFile;
use Automake::FileUtils;
-use File::Basename;
-use File::Path ();
# Some globals.
{
# $dest does not exist. We create an empty one just to
# run diff, and we erase it afterward. Using the real
- # the destination file (rather than a temporary file) is
+ # destination file (rather than a temporary file) is
# good when diff is run with options that display the
# file name.
#
my %files = map { $map{$_} => 1 } keys %macro_seen;
%files = strip_redundant_includes %files;
- # When AC_CONFIG_MACRO_DIRS is used, avoid possible spurious warnings
- # from autom4te about macros being "m4_require'd but not m4_defun'd";
- # for more background, see:
- # https://lists.gnu.org/archive/html/autoconf-patches/2012-11/msg00004.html
- # as well as autoconf commit 'v2.69-44-g1ed0548', "warn: allow aclocal
- # to silence m4_require warnings".
- my $early_m4_code .= "m4_define([m4_require_silent_probe], [-])";
+ # Suppress all warnings from this invocation of autom4te.
+ # In particular we want to avoid spurious warnings about
+ # macros being "m4_require'd but not m4_defun'd" because
+ # aclocal.m4 is not yet available.
+ local $ENV{WARNINGS} = 'none';
my $traces = ($ENV{AUTOM4TE} || '@am_AUTOM4TE@');
$traces .= " --language Autoconf-without-aclocal-m4 ";
- $traces = "echo '$early_m4_code' | $traces - ";
# Support AC_CONFIG_MACRO_DIRS also with older autoconf.
# Note that we can't use '$ac_config_macro_dirs_fallback' here, because
- # a bug in option parsing code of autom4te 2.68 and earlier will cause
- # it to read standard input last, even if the "-" argument is specified
+ # a bug in option parsing code of autom4te 2.68 and earlier would cause
+ # it to read standard input last, even if the "-" argument was specified
# early.
# FIXME: To be removed in Automake 2.0, once we can assume autoconf
# 2.70 or later.
# characters (like newlines).
(map { "--trace='$_:\$f::\$n'" } (keys %macro_seen)));
- verb "running $traces $configure_ac";
+ verb "running WARNINGS=$ENV{WARNINGS} $traces $configure_ac";
my $tracefh = new Automake::XFile ("$traces $configure_ac |");
--verbose don't be silent
--version print version number, then exit
-W, --warnings=CATEGORY report the warnings falling in CATEGORY
+EOF
-Warning categories include:
- syntax dubious syntactic constructs (default)
- unsupported unknown macros (default)
- all all the warnings (default)
- no-CATEGORY turn off warnings in CATEGORY
- none turn off all the warnings
- error treat warnings as errors
+ print Automake::ChannelDefs::usage (), "\n";
+ print <<'EOF';
Report bugs to <@PACKAGE_BUGREPORT@>.
GNU Automake home page: <@PACKAGE_URL@>.
General help using GNU software: <https://www.gnu.org/gethelp/>.
{
my $print_and_exit = 0;
my $diff_command;
+ my @warnings = ();
my %cli_options =
(
'output=s' => \$output_file,
'print-ac-dir' => \$print_and_exit,
'verbose' => sub { setup_channel 'verb', silent => 0; },
- 'W|warnings=s' => \&parse_warnings,
+ 'W|warnings=s' => \@warnings,
);
use Automake::Getopt ();
Automake::Getopt::parse_options %cli_options;
+ parse_warnings @warnings;
if (@ARGV > 0)
{
-#!@PERL@ -w
+#!@PERL@
# automake - create Makefile.in from Makefile.am -*- perl -*-
# @configure_input@
# Copyright (C) 1994-2020 Free Software Foundation, Inc.
package Automake;
+use 5.006;
use strict;
+use warnings FATAL => 'all';
BEGIN
{
$ENV{'SHELL'} = '@SHELL@' if exists $ENV{'DJDIR'};
}
+use Carp;
+use File::Basename;
+use File::Spec;
+
use Automake::Config;
BEGIN
{
use Automake::RuleDef;
use Automake::Wrap 'makefile_wrap';
use Automake::Language;
-use File::Basename;
-use File::Spec;
-use Carp;
## ----------------------- ##
## Subroutine prototypes. ##
# Use $(install_sh), not $(MKDIR_P) because the latter requires
# at least one argument, and $(mkinstalldirs) used to work
# even without arguments (e.g. $(mkinstalldirs) $(conditional_dir)).
+ # Also, $(MKDIR_P) uses the umask for any intermediate directories
+ # created, whereas we want them to be created with umask 022
+ # so that they are mode 755.
define_variable ('mkinstalldirs', '$(install_sh) -d', INTERNAL);
}
if ($handle_exeext)
{
unshift (@test_suffixes, $at_exeext)
- unless $test_suffixes[0] eq $at_exeext;
+ unless @test_suffixes && $test_suffixes[0] eq $at_exeext;
}
unshift (@test_suffixes, '');
sinclude => 1,
);
+ # Suppress all warnings from this invocation of autoconf.
+ # The user is presumably about to run autoconf themselves
+ # and will see its warnings then.
+ local $ENV{WARNINGS} = 'none';
+
my $traces = ($ENV{AUTOCONF} || '@am_AUTOCONF@') . " ";
# Use a separator unlikely to be used, not ':', the default, which
map { "--trace=$_" . ':\$f:\$l::\$d::\$n::\${::}%' }
(keys %traced));
+ verb "running WARNINGS=$ENV{WARNINGS} $traces";
my $tracefh = new Automake::XFile ("$traces $filename |");
- verb "reading $traces";
@cond_stack = ();
my $where;
my $c_file = $vala_file;
if ($c_file =~ s/(.*)\.vala$/$1.c/)
{
- $c_file = "\$(srcdir)/$c_file";
- $output_rules .= "$c_file: \$(srcdir)/${derived}_vala.stamp\n"
- . "\t\@if test -f \$@; then :; else rm -f \$(srcdir)/${derived}_vala.stamp; fi\n"
+ my $built_c_file = "\$(builddir)/$c_file";
+ my $built_dir = dirname $built_c_file;
+ my $base_c_file = basename $c_file;
+ $output_rules .= "$built_c_file: \$(builddir)/${derived}_vala.stamp\n"
+ . "\t\@if test ! -f \$@ && test \$(srcdir) != \$(builddir) && test -n \"\$\$(find -L \$(srcdir)/$c_file -prune -newer \$(srcdir)/$vala_file)\"; then cp -p \$(srcdir)/$c_file $built_c_file; fi\n"
+ . "\t\@if test -f \$@; then :; else rm -f \$(builddir)/${derived}_vala.stamp; fi\n"
. "\t\@if test -f \$@; then :; else \\\n"
- . "\t \$(MAKE) \$(AM_MAKEFLAGS) \$(srcdir)/${derived}_vala.stamp; \\\n"
+ . "\t \$(MAKE) \$(AM_MAKEFLAGS) \$(builddir)/${derived}_vala.stamp; \\\n"
+ . "\t if test $built_dir != .; then mv $base_c_file $built_dir/; fi \\\n"
. "\tfi\n";
- $clean_files{$c_file} = MAINTAINER_CLEAN;
+ $clean_files{$built_c_file} = DIST_CLEAN;
+ $clean_files{"\$(srcdir)/$c_file"} = MAINTAINER_CLEAN;
}
}
my $lastflag = '';
foreach my $flag ($flags->value_as_list_recursive)
{
- if (grep (/$lastflag/, ('-H', '-h', '--header', '--internal-header',
- '--vapi', '--internal-vapi', '--gir')))
+ if (grep (/^$lastflag$/, ('-H', '-h', '--header', '--internal-header',
+ '--vapi', '--internal-vapi', '--gir')))
{
- my $headerfile = "\$(srcdir)/$flag";
- $output_rules .= "$headerfile: \$(srcdir)/${derived}_vala.stamp\n"
- . "\t\@if test -f \$@; then :; else rm -f \$(srcdir)/${derived}_vala.stamp; fi\n"
+ my $headerfile = "\$(builddir)/$flag";
+ $output_rules .= "$headerfile: \$(builddir)/${derived}_vala.stamp\n"
+ . "\t\@if test -f \$@; then :; else rm -f \$(builddir)/${derived}_vala.stamp; fi\n"
. "\t\@if test -f \$@; then :; else \\\n"
- . "\t \$(MAKE) \$(AM_MAKEFLAGS) \$(srcdir)/${derived}_vala.stamp; \\\n"
+ . "\t \$(MAKE) \$(AM_MAKEFLAGS) \$(builddir)/${derived}_vala.stamp; \\\n"
. "\tfi\n";
# valac is not used when building from dist tarballs
push_dist_common ($headerfile);
$clean_files{$headerfile} = MAINTAINER_CLEAN;
}
+ if (grep (/$lastflag/, ('--library')))
+ {
+ my $headerfile = "\$(builddir)/$flag";
+ $output_rules .= "$headerfile.vapi: \$(builddir)/${derived}_vala.stamp\n"
+ . "\t\@if test -f \$@; then :; else rm -f \$(builddir)/${derived}_vala.stamp; fi\n"
+ . "\t\@if test -f \$@; then :; else \\\n"
+ . "\t \$(MAKE) \$(AM_MAKEFLAGS) \$(builddir)/${derived}_vala.stamp; \\\n"
+ . "\tfi\n";
+
+ # valac is not used when building from dist tarballs
+ # distribute the generated files
+ my $vapi = "$headerfile.vapi";
+ push_dist_common ($vapi);
+ $clean_files{$headerfile.'.vapi'} = MAINTAINER_CLEAN;
+ }
$lastflag = $flag;
}
}
my $verbose = verbose_flag ('VALAC');
my $silent = silent_flag ();
- my $stampfile = "\$(srcdir)/${derived}_vala.stamp";
+ my $stampfile = "\$(builddir)/${derived}_vala.stamp";
$output_rules .=
- "\$(srcdir)/${derived}_vala.stamp: @vala_sources\n".
+ "\$(builddir)/${derived}_vala.stamp: @vala_sources\n".
# Since the C files generated from the vala sources depend on the
# ${derived}_vala.stamp file, we must ensure its timestamp is older than
# those of the C files generated by the valac invocation below (this is
# Thus we need to create the stamp file *before* invoking valac, and to
# move it to its final location only after valac has been invoked.
"\t${silent}rm -f \$\@ && echo stamp > \$\@-t\n".
- "\t${verbose}\$(am__cd) \$(srcdir) && $compile @vala_sources\n".
+ "\t${verbose}$compile \$^\n".
"\t${silent}mv -f \$\@-t \$\@\n";
push_dist_common ($stampfile);
-f, --force-missing force update of standard files
";
- Automake::ChannelDefs::usage;
+ print Automake::ChannelDefs::usage (), "\n";
print "\nFiles automatically distributed if found " .
"(always):\n";
set_strictness ($strict);
my $cli_where = new Automake::Location;
set_global_option ('no-dependencies', $cli_where) if $ignore_deps;
- for my $warning (@warnings)
- {
- parse_warnings ('-W', $warning);
- }
+ parse_warnings @warnings;
return unless @ARGV;
#! /bin/sh
# Guess values for system-dependent variables and create Makefiles.
-# Generated by GNU Autoconf 2.69.204-98d6 for GNU Automake 1.16.2.
+# Generated by GNU Autoconf 2.69d.4-8e54 for GNU Automake 1.16.3.
#
# Report bugs to <bug-automake@gnu.org>.
#
fi
+
+# Reset variables that may have inherited troublesome values from
+# the environment.
+
+# IFS needs to be set, to space, tab, and newline, in precisely that order.
+# (If _AS_PATH_WALK were called with IFS unset, it would have the
+# side effect of setting IFS to empty, thus disabling word splitting.)
+# Quoting is to prevent editors from complaining about space-tab.
+as_nl='
+'
+export as_nl
+IFS=" "" $as_nl"
+
+PS1='$ '
+PS2='> '
+PS4='+ '
+
+# Ensure predictable behavior from utilities with locale-dependent output.
+LC_ALL=C
+export LC_ALL
+LANGUAGE=C
+export LANGUAGE
+
+# We cannot yet rely on "unset" to work, but we need these variables
+# to be unset--not just set to an empty or harmless value--now, to
+# avoid bugs in old shells (e.g. pre-3.0 UWIN ksh). This construct
+# also avoids known problems related to "unset" and subshell syntax
+# in other old shells (e.g. bash 2.01 and pdksh 5.2.14).
+for as_var in BASH_ENV ENV MAIL MAILPATH CDPATH
+do eval test \${$as_var+y} \
+ && ( (unset $as_var) || exit 1) >/dev/null 2>&1 && unset $as_var || :
+done
+
+# Ensure that fds 0, 1, and 2 are open.
+if (exec 3>&0) 2>/dev/null; then :; else exec 0</dev/null; fi
+if (exec 3>&1) 2>/dev/null; then :; else exec 1>/dev/null; fi
+if (exec 3>&2) ; then :; else exec 2>/dev/null; fi
+
# The user is always right.
if ${PATH_SEPARATOR+false} :; then
PATH_SEPARATOR=:
fi
-# IFS
-# We need space, tab and new line, in precisely that order. Quoting is
-# there to prevent editors from complaining about space-tab.
-# (If _AS_PATH_WALK were called with IFS unset, it would disable word
-# splitting by setting IFS to empty value.)
-as_nl='
-'
-export as_nl
-IFS=" "" $as_nl"
-
# Find who we are. Look in the path if we contain no directory separator.
as_myself=
case $0 in #((
exit 1
fi
-# Unset variables that we do not need and which cause bugs (e.g. in
-# pre-3.0 UWIN ksh). But do not cause bugs in bash 2.01; the "|| exit 1"
-# suppresses any "Segmentation fault" message there. '((' could
-# trigger a bug in pdksh 5.2.14.
-for as_var in BASH_ENV ENV MAIL MAILPATH
-do eval test \${$as_var+y} \
- && ( (unset $as_var) || exit 1) >/dev/null 2>&1 && unset $as_var || :
-done
-PS1='$ '
-PS2='> '
-PS4='+ '
-
-# NLS nuisances.
-LC_ALL=C
-export LC_ALL
-LANGUAGE=C
-export LANGUAGE
-
-# CDPATH.
-(unset CDPATH) >/dev/null 2>&1 && unset CDPATH
# Use a proper internal environment variable to ensure we don't fall
# into an infinite loop, continuously re-executing ourselves.
# Admittedly, this is quite paranoid, since all the known shells bail
# out after a failed `exec'.
printf "%s\n" "$0: could not re-execute with $CONFIG_SHELL" >&2
-as_fn_exit 255
+exit 255
fi
# We don't want this to propagate to other subprocesses.
{ _as_can_reexec=; unset _as_can_reexec;}
# Try only shells that exist, to save several forks.
as_shell=$as_dir$as_base
if { test -f "$as_shell" || test -f "$as_shell.exe"; } &&
- { $as_echo "$as_bourne_compatible""$as_required" | as_run=a "$as_shell"; } 2>/dev/null
+ as_run=a "$as_shell" -c "$as_bourne_compatible""$as_required" 2>/dev/null
then :
CONFIG_SHELL=$as_shell as_have_required=yes
- if { $as_echo "$as_bourne_compatible""$as_suggested" | as_run=a "$as_shell"; } 2>/dev/null
+ if as_run=a "$as_shell" -c "$as_bourne_compatible""$as_suggested" 2>/dev/null
then :
break 2
fi
as_found=false
done
IFS=$as_save_IFS
-$as_found || { if { test -f "$SHELL" || test -f "$SHELL.exe"; } &&
- { $as_echo "$as_bourne_compatible""$as_required" | as_run=a "$SHELL"; } 2>/dev/null
+if $as_found
+then :
+
+else
+ if { test -f "$SHELL" || test -f "$SHELL.exe"; } &&
+ as_run=a "$SHELL" -c "$as_bourne_compatible""$as_required" 2>/dev/null
then :
CONFIG_SHELL=$SHELL as_have_required=yes
-fi; }
+fi
+fi
if test "x$CONFIG_SHELL" != x
exit
}
+
+# Determine whether it's possible to make 'echo' print without a newline.
+# These variables are no longer used directly by Autoconf, but are AC_SUBSTed
+# for compatibility with existing Makefiles.
ECHO_C= ECHO_N= ECHO_T=
case `echo -n x` in #(((((
-n*)
ECHO_N='-n';;
esac
+# For backward compatibility with old third-party macros, we provide
+# the shell variables $as_echo and $as_echo_n. New code should use
+# AS_ECHO(["message"]) and AS_ECHO_N(["message"]), respectively.
+as_echo='printf %s\n'
+as_echo_n='printf %s'
+
rm -f conf$$ conf$$.exe conf$$.file
if test -d conf$$.dir; then
rm -f conf$$.dir/conf$$.file
# Identity of this package.
PACKAGE_NAME='GNU Automake'
PACKAGE_TARNAME='automake'
-PACKAGE_VERSION='1.16.2'
-PACKAGE_STRING='GNU Automake 1.16.2'
+PACKAGE_VERSION='1.16.3'
+PACKAGE_STRING='GNU Automake 1.16.3'
PACKAGE_BUGREPORT='bug-automake@gnu.org'
PACKAGE_URL='https://www.gnu.org/software/automake/'
*) ac_optarg=yes ;;
esac
- # Accept the important Cygnus configure options, so we can diagnose typos.
-
case $ac_dashdash$ac_option in
--)
ac_dashdash=yes ;;
# Omit some internal or obsolete options to make the list less imposing.
# This message is too long to be a string in the A/UX 3.1 sh.
cat <<_ACEOF
-\`configure' configures GNU Automake 1.16.2 to adapt to many kinds of systems.
+\`configure' configures GNU Automake 1.16.3 to adapt to many kinds of systems.
Usage: $0 [OPTION]... [VAR=VALUE]...
if test -n "$ac_init_help"; then
case $ac_init_help in
- short | recursive ) echo "Configuration of GNU Automake 1.16.2:";;
+ short | recursive ) echo "Configuration of GNU Automake 1.16.3:";;
esac
cat <<\_ACEOF
ac_abs_srcdir=$ac_abs_top_srcdir$ac_dir_suffix
cd "$ac_dir" || { ac_status=$?; continue; }
- # Check for guested configure.
+ # Check for configure.gnu first; this name is used for a wrapper for
+ # Metaconfig's "Configure" on case-insensitive file systems.
if test -f "$ac_srcdir/configure.gnu"; then
echo &&
$SHELL "$ac_srcdir/configure.gnu" --help=recursive
test -n "$ac_init_help" && exit $ac_status
if $ac_init_version; then
cat <<\_ACEOF
-GNU Automake configure 1.16.2
-generated by GNU Autoconf 2.69.204-98d6
+GNU Automake configure 1.16.3
+generated by GNU Autoconf 2.69d.4-8e54
Copyright (C) 2020 Free Software Foundation, Inc.
This configure script is free software; the Free Software Foundation
This file contains any messages produced by compilers while
running configure, to aid debugging if configure makes a mistake.
-It was created by GNU Automake $as_me 1.16.2, which was
-generated by GNU Autoconf 2.69.204-98d6. Invocation command line was
+It was created by GNU Automake $as_me 1.16.3, which was
+generated by GNU Autoconf 2.69d.4-8e54. Invocation command line was
$ $0$ac_configure_args_raw
>$cache_file
fi
+
+# Auxiliary files required by this configure script.
+ac_aux_files="compile missing install-sh config.guess config.sub"
+
+# Locations in which to look for auxiliary files.
+ac_aux_dir_candidates="${srcdir}/lib"
+
+# Search for a directory containing all of the required auxiliary files,
+# $ac_aux_files, from the $PATH-style list $ac_aux_dir_candidates.
+# If we don't find one directory that contains all the files we need,
+# we report the set of missing files from the *first* directory in
+# $ac_aux_dir_candidates and give up.
+ac_missing_aux_files=""
+ac_first_candidate=:
+printf "%s\n" "$as_me:${as_lineno-$LINENO}: looking for aux files: $ac_aux_files" >&5
+as_save_IFS=$IFS; IFS=$PATH_SEPARATOR
+as_found=false
+for as_dir in $ac_aux_dir_candidates
+do
+ IFS=$as_save_IFS
+ case $as_dir in #(((
+ '') as_dir=./ ;;
+ */) ;;
+ *) as_dir=$as_dir/ ;;
+ esac
+ as_found=:
+
+ printf "%s\n" "$as_me:${as_lineno-$LINENO}: trying $as_dir" >&5
+ ac_aux_dir_found=yes
+ ac_install_sh=
+ for ac_aux in $ac_aux_files
+ do
+ # As a special case, if "install-sh" is required, that requirement
+ # can be satisfied by any of "install-sh", "install.sh", or "shtool",
+ # and $ac_install_sh is set appropriately for whichever one is found.
+ if test x"$ac_aux" = x"install-sh"
+ then
+ if test -f "${as_dir}install-sh"; then
+ printf "%s\n" "$as_me:${as_lineno-$LINENO}: ${as_dir}install-sh found" >&5
+ ac_install_sh="${as_dir}install-sh -c"
+ elif test -f "${as_dir}install.sh"; then
+ printf "%s\n" "$as_me:${as_lineno-$LINENO}: ${as_dir}install.sh found" >&5
+ ac_install_sh="${as_dir}install.sh -c"
+ elif test -f "${as_dir}shtool"; then
+ printf "%s\n" "$as_me:${as_lineno-$LINENO}: ${as_dir}shtool found" >&5
+ ac_install_sh="${as_dir}shtool install -c"
+ else
+ ac_aux_dir_found=no
+ if $ac_first_candidate; then
+ ac_missing_aux_files="${ac_missing_aux_files} install-sh"
+ else
+ break
+ fi
+ fi
+ else
+ if test -f "${as_dir}${ac_aux}"; then
+ printf "%s\n" "$as_me:${as_lineno-$LINENO}: ${as_dir}${ac_aux} found" >&5
+ else
+ ac_aux_dir_found=no
+ if $ac_first_candidate; then
+ ac_missing_aux_files="${ac_missing_aux_files} ${ac_aux}"
+ else
+ break
+ fi
+ fi
+ fi
+ done
+ if test "$ac_aux_dir_found" = yes; then
+ ac_aux_dir="$as_dir"
+ break
+ fi
+ ac_first_candidate=false
+
+ as_found=false
+done
+IFS=$as_save_IFS
+if $as_found
+then :
+
+else
+ as_fn_error $? "cannot find required auxiliary files:$ac_missing_aux_files" "$LINENO" 5
+fi
+
+
+# These three variables are undocumented and unsupported,
+# and are intended to be withdrawn in a future Autoconf release.
+# They can cause serious problems if a builder's source tree is in a directory
+# whose full name contains unusual characters.
+if test -f "${ac_aux_dir}config.guess"; then
+ ac_config_guess="$SHELL ${ac_aux_dir}config.guess"
+fi
+if test -f "${ac_aux_dir}config.sub"; then
+ ac_config_sub="$SHELL ${ac_aux_dir}config.sub"
+fi
+if test -f "$ac_aux_dir/configure"; then
+ ac_configure="$SHELL ${ac_aux_dir}configure"
+fi
+
# Check that the precious variables saved in the cache have kept the same
# value.
ac_cache_corrupted=false
printf "%s\n" "$as_me: error: in \`$ac_pwd':" >&2;}
{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: error: changes in the environment can compromise the build" >&5
printf "%s\n" "$as_me: error: changes in the environment can compromise the build" >&2;}
- as_fn_error $? "run \`make distclean' and/or \`rm $cache_file' and start over" "$LINENO" 5
+ as_fn_error $? "run \`${MAKE-make} distclean' and/or \`rm $cache_file'
+ and start over" "$LINENO" 5
fi
## -------------------- ##
## Main body of script. ##
-ac_aux_dir=
-for ac_dir in lib "$srcdir"/lib
-do
- if test -f "$ac_dir/install-sh"; then
- ac_aux_dir=$ac_dir
- ac_install_sh="$ac_aux_dir/install-sh -c"
- break
- elif test -f "$ac_dir/install.sh"; then
- ac_aux_dir=$ac_dir
- ac_install_sh="$ac_aux_dir/install.sh -c"
- break
- elif test -f "$ac_dir/shtool"; then
- ac_aux_dir=$ac_dir
- ac_install_sh="$ac_aux_dir/shtool install -c"
- break
- fi
-done
-if test -z "$ac_aux_dir"; then
- as_fn_error $? "cannot find install-sh, install.sh, or shtool in lib \"$srcdir\"/lib" "$LINENO" 5
-fi
-
-# These three variables are undocumented and unsupported,
-# and are intended to be withdrawn in a future Autoconf release.
-# They can cause serious problems if a builder's source tree is in a directory
-# whose full name contains unusual characters.
-ac_config_guess="$SHELL $ac_aux_dir/config.guess" # Please don't use this var.
-ac_config_sub="$SHELL $ac_aux_dir/config.sub" # Please don't use this var.
-ac_configure="$SHELL $ac_aux_dir/configure" # Please don't use this var.
-
# Check whether --enable-silent-rules was given.
AM_BACKSLASH='\'
-# Make sure we can run config.sub.
-$SHELL "$ac_aux_dir/config.sub" sun4 >/dev/null 2>&1 ||
- as_fn_error $? "cannot run $SHELL $ac_aux_dir/config.sub" "$LINENO" 5
+
+
+
+ # Make sure we can run config.sub.
+$SHELL "${ac_aux_dir}config.sub" sun4 >/dev/null 2>&1 ||
+ as_fn_error $? "cannot run $SHELL ${ac_aux_dir}config.sub" "$LINENO" 5
{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking build system type" >&5
printf %s "checking build system type... " >&6; }
else
ac_build_alias=$build_alias
test "x$ac_build_alias" = x &&
- ac_build_alias=`$SHELL "$ac_aux_dir/config.guess"`
+ ac_build_alias=`$SHELL "${ac_aux_dir}config.guess"`
test "x$ac_build_alias" = x &&
as_fn_error $? "cannot guess build type; you must specify one" "$LINENO" 5
-ac_cv_build=`$SHELL "$ac_aux_dir/config.sub" $ac_build_alias` ||
- as_fn_error $? "$SHELL $ac_aux_dir/config.sub $ac_build_alias failed" "$LINENO" 5
+ac_cv_build=`$SHELL "${ac_aux_dir}config.sub" $ac_build_alias` ||
+ as_fn_error $? "$SHELL ${ac_aux_dir}config.sub $ac_build_alias failed" "$LINENO" 5
fi
{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: result: $ac_cv_build" >&5
if test "x$host_alias" = x; then
ac_cv_host=$ac_cv_build
else
- ac_cv_host=`$SHELL "$ac_aux_dir/config.sub" $host_alias` ||
- as_fn_error $? "$SHELL $ac_aux_dir/config.sub $host_alias failed" "$LINENO" 5
+ ac_cv_host=`$SHELL "${ac_aux_dir}config.sub" $host_alias` ||
+ as_fn_error $? "$SHELL ${ac_aux_dir}config.sub $host_alias failed" "$LINENO" 5
fi
fi
am__api_version='1.16'
-# Find a good install program. We prefer a C program (faster),
+
+ # Find a good install program. We prefer a C program (faster),
# so one script is as good as another. But avoid the broken or
# incompatible versions:
# SysV /etc/install, /usr/sbin/install
ac_script='s/[\\$]/&&/g;s/;s,x,x,$//'
program_transform_name=`printf "%s\n" "$program_transform_name" | sed "$ac_script"`
+
# Expand $ac_aux_dir to an absolute path.
am_aux_dir=`cd "$ac_aux_dir" && pwd`
-if test x"${MISSING+set}" != xset; then
- case $am_aux_dir in
- *\ * | *\ *)
- MISSING="\${SHELL} \"$am_aux_dir/missing\"" ;;
- *)
- MISSING="\${SHELL} $am_aux_dir/missing" ;;
- esac
+
+ if test x"${MISSING+set}" != xset; then
+ MISSING="\${SHELL} '$am_aux_dir/missing'"
fi
# Use eval to expand $SHELL
if eval "$MISSING --is-lightweight"; then
fi
INSTALL_STRIP_PROGRAM="\$(install_sh) -c -s"
-{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking for a race-free mkdir -p" >&5
+
+ { printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking for a race-free mkdir -p" >&5
printf %s "checking for a race-free mkdir -p... " >&6; }
if test -z "$MKDIR_P"; then
if test ${ac_cv_path_mkdir+y}
# Define the identity of the package.
PACKAGE='automake'
- VERSION='1.16.2'
+ VERSION='1.16.3'
# Some tools Automake needs.
# the compiler is broken, or we cross compile.
{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether we are cross compiling" >&5
printf %s "checking whether we are cross compiling... " >&6; }
-if test "$cross_compiling" = maybe && test "x$build" != "x$host"; then
- cross_compiling=yes
-elif test "$cross_compiling" != yes; then
+if test "$cross_compiling" != yes; then
{ { ac_try="$ac_link"
case "(($ac_try" in
*\"* | *\`* | *\\*) ac_try_echo=\$ac_try;;
fi
{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: result: $ac_cv_c_compiler_gnu" >&5
printf "%s\n" "$ac_cv_c_compiler_gnu" >&6; }
+ac_compiler_gnu=$ac_cv_c_compiler_gnu
+
if test $ac_compiler_gnu = yes; then
GCC=yes
else
am__failed=no
while :; do
-ac_ext=c
+
+ ac_ext=c
ac_cpp='$CPP $CPPFLAGS'
ac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5'
ac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5'
# the compiler is broken, or we cross compile.
{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether we are cross compiling" >&5
printf %s "checking whether we are cross compiling... " >&6; }
-if test "$cross_compiling" = maybe && test "x$build" != "x$host"; then
- cross_compiling=yes
-elif test "$cross_compiling" != yes; then
+if test "$cross_compiling" != yes; then
{ { ac_try="$ac_link"
case "(($ac_try" in
*\"* | *\`* | *\\*) ac_try_echo=\$ac_try;;
fi
{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: result: $ac_cv_cxx_compiler_gnu" >&5
printf "%s\n" "$ac_cv_cxx_compiler_gnu" >&6; }
+ac_compiler_gnu=$ac_cv_cxx_compiler_gnu
+
if test $ac_compiler_gnu = yes; then
GXX=yes
else
cat confdefs.h - <<_ACEOF >conftest.$ac_ext
/* end confdefs.h. */
-#include <deque>
-#include <functional>
-#include <memory>
-#include <tuple>
-#include <array>
-#include <regex>
-#include <iostream>
+// Does the compiler advertise C++ 2011 conformance?
+#if !defined __cplusplus || __cplusplus < 201103L
+# error "Compiler does not advertise C++11 conformance"
+#endif
namespace cxx11test
{
- typedef std::shared_ptr<std::string> sptr;
- typedef std::weak_ptr<std::string> wptr;
-
- typedef std::tuple<std::string,int,double> tp;
- typedef std::array<int, 20> int_array;
-
constexpr int get_val() { return 20; }
struct testinit
double d;
};
- class delegate {
+ class delegate
+ {
public:
delegate(int n) : n(n) {}
delegate(): delegate(2354) {}
int n;
};
- class overridden : public delegate {
+ class overridden : public delegate
+ {
public:
overridden(int n): delegate(n) {}
virtual int getval() override final { return this->n * 2; }
};
- class nocopy {
+ class nocopy
+ {
public:
nocopy(int i): i(i) {}
nocopy() = default;
private:
int i;
};
+
+ // for testing lambda expressions
+ template <typename Ret, typename Fn> Ret eval(Fn f, Ret v)
+ {
+ return f(v);
+ }
+
+ // for testing variadic templates and trailing return types
+ template <typename V> auto sum(V first) -> V
+ {
+ return first;
+ }
+ template <typename V, typename... Args> auto sum(V first, Args... rest) -> V
+ {
+ return first + sum(rest...);
+ }
}
+// Does the compiler advertise C++98 conformance?
+#if !defined __cplusplus || __cplusplus < 199711L
+# error "Compiler does not advertise C++98 conformance"
+#endif
-#include <algorithm>
-#include <cstdlib>
-#include <fstream>
-#include <iomanip>
+// These inclusions are cheap compared to including any STL header, but will
+// reliably reject old compilers that lack the unsuffixed header files.
+#undef NDEBUG
+#include <cassert>
+#include <cstring>
#include <iostream>
-#include <list>
-#include <map>
-#include <set>
-#include <sstream>
-#include <stdexcept>
-#include <string>
-#include <utility>
-#include <vector>
-
-namespace test {
- typedef std::vector<std::string> string_vec;
- typedef std::pair<int,bool> map_value;
- typedef std::map<std::string,map_value> map_type;
- typedef std::set<int> set_type;
-
- template<typename T>
- class printer {
- public:
- printer(std::ostringstream& os): os(os) {}
- void operator() (T elem) { os << elem << std::endl; }
- private:
- std::ostringstream& os;
- };
+
+// Namespaces, exceptions, and templates were all added after "C++ 2.0".
+using std::cout;
+using std::strcmp;
+
+namespace {
+
+void test_exception_syntax()
+{
+ try {
+ throw "test";
+ } catch (const char *s) {
+ // Extra parentheses suppress a warning when building autoconf itself,
+ // due to lint rules shared with more typical C programs.
+ assert (!(strcmp) (s, "test"));
+ }
}
+template <typename T> struct test_template
+{
+ T const val;
+ explicit test_template(T t) : val(t) {}
+ template <typename U> T add(U u) { return static_cast<T>(u) + val; }
+};
+
+} // anonymous namespace
+
int
main (void)
{
{
// Test auto and decltype
- std::deque<int> d;
- d.push_front(43);
- d.push_front(484);
- d.push_front(3);
- d.push_front(844);
- int total = 0;
- for (auto i = d.begin(); i != d.end(); ++i) { total += *i; }
-
auto a1 = 6538;
auto a2 = 48573953.4;
auto a3 = "String literal";
+ int total = 0;
+ for (auto i = a3; *i; ++i) { total += *i; }
+
decltype(a2) a4 = 34895.034;
}
{
cxx11test::testinit il = { 4323, 435234.23544 };
}
{
- // Test range-based for and lambda
- cxx11test::int_array array = {9, 7, 13, 15, 4, 18, 12, 10, 5, 3, 14, 19, 17, 8, 6, 20, 16, 2, 11, 1};
- for (int &x : array) { x += 23; }
- std::for_each(array.begin(), array.end(), [](int v1){ std::cout << v1; });
-}
-{
- using cxx11test::sptr;
- using cxx11test::wptr;
-
- sptr sp(new std::string("ASCII string"));
- wptr wp(sp);
- sptr sp2(wp);
+ // Test range-based for
+ int array[] = {9, 7, 13, 15, 4, 18, 12, 10, 5, 3,
+ 14, 19, 17, 8, 6, 20, 16, 2, 11, 1};
+ for (auto &x : array) { x += 23; }
}
{
- cxx11test::tp tuple("test", 54, 45.53434);
- double d = std::get<2>(tuple);
- std::string s;
- int i;
- std::tie(s,i,d) = tuple;
+ // Test lambda expressions
+ using cxx11test::eval;
+ assert (eval ([](int x) { return x*2; }, 21) == 42);
+ double d = 2.0;
+ assert (eval ([&](double x) { return d += x; }, 3.0) == 5.0);
+ assert (d == 5.0);
+ assert (eval ([=](double x) mutable { return d += x; }, 4.0) == 9.0);
+ assert (d == 5.0);
}
{
- static std::regex filename_regex("^_?([a-z0-9_.]+-)+[a-z0-9]+$");
- std::string testmatch("Test if this string matches");
- bool match = std::regex_search(testmatch, filename_regex);
-}
-{
- cxx11test::int_array array = {9, 7, 13, 15, 4, 18, 12, 10, 5, 3, 14, 19, 17, 8, 6, 20, 16, 2, 11, 1};
- cxx11test::int_array::size_type size = array.size();
+ // Test use of variadic templates
+ using cxx11test::sum;
+ auto a = sum(1);
+ auto b = sum(1, 2);
+ auto c = sum(1.0, 2.0, 3.0);
}
{
// Test constructor delegation
}
{
// Test template brackets
- std::vector<std::pair<int,char*>> v1;
+ test_template<::test_template<int>> v(test_template<int>(12));
}
{
// Unicode literals
}
-
-try {
- // Basic string.
- std::string teststr("ASCII text");
- teststr += " string";
-
- // Simple vector.
- test::string_vec testvec;
- testvec.push_back(teststr);
- testvec.push_back("foo");
- testvec.push_back("bar");
- if (testvec.size() != 3) {
- throw std::runtime_error("vector size is not 1");
- }
-
- // Dump vector into stringstream and obtain string.
- std::ostringstream os;
- for (test::string_vec::const_iterator i = testvec.begin();
- i != testvec.end(); ++i) {
- if (i + 1 != testvec.end()) {
- os << teststr << '\n';
- }
- }
- // Check algorithms work.
- std::for_each(testvec.begin(), testvec.end(), test::printer<std::string>(os));
- std::string os_out = os.str();
-
- // Test pair and map.
- test::map_type testmap;
- testmap.insert(std::make_pair(std::string("key"),
- std::make_pair(53,false)));
-
- // Test set.
- int values[] = {9, 7, 13, 15, 4, 18, 12, 10, 5, 3, 14, 19, 17, 8, 6, 20, 16, 2, 11, 1};
- test::set_type testset(values, values + sizeof(values)/sizeof(values[0]));
- std::list<int> testlist(testset.begin(), testset.end());
- std::copy(testset.begin(), testset.end(), std::back_inserter(testlist));
-} catch (const std::exception& e) {
- std::cerr << "Caught exception: " << e.what() << std::endl;
-
- // Test fstream
- std::ofstream of("test.txt");
- of << "Test ASCII text\n" << std::flush;
- of << "N= " << std::hex << std::setw(8) << std::left << 534 << std::endl;
- of.close();
+{
+ test_exception_syntax ();
+ test_template<double> tt (2.0);
+ assert (tt.add (4) == 6.0);
+ assert (true && !false);
+ cout << "ok\n";
}
-std::exit(0);
;
return 0;
ac_save_CXX=$CXX
cat confdefs.h - <<_ACEOF >conftest.$ac_ext
/* end confdefs.h. */
+// Does the compiler advertise C++98 conformance?
+#if !defined __cplusplus || __cplusplus < 199711L
+# error "Compiler does not advertise C++98 conformance"
+#endif
-#include <algorithm>
-#include <cstdlib>
-#include <fstream>
-#include <iomanip>
+// These inclusions are cheap compared to including any STL header, but will
+// reliably reject old compilers that lack the unsuffixed header files.
+#undef NDEBUG
+#include <cassert>
+#include <cstring>
#include <iostream>
-#include <list>
-#include <map>
-#include <set>
-#include <sstream>
-#include <stdexcept>
-#include <string>
-#include <utility>
-#include <vector>
-
-namespace test {
- typedef std::vector<std::string> string_vec;
- typedef std::pair<int,bool> map_value;
- typedef std::map<std::string,map_value> map_type;
- typedef std::set<int> set_type;
-
- template<typename T>
- class printer {
- public:
- printer(std::ostringstream& os): os(os) {}
- void operator() (T elem) { os << elem << std::endl; }
- private:
- std::ostringstream& os;
- };
-}
-int
-main (void)
+// Namespaces, exceptions, and templates were all added after "C++ 2.0".
+using std::cout;
+using std::strcmp;
+
+namespace {
+
+void test_exception_syntax()
{
+ try {
+ throw "test";
+ } catch (const char *s) {
+ // Extra parentheses suppress a warning when building autoconf itself,
+ // due to lint rules shared with more typical C programs.
+ assert (!(strcmp) (s, "test"));
+ }
+}
+template <typename T> struct test_template
+{
+ T const val;
+ explicit test_template(T t) : val(t) {}
+ template <typename U> T add(U u) { return static_cast<T>(u) + val; }
+};
-try {
- // Basic string.
- std::string teststr("ASCII text");
- teststr += " string";
+} // anonymous namespace
- // Simple vector.
- test::string_vec testvec;
- testvec.push_back(teststr);
- testvec.push_back("foo");
- testvec.push_back("bar");
- if (testvec.size() != 3) {
- throw std::runtime_error("vector size is not 1");
- }
+int
+main (void)
+{
- // Dump vector into stringstream and obtain string.
- std::ostringstream os;
- for (test::string_vec::const_iterator i = testvec.begin();
- i != testvec.end(); ++i) {
- if (i + 1 != testvec.end()) {
- os << teststr << '\n';
- }
- }
- // Check algorithms work.
- std::for_each(testvec.begin(), testvec.end(), test::printer<std::string>(os));
- std::string os_out = os.str();
-
- // Test pair and map.
- test::map_type testmap;
- testmap.insert(std::make_pair(std::string("key"),
- std::make_pair(53,false)));
-
- // Test set.
- int values[] = {9, 7, 13, 15, 4, 18, 12, 10, 5, 3, 14, 19, 17, 8, 6, 20, 16, 2, 11, 1};
- test::set_type testset(values, values + sizeof(values)/sizeof(values[0]));
- std::list<int> testlist(testset.begin(), testset.end());
- std::copy(testset.begin(), testset.end(), std::back_inserter(testlist));
-} catch (const std::exception& e) {
- std::cerr << "Caught exception: " << e.what() << std::endl;
-
- // Test fstream
- std::ofstream of("test.txt");
- of << "Test ASCII text\n" << std::flush;
- of << "N= " << std::hex << std::setw(8) << std::left << 534 << std::endl;
- of.close();
+{
+ test_exception_syntax ();
+ test_template<double> tt (2.0);
+ assert (tt.add (4) == 6.0);
+ assert (true && !false);
+ cout << "ok\n";
}
-std::exit(0);
;
return 0;
# the compiler is broken, or we cross compile.
{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether we are cross compiling" >&5
printf %s "checking whether we are cross compiling... " >&6; }
-if test "$cross_compiling" = maybe && test "x$build" != "x$host"; then
- cross_compiling=yes
-elif test "$cross_compiling" != yes; then
+if test "$cross_compiling" != yes; then
{ { ac_try="$ac_link"
case "(($ac_try" in
*\"* | *\`* | *\\*) ac_try_echo=\$ac_try;;
fi
{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: result: $ac_cv_fc_compiler_gnu" >&5
printf "%s\n" "$ac_cv_fc_compiler_gnu" >&6; }
+ac_compiler_gnu=$ac_cv_fc_compiler_gnu
+
ac_ext=$ac_save_ext
ac_test_FCFLAGS=${FCFLAGS+y}
ac_save_FCFLAGS=$FCFLAGS
# the compiler is broken, or we cross compile.
{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether we are cross compiling" >&5
printf %s "checking whether we are cross compiling... " >&6; }
-if test "$cross_compiling" = maybe && test "x$build" != "x$host"; then
- cross_compiling=yes
-elif test "$cross_compiling" != yes; then
+if test "$cross_compiling" != yes; then
{ { ac_try="$ac_link"
case "(($ac_try" in
*\"* | *\`* | *\\*) ac_try_echo=\$ac_try;;
fi
{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: result: $ac_cv_f77_compiler_gnu" >&5
printf "%s\n" "$ac_cv_f77_compiler_gnu" >&6; }
+ac_compiler_gnu=$ac_cv_f77_compiler_gnu
+
ac_ext=$ac_save_ext
ac_test_FFLAGS=${FFLAGS+y}
ac_save_FFLAGS=$FFLAGS
fi
+
+# Reset variables that may have inherited troublesome values from
+# the environment.
+
+# IFS needs to be set, to space, tab, and newline, in precisely that order.
+# (If _AS_PATH_WALK were called with IFS unset, it would have the
+# side effect of setting IFS to empty, thus disabling word splitting.)
+# Quoting is to prevent editors from complaining about space-tab.
+as_nl='
+'
+export as_nl
+IFS=" "" $as_nl"
+
+PS1='$ '
+PS2='> '
+PS4='+ '
+
+# Ensure predictable behavior from utilities with locale-dependent output.
+LC_ALL=C
+export LC_ALL
+LANGUAGE=C
+export LANGUAGE
+
+# We cannot yet rely on "unset" to work, but we need these variables
+# to be unset--not just set to an empty or harmless value--now, to
+# avoid bugs in old shells (e.g. pre-3.0 UWIN ksh). This construct
+# also avoids known problems related to "unset" and subshell syntax
+# in other old shells (e.g. bash 2.01 and pdksh 5.2.14).
+for as_var in BASH_ENV ENV MAIL MAILPATH CDPATH
+do eval test \${$as_var+y} \
+ && ( (unset $as_var) || exit 1) >/dev/null 2>&1 && unset $as_var || :
+done
+
+# Ensure that fds 0, 1, and 2 are open.
+if (exec 3>&0) 2>/dev/null; then :; else exec 0</dev/null; fi
+if (exec 3>&1) 2>/dev/null; then :; else exec 1>/dev/null; fi
+if (exec 3>&2) ; then :; else exec 2>/dev/null; fi
+
# The user is always right.
if ${PATH_SEPARATOR+false} :; then
PATH_SEPARATOR=:
fi
-# IFS
-# We need space, tab and new line, in precisely that order. Quoting is
-# there to prevent editors from complaining about space-tab.
-# (If _AS_PATH_WALK were called with IFS unset, it would disable word
-# splitting by setting IFS to empty value.)
-as_nl='
-'
-export as_nl
-IFS=" "" $as_nl"
-
# Find who we are. Look in the path if we contain no directory separator.
as_myself=
case $0 in #((
exit 1
fi
-# Unset variables that we do not need and which cause bugs (e.g. in
-# pre-3.0 UWIN ksh). But do not cause bugs in bash 2.01; the "|| exit 1"
-# suppresses any "Segmentation fault" message there. '((' could
-# trigger a bug in pdksh 5.2.14.
-for as_var in BASH_ENV ENV MAIL MAILPATH
-do eval test \${$as_var+y} \
- && ( (unset $as_var) || exit 1) >/dev/null 2>&1 && unset $as_var || :
-done
-PS1='$ '
-PS2='> '
-PS4='+ '
-
-# NLS nuisances.
-LC_ALL=C
-export LC_ALL
-LANGUAGE=C
-export LANGUAGE
-
-# CDPATH.
-(unset CDPATH) >/dev/null 2>&1 && unset CDPATH
# as_fn_error STATUS ERROR [LINENO LOG_FD]
as_cr_digits='0123456789'
as_cr_alnum=$as_cr_Letters$as_cr_digits
+
+# Determine whether it's possible to make 'echo' print without a newline.
+# These variables are no longer used directly by Autoconf, but are AC_SUBSTed
+# for compatibility with existing Makefiles.
ECHO_C= ECHO_N= ECHO_T=
case `echo -n x` in #(((((
-n*)
ECHO_N='-n';;
esac
+# For backward compatibility with old third-party macros, we provide
+# the shell variables $as_echo and $as_echo_n. New code should use
+# AS_ECHO(["message"]) and AS_ECHO_N(["message"]), respectively.
+as_echo='printf %s\n'
+as_echo_n='printf %s'
+
rm -f conf$$ conf$$.exe conf$$.file
if test -d conf$$.dir; then
rm -f conf$$.dir/conf$$.file
# report actual input values of CONFIG_FILES etc. instead of their
# values after options handling.
ac_log="
-This file was extended by GNU Automake $as_me 1.16.2, which was
-generated by GNU Autoconf 2.69.204-98d6. Invocation command line was
+This file was extended by GNU Automake $as_me 1.16.3, which was
+generated by GNU Autoconf 2.69d.4-8e54. Invocation command line was
CONFIG_FILES = $CONFIG_FILES
CONFIG_HEADERS = $CONFIG_HEADERS
cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1
ac_cs_config='$ac_cs_config_escaped'
ac_cs_version="\\
-GNU Automake config.status 1.16.2
-configured by $0, generated by GNU Autoconf 2.69.204-98d6,
+GNU Automake config.status 1.16.3
+configured by $0, generated by GNU Autoconf 2.69d.4-8e54,
with options \\"\$ac_cs_config\\"
Copyright (C) 2020 Free Software Foundation, Inc.
EOF
as_fn_exit 0
+
# along with this program. If not, see <https://www.gnu.org/licenses/>.
AC_PREREQ([2.69])
-AC_INIT([GNU Automake], [1.16.2], [bug-automake@gnu.org])
+AC_INIT([GNU Automake], [1.16.3], [bug-automake@gnu.org])
AC_CONFIG_SRCDIR([bin/automake.in])
AC_CONFIG_AUX_DIR([lib])
This is the 'contrib' directory of the GNU Automake distribution.
-Here you'll find additions to the Automake base distribution, in form of
-makefile fragments, m4 macros, scripts, documentation, et cetera. Such
-addition that might be useful for a significant percentage of its general
-audience, but (for one reason or another) are not deemed appropriate for
-inclusion into the Automake core.
+Here you'll find additions to the Automake base distribution, in the form of
+makefile fragments, m4 macros, scripts, documentation, et cetera: additions
+that might be handy to many users, but (for one reason or another) are not
+deemed appropriate for inclusion into the Automake core.
-There are several reasons for which a feature can be kept in contrib:
+There are several reasons that a feature might be kept in contrib:
1. The long-term usefulness of the feature is debatable and uncertain;
- on-field and real-word testing are necessary to prove or disprove
- its usefulness, before the feature can be committed into the Automake
- core (as doing so too early would later force us to continue the
- support for backward-compatibility, even if the features proves
- flawed or fails to attract widespread use).
+ real-world testing is necessary to prove or disprove its usefulness,
+ before the feature can be committed into the Automake core (as doing so
+ too early would later force us to continue support for
+ backward-compatibility, even if the feature proved flawed or fails to
+ attract widespread use).
- 2. The APIs or overall design of the feature are still unstable, and
- need on-field testing to iron warts and usability bugs, or uncover
- potential flaws.
+ 2. The APIs or overall design of the feature are still unstable, and need
+ testing to iron out warts and usability bugs, or uncover potential flaws.
- 3. The feature was an historical one, mostly obsoleted but still used
- "here and there" in the wild; so we want to to deprecate it and
- remove it from the Automake core, but cannot remove it altogether,
- for the sake of those still-existing usage. So it gets moved in
- contrib.
+ 3. The feature was an historical one, mostly obsolete but still used in the
+ wild. We want to deprecate it and remove it from the Automake core, but
+ cannot remove it altogether, for the sake of the existing usage, so it
+ gets moved to contrib.
--- /dev/null
+#!/usr/local/bin/perl -wT
+#
+# W3C Link Checker
+# by Hugo Haas <hugo@w3.org>
+# (c) 1999-2011 World Wide Web Consortium
+# based on Renaud Bruyeron's checklink.pl
+#
+# This program is licensed under the W3C(r) Software License:
+# http://www.w3.org/Consortium/Legal/copyright-software
+#
+# The documentation is at:
+# http://validator.w3.org/docs/checklink.html
+#
+# See the Mercurial interface at:
+# http://dvcs.w3.org/hg/link-checker/
+#
+# An online version is available at:
+# http://validator.w3.org/checklink
+#
+# Comments and suggestions should be sent to the www-validator mailing list:
+# www-validator@w3.org (with 'checklink' in the subject)
+# http://lists.w3.org/Archives/Public/www-validator/ (archives)
+#
+# Small modifications in March 2020 by Karl Berry <karl@freefriends.org>
+# (contributed under the same license, or public domain if you prefer).
+# I started from https://metacpan.org/release/W3C-LinkChecker, version 4.81.
+# - (&simple_request) ignore "Argument isn't numeric" warnings.
+# - (%Opts, &check_uri) new option --exclude-url-file; see --help message.
+# - (&parse_arguments) allow multiple -X options.
+# - (&check_uri) missing argument to hprintf.
+# - (&hprintf) avoid useless warnings when undef is returned.
+# The ideas are (1) to avoid rechecking every url during development,
+# and (2) to make the exclude list easier to maintain,
+# and (3) to eliminate useless warnings from the code,
+#
+# For GNU Automake, this program is used by the checklinkx target
+# in doc/local.mk to check the (html output of) automake manual.
+
+use strict;
+use 5.008;
+
+# Get rid of potentially unsafe and unneeded environment variables.
+delete(@ENV{qw(IFS CDPATH ENV BASH_ENV)});
+$ENV{PATH} = undef;
+
+# ...but we want PERL5?LIB honored even in taint mode, see perlsec, perl5lib,
+# http://www.mail-archive.com/cpan-testers-discuss%40perl.org/msg01064.html
+use Config qw(%Config);
+use lib map { /(.*)/ }
+ defined($ENV{PERL5LIB}) ? split(/$Config{path_sep}/, $ENV{PERL5LIB}) :
+ defined($ENV{PERLLIB}) ? split(/$Config{path_sep}/, $ENV{PERLLIB}) :
+ ();
+
+# -----------------------------------------------------------------------------
+
+package W3C::UserAgent;
+
+use LWP::RobotUA 1.19 qw();
+use LWP::UserAgent qw();
+use Net::HTTP::Methods 5.833 qw(); # >= 5.833 for 4kB cookies (#6678)
+
+# if 0, ignore robots exclusion (useful for testing)
+use constant USE_ROBOT_UA => 1;
+
+if (USE_ROBOT_UA) {
+ @W3C::UserAgent::ISA = qw(LWP::RobotUA);
+}
+else {
+ @W3C::UserAgent::ISA = qw(LWP::UserAgent);
+}
+
+sub new
+{
+ my $proto = shift;
+ my $class = ref($proto) || $proto;
+ my ($name, $from, $rules) = @_;
+
+ # For security/privacy reasons, if $from was not given, do not send it.
+ # Cheat by defining something for the constructor, and resetting it later.
+ my $from_ok = $from;
+ $from ||= 'www-validator@w3.org';
+
+ my $self;
+ if (USE_ROBOT_UA) {
+ $self = $class->SUPER::new($name, $from, $rules);
+ }
+ else {
+ my %cnf;
+ @cnf{qw(agent from)} = ($name, $from);
+ $self = LWP::UserAgent->new(%cnf);
+ $self = bless $self, $class;
+ }
+
+ $self->from(undef) unless $from_ok;
+
+ $self->env_proxy();
+
+ $self->allow_private_ips(1);
+
+ $self->protocols_forbidden([qw(mailto javascript)]);
+
+ return $self;
+}
+
+sub allow_private_ips
+{
+ my $self = shift;
+ if (@_) {
+ $self->{Checklink_allow_private_ips} = shift;
+ if (!$self->{Checklink_allow_private_ips}) {
+
+ # Pull in dependencies
+ require Net::IP;
+ require Socket;
+ require Net::hostent;
+ }
+ }
+ return $self->{Checklink_allow_private_ips};
+}
+
+sub redirect_progress_callback
+{
+ my $self = shift;
+ $self->{Checklink_redirect_callback} = shift if @_;
+ return $self->{Checklink_redirect_callback};
+}
+
+sub simple_request
+{
+ my $self = shift;
+
+ my $response = $self->ip_disallowed($_[0]->uri());
+
+ # RFC 2616, section 15.1.3
+ $_[0]->remove_header("Referer")
+ if ($_[0]->referer() &&
+ (!$_[0]->uri()->secure() && URI->new($_[0]->referer())->secure()));
+
+ $response ||= do {
+ local $SIG{__WARN__} =
+ sub { # Suppress RobotRules warnings, rt.cpan.org #18902
+ # Suppress "Argument isn't numeric" warnings, see below.
+ warn($_[0])
+ if ($_[0]
+ && $_[0] !~ /^RobotRules/
+ && $_[0] !~ /^Argument .* isn't numeric.*Response\.pm/
+ );
+ };
+
+ # @@@ Why not just $self->SUPER::simple_request? [--unknown]
+ # --- Indeed. Further, why use simple_request in the first place?
+ # It is not part of the UserAgent UI. I believe this can result
+ # in warnings like:
+ # Argument "0, 0, 0, 0" isn't numeric in numeric gt (>) at
+ # /usr/local/lib/perl5/site_perl/5.30.2/HTTP/Response.pm line 261.
+ # when checking, e.g.,
+ # https://metacpan.org/pod/distribution/Test-Harness/bin/prove
+ # For testing, here is a three-line html file to check that url:
+ # <html><head><title>X</title></head><body>
+ # <p><a href="https://metacpan.org/pod/release/MSCHWERN/Test-Simple-0.98_05/lib/Test/More.pm">prove</a></p>
+ # </body></html>
+ # I have been unable to reproduce the warning with a test program
+ # checking that url using $ua->request(), or other UserAgent
+ # functions, even after carefully reproducing all the headers
+ # that checklink sends in the request. --karl@freefriends.org.
+
+ $self->W3C::UserAgent::SUPER::simple_request(@_);
+ };
+
+ if (!defined($self->{FirstResponse})) {
+ $self->{FirstResponse} = $response->code();
+ $self->{FirstMessage} = $response->message() || '(no message)';
+ }
+
+ return $response;
+}
+
+sub redirect_ok
+{
+ my ($self, $request, $response) = @_;
+
+ if (my $callback = $self->redirect_progress_callback()) {
+
+ # @@@ TODO: when an LWP internal robots.txt request gets redirected,
+ # this will a bit confusingly fire for it too. Would need a robust
+ # way to determine whether the request is such a LWP "internal
+ # robots.txt" one.
+ &$callback($request->method(), $request->uri());
+ }
+
+ return 0 unless $self->SUPER::redirect_ok($request, $response);
+
+ if (my $res = $self->ip_disallowed($request->uri())) {
+ $response->previous($response->clone());
+ $response->request($request);
+ $response->code($res->code());
+ $response->message($res->message());
+ return 0;
+ }
+
+ return 1;
+}
+
+#
+# Checks whether we're allowed to retrieve the document based on its IP
+# address. Takes an URI object and returns a HTTP::Response containing the
+# appropriate status and error message if the IP was disallowed, 0
+# otherwise. URIs without hostname or IP address are always allowed,
+# including schemes where those make no sense (eg. data:, often javascript:).
+#
+sub ip_disallowed
+{
+ my ($self, $uri) = @_;
+ return 0 if $self->allow_private_ips(); # Short-circuit
+
+ my $hostname = undef;
+ eval { $hostname = $uri->host() }; # Not all URIs implement host()...
+ return 0 unless $hostname;
+
+ my $addr = my $iptype = my $resp = undef;
+ if (my $host = Net::hostent::gethostbyname($hostname)) {
+ $addr = Socket::inet_ntoa($host->addr()) if $host->addr();
+ if ($addr && (my $ip = Net::IP->new($addr))) {
+ $iptype = $ip->iptype();
+ }
+ }
+ if ($iptype && $iptype ne 'PUBLIC') {
+ $resp = HTTP::Response->new(403,
+ 'Checking non-public IP address disallowed by link checker configuration'
+ );
+ $resp->header('Client-Warning', 'Internal response');
+ }
+ return $resp;
+}
+
+# -----------------------------------------------------------------------------
+
+package W3C::LinkChecker;
+
+use vars qw($AGENT $PACKAGE $PROGRAM $VERSION $REVISION
+ $DocType $Head $Accept $ContentTypes %Cfg $CssUrl);
+
+use CSS::DOM 0.09 qw(); # >= 0.09 for many bugfixes
+use CSS::DOM::Constants qw(:rule);
+use CSS::DOM::Style qw();
+use CSS::DOM::Util qw();
+use Encode qw();
+use HTML::Entities qw();
+use HTML::Parser 3.40 qw(); # >= 3.40 for utf8_mode()
+use HTTP::Headers::Util qw();
+use HTTP::Message 5.827 qw(); # >= 5.827 for content_charset()
+use HTTP::Request 5.814 qw(); # >= 5.814 for accept_decodable()
+use HTTP::Response 1.50 qw(); # >= 1.50 for decoded_content()
+use Time::HiRes qw();
+use URI 1.53 qw(); # >= 1.53 for secure()
+use URI::Escape qw();
+use URI::Heuristic qw();
+
+# @@@ Needs also W3C::UserAgent but can't use() it here.
+
+use constant RC_ROBOTS_TXT => -1;
+use constant RC_DNS_ERROR => -2;
+use constant RC_IP_DISALLOWED => -3;
+use constant RC_PROTOCOL_DISALLOWED => -4;
+
+use constant LINE_UNKNOWN => -1;
+
+use constant MP2 =>
+ (exists($ENV{MOD_PERL_API_VERSION}) && $ENV{MOD_PERL_API_VERSION} >= 2);
+
+# Tag=>attribute mapping of things we treat as links.
+# Note: meta/@http-equiv gets special treatment, see start() for details.
+use constant LINK_ATTRS => {
+ a => ['href'],
+
+ # base/@href intentionally not checked
+ # http://www.w3.org/mid/200802091439.27764.ville.skytta%40iki.fi
+ area => ['href'],
+ audio => ['src'],
+ blockquote => ['cite'],
+ body => ['background'],
+ command => ['icon'],
+
+ # button/@formaction not checked (side effects)
+ del => ['cite'],
+
+ # @pluginspage, @pluginurl, @href: pre-HTML5 proprietary
+ embed => ['href', 'pluginspage', 'pluginurl', 'src'],
+
+ # form/@action not checked (side effects)
+ frame => ['longdesc', 'src'],
+ html => ['manifest'],
+ iframe => ['longdesc', 'src'],
+ img => ['longdesc', 'src'],
+
+ # input/@action, input/@formaction not checked (side effects)
+ input => ['src'],
+ ins => ['cite'],
+ link => ['href'],
+ object => ['data'],
+ q => ['cite'],
+ script => ['src'],
+ source => ['src'],
+ track => ['src'],
+ video => ['src', 'poster'],
+};
+
+# Tag=>[separator, attributes] mapping of things we treat as lists of links.
+use constant LINK_LIST_ATTRS => {
+ a => [qr/\s+/, ['ping']],
+ applet => [qr/[\s,]+/, ['archive']],
+ area => [qr/\s+/, ['ping']],
+ head => [qr/\s+/, ['profile']],
+ object => [qr/\s+/, ['archive']],
+};
+
+# TBD/TODO:
+# - applet/@code?
+# - bgsound/@src?
+# - object/@classid?
+# - isindex/@action?
+# - layer/@background,@src?
+# - ilayer/@background?
+# - table,tr,td,th/@background?
+# - xmp/@href?
+
+@W3C::LinkChecker::ISA = qw(HTML::Parser);
+
+BEGIN {
+
+ # Version info
+ $PACKAGE = 'W3C Link Checker';
+ $PROGRAM = 'W3C-checklink';
+ $VERSION = '4.81';
+ $REVISION = sprintf('version %s (c) 1999-2011 W3C', $VERSION);
+ $AGENT = sprintf(
+ '%s/%s %s',
+ $PROGRAM, $VERSION,
+ ( W3C::UserAgent::USE_ROBOT_UA ? LWP::RobotUA->_agent() :
+ LWP::UserAgent->_agent()
+ )
+ );
+
+ # Pull in mod_perl modules if applicable.
+ eval {
+ local $SIG{__DIE__} = undef;
+ require Apache2::RequestUtil;
+ } if MP2();
+
+ my @content_types = qw(
+ text/html
+ application/xhtml+xml;q=0.9
+ application/vnd.wap.xhtml+xml;q=0.6
+ );
+ $Accept = join(', ', @content_types, '*/*;q=0.5');
+ push(@content_types, 'text/css', 'text/html-sandboxed');
+ my $re = join('|', map { s/;.*//; quotemeta } @content_types);
+ $ContentTypes = qr{\b(?:$re)\b}io;
+
+ # Regexp for matching URL values in CSS.
+ $CssUrl = qr/(?:\s|^)url\(\s*(['"]?)(.*?)\1\s*\)(?=\s|$)/;
+
+ #
+ # Read configuration. If the W3C_CHECKLINK_CFG environment variable has
+ # been set or the default contains a non-empty file, read it. Otherwise,
+ # skip silently.
+ #
+ my $defaultconfig = '/etc/w3c/checklink.conf';
+ if ($ENV{W3C_CHECKLINK_CFG} || -s $defaultconfig) {
+
+ require Config::General;
+ Config::General->require_version(2.06); # Need 2.06 for -SplitPolicy
+
+ my $conffile = $ENV{W3C_CHECKLINK_CFG} || $defaultconfig;
+ eval {
+ my %config_opts = (
+ -ConfigFile => $conffile,
+ -SplitPolicy => 'equalsign',
+ -AllowMultiOptions => 'no',
+ );
+ %Cfg = Config::General->new(%config_opts)->getall();
+ };
+ if ($@) {
+ die <<"EOF";
+Failed to read configuration from '$conffile':
+$@
+EOF
+ }
+ }
+ $Cfg{Markup_Validator_URI} ||= 'http://validator.w3.org/check?uri=%s';
+ $Cfg{CSS_Validator_URI} ||=
+ 'http://jigsaw.w3.org/css-validator/validator?uri=%s';
+ $Cfg{Doc_URI} ||= 'http://validator.w3.org/docs/checklink.html';
+
+ # Untaint config params that are used as the format argument to (s)printf(),
+ # Perl 5.10 does not want to see that in taint mode.
+ ($Cfg{Markup_Validator_URI}) = ($Cfg{Markup_Validator_URI} =~ /^(.*)$/);
+ ($Cfg{CSS_Validator_URI}) = ($Cfg{CSS_Validator_URI} =~ /^(.*)$/);
+
+ $DocType =
+ '<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">';
+ my $css_url = URI->new_abs('linkchecker.css', $Cfg{Doc_URI});
+ my $js_url = URI->new_abs('linkchecker.js', $Cfg{Doc_URI});
+ $Head =
+ sprintf(<<'EOF', HTML::Entities::encode($AGENT), $css_url, $js_url);
+<meta http-equiv="Content-Script-Type" content="text/javascript" />
+<meta name="generator" content="%s" />
+<link rel="stylesheet" type="text/css" href="%s" />
+<script type="text/javascript" src="%s"></script>
+EOF
+
+ # Trusted environment variables that need laundering in taint mode.
+ for (qw(NNTPSERVER NEWSHOST)) {
+ ($ENV{$_}) = ($ENV{$_} =~ /^(.*)$/) if $ENV{$_};
+ }
+
+ # Use passive FTP by default, see Net::FTP(3).
+ $ENV{FTP_PASSIVE} = 1 unless exists($ENV{FTP_PASSIVE});
+}
+
+# Autoflush
+$| = 1;
+
+# Different options specified by the user
+my $cmdline = !($ENV{GATEWAY_INTERFACE} && $ENV{GATEWAY_INTERFACE} =~ /^CGI/);
+my %Opts = (
+ Command_Line => $cmdline,
+ Quiet => 0,
+ Summary_Only => 0,
+ Verbose => 0,
+ Progress => 0,
+ HTML => 0,
+ Timeout => 30,
+ Redirects => 1,
+ Dir_Redirects => 1,
+ Accept_Language => $cmdline ? undef : $ENV{HTTP_ACCEPT_LANGUAGE},
+ Cookies => undef,
+ No_Referer => 0,
+ Hide_Same_Realm => 0,
+ Depth => 0, # < 0 means unlimited recursion.
+ Sleep_Time => 1,
+ Connection_Cache_Size => 2,
+ Max_Documents => 150, # For the online version.
+ User => undef,
+ Password => undef,
+ Base_Locations => [],
+ Exclude => undef,
+ Exclude_Docs => undef,
+ Exclude_Url_File => undef,
+ Suppress_Redirect => [],
+ Suppress_Redirect_Prefix => [],
+ Suppress_Redirect_Regexp => [],
+ Suppress_Temp_Redirects => 1,
+ Suppress_Broken => [],
+ Suppress_Fragment => [],
+ Masquerade => 0,
+ Masquerade_From => '',
+ Masquerade_To => '',
+ Trusted => $Cfg{Trusted},
+ Allow_Private_IPs => defined($Cfg{Allow_Private_IPs}) ?
+ $Cfg{Allow_Private_IPs} :
+ $cmdline,
+);
+undef $cmdline;
+
+# Global variables
+# What URI's did we process? (used for recursive mode)
+my %processed;
+
+# Result of the HTTP query
+my %results;
+
+# List of redirects
+my %redirects;
+
+# Count of the number of documents checked
+my $doc_count = 0;
+
+# Time stamp
+my $timestamp = &get_timestamp();
+
+# Per-document header; undefined if already printed. See print_doc_header().
+my $doc_header;
+
+&parse_arguments() if $Opts{Command_Line};
+
+my $ua = W3C::UserAgent->new($AGENT); # @@@ TODO: admin address
+
+$ua->conn_cache({total_capacity => $Opts{Connection_Cache_Size}});
+if ($ua->can('delay')) {
+ $ua->delay($Opts{Sleep_Time} / 60);
+}
+$ua->timeout($Opts{Timeout});
+
+# Set up cookie stash if requested
+if (defined($Opts{Cookies})) {
+ require HTTP::Cookies;
+ my $cookie_file = $Opts{Cookies};
+ if ($cookie_file eq 'tmp') {
+ $cookie_file = undef;
+ }
+ elsif ($cookie_file =~ /^(.*)$/) {
+ $cookie_file = $1; # untaint
+ }
+ $ua->cookie_jar(HTTP::Cookies->new(file => $cookie_file, autosave => 1));
+}
+eval { $ua->allow_private_ips($Opts{Allow_Private_IPs}); };
+if ($@) {
+ die <<"EOF";
+Allow_Private_IPs is false; this feature requires the Net::IP, Socket, and
+Net::hostent modules:
+$@
+EOF
+}
+
+# Add configured forbidden protocols
+if ($Cfg{Forbidden_Protocols}) {
+ my $forbidden = $ua->protocols_forbidden();
+ push(@$forbidden, split(/[,\s]+/, lc($Cfg{Forbidden_Protocols})));
+ $ua->protocols_forbidden($forbidden);
+}
+
+if ($Opts{Command_Line}) {
+
+ require Text::Wrap;
+ Text::Wrap->import('wrap');
+
+ require URI::file;
+
+ &usage(1) unless scalar(@ARGV);
+
+ $Opts{_Self_URI} = 'http://validator.w3.org/checklink'; # For HTML output
+
+ &ask_password() if ($Opts{User} && !$Opts{Password});
+
+ if (!$Opts{Summary_Only}) {
+ printf("%s %s\n", $PACKAGE, $REVISION) unless $Opts{HTML};
+ }
+ else {
+ $Opts{Verbose} = 0;
+ $Opts{Progress} = 0;
+ }
+
+ # Populate data for print_form()
+ my %params = (
+ summary => $Opts{Summary_Only},
+ hide_redirects => !$Opts{Redirects},
+ hide_type => $Opts{Dir_Redirects} ? 'dir' : 'all',
+ no_accept_language => !(
+ defined($Opts{Accept_Language}) && $Opts{Accept_Language} eq 'auto'
+ ),
+ no_referer => $Opts{No_Referer},
+ recursive => ($Opts{Depth} != 0),
+ depth => $Opts{Depth},
+ );
+
+ my $check_num = 1;
+ my @bases = @{$Opts{Base_Locations}};
+ for my $uri (@ARGV) {
+
+ # Reset base locations so that previous URI's given on the command line
+ # won't affect the recursion scope for this URI (see check_uri())
+ @{$Opts{Base_Locations}} = @bases;
+
+ # Transform the parameter into a URI
+ $uri = &urize($uri);
+ $params{uri} = $uri;
+ &check_uri(\%params, $uri, $check_num, $Opts{Depth}, undef, undef, 1);
+ $check_num++;
+ }
+ undef $check_num;
+
+ if ($Opts{HTML}) {
+ &html_footer();
+ }
+ elsif ($doc_count > 0 && !$Opts{Summary_Only}) {
+ printf("\n%s\n", &global_stats());
+ }
+
+}
+else {
+
+ require CGI;
+ require CGI::Carp;
+ CGI::Carp->import(qw(fatalsToBrowser));
+ require CGI::Cookie;
+
+ # file: URIs are not allowed in CGI mode
+ my $forbidden = $ua->protocols_forbidden();
+ push(@$forbidden, 'file');
+ $ua->protocols_forbidden($forbidden);
+
+ my $query = CGI->new();
+
+ for my $param ($query->param()) {
+ my @values = map { Encode::decode_utf8($_) } $query->param($param);
+ $query->param($param, @values);
+ }
+
+ # Set a few parameters in CGI mode
+ $Opts{Verbose} = 0;
+ $Opts{Progress} = 0;
+ $Opts{HTML} = 1;
+ $Opts{_Self_URI} = $query->url(-relative => 1);
+
+ # Backwards compatibility
+ my $uri = undef;
+ if ($uri = $query->param('url')) {
+ $query->param('uri', $uri) unless $query->param('uri');
+ $query->delete('url');
+ }
+ $uri = $query->param('uri');
+
+ if (!$uri) {
+ &html_header('', undef); # Set cookie only from results page.
+ my %cookies = CGI::Cookie->fetch();
+ &print_form(scalar($query->Vars()), $cookies{$PROGRAM}, 1);
+ &html_footer();
+ exit;
+ }
+
+ # Backwards compatibility
+ if ($query->param('hide_dir_redirects')) {
+ $query->param('hide_redirects', 'on');
+ $query->param('hide_type', 'dir');
+ $query->delete('hide_dir_redirects');
+ }
+
+ $Opts{Summary_Only} = 1 if $query->param('summary');
+
+ if ($query->param('hide_redirects')) {
+ $Opts{Dir_Redirects} = 0;
+ if (my $type = $query->param('hide_type')) {
+ $Opts{Redirects} = 0 if ($type ne 'dir');
+ }
+ else {
+ $Opts{Redirects} = 0;
+ }
+ }
+
+ $Opts{Accept_Language} = undef if $query->param('no_accept_language');
+ $Opts{No_Referer} = $query->param('no_referer');
+
+ $Opts{Depth} = -1 if ($query->param('recursive') && $Opts{Depth} == 0);
+ if (my $depth = $query->param('depth')) {
+
+ # @@@ Ignore invalid depth silently for now.
+ $Opts{Depth} = $1 if ($depth =~ /(-?\d+)/);
+ }
+
+ # Save, clear or leave cookie as is.
+ my $cookie = undef;
+ if (my $action = $query->param('cookie')) {
+ if ($action eq 'clear') {
+
+ # Clear the cookie.
+ $cookie = CGI::Cookie->new(-name => $PROGRAM);
+ $cookie->value({clear => 1});
+ $cookie->expires('-1M');
+ }
+ elsif ($action eq 'set') {
+
+ # Set the options.
+ $cookie = CGI::Cookie->new(-name => $PROGRAM);
+ my %options = $query->Vars();
+ delete($options{$_})
+ for qw(url uri check cookie); # Non-persistent.
+ $cookie->value(\%options);
+ }
+ }
+ if (!$cookie) {
+ my %cookies = CGI::Cookie->fetch();
+ $cookie = $cookies{$PROGRAM};
+ }
+
+ # Always refresh cookie expiration time.
+ $cookie->expires('+1M') if ($cookie && !$cookie->expires());
+
+ # All Apache configurations don't set HTTP_AUTHORIZATION for CGI scripts.
+ # If we're under mod_perl, there is a way around it...
+ eval {
+ local $SIG{__DIE__} = undef;
+ my $auth =
+ Apache2::RequestUtil->request()->headers_in()->{Authorization};
+ $ENV{HTTP_AUTHORIZATION} = $auth if $auth;
+ } if (MP2() && !$ENV{HTTP_AUTHORIZATION});
+
+ $uri =~ s/^\s+//g;
+ if ($uri =~ /:/) {
+ $uri = URI->new($uri);
+ }
+ else {
+ if ($uri =~ m|^//|) {
+ $uri = URI->new("http:$uri");
+ }
+ else {
+ local $ENV{URL_GUESS_PATTERN} = '';
+ my $guess = URI::Heuristic::uf_uri($uri);
+ if ($guess->scheme() && $ua->is_protocol_supported($guess)) {
+ $uri = $guess;
+ }
+ else {
+ $uri = URI->new("http://$uri");
+ }
+ }
+ }
+ $uri = $uri->canonical();
+ $query->param("uri", $uri);
+
+ &check_uri(scalar($query->Vars()), $uri, 1, $Opts{Depth}, $cookie);
+ undef $query; # Not needed any more.
+ &html_footer();
+}
+
+###############################################################################
+
+################################
+# Command line and usage stuff #
+################################
+
+sub parse_arguments ()
+{
+ require Encode::Locale;
+ Encode::Locale::decode_argv();
+
+ require Getopt::Long;
+ Getopt::Long->require_version(2.17);
+ Getopt::Long->import('GetOptions');
+ Getopt::Long::Configure('bundling', 'no_ignore_case');
+ my $masq = '';
+ my @locs = ();
+
+ GetOptions(
+ 'help|h|?' => sub { usage(0) },
+ 'q|quiet' => sub {
+ $Opts{Quiet} = 1;
+ $Opts{Summary_Only} = 1;
+ },
+ 's|summary' => \$Opts{Summary_Only},
+ 'b|broken' => sub {
+ $Opts{Redirects} = 0;
+ $Opts{Dir_Redirects} = 0;
+ },
+ 'e|dir-redirects' => sub { $Opts{Dir_Redirects} = 0; },
+ 'v|verbose' => \$Opts{Verbose},
+ 'i|indicator' => \$Opts{Progress},
+ 'H|html' => \$Opts{HTML},
+ 'r|recursive' => sub {
+ $Opts{Depth} = -1
+ if $Opts{Depth} == 0;
+ },
+ 'l|location=s' => \@locs,
+ 'X|exclude=s@' => \@{$Opts{Exclude}},
+ 'exclude-docs=s@' => \@{$Opts{Exclude_Docs}},
+ 'exclude-url-file=s' => \$Opts{Exclude_Url_File},
+ 'suppress-redirect=s@' => \@{$Opts{Suppress_Redirect}},
+ 'suppress-redirect-prefix=s@' => \@{$Opts{Suppress_Redirect_Prefix}},
+ 'suppress-temp-redirects' => \$Opts{Suppress_Temp_Redirects},
+ 'suppress-broken=s@' => \@{$Opts{Suppress_Broken}},
+ 'suppress-fragment=s@' => \@{$Opts{Suppress_Fragment}},
+ 'u|user=s' => \$Opts{User},
+ 'p|password=s' => \$Opts{Password},
+ 't|timeout=i' => \$Opts{Timeout},
+ 'C|connection-cache=i' => \$Opts{Connection_Cache_Size},
+ 'S|sleep=i' => \$Opts{Sleep_Time},
+ 'L|languages=s' => \$Opts{Accept_Language},
+ 'c|cookies=s' => \$Opts{Cookies},
+ 'R|no-referer' => \$Opts{No_Referer},
+ 'D|depth=i' => sub {
+ $Opts{Depth} = $_[1]
+ unless $_[1] == 0;
+ },
+ 'd|domain=s' => \$Opts{Trusted},
+ 'masquerade=s' => \$masq,
+ 'hide-same-realm' => \$Opts{Hide_Same_Realm},
+ 'V|version' => \&version,
+ ) ||
+ usage(1);
+
+ if ($masq) {
+ $Opts{Masquerade} = 1;
+ my @masq = split(/\s+/, $masq);
+ if (scalar(@masq) != 2 ||
+ !defined($masq[0]) ||
+ $masq[0] !~ /\S/ ||
+ !defined($masq[1]) ||
+ $masq[1] !~ /\S/)
+ {
+ usage(1,
+ "Error: --masquerade takes two whitespace separated URIs.");
+ }
+ else {
+ require URI::file;
+ $Opts{Masquerade_From} = $masq[0];
+ my $u = URI->new($masq[1]);
+ $Opts{Masquerade_To} =
+ $u->scheme() ? $u : URI::file->new_abs($masq[1]);
+ }
+ }
+
+ if ($Opts{Accept_Language} && $Opts{Accept_Language} eq 'auto') {
+ $Opts{Accept_Language} = &guess_language();
+ }
+
+ if (($Opts{Sleep_Time} || 0) < 1) {
+ warn(
+ "*** Warning: minimum allowed sleep time is 1 second, resetting.\n"
+ );
+ $Opts{Sleep_Time} = 1;
+ }
+
+ push(@{$Opts{Base_Locations}}, map { URI->new($_)->canonical() } @locs);
+
+ $Opts{Depth} = -1 if ($Opts{Depth} == 0 && @locs);
+
+ for my $i (0 .. $#{$Opts{Exclude_Docs}}) {
+ eval { $Opts{Exclude_Docs}->[$i] = qr/$Opts{Exclude_Docs}->[$i]/; };
+ &usage(1, "Error in exclude-docs regexp: $@") if $@;
+ }
+ if (defined($Opts{Trusted})) {
+ eval { $Opts{Trusted} = qr/$Opts{Trusted}/io; };
+ &usage(1, "Error in trusted domains regexp: $@") if $@;
+ }
+
+ # Sanity-check error-suppression arguments
+ for my $i (0 .. $#{$Opts{Suppress_Redirect}}) {
+ ${$Opts{Suppress_Redirect}}[$i] =~ s/ /->/;
+ my $sr_arg = ${$Opts{Suppress_Redirect}}[$i];
+ if ($sr_arg !~ /.->./) {
+ &usage(1,
+ "Bad suppress-redirect argument, should contain \"->\": $sr_arg"
+ );
+ }
+ }
+ for my $i (0 .. $#{$Opts{Suppress_Redirect_Prefix}}) {
+ my $srp_arg = ${$Opts{Suppress_Redirect_Prefix}}[$i];
+ $srp_arg =~ s/ /->/;
+ if ($srp_arg !~ /^(.*)->(.*)$/) {
+ &usage(1,
+ "Bad suppress-redirect-prefix argument, should contain \"->\": $srp_arg"
+ );
+ }
+
+ # Turn prefixes into a regexp.
+ ${$Opts{Suppress_Redirect_Prefix}}[$i] = qr/^\Q$1\E(.*)->\Q$2\E\1$/ism;
+ }
+ for my $i (0 .. $#{$Opts{Suppress_Broken}}) {
+ ${$Opts{Suppress_Broken}}[$i] =~ s/ /:/;
+ my $sb_arg = ${$Opts{Suppress_Broken}}[$i];
+ if ($sb_arg !~ /^(-1|[0-9]+):./) {
+ &usage(1,
+ "Bad suppress-broken argument, should be prefixed by a numeric response code: $sb_arg"
+ );
+ }
+ }
+ for my $sf_arg (@{$Opts{Suppress_Fragment}}) {
+ if ($sf_arg !~ /.#./) {
+ &usage(1,
+ "Bad suppress-fragment argument, should contain \"#\": $sf_arg"
+ );
+ }
+ }
+
+ if ($#{$Opts{Exclude}} > 0) {
+ # convert $Opts{Exclude} array into regexp by parenthesizing
+ # each and inserting alternations between.
+ my $exclude_rx = join("|", map { "($_)" } @{$Opts{Exclude}});
+ #
+ # For the sake of the rest of the program, pretend the option
+ # was that string all along.
+ $Opts{Exclude} = $exclude_rx;
+ }
+
+ if ($Opts{Exclude_Url_File}) {
+ # The idea is that if the specified file exists, we read it and
+ # treat it as a list of excludes. If the file doesn't exist, we
+ # write it with all the urls that were successful. That way, we
+ # can avoid re-checking them on every run, and it can be removed
+ # externally (from cron) to get re-updated.
+ #
+ # We distinguish the cases here, and either add to
+ # $Opts{Exclude} if reading, or setting Exclude_File_Write in
+ # %Opts if writing (even though it is not really an option,
+ # but it's the most convenient place).
+ if (-s $Opts{Exclude_Url_File}) {
+ open (my $xf, "$Opts{Exclude_Url_File}")
+ || &usage(1, "Could not open $Opts{Exclude_Url_File}"
+ . " for reading: $!");
+ my @xf = ();
+ while (<$xf>) {
+ chomp;
+ # the file is urls, not regexps, so quotemeta.
+ push (@xf, "(" . quotemeta($_) . ")");
+ }
+ my $xf_rx = join ("|", @xf);
+ if ($Opts{Exclude}) {
+ $Opts{Exclude} .= "|$xf_rx";
+ } else {
+ $Opts{Exclude} = $xf_rx;
+ }
+ } else {
+ open ($Opts{Exclude_File_Write}, ">$Opts{Exclude_Url_File}")
+ || &usage(1,
+ "Could not open $Opts{Exclude_Url_File} for writing: $!");
+ # we write on a successful retrieve, and don't bother closing.
+ }
+ }
+
+ # Precompile/error-check final list of regular expressions
+ if (defined($Opts{Exclude})) {
+ eval { $Opts{Exclude} = qr/$Opts{Exclude}/o; };
+ &usage(1, "Error in exclude regexp $Opts{Exclude}: $@") if $@;
+ }
+
+ return;
+}
+
+sub version ()
+{
+ print "$PACKAGE $REVISION\n";
+ exit 0;
+}
+
+sub usage ()
+{
+ my ($exitval, $msg) = @_;
+ $exitval = 0 unless defined($exitval);
+ $msg ||= '';
+ $msg =~ s/[\r\n]*$/\n\n/ if $msg;
+
+ die($msg) unless $Opts{Command_Line};
+
+ my $trust = defined($Cfg{Trusted}) ? $Cfg{Trusted} : 'same host only';
+
+ select(STDERR) if $exitval;
+ print "$msg$PACKAGE $REVISION
+
+Usage: checklink <options> <uris>
+Options:
+ -s, --summary Result summary only.
+ -b, --broken Show only the broken links, not the redirects.
+ -e, --directory Hide directory redirects, for example
+ http://www.w3.org/TR -> http://www.w3.org/TR/
+ -r, --recursive Check the documents linked from the first one.
+ -D, --depth N Check the documents linked from the first one to
+ depth N (implies --recursive).
+ -l, --location URI Scope of the documents checked in recursive mode
+ (implies --recursive). Can be specified multiple
+ times. If not specified, the default eg. for
+ http://www.w3.org/TR/html4/Overview.html
+ would be http://www.w3.org/TR/html4/
+ -X, --exclude REGEXP Do not check links whose full, canonical URIs
+ match REGEXP; also limits recursion the same way
+ as --exclude-docs with the same regexp would.
+ This option may be specified multiple times.
+ --exclude-docs REGEXP In recursive mode, do not check links in documents
+ whose full, canonical URIs match REGEXP. This
+ option may be specified multiple times.
+ --exclude-url-file FILE If FILE exists, treat each line as a string
+ specifying another exclude; quotemeta is called
+ to make them regexps. If FILE does not exist,
+ open it for writing and write each checked url
+ which gets a 200 response to it.
+ --suppress-redirect URI->URI Do not report a redirect from the first to the
+ second URI. This option may be specified multiple
+ times.
+ --suppress-redirect-prefix URI->URI Do not report a redirect from a child of
+ the first URI to the same child of the second URI.
+ This option may be specified multiple times.
+ --suppress-temp-redirects Suppress warnings about temporary redirects.
+ --suppress-broken CODE:URI Do not report a broken link with the given CODE.
+ CODE is HTTP response, or -1 for robots exclusion.
+ This option may be specified multiple times.
+ --suppress-fragment URI Do not report the given broken fragment URI.
+ A fragment URI contains \"#\". This option may be
+ specified multiple times.
+ -L, --languages LANGS Accept-Language header to send. The special value
+ 'auto' causes autodetection from the environment.
+ -c, --cookies FILE Use cookies, load/save them in FILE. The special
+ value 'tmp' causes non-persistent use of cookies.
+ -R, --no-referer Do not send the Referer HTTP header.
+ -q, --quiet No output if no errors are found (implies -s).
+ -v, --verbose Verbose mode.
+ -i, --indicator Show percentage of lines processed while parsing.
+ -u, --user USERNAME Specify a username for authentication.
+ -p, --password PASSWORD Specify a password.
+ --hide-same-realm Hide 401's that are in the same realm as the
+ document checked.
+ -S, --sleep SECS Sleep SECS seconds between requests to each server
+ (default and minimum: 1 second).
+ -t, --timeout SECS Timeout for requests in seconds (default: 30).
+ -d, --domain DOMAIN Regular expression describing the domain to which
+ authentication information will be sent
+ (default: $trust).
+ --masquerade \"BASE1 BASE2\" Masquerade base URI BASE1 as BASE2. See the
+ manual page for more information.
+ -H, --html HTML output.
+ -?, -h, --help Show this message and exit.
+ -V, --version Output version information and exit.
+
+See \"perldoc LWP\" for information about proxy server support,
+\"perldoc Net::FTP\" for information about various environment variables
+affecting FTP connections and \"perldoc Net::NNTP\" for setting a default
+NNTP server for news: URIs.
+
+The W3C_CHECKLINK_CFG environment variable can be used to set the
+configuration file to use. See details in the full manual page, it can
+be displayed with: perldoc checklink
+
+More documentation at: $Cfg{Doc_URI}
+Please send bug reports and comments to the www-validator mailing list:
+ www-validator\@w3.org (with 'checklink' in the subject)
+ Archives are at: http://lists.w3.org/Archives/Public/www-validator/
+";
+ exit $exitval;
+}
+
+sub ask_password ()
+{
+ eval {
+ local $SIG{__DIE__} = undef;
+ require Term::ReadKey;
+ Term::ReadKey->require_version(2.00);
+ Term::ReadKey->import(qw(ReadMode));
+ };
+ if ($@) {
+ warn('Warning: Term::ReadKey 2.00 or newer not available, ' .
+ "password input disabled.\n");
+ return;
+ }
+ printf(STDERR 'Enter the password for user %s: ', $Opts{User});
+ ReadMode('noecho', *STDIN);
+ chomp($Opts{Password} = <STDIN>);
+ ReadMode('restore', *STDIN);
+ print(STDERR "ok.\n");
+ return;
+}
+
+###############################################################################
+
+###########################################################################
+# Guess an Accept-Language header based on the $LANG environment variable #
+###########################################################################
+
+sub guess_language ()
+{
+ my $lang = $ENV{LANG} or return;
+
+ $lang =~ s/[\.@].*$//; # en_US.UTF-8, fi_FI@euro...
+
+ return 'en' if ($lang eq 'C' || $lang eq 'POSIX');
+
+ my $res = undef;
+ eval {
+ require Locale::Language;
+ if (my $tmp = Locale::Language::language2code($lang)) {
+ $lang = $tmp;
+ }
+ if (my ($l, $c) = (lc($lang) =~ /^([a-z]+)(?:[-_]([a-z]+))?/)) {
+ if (Locale::Language::code2language($l)) {
+ $res = $l;
+ if ($c) {
+ require Locale::Country;
+ $res .= "-$c" if Locale::Country::code2country($c);
+ }
+ }
+ }
+ };
+ return $res;
+}
+
+############################
+# Transform foo into a URI #
+############################
+
+sub urize ($)
+{
+ my $arg = shift;
+ my $uarg = URI::Escape::uri_unescape($arg);
+ my $uri;
+ if (-d $uarg) {
+
+ # look for an "index" file in dir, return it if found
+ require File::Spec;
+ for my $index (map { File::Spec->catfile($uarg, $_) }
+ qw(index.html index.xhtml index.htm index.xhtm))
+ {
+ if (-e $index) {
+ $uri = URI::file->new_abs($index);
+ last;
+ }
+ }
+
+ # return dir itself if an index file was not found
+ $uri ||= URI::file->new_abs($uarg);
+ }
+ elsif ($uarg =~ /^[.\/\\]/ || -e $uarg) {
+ $uri = URI::file->new_abs($uarg);
+ }
+ else {
+ my $newuri = URI->new($arg);
+ if ($newuri->scheme()) {
+ $uri = $newuri;
+ }
+ else {
+ local $ENV{URL_GUESS_PATTERN} = '';
+ $uri = URI::Heuristic::uf_uri($arg);
+ $uri = URI::file->new_abs($uri) unless $uri->scheme();
+ }
+ }
+ return $uri->canonical();
+}
+
+########################################
+# Check for broken links in a resource #
+########################################
+
+sub check_uri (\%\$$$$;\$$)
+{
+ my ($params, $uri, $check_num, $depth, $cookie, $referer, $is_start) = @_;
+ $is_start ||= ($check_num == 1);
+
+ my $start = $Opts{Summary_Only} ? 0 : &get_timestamp();
+
+ # Get and parse the document
+ my $response = &get_document(
+ 'GET', $uri, $doc_count, \%redirects, $referer,
+ $cookie, $params, $check_num, $is_start
+ );
+
+ # Can we check the resource? If not, we exit here...
+ return if defined($response->{Stop});
+
+ if ($Opts{HTML}) {
+ &html_header($uri, $cookie) if ($check_num == 1);
+ &print_form($params, $cookie, $check_num) if $is_start;
+ }
+
+ if ($is_start) { # Starting point of a new check, eg. from the command line
+ # Use the first URI as the recursion base unless specified otherwise.
+ push(@{$Opts{Base_Locations}}, $response->{absolute_uri}->canonical())
+ unless @{$Opts{Base_Locations}};
+ }
+ else {
+
+ # Before fetching the document, we don't know if we'll be within the
+ # recursion scope or not (think redirects).
+ if (!&in_recursion_scope($response->{absolute_uri})) {
+ hprintf("Not in recursion scope: %s\n", $response->{absolute_uri})
+ if ($Opts{Verbose});
+ $response->content("");
+ return;
+ }
+ }
+
+ # Define the document header, and perhaps print it.
+ # (It might still be defined if the previous document had no errors;
+ # just redefine it in that case.)
+
+ if ($check_num != 1) {
+ if ($Opts{HTML}) {
+ $doc_header = "\n<hr />\n";
+ }
+ else {
+ $doc_header = "\n" . ('-' x 40) . "\n";
+ }
+ }
+
+ if ($Opts{HTML}) {
+ $doc_header .=
+ ("<h2>\nProcessing\t" . &show_url($response->{absolute_uri}) .
+ "\n</h2>\n\n");
+ }
+ else {
+ $doc_header .= "\nProcessing\t$response->{absolute_uri}\n\n";
+ }
+
+ if (!$Opts{Quiet}) {
+ print_doc_header();
+ }
+
+ # We are checking a new document
+ $doc_count++;
+
+ my $result_anchor = 'results' . $doc_count;
+
+ if ($check_num == 1 && !$Opts{HTML} && !$Opts{Summary_Only}) {
+ my $s = $Opts{Sleep_Time} == 1 ? '' : 's';
+ my $acclang = $Opts{Accept_Language} || '(not sent)';
+ my $send_referer = $Opts{No_Referer} ? 'not sent' : 'sending';
+ my $cookies = 'not used';
+ if (defined($Opts{Cookies})) {
+ $cookies = 'used, ';
+ if ($Opts{Cookies} eq 'tmp') {
+ $cookies .= 'non-persistent';
+ }
+ else {
+ $cookies .= "file $Opts{Cookies}";
+ }
+ }
+ printf(
+ <<'EOF', $Accept, $acclang, $send_referer, $cookies, $Opts{Sleep_Time}, $s);
+
+Settings used:
+- Accept: %s
+- Accept-Language: %s
+- Referer: %s
+- Cookies: %s
+- Sleeping %d second%s between requests to each server
+EOF
+ printf("- Excluding links matching %s\n", $Opts{Exclude})
+ if defined($Opts{Exclude});
+ printf("- Excluding links in documents whose URIs match %s\n",
+ join(', ', @{$Opts{Exclude_Docs}}))
+ if @{$Opts{Exclude_Docs}};
+ }
+
+ if ($Opts{HTML}) {
+ if (!$Opts{Summary_Only}) {
+ my $accept = &encode($Accept);
+ my $acclang = &encode($Opts{Accept_Language} || '(not sent)');
+ my $send_referer = $Opts{No_Referer} ? 'not sent' : 'sending';
+ my $s = $Opts{Sleep_Time} == 1 ? '' : 's';
+ printf(
+ <<'EOF', $accept, $acclang, $send_referer, $Opts{Sleep_Time}, $s);
+<div class="settings">
+Settings used:
+ <ul>
+ <li><tt><a href="http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.1">Accept</a></tt>: %s</li>
+ <li><tt><a href="http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.4">Accept-Language</a></tt>: %s</li>
+ <li><tt><a href="http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.36">Referer</a></tt>: %s</li>
+ <li>Sleeping %d second%s between requests to each server</li>
+ </ul>
+</div>
+EOF
+ printf("<p>Go to <a href=\"#%s\">the results</a>.</p>\n",
+ $result_anchor);
+ my $esc_uri = URI::Escape::uri_escape($response->{absolute_uri},
+ "^A-Za-z0-9.");
+ print "<p>For reliable link checking results, check ";
+
+ if (!$response->{IsCss}) {
+ printf("<a href=\"%s\">HTML validity</a> and ",
+ &encode(sprintf($Cfg{Markup_Validator_URI}, $esc_uri)));
+ }
+ printf(
+ "<a href=\"%s\">CSS validity</a> first.</p>
+<p>Back to the <a accesskey=\"1\" href=\"%s\">link checker</a>.</p>\n",
+ &encode(sprintf($Cfg{CSS_Validator_URI}, $esc_uri)),
+ &encode($Opts{_Self_URI})
+ );
+
+ printf(<<'EOF', $result_anchor);
+<div class="progress" id="progress%s">
+<h3>Status: <span></span></h3>
+<div class="progressbar"><div></div></div>
+<pre>
+EOF
+ }
+ }
+
+ if ($Opts{Summary_Only} && !$Opts{Quiet}) {
+ print '<p>' if $Opts{HTML};
+ print 'This may take some time';
+ print "... (<a href=\"$Cfg{Doc_URI}#wait\">why?</a>)</p>"
+ if $Opts{HTML};
+ print " if the document has many links to check.\n" unless $Opts{HTML};
+ }
+
+ # Record that we have processed this resource
+ $processed{$response->{absolute_uri}} = 1;
+
+ # Parse the document
+ my $p =
+ &parse_document($uri, $response->base(), $response, 1, ($depth != 0));
+ my $base = URI->new($p->{base});
+
+ # Check anchors
+ ###############
+
+ print "Checking anchors...\n" unless $Opts{Summary_Only};
+
+ my %errors;
+ while (my ($anchor, $lines) = each(%{$p->{Anchors}})) {
+ if (!length($anchor)) {
+
+ # Empty IDREF's are not allowed
+ $errors{$anchor} = 1;
+ }
+ else {
+ my $times = 0;
+ $times += $_ for values(%$lines);
+
+ # They should appear only once
+ $errors{$anchor} = 1 if ($times > 1);
+ }
+ }
+ print " done.\n" unless $Opts{Summary_Only};
+
+ # Check links
+ #############
+
+ &hprintf("Recording all the links found: %d\n",
+ scalar(keys %{$p->{Links}}))
+ if ($Opts{Verbose});
+ my %links;
+ my %hostlinks;
+
+ # Record all the links found
+ while (my ($link, $lines) = each(%{$p->{Links}})) {
+ my $link_uri = URI->new($link);
+ my $abs_link_uri = URI->new_abs($link_uri, $base);
+
+ if ($Opts{Masquerade}) {
+ if ($abs_link_uri =~ m|^\Q$Opts{Masquerade_From}\E|) {
+ print_doc_header();
+ printf("processing %s in base %s\n",
+ $abs_link_uri, $Opts{Masquerade_To});
+ my $nlink = $abs_link_uri;
+ $nlink =~ s|^\Q$Opts{Masquerade_From}\E|$Opts{Masquerade_To}|;
+ $abs_link_uri = URI->new($nlink);
+ }
+ }
+
+ my $canon_uri = URI->new($abs_link_uri->canonical());
+ my $fragment = $canon_uri->fragment(undef);
+ if (!defined($Opts{Exclude}) || $canon_uri !~ $Opts{Exclude}) {
+ if (!exists($links{$canon_uri})) {
+ my $hostport;
+ $hostport = $canon_uri->host_port()
+ if $canon_uri->can('host_port');
+ $hostport = '' unless defined $hostport;
+ push(@{$hostlinks{$hostport}}, $canon_uri);
+ }
+ for my $line_num (keys(%$lines)) {
+ if (!defined($fragment) || !length($fragment)) {
+
+ # Document without fragment
+ $links{$canon_uri}{location}{$line_num} = 1;
+ }
+ else {
+
+ # Resource with a fragment
+ $links{$canon_uri}{fragments}{$fragment}{$line_num} = 1;
+ }
+ }
+ } else {
+ hprintf("excluded via options: %s\n", $canon_uri)
+ if ($Opts{Verbose});
+ }
+ }
+
+ my @order = &distribute_links(\%hostlinks);
+ undef %hostlinks;
+
+ # Build the list of broken URI's
+
+ my $nlinks = scalar(@order);
+
+ &hprintf("Checking %d links to build list of broken URI's\n", $nlinks)
+ if ($Opts{Verbose});
+
+ my %broken;
+ my $link_num = 0;
+ for my $u (@order) {
+ my $ulinks = $links{$u};
+
+ if ($Opts{Summary_Only}) {
+
+ # Hack: avoid browser/server timeouts in summary only CGI mode, bug 896
+ print ' ' if ($Opts{HTML} && !$Opts{Command_Line});
+ }
+ else {
+ &hprintf("\nChecking link %s\n", $u);
+ my $progress = ($link_num / $nlinks) * 100;
+ printf(
+ '<script type="text/javascript">show_progress("%s", "Checking link %s", "%.1f%%");</script>',
+ $result_anchor, &encode($u), $progress)
+ if (!$Opts{Command_Line} &&
+ $Opts{HTML} &&
+ !$Opts{Summary_Only});
+ }
+ $link_num++;
+
+ # Check that a link is valid
+ &check_validity($uri, $u, ($depth != 0 && &in_recursion_scope($u)),
+ \%links, \%redirects);
+ &hprintf("\tReturn code: %s\n", $results{$u}{location}{code})
+ if ($Opts{Verbose});
+ if ($Opts{Exclude_File_Write} && $results{$u}{location}{code} == 200) {
+ my $fh = $Opts{Exclude_File_Write};
+ print $fh ("$u\n");
+ }
+ if ($results{$u}{location}{success}) {
+
+ # Even though it was not broken, we might want to display it
+ # on the results page (e.g. because it required authentication)
+ $broken{$u}{location} = 1
+ if ($results{$u}{location}{display} >= 400);
+
+ # List the broken fragments
+ while (my ($fragment, $lines) = each(%{$ulinks->{fragments}})) {
+
+ my $fragment_ok = $results{$u}{fragments}{$fragment};
+
+ if ($Opts{Verbose}) {
+ my @line_nums = sort { $a <=> $b } keys(%$lines);
+ &hprintf(
+ "\t\t%s %s - Line%s: %s\n",
+ $fragment,
+ $fragment_ok ? 'OK' : 'Not found',
+ (scalar(@line_nums) > 1) ? 's' : '',
+ join(', ', @line_nums)
+ );
+ }
+
+ # A broken fragment?
+ $broken{$u}{fragments}{$fragment} += 2 unless $fragment_ok;
+ }
+ }
+ elsif (!($Opts{Quiet} && &informational($results{$u}{location}{code})))
+ {
+
+ # Couldn't find the document
+ $broken{$u}{location} = 1;
+
+ # All the fragments associated are hence broken
+ for my $fragment (keys %{$ulinks->{fragments}}) {
+ $broken{$u}{fragments}{$fragment}++;
+ }
+ }
+ }
+ &hprintf(
+ "\nProcessed in %s seconds.\n",
+ &time_diff($start, &get_timestamp())
+ ) unless $Opts{Summary_Only};
+ printf(
+ '<script type="text/javascript">show_progress("%s", "Done. Document processed in %s seconds.", "100%%");</script>',
+ $result_anchor, &time_diff($start, &get_timestamp()))
+ if ($Opts{HTML} && !$Opts{Summary_Only});
+
+ # Display results
+ if ($Opts{HTML} && !$Opts{Summary_Only}) {
+ print("</pre>\n</div>\n");
+ printf("<h2><a name=\"%s\">Results</a></h2>\n", $result_anchor);
+ }
+ print "\n" unless $Opts{Quiet};
+
+ &links_summary(\%links, \%results, \%broken, \%redirects);
+ &anchors_summary($p->{Anchors}, \%errors);
+
+ # Do we want to process other documents?
+ if ($depth != 0) {
+
+ for my $u (map { URI->new($_) } keys %links) {
+
+ next unless $results{$u}{location}{success}; # Broken link?
+
+ next unless &in_recursion_scope($u);
+
+ # Do we understand its content type?
+ next unless ($results{$u}{location}{type} =~ $ContentTypes);
+
+ # Have we already processed this URI?
+ next if &already_processed($u, $uri);
+
+ # Do the job
+ print "\n" unless $Opts{Quiet};
+ if ($Opts{HTML}) {
+ if (!$Opts{Command_Line}) {
+ if ($doc_count == $Opts{Max_Documents}) {
+ print(
+ "<hr />\n<p><strong>Maximum number of documents ($Opts{Max_Documents}) reached!</strong></p>\n"
+ );
+ }
+ if ($doc_count >= $Opts{Max_Documents}) {
+ $doc_count++;
+ print("<p>Not checking <strong>$u</strong></p>\n");
+ $processed{$u} = 1;
+ next;
+ }
+ }
+ }
+
+ # This is an inherently recursive algorithm, so Perl's warning is not
+ # helpful. You may wish to comment this out when debugging, though.
+ no warnings 'recursion';
+
+ if ($depth < 0) {
+ &check_uri($params, $u, 0, -1, $cookie, $uri);
+ }
+ else {
+ &check_uri($params, $u, 0, $depth - 1, $cookie, $uri);
+ }
+ }
+ }
+ return;
+}
+
+###############################################################
+# Distribute links based on host:port to avoid RobotUA delays #
+###############################################################
+
+sub distribute_links(\%)
+{
+ my $hostlinks = shift;
+
+ # Hosts ordered by weight (number of links), descending
+ my @order =
+ sort { scalar(@{$hostlinks->{$b}}) <=> scalar(@{$hostlinks->{$a}}) }
+ keys %$hostlinks;
+
+ # All link list flattened into one, in host weight order
+ my @all;
+ push(@all, @{$hostlinks->{$_}}) for @order;
+
+ return @all if (scalar(@order) < 2);
+
+ # Indexes and chunk size for "zipping" the end result list
+ my $num = scalar(@{$hostlinks->{$order[0]}});
+ my @indexes = map { $_ * $num } (0 .. $num - 1);
+
+ # Distribute them
+ my @result;
+ while (my @chunk = splice(@all, 0, $num)) {
+ @result[@indexes] = @chunk;
+ @indexes = map { $_ + 1 } @indexes;
+ }
+
+ # Weed out undefs
+ @result = grep(defined, @result);
+
+ return @result;
+}
+
+##########################################
+# Decode Content-Encodings in a response #
+##########################################
+
+sub decode_content ($)
+{
+ my $response = shift;
+ my $error = undef;
+
+ my $docref = $response->decoded_content(ref => 1);
+ if (defined($docref)) {
+ utf8::encode($$docref);
+ $response->content_ref($docref);
+
+ # Remove Content-Encoding so it won't be decoded again later.
+ $response->remove_header('Content-Encoding');
+ }
+ else {
+ my $ce = $response->header('Content-Encoding');
+ $ce = defined($ce) ? "'$ce'" : 'undefined';
+ my $ct = $response->header('Content-Type');
+ $ct = defined($ct) ? "'$ct'" : 'undefined';
+ my $request_uri = $response->request->url;
+
+ my $cs = $response->content_charset();
+ $cs = defined($cs) ? "'$cs'" : 'unknown';
+ $error =
+ "Error decoding document at <$request_uri>, Content-Type $ct, " .
+ "Content-Encoding $ce, content charset $cs: '$@'";
+ }
+ return $error;
+}
+
+#######################################
+# Get and parse a resource to process #
+#######################################
+
+sub get_document ($\$$;\%\$$$$$)
+{
+ my ($method, $uri, $in_recursion, $redirects, $referer,
+ $cookie, $params, $check_num, $is_start
+ ) = @_;
+
+ # $method contains the HTTP method the use (GET or HEAD)
+ # $uri object contains the identifier of the resource
+ # $in_recursion is > 0 if we are in recursion mode (i.e. it is at least
+ # the second resource checked)
+ # $redirects is a pointer to the hash containing the map of the redirects
+ # $referer is the URI object of the referring document
+ # $cookie, $params, $check_num, and $is_start are for printing HTTP headers
+ # and the form if $in_recursion == 0 and not authenticating
+
+ # Get the resource
+ my $response;
+ if (defined($results{$uri}{response}) &&
+ !($method eq 'GET' && $results{$uri}{method} eq 'HEAD'))
+ {
+ $response = $results{$uri}{response};
+ }
+ else {
+ $response = &get_uri($method, $uri, $referer);
+ &record_results($uri, $method, $response, $referer);
+ &record_redirects($redirects, $response);
+ }
+ if (!$response->is_success()) {
+ if (!$in_recursion) {
+
+ # Is it too late to request authentication?
+ if ($response->code() == 401) {
+ &authentication($response, $cookie, $params, $check_num,
+ $is_start);
+ }
+ else {
+ if ($Opts{HTML}) {
+ &html_header($uri, $cookie) if ($check_num == 1);
+ &print_form($params, $cookie, $check_num) if $is_start;
+ print "<p>", &status_icon($response->code());
+ }
+ &hprintf("\nError: %d %s\n",
+ $response->code(), $response->message() || '(no message)');
+ print "</p>\n" if $Opts{HTML};
+ }
+ }
+ $response->{Stop} = 1;
+ $response->content("");
+ return ($response);
+ }
+
+ # What is the URI of the resource that we are processing by the way?
+ my $base_uri = $response->base();
+ my $request_uri = URI->new($response->request->url);
+ $response->{absolute_uri} = $request_uri->abs($base_uri);
+
+ # Can we parse the document?
+ my $failed_reason;
+ my $ct = $response->header('Content-Type');
+ if (!$ct || $ct !~ $ContentTypes) {
+ $failed_reason = "Content-Type for <$request_uri> is " .
+ (defined($ct) ? "'$ct'" : 'undefined');
+ }
+ else {
+ $failed_reason = decode_content($response);
+ }
+ if ($failed_reason) {
+
+ # No, there is a problem...
+ if (!$in_recursion) {
+ if ($Opts{HTML}) {
+ &html_header($uri, $cookie) if ($check_num == 1);
+ &print_form($params, $cookie, $check_num) if $is_start;
+ print "<p>", &status_icon(406);
+
+ }
+ &hprintf("Can't check links: %s.\n", $failed_reason);
+ print "</p>\n" if $Opts{HTML};
+ }
+ $response->{Stop} = 1;
+ $response->content("");
+ }
+
+ # Ok, return the information
+ return ($response);
+}
+
+#########################################################
+# Check whether a URI is within the scope of recursion. #
+#########################################################
+
+sub in_recursion_scope (\$)
+{
+ my ($uri) = @_;
+ return 0 unless $uri;
+
+ my $candidate = $uri->canonical();
+
+ return 0 if (defined($Opts{Exclude}) && $candidate =~ $Opts{Exclude});
+
+ for my $excluded_doc (@{$Opts{Exclude_Docs}}) {
+ return 0 if ($candidate =~ $excluded_doc);
+ }
+
+ for my $base (@{$Opts{Base_Locations}}) {
+ my $rel = $candidate->rel($base);
+ next if ($candidate eq $rel); # Relative path not possible?
+ next if ($rel =~ m|^(\.\.)?/|); # Relative path upwards?
+ return 1;
+ }
+
+ return 0; # We always have at least one base location, but none matched.
+}
+
+#################################
+# Check for content type match. #
+#################################
+
+sub is_content_type ($$)
+{
+ my ($candidate, $type) = @_;
+ return 0 unless ($candidate && $type);
+ my @v = HTTP::Headers::Util::split_header_words($candidate);
+ return scalar(@v) ? $type eq lc($v[0]->[0]) : 0;
+}
+
+##################################################
+# Check whether a URI has already been processed #
+##################################################
+
+sub already_processed (\$\$)
+{
+ my ($uri, $referer) = @_;
+
+ # Don't be verbose for that part...
+ my $summary_value = $Opts{Summary_Only};
+ $Opts{Summary_Only} = 1;
+
+ # Do a GET: if it fails, we stop, if not, the results are cached
+ my $response = &get_document('GET', $uri, 1, undef, $referer);
+
+ # ... but just for that part
+ $Opts{Summary_Only} = $summary_value;
+
+ # Can we process the resource?
+ return -1 if defined($response->{Stop});
+
+ # Have we already processed it?
+ return 1 if defined($processed{$response->{absolute_uri}->as_string()});
+
+ # It's not processed yet and it is processable: return 0
+ return 0;
+}
+
+############################
+# Get the content of a URI #
+############################
+
+sub get_uri ($\$;\$$\%$$$$)
+{
+
+ # Here we have a lot of extra parameters in order not to lose information
+ # if the function is called several times (401's)
+ my ($method, $uri, $referer, $start, $redirects,
+ $code, $realm, $message, $auth
+ ) = @_;
+
+ # $method contains the method used
+ # $uri object contains the target of the request
+ # $referer is the URI object of the referring document
+ # $start is a timestamp (not defined the first time the function is called)
+ # $redirects is a map of redirects
+ # $code is the first HTTP return code
+ # $realm is the realm of the request
+ # $message is the HTTP message received
+ # $auth equals 1 if we want to send out authentication information
+
+ # For timing purposes
+ $start = &get_timestamp() unless defined($start);
+
+ # Prepare the query
+
+ # Do we want printouts of progress?
+ my $verbose_progress =
+ !($Opts{Summary_Only} || (!$doc_count && $Opts{HTML}));
+
+ &hprintf("%s %s ", $method, $uri) if $verbose_progress;
+
+ my $request = HTTP::Request->new($method, $uri);
+
+ $request->header('Accept-Language' => $Opts{Accept_Language})
+ if $Opts{Accept_Language};
+ $request->header('Accept', $Accept);
+ $request->accept_decodable();
+
+ # Are we providing authentication info?
+ if ($auth && $request->url()->host() =~ $Opts{Trusted}) {
+ if (defined($ENV{HTTP_AUTHORIZATION})) {
+ $request->header(Authorization => $ENV{HTTP_AUTHORIZATION});
+ }
+ elsif (defined($Opts{User}) && defined($Opts{Password})) {
+ $request->authorization_basic($Opts{User}, $Opts{Password});
+ }
+ }
+
+ # Tell the user agent if we want progress reports for redirects or not.
+ $ua->redirect_progress_callback(sub { &hprintf("\n-> %s %s ", @_); })
+ if $verbose_progress;
+
+ # Set referer
+ $request->referer($referer) if (!$Opts{No_Referer} && $referer);
+
+ # Telling caches in the middle we want a fresh copy (Bug 4998)
+ $request->header(Cache_Control => "max-age=0");
+
+ # Do the query
+ my $response = $ua->request($request);
+
+ # Get the results
+ # Record the very first response
+ if (!defined($code)) {
+ ($code, $message) = delete(@$ua{qw(FirstResponse FirstMessage)});
+ }
+
+ # Authentication requested?
+ if ($response->code() == 401 &&
+ !defined($auth) &&
+ (defined($ENV{HTTP_AUTHORIZATION}) ||
+ (defined($Opts{User}) && defined($Opts{Password})))
+ )
+ {
+
+ # Set host as trusted domain unless we already have one.
+ if (!$Opts{Trusted}) {
+ my $re = sprintf('^%s$', quotemeta($response->base()->host()));
+ $Opts{Trusted} = qr/$re/io;
+ }
+
+ # Deal with authentication and avoid loops
+ if (!defined($realm) &&
+ $response->www_authenticate() =~ /Basic realm=\"([^\"]+)\"/)
+ {
+ $realm = $1;
+ }
+
+ print "\n" if $verbose_progress;
+ return &get_uri($method, $response->request()->url(),
+ $referer, $start, $redirects, $code, $realm, $message, 1);
+ }
+
+ # @@@ subtract robot delay from the "fetched in" time?
+ &hprintf(" fetched in %s seconds\n", &time_diff($start, &get_timestamp()))
+ if $verbose_progress;
+
+ $response->{IsCss} =
+ is_content_type($response->content_type(), "text/css");
+ $response->{Realm} = $realm if defined($realm);
+
+ return $response;
+}
+
+#########################################
+# Record the results of an HTTP request #
+#########################################
+
+sub record_results (\$$$$)
+{
+ my ($uri, $method, $response, $referer) = @_;
+ $results{$uri}{referer} = $referer;
+ $results{$uri}{response} = $response;
+ $results{$uri}{method} = $method;
+ $results{$uri}{location}{code} = $response->code();
+ $results{$uri}{location}{code} = RC_ROBOTS_TXT()
+ if ($results{$uri}{location}{code} == 403 &&
+ $response->message() =~ /Forbidden by robots\.txt/);
+ $results{$uri}{location}{code} = RC_IP_DISALLOWED()
+ if ($results{$uri}{location}{code} == 403 &&
+ $response->message() =~ /non-public IP/);
+ $results{$uri}{location}{code} = RC_DNS_ERROR()
+ if ($results{$uri}{location}{code} == 500 &&
+ $response->message() =~ /Bad hostname '[^\']*'/);
+ $results{$uri}{location}{code} = RC_PROTOCOL_DISALLOWED()
+ if ($results{$uri}{location}{code} == 500 &&
+ $response->message() =~ /Access to '[^\']*' URIs has been disabled/);
+ $results{$uri}{location}{type} = $response->header('Content-type');
+ $results{$uri}{location}{display} = $results{$uri}{location}{code};
+
+ # Rewind, check for the original code and message.
+ for (my $tmp = $response->previous(); $tmp; $tmp = $tmp->previous()) {
+ $results{$uri}{location}{orig} = $tmp->code();
+ $results{$uri}{location}{orig_message} = $tmp->message() ||
+ '(no message)';
+ }
+ $results{$uri}{location}{success} = $response->is_success();
+
+ # If a suppressed broken link, fill the data structure like a typical success.
+ # print STDERR "success? " . $results{$uri}{location}{success} . ": $uri\n";
+ if (!$results{$uri}{location}{success}) {
+ my $code = $results{$uri}{location}{code};
+ my $match = grep { $_ eq "$code:$uri" } @{$Opts{Suppress_Broken}};
+ if ($match) {
+ $results{$uri}{location}{success} = 1;
+ $results{$uri}{location}{code} = 100;
+ $results{$uri}{location}{display} = 100;
+ }
+ }
+
+ # Stores the authentication information
+ if (defined($response->{Realm})) {
+ $results{$uri}{location}{realm} = $response->{Realm};
+ $results{$uri}{location}{display} = 401 unless $Opts{Hide_Same_Realm};
+ }
+
+ # What type of broken link is it? (stored in {record} - the {display}
+ # information is just for visual use only)
+ if ($results{$uri}{location}{display} == 401 &&
+ $results{$uri}{location}{code} == 404)
+ {
+ $results{$uri}{location}{record} = 404;
+ }
+ else {
+ $results{$uri}{location}{record} = $results{$uri}{location}{display};
+ }
+
+ # Did it fail?
+ $results{$uri}{location}{message} = $response->message() || '(no message)';
+ if (!$results{$uri}{location}{success}) {
+ &hprintf(
+ "Error: %d %s\n",
+ $results{$uri}{location}{code},
+ $results{$uri}{location}{message}
+ ) if ($Opts{Verbose});
+ }
+ return;
+}
+
+####################
+# Parse a document #
+####################
+
+sub parse_document (\$\$$$$)
+{
+ my ($uri, $base_uri, $response, $links, $rec_needs_links) = @_;
+
+ print("parse_document($uri, $base_uri, ..., $links, $rec_needs_links)\n")
+ if $Opts{Verbose};
+
+ my $p;
+
+ if (defined($results{$uri}{parsing})) {
+
+ # We have already done the job. Woohoo!
+ $p->{base} = $results{$uri}{parsing}{base};
+ $p->{Anchors} = $results{$uri}{parsing}{Anchors};
+ $p->{Links} = $results{$uri}{parsing}{Links};
+ return $p;
+ }
+
+ $p = W3C::LinkChecker->new();
+ $p->{base} = $base_uri;
+
+ my $stype = $response->header("Content-Style-Type");
+ $p->{style_is_css} = !$stype || is_content_type($stype, "text/css");
+
+ my $start;
+ if (!$Opts{Summary_Only}) {
+ $start = &get_timestamp();
+ print("Parsing...\n");
+ }
+
+ # Content-Encoding etc already decoded in get_document().
+ my $docref = $response->content_ref();
+
+ # Count lines beforehand if needed (for progress indicator, or CSS while
+ # we don't get any line context out of the parser). In case of HTML, the
+ # actual final number of lines processed shown is populated by our
+ # end_document handler.
+ $p->{Total} = ($$docref =~ tr/\n//)
+ if ($response->{IsCss} || $Opts{Progress});
+
+ # We only look for anchors if we are not interested in the links
+ # obviously, or if we are running a recursive checking because we
+ # might need this information later
+ $p->{only_anchors} = !($links || $rec_needs_links);
+
+ if ($response->{IsCss}) {
+
+ # Parse as CSS
+
+ $p->parse_css($$docref, LINE_UNKNOWN());
+ }
+ else {
+
+ # Parse as HTML
+
+ # Transform <?xml:stylesheet ...?> into <xml:stylesheet ...> for parsing
+ # Processing instructions are not parsed by process, but in this case
+ # it should be. It's expensive, it's horrible, but it's the easiest way
+ # for right now.
+ $$docref =~ s/\<\?(xml:stylesheet.*?)\?\>/\<$1\>/
+ unless $p->{only_anchors};
+
+ $p->xml_mode(1) if ($response->content_type() =~ /\+xml$/);
+
+ $p->parse($$docref)->eof();
+ }
+
+ $response->content("");
+
+ if (!$Opts{Summary_Only}) {
+ my $stop = &get_timestamp();
+ print "\r" if $Opts{Progress};
+ &hprintf(" done (%d lines in %s seconds).\n",
+ $p->{Total}, &time_diff($start, $stop));
+ }
+
+ # Save the results before exiting
+ $results{$uri}{parsing}{base} = $p->{base};
+ $results{$uri}{parsing}{Anchors} = $p->{Anchors};
+ $results{$uri}{parsing}{Links} = $p->{Links};
+
+ return $p;
+}
+
+####################################
+# Constructor for W3C::LinkChecker #
+####################################
+
+sub new
+{
+ my $p = HTML::Parser::new(@_, api_version => 3);
+ $p->utf8_mode(1);
+
+ # Set up handlers
+
+ $p->handler(start => 'start', 'self, tagname, attr, line');
+ $p->handler(end => 'end', 'self, tagname, line');
+ $p->handler(text => 'text', 'self, dtext, line');
+ $p->handler(
+ declaration => sub {
+ my $self = shift;
+ $self->declaration(substr($_[0], 2, -1));
+ },
+ 'self, text, line'
+ );
+ $p->handler(end_document => 'end_document', 'self, line');
+ if ($Opts{Progress}) {
+ $p->handler(default => 'parse_progress', 'self, line');
+ $p->{last_percentage} = 0;
+ }
+
+ # Check <a [..] name="...">?
+ $p->{check_name} = 1;
+
+ # Check <[..] id="..">?
+ $p->{check_id} = 1;
+
+ # Don't interpret comment loosely
+ $p->strict_comment(1);
+
+ return $p;
+}
+
+#################################################
+# Record or return the doctype of the document #
+#################################################
+
+sub doctype
+{
+ my ($self, $dc) = @_;
+ return $self->{doctype} unless $dc;
+ $_ = $self->{doctype} = $dc;
+
+ # What to look for depending on the doctype
+
+ # Check for <a name="...">?
+ $self->{check_name} = 0
+ if m%^-//(W3C|WAPFORUM)//DTD XHTML (Basic|Mobile) %;
+
+ # Check for <* id="...">?
+ $self->{check_id} = 0
+ if (m%^-//IETF//DTD HTML [23]\.0//% || m%^-//W3C//DTD HTML 3\.2//%);
+
+ # Enable XML mode (XHTML, XHTML Mobile, XHTML-Print, XHTML+RDFa, ...)
+ $self->xml_mode(1) if (m%^-//(W3C|WAPFORUM)//DTD XHTML[ \-\+]%);
+
+ return;
+}
+
+###################################
+# Print parse progress indication #
+###################################
+
+sub parse_progress
+{
+ my ($self, $line) = @_;
+ return unless defined($line) && $line > 0 && $self->{Total} > 0;
+
+ my $percentage = int($line / $self->{Total} * 100);
+ if ($percentage != $self->{last_percentage}) {
+ printf("\r%4d%%", $percentage);
+ $self->{last_percentage} = $percentage;
+ }
+
+ return;
+}
+
+#############################
+# Extraction of the anchors #
+#############################
+
+sub get_anchor
+{
+ my ($self, $tag, $attr) = @_;
+
+ my $anchor = $self->{check_id} ? $attr->{id} : undef;
+ if ($self->{check_name} && ($tag eq 'a')) {
+
+ # @@@@ In XHTML, <a name="foo" id="foo"> is mandatory
+ # Force an error if it's not the case (or if id's and name's values
+ # are different)
+ # If id is defined, name if defined must have the same value
+ $anchor ||= $attr->{name};
+ }
+
+ return $anchor;
+}
+
+#############################
+# W3C::LinkChecker handlers #
+#############################
+
+sub add_link
+{
+ my ($self, $uri, $base, $line) = @_;
+ if (defined($uri)) {
+
+ # Remove repeated slashes after the . or .. in relative links, to avoid
+ # duplicated checking or infinite recursion.
+ $uri =~ s|^(\.\.?/)/+|$1|o;
+ $uri = Encode::decode_utf8($uri);
+ $uri = URI->new_abs($uri, $base) if defined($base);
+ $self->{Links}{$uri}{defined($line) ? $line : LINE_UNKNOWN()}++;
+ }
+ return;
+}
+
+sub start
+{
+ my ($self, $tag, $attr, $line) = @_;
+ $line = LINE_UNKNOWN() unless defined($line);
+
+ # Anchors
+ my $anchor = $self->get_anchor($tag, $attr);
+ $self->{Anchors}{$anchor}{$line}++ if defined($anchor);
+
+ # Links
+ if (!$self->{only_anchors}) {
+
+ my $tag_local_base = undef;
+
+ # Special case: base/@href
+ # @@@TODO: The reason for handling <base href> ourselves is that LWP's
+ # head parsing magic fails at least for responses that have
+ # Content-Encodings: https://rt.cpan.org/Ticket/Display.html?id=54361
+ if ($tag eq 'base') {
+
+ # Ignore <base> with missing/empty href.
+ $self->{base} = $attr->{href}
+ if (defined($attr->{href}) && length($attr->{href}));
+ }
+
+ # Special case: meta[@http-equiv=Refresh]/@content
+ elsif ($tag eq 'meta') {
+ if ($attr->{'http-equiv'} &&
+ lc($attr->{'http-equiv'}) eq 'refresh')
+ {
+ my $content = $attr->{content};
+ if ($content && $content =~ /.*?;\s*(?:url=)?(.+)/i) {
+ $self->add_link($1, undef, $line);
+ }
+ }
+ }
+
+ # Special case: tags that have "local base"
+ elsif ($tag eq 'applet' || $tag eq 'object') {
+ if (my $codebase = $attr->{codebase}) {
+
+ # Applet codebases are directories, append trailing slash
+ # if it's not there so that new_abs does the right thing.
+ $codebase .= "/" if ($tag eq 'applet' && $codebase !~ m|/$|);
+
+ # TODO: HTML 4 spec says applet/@codebase may only point to
+ # subdirs of the directory containing the current document.
+ # Should we do something about that?
+ $tag_local_base = URI->new_abs($codebase, $self->{base});
+ }
+ }
+
+ # Link attributes:
+ if (my $link_attrs = LINK_ATTRS()->{$tag}) {
+ for my $la (@$link_attrs) {
+ $self->add_link($attr->{$la}, $tag_local_base, $line);
+ }
+ }
+
+ # List of links attributes:
+ if (my $link_attrs = LINK_LIST_ATTRS()->{$tag}) {
+ my ($sep, $attrs) = @$link_attrs;
+ for my $la (@$attrs) {
+ if (defined(my $value = $attr->{$la})) {
+ for my $link (split($sep, $value)) {
+ $self->add_link($link, $tag_local_base, $line);
+ }
+ }
+ }
+ }
+
+ # Inline CSS:
+ delete $self->{csstext};
+ if ($tag eq 'style') {
+ $self->{csstext} = ''
+ if ((!$attr->{type} && $self->{style_is_css}) ||
+ is_content_type($attr->{type}, "text/css"));
+ }
+ elsif ($self->{style_is_css} && (my $style = $attr->{style})) {
+ $style = CSS::DOM::Style::parse($style);
+ $self->parse_style($style, $line);
+ }
+ }
+
+ $self->parse_progress($line) if $Opts{Progress};
+ return;
+}
+
+sub end
+{
+ my ($self, $tagname, $line) = @_;
+
+ $self->parse_css($self->{csstext}, $line) if ($tagname eq 'style');
+ delete $self->{csstext};
+
+ $self->parse_progress($line) if $Opts{Progress};
+ return;
+}
+
+sub parse_css
+{
+ my ($self, $css, $line) = @_;
+ return unless $css;
+
+ my $sheet = CSS::DOM::parse($css);
+ for my $rule (@{$sheet->cssRules()}) {
+ if ($rule->type() == IMPORT_RULE()) {
+ $self->add_link($rule->href(), $self->{base}, $line);
+ }
+ elsif ($rule->type == STYLE_RULE()) {
+ $self->parse_style($rule->style(), $line);
+ }
+ }
+ return;
+}
+
+sub parse_style
+{
+ my ($self, $style, $line) = @_;
+ return unless $style;
+
+ for (my $i = 0, my $len = $style->length(); $i < $len; $i++) {
+ my $prop = $style->item($i);
+ my $val = $style->getPropertyValue($prop);
+
+ while ($val =~ /$CssUrl/go) {
+ my $url = CSS::DOM::Util::unescape($2);
+ $self->add_link($url, $self->{base}, $line);
+ }
+ }
+
+ return;
+}
+
+sub declaration
+{
+ my ($self, $text, $line) = @_;
+
+ # Extract the doctype
+ my @declaration = split(/\s+/, $text, 4);
+ if ($#declaration >= 3 &&
+ $declaration[0] eq 'DOCTYPE' &&
+ lc($declaration[1]) eq 'html')
+ {
+
+ # Parse the doctype declaration
+ if ($text =~
+ m/^DOCTYPE\s+html\s+(?:PUBLIC\s+"([^"]+)"|SYSTEM)(\s+"([^"]+)")?\s*$/i
+ )
+ {
+
+ # Store the doctype
+ $self->doctype($1) if $1;
+
+ # If there is a link to the DTD, record it
+ $self->add_link($3, undef, $line)
+ if (!$self->{only_anchors} && $3);
+ }
+ }
+
+ $self->text($text) unless $self->{only_anchors};
+
+ return;
+}
+
+sub text
+{
+ my ($self, $text, $line) = @_;
+ $self->{csstext} .= $text if defined($self->{csstext});
+ $self->parse_progress($line) if $Opts{Progress};
+ return;
+}
+
+sub end_document
+{
+ my ($self, $line) = @_;
+ $self->{Total} = $line;
+ delete $self->{csstext};
+ return;
+}
+
+################################
+# Check the validity of a link #
+################################
+
+sub check_validity (\$\$$\%\%)
+{
+ my ($referer, $uri, $want_links, $links, $redirects) = @_;
+
+ # $referer is the URI object of the document checked
+ # $uri is the URI object of the target that we are verifying
+ # $want_links is true if we're interested in links in the target doc
+ # $links is a hash of the links in the documents checked
+ # $redirects is a map of the redirects encountered
+
+ # Get the document with the appropriate method: GET if there are
+ # fragments to check or links are wanted, HEAD is enough otherwise.
+ my $fragments = $links->{$uri}{fragments} || {};
+ my $method = ($want_links || %$fragments) ? 'GET' : 'HEAD';
+
+ my $response;
+ my $being_processed = 0;
+ if (!defined($results{$uri}) ||
+ ($method eq 'GET' && $results{$uri}{method} eq 'HEAD'))
+ {
+ $being_processed = 1;
+ $response = &get_uri($method, $uri, $referer);
+
+ # Get the information back from get_uri()
+ &record_results($uri, $method, $response, $referer);
+
+ # Record the redirects
+ &record_redirects($redirects, $response);
+ }
+ elsif (!($Opts{Summary_Only} || (!$doc_count && $Opts{HTML}))) {
+ my $ref = $results{$uri}{referer};
+ &hprintf("Already checked%s\n", $ref ? ", referrer $ref" : ".");
+ }
+
+ # We got the response of the HTTP request. Stop here if it was a HEAD.
+ return if ($method eq 'HEAD');
+
+ # There are fragments. Parse the document.
+ my $p;
+ if ($being_processed) {
+
+ # Can we really parse the document?
+ if (!defined($results{$uri}{location}{type}) ||
+ $results{$uri}{location}{type} !~ $ContentTypes)
+ {
+ &hprintf("Can't check content: Content-Type for '%s' is '%s'.\n",
+ $uri, $results{$uri}{location}{type})
+ if ($Opts{Verbose});
+ $response->content("");
+ return;
+ }
+
+ # Do it then
+ if (my $error = decode_content($response)) {
+ &hprintf("%s\n.", $error);
+ }
+
+ # @@@TODO: this isn't the best thing to do if a decode error occurred
+ $p =
+ &parse_document($uri, $response->base(), $response, 0,
+ $want_links);
+ }
+ else {
+
+ # We already had the information
+ $p->{Anchors} = $results{$uri}{parsing}{Anchors};
+ }
+
+ # Check that the fragments exist
+ for my $fragment (keys %$fragments) {
+ if (defined($p->{Anchors}{$fragment}) ||
+ &escape_match($fragment, $p->{Anchors}) ||
+ grep { $_ eq "$uri#$fragment" } @{$Opts{Suppress_Fragment}})
+ {
+ $results{$uri}{fragments}{$fragment} = 1;
+ }
+ else {
+ $results{$uri}{fragments}{$fragment} = 0;
+ }
+ }
+ return;
+}
+
+sub escape_match ($\%)
+{
+ my ($a, $hash) = (URI::Escape::uri_unescape($_[0]), $_[1]);
+ for my $b (keys %$hash) {
+ return 1 if ($a eq URI::Escape::uri_unescape($b));
+ }
+ return 0;
+}
+
+##########################
+# Ask for authentication #
+##########################
+
+sub authentication ($;$$$$)
+{
+ my ($response, $cookie, $params, $check_num, $is_start) = @_;
+
+ my $realm = '';
+ if ($response->www_authenticate() =~ /Basic realm=\"([^\"]+)\"/) {
+ $realm = $1;
+ }
+
+ if ($Opts{Command_Line}) {
+ printf STDERR <<'EOF', $response->request()->url(), $realm;
+
+Authentication is required for %s.
+The realm is "%s".
+Use the -u and -p options to specify a username and password and the -d option
+to specify trusted domains.
+EOF
+ }
+ else {
+
+ printf(
+ "Status: 401 Authorization Required\nWWW-Authenticate: %s\n%sConnection: close\nContent-Language: en\nContent-Type: text/html; charset=utf-8\n\n",
+ $response->www_authenticate(),
+ $cookie ? "Set-Cookie: $cookie\n" : "",
+ );
+
+ printf(
+ "%s
+<html lang=\"en\" xmlns=\"http://www.w3.org/1999/xhtml\" xml:lang=\"en\">
+<head>
+<title>W3C Link Checker: 401 Authorization Required</title>
+%s</head>
+<body>", $DocType, $Head
+ );
+ &banner(': 401 Authorization Required');
+ &print_form($params, $cookie, $check_num) if $is_start;
+ printf(
+ '<p>
+ %s
+ You need "%s" access to <a href="%s">%s</a> to perform link checking.<br />
+',
+ &status_icon(401),
+ &encode($realm), (&encode($response->request()->url())) x 2
+ );
+
+ my $host = $response->request()->url()->host();
+ if ($Opts{Trusted} && $host !~ $Opts{Trusted}) {
+ printf <<'EOF', &encode($Opts{Trusted}), &encode($host);
+ This service has been configured to send authentication only to hostnames
+ matching the regular expression <code>%s</code>, but the hostname
+ <code>%s</code> does not match it.
+EOF
+ }
+
+ print "</p>\n";
+ }
+ return;
+}
+
+##################
+# Get statistics #
+##################
+
+sub get_timestamp ()
+{
+ return pack('LL', Time::HiRes::gettimeofday());
+}
+
+sub time_diff ($$)
+{
+ my @start = unpack('LL', $_[0]);
+ my @stop = unpack('LL', $_[1]);
+ for ($start[1], $stop[1]) {
+ $_ /= 1_000_000;
+ }
+ return (sprintf("%.2f", ($stop[0] + $stop[1]) - ($start[0] + $start[1])));
+}
+
+########################
+# Handle the redirects #
+########################
+
+# Record the redirects in a hash
+sub record_redirects (\%$)
+{
+ my ($redirects, $response) = @_;
+ for (my $prev = $response->previous(); $prev; $prev = $prev->previous()) {
+
+ # Check for redirect match.
+ my $from = $prev->request()->url();
+ my $to = $response->request()->url(); # same on every loop iteration
+ my $from_to = $from . '->' . $to;
+ my $match = grep { $_ eq $from_to } @{$Opts{Suppress_Redirect}};
+
+ # print STDERR "Result $match of redirect checking $from_to\n";
+ if ($match) { next; }
+
+ $match = grep { $from_to =~ /$_/ } @{$Opts{Suppress_Redirect_Prefix}};
+
+ # print STDERR "Result $match of regexp checking $from_to\n";
+ if ($match) { next; }
+
+ my $c = $prev->code();
+ if ($Opts{Suppress_Temp_Redirects} && ($c == 307 || $c == 302)) {
+ next;
+ }
+
+ $redirects->{$prev->request()->url()} = $response->request()->url();
+ }
+ return;
+}
+
+# Determine if a request is redirected
+sub is_redirected ($%)
+{
+ my ($uri, %redirects) = @_;
+ return (defined($redirects{$uri}));
+}
+
+# Get a list of redirects for a URI
+sub get_redirects ($%)
+{
+ my ($uri, %redirects) = @_;
+ my @history = ($uri);
+ my %seen = ($uri => 1); # for tracking redirect loops
+ my $loop = 0;
+ while ($redirects{$uri}) {
+ $uri = $redirects{$uri};
+ push(@history, $uri);
+ if ($seen{$uri}) {
+ $loop = 1;
+ last;
+ }
+ else {
+ $seen{$uri}++;
+ }
+ }
+ return ($loop, @history);
+}
+
+####################################################
+# Tool for sorting the unique elements of an array #
+####################################################
+
+sub sort_unique (@)
+{
+ my %saw;
+ @saw{@_} = ();
+ return (sort { $a <=> $b } keys %saw);
+}
+
+#####################
+# Print the results #
+#####################
+
+sub line_number ($)
+{
+ my $line = shift;
+ return $line if ($line >= 0);
+ return "(N/A)";
+}
+
+sub http_rc ($)
+{
+ my $rc = shift;
+ return $rc if ($rc >= 0);
+ return "(N/A)";
+}
+
+# returns true if the given code is informational
+sub informational ($)
+{
+ my $rc = shift;
+ return $rc == RC_ROBOTS_TXT() ||
+ $rc == RC_IP_DISALLOWED() ||
+ $rc == RC_PROTOCOL_DISALLOWED();
+}
+
+sub anchors_summary (\%\%)
+{
+ my ($anchors, $errors) = @_;
+
+ # Number of anchors found.
+ my $n = scalar(keys(%$anchors));
+ if (!$Opts{Quiet}) {
+ if ($Opts{HTML}) {
+ print("<h3>Anchors</h3>\n<p>");
+ }
+ else {
+ print("Anchors\n\n");
+ }
+ &hprintf("Found %d anchor%s.\n", $n, ($n == 1) ? '' : 's');
+ print("</p>\n") if $Opts{HTML};
+ }
+
+ # List of the duplicates, if any.
+ my @errors = keys %{$errors};
+ if (!scalar(@errors)) {
+ print("<p>Valid anchors!</p>\n")
+ if (!$Opts{Quiet} && $Opts{HTML} && $n);
+ return;
+ }
+ undef $n;
+
+ print_doc_header();
+ print('<p>') if $Opts{HTML};
+ print('List of duplicate and empty anchors');
+ print <<'EOF' if $Opts{HTML};
+</p>
+<table class="report" border="1" summary="List of duplicate and empty anchors.">
+<thead>
+<tr>
+<th>Anchor</th>
+<th>Lines</th>
+</tr>
+</thead>
+<tbody>
+EOF
+ print("\n");
+
+ for my $anchor (@errors) {
+ my $format;
+ my @unique = &sort_unique(
+ map { line_number($_) }
+ keys %{$anchors->{$anchor}}
+ );
+ if ($Opts{HTML}) {
+ $format = "<tr><td class=\"broken\">%s</td><td>%s</td></tr>\n";
+ }
+ else {
+ my $s = (scalar(@unique) > 1) ? 's' : '';
+ $format = "\t%s\tLine$s: %s\n";
+ }
+ printf($format,
+ &encode(length($anchor) ? $anchor : 'Empty anchor'),
+ join(', ', @unique));
+ }
+
+ print("</tbody>\n</table>\n") if $Opts{HTML};
+
+ return;
+}
+
+sub show_link_report (\%\%\%\%\@;$\%)
+{
+ my ($links, $results, $broken, $redirects, $urls, $codes, $todo) = @_;
+
+ print("\n<dl class=\"report\">") if $Opts{HTML};
+ print("\n") if (!$Opts{Quiet});
+
+ # Process each URL
+ my ($c, $previous_c);
+ for my $u (@$urls) {
+ my @fragments = keys %{$broken->{$u}{fragments}};
+
+ # Did we get a redirect?
+ my $redirected = &is_redirected($u, %$redirects);
+
+ # List of lines
+ my @total_lines;
+ push(@total_lines, keys(%{$links->{$u}{location}}));
+ for my $f (@fragments) {
+ push(@total_lines, keys(%{$links->{$u}{fragments}{$f}}))
+ unless ($f eq $u && defined($links->{$u}{$u}{LINE_UNKNOWN()}));
+ }
+
+ my ($redirect_loop, @redirects_urls) = get_redirects($u, %$redirects);
+ my $currloc = $results->{$u}{location};
+
+ # Error type
+ $c = &code_shown($u, $results);
+
+ # What to do
+ my $whattodo;
+ my $redirect_too;
+ if ($todo) {
+ if ($u =~ m/^javascript:/) {
+ if ($Opts{HTML}) {
+ $whattodo =
+ 'You must change this link: people using a browser without JavaScript support
+will <em>not</em> be able to follow this link. See the
+<a href="http://www.w3.org/TR/WAI-WEBCONTENT/#tech-scripts">Web Content
+Accessibility Guidelines on the use of scripting on the Web</a> and the
+<a href="http://www.w3.org/TR/WCAG10-HTML-TECHS/#directly-accessible-scripts">techniques
+on how to solve this</a>.';
+ }
+ else {
+ $whattodo =
+ 'Change this link: people using a browser without JavaScript support will not be able to follow this link.';
+ }
+ }
+ elsif ($c == RC_ROBOTS_TXT()) {
+ $whattodo =
+ 'The link was not checked due to robots exclusion ' .
+ 'rules. Check the link manually.';
+ }
+ elsif ($redirect_loop) {
+ $whattodo =
+ 'Retrieving the URI results in a redirect loop, that should be '
+ . 'fixed. Examine the redirect sequence to see where the loop '
+ . 'occurs.';
+ }
+ else {
+ $whattodo = $todo->{$c};
+ }
+ }
+ elsif (defined($redirects{$u})) {
+
+ # Redirects
+ if (($u . '/') eq $redirects{$u}) {
+ $whattodo =
+ 'The link is missing a trailing slash, and caused a redirect. Adding the trailing slash would speed up browsing.';
+ }
+ elsif ($c == 307 || $c == 302) {
+ $whattodo =
+ 'This is a temporary redirect. Update the link if you believe it makes sense, or leave it as is.';
+ }
+ elsif ($c == 301) {
+ $whattodo =
+ 'This is a permanent redirect. The link should be updated.';
+ }
+ }
+
+ my @unique = &sort_unique(map { line_number($_) } @total_lines);
+ my $lines_list = join(', ', @unique);
+ my $s = (scalar(@unique) > 1) ? 's' : '';
+ undef @unique;
+
+ my @http_codes = ($currloc->{code});
+ unshift(@http_codes, $currloc->{orig}) if $currloc->{orig};
+ @http_codes = map { http_rc($_) } @http_codes;
+
+ if ($Opts{HTML}) {
+
+ # Style stuff
+ my $idref = '';
+ if ($codes && (!defined($previous_c) || ($c != $previous_c))) {
+ $idref = ' id="d' . $doc_count . 'code_' . $c . '"';
+ $previous_c = $c;
+ }
+
+ # Main info
+ for (@redirects_urls) {
+ $_ = &show_url($_);
+ }
+
+ # HTTP message
+ my $http_message;
+ if ($currloc->{message}) {
+ $http_message = &encode($currloc->{message});
+ if ($c == 404 || $c == 500) {
+ $http_message =
+ '<span class="broken">' . $http_message . '</span>';
+ }
+ }
+ my $redirmsg =
+ $redirect_loop ? ' <em>redirect loop detected</em>' : '';
+ printf("
+<dt%s>%s <span class='msg_loc'>Line%s: %s</span> %s</dt>
+<dd class='responsecode'><strong>Status</strong>: %s %s %s</dd>
+<dd class='message_explanation'><p>%s %s</p></dd>\n",
+
+ # Anchor for return codes
+ $idref,
+
+ # Color
+ &status_icon($c),
+ $s,
+
+ # List of lines
+ $lines_list,
+
+ # List of redirects
+ $redirected ?
+ join(' redirected to ', @redirects_urls) . $redirmsg :
+ &show_url($u),
+
+ # Realm
+ defined($currloc->{realm}) ?
+ sprintf('Realm: %s<br />', &encode($currloc->{realm})) :
+ '',
+
+ # HTTP original message
+ # defined($currloc->{orig_message})
+ # ? &encode($currloc->{orig_message}).
+ # ' <span title="redirected to">-></span> '
+ # : '',
+
+ # Response code chain
+ join(
+ ' <span class="redirected_to" title="redirected to">-></span> ',
+ map { &encode($_) } @http_codes),
+
+ # HTTP final message
+ $http_message,
+
+ # What to do
+ $whattodo,
+
+ # Redirect too?
+ $redirect_too ?
+ sprintf(' <span %s>%s</span>',
+ &bgcolor(301), $redirect_too) :
+ '',
+ );
+ if ($#fragments >= 0) {
+ printf("<dd>Broken fragments: <ul>\n");
+ }
+ }
+ else {
+ my $redirmsg = $redirect_loop ? ' redirect loop detected' : '';
+ printf(
+ "\n%s\t%s\n Code: %s %s\n%s\n",
+
+ # List of redirects
+ $redirected ? join("\n-> ", @redirects_urls) . $redirmsg : $u,
+
+ # List of lines
+ $lines_list ? sprintf("\n%6s: %s", "Line$s", $lines_list) : '',
+
+ # Response code chain
+ join(' -> ', @http_codes),
+
+ # HTTP message
+ $currloc->{message} || '',
+
+ # What to do
+ wrap(' To do: ', ' ', $whattodo)
+ );
+ if ($#fragments >= 0) {
+ if ($currloc->{code} == 200) {
+ print("The following fragments need to be fixed:\n");
+ }
+ else {
+ print("Fragments:\n");
+ }
+ }
+ }
+
+ # Fragments
+ for my $f (@fragments) {
+ my @unique_lines =
+ &sort_unique(keys %{$links->{$u}{fragments}{$f}});
+ my $plural = (scalar(@unique_lines) > 1) ? 's' : '';
+ my $unique_lines = join(', ', @unique_lines);
+ if ($Opts{HTML}) {
+ printf("<li>%s<em>#%s</em> (line%s %s)</li>\n",
+ &encode($u), &encode($f), $plural, $unique_lines);
+ }
+ else {
+ printf("\t%-30s\tLine%s: %s\n", $f, $plural, $unique_lines);
+ }
+ }
+
+ print("</ul></dd>\n") if ($Opts{HTML} && scalar(@fragments));
+ }
+
+ # End of the table
+ print("</dl>\n") if $Opts{HTML};
+
+ return;
+}
+
+sub code_shown ($$)
+{
+ my ($u, $results) = @_;
+
+ if ($results->{$u}{location}{record} == 200) {
+ return $results->{$u}{location}{orig} ||
+ $results->{$u}{location}{record};
+ }
+ else {
+ return $results->{$u}{location}{record};
+ }
+}
+
+sub links_summary (\%\%\%\%)
+{
+
+ # Advices to fix the problems
+
+ my %todo = (
+ 200 =>
+ 'Some of the links to this resource point to broken URI fragments (such as index.html#fragment).',
+ 300 =>
+ 'This often happens when a typo in the link gets corrected automatically by the server. For the sake of performance, the link should be fixed.',
+ 301 =>
+ 'This is a permanent redirect. The link should be updated to point to the more recent URI.',
+ 302 =>
+ 'This is a temporary redirect. Update the link if you believe it makes sense, or leave it as is.',
+ 303 =>
+ 'This rare status code points to a "See Other" resource. There is generally nothing to be done.',
+ 307 =>
+ 'This is a temporary redirect. Update the link if you believe it makes sense, or leave it as is.',
+ 400 =>
+ 'This is usually the sign of a malformed URL that cannot be parsed by the server. Check the syntax of the link.',
+ 401 =>
+ "The link is not public and the actual resource is only available behind authentication. If not already done, you could specify it.",
+ 403 =>
+ 'The link is forbidden! This needs fixing. Usual suspects: a missing index.html or Overview.html, or a missing ACL.',
+ 404 =>
+ 'The link is broken. Double-check that you have not made any typo, or mistake in copy-pasting. If the link points to a resource that no longer exists, you may want to remove or fix the link.',
+ 405 =>
+ 'The server does not allow HTTP HEAD requests, which prevents the Link Checker to check the link automatically. Check the link manually.',
+ 406 =>
+ "The server isn't capable of responding according to the Accept* headers sent. This is likely to be a server-side issue with negotiation.",
+ 407 => 'The link is a proxy, but requires Authentication.',
+ 408 => 'The request timed out.',
+ 410 => 'The resource is gone. You should remove this link.',
+ 415 => 'The media type is not supported.',
+ 500 => 'This is a server side problem. Check the URI.',
+ 501 =>
+ 'Could not check this link: method not implemented or scheme not supported.',
+ 503 =>
+ 'The server cannot service the request, for some unknown reason.',
+
+ # Non-HTTP codes:
+ RC_ROBOTS_TXT() => sprintf(
+ 'The link was not checked due to %srobots exclusion rules%s. Check the link manually, and see also the link checker %sdocumentation on robots exclusion%s.',
+ $Opts{HTML} ? (
+ '<a href="http://www.robotstxt.org/robotstxt.html">', '</a>',
+ "<a href=\"$Cfg{Doc_URI}#bot\">", '</a>'
+ ) : ('') x 4
+ ),
+ RC_DNS_ERROR() =>
+ 'The hostname could not be resolved. Check the link for typos.',
+ RC_IP_DISALLOWED() =>
+ sprintf(
+ 'The link resolved to a %snon-public IP address%s, and this link checker instance has been configured to not access such addresses. This may be a real error or just a quirk of the name resolver configuration on the server where the link checker runs. Check the link manually, in particular its hostname/IP address.',
+ $Opts{HTML} ?
+ ('<a href="http://www.ietf.org/rfc/rfc1918.txt">', '</a>') :
+ ('') x 2),
+ RC_PROTOCOL_DISALLOWED() =>
+ 'Accessing links with this URI scheme has been disabled in link checker.',
+ );
+ my %priority = (
+ 410 => 1,
+ 404 => 2,
+ 403 => 5,
+ 200 => 10,
+ 300 => 15,
+ 401 => 20
+ );
+
+ my ($links, $results, $broken, $redirects) = @_;
+
+ # List of the broken links
+ my @urls = keys %{$broken};
+ my @dir_redirect_urls = ();
+ if ($Opts{Redirects}) {
+
+ # Add the redirected URI's to the report
+ for my $l (keys %$redirects) {
+ next
+ unless (defined($results->{$l}) &&
+ defined($links->{$l}) &&
+ !defined($broken->{$l}));
+
+ # Check whether we have a "directory redirect"
+ # e.g. http://www.w3.org/TR -> http://www.w3.org/TR/
+ my ($redirect_loop, @redirects) = get_redirects($l, %$redirects);
+ if ($#redirects == 1) {
+ push(@dir_redirect_urls, $l);
+ next;
+ }
+ push(@urls, $l);
+ }
+ }
+
+ # Broken links and redirects
+ if ($#urls < 0) {
+ if (!$Opts{Quiet}) {
+ print_doc_header();
+ if ($Opts{HTML}) {
+ print "<h3>Links</h3>\n<p>Valid links!</p>\n";
+ }
+ else {
+ print "\nValid links.\n";
+ }
+ }
+ }
+ else {
+ print_doc_header();
+ print('<h3>') if $Opts{HTML};
+ print("\nList of broken links and other issues");
+
+ #print(' and redirects') if $Opts{Redirects};
+
+ # Sort the URI's by HTTP Code
+ my %code_summary;
+ my @idx;
+ for my $u (@urls) {
+ if (defined($results->{$u}{location}{record})) {
+ my $c = &code_shown($u, $results);
+ $code_summary{$c}++;
+ push(@idx, $c);
+ }
+ }
+ my @sorted = @urls[
+ sort {
+ defined($priority{$idx[$a]}) ?
+ defined($priority{$idx[$b]}) ?
+ $priority{$idx[$a]} <=> $priority{$idx[$b]} :
+ -1 :
+ defined($priority{$idx[$b]}) ? 1 :
+ $idx[$a] <=> $idx[$b]
+ } 0 .. $#idx
+ ];
+ @urls = @sorted;
+ undef(@sorted);
+ undef(@idx);
+
+ if ($Opts{HTML}) {
+
+ # Print a summary
+ print <<'EOF';
+</h3>
+<p><em>There are issues with the URLs listed below. The table summarizes the
+issues and suggested actions by HTTP response status code.</em></p>
+<table class="report" border="1" summary="List of issues and suggested actions.">
+<thead>
+<tr>
+<th>Code</th>
+<th>Occurrences</th>
+<th>What to do</th>
+</tr>
+</thead>
+<tbody>
+EOF
+ for my $code (sort(keys(%code_summary))) {
+ printf('<tr%s>', &bgcolor($code));
+ printf('<td><a href="#d%scode_%s">%s</a></td>',
+ $doc_count, $code, http_rc($code));
+ printf('<td>%s</td>', $code_summary{$code});
+ printf('<td>%s</td>', $todo{$code});
+ print "</tr>\n";
+ }
+ print "</tbody>\n</table>\n";
+ }
+ else {
+ print(':');
+ }
+ &show_link_report($links, $results, $broken, $redirects, \@urls, 1,
+ \%todo);
+ }
+
+ # Show directory redirects
+ if ($Opts{Dir_Redirects} && ($#dir_redirect_urls > -1)) {
+ print_doc_header();
+ print('<h3>') if $Opts{HTML};
+ print("\nList of redirects");
+ print(
+ "</h3>\n<p>The links below are not broken, but the document does not use the exact URL, and the links were redirected. It may be a good idea to link to the final location, for the sake of speed.</p>"
+ ) if $Opts{HTML};
+ &show_link_report($links, $results, $broken, $redirects,
+ \@dir_redirect_urls);
+ }
+
+ return;
+}
+
+###############################################################################
+
+################
+# Global stats #
+################
+
+sub global_stats ()
+{
+ my $stop = &get_timestamp();
+ my $n_docs =
+ ($doc_count <= $Opts{Max_Documents}) ? $doc_count :
+ $Opts{Max_Documents};
+ return sprintf(
+ 'Checked %d document%s in %s seconds.',
+ $n_docs,
+ ($n_docs == 1) ? '' : 's',
+ &time_diff($timestamp, $stop)
+ );
+}
+
+##################
+# HTML interface #
+##################
+
+sub html_header ($$)
+{
+ my ($uri, $cookie) = @_;
+
+ my $title = defined($uri) ? $uri : '';
+ $title = ': ' . $title if ($title =~ /\S/);
+
+ my $headers = '';
+ if (!$Opts{Command_Line}) {
+ $headers .= "Cache-Control: no-cache\nPragma: no-cache\n" if $uri;
+ $headers .= "Content-Type: text/html; charset=utf-8\n";
+ $headers .= "Set-Cookie: $cookie\n" if $cookie;
+
+ # mod_perl 1.99_05 doesn't seem to like it if the "\n\n" isn't in the same
+ # print() statement as the last header
+ $headers .= "Content-Language: en\n\n";
+ }
+
+ my $onload = $uri ? '' :
+ ' onload="if(document.getElementById){document.getElementById(\'uri_1\').focus()}"';
+
+ print $headers, $DocType, "
+<html lang=\"en\" xmlns=\"http://www.w3.org/1999/xhtml\" xml:lang=\"en\">
+<head>
+<title>W3C Link Checker", &encode($title), "</title>
+", $Head, "</head>
+<body", $onload, '>';
+ &banner($title);
+ return;
+}
+
+sub banner ($)
+{
+ my $tagline = "Check links and anchors in Web pages or full Web sites";
+
+ printf(
+ <<'EOF', URI->new_abs("../images/no_w3c.png", $Cfg{Doc_URI}), $tagline);
+<div id="banner"><h1 id="title"><a href="http://www.w3.org/" title="W3C"><img alt="W3C" id="logo" src="%s" width="110" height="61" /></a>
+<a href="checklink"><span>Link Checker</span></a></h1>
+<p id="tagline">%s</p></div>
+<div id="main">
+EOF
+ return;
+}
+
+sub status_icon($)
+{
+ my ($code) = @_;
+ my $icon_type;
+ my $r = HTTP::Response->new($code);
+ if ($r->is_success()) {
+ $icon_type = 'error'
+ ; # if is success but reported, it's because of broken frags => error
+ }
+ elsif (&informational($code)) {
+ $icon_type = 'info';
+ }
+ elsif ($code == 300) {
+ $icon_type = 'info';
+ }
+ elsif ($code == 401) {
+ $icon_type = 'error';
+ }
+ elsif ($r->is_redirect()) {
+ $icon_type = 'warning';
+ }
+ elsif ($r->is_error()) {
+ $icon_type = 'error';
+ }
+ else {
+ $icon_type = 'error';
+ }
+ return sprintf('<span class="err_type"><img src="%s" alt="%s" /></span>',
+ URI->new_abs("../images/info_icons/$icon_type.png", $Cfg{Doc_URI}),
+ $icon_type);
+}
+
+sub bgcolor ($)
+{
+ my ($code) = @_;
+ my $class;
+ my $r = HTTP::Response->new($code);
+ if ($r->is_success()) {
+ return '';
+ }
+ elsif ($code == RC_ROBOTS_TXT() || $code == RC_IP_DISALLOWED()) {
+ $class = 'dubious';
+ }
+ elsif ($code == 300) {
+ $class = 'multiple';
+ }
+ elsif ($code == 401) {
+ $class = 'unauthorized';
+ }
+ elsif ($r->is_redirect()) {
+ $class = 'redirect';
+ }
+ elsif ($r->is_error()) {
+ $class = 'broken';
+ }
+ else {
+ $class = 'broken';
+ }
+ return (' class="' . $class . '"');
+}
+
+sub show_url ($)
+{
+ my ($url) = @_;
+ return sprintf('<a href="%s">%s</a>', (&encode($url)) x 2);
+}
+
+sub html_footer ()
+{
+ printf("<p>%s</p>\n", &global_stats())
+ if ($doc_count > 0 && !$Opts{Quiet});
+ if (!$doc_count) {
+ print <<'EOF';
+<div class="intro">
+ <p>
+ This Link Checker looks for issues in links, anchors and referenced objects
+ in a Web page, CSS style sheet, or recursively on a whole Web site. For
+ best results, it is recommended to first ensure that the documents checked
+ use Valid <a href="http://validator.w3.org/">(X)HTML Markup</a> and
+ <a href="http://jigsaw.w3.org/css-validator/">CSS</a>. The Link Checker is
+ part of the W3C's <a href="http://www.w3.org/QA/Tools/">validators and
+ Quality Web tools</a>.
+ </p>
+</div>
+EOF
+ }
+ printf(<<'EOF', $Cfg{Doc_URI}, $Cfg{Doc_URI}, $PACKAGE, $REVISION);
+</div><!-- main -->
+<ul class="navbar" id="menu">
+ <li><a href="%s" accesskey="3" title="Documentation for this Link Checker Service">Docs</a></li>
+ <li><a href="http://search.cpan.org/dist/W3C-LinkChecker/" accesskey="2" title="Download the source / Install this service">Download</a></li>
+ <li><a href="%s#csb" title="feedback: comments, suggestions and bugs" accesskey="4">Feedback</a></li>
+ <li><a href="http://validator.w3.org/" title="Validate your markup with the W3C Markup Validation Service">Validator</a></li>
+</ul>
+<div>
+<address>
+%s<br /> %s
+</address>
+</div>
+</body>
+</html>
+EOF
+ return;
+}
+
+sub print_form (\%$$)
+{
+ my ($params, $cookie, $check_num) = @_;
+
+ # Split params on \0, see CGI's docs on Vars()
+ while (my ($key, $value) = each(%$params)) {
+ if ($value) {
+ my @vals = split(/\0/, $value, 2);
+ $params->{$key} = $vals[0];
+ }
+ }
+
+ # Override undefined values from the cookie, if we got one.
+ my $valid_cookie = 0;
+ if ($cookie) {
+ my %cookie_values = $cookie->value();
+ if (!$cookie_values{clear})
+ { # XXX no easy way to check if cookie expired?
+ $valid_cookie = 1;
+ while (my ($key, $value) = each(%cookie_values)) {
+ $params->{$key} = $value unless defined($params->{$key});
+ }
+ }
+ }
+
+ my $chk = ' checked="checked"';
+ $params->{hide_type} = 'all' unless $params->{hide_type};
+
+ my $requested_uri = &encode($params->{uri} || '');
+ my $sum = $params->{summary} ? $chk : '';
+ my $red = $params->{hide_redirects} ? $chk : '';
+ my $all = ($params->{hide_type} ne 'dir') ? $chk : '';
+ my $dir = $all ? '' : $chk;
+ my $acc = $params->{no_accept_language} ? $chk : '';
+ my $ref = $params->{no_referer} ? $chk : '';
+ my $rec = $params->{recursive} ? $chk : '';
+ my $dep = &encode($params->{depth} || '');
+
+ my $cookie_options = '';
+ if ($valid_cookie) {
+ $cookie_options = "
+ <label for=\"cookie1_$check_num\"><input type=\"radio\" id=\"cookie1_$check_num\" name=\"cookie\" value=\"nochanges\" checked=\"checked\" /> Don't modify saved options</label>
+ <label for=\"cookie2_$check_num\"><input type=\"radio\" id=\"cookie2_$check_num\" name=\"cookie\" value=\"set\" /> Save these options</label>
+ <label for=\"cookie3_$check_num\"><input type=\"radio\" id=\"cookie3_$check_num\" name=\"cookie\" value=\"clear\" /> Clear saved options</label>";
+ }
+ else {
+ $cookie_options = "
+ <label for=\"cookie_$check_num\"><input type=\"checkbox\" id=\"cookie_$check_num\" name=\"cookie\" value=\"set\" /> Save options in a <a href=\"http://www.w3.org/Protocols/rfc2109/rfc2109\">cookie</a></label>";
+ }
+
+ print "<form action=\"", $Opts{_Self_URI},
+ "\" method=\"get\" onsubmit=\"return uriOk($check_num)\" accept-charset=\"UTF-8\">
+<p><label for=\"uri_$check_num\">Enter the address (<a href=\"http://www.w3.org/Addressing/\">URL</a>)
+of a document that you would like to check:</label></p>
+<p><input type=\"text\" size=\"50\" id=\"uri_$check_num\" name=\"uri\" value=\"",
+ $requested_uri, "\" /></p>
+<fieldset id=\"extra_opt_uri_$check_num\" class=\"moreoptions\">
+ <legend class=\"toggletext\">More Options</legend>
+ <div class=\"options\">
+ <p>
+ <label for=\"summary_$check_num\"><input type=\"checkbox\" id=\"summary_$check_num\" name=\"summary\" value=\"on\"",
+ $sum, " /> Summary only</label>
+ <br />
+ <label for=\"hide_redirects_$check_num\"><input type=\"checkbox\" id=\"hide_redirects_$check_num\" name=\"hide_redirects\" value=\"on\"",
+ $red,
+ " /> Hide <a href=\"http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html#sec10.3\">redirects</a>:</label>
+ <label for=\"hide_type_all_$check_num\"><input type=\"radio\" id=\"hide_type_all_$check_num\" name=\"hide_type\" value=\"all\"",
+ $all, " /> all</label>
+ <label for=\"hide_type_dir_$check_num\"><input type=\"radio\" id=\"hide_type_dir_$check_num\" name=\"hide_type\" value=\"dir\"",
+ $dir, " /> for directories only</label>
+ <br />
+ <label for=\"no_accept_language_$check_num\"><input type=\"checkbox\" id=\"no_accept_language_$check_num\" name=\"no_accept_language\" value=\"on\"",
+ $acc,
+ " /> Don't send the <tt><a href=\"http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.4\">Accept-Language</a></tt> header</label>
+ <br />
+ <label for=\"no_referer_$check_num\"><input type=\"checkbox\" id=\"no_referer_$check_num\" name=\"no_referer\" value=\"on\"",
+ $ref,
+ " /> Don't send the <tt><a href=\"http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.36\">Referer</a></tt> header</label>
+ <br />
+ <label title=\"Check linked documents recursively (maximum: ",
+ $Opts{Max_Documents},
+ " documents)\" for=\"recursive_$check_num\"><input type=\"checkbox\" id=\"recursive_$check_num\" name=\"recursive\" value=\"on\"",
+ $rec, " /> Check linked documents recursively</label>,
+ <label title=\"Depth of the recursion (-1 is the default and means unlimited)\" for=\"depth_$check_num\">recursion depth: <input type=\"text\" size=\"3\" maxlength=\"3\" id=\"depth_$check_num\" name=\"depth\" value=\"",
+ $dep, "\" /></label>
+ <br /><br />", $cookie_options, "
+ </p>
+ </div>
+</fieldset>
+<p class=\"submit_button\"><input type=\"submit\" name=\"check\" value=\"Check\" /></p>
+</form>
+<div class=\"intro\" id=\"don_program\"></div>
+<script type=\"text/javascript\" src=\"http://www.w3.org/QA/Tools/don_prog.js\"></script>
+";
+ return;
+}
+
+sub encode (@)
+{
+ return $Opts{HTML} ? HTML::Entities::encode(@_) : @_;
+}
+
+sub hprintf (@)
+{
+ print_doc_header();
+ if (!$Opts{HTML}) {
+ # can have undef values here; avoid useless warning. E.g.,
+ # Error: -1 Forbidden by robots.txt
+ # Use of uninitialized value $_[2] in printf at /usr/local/bin/checklink line 3245.
+ # and
+ # Error: 404 File `/u/karl/gnu/src/akarl/doc/dejagnu.html' does not exist
+ # Use of uninitialized value $_[2] in printf at /usr/local/bin/checklink line 3245.
+ my @args = ();
+ for my $a (@_) {
+ push (@args, defined $a ? $a : ""),
+ }
+ printf(@args);
+ }
+ else {
+ print HTML::Entities::encode(sprintf($_[0], @_[1 .. @_ - 1]));
+ }
+ return;
+}
+
+# Print the document header, if it hasn't been printed already.
+# This is invoked before most other output operations, in order
+# to enable quiet processing that doesn't clutter the output with
+# "Processing..." messages when nothing else will be reported.
+sub print_doc_header ()
+{
+ if (defined($doc_header)) {
+ print $doc_header;
+ undef($doc_header);
+ }
+}
+
+# Local Variables:
+# mode: perl
+# indent-tabs-mode: nil
+# cperl-indent-level: 4
+# cperl-continued-statement-offset: 4
+# cperl-brace-offset: -4
+# perl-indent-level: 4
+# End:
+# ex: ts=4 sw=4 et
-This is automake-history.info, produced by makeinfo version 6.5 from
+This is automake-history.info, produced by makeinfo version 6.7 from
automake-history.texi.
This manual describes (part of) the history of GNU Automake, a program
and with no Back-Cover Texts. A copy of the license is included in
the section entitled "GNU Free Documentation License."
+INFO-DIR-SECTION Software development
+START-INFO-DIR-ENTRY
+* Automake-history: (automake-history). History of Automake development.
+END-INFO-DIR-ENTRY
+
\1f
File: automake-history.info, Node: Top, Next: Timeline, Up: (dir)
to the Gnits folks.
Gnits was (and still is) totally informal, just a few GNU friends
- who Franc,ois Pinard knew, who were all interested in making a
+ who François Pinard knew, who were all interested in making a
common infrastructure for GNU projects, and shared a similar
outlook on how to do it. So they were able to make some progress.
It came along with Autoconf and extensions thereof, and then
'automake'; it has lasted pretty well.
AutoMake is renamed to Automake (Tom seems to recall it was
- Franc,ois Pinard's doing).
+ François Pinard's doing).
0.25 fixes a Perl 4 portability bug.
1995-12-18 Jim Meyering starts using Automake in GNU Textutils.
-1995-12-31 Franc,ois Pinard starts using Automake in GNU tar.
+1995-12-31 François Pinard starts using Automake in GNU tar.
1996-01-03 Automake 0.26
1996-01-03 Automake 0.27
- Of the many changes and suggestions sent by Franc,ois Pinard and
+ Of the many changes and suggestions sent by François Pinard and
included in 0.26, perhaps the most important is the advice that to
ease customization a user rule or variable definition should always
override an Automake rule or definition.
'EXTRA_PROGRAMS' finally replaces 'AM_PROGRAMS'.
- All the third-party Autoconf macros, written mostly by Franc,ois
+ All the third-party Autoconf macros, written mostly by François
Pinard (and later Jim Meyering), are distributed in Automake's
hand-written 'aclocal.m4' file. Package maintainers are expected
to extract the necessary macros from this file. (In previous
Libtool is fully supported using '*_LTLIBRARIES'.
- The missing script is introduced by Franc,ois Pinard; it is meant
- to be a better solution than 'AM_MAINTAINER_MODE' (*note
+ The missing script is introduced by François Pinard; it is meant to
+ be a better solution than 'AM_MAINTAINER_MODE' (*note
maintainer-mode: (automake)maintainer-mode.).
Conditionals support was implemented by Ian Lance Taylor. At the
\1f
Tag Table:
-Node: Top\7f702
-Node: Timeline\7f2236
-Node: Dependency Tracking Evolution\7f33697
-Node: First Take on Dependencies\7f34539
-Node: Dependencies As Side Effects\7f37197
-Node: Dependencies for the User\7f39256
-Node: Techniques for Dependencies\7f44286
-Node: Recommendations for Tool Writers\7f45984
-Node: Future Directions for Dependencies\7f46703
-Node: Releases\7f47173
-Node: Copying This Manual\7f52560
-Node: GNU Free Documentation License\7f52787
+Node: Top\7f854
+Node: Timeline\7f2388
+Node: Dependency Tracking Evolution\7f33849
+Node: First Take on Dependencies\7f34691
+Node: Dependencies As Side Effects\7f37349
+Node: Dependencies for the User\7f39408
+Node: Techniques for Dependencies\7f44438
+Node: Recommendations for Tool Writers\7f46136
+Node: Future Directions for Dependencies\7f46855
+Node: Releases\7f47325
+Node: Copying This Manual\7f52712
+Node: GNU Free Documentation License\7f52939
\1f
End Tag Table
+
+\1f
+Local Variables:
+coding: utf-8
+End:
@end quotation
@end copying
+@dircategory Software development
+@direntry
+* Automake-history: (automake-history). History of Automake development.
+@end direntry
+
@titlepage
@title Brief History of Automake
@author David MacKenzie
-This is automake.info, produced by makeinfo version 6.5 from
+This is automake.info, produced by makeinfo version 6.7 from
automake.texi.
-This manual is for GNU Automake (version 1.16.2, 1 February 2020), a
+This manual is for GNU Automake (version 1.16.3, 19 November 2020), a
program that creates GNU standards-compliant Makefiles from template
files.
\1f
Indirect:
-automake.info-1: 1084
-automake.info-2: 301844
+automake.info-1: 1085
+automake.info-2: 301619
\1f
Tag Table:
(Indirect)
-Node: Top\7f1084
-Node: Introduction\7f14414
-Ref: Introduction-Footnote-1\7f15823
-Node: Autotools Introduction\7f15982
-Node: GNU Build System\7f17363
-Node: Use Cases\7f20111
-Node: Basic Installation\7f22239
-Node: Standard Targets\7f25825
-Node: Standard Directory Variables\7f27428
-Node: Standard Configuration Variables\7f29285
-Node: config.site\7f30644
-Node: VPATH Builds\7f32070
-Node: Two-Part Install\7f36128
-Node: Cross-Compilation\7f38572
-Node: Renaming\7f41547
-Node: DESTDIR\7f42705
-Node: Preparing Distributions\7f44887
-Node: Dependency Tracking\7f47258
-Node: Nested Packages\7f49366
-Node: Why Autotools\7f50882
-Node: Hello World\7f52524
-Ref: amhello Explained\7f52946
-Node: Creating amhello\7f53118
-Node: amhello's configure.ac Setup Explained\7f58473
-Node: amhello's Makefile.am Setup Explained\7f63393
-Node: Generalities\7f67093
-Node: General Operation\7f67787
-Node: Strictness\7f71211
-Node: Uniform\7f72823
-Node: Length Limitations\7f77762
-Node: Canonicalization\7f80072
-Node: User Variables\7f81144
-Node: Auxiliary Programs\7f82634
-Node: Examples\7f86373
-Node: Complete\7f87243
-Node: true\7f89348
-Node: automake Invocation\7f91839
-Ref: Invoking automake\7f91994
-Node: configure\7f99771
-Node: Requirements\7f100700
-Node: Optional\7f105938
-Node: aclocal Invocation\7f115340
-Ref: Invoking aclocal\7f115501
-Node: aclocal Options\7f118555
-Node: Macro Search Path\7f122281
-Ref: ACLOCAL_PATH\7f126655
-Node: Extending aclocal\7f128225
-Node: Local Macros\7f131949
-Node: Serials\7f135933
-Node: Future of aclocal\7f141153
-Node: Macros\7f143542
-Node: Public Macros\7f144083
-Ref: Modernize AM_INIT_AUTOMAKE invocation\7f145706
-Node: Obsolete Macros\7f150187
-Node: Private Macros\7f151529
-Node: Directories\7f152979
-Node: Subdirectories\7f154574
-Node: Conditional Subdirectories\7f157966
-Node: SUBDIRS vs DIST_SUBDIRS\7f159647
-Node: Subdirectories with AM_CONDITIONAL\7f161285
-Node: Subdirectories with AC_SUBST\7f162479
-Node: Unconfigured Subdirectories\7f163306
-Node: Alternative\7f166761
-Ref: Alternative-Footnote-1\7f168953
-Node: Subpackages\7f169078
-Node: Programs\7f172424
-Node: A Program\7f173966
-Node: Program Sources\7f174689
-Node: Linking\7f176594
-Node: Conditional Sources\7f180239
-Node: Conditional Programs\7f183167
-Node: A Library\7f185057
-Node: A Shared Library\7f187734
-Node: Libtool Concept\7f188738
-Node: Libtool Libraries\7f190844
-Node: Conditional Libtool Libraries\7f192578
-Node: Conditional Libtool Sources\7f195035
-Node: Libtool Convenience Libraries\7f196422
-Node: Libtool Modules\7f199850
-Node: Libtool Flags\7f201166
-Node: LTLIBOBJS\7f203077
-Node: Libtool Issues\7f203712
-Node: Error required file ltmain.sh not found\7f204053
-Node: Objects created both with libtool and without\7f205274
-Node: Program and Library Variables\7f207197
-Ref: Program and Library Variables-Footnote-1\7f218575
-Node: Default _SOURCES\7f218650
-Node: LIBOBJS\7f221127
-Node: Program Variables\7f226354
-Node: Yacc and Lex\7f229917
-Ref: Yacc and Lex-Footnote-1\7f235545
-Node: C++ Support\7f235808
-Node: Objective C Support\7f236688
-Node: Objective C++ Support\7f237661
-Node: Unified Parallel C Support\7f238683
-Node: Assembly Support\7f239679
-Node: Fortran 77 Support\7f240851
-Ref: Fortran 77 Support-Footnote-1\7f242536
-Node: Preprocessing Fortran 77\7f242739
-Node: Compiling Fortran 77 Files\7f243343
-Node: Mixing Fortran 77 With C and C++\7f243955
-Ref: Mixing Fortran 77 With C and C++-Footnote-1\7f246278
-Node: How the Linker is Chosen\7f246585
-Node: Fortran 9x Support\7f248124
-Node: Compiling Fortran 9x Files\7f249170
-Node: Java Support with gcj\7f249806
-Node: Vala Support\7f251287
-Node: Support for Other Languages\7f253372
-Node: Dependencies\7f254080
-Node: EXEEXT\7f255967
-Node: Other Objects\7f258207
-Node: Scripts\7f258799
-Node: Headers\7f261658
-Node: Data\7f263461
-Node: Sources\7f264146
-Node: Built Sources Example\7f267091
-Node: Other GNU Tools\7f274279
-Node: Emacs Lisp\7f274808
-Node: gettext\7f276906
-Node: Libtool\7f277594
-Node: Java\7f277853
-Node: Python\7f280512
-Node: Documentation\7f285594
-Node: Texinfo\7f285898
-Node: Man Pages\7f293097
-Node: Install\7f296222
-Node: Basics of Installation\7f296926
-Node: The Two Parts of Install\7f298456
-Node: Extending Installation\7f299996
-Node: Staged Installs\7f301844
-Node: Install Rules for the User\7f303257
-Node: Clean\7f303815
-Node: Dist\7f305987
-Node: Basics of Distribution\7f306491
-Node: Fine-grained Distribution Control\7f309722
-Node: The dist Hook\7f310649
-Node: Checking the Distribution\7f313143
-Node: The Types of Distributions\7f319496
-Node: Tests\7f322311
-Node: Generalities about Testing\7f323507
-Node: Simple Tests\7f326444
-Node: Scripts-based Testsuites\7f326825
-Ref: Testsuite progress on console\7f329208
-Ref: Simple tests and color-tests\7f330310
-Node: Serial Test Harness\7f334333
-Node: Parallel Test Harness\7f336438
-Ref: Basics of test metadata\7f336944
-Node: Custom Test Drivers\7f345676
-Node: Overview of Custom Test Drivers Support\7f345967
-Node: Declaring Custom Test Drivers\7f349019
-Node: API for Custom Test Drivers\7f350441
-Node: Command-line arguments for test drivers\7f351217
-Node: Log files generation and test results recording\7f353931
-Node: Testsuite progress output\7f358146
-Node: Using the TAP test protocol\7f359568
-Node: Introduction to TAP\7f359930
-Node: Use TAP with the Automake test harness\7f361742
-Node: Incompatibilities with other TAP parsers and drivers\7f367165
-Node: Links and external resources on TAP\7f368566
-Node: DejaGnu Tests\7f370190
-Node: Install Tests\7f372317
-Node: Rebuilding\7f372627
-Node: Options\7f376305
-Node: Options generalities\7f376606
-Node: List of Automake options\7f378387
-Ref: tar-formats\7f385096
-Node: Miscellaneous\7f388609
-Node: Tags\7f388954
-Node: Suffixes\7f392071
-Node: Include\7f393703
-Node: Conditionals\7f395438
-Node: Usage of Conditionals\7f396296
-Node: Limits of Conditionals\7f399654
-Node: Silencing Make\7f400839
-Node: Make verbosity\7f401190
-Ref: Make verbosity-Footnote-1\7f402512
-Node: Tricks For Silencing Make\7f402586
-Node: Automake Silent Rules\7f405093
-Node: Gnits\7f412079
-Node: Not Enough\7f414559
-Node: Extending\7f415006
-Node: Third-Party Makefiles\7f420040
-Node: Distributing\7f426981
-Node: API Versioning\7f427630
-Node: Upgrading\7f430335
-Node: FAQ\7f432380
-Node: CVS\7f433504
-Node: maintainer-mode\7f441907
-Node: Wildcards\7f446079
-Node: Limitations on File Names\7f449518
-Node: Errors with distclean\7f452148
-Node: Flag Variables Ordering\7f457097
-Node: Renamed Objects\7f464933
-Node: Per-Object Flags\7f466524
-Node: Multiple Outputs\7f469534
-Node: Hard-Coded Install Paths\7f481489
-Node: Debugging Make Rules\7f486654
-Ref: Debugging Make Rules-Footnote-1\7f488815
-Node: Reporting Bugs\7f488993
-Node: Copying This Manual\7f490942
-Node: GNU Free Documentation License\7f491172
-Node: Indices\7f516493
-Node: Macro Index\7f516782
-Node: Variable Index\7f522421
-Node: General Index\7f553727
+Node: Top\7f1085
+Node: Introduction\7f14340
+Ref: Introduction-Footnote-1\7f15749
+Node: Autotools Introduction\7f15908
+Node: GNU Build System\7f17290
+Node: Use Cases\7f20038
+Node: Basic Installation\7f22166
+Node: Standard Targets\7f25752
+Node: Standard Directory Variables\7f27355
+Node: Standard Configuration Variables\7f29212
+Node: config.site\7f30571
+Node: VPATH Builds\7f31997
+Node: Two-Part Install\7f36055
+Node: Cross-Compilation\7f38499
+Node: Renaming\7f41474
+Node: DESTDIR\7f42632
+Node: Preparing Distributions\7f44814
+Node: Dependency Tracking\7f47184
+Node: Nested Packages\7f49292
+Node: Why Autotools\7f50808
+Node: Hello World\7f52450
+Ref: amhello Explained\7f52872
+Node: Creating amhello\7f53044
+Node: amhello's configure.ac Setup Explained\7f58399
+Node: amhello's Makefile.am Setup Explained\7f63319
+Node: Generalities\7f67019
+Node: General Operation\7f67713
+Node: Strictness\7f71137
+Ref: Gnits\7f71271
+Node: Uniform\7f75176
+Node: Length Limitations\7f80115
+Node: Canonicalization\7f82426
+Node: User Variables\7f83498
+Node: Auxiliary Programs\7f84988
+Node: Examples\7f88925
+Node: Complete\7f89795
+Node: true\7f91900
+Node: automake Invocation\7f94391
+Ref: Invoking automake\7f94546
+Node: configure\7f103161
+Node: Requirements\7f104090
+Node: Optional\7f109328
+Node: aclocal Invocation\7f118730
+Ref: Invoking aclocal\7f118891
+Node: aclocal Options\7f121945
+Node: Macro Search Path\7f125671
+Ref: ACLOCAL_PATH\7f130045
+Node: Extending aclocal\7f131615
+Node: Local Macros\7f135339
+Node: Serials\7f139323
+Node: Future of aclocal\7f144543
+Node: Macros\7f146932
+Node: Public Macros\7f147473
+Ref: Modernize AM_INIT_AUTOMAKE invocation\7f149096
+Node: Obsolete Macros\7f153578
+Node: Private Macros\7f154920
+Node: Directories\7f156370
+Node: Subdirectories\7f157965
+Node: Conditional Subdirectories\7f161357
+Node: SUBDIRS vs DIST_SUBDIRS\7f163038
+Node: Subdirectories with AM_CONDITIONAL\7f164676
+Node: Subdirectories with AC_SUBST\7f165870
+Node: Unconfigured Subdirectories\7f166697
+Node: Alternative\7f170152
+Ref: Alternative-Footnote-1\7f172304
+Node: Subpackages\7f172429
+Node: Programs\7f175775
+Node: A Program\7f177317
+Node: Program Sources\7f178040
+Node: Linking\7f179945
+Node: Conditional Sources\7f183590
+Node: Conditional Programs\7f186518
+Node: A Library\7f188408
+Node: A Shared Library\7f191085
+Node: Libtool Concept\7f192089
+Node: Libtool Libraries\7f194179
+Node: Conditional Libtool Libraries\7f195913
+Node: Conditional Libtool Sources\7f198370
+Node: Libtool Convenience Libraries\7f199757
+Node: Libtool Modules\7f203185
+Node: Libtool Flags\7f204501
+Node: LTLIBOBJS\7f206412
+Node: Libtool Issues\7f207047
+Node: Error required file ltmain.sh not found\7f207388
+Node: Objects created both with libtool and without\7f208609
+Node: Program and Library Variables\7f210532
+Ref: Program and Library Variables-Footnote-1\7f221910
+Node: Default _SOURCES\7f221985
+Node: LIBOBJS\7f224462
+Node: Program Variables\7f229689
+Node: Yacc and Lex\7f233252
+Ref: Yacc and Lex-Footnote-1\7f238880
+Node: C++ Support\7f239143
+Node: Objective C Support\7f240023
+Node: Objective C++ Support\7f240996
+Node: Unified Parallel C Support\7f242018
+Node: Assembly Support\7f243014
+Node: Fortran 77 Support\7f244186
+Ref: Fortran 77 Support-Footnote-1\7f245871
+Node: Preprocessing Fortran 77\7f246074
+Node: Compiling Fortran 77 Files\7f246678
+Node: Mixing Fortran 77 With C and C++\7f247290
+Ref: Mixing Fortran 77 With C and C++-Footnote-1\7f249613
+Node: How the Linker is Chosen\7f249921
+Node: Fortran 9x Support\7f251460
+Node: Compiling Fortran 9x Files\7f252506
+Node: Java Support with gcj\7f253142
+Node: Vala Support\7f254623
+Node: Support for Other Languages\7f256896
+Node: Dependencies\7f257604
+Node: EXEEXT\7f259491
+Node: Other Objects\7f261731
+Node: Scripts\7f262323
+Node: Headers\7f265182
+Node: Data\7f266985
+Node: Sources\7f267670
+Node: Built Sources Example\7f270685
+Node: Other GNU Tools\7f277873
+Node: Emacs Lisp\7f278402
+Node: gettext\7f280500
+Node: Libtool\7f281188
+Node: Java\7f281447
+Node: Python\7f284106
+Node: Documentation\7f289203
+Node: Texinfo\7f289507
+Node: Man Pages\7f296705
+Node: Install\7f299830
+Node: Basics of Installation\7f301619
+Node: The Two Parts of Install\7f303149
+Node: Extending Installation\7f304689
+Node: Staged Installs\7f305453
+Node: Install Rules for the User\7f306866
+Node: Clean\7f307424
+Node: Dist\7f309596
+Node: Basics of Distribution\7f310100
+Node: Fine-grained Distribution Control\7f313443
+Node: The dist Hook\7f314370
+Node: Checking the Distribution\7f316916
+Node: The Types of Distributions\7f324223
+Node: Tests\7f327038
+Node: Generalities about Testing\7f328234
+Node: Simple Tests\7f331171
+Node: Scripts-based Testsuites\7f331552
+Ref: Testsuite progress on console\7f333925
+Ref: Simple tests and color-tests\7f336018
+Node: Serial Test Harness\7f340041
+Node: Parallel Test Harness\7f342146
+Ref: Basics of test metadata\7f342652
+Node: Custom Test Drivers\7f351384
+Node: Overview of Custom Test Drivers Support\7f351675
+Node: Declaring Custom Test Drivers\7f354727
+Node: API for Custom Test Drivers\7f356149
+Node: Command-line arguments for test drivers\7f356925
+Node: Log files generation and test results recording\7f359639
+Node: Testsuite progress output\7f363854
+Node: Using the TAP test protocol\7f365276
+Node: Introduction to TAP\7f365638
+Node: Use TAP with the Automake test harness\7f367422
+Node: Incompatibilities with other TAP parsers and drivers\7f372845
+Node: Links and external resources on TAP\7f374246
+Node: DejaGnu Tests\7f375838
+Node: Install Tests\7f377966
+Node: Rebuilding\7f378276
+Node: Options\7f381954
+Node: Options generalities\7f382255
+Node: List of Automake options\7f384036
+Ref: tar-formats\7f390771
+Node: Miscellaneous\7f394284
+Node: Tags\7f394629
+Node: Suffixes\7f397746
+Node: Include\7f399378
+Node: Conditionals\7f401113
+Node: Usage of Conditionals\7f401971
+Node: Limits of Conditionals\7f405329
+Node: Silencing Make\7f406514
+Node: Make verbosity\7f406870
+Ref: Make verbosity-Footnote-1\7f408192
+Node: Tricks For Silencing Make\7f408266
+Node: Automake Silent Rules\7f410773
+Node: Not Enough\7f417759
+Node: Extending\7f418215
+Node: Third-Party Makefiles\7f423249
+Node: Distributing\7f430314
+Node: API Versioning\7f430963
+Node: Upgrading\7f433668
+Node: FAQ\7f435713
+Node: CVS\7f436837
+Node: maintainer-mode\7f445282
+Node: Wildcards\7f449454
+Node: Limitations on File Names\7f452893
+Node: Errors with distclean\7f455523
+Node: Flag Variables Ordering\7f460472
+Node: Renamed Objects\7f468308
+Node: Per-Object Flags\7f469899
+Node: Multiple Outputs\7f472909
+Node: Hard-Coded Install Paths\7f485268
+Node: Debugging Make Rules\7f490433
+Ref: Debugging Make Rules-Footnote-1\7f492594
+Node: Reporting Bugs\7f492772
+Node: Copying This Manual\7f494583
+Node: GNU Free Documentation License\7f494813
+Node: Indices\7f520134
+Node: Macro Index\7f520423
+Node: Variable Index\7f526062
+Node: General Index\7f557783
\1f
End Tag Table
-This is automake.info, produced by makeinfo version 6.5 from
+This is automake.info, produced by makeinfo version 6.7 from
automake.texi.
-This manual is for GNU Automake (version 1.16.2, 1 February 2020), a
+This manual is for GNU Automake (version 1.16.3, 19 November 2020), a
program that creates GNU standards-compliant Makefiles from template
files.
GNU Automake
************
-This manual is for GNU Automake (version 1.16.2, 1 February 2020), a
+This manual is for GNU Automake (version 1.16.3, 19 November 2020), a
program that creates GNU standards-compliant Makefiles from template
files.
* Include:: Including extra files in an Automake template
* Conditionals:: Conditionals
* Silencing Make:: Obtain less verbose output from ‘make’
-* Gnits:: The effect of ‘--gnu’ and ‘--gnits’
* Not Enough:: When Automake is not Enough
* Distributing:: Distributing the Makefile.in
* API Versioning:: About compatibility between Automake versions
If you need some teaching material, more illustrations, or a less
‘automake’-centered continuation, some slides for this introduction are
available in Alexandre Duret-Lutz’s Autotools Tutorial
-(http://www.lrde.epita.fr/~adl/autotools.html). This chapter is the
+(https://www.lrde.epita.fr/~adl/autotools.html). This chapter is the
written version of the first part of his tutorial.
* Menu:
it additionally ensures most of the use cases presented so far work:
• It attempts a full compilation of the package (*note Basic
- Installation::), unpacking the newly constructed tarball, running
- ‘make’, ‘make check’, ‘make install’, as well as ‘make
+ Installation::): unpacking the newly constructed tarball, running
+ ‘make’, ‘make dvi’, ‘make check’, ‘make install’, as well as ‘make
installcheck’, and even ‘make dist’,
• it tests VPATH builds with read-only source tree (*note VPATH
Builds::),
• and it checks that ‘DESTDIR’ installations work (*note DESTDIR::).
All of these actions are performed in a temporary directory, so that
-no root privileges are required. Please note that the exact location
-and the exact structure of such a subdirectory (where the extracted
-sources are placed, how the temporary build and install directories are
-named and how deeply they are nested, etc.) is to be considered an
-implementation detail, which can change at any time; so do not rely on
-it.
+no root privileges are required. The exact location and the exact
+structure of such a subdirectory (where the extracted sources are
+placed, how the temporary build and install directories are named and
+how deeply they are nested, etc.) is to be considered an implementation
+detail, which can change at any time; so do not rely on it.
Releasing a package that fails ‘make distcheck’ means that one of the
scenarios we presented will not work and some users will be
does make some effort to accommodate those who wish to use it, but do
not want to use all the GNU conventions.
- To this end, Automake supports three levels of “strictness”—the
-strictness indicating how stringently Automake should check standards
-conformance.
+ To this end, Automake supports three levels of “strictness”—how
+stringently Automake should enforce conformance with GNU conventions.
+Each strictness level can be selected using an option of the same name;
+see *note Options::.
- The valid strictness levels are:
+ The strictness levels are:
+
+‘gnu’
+ This is the default level of strictness. Automake will check for
+ basic compliance with the GNU standards for software packaging.
+ *Note (standards)Top:: for full details of these standards.
+ Currently the following checks are made:
+
+ • The files ‘INSTALL’, ‘NEWS’, ‘README’, ‘AUTHORS’, and
+ ‘ChangeLog’, plus one of ‘COPYING.LIB’, ‘COPYING.LESSER’ or
+ ‘COPYING’, are required at the topmost directory of the
+ package.
+
+ If the ‘--add-missing’ option is given, ‘automake’ will add a
+ generic version of the ‘INSTALL’ file as well as the ‘COPYING’
+ file containing the text of the current version of the GNU
+ General Public License existing at the time of this Automake
+ release (version 3 as this is written,
+ <https://www.gnu.org/copyleft/gpl.html>). However, an
+ existing ‘COPYING’ file will never be overwritten by
+ ‘automake’.
+
+ • The options ‘no-installman’ and ‘no-installinfo’ are
+ prohibited.
+
+ Future versions of Automake will add more checks at this level of
+ strictness; it is advisable to be familiar with the precise
+ requirements of the GNU standards.
+
+ Future versions of Automake may, at this level of strictness,
+ require certain non-standard GNU tools to be available to
+ maintainer-only Makefile rules. For instance, in the future
+ ‘pathchk’ (*note (coreutils)pathchk invocation::) may be required
+ to run ‘make dist’.
‘foreign’
Automake will check for only those things that are absolutely
required for proper operation. For instance, whereas GNU standards
dictate the existence of a ‘NEWS’ file, it will not be required in
this mode. This strictness will also turn off some warnings by
- default (among them, portability warnings). The name comes from
- the fact that Automake is intended to be used for GNU programs;
- these relaxed rules are not the standard mode of operation.
-
-‘gnu’
- Automake will check—as much as possible—for compliance to the GNU
- standards for packages. This is the default.
+ default (among them, portability warnings).
‘gnits’
Automake will check for compliance to the as-yet-unwritten “Gnits
recommended that you avoid this option until such time as the Gnits
standard is actually published (which may never happen).
- *Note Gnits::, for more information on the precise implications of
-the strictness level.
+ Currently, ‘--gnits’ does all the checks that ‘--gnu’ does, and
+ checks the following as well:
+
+ • ‘make installcheck’ will check to make sure that the ‘--help’
+ and ‘--version’ really print a usage message and a version
+ string, respectively. This is the ‘std-options’ option (*note
+ Options::).
+
+ • ‘make dist’ will check to make sure the ‘NEWS’ file has been
+ updated to the current version.
+
+ • ‘VERSION’ is checked to make sure its format complies with
+ Gnits standards.
+
+ • If ‘VERSION’ indicates that this is an alpha release, and the
+ file ‘README-alpha’ appears in the topmost directory of a
+ package, then it is included in the distribution. This is
+ done in ‘--gnits’ mode, and no other, because this mode is the
+ only one where version number formats are constrained, and
+ hence the only mode where Automake can automatically determine
+ whether ‘README-alpha’ should be included.
+
+ • The file ‘THANKS’ is required.
\1f
File: automake.info, Node: Uniform, Next: Length Limitations, Prev: Strictness, Up: Generalities
Traditionally, most unix-like systems have a length limitation for the
command line arguments and environment contents when creating new
processes (see for example
-<http://www.in-ulm.de/~mascheck/various/argmax/> for an overview on this
-issue), which of course also applies to commands spawned by ‘make’.
+<https://www.in-ulm.de/~mascheck/various/argmax/> for an overview on
+this issue), which of course also applies to commands spawned by ‘make’.
POSIX requires this limit to be at least 4096 bytes, and most modern
systems have quite high limits (or are unlimited).
testsuite harness.
‘texinfo.tex’
- Not a program, this file is required for ‘make dvi’, ‘make ps’ and
- ‘make pdf’ to work when Texinfo sources are in the package. The
- latest version can be downloaded from
- <https://www.gnu.org/software/texinfo/>.
+ When Texinfo sources are in the package, this file is required for
+ ‘make dvi’, ‘make ps’ and ‘make pdf’. The latest version can be
+ downloaded from <https://www.gnu.org/software/texinfo/>. A working
+ TeX distribution, or at least a ‘tex’ program, is also required.
+ Furthermore, ‘make dist’ invokes ‘make dvi’, so these become
+ requirements for making a distribution with Texinfo sources.
‘ylwrap’
This program wraps ‘lex’ and ‘yacc’ to rename their output files.
‘--gnits’
Set the global strictness to ‘gnits’. For more information, see
- *note Gnits::.
+ *note Strictness::.
‘--gnu’
Set the global strictness to ‘gnu’. For more information, see
- *note Gnits::. This is the default strictness.
+ *note Strictness::. This is the default strictness.
‘--help’
Print a summary of the command line options and exit.
‘--version’
Print the version number of Automake and exit.
-‘-W CATEGORY’
-‘--warnings=CATEGORY’
- Output warnings falling in CATEGORY. CATEGORY can be one of:
+‘-W CATEGORY[,CATEGORY...]’
+‘--warnings=CATEGORY[,CATEGORY...]’
+ Output warnings about a CATEGORY of potential problems with the
+ package. CATEGORY can be any of:
+
+ ‘cross’
+ Constructs compromising the ability to cross-compile the
+ package.
‘gnu’
- warnings related to the GNU Coding Standards (*note
+ Minor deviations from the GNU Coding Standards (*note
(standards)Top::).
‘obsolete’
- obsolete features or constructions
+ Obsolete features or constructions.
‘override’
- user redefinitions of Automake rules or variables
+ Redefinitions of Automake rules or variables.
‘portability’
- portability issues (e.g., use of ‘make’ features that are
- known to be not portable)
+ Portability issues (e.g., use of ‘make’ features that are
+ known to be not portable).
+ ‘portability-recursive’
+ Recursive, or nested, Make variable expansions (‘$(foo$(x))’).
+ These are not universally supported, but are more portable
+ than the other non-portable constructs diagnosed by
+ ‘-Wportability’. These warnings are turned on by
+ ‘-Wportability’ but can then be turned off specifically by
+ ‘-Wno-portability-recursive’.
‘extra-portability’
- extra portability issues related to obscure tools. One
- example of such a tool is the Microsoft ‘lib’ archiver.
+ Extra portability issues, related to rarely-used tools such as
+ the Microsoft ‘lib’ archiver.
‘syntax’
- weird syntax, unused variables, typos
+ Questionable syntax, unused variables, typos, etc.
‘unsupported’
- unsupported or incomplete features
+ Unsupported or incomplete features.
‘all’
- all the warnings
+ Turn on all the above categories of warnings.
‘none’
- turn off all the warnings
+ Turn off all the above categories of warnings.
‘error’
- treat warnings as errors
+ Treat warnings as errors.
A category can be turned off by prefixing its name with ‘no-’. For
instance, ‘-Wno-syntax’ will hide the warnings about unused
variables.
- The categories output by default are ‘obsolete’, ‘syntax’ and
- ‘unsupported’. Additionally, ‘gnu’ and ‘portability’ are enabled
- in ‘--gnu’ and ‘--gnits’ strictness.
+ Warnings in the ‘gnu’, ‘obsolete’, ‘portability’, ‘syntax’, and
+ ‘unsupported’ categories are turned on by default. The ‘gnu’ and
+ ‘portability’ categories are turned off in ‘--foreign’ strictness.
Turning off ‘portability’ will also turn off ‘extra-portability’,
and similarly turning on ‘extra-portability’ will also turn on
‘portability’. However, turning on ‘portability’ or turning off
‘extra-portability’ will not affect the other category.
+ Unknown warning categories supplied as an argument to ‘-W’ will
+ themselves produce a warning, in the ‘unsupported’ category. This
+ warning is never treated as an error.
+
The environment variable ‘WARNINGS’ can contain a comma separated
- list of categories to enable. It will be taken into account before
- the command-line switches, this way ‘-Wnone’ will also ignore any
- warning category enabled by ‘WARNINGS’. This variable is also used
- by other tools like ‘autoconf’; unknown categories are ignored for
- this reason.
+ list of categories to enable. ‘-W’ settings on the command line
+ take precedence; for instance, ‘-Wnone’ also turns off any warning
+ categories enabled by ‘WARNINGS’.
+
+ Unknown warning categories named in ‘WARNINGS’ are silently
+ ignored.
If the environment variable ‘AUTOMAKE_JOBS’ contains a positive
number, it is taken as the maximum number of Perl threads to use in
Silent Rules::).
‘AM_WITH_DMALLOC’
- Add support for the Dmalloc package (http://dmalloc.com/). If the
+ Add support for the Dmalloc package (https://dmalloc.com/). If the
user runs ‘configure’ with ‘--with-dmalloc’, then define
‘WITH_DMALLOC’ and add ‘-ldmalloc’ to ‘LIBS’.
7.3 An Alternative Approach to Subdirectories
=============================================
-If you’ve ever read Peter Miller’s excellent paper, Recursive Make
-Considered Harmful (http://miller.emu.id.au/pmiller/books/rmch/), the
-preceding sections on the use of make recursion will probably come as
-unwelcome advice. For those who haven’t read the paper, Miller’s main
-thesis is that recursive ‘make’ invocations are both slow and
-error-prone.
+If you’ve ever read Peter Miller’s excellent paper, ‘Recursive Make
+Considered Harmful’, the preceding sections on the use of make recursion
+will probably come as unwelcome advice. For those who haven’t read the
+paper, Miller’s main thesis is that recursive ‘make’ invocations are
+both slow and error-prone.
Automake provides sufficient cross-directory support (1) to enable
you to write a single ‘Makefile.am’ for a complex multi-directory
‘./configure’ is run: not all platforms support all kinds of libraries,
and users can explicitly select which libraries should be built.
(However the package’s maintainers can tune the default; *note The
-‘AC_PROG_LIBTOOL’ macro: (libtool)AC_PROG_LIBTOOL.)
+‘LT_INIT’ macro: (libtool)LT_INIT.)
Because object files for shared and static libraries must be compiled
differently, libtool is also used during compilation. Object files
---------- Footnotes ----------
(1) For example, the cfortran package
-(http://www-zeus.desy.de/~burow/cfortran/) addresses all of these
+(https://www-zeus.desy.de/~burow/cfortran/) addresses all of these
inter-language issues, and runs under nearly all Fortran 77, C and C++
compilers on nearly all platforms. However, ‘cfortran’ is not yet Free
Software, but it will be in the next major release.
=================
Automake provides initial support for Vala
-(<http://www.vala-project.org/>). This requires valac version 0.7.0 or
+(<https://www.vala-project.org/>). This requires valac version 0.7.0 or
later, and currently requires the user to use GNU ‘make’.
foo_SOURCES = foo.vala bar.vala zardoc.c
[ACTION-IF-NOT-FOUND]) Search for a Vala compiler in ‘PATH’. If it
is found, the variable ‘VALAC’ is set to point to it (see below for
more details). This macro takes three optional arguments. The
- first argument, if present, is the minimum version of the Vala
- compiler required to compile this package. If a compiler is found
- and satisfies MINIMUM-VERSION, then ACTION-IF-FOUND is run (this
- defaults to do nothing). Otherwise, ACTION-IF-NOT-FOUND is run.
- If ACTION-IF-NOT-FOUND is not specified, the default value is to
- print a warning in case no compiler is found, or if a too-old
- version of the compiler is found.
+ first argument, if present, is the minimum version of the Vala API
+ required to compile this package. For Vala releases, this is the
+ same as the major and minor release number; e.g., when ‘valac
+ --version’ reports ‘0.48.7’, ‘valac --api-version’ reports ‘0.48’.
+ If a compiler is found and satisfies MINIMUM-VERSION, then
+ ACTION-IF-FOUND is run (this defaults to do nothing). Otherwise,
+ ACTION-IF-NOT-FOUND is run. If ACTION-IF-NOT-FOUND is not
+ specified, the default value is to print a warning in case no
+ compiler is found, or if a too-old version of the compiler is
+ found.
There are a few variables that are used when compiling Vala sources:
dependency information.
The ‘BUILT_SOURCES’ variable is a workaround for this problem. A
-source file listed in ‘BUILT_SOURCES’ is made on ‘make all’ or ‘make
-check’ (or even ‘make install’) before other targets are processed.
-However, such a source file is not _compiled_ unless explicitly
-requested by mentioning it in some other ‘_SOURCES’ variable.
+source file listed in ‘BUILT_SOURCES’ is made when ‘make all’, ‘make
+check’, ‘make install’, ‘make install-exec’ (or ‘make dist’) is run,
+before other targets are processed. However, such a source file is not
+_compiled_ unless explicitly requested by mentioning it in some other
+‘_SOURCES’ variable.
So, to conclude our introductory example, we could use ‘BUILT_SOURCES
= foo.h’ to ensure ‘foo.h’ gets built before any other target (including
need to appear in ‘BUILT_SOURCES’ (unless it is included by another
source), because it’s a known dependency of the associated object.
- It might be important to emphasize that ‘BUILT_SOURCES’ is honored
-only by ‘make all’, ‘make check’ and ‘make install’. This means you
-cannot build a specific target (e.g., ‘make foo’) in a clean tree if it
-depends on a built source. However it will succeed if you have run
-‘make all’ earlier, because accurate dependencies are already available.
+ To emphasize, ‘BUILT_SOURCES’ is honored only by ‘make all’, ‘make
+check’, ‘make install’, and ‘make install-exec’ (and ‘make dist’). This
+means you cannot build an arbitrary target (e.g., ‘make foo’) in a clean
+tree if it depends on a built source. However it will succeed if you
+have run ‘make all’ earlier, because accurate dependencies are already
+available.
The next section illustrates and discusses the handling of built
sources on a toy example.
‘PYTHON_VERSION’
The Python version number, in the form MAJOR.MINOR (e.g., ‘2.5’).
- This is currently the value of ‘sys.version[:3]’.
+ This is currently the value of ‘'%u.%u' % sys.version_info[:2]’.
‘PYTHON_PREFIX’
The string ‘${prefix}’. This term may be used in future work that
file. This defaults to ‘dvips’.
‘TEXINFO_TEX’
-
If your package has Texinfo files in many directories, you can use
the variable ‘TEXINFO_TEX’ to tell Automake where to find the
canonical ‘texinfo.tex’ for your package. The value of this
* Staged Installs:: Installation in a temporary location
* Install Rules for the User:: Useful additional rules
-\1f
-File: automake.info, Node: Basics of Installation, Next: The Two Parts of Install, Up: Install
-
-12.1 Basics of Installation
-===========================
-
-A file named in a primary is installed by copying the built file into
-the appropriate directory. The base name of the file is used when
-installing.
-
- bin_PROGRAMS = hello subdir/goodbye
-
- In this example, both ‘hello’ and ‘goodbye’ will be installed in
-‘$(bindir)’.
-
- Sometimes it is useful to avoid the basename step at install time.
-For instance, you might have a number of header files in subdirectories
-of the source tree that are laid out precisely how you want to install
-them. In this situation you can use the ‘nobase_’ prefix to suppress
-the base name step. For example:
-
- nobase_include_HEADERS = stdio.h sys/types.h
-
-will install ‘stdio.h’ in ‘$(includedir)’ and ‘types.h’ in
-‘$(includedir)/sys’.
-
- For most file types, Automake will install multiple files at once,
-while avoiding command line length issues (*note Length Limitations::).
-Since some ‘install’ programs will not install the same file twice in
-one invocation, you may need to ensure that file lists are unique within
-one variable such as ‘nobase_include_HEADERS’ above.
-
- You should not rely on the order in which files listed in one
-variable are installed. Likewise, to cater for parallel make, you
-should not rely on any particular file installation order even among
-different file types (library dependencies are an exception here).
-
-\1f
-File: automake.info, Node: The Two Parts of Install, Next: Extending Installation, Prev: Basics of Installation, Up: Install
-
-12.2 The Two Parts of Install
-=============================
-
-Automake generates separate ‘install-data’ and ‘install-exec’ rules, in
-case the installer is installing on multiple machines that share
-directory structure—these targets allow the machine-independent parts to
-be installed only once. ‘install-exec’ installs platform-dependent
-files, and ‘install-data’ installs platform-independent files. The
-‘install’ target depends on both of these targets. While Automake tries
-to automatically segregate objects into the correct category, the
-‘Makefile.am’ author is, in the end, responsible for making sure this is
-done correctly.
-
- Variables using the standard directory prefixes ‘data’, ‘info’,
-‘man’, ‘include’, ‘oldinclude’, ‘pkgdata’, or ‘pkginclude’ are installed
-by ‘install-data’.
-
- Variables using the standard directory prefixes ‘bin’, ‘sbin’,
-‘libexec’, ‘sysconf’, ‘localstate’, ‘lib’, or ‘pkglib’ are installed by
-‘install-exec’.
-
- For instance, ‘data_DATA’ files are installed by ‘install-data’,
-while ‘bin_PROGRAMS’ files are installed by ‘install-exec’.
-
- Any variable using a user-defined directory prefix with ‘exec’ in the
-name (e.g., ‘myexecbin_PROGRAMS’) is installed by ‘install-exec’. All
-other user-defined prefixes are installed by ‘install-data’.
-
-\1f
-File: automake.info, Node: Extending Installation, Next: Staged Installs, Prev: The Two Parts of Install, Up: Install
-
-12.3 Extending Installation
-===========================
-
-It is possible to extend this mechanism by defining an
-‘install-exec-local’ or ‘install-data-local’ rule. If these rules
-exist, they will be run at ‘make install’ time. These rules can do
-almost anything; care is required.
-
- Automake also supports two install hooks, ‘install-exec-hook’ and
-‘install-data-hook’. These hooks are run after all other install rules
-of the appropriate type, exec or data, have completed. So, for
-instance, it is possible to perform post-installation modifications
-using an install hook. *Note Extending::, for some examples.
-
-This is automake.info, produced by makeinfo version 6.5 from
+This is automake.info, produced by makeinfo version 6.7 from
automake.texi.
-This manual is for GNU Automake (version 1.16.2, 1 February 2020), a
+This manual is for GNU Automake (version 1.16.3, 19 November 2020), a
program that creates GNU standards-compliant Makefiles from template
files.
* automake-invocation: (automake)automake Invocation. Generating Makefile.in.
END-INFO-DIR-ENTRY
+\1f
+File: automake.info, Node: Basics of Installation, Next: The Two Parts of Install, Up: Install
+
+12.1 Basics of Installation
+===========================
+
+A file named in a primary is installed by copying the built file into
+the appropriate directory. The base name of the file is used when
+installing.
+
+ bin_PROGRAMS = hello subdir/goodbye
+
+ In this example, both ‘hello’ and ‘goodbye’ will be installed in
+‘$(bindir)’.
+
+ Sometimes it is useful to avoid the basename step at install time.
+For instance, you might have a number of header files in subdirectories
+of the source tree that are laid out precisely how you want to install
+them. In this situation you can use the ‘nobase_’ prefix to suppress
+the base name step. For example:
+
+ nobase_include_HEADERS = stdio.h sys/types.h
+
+will install ‘stdio.h’ in ‘$(includedir)’ and ‘types.h’ in
+‘$(includedir)/sys’.
+
+ For most file types, Automake will install multiple files at once,
+while avoiding command line length issues (*note Length Limitations::).
+Since some ‘install’ programs will not install the same file twice in
+one invocation, you may need to ensure that file lists are unique within
+one variable such as ‘nobase_include_HEADERS’ above.
+
+ You should not rely on the order in which files listed in one
+variable are installed. Likewise, to cater for parallel make, you
+should not rely on any particular file installation order even among
+different file types (library dependencies are an exception here).
+
+\1f
+File: automake.info, Node: The Two Parts of Install, Next: Extending Installation, Prev: Basics of Installation, Up: Install
+
+12.2 The Two Parts of Install
+=============================
+
+Automake generates separate ‘install-data’ and ‘install-exec’ rules, in
+case the installer is installing on multiple machines that share
+directory structure—these targets allow the machine-independent parts to
+be installed only once. ‘install-exec’ installs platform-dependent
+files, and ‘install-data’ installs platform-independent files. The
+‘install’ target depends on both of these targets. While Automake tries
+to automatically segregate objects into the correct category, the
+‘Makefile.am’ author is, in the end, responsible for making sure this is
+done correctly.
+
+ Variables using the standard directory prefixes ‘data’, ‘info’,
+‘man’, ‘include’, ‘oldinclude’, ‘pkgdata’, or ‘pkginclude’ are installed
+by ‘install-data’.
+
+ Variables using the standard directory prefixes ‘bin’, ‘sbin’,
+‘libexec’, ‘sysconf’, ‘localstate’, ‘lib’, or ‘pkglib’ are installed by
+‘install-exec’.
+
+ For instance, ‘data_DATA’ files are installed by ‘install-data’,
+while ‘bin_PROGRAMS’ files are installed by ‘install-exec’.
+
+ Any variable using a user-defined directory prefix with ‘exec’ in the
+name (e.g., ‘myexecbin_PROGRAMS’) is installed by ‘install-exec’. All
+other user-defined prefixes are installed by ‘install-data’.
+
+\1f
+File: automake.info, Node: Extending Installation, Next: Staged Installs, Prev: The Two Parts of Install, Up: Install
+
+12.3 Extending Installation
+===========================
+
+It is possible to extend this mechanism by defining an
+‘install-exec-local’ or ‘install-data-local’ rule. If these rules
+exist, they will be run at ‘make install’ time. These rules can do
+almost anything; care is required.
+
+ Automake also supports two install hooks, ‘install-exec-hook’ and
+‘install-data-hook’. These hooks are run after all other install rules
+of the appropriate type, exec or data, have completed. So, for
+instance, it is possible to perform post-installation modifications
+using an install hook. *Note Extending::, for some examples.
+
\1f
File: automake.info, Node: Staged Installs, Next: Install Rules for the User, Prev: Extending Installation, Up: Install
automatically defined by either the ‘AC_INIT’ invocation or by a
_deprecated_ two-arguments invocation of the ‘AM_INIT_AUTOMAKE’ macro
(see *note Public Macros:: for how these variables get their values,
-from either defaults or explicit values – it’s slightly trickier than
-one would expect). More precisely the gzipped ‘tar’ file is named
-‘${PACKAGE}-${VERSION}.tar.gz’. You can use the ‘make’ variable
-‘GZIP_ENV’ to control how gzip is run. The default setting is ‘--best’.
+from either defaults or explicit values—it’s slightly trickier than one
+would expect). More precisely the gzipped ‘tar’ file is named
+‘${PACKAGE}-${VERSION}.tar.gz’.
+
+ You can use the ‘make’ variable ‘GZIP_ENV’ to control how gzip is
+run. The default setting is ‘--best’.
+
+ You can set the environment variable ‘TAR’ to override the tar
+program used; it defaults to ‘tar’.
For the most part, the files to distribute are automatically found by
Automake: all source files are automatically included in a distribution,
EXTRA_DIST = doc
dist-hook:
+ chmod -R u+w $(distdir)/doc
rm -rf `find $(distdir)/doc -type d -name .svn`
Note that the ‘dist-hook’ recipe shouldn’t assume that the regular files
EXTRA_DIST = README doc
dist-hook:
chmod u+w $(distdir)/README $(distdir)/doc
- echo "Distribution date: `date`" >> README
+ echo "Distribution date: `date`" >> $(distdir)/README
rm -f $(distdir)/doc/HACKING
Two variables that come handy when writing ‘dist-hook’ rules are
Automake also generates a ‘distcheck’ rule that can be of help to ensure
that a given distribution will actually work. Simplifying a bit, we can
say this rule first makes a distribution, and then, _operating from it_,
-takes the following steps:
+takes the following steps (in this order):
• tries to do a ‘VPATH’ build (*note VPATH Builds::), with the
‘srcdir’ and all its content made _read-only_;
+ • tries to make the printable documentation, if any (with ‘make
+ dvi’),
• runs the test suite (with ‘make check’) on this fresh build;
• installs the package in a temporary directory (with ‘make
install’), and runs the test suite on the resulting installation
• finally, makes another tarball to ensure the distribution is
self-contained.
- All of these actions are performed in a temporary directory. Please
-note that the exact location and the exact structure of such a directory
-(where the read-only sources are placed, how the temporary build and
-install directories are named and how deeply they are nested, etc.) is
-to be considered an implementation detail, which can change at any time;
-so do not rely on it.
+ All of these actions are performed in a temporary directory. The
+exact location and the exact structure of such a directory (where the
+read-only sources are placed, how the temporary build and install
+directories are named and how deeply they are nested, etc.) is to be
+considered an implementation detail, which can change at any time; so do
+not rely on it.
DISTCHECK_CONFIGURE_FLAGS
-------------------------
where ‘make installcheck’ was wrongly assuming it could blindly test
"‘m4’", rather than the just-installed "‘gm4’".
+dvi and distcheck
+-----------------
+
+Ordinarily, ‘make distcheck’ runs ‘make dvi’. It does nothing if the
+distribution contains no Texinfo sources. If the distribution does
+contain a Texinfo manual, by default the ‘dvi’ target will run TeX to
+make sure it can be successfully processed (*note Texinfo::).
+
+ However, you may wish to test the manual by producing ‘pdf’ (e.g., if
+your manual uses images in formats other than ‘eps’), ‘html’ (if you
+don’t have TeX at all), some other format, or just skip the test
+entirely (not recommended). You can change the target that is run by
+setting the variable ‘AM_DISTCHECK_DVI_TARGET’ in your ‘Makefile.am’;
+for example,
+
+ AM_DISTCHECK_DVI_TARGET = pdf
+
+ To make ‘dvi’ into a do-nothing target, see the example for
+‘EMPTY_AUTOMAKE_TARGETS’ in *note Third-Party Makefiles::.
+
distcheck-hook
--------------
simply make no sense on a given system (for example, a test checking a
Windows-specific feature makes no sense on a GNU/Linux system). In this
case, accordingly to the definition above, the tests can neither be
-considered passed nor failed; instead, they are _skipped_ – i.e., they
+considered passed nor failed; instead, they are _skipped_- –i.e., they
are not run, or their result is anyway ignored for what concerns the
count of failures and successes. Skips are usually explicitly reported
though, so that the user will be aware that not all of the testsuite has
By default, only the exit statuses of the test scripts are considered
when determining the testsuite outcome. But Automake allows also the
use of more complex test protocols, either standard (*note Using the TAP
-test protocol::) or custom (*note Custom Test Drivers::). Note that you
-can’t enable such protocols when the serial harness is used, though. In
-the rest of this section we are going to concentrate mostly on
-protocol-less tests, since we cover test protocols in a later section
-(again, *note Custom Test Drivers::).
+test protocol::) or custom (*note Custom Test Drivers::). You can’t
+enable such protocols when the serial harness is used, though. In the
+rest of this section we are going to concentrate mostly on protocol-less
+tests, since we cover test protocols in a later section (again, *note
+Custom Test Drivers::).
When no test protocol is in use, an exit status of 0 from a test
script will denote a success, an exit status of 77 a skipped test, an
A testsuite summary (expected to report at least the number of run,
skipped and failed tests) will be printed at the end of the testsuite
-run.
+run. By default, the first line of the summary has the form:
+
+ Testsuite summary for PACKAGE-STRING
+
+where PACKAGE-STRING is the name and version of the package. If you
+have several independent test suites for different parts of the package,
+though, it can be misleading for each suite to imply it is for the whole
+package. Or, in complex projects, you may wish to add the current
+directory or other information to the testsuite header line. So you can
+override the ‘ for PACKAGE-STRING’ suffix on that line by setting the
+‘AM_TESTSUITE_SUMMARY_HEADER’ variable. The value of this variable is
+used unquoted in a shell echo command, so you must include any necessary
+quotes. For example, the default value is
+
+ AM_TESTSUITE_SUMMARY_HEADER = ' for $(PACKAGE_STRING)'
+
+including the double quotes (interpreted by the shell) and the leading
+space (since the value is output directly after the ‘Testsuite
+summary’). The ‘$(PACKAGE_STRING)’ is substituted by ‘make’.
If the standard output is connected to a capable terminal, then the
test results and the summary are colored appropriately. The developer
files (*note Basics of test metadata::) to store the test results and
related metadata. Apart from that, it will try to remain as compatible
as possible with pre-existing and widespread utilities, such as the
-‘prove’ utility (http://search.cpan.org/~andya/Test-Harness/bin/prove),
-at least for the simpler usages.
+‘prove’ utility
+(https://metacpan.org/pod/distribution/Test-Harness/bin/prove), at least
+for the simpler usages.
TAP started its life as part of the test harness for Perl, but today
it has been (mostly) standardized, and has various independent
implementations in different languages; among them, C, C++, Perl,
Python, PHP, and Java. For a semi-official specification of the TAP
-protocol, please refer to the documentation of ‘Test::Harness::TAP’
-(http://search.cpan.org/~petdance/Test-Harness/lib/Test/Harness/TAP.pod).
+protocol, please refer to the documentation of ‘Test::Harness’
+(https://metacpan.org/pod/Test::Harness).
The most relevant real-world usages of TAP are obviously in the
-testsuites of ‘perl’ and of many perl modules. Still, other important
-third-party packages, such as ‘git’ (http://git-scm.com/), also use TAP
+testsuites of ‘perl’ and of many Perl modules. Still, other important
+third-party packages, such as ‘git’ (https://git-scm.com/), also use TAP
in their testsuite.
\1f
Here are some links to more extensive official or third-party
documentation and resources about the TAP protocol and related tools and
libraries.
- • ‘Test::Harness::TAP’
- (http://search.cpan.org/~petdance/Test-Harness/lib/Test/Harness/TAP.pod),
- the (mostly) official documentation about the TAP format and
- protocol.
- • ‘prove’ (http://search.cpan.org/~andya/Test-Harness/bin/prove),
- the most famous command-line TAP test driver, included in the
+ • ‘Test::Harness’ (https://metacpan.org/pod/Test::Harness), the
+ (mostly) official documentation about the TAP format and protocol.
+ • ‘prove’
+ (https://metacpan.org/pod/distribution/Test-Harness/bin/prove), the
+ most famous command-line TAP test driver, included in the
distribution of ‘perl’ and ‘Test::Harness’
- (http://search.cpan.org/~andya/Test-Harness/lib/Test/Harness.pm).
- • The TAP wiki (http://testanything.org/wiki/index.php/Main_Page).
- • A “gentle introduction” to testing for perl coders:
+ (https://metacpan.org/pod/distribution/Test-Harness/lib/Test/Harness.pm).
+ • The TAP wiki (https://testanything.org/).
+ • A “gentle introduction” to testing for Perl coders:
‘Test::Tutorial’
- (http://search.cpan.org/dist/Test-Simple/lib/Test/Tutorial.pod).
+ (https://metacpan.org/pod/distribution/Test-Simple/lib/Test/Tutorial.pod).
• ‘Test::Simple’
- (http://search.cpan.org/~mschwern/Test-Simple/lib/Test/Simple.pm)
+ (https://metacpan.org/pod/distribution/Test-Simple/lib/Test/Simple.pm)
and ‘Test::More’
- (http://search.cpan.org/~mschwern/Test-Simple/lib/Test/More.pm),
- the standard perl testing libraries, which are based on TAP.
+ (https://metacpan.org/pod/distribution/Test-Simple/lib/Test/More.pm),
+ the standard Perl testing libraries, which are based on TAP.
• C TAP Harness
- (http://www.eyrie.org/~eagle/software/c-tap-harness/), a C-based
+ (https://www.eyrie.org/~eagle/software/c-tap-harness/), a C-based
project implementing both a TAP producer and a TAP consumer.
- • tap4j (http://www.tap4j.org/), a Java-based project implementing
- both a TAP producer and a TAP consumer.
+ • tap4j (https://tap4j.org/), a Java-based project implementing both
+ a TAP producer and a TAP consumer.
\1f
File: automake.info, Node: DejaGnu Tests, Next: Install Tests, Prev: Using the TAP test protocol, Up: Tests
15.5 DejaGnu Tests
==================
-If ‘dejagnu’ (https://ftp.gnu.org/gnu/dejagnu/) appears in
+If ‘dejagnu’ (*note Introduction: (dejagnu)Top.) appears in
‘AUTOMAKE_OPTIONS’, then a ‘dejagnu’-based test suite is assumed. The
variable ‘DEJATOOL’ is a list of names that are passed, one at a time,
as the ‘--tool’ argument to ‘runtest’ invocations; it defaults to the
‘gnu’
‘foreign’
- Set the strictness as appropriate. The ‘gnits’ option also implies
- options ‘readme-alpha’ and ‘check-news’.
+ Set the strictness as appropriate. *Note Strictness::. The
+ ‘gnits’ option also implies options ‘readme-alpha’ and
+ ‘check-news’.
‘check-news’
Cause ‘make dist’ to fail unless the current version number appears
Libtool Sources::).
\1f
-File: automake.info, Node: Silencing Make, Next: Gnits, Prev: Conditionals, Up: Top
+File: automake.info, Node: Silencing Make, Next: Not Enough, Prev: Conditionals, Up: Top
21 Silencing ‘make’
*******************
the “Entering/Leaving directory ...” messages are to be disabled.
\1f
-File: automake.info, Node: Gnits, Next: Not Enough, Prev: Silencing Make, Up: Top
-
-22 The effect of ‘--gnu’ and ‘--gnits’
-**************************************
-
-The ‘--gnu’ option (or ‘gnu’ in the ‘AUTOMAKE_OPTIONS’ variable) causes
-‘automake’ to check the following:
-
- • The files ‘INSTALL’, ‘NEWS’, ‘README’, ‘AUTHORS’, and ‘ChangeLog’,
- plus one of ‘COPYING.LIB’, ‘COPYING.LESSER’ or ‘COPYING’, are
- required at the topmost directory of the package.
-
- If the ‘--add-missing’ option is given, ‘automake’ will add a
- generic version of the ‘INSTALL’ file as well as the ‘COPYING’ file
- containing the text of the current version of the GNU General
- Public License existing at the time of this Automake release
- (version 3 as this is written,
- <https://www.gnu.org/copyleft/gpl.html>). However, an existing
- ‘COPYING’ file will never be overwritten by ‘automake’.
-
- • The options ‘no-installman’ and ‘no-installinfo’ are prohibited.
-
- Note that this option will be extended in the future to do even more
-checking; it is advisable to be familiar with the precise requirements
-of the GNU standards. Also, ‘--gnu’ can require certain non-standard
-GNU programs to exist for use by various maintainer-only rules; for
-instance, in the future ‘pathchk’ might be required for ‘make dist’.
-
- The ‘--gnits’ option does everything that ‘--gnu’ does, and checks
-the following as well:
-
- • ‘make installcheck’ will check to make sure that the ‘--help’ and
- ‘--version’ really print a usage message and a version string,
- respectively. This is the ‘std-options’ option (*note Options::).
-
- • ‘make dist’ will check to make sure the ‘NEWS’ file has been
- updated to the current version.
-
- • ‘VERSION’ is checked to make sure its format complies with Gnits
- standards.
-
- • If ‘VERSION’ indicates that this is an alpha release, and the file
- ‘README-alpha’ appears in the topmost directory of a package, then
- it is included in the distribution. This is done in ‘--gnits’
- mode, and no other, because this mode is the only one where version
- number formats are constrained, and hence the only mode where
- Automake can automatically determine whether ‘README-alpha’ should
- be included.
-
- • The file ‘THANKS’ is required.
-
-\1f
-File: automake.info, Node: Not Enough, Next: Distributing, Prev: Gnits, Up: Top
+File: automake.info, Node: Not Enough, Next: Distributing, Prev: Silencing Make, Up: Top
-23 When Automake Isn’t Enough
+22 When Automake Isn’t Enough
*****************************
In some situations, where Automake is not up to one task, one has to
\1f
File: automake.info, Node: Extending, Next: Third-Party Makefiles, Up: Not Enough
-23.1 Extending Automake Rules
+22.1 Extending Automake Rules
=============================
With some minor exceptions (for example ‘_PROGRAMS’ variables, ‘TESTS’,
\1f
File: automake.info, Node: Third-Party Makefiles, Prev: Extending, Up: Not Enough
-23.2 Third-Party ‘Makefile’s
+22.2 Third-Party ‘Makefile’s
============================
In most projects all ‘Makefile’s are generated by Automake. In some
.PHONY: $(EMPTY_AUTOMAKE_TARGETS)
$(EMPTY_AUTOMAKE_TARGETS):
+ To be clear, there is nothing special about the variable name
+‘EMPTY_AUTOMAKE_TARGETS’; the name could be anything.
+
Another aspect of integrating third-party build systems is whether
they support VPATH builds (*note VPATH Builds::). Obviously if the
subpackage does not support VPATH builds the whole package will not
\1f
File: automake.info, Node: Distributing, Next: API Versioning, Prev: Not Enough, Up: Top
-24 Distributing ‘Makefile.in’s
+23 Distributing ‘Makefile.in’s
******************************
Automake places no restrictions on the distribution of the resulting
\1f
File: automake.info, Node: API Versioning, Next: Upgrading, Prev: Distributing, Up: Top
-25 Automake API Versioning
+24 Automake API Versioning
**************************
New Automake releases usually include bug fixes and new features.
\1f
File: automake.info, Node: Upgrading, Next: FAQ, Prev: API Versioning, Up: Top
-26 Upgrading a Package to a Newer Automake Version
+25 Upgrading a Package to a Newer Automake Version
**************************************************
Automake maintains three kinds of files in a package.
\1f
File: automake.info, Node: FAQ, Next: Copying This Manual, Prev: Upgrading, Up: Top
-27 Frequently Asked Questions about Automake
+26 Frequently Asked Questions about Automake
********************************************
This chapter covers some questions that often come up on the mailing
\1f
File: automake.info, Node: CVS, Next: maintainer-mode, Up: FAQ
-27.1 CVS and generated files
+26.1 CVS and generated files
============================
Background: distributed generated Files
These files, whether they are kept under CVS or not, raise similar
concerns about version mismatch between developers’ tools. The Gettext
-manual has a section about this; see *note CVS Issues: (gettext)CVS
-Issues.
+manual has a section about this; see *note Integrating with Version
+Control Systems: (gettext)Version Control Issues.
\1f
File: automake.info, Node: maintainer-mode, Next: Wildcards, Prev: CVS, Up: FAQ
-27.2 ‘missing’ and ‘AM_MAINTAINER_MODE’
+26.2 ‘missing’ and ‘AM_MAINTAINER_MODE’
=======================================
‘missing’
\1f
File: automake.info, Node: Wildcards, Next: Limitations on File Names, Prev: maintainer-mode, Up: FAQ
-27.3 Why doesn’t Automake support wildcards?
+26.3 Why doesn’t Automake support wildcards?
============================================
Developers are lazy. They would often like to use wildcards in
\1f
File: automake.info, Node: Limitations on File Names, Next: Errors with distclean, Prev: Wildcards, Up: FAQ
-27.4 Limitations on File Names
+26.4 Limitations on File Names
==============================
Automake attempts to support all kinds of file names, even those that
\1f
File: automake.info, Node: Errors with distclean, Next: Flag Variables Ordering, Prev: Limitations on File Names, Up: FAQ
-27.5 Errors with distclean
+26.5 Errors with distclean
==========================
This is a diagnostic you might encounter while running ‘make distcheck’.
\1f
File: automake.info, Node: Flag Variables Ordering, Next: Renamed Objects, Prev: Errors with distclean, Up: FAQ
-27.6 Flag Variables Ordering
+26.6 Flag Variables Ordering
============================
What is the difference between ‘AM_CFLAGS’, ‘CFLAGS’, and
\1f
File: automake.info, Node: Renamed Objects, Next: Per-Object Flags, Prev: Flag Variables Ordering, Up: FAQ
-27.7 Why are object files sometimes renamed?
+26.7 Why are object files sometimes renamed?
============================================
This happens when per-target compilation flags are used. Object files
\1f
File: automake.info, Node: Per-Object Flags, Next: Multiple Outputs, Prev: Renamed Objects, Up: FAQ
-27.8 Per-Object Flags Emulation
+26.8 Per-Object Flags Emulation
===============================
One of my source files needs to be compiled with different flags. How
\1f
File: automake.info, Node: Multiple Outputs, Next: Hard-Coded Install Paths, Prev: Per-Object Flags, Up: FAQ
-27.9 Handling Tools that Produce Many Outputs
+26.9 Handling Tools that Produce Many Outputs
=============================================
This section describes a ‘make’ idiom that can be used when a tool
data.c: data.foo
foo data.foo
data.h: data.c
+ ## Recover from the removal of $@
+ @test -f $@ || rm -f data.c
+ @test -f $@ || $(MAKE) $(AM_MAKEFLAGS) data.c
+
+ It is tempting to use a single test as follows:
+
+ data.h: data.c
## Recover from the removal of $@
@if test -f $@; then :; else \
rm -f data.c; \
$(MAKE) $(AM_MAKEFLAGS) data.c; \
fi
+but that would break ‘make -n’: at least GNU ‘make’ and Solaris ‘make’
+execute recipes containing the ‘$(MAKE)’ string even when they are
+running in dry mode. So if we didn’t break the recipe above in two
+invocations, the file ‘data.c’ would be removed even upon ‘make -n’.
+Not nice.
+
The above scheme can be extended to handle more outputs and more
inputs. One of the outputs is selected to serve as a witness to the
successful completion of the command, it depends upon all inputs, and
foo data.foo data.bar
data.h data.w data.x: data.c
## Recover from the removal of $@
- @if test -f $@; then :; else \
- rm -f data.c; \
- $(MAKE) $(AM_MAKEFLAGS) data.c; \
- fi
+ @test -f $@ || rm -f data.c
+ @test -f $@ || $(MAKE) $(AM_MAKEFLAGS) data.c
However there are now three minor problems in this setup. One is
related to the timestamp ordering of ‘data.h’, ‘data.w’, ‘data.x’, and
data.c: data.foo data.bar
foo data.foo data.bar
data.h data.w data.x: data.c
- @if test -f $@; then \
- touch $@; \
- else \
+ @test ! -f $@ || touch $@
## Recover from the removal of $@
- rm -f data.c; \
- $(MAKE) $(AM_MAKEFLAGS) data.c; \
- fi
+ @test -f $@ || rm -f data.c
+ @test -f $@ || $(MAKE) $(AM_MAKEFLAGS) data.c
Another solution is to use a different and dedicated file as witness,
rather than using any of ‘foo’’s outputs.
@mv -f data.tmp $@
data.c data.h data.w data.x: data.stamp
## Recover from the removal of $@
- @if test -f $@; then :; else \
- rm -f data.stamp; \
- $(MAKE) $(AM_MAKEFLAGS) data.stamp; \
- fi
+ @test -f $@ || rm -f data.stamp
+ @test -f $@ || $(MAKE) $(AM_MAKEFLAGS) data.stamp
‘data.tmp’ is created before ‘foo’ is run, so it has a timestamp
older than output files output by ‘foo’. It is then renamed to
\1f
File: automake.info, Node: Hard-Coded Install Paths, Next: Debugging Make Rules, Prev: Multiple Outputs, Up: FAQ
-27.10 Installing to Hard-Coded Locations
+26.10 Installing to Hard-Coded Locations
========================================
My package needs to install some configuration file. I tried to use
\1f
File: automake.info, Node: Debugging Make Rules, Next: Reporting Bugs, Prev: Hard-Coded Install Paths, Up: FAQ
-27.11 Debugging Make Rules
+26.11 Debugging Make Rules
==========================
The rules and dependency trees generated by ‘automake’ can get rather
\1f
File: automake.info, Node: Reporting Bugs, Prev: Debugging Make Rules, Up: FAQ
-27.12 Reporting Bugs
+26.12 Reporting Bugs
====================
Most nontrivial software has bugs. Automake is no exception. Although
known. You can look at the GNU Bug Tracker (https://debbugs.gnu.org/)
and the bug-automake mailing list archives
(https://lists.gnu.org/archive/html/bug-automake/) for previous bug
-reports. We previously used a Gnats database
-(http://sourceware.org/cgi-bin/gnatsweb.pl?database=automake) for bug
-tracking, so some bugs might have been reported there already. Please
-do not use it for new bug reports, however.
+reports. We previously used a Gnats database for bug tracking, but it
+is no longer online.
If the bug is not already known, it should be reported. It is very
important to report bugs in a way that is useful and efficient. For
this, please familiarize yourself with How to Report Bugs Effectively
-(http://www.chiark.greenend.org.uk/~sgtatham/bugs.html) and How to Ask
+(https://www.chiark.greenend.org.uk/~sgtatham/bugs.html) and How to Ask
Questions the Smart Way
(http://catb.org/~esr/faqs/smart-questions.html). This helps you and
developers to save time, which can then be spent on fixing more bugs and
* AM_SUBST_NOTMAKE(VAR): Optional. (line 180)
* AM_WITH_DMALLOC: Public Macros. (line 122)
* m4_include: Basics of Distribution.
- (line 17)
+ (line 22)
* m4_include <1>: Optional. (line 190)
\1f
* AM_CCASFLAGS: Assembly Support. (line 10)
* AM_CFLAGS: Program Variables. (line 50)
* AM_COLOR_TESTS: Scripts-based Testsuites.
- (line 67)
+ (line 85)
* AM_CPPFLAGS: Program Variables. (line 16)
* AM_CPPFLAGS <1>: Assembly Support. (line 10)
* AM_CXXFLAGS: C++ Support. (line 22)
* AM_DEFAULT_VERBOSITY: Automake Silent Rules.
(line 120)
* AM_DISTCHECK_CONFIGURE_FLAGS: Checking the Distribution.
- (line 28)
+ (line 30)
* AM_ETAGSFLAGS: Tags. (line 25)
* AM_EXT_LOG_DRIVER_FLAGS: Declaring Custom Test Drivers.
(line 6)
* AM_GCJFLAGS: Java Support with gcj.
(line 26)
* AM_INSTALLCHECK_STD_OPTIONS_EXEMPT: List of Automake options.
- (line 138)
+ (line 139)
* AM_JAVACFLAGS: Java. (line 44)
* AM_LDFLAGS: Linking. (line 10)
* AM_LDFLAGS <1>: Program Variables. (line 59)
(line 22)
* AM_RFLAGS: Fortran 77 Support. (line 28)
* AM_RUNTESTFLAGS: DejaGnu Tests. (line 24)
+* AM_TESTSUITE_SUMMARY_HEADER: Scripts-based Testsuites.
+ (line 69)
* AM_TESTS_ENVIRONMENT: Scripts-based Testsuites.
- (line 86)
+ (line 104)
* AM_TESTS_FD_REDIRECT: Scripts-based Testsuites.
- (line 94)
+ (line 112)
* AM_UPCFLAGS: Unified Parallel C Support.
(line 21)
* AM_UPDATE_INFO_DIR: Texinfo. (line 92)
* AM_V: Automake Silent Rules.
(line 120)
-* AM_VALAFLAGS: Vala Support. (line 41)
+* AM_VALAFLAGS: Vala Support. (line 44)
* AM_V_at: Automake Silent Rules.
(line 120)
* AM_V_GEN: Automake Silent Rules.
* AR: Public Macros. (line 75)
* AUTOCONF: automake Invocation. (line 28)
* AUTOM4TE: aclocal Invocation. (line 44)
-* AUTOMAKE_JOBS: automake Invocation. (line 178)
+* AUTOMAKE_JOBS: automake Invocation. (line 195)
* AUTOMAKE_LIBDIR: automake Invocation. (line 64)
* AUTOMAKE_OPTIONS: Public Macros. (line 10)
* AUTOMAKE_OPTIONS <1>: Dependencies. (line 34)
* DISABLE_HARD_ERRORS: Scripts-based Testsuites.
(line 32)
* DISTCHECK_CONFIGURE_FLAGS: Checking the Distribution.
- (line 28)
+ (line 30)
* distcleancheck_listfiles: Checking the Distribution.
- (line 70)
+ (line 92)
* distcleancheck_listfiles <1>: Errors with distclean.
(line 112)
* DISTCLEANFILES: Clean. (line 13)
* DISTCLEANFILES <1>: Checking the Distribution.
- (line 70)
-* distdir: The dist Hook. (line 33)
+ (line 92)
+* distdir: The dist Hook. (line 34)
* distdir <1>: Third-Party Makefiles.
(line 25)
* distuninstallcheck_listfiles: Checking the Distribution.
- (line 106)
-* dist_: Alternative. (line 29)
+ (line 128)
+* dist_: Alternative. (line 28)
* dist_ <1>: Fine-grained Distribution Control.
(line 6)
* dist_lisp_LISP: Emacs Lisp. (line 11)
* DIST_SUBDIRS: Subdirectories with AM_CONDITIONAL.
(line 25)
* DIST_SUBDIRS <1>: Basics of Distribution.
- (line 47)
+ (line 52)
* DVIPS: Texinfo. (line 141)
* EMACS: Public Macros. (line 60)
+* EMPTY_AUTOMAKE_TARGETS: Third-Party Makefiles.
+ (line 88)
* ETAGSFLAGS: Tags. (line 25)
* ETAGS_ARGS: Tags. (line 25)
* EXPECT: DejaGnu Tests. (line 19)
* EXTRA_DIST: Basics of Distribution.
- (line 34)
+ (line 39)
* EXTRA_maude_DEPENDENCIES: Linking. (line 41)
* EXTRA_maude_DEPENDENCIES <1>: Program and Library Variables.
(line 119)
(line 10)
* GTAGS_ARGS: Tags. (line 60)
* GZIP_ENV: Basics of Distribution.
- (line 14)
+ (line 16)
* HEADERS: Uniform. (line 101)
* host_triplet: Optional. (line 14)
* INCLUDES: Program Variables. (line 44)
* MKDIR_P: Obsolete Macros. (line 14)
* mkdir_p: Obsolete Macros. (line 14)
* MOSTLYCLEANFILES: Clean. (line 13)
-* nobase_: Alternative. (line 23)
-* nodist_: Alternative. (line 29)
+* nobase_: Alternative. (line 22)
+* nodist_: Alternative. (line 28)
* nodist_ <1>: Fine-grained Distribution Control.
(line 6)
* noinst_: Uniform. (line 90)
* SOURCES <1>: Default _SOURCES. (line 6)
* SUBDIRS: Subdirectories. (line 8)
* SUBDIRS <1>: Basics of Distribution.
- (line 47)
+ (line 52)
* SUFFIXES: Suffixes. (line 6)
* sysconf_DATA: Data. (line 9)
* TAGS_DEPENDENCIES: Tags. (line 35)
+* TAR: Basics of Distribution.
+ (line 19)
* target_triplet: Optional. (line 14)
* TESTS: Scripts-based Testsuites.
- (line 86)
+ (line 104)
* TESTS <1>: Parallel Test Harness.
(line 12)
* TESTS_ENVIRONMENT: Scripts-based Testsuites.
- (line 86)
+ (line 104)
* TEST_EXTENSIONS: Parallel Test Harness.
(line 34)
* TEST_LOGS: Parallel Test Harness.
* TEXINFOS: Uniform. (line 101)
* TEXINFOS <1>: Texinfo. (line 65)
* TEXINFO_TEX: Texinfo. (line 145)
-* top_distdir: The dist Hook. (line 33)
+* top_distdir: The dist Hook. (line 34)
* top_distdir <1>: Third-Party Makefiles.
(line 25)
* UPC: Public Macros. (line 104)
(line 16)
* V: Automake Silent Rules.
(line 88)
-* VALAC: Vala Support. (line 34)
-* VALAFLAGS: Vala Support. (line 38)
+* VALAC: Vala Support. (line 37)
+* VALAFLAGS: Vala Support. (line 41)
* VERBOSE: Parallel Test Harness.
(line 26)
* VERSION: Basics of Distribution.
(line 6)
-* WARNINGS: automake Invocation. (line 171)
+* WARNINGS: automake Invocation. (line 187)
* WARNINGS <1>: aclocal Options. (line 95)
* WITH_DMALLOC: Public Macros. (line 122)
* XFAIL_TESTS: Scripts-based Testsuites.
(line 85)
* --force: aclocal Options. (line 49)
* --force-missing: automake Invocation. (line 80)
-* --foreign: automake Invocation. (line 86)
-* --gnits: automake Invocation. (line 90)
-* --gnits, complete description: Gnits. (line 29)
-* --gnu: automake Invocation. (line 94)
-* --gnu, complete description: Gnits. (line 6)
-* --gnu, required files: Gnits. (line 6)
+* --foreign: Strictness. (line 51)
+* --foreign <1>: automake Invocation. (line 86)
+* --gnits: Strictness. (line 58)
+* --gnits <1>: automake Invocation. (line 90)
+* --gnu: Strictness. (line 18)
+* --gnu <1>: automake Invocation. (line 94)
* --help: automake Invocation. (line 98)
* --help <1>: aclocal Options. (line 31)
* --help check: List of Automake options.
- (line 132)
+ (line 133)
* --help=recursive: Nested Packages. (line 30)
* --host=HOST: Cross-Compilation. (line 16)
* --include-deps: automake Invocation. (line 106)
* --version: automake Invocation. (line 129)
* --version <1>: aclocal Options. (line 76)
* --version check: List of Automake options.
- (line 132)
+ (line 133)
* --warnings: automake Invocation. (line 133)
* --warnings <1>: aclocal Options. (line 80)
* --with-dmalloc: Public Macros. (line 122)
* AM_YFLAGS and YFLAGS: Flag Variables Ordering.
(line 20)
* Append operator: General Operation. (line 24)
+* ar-lib: Auxiliary Programs. (line 16)
* ARG_MAX: Length Limitations. (line 6)
* autogen.sh and autoreconf: Error required file ltmain.sh not found.
(line 6)
* Automake parser, limitations of: General Operation. (line 33)
* Automake requirements: Introduction. (line 26)
* Automake requirements <1>: Requirements. (line 6)
+* Automake targets, no-op: Third-Party Makefiles.
+ (line 88)
* automake, invocation: automake Invocation. (line 6)
* automake, invoking: automake Invocation. (line 6)
* Automake, recursive operation: General Operation. (line 58)
* autoupdate: Obsolete Macros. (line 6)
* Auxiliary programs: Auxiliary Programs. (line 6)
* Avoiding man page renaming: Man Pages. (line 54)
-* Avoiding path stripping: Alternative. (line 23)
+* Avoiding path stripping: Alternative. (line 22)
* Binary package: DESTDIR. (line 22)
* bootstrap and autoreconf: Error required file ltmain.sh not found.
(line 6)
* check <2>: Extending. (line 41)
* check-local: Extending. (line 41)
* check-news: List of Automake options.
- (line 14)
+ (line 15)
* check_ primary prefix, definition: Uniform. (line 95)
* check_PROGRAMS example: Default _SOURCES. (line 28)
* clean: Standard Targets. (line 27)
* clean-local: Clean. (line 15)
* clean-local <1>: Extending. (line 41)
* Colorized testsuite output: Scripts-based Testsuites.
- (line 67)
+ (line 85)
* command line length limit: Length Limitations. (line 6)
* Comment, special to Automake: General Operation. (line 68)
* Compilation of Java to bytecode: Java. (line 6)
* Compilation of Java to native code: Java Support with gcj.
(line 6)
+* compile: Auxiliary Programs. (line 20)
* Compile Flag Variables: Flag Variables Ordering.
(line 20)
* Complete example: Complete. (line 6)
* Conditional SUBDIRS: Conditional Subdirectories.
(line 6)
* Conditionals: Conditionals. (line 6)
-* config.guess: automake Invocation. (line 39)
+* config.guess: Auxiliary Programs. (line 30)
+* config.guess <1>: automake Invocation. (line 39)
* config.site example: config.site. (line 6)
+* config.sub: Auxiliary Programs. (line 30)
* configuration variables, overriding: Standard Configuration Variables.
(line 6)
* Configuration, basics: Basic Installation. (line 6)
* definitions, conflicts: Extending. (line 14)
* dejagnu: DejaGnu Tests. (line 19)
* dejagnu <1>: List of Automake options.
- (line 18)
-* depcomp: Dependencies. (line 22)
+ (line 19)
+* depcomp: Auxiliary Programs. (line 40)
+* depcomp <1>: Dependencies. (line 22)
* dependencies and distributed files: Errors with distclean.
(line 6)
* Dependency tracking: Dependency Tracking. (line 6)
* dist-bzip2: The Types of Distributions.
(line 18)
* dist-bzip2 <1>: List of Automake options.
- (line 22)
+ (line 23)
* dist-bzip2 <2>: List of Automake options.
- (line 22)
+ (line 23)
* dist-gzip: The Types of Distributions.
(line 11)
* dist-hook: The dist Hook. (line 6)
* dist-lzip: The Types of Distributions.
(line 22)
* dist-lzip <1>: List of Automake options.
- (line 25)
+ (line 26)
* dist-lzip <2>: List of Automake options.
- (line 25)
+ (line 26)
* dist-shar: The Types of Distributions.
(line 45)
* dist-shar <1>: List of Automake options.
- (line 39)
+ (line 40)
* dist-shar <2>: List of Automake options.
- (line 37)
+ (line 38)
* dist-tarZ: The Types of Distributions.
(line 39)
* dist-tarZ <1>: List of Automake options.
- (line 44)
+ (line 45)
* dist-tarZ <2>: List of Automake options.
- (line 42)
+ (line 43)
* dist-xz: The Types of Distributions.
(line 30)
* dist-xz <1>: List of Automake options.
- (line 28)
+ (line 29)
* dist-xz <2>: List of Automake options.
- (line 28)
+ (line 29)
* dist-zip: The Types of Distributions.
(line 33)
* dist-zip <1>: List of Automake options.
- (line 31)
+ (line 32)
* dist-zip <2>: List of Automake options.
- (line 31)
+ (line 32)
* dist-zstd: The Types of Distributions.
(line 55)
* dist-zstd <1>: List of Automake options.
- (line 34)
+ (line 35)
* dist-zstd <2>: List of Automake options.
- (line 34)
+ (line 35)
* distcheck: Creating amhello. (line 100)
* distcheck <1>: Checking the Distribution.
(line 6)
(line 10)
* distcheck example: Creating amhello. (line 100)
* distcheck-hook: Checking the Distribution.
- (line 55)
+ (line 77)
* distclean: Standard Targets. (line 29)
* distclean <1>: Extending. (line 41)
* distclean <2>: Errors with distclean.
* distclean-local: Clean. (line 15)
* distclean-local <1>: Extending. (line 41)
* distcleancheck: Checking the Distribution.
- (line 70)
+ (line 92)
* distdir: Third-Party Makefiles.
(line 25)
* Distinction between errors and failures in testsuites: Generalities about Testing.
* Distributions, preparation: Preparing Distributions.
(line 6)
* distuninstallcheck: Checking the Distribution.
- (line 106)
-* dist_ and nobase_: Alternative. (line 29)
+ (line 128)
+* dist_ and nobase_: Alternative. (line 28)
* dist_ and notrans_: Man Pages. (line 63)
* DIST_SUBDIRS, explained: SUBDIRS vs DIST_SUBDIRS.
(line 6)
* dmalloc, support for: Public Macros. (line 122)
+* do-nothing Automake targets: Third-Party Makefiles.
+ (line 88)
* dvi: Texinfo. (line 25)
-* dvi <1>: Extending. (line 41)
+* dvi <1>: Checking the Distribution.
+ (line 57)
+* dvi <2>: Extending. (line 41)
* DVI output using Texinfo: Texinfo. (line 6)
* dvi-local: Extending. (line 41)
* E-mail, bug reports: Introduction. (line 30)
* EDITION Texinfo flag: Texinfo. (line 35)
* else: Usage of Conditionals.
(line 36)
+* empty Automake targets: Third-Party Makefiles.
+ (line 88)
* Empty libraries: A Library. (line 48)
* Empty libraries and $(LIBOBJS): LIBOBJS. (line 72)
* empty _SOURCES: Default _SOURCES. (line 44)
* endif: Usage of Conditionals.
(line 36)
+* eps images: Checking the Distribution.
+ (line 60)
* Example conditional --enable-debug: Usage of Conditionals.
(line 21)
* Example conditional AC_CONFIG_FILES: Usage of Conditionals.
* file names, limitations on: Limitations on File Names.
(line 6)
* filename-length-max=99: List of Automake options.
- (line 47)
+ (line 48)
* Files distributed with Automake: automake Invocation. (line 39)
* First line of Makefile.am: General Operation. (line 74)
* Flag variables, ordering: Flag Variables Ordering.
(line 38)
* foreign <1>: List of Automake options.
(line 9)
-* foreign strictness: Strictness. (line 10)
+* foreign strictness: Strictness. (line 51)
* Fortran 77 support: Fortran 77 Support. (line 6)
* Fortran 77, mixing with C and C++: Mixing Fortran 77 With C and C++.
(line 6)
* git-dist, non-standard example: General Operation. (line 12)
* gnits: List of Automake options.
(line 9)
-* gnits strictness: Strictness. (line 10)
+* gnits strictness: Strictness. (line 58)
* gnu: List of Automake options.
(line 9)
* GNU Build System, basics: Basic Installation. (line 6)
* GNU Gettext support: gettext. (line 6)
* GNU make extensions: General Operation. (line 20)
* GNU Makefile standards: Introduction. (line 12)
-* gnu strictness: Strictness. (line 10)
+* gnu strictness: Strictness. (line 18)
* GNUmakefile including Makefile: Third-Party Makefiles.
- (line 111)
+ (line 114)
* hard error: Generalities about Testing.
(line 48)
* Header files in _SOURCES: Program Sources. (line 39)
* if: Usage of Conditionals.
(line 36)
* include: Basics of Distribution.
- (line 17)
+ (line 22)
* include <1>: Include. (line 6)
* include, distribution: Basics of Distribution.
- (line 17)
+ (line 22)
* Including Makefile fragment: Include. (line 6)
* indentation in Makefile.am: General Operation. (line 33)
* info: List of Automake options.
- (line 96)
+ (line 97)
* info <1>: Extending. (line 41)
* info-in-builddir: List of Automake options.
- (line 56)
+ (line 57)
* info-local: Extending. (line 41)
* install: Standard Targets. (line 18)
* install <1>: The Two Parts of Install.
* install-html-local: Extending. (line 41)
* install-info: Texinfo. (line 85)
* install-info <1>: List of Automake options.
- (line 96)
+ (line 97)
* install-info <2>: Extending. (line 41)
* install-info target: Texinfo. (line 85)
* install-info-local: Extending. (line 41)
* install-man: Man Pages. (line 32)
* install-man <1>: List of Automake options.
- (line 102)
+ (line 103)
* install-man target: Man Pages. (line 32)
* install-pdf: Texinfo. (line 25)
* install-pdf <1>: Extending. (line 41)
* install-ps: Texinfo. (line 25)
* install-ps <1>: Extending. (line 41)
* install-ps-local: Extending. (line 41)
+* install-sh: Auxiliary Programs. (line 46)
* install-strip: Standard Targets. (line 21)
* install-strip <1>: Install Rules for the User.
(line 7)
* ltmain.sh not found: Error required file ltmain.sh not found.
(line 6)
* m4_include, distribution: Basics of Distribution.
- (line 17)
+ (line 22)
* Macro search path: Macro Search Path. (line 6)
* macro serial numbers: Serials. (line 6)
* Macros Automake recognizes: Optional. (line 6)
* make distclean, diagnostic: Errors with distclean.
(line 6)
* make distcleancheck: Checking the Distribution.
- (line 70)
+ (line 92)
* make distuninstallcheck: Checking the Distribution.
- (line 106)
+ (line 128)
* make install support: Install. (line 6)
* make installcheck, testing --help and --version: List of Automake options.
- (line 132)
+ (line 133)
* Make rules, overriding: General Operation. (line 46)
* Make targets, overriding: General Operation. (line 46)
* Makefile fragment, including: Include. (line 6)
* Man page renaming, avoiding: Man Pages. (line 54)
* MANS primary, defined: Man Pages. (line 6)
* many outputs, rules with: Multiple Outputs. (line 6)
-* mdate-sh: Texinfo. (line 35)
+* mdate-sh: Auxiliary Programs. (line 50)
+* mdate-sh <1>: Texinfo. (line 35)
* MinGW cross-compilation example: Cross-Compilation. (line 25)
+* missing program: Auxiliary Programs. (line 54)
* missing, purpose: maintainer-mode. (line 9)
* Mixed language example: Mixing Fortran 77 With C and C++.
(line 34)
* Mixing Fortran 77 with C and/or C++: Mixing Fortran 77 With C and C++.
(line 6)
* mkdir -p, macro check: Obsolete Macros. (line 14)
+* mkinstalldirs: Auxiliary Programs. (line 60)
* modules, libtool: Libtool Modules. (line 6)
* mostlyclean: Extending. (line 41)
* mostlyclean-local: Clean. (line 15)
* Nesting packages: Subpackages. (line 6)
* no-define: Public Macros. (line 54)
* no-define <1>: List of Automake options.
- (line 61)
+ (line 62)
* no-dependencies: Dependencies. (line 34)
* no-dependencies <1>: List of Automake options.
- (line 69)
+ (line 70)
* no-dist: List of Automake options.
- (line 76)
+ (line 77)
* no-dist-gzip: List of Automake options.
- (line 80)
+ (line 81)
* no-dist-gzip <1>: List of Automake options.
- (line 80)
+ (line 81)
* no-exeext: List of Automake options.
- (line 83)
+ (line 84)
* no-installinfo: Texinfo. (line 85)
* no-installinfo <1>: List of Automake options.
- (line 93)
+ (line 94)
* no-installinfo option: Texinfo. (line 85)
* no-installman: Man Pages. (line 32)
* no-installman <1>: List of Automake options.
- (line 99)
+ (line 100)
* no-installman option: Man Pages. (line 32)
+* no-op Automake targets: Third-Party Makefiles.
+ (line 88)
* no-texinfo.tex: List of Automake options.
- (line 109)
-* nobase_ and dist_ or nodist_: Alternative. (line 29)
-* nobase_ prefix: Alternative. (line 23)
-* nodist_ and nobase_: Alternative. (line 29)
+ (line 110)
+* nobase_ and dist_ or nodist_: Alternative. (line 28)
+* nobase_ prefix: Alternative. (line 22)
+* nodist_ and nobase_: Alternative. (line 28)
* nodist_ and notrans_: Man Pages. (line 63)
* noinst_ primary prefix, definition: Uniform. (line 90)
* Non-GNU packages: Strictness. (line 6)
* Non-standard targets: General Operation. (line 12)
* nostdinc: List of Automake options.
- (line 105)
+ (line 106)
* notrans_ and dist_ or nodist_: Man Pages. (line 63)
* notrans_ prefix: Man Pages. (line 54)
* OBJCFLAGS and AM_OBJCFLAGS: Flag Variables Ordering.
* obsolete macros: Obsolete Macros. (line 6)
* optimized build, example: VPATH Builds. (line 48)
* Option, --warnings=CATEGORY: List of Automake options.
- (line 216)
+ (line 217)
* Option, -WCATEGORY: List of Automake options.
- (line 216)
+ (line 217)
* Option, check-news: List of Automake options.
- (line 14)
+ (line 15)
* Option, dejagnu: List of Automake options.
- (line 18)
+ (line 19)
* Option, dist-bzip2: List of Automake options.
- (line 22)
+ (line 23)
* Option, dist-lzip: List of Automake options.
- (line 25)
+ (line 26)
* Option, dist-shar: List of Automake options.
- (line 37)
+ (line 38)
* Option, dist-tarZ: List of Automake options.
- (line 42)
+ (line 43)
* Option, dist-xz: List of Automake options.
- (line 28)
+ (line 29)
* Option, dist-zip: List of Automake options.
- (line 31)
+ (line 32)
* Option, dist-zstd: List of Automake options.
- (line 34)
+ (line 35)
* Option, filename-length-max=99: List of Automake options.
- (line 47)
+ (line 48)
* Option, foreign: List of Automake options.
(line 9)
* Option, gnits: List of Automake options.
* Option, gnu: List of Automake options.
(line 9)
* Option, info-in-builddir: List of Automake options.
- (line 56)
+ (line 57)
* Option, no-define: List of Automake options.
- (line 61)
+ (line 62)
* Option, no-dependencies: List of Automake options.
- (line 69)
+ (line 70)
* Option, no-dist: List of Automake options.
- (line 76)
+ (line 77)
* Option, no-dist-gzip: List of Automake options.
- (line 80)
+ (line 81)
* Option, no-exeext: List of Automake options.
- (line 83)
+ (line 84)
* Option, no-installinfo: Texinfo. (line 85)
* Option, no-installinfo <1>: List of Automake options.
- (line 93)
+ (line 94)
* Option, no-installman: Man Pages. (line 32)
* Option, no-installman <1>: List of Automake options.
- (line 99)
+ (line 100)
* Option, no-texinfo.tex: List of Automake options.
- (line 109)
+ (line 110)
* Option, nostdinc: List of Automake options.
- (line 105)
+ (line 106)
* Option, parallel-tests: List of Automake options.
- (line 117)
+ (line 118)
* Option, readme-alpha: List of Automake options.
- (line 123)
+ (line 124)
* Option, serial-tests: List of Automake options.
- (line 113)
+ (line 114)
* Option, tar-pax: List of Automake options.
- (line 162)
+ (line 163)
* Option, tar-ustar: List of Automake options.
- (line 162)
+ (line 163)
* Option, tar-v7: List of Automake options.
- (line 162)
+ (line 163)
* Option, VERSION: List of Automake options.
- (line 211)
+ (line 212)
* Option, warnings: List of Automake options.
- (line 216)
+ (line 217)
* Options, aclocal: aclocal Options. (line 6)
* Options, automake: automake Invocation. (line 37)
* Options, std-options: List of Automake options.
- (line 132)
+ (line 133)
* Options, subdir-objects: List of Automake options.
- (line 153)
+ (line 154)
* Ordering flag variables: Flag Variables Ordering.
(line 6)
* Overriding make rules: General Operation. (line 46)
(line 6)
* Parallel build trees: VPATH Builds. (line 6)
* parallel-tests: List of Automake options.
- (line 117)
-* Path stripping, avoiding: Alternative. (line 23)
+ (line 118)
+* Path stripping, avoiding: Alternative. (line 22)
* pax format: List of Automake options.
- (line 162)
+ (line 163)
* pdf: Texinfo. (line 25)
* pdf <1>: Extending. (line 41)
* PDF output using Texinfo: Texinfo. (line 6)
* Programs, renaming during installation: Renaming. (line 6)
* prog_LDADD, defined: Linking. (line 12)
* Proxy Makefile for third-party packages: Third-Party Makefiles.
- (line 128)
+ (line 131)
* ps: Texinfo. (line 25)
* ps <1>: Extending. (line 41)
* PS output using Texinfo: Texinfo. (line 6)
* ps-local: Extending. (line 41)
+* py-compile: Auxiliary Programs. (line 70)
* PYTHON primary, defined: Python. (line 6)
* Ratfor programs: Preprocessing Fortran 77.
(line 6)
* read-only source tree: VPATH Builds. (line 91)
* readme-alpha: List of Automake options.
- (line 123)
-* README-alpha: Gnits. (line 42)
+ (line 124)
+* README-alpha: Strictness. (line 78)
* rebuild rules: Rebuilding. (line 6)
* rebuild rules <1>: CVS. (line 9)
* recheck: Parallel Test Harness.
* serial number and --install: aclocal Options. (line 42)
* serial numbers in macros: Serials. (line 6)
* serial-tests: List of Automake options.
- (line 113)
+ (line 114)
* serial-tests, Using: Serial Test Harness. (line 6)
* Shared libraries, support for: A Shared Library. (line 6)
* Silencing make: Silencing Make. (line 6)
* Special Automake comment: General Operation. (line 68)
* Staged installation: DESTDIR. (line 14)
* std-options: List of Automake options.
- (line 132)
+ (line 133)
* Strictness, command line: automake Invocation. (line 37)
* Strictness, defined: Strictness. (line 10)
-* Strictness, foreign: Strictness. (line 10)
-* Strictness, gnits: Strictness. (line 10)
-* Strictness, gnu: Strictness. (line 10)
+* Strictness, foreign: Strictness. (line 51)
+* Strictness, gnits: Strictness. (line 58)
+* Strictness, gnu: Strictness. (line 18)
* su, before make install: Basic Installation. (line 49)
* subdir-objects: List of Automake options.
- (line 153)
+ (line 154)
* Subdirectories, building conditionally: Conditional Subdirectories.
(line 6)
* Subdirectories, configured conditionally: Unconfigured Subdirectories.
* tags: Tags. (line 9)
* TAGS support: Tags. (line 6)
* tar formats: List of Automake options.
- (line 162)
+ (line 163)
* tar-pax: List of Automake options.
- (line 162)
+ (line 163)
* tar-ustar: List of Automake options.
- (line 162)
+ (line 163)
* tar-v7: List of Automake options.
- (line 162)
+ (line 163)
* Target, install-info: Texinfo. (line 85)
* Target, install-man: Man Pages. (line 32)
+* targets, making into no-op: Third-Party Makefiles.
+ (line 88)
* test case: Generalities about Testing.
(line 11)
* Test case result, registering: Log files generation and test results recording.
* test skip: Generalities about Testing.
(line 29)
* Test suites: Tests. (line 6)
+* test-driver: Auxiliary Programs. (line 73)
* Tests, expected failure: Scripts-based Testsuites.
(line 32)
* testsuite harness: Generalities about Testing.
* Texinfo flag, UPDATED: Texinfo. (line 35)
* Texinfo flag, UPDATED-MONTH: Texinfo. (line 35)
* Texinfo flag, VERSION: Texinfo. (line 35)
-* texinfo.tex: Texinfo. (line 70)
+* texinfo.tex: Auxiliary Programs. (line 77)
+* texinfo.tex <1>: Texinfo. (line 70)
* TEXINFOS primary, defined: Texinfo. (line 6)
* third-party files and CVS: CVS. (line 168)
* Third-party packages, interfacing with: Third-Party Makefiles.
* user variables: User Variables. (line 6)
* Using aclocal: configure. (line 6)
* ustar format: List of Automake options.
- (line 162)
+ (line 163)
* v7 tar format: List of Automake options.
- (line 162)
+ (line 163)
* Vala Support: Vala Support. (line 6)
* variables, conflicting: Extending. (line 14)
* Variables, overriding: General Operation. (line 51)
* yacc, multiple parsers: Yacc and Lex. (line 68)
* YFLAGS and AM_YFLAGS: Flag Variables Ordering.
(line 20)
-* ylwrap: Yacc and Lex. (line 68)
+* ylwrap: Auxiliary Programs. (line 85)
+* ylwrap <1>: Yacc and Lex. (line 68)
* zardoz example: Complete. (line 35)
* Include:: Including extra files in an Automake template
* Conditionals:: Conditionals
* Silencing Make:: Obtain less verbose output from @command{make}
-* Gnits:: The effect of @option{--gnu} and @option{--gnits}
* Not Enough:: When Automake is not Enough
* Distributing:: Distributing the Makefile.in
* API Versioning:: About compatibility between Automake versions
If you need some teaching material, more illustrations, or a less
@command{automake}-centered continuation, some slides for this
introduction are available in Alexandre Duret-Lutz's
-@uref{http://www.lrde.epita.fr/@/~adl/@/autotools.html,
+@uref{https://www.lrde.epita.fr/@/~adl/@/autotools.html,
Autotools Tutorial}.
This chapter is the written version of the first part of his tutorial.
@itemize @bullet
@item
It attempts a full compilation of the package (@pxref{Basic
-Installation}), unpacking the newly constructed tarball, running
-@code{make}, @code{make check}, @code{make install}, as well as
-@code{make installcheck}, and even @code{make dist},
+Installation}): unpacking the newly constructed tarball, running
+@code{make}, @code{make dvi}, @code{make check}, @code{make install},
+as well as @code{make installcheck}, and even @code{make dist},
@item
it tests VPATH builds with read-only source tree (@pxref{VPATH Builds}),
@item
and it checks that @code{DESTDIR} installations work (@pxref{DESTDIR}).
@end itemize
-All of these actions are performed in a temporary directory, so that no
-root privileges are required. Please note that the exact location and the
-exact structure of such a subdirectory (where the extracted sources are
-placed, how the temporary build and install directories are named and how
-deeply they are nested, etc.) is to be considered an implementation detail,
-which can change at any time; so do not rely on it.
+All of these actions are performed in a temporary directory, so that
+no root privileges are required. The exact location and the exact
+structure of such a subdirectory (where the extracted sources are
+placed, how the temporary build and install directories are named and
+how deeply they are nested, etc.) is to be considered an
+implementation detail, which can change at any time; so do not rely on
+it.
Releasing a package that fails @code{make distcheck} means that one of
the scenarios we presented will not work and some users will be
@node Strictness
@section Strictness
-
+@c "Gnits" used to be a separate section.
+@c This @anchor allows old links to still work.
+@anchor{Gnits}
@cindex Non-GNU packages
While Automake is intended to be used by maintainers of GNU packages, it
not want to use all the GNU conventions.
@cindex Strictness, defined
-@cindex Strictness, @option{foreign}
-@cindex @option{foreign} strictness
+To this end, Automake supports three levels of @dfn{strictness}---how
+stringently Automake should enforce conformance with GNU conventions.
+Each strictness level can be selected using an option of the same name;
+see @ref{Options}.
+
+The strictness levels are:
+
+@table @option
+@item gnu
@cindex Strictness, @option{gnu}
@cindex @option{gnu} strictness
-@cindex Strictness, @option{gnits}
-@cindex @option{gnits} strictness
+@opindex --gnu
+This is the default level of strictness. Automake will check for
+basic compliance with the GNU standards for software packaging.
+@xref{Top,,, standards, The GNU Coding Standards} for full details
+of these standards. Currently the following checks are made:
-To this end, Automake supports three levels of @dfn{strictness}---the
-strictness indicating how stringently Automake should check standards
-conformance.
+@itemize @bullet
+@item
+The files @file{INSTALL}, @file{NEWS}, @file{README}, @file{AUTHORS},
+and @file{ChangeLog}, plus one of @file{COPYING.LIB}, @file{COPYING.LESSER}
+or @file{COPYING}, are required at the topmost directory of the package.
-The valid strictness levels are:
+If the @option{--add-missing} option is given, @command{automake} will
+add a generic version of the @file{INSTALL} file as well as the
+@file{COPYING} file containing the text of the current version of the
+GNU General Public License existing at the time of this Automake release
+(version 3 as this is written,
+@uref{https://www.gnu.org/@/copyleft/@/gpl.html}).
+However, an existing @file{COPYING} file will never be overwritten by
+@command{automake}.
+
+@item
+The options @option{no-installman} and @option{no-installinfo} are
+prohibited.
+@end itemize
+
+Future versions of Automake will add more checks at this level of
+strictness; it is advisable to be familiar with the precise requirements
+of the GNU standards.
+
+Future versions of Automake may, at this level of strictness, require
+certain non-standard GNU tools to be available to maintainer-only
+Makefile rules. For instance, in the future @command{pathchk}
+(@pxref{pathchk invocation,,, coreutils, GNU Coreutils})
+may be required to run @samp{make dist}.
-@table @option
@item foreign
+@cindex Strictness, @option{foreign}
+@cindex @option{foreign} strictness
+@opindex --foreign
Automake will check for only those things that are absolutely
required for proper operation. For instance, whereas GNU standards
dictate the existence of a @file{NEWS} file, it will not be required in
this mode. This strictness will also turn off some warnings by default
(among them, portability warnings).
-The name comes from the fact that Automake is intended to be
-used for GNU programs; these relaxed rules are not the standard mode of
-operation.
-
-@item gnu
-Automake will check---as much as possible---for compliance to the GNU
-standards for packages. This is the default.
@item gnits
+@cindex Strictness, @option{gnits}
+@cindex @option{gnits} strictness
+@opindex --gnits
Automake will check for compliance to the as-yet-unwritten @dfn{Gnits
standards}. These are based on the GNU standards, but are even more
detailed. Unless you are a Gnits standards contributor, it is
recommended that you avoid this option until such time as the Gnits
standard is actually published (which may never happen).
+
+Currently, @option{--gnits} does all the checks that
+@option{--gnu} does, and checks the following as well:
+
+@itemize @bullet
+@item
+@samp{make installcheck} will check to make sure that the @option{--help}
+and @option{--version} really print a usage message and a version string,
+respectively. This is the @option{std-options} option (@pxref{Options}).
+
+@item
+@samp{make dist} will check to make sure the @file{NEWS} file has been
+updated to the current version.
+
+@item
+@code{VERSION} is checked to make sure its format complies with Gnits
+standards.
+@c FIXME xref when standards are finished
+
+@item
+@cindex @file{README-alpha}
+If @code{VERSION} indicates that this is an alpha release, and the file
+@file{README-alpha} appears in the topmost directory of a package, then
+it is included in the distribution. This is done in @option{--gnits}
+mode, and no other, because this mode is the only one where version
+number formats are constrained, and hence the only mode where Automake
+can automatically determine whether @file{README-alpha} should be
+included.
+
+@item
+The file @file{THANKS} is required.
+@end itemize
+
@end table
-@xref{Gnits}, for more information on the precise implications of the
-strictness level.
@node Uniform
Traditionally, most unix-like systems have a length limitation for the
command line arguments and environment contents when creating new
processes (see for example
-@uref{http://www.in-ulm.de/@/~mascheck/@/various/@/argmax/} for an
+@uref{https://www.in-ulm.de/@/~mascheck/@/various/@/argmax/} for an
overview on this issue),
which of course also applies to commands spawned by @command{make}.
POSIX requires this limit to be at least 4096 bytes, and most modern
@table @code
@item ar-lib
+@cmindex ar-lib
This is a wrapper primarily for the Microsoft lib archiver, to make
it more POSIX-like.
@item compile
+@cmindex compile
This is a wrapper for compilers that do not accept options @option{-c}
and @option{-o} at the same time. It is only used when absolutely
required. Such compilers are rare, with the Microsoft C/C++ Compiler
@item config.guess
@itemx config.sub
+@cmindex config.guess
+@cmindex config.sub
These two programs compute the canonical triplets for the given build,
host, or target architecture. These programs are updated regularly to
support new architectures and fix probes broken by changes in new
release.
@item depcomp
+@cmindex depcomp
This program understands how to run a compiler so that it will
generate not only the desired output but also dependency information
that is then used by the automatic dependency tracking feature
(@pxref{Dependencies}).
@item install-sh
+@cmindex install-sh
This is a replacement for the @command{install} program that works on
platforms where @command{install} is unavailable or unusable.
@item mdate-sh
+@cmindex mdate-sh
This script is used to generate a @file{version.texi} file. It examines
a file and prints some date information about it.
@item missing
+@cmindex missing @r{program}
This wraps a number of programs that are typically only required by
maintainers. If the program in question doesn't exist, or seems too old,
@command{missing} will print an informative warning before failing out,
to provide the user with more context and information.
@item mkinstalldirs
+@cmindex mkinstalldirs
This script used to be a wrapper around @samp{mkdir -p}, which is not
portable. Now we prefer to use @samp{install-sh -d} when @command{configure}
finds that @samp{mkdir -p} does not work, this makes one less script to
longer installed automatically, and it should be safe to remove it.
@item py-compile
+@cmindex py-compile
This is used to byte-compile Python scripts.
@item test-driver
+@cmindex test-driver
This implements the default test driver offered by the parallel
testsuite harness.
@item texinfo.tex
-Not a program, this file is required for @samp{make dvi}, @samp{make
-ps} and @samp{make pdf} to work when Texinfo sources are in the
-package. The latest version can be downloaded from
-@url{https://www.gnu.org/software/texinfo/}.
+@cmindex texinfo.tex
+When Texinfo sources are in the package, this file is required for
+@samp{make dvi}, @samp{make ps} and @samp{make pdf}. The latest
+version can be downloaded from
+@url{https://www.gnu.org/software/texinfo/}. A working @TeX{}
+distribution, or at least a @file{tex} program, is also required.
+Furthermore, @samp{make dist} invokes @samp{make dvi}, so these become
+requirements for making a distribution with Texinfo sources.
@item ylwrap
+@cmindex ylwrap
This program wraps @command{lex} and @command{yacc} to rename their
output files. It also ensures that, for instance, multiple
@command{yacc} instances can be invoked in a single directory in
@item --gnits
@opindex --gnits
Set the global strictness to @option{gnits}. For more information, see
-@ref{Gnits}.
+@ref{Strictness}.
@item --gnu
@opindex --gnu
Set the global strictness to @option{gnu}. For more information, see
-@ref{Gnits}. This is the default strictness.
+@ref{Strictness}. This is the default strictness.
@item --help
@opindex --help
@opindex --version
Print the version number of Automake and exit.
-@item -W CATEGORY
-@itemx --warnings=@var{category}
+@item -W @var{category}[,@var{category}...]
+@itemx --warnings=@var{category}[,@var{category}...]
@opindex -W
@opindex --warnings
-Output warnings falling in @var{category}. @var{category} can be
-one of:
+Output warnings about a @var{category} of potential problems with the
+package. @var{category} can be any of:
+
@table @code
+@item cross
+Constructs compromising the ability to cross-compile the package.
@item gnu
-warnings related to the GNU Coding Standards
+Minor deviations from the GNU Coding Standards
(@pxref{Top, , , standards, The GNU Coding Standards}).
@item obsolete
-obsolete features or constructions
+Obsolete features or constructions.
@item override
-user redefinitions of Automake rules or variables
+Redefinitions of Automake rules or variables.
@item portability
-portability issues (e.g., use of @command{make} features that are
-known to be not portable)
+Portability issues (e.g., use of @command{make} features that are
+known to be not portable).
+@item portability-recursive
+Recursive, or nested, Make variable expansions (@code{$(foo$(x))}).
+These are not universally supported, but are more portable than the
+other non-portable constructs diagnosed by @option{-Wportability}.
+These warnings are turned on by @option{-Wportability} but can then be
+turned off specifically by @option{-Wno-portability-recursive}.
@item extra-portability
-extra portability issues related to obscure tools. One example of such
-a tool is the Microsoft @command{lib} archiver.
+Extra portability issues, related to rarely-used tools such as
+the Microsoft @command{lib} archiver.
@item syntax
-weird syntax, unused variables, typos
+Questionable syntax, unused variables, typos, etc.
@item unsupported
-unsupported or incomplete features
+Unsupported or incomplete features.
@item all
-all the warnings
+Turn on all the above categories of warnings.
@item none
-turn off all the warnings
+Turn off all the above categories of warnings.
@item error
-treat warnings as errors
+Treat warnings as errors.
@end table
A category can be turned off by prefixing its name with @samp{no-}. For
instance, @option{-Wno-syntax} will hide the warnings about unused
variables.
-The categories output by default are @samp{obsolete}, @samp{syntax} and
-@samp{unsupported}. Additionally, @samp{gnu} and @samp{portability}
-are enabled in @option{--gnu} and @option{--gnits} strictness.
+Warnings in the @samp{gnu}, @samp{obsolete}, @samp{portability},
+@samp{syntax}, and @samp{unsupported} categories are turned on by
+default. The @samp{gnu} and @samp{portability} categories are turned
+off in @option{--foreign} strictness.
@c Checked by extra-portability.sh
Turning off @samp{portability} will also turn off @samp{extra-portability},
@samp{portability}. However, turning on @samp{portability} or turning
off @samp{extra-portability} will not affect the other category.
+Unknown warning categories supplied as an argument to @option{-W} will
+themselves produce a warning, in the @samp{unsupported} category. This
+warning is never treated as an error.
+
@vindex WARNINGS
The environment variable @env{WARNINGS} can contain a comma separated
-list of categories to enable. It will be taken into account before the
-command-line switches, this way @option{-Wnone} will also ignore any
-warning category enabled by @env{WARNINGS}. This variable is also used
-by other tools like @command{autoconf}; unknown categories are ignored
-for this reason.
+list of categories to enable. @option{-W} settings on the command line
+take precedence; for instance, @option{-Wnone} also turns off any
+warning categories enabled by @env{WARNINGS}.
+
+Unknown warning categories named in @env{WARNINGS} are silently ignored.
@end table
@cindex @command{dmalloc}, support for
@vindex WITH_DMALLOC
@opindex --with-dmalloc
-Add support for the @uref{http://dmalloc.com/, Dmalloc package}. If
+Add support for the @uref{https://dmalloc.com/, Dmalloc package}. If
the user runs @command{configure} with @option{--with-dmalloc}, then
define @code{WITH_DMALLOC} and add @option{-ldmalloc} to @code{LIBS}.
@node Alternative
@section An Alternative Approach to Subdirectories
-If you've ever read Peter Miller's excellent paper,
-@uref{http://miller.emu.id.au/pmiller/books/rmch/,
-Recursive Make Considered Harmful}, the preceding sections on the use of
-make recursion will probably come as unwelcome advice. For those who
+If you've ever read Peter Miller's excellent paper, @cite{Recursive
+Make Considered Harmful}, the preceding sections on the use of make
+recursion will probably come as unwelcome advice. For those who
haven't read the paper, Miller's main thesis is that recursive
@command{make} invocations are both slow and error-prone.
determined until @file{./configure} is run: not all platforms support
all kinds of libraries, and users can explicitly select which
libraries should be built. (However the package's maintainers can
-tune the default; @pxref{AC_PROG_LIBTOOL, , The @code{AC_PROG_LIBTOOL}
+tune the default; @pxref{LT_INIT, , The @code{LT_INIT}
macro, libtool, The Libtool Manual}.)
@cindex suffix @file{.lo}, defined
However, there are many other issues related to mixing Fortran 77 with
other languages that are @emph{not} (currently) handled by Automake, but
that are handled by other packages@footnote{For example,
-@uref{http://www-zeus.desy.de/~burow/cfortran/, the cfortran package}
+@uref{https://www-zeus.desy.de/~burow/cfortran/, the cfortran package}
addresses all of these inter-language issues, and runs under nearly all
Fortran 77, C and C++ compilers on nearly all platforms. However,
@command{cfortran} is not yet Free Software, but it will be in the next
@cindex Support for Vala
Automake provides initial support for Vala
-(@uref{http://www.vala-project.org/}).
+(@uref{https://www.vala-project.org/}).
This requires valac version 0.7.0 or later, and currently requires
the user to use GNU @command{make}.
Search for a Vala compiler in @env{PATH}. If it is found, the variable
@code{VALAC} is set to point to it (see below for more details). This
macro takes three optional arguments. The first argument, if present,
-is the minimum version of the Vala compiler required to compile this
-package. If a compiler is found and satisfies @var{minimum-version},
-then @var{action-if-found} is run (this defaults to do nothing).
-Otherwise, @var{action-if-not-found} is run. If @var{action-if-not-found}
-is not specified, the default value is to print a warning in case no
-compiler is found, or if a too-old version of the compiler is found.
+is the minimum version of the Vala API required to compile this package.
+For Vala releases, this is the same as the major and minor release
+number; e.g., when @code{valac --version} reports @code{0.48.7},
+@code{valac --api-version} reports @code{0.48}. If a compiler is found
+and satisfies @var{minimum-version}, then @var{action-if-found} is run
+(this defaults to do nothing). Otherwise, @var{action-if-not-found} is
+run. If @var{action-if-not-found} is not specified, the default value
+is to print a warning in case no compiler is found, or if a too-old
+version of the compiler is found.
@end defmac
There are a few variables that are used when compiling Vala sources:
@cindex @code{BUILT_SOURCES}, defined
The @code{BUILT_SOURCES} variable is a workaround for this problem. A
-source file listed in @code{BUILT_SOURCES} is made on @samp{make all}
-or @samp{make check} (or even @samp{make install}) before other
-targets are processed. However, such a source file is not
-@emph{compiled} unless explicitly requested by mentioning it in some
-other @code{_SOURCES} variable.
+source file listed in @code{BUILT_SOURCES} is made when @samp{make
+all}, @samp{make check}, @samp{make install}, @samp{make install-exec}
+(or @code{make dist}) is run, before other targets are processed.
+However, such a source file is not @emph{compiled} unless explicitly
+requested by mentioning it in some other @code{_SOURCES} variable.
So, to conclude our introductory example, we could use
@samp{BUILT_SOURCES = foo.h} to ensure @file{foo.h} gets built before
another source), because it's a known dependency of the associated
object.
-It might be important to emphasize that @code{BUILT_SOURCES} is
-honored only by @samp{make all}, @samp{make check} and @samp{make
-install}. This means you cannot build a specific target (e.g.,
-@samp{make foo}) in a clean tree if it depends on a built source.
-However it will succeed if you have run @samp{make all} earlier,
-because accurate dependencies are already available.
+To emphasize, @code{BUILT_SOURCES} is honored only by @samp{make all},
+@samp{make check}, @samp{make install}, and @code{make install-exec}
+(and @samp{make dist}). This means you cannot build an arbitrary
+target (e.g., @samp{make foo}) in a clean tree if it depends on a
+built source. However it will succeed if you have run @samp{make all}
+earlier, because accurate dependencies are already available.
The next section illustrates and discusses the handling of built sources
on a toy example.
@item PYTHON_VERSION
The Python version number, in the form @var{major}.@var{minor}
(e.g., @samp{2.5}). This is currently the value of
-@samp{sys.version[:3]}.
+@samp{'%u.%u' % sys.version_info[:2]}.
@item PYTHON_PREFIX
The string @samp{$@{prefix@}}. This term may be used in future work
@file{.dvi} file. This defaults to @samp{dvips}.
@item TEXINFO_TEX
-
If your package has Texinfo files in many directories, you can use the
variable @code{TEXINFO_TEX} to tell Automake where to find the canonical
@file{texinfo.tex} for your package. The value of this variable should
@code{AC_INIT} invocation or by a @emph{deprecated} two-arguments
invocation of the @code{AM_INIT_AUTOMAKE} macro (see @ref{Public Macros}
for how these variables get their values, from either defaults or explicit
-values -- it's slightly trickier than one would expect).
+values---it's slightly trickier than one would expect).
More precisely the gzipped @code{tar} file is named
@samp{$@{PACKAGE@}-$@{VERSION@}.tar.gz}.
+
@vindex GZIP_ENV
You can use the @command{make} variable @code{GZIP_ENV} to control how gzip
is run. The default setting is @option{--best}.
+@c See automake #9822.
+@vindex TAR
+You can set the environment variable @code{TAR} to override the tar
+program used; it defaults to @code{tar}.
+
@cindex @code{m4_include}, distribution
@cindex @code{include}, distribution
@acindex m4_include
@example
EXTRA_DIST = doc
dist-hook:
+ chmod -R u+w $(distdir)/doc
rm -rf `find $(distdir)/doc -type d -name .svn`
@end example
EXTRA_DIST = README doc
dist-hook:
chmod u+w $(distdir)/README $(distdir)/doc
- echo "Distribution date: `date`" >> README
+ echo "Distribution date: `date`" >> $(distdir)/README
rm -f $(distdir)/doc/HACKING
@end example
Automake also generates a @code{distcheck} rule that can be of help
to ensure that a given distribution will actually work. Simplifying
a bit, we can say this rule first makes a distribution, and then,
-@emph{operating from it}, takes the following steps:
+@emph{operating from it}, takes the following steps (in this order):
@itemize
@item
tries to do a @code{VPATH} build (@pxref{VPATH Builds}), with the
@code{srcdir} and all its content made @emph{read-only};
@item
+tries to make the printable documentation, if any (with @command{make dvi}),
+@item
runs the test suite (with @command{make check}) on this fresh build;
@item
installs the package in a temporary directory (with @command{make
self-contained.
@end itemize
-All of these actions are performed in a temporary directory. Please
-note that the exact location and the exact structure of such a directory
-(where the read-only sources are placed, how the temporary build and
-install directories are named and how deeply they are nested, etc.) is
-to be considered an implementation detail, which can change at any time;
-so do not rely on it.
+All of these actions are performed in a temporary directory. The
+exact location and the exact structure of such a directory (where the
+read-only sources are placed, how the temporary build and install
+directories are named and how deeply they are nested, etc.) is to be
+considered an implementation detail, which can change at any time; so
+do not rely on it.
@vindex AM_DISTCHECK_CONFIGURE_FLAGS
@vindex DISTCHECK_CONFIGURE_FLAGS
installcheck} was wrongly assuming it could blindly test "@command{m4}",
rather than the just-installed "@command{gm4}".
+@trindex dvi
+@subheading dvi and distcheck
+@cindex @code{eps} images
+Ordinarily, @command{make distcheck} runs @command{make dvi}. It does
+nothing if the distribution contains no Texinfo sources. If the
+distribution does contain a Texinfo manual, by default the @code{dvi}
+target will run @TeX{} to make sure it can be successfully processed
+(@pxref{Texinfo}).
+
+However, you may wish to test the manual by producing @code{pdf}
+(e.g., if your manual uses images in formats other than @code{eps}),
+@code{html} (if you don't have @TeX{} at all), some other format, or
+just skip the test entirely (not recommended). You can change the
+target that is run by setting the variable
+@code{AM_DISTCHECK_DVI_TARGET} in your @code{Makefile.am}; for
+example,
+
+@example
+AM_DISTCHECK_DVI_TARGET = pdf
+@end example
+
+To make @code{dvi} into a do-nothing target, see the example for
+@code{EMPTY_AUTOMAKE_TARGETS} in @ref{Third-Party Makefiles}.
+
@trindex distcheck-hook
@subheading distcheck-hook
If the @code{distcheck-hook} rule is defined in your top-level
simply make no sense on a given system (for example, a test checking a
Windows-specific feature makes no sense on a GNU/Linux system). In this
case, accordingly to the definition above, the tests can neither be
-considered passed nor failed; instead, they are @emph{skipped} -- i.e.,
+considered passed nor failed; instead, they are @emph{skipped}- --i.e.,
they are not run, or their result is anyway ignored for what concerns
the count of failures and successes. Skips are usually explicitly
reported though, so that the user will be aware that not all of the
By default, only the exit statuses of the test scripts are considered when
determining the testsuite outcome. But Automake allows also the use of
more complex test protocols, either standard (@pxref{Using the TAP test
-protocol}) or custom (@pxref{Custom Test Drivers}). Note that you can't
+protocol}) or custom (@pxref{Custom Test Drivers}). You can't
enable such protocols when the serial harness is used, though.
In the rest of this section we are going to concentrate mostly on
protocol-less tests, since we cover test protocols in a later section
@noindent
A testsuite summary (expected to report at least the number of run,
skipped and failed tests) will be printed at the end of the testsuite
-run.
+run. By default, the first line of the summary has the form:
+
+@example
+Testsuite summary for @var{package-string}
+@end example
+
+@c See automake bug#11745.
+@vindex AM_TESTSUITE_SUMMARY_HEADER
+@noindent
+where @var{package-string} is the name and version of the package. If
+you have several independent test suites for different parts of the
+package, though, it can be misleading for each suite to imply it is
+for the whole package. Or, in complex projects, you may wish to add
+the current directory or other information to the testsuite header
+line. So you can override the @samp{ for @var{package-string}} suffix
+on that line by setting the @code{AM_TESTSUITE_SUMMARY_HEADER}
+variable. The value of this variable is used unquoted in a shell echo
+command, so you must include any necessary quotes. For example, the
+default value is
+
+@example
+AM_TESTSUITE_SUMMARY_HEADER = ' for $(PACKAGE_STRING)'
+@end example
+
+@noindent
+including the double quotes (interpreted by the shell) and the leading
+space (since the value is output directly after the @samp{Testsuite
+summary}). The @code{$(PACKAGE_STRING)} is substituted by @code{make}.
@anchor{Simple tests and color-tests}
@vindex AM_COLOR_TESTS
@file{.trs} files (@pxref{Basics of test metadata}) to store the test
results and related metadata. Apart from that, it will try to remain
as compatible as possible with pre-existing and widespread utilities,
-such as the @uref{http://search.cpan.org/~andya/Test-Harness/bin/prove,
+such as the
+@uref{https://metacpan.org/pod/distribution/Test-Harness/bin/prove,
@command{prove} utility}, at least for the simpler usages.
TAP started its life as part of the test harness for Perl, but today
implementations in different languages; among them, C, C++, Perl,
Python, PHP, and Java. For a semi-official specification of the
TAP protocol, please refer to the documentation of
-@uref{http://search.cpan.org/~petdance/Test-Harness/lib/Test/Harness/TAP.pod,
- @samp{Test::Harness::TAP}}.
+@uref{https://metacpan.org/pod/Test::Harness, @samp{Test::Harness}}.
The most relevant real-world usages of TAP are obviously in the testsuites
-of @command{perl} and of many perl modules. Still, other important
-third-party packages, such as @uref{http://git-scm.com/, @command{git}},
+of @command{perl} and of many Perl modules. Still, other important
+third-party packages, such as @uref{https://git-scm.com/, @command{git}},
also use TAP in their testsuite.
@node Use TAP with the Automake test harness
tools and libraries.
@itemize @bullet
@item
-@uref{http://search.cpan.org/~petdance/Test-Harness/lib/Test/Harness/TAP.pod,
- @samp{Test::Harness::TAP}},
+@uref{https://metacpan.org/pod/Test::Harness, @samp{Test::Harness}},
the (mostly) official documentation about the TAP format and protocol.
@item
-@uref{http://search.cpan.org/~andya/Test-Harness/bin/prove,
+@uref{https://metacpan.org/pod/distribution/Test-Harness/bin/prove,
@command{prove}},
the most famous command-line TAP test driver, included in the distribution
of @command{perl} and
-@uref{http://search.cpan.org/~andya/Test-Harness/lib/Test/Harness.pm,
+@uref{https://metacpan.org/pod/distribution/Test-Harness/lib/Test/Harness.pm,
@samp{Test::Harness}}.
@item
-The @uref{http://testanything.org/wiki/index.php/Main_Page,TAP wiki}.
+The @uref{https://testanything.org/,TAP wiki}.
@item
-A ``gentle introduction'' to testing for perl coders:
-@uref{http://search.cpan.org/dist/Test-Simple/lib/Test/Tutorial.pod,
+A ``gentle introduction'' to testing for Perl coders:
+@uref{https://metacpan.org/pod/distribution/Test-Simple/lib/Test/Tutorial.pod,
@samp{Test::Tutorial}}.
@item
-@uref{http://search.cpan.org/~mschwern/Test-Simple/lib/Test/Simple.pm,
+@uref{https://metacpan.org/pod/distribution/Test-Simple/lib/Test/Simple.pm,
@samp{Test::Simple}}
and
-@uref{http://search.cpan.org/~mschwern/Test-Simple/lib/Test/More.pm,
+@uref{https://metacpan.org/pod/distribution/Test-Simple/lib/Test/More.pm,
@samp{Test::More}},
-the standard perl testing libraries, which are based on TAP.
+the standard Perl testing libraries, which are based on TAP.
@item
-@uref{http://www.eyrie.org/~eagle/software/c-tap-harness/,C TAP Harness},
+@uref{https://www.eyrie.org/~eagle/software/c-tap-harness/,C TAP Harness},
a C-based project implementing both a TAP producer and a TAP consumer.
@item
-@uref{http://www.tap4j.org/,tap4j},
+@uref{https://tap4j.org/,tap4j},
a Java-based project implementing both a TAP producer and a TAP consumer.
@end itemize
@node DejaGnu Tests
@section DejaGnu Tests
-If @uref{https://ftp.gnu.org/gnu/dejagnu/, @command{dejagnu}} appears in
-@code{AUTOMAKE_OPTIONS}, then a @command{dejagnu}-based test suite is
-assumed. The variable @code{DEJATOOL} is a list of names that are
-passed, one at a time, as the @option{--tool} argument to
-@command{runtest} invocations; it defaults to the name of the package.
+If @command{dejagnu} (@pxref{Top, , Introduction, dejagnu, DejaGnu})
+appears in @code{AUTOMAKE_OPTIONS}, then a @command{dejagnu}-based
+test suite is assumed. The variable @code{DEJATOOL} is a list of
+names that are passed, one at a time, as the @option{--tool} argument
+to @command{runtest} invocations; it defaults to the name of the
+package.
The variable @code{RUNTESTDEFAULTFLAGS} holds the @option{--tool} and
@option{--srcdir} flags that are passed to dejagnu by default; this can be
@opindex gnu
@opindex foreign
-Set the strictness as appropriate. The @option{gnits} option also
-implies options @option{readme-alpha} and @option{check-news}.
+Set the strictness as appropriate. @xref{Strictness}.
+The @option{gnits} option also implies options @option{readme-alpha} and
+@option{check-news}.
@item @option{check-news}
@cindex Option, @option{check-news}
@command{make} if the ``@i{Entering/Leaving directory ...}'' messages
are to be disabled.
-@node Gnits
-@chapter The effect of @option{--gnu} and @option{--gnits}
-
-@cindex @option{--gnu}, required files
-@cindex @option{--gnu}, complete description
-
-The @option{--gnu} option (or @option{gnu} in the
-@code{AUTOMAKE_OPTIONS} variable) causes @command{automake} to check
-the following:
-
-@itemize @bullet
-@item
-The files @file{INSTALL}, @file{NEWS}, @file{README}, @file{AUTHORS},
-and @file{ChangeLog}, plus one of @file{COPYING.LIB}, @file{COPYING.LESSER}
-or @file{COPYING}, are required at the topmost directory of the package.
-
-If the @option{--add-missing} option is given, @command{automake} will
-add a generic version of the @file{INSTALL} file as well as the
-@file{COPYING} file containing the text of the current version of the
-GNU General Public License existing at the time of this Automake release
-(version 3 as this is written, @uref{https://www.gnu.org/@/copyleft/@/gpl.html}).
-However, an existing @file{COPYING} file will never be overwritten by
-@command{automake}.
-
-@item
-The options @option{no-installman} and @option{no-installinfo} are
-prohibited.
-@end itemize
-
-Note that this option will be extended in the future to do even more
-checking; it is advisable to be familiar with the precise requirements
-of the GNU standards. Also, @option{--gnu} can require certain
-non-standard GNU programs to exist for use by various maintainer-only
-rules; for instance, in the future @command{pathchk} might be required for
-@samp{make dist}.
-
-@cindex @option{--gnits}, complete description
-
-The @option{--gnits} option does everything that @option{--gnu} does, and
-checks the following as well:
-
-@itemize @bullet
-@item
-@samp{make installcheck} will check to make sure that the @option{--help}
-and @option{--version} really print a usage message and a version string,
-respectively. This is the @option{std-options} option (@pxref{Options}).
-
-@item
-@samp{make dist} will check to make sure the @file{NEWS} file has been
-updated to the current version.
-
-@item
-@code{VERSION} is checked to make sure its format complies with Gnits
-standards.
-@c FIXME xref when standards are finished
-
-@item
-@cindex @file{README-alpha}
-If @code{VERSION} indicates that this is an alpha release, and the file
-@file{README-alpha} appears in the topmost directory of a package, then
-it is included in the distribution. This is done in @option{--gnits}
-mode, and no other, because this mode is the only one where version
-number formats are constrained, and hence the only mode where Automake
-can automatically determine whether @file{README-alpha} should be
-included.
-
-@item
-The file @file{THANKS} is required.
-@end itemize
-
-
@node Not Enough
@chapter When Automake Isn't Enough
third-party project with no documentation or tag support, you could
simply augment its @file{Makefile} as follows:
+@vindex EMPTY_AUTOMAKE_TARGETS
+@cindex Automake targets, no-op
+@cindex do-nothing Automake targets
+@cindex empty Automake targets
+@cindex no-op Automake targets
+@cindex targets, making into no-op
@example
EMPTY_AUTOMAKE_TARGETS = dvi pdf ps info html tags ctags
.PHONY: $(EMPTY_AUTOMAKE_TARGETS)
$(EMPTY_AUTOMAKE_TARGETS):
@end example
+To be clear, there is nothing special about the variable name
+@code{EMPTY_AUTOMAKE_TARGETS}; the name could be anything.
+
Another aspect of integrating third-party build systems is whether
they support VPATH builds (@pxref{VPATH Builds}). Obviously if the
subpackage does not support VPATH builds the whole package will not
These files, whether they are kept under CVS or not, raise similar
concerns about version mismatch between developers' tools. The
-Gettext manual has a section about this; see @ref{CVS Issues, CVS
-Issues, Integrating with CVS, gettext, GNU gettext tools}.
+Gettext manual has a section about this; see @ref{Version Control Issues,,
+Integrating with Version Control Systems, gettext, GNU gettext tools}.
@node maintainer-mode
@section @command{missing} and @code{AM_MAINTAINER_MODE}
Nowadays it is no longer worth worrying about the 8.3 limits of
DOS file systems.
-@c FIXME This should probably be moved in the "Checking the Distribution"
+@c FIXME This should probably be moved to the "Checking the Distribution"
@c FIXME section...
@node Errors with distclean
@section Errors with distclean
data.c: data.foo
foo data.foo
data.h: data.c
+## Recover from the removal of $@@
+ @@test -f $@@ || rm -f data.c
+ @@test -f $@@ || $(MAKE) $(AM_MAKEFLAGS) data.c
+@end example
+
+It is tempting to use a single test as follows:
+
+@example
+data.h: data.c
## Recover from the removal of $@@
@@if test -f $@@; then :; else \
rm -f data.c; \
fi
@end example
+@noindent
+but that would break @samp{make -n}: at least GNU @command{make} and
+Solaris @command{make} execute recipes containing the @samp{$(MAKE)}
+string even when they are running in dry mode. So if we didn't break
+the recipe above in two invocations, the file @file{data.c} would be
+removed even upon @samp{make -n}. Not nice.
+
+
The above scheme can be extended to handle more outputs and more
inputs. One of the outputs is selected to serve as a witness to the
successful completion of the command, it depends upon all inputs, and
foo data.foo data.bar
data.h data.w data.x: data.c
## Recover from the removal of $@@
- @@if test -f $@@; then :; else \
- rm -f data.c; \
- $(MAKE) $(AM_MAKEFLAGS) data.c; \
- fi
+ @@test -f $@@ || rm -f data.c
+ @@test -f $@@ || $(MAKE) $(AM_MAKEFLAGS) data.c
@end example
However there are now three minor problems in this setup. One is related
data.c: data.foo data.bar
foo data.foo data.bar
data.h data.w data.x: data.c
- @@if test -f $@@; then \
- touch $@@; \
- else \
+ @@test ! -f $@@ || touch $@@
## Recover from the removal of $@@
- rm -f data.c; \
- $(MAKE) $(AM_MAKEFLAGS) data.c; \
- fi
+ @@test -f $@@ || rm -f data.c
+ @@test -f $@@ || $(MAKE) $(AM_MAKEFLAGS) data.c
@end example
Another solution is to use a different and dedicated file as witness,
@@mv -f data.tmp $@@
data.c data.h data.w data.x: data.stamp
## Recover from the removal of $@@
- @@if test -f $@@; then :; else \
- rm -f data.stamp; \
- $(MAKE) $(AM_MAKEFLAGS) data.stamp; \
- fi
+ @@test -f $@@ || rm -f data.stamp
+ @@test -f $@@ || $(MAKE) $(AM_MAKEFLAGS) data.stamp
@end example
@file{data.tmp} is created before @command{foo} is run, so it has a
You can look at the @uref{https://debbugs.gnu.org/, GNU Bug Tracker}
and the @uref{https://lists.gnu.org/@/archive/@/html/@/bug-automake/,
bug-automake mailing list archives} for previous bug reports. We
-previously used a
-@uref{http://sourceware.org/@/cgi-bin/@/gnatsweb.pl?database=automake,
-Gnats database} for bug tracking, so some bugs might have been reported
-there already. Please do not use it for new bug reports, however.
+previously used a Gnats database for bug tracking, but it is no longer
+online.
If the bug is not already known, it should be reported. It is very
important to report bugs in a way that is useful and efficient. For
this, please familiarize yourself with
-@uref{http://www.chiark.greenend.org.uk/@/~sgtatham/@/bugs.html, How to
+@uref{https://www.chiark.greenend.org.uk/@/~sgtatham/@/bugs.html, How to
Report Bugs Effectively} and
@uref{http://catb.org/@/~esr/@/faqs/@/smart-questions.html, How to Ask
Questions the Smart Way}. This helps you and developers to save time,
%D%/automake-$(APIVERSION).1: $(automake_script) lib/Automake/Config.pm
$(update_mans) automake-$(APIVERSION)
+## This target is not invoked as a dependency of anything. It exists
+## merely to make checking the links in automake.texi (that is,
+## automake.html) more convenient. We use a slightly-enhanced version of
+## W3C checklink to do this. We intentionally do not have automake.html
+## as a dependency, as it seems more convenient to have its regeneration
+## under manual control. See https://debbugs.gnu.org/10371.
+##
+checklinkx = $(top_srcdir)/contrib/checklinkx
+# that 4-second sleep seems to be what gnu.org likes.
+chlx_args = -v --sleep 8 #--exclude-url-file=/tmp/xf
+# Explanation of excludes:
+# - w3.org dtds, they are fine (and slow).
+# - mailto urls, they are always forbidden.
+# - vala, redirects to a Gnome subpage and returns 403 to us.
+# - cfortran, forbidden by site's robots.txt.
+# - search.cpan.org, gets
+# - debbugs.gnu.org/automake, forbidden by robots.txt.
+# - autoconf.html, forbidden by robots.txt (since served from savannah).
+# - https://fsf.org redirects to https://www.fsf.org and nothing to do
+# (it's in the FDL). --suppress-redirect options do not suppress the msg.
+#
+chlx_excludes = \
+ -X 'http.*w3\.org/.*dtd' \
+ -X 'mailto:.*' \
+ -X 'https://www\.vala-project\.org/' \
+ -X 'https://www-zeus\.desy\.de/~burow/cfortran/' \
+ -X 'http://xsearch\.cpan\.org/~mschwern/Test-Simple/lib/Test/More\.pm' \
+ -X 'https://debbugs\.gnu\.org/automake' \
+ -X 'https://www\.gnu\.org/software/autoconf/manual/autoconf\.html' \
+ -X 'https://fsf\.org/'
+chlx_file = $(top_srcdir)/doc/automake.html
+.PHONY: checklinkx
+checklinkx:
+ $(checklinkx) $(chlx_args) $(chlx_excludes) $(chlx_file)
+
## ---------------------------- ##
## Example package "amhello". ##
## ---------------------------- ##
-@set UPDATED 1 February 2020
-@set UPDATED-MONTH February 2020
-@set EDITION 1.16.2
-@set VERSION 1.16.2
+@set UPDATED 19 November 2020
+@set UPDATED-MONTH November 2020
+@set EDITION 1.16.3
+@set VERSION 1.16.3
-@set UPDATED 1 February 2020
-@set UPDATED-MONTH February 2020
-@set EDITION 1.16.2
-@set VERSION 1.16.2
+@set UPDATED 19 November 2020
+@set UPDATED-MONTH November 2020
+@set EDITION 1.16.3
+@set VERSION 1.16.3
package Automake::ChannelDefs;
-use Automake::Config;
-BEGIN
-{
- if ($perl_threads)
- {
- require threads;
- import threads;
- }
-}
-use Automake::Channels;
-
=head1 NAME
Automake::ChannelDefs - channel definitions for Automake and helper functions
use Automake::ChannelDefs;
- Automake::ChannelDefs::usage ();
+ print Automake::ChannelDefs::usage (), "\n";
prog_error ($MESSAGE, [%OPTIONS]);
error ($WHERE, $MESSAGE, [%OPTIONS]);
error ($MESSAGE);
verb ($MESSAGE, [%OPTIONS]);
switch_warning ($CATEGORY);
parse_WARNINGS ();
- parse_warnings ($OPTION, $ARGUMENT);
+ parse_warnings ($OPTION, @ARGUMENT);
Automake::ChannelDefs::set_strictness ($STRICTNESS_NAME);
=head1 DESCRIPTION
-This packages defines channels that can be used in Automake to
+This package defines channels that can be used in Automake to
output diagnostics and other messages (via C<msg()>). It also defines
some helper function to enable or disable these channels, and some
shorthand function to output on specific channels.
use 5.006;
use strict;
+use warnings FATAL => 'all';
+
use Exporter;
-use vars qw (@ISA @EXPORT);
+use Automake::Channels;
+use Automake::Config;
+BEGIN
+{
+ if ($perl_threads)
+ {
+ require threads;
+ import threads;
+ }
+}
-@ISA = qw (Exporter);
-@EXPORT = qw (&prog_error &error &fatal &verb
- &switch_warning &parse_WARNINGS &parse_warnings);
+our @ISA = qw (Exporter);
+our @EXPORT = qw (&prog_error &error &fatal &verb
+ &switch_warning &parse_WARNINGS &parse_warnings
+ &merge_WARNINGS);
=head2 CHANNELS
Internal errors. Use C<&prog_error> to send messages over this channel.
+=item C<cross>
+
+Constructs compromising the cross-compilation of the package.
+
=item C<gnu>
Warnings related to GNU Coding Standards.
=item C<obsolete>
-Warnings about obsolete features (silent by default).
+Warnings about obsolete features.
=item C<override>
Warnings about non-portable constructs.
+=item C<portability-recursive>
+
+Warnings about recursive variable expansions (C<$(foo$(x))>).
+These are not universally supported, but are more portable than
+the other non-portable constructs diagnosed by C<-Wportability>.
+These warnings are turned on by C<-Wportability> but can then be
+turned off separately by C<-Wno-portability-recursive>.
+
=item C<extra-portability>
Extra warnings about non-portable constructs covering obscure tools.
footer => "\nPlease contact <$PACKAGE_BUGREPORT>.",
uniq_part => UP_NONE, ordered => 0;
-register_channel 'extra-portability', type => 'warning', silent => 1;
+register_channel 'cross', type => 'warning', silent => 1;
register_channel 'gnu', type => 'warning';
register_channel 'obsolete', type => 'warning';
register_channel 'override', type => 'warning', silent => 1;
register_channel 'portability', type => 'warning', silent => 1;
+register_channel 'extra-portability', type => 'warning', silent => 1;
register_channel 'portability-recursive', type => 'warning', silent => 1;
register_channel 'syntax', type => 'warning';
register_channel 'unsupported', type => 'warning';
=item C<usage ()>
-Display warning categories.
+Return the warning category descriptions.
=cut
sub usage ()
{
- print <<EOF;
-Warning categories include:
- gnu GNU coding standards (default in gnu and gnits modes)
- obsolete obsolete features or constructions
- override user redefinitions of Automake rules or variables
- portability portability issues (default in gnu and gnits modes)
- extra-portability extra portability issues related to obscure tools
- syntax dubious syntactic constructs (default)
- unsupported unsupported or incomplete features (default)
- all all the warnings
- no-CATEGORY turn off warnings in CATEGORY
- none turn off all the warnings
- error treat warnings as errors
-EOF
+ return "Warning categories include:
+ cross cross compilation issues
+ gnu GNU coding standards (default in gnu and gnits modes)
+ obsolete obsolete features or constructions (default)
+ override user redefinitions of Automake rules or variables
+ portability portability issues (default in gnu and gnits modes)
+ portability-recursive nested Make variables (default with -Wportability)
+ extra-portability extra portability issues related to obscure tools
+ syntax dubious syntactic constructs (default)
+ unsupported unsupported or incomplete features (default)
+ all all the warnings
+ no-CATEGORY turn off warnings in CATEGORY
+ none turn off all the warnings
+ error treat warnings as errors";
}
=item C<prog_error ($MESSAGE, [%OPTIONS])>
=item C<switch_warning ($CATEGORY)>
If C<$CATEGORY> is C<mumble>, turn on channel C<mumble>.
-If it's C<no-mumble>, turn C<mumble> off.
+If it is C<no-mumble>, turn C<mumble> off.
Else handle C<all> and C<none> for completeness.
=cut
=cut
+# Used to communicate from parse_WARNINGS to parse_warnings.
+our $_werror = 0;
+
sub parse_WARNINGS ()
{
if (exists $ENV{'WARNINGS'})
{
# Ignore unknown categories. This is required because WARNINGS
# should be honored by many tools.
- switch_warning $_ foreach (split (',', $ENV{'WARNINGS'}));
+ # For the same reason, do not turn on -Werror at this point, just
+ # record that we saw it; parse_warnings will turn on -Werror after
+ # the command line has been processed.
+ foreach (split (',', $ENV{'WARNINGS'}))
+ {
+ if (/^(no-)?error$/)
+ {
+ $_werror = !defined $1;
+ }
+ else
+ {
+ switch_warning $_;
+ }
+ }
}
}
-=item C<parse_warnings ($OPTION, $ARGUMENT)>
+=item C<parse_warnings (@CATEGORIES)>
Parse the argument of C<--warning=CATEGORY> or C<-WCATEGORY>.
+C<@CATEGORIES> is the accumulated set of warnings categories.
+Use like this:
-C<$OPTIONS> is C<"--warning"> or C<"-W">, C<$ARGUMENT> is C<CATEGORY>.
+ Automake::GetOpt::parse_options (
+ # ...
+ 'W|warnings=s' => \@warnings,
+ )
+ # possibly call set_strictness here
+ parse_warnings @warnings;
-This is meant to be used as an argument to C<Getopt>.
+=cut
+
+sub parse_warnings (@)
+{
+ foreach my $cat (map { split ',' } @_)
+ {
+ if ($cat =~ /^(no-)?error$/)
+ {
+ $_werror = !defined $1;
+ }
+ elsif (switch_warning $cat)
+ {
+ msg 'unsupported', "unknown warning category '$cat'";
+ }
+ }
+
+ switch_warning ($_werror ? 'error' : 'no-error');
+}
+
+=item C<merge_WARNINGS (@CATEGORIES)>
+
+Merge the warnings categories in the environment variable C<WARNINGS>
+with the warnings categories in C<@CATEGORIES>, and return a new
+value for C<WARNINGS>. Values in C<@CATEGORIES> take precedence.
+Use like this:
+
+ local $ENV{WARNINGS} = merge_WARNINGS @additional_warnings;
=cut
-sub parse_warnings ($$)
+sub merge_WARNINGS (@)
{
- my ($opt, $categories) = @_;
+ my $werror = '';
+ my $all_or_none = '';
+ my %warnings;
+
+ my @categories = split /,/, $ENV{WARNINGS} || '';
+ push @categories, @_;
+
+ foreach (@categories)
+ {
+ if (/^(?:no-)?error$/)
+ {
+ $werror = $_;
+ }
+ elsif (/^(?:all|none)$/)
+ {
+ $all_or_none = $_;
+ }
+ else
+ {
+ # The character class in the second match group is ASCII \S minus
+ # comma. We are generous with this because category values may come
+ # from WARNINGS and we don't want to assume what other programs'
+ # syntaxes for warnings categories are.
+ /^(no-|)([\w\[\]\/\\!"#$%&'()*+-.:;<=>?@^`{|}~]+)$/
+ or die "Invalid warnings category: $_";
+ $warnings{$2} = $1;
+ }
+ }
- foreach my $cat (split (',', $categories))
+ my @final_warnings;
+ if ($all_or_none)
{
- msg 'unsupported', "unknown warning category '$cat'"
- if switch_warning $cat;
+ push @final_warnings, $all_or_none;
}
+ else
+ {
+ foreach (sort keys %warnings)
+ {
+ push @final_warnings, $warnings{$_} . $_;
+ }
+ }
+ if ($werror)
+ {
+ push @final_warnings, $werror;
+ }
+
+ return join (',', @final_warnings);
}
=item C<set_strictness ($STRICTNESS_NAME)>
=cut
1;
+
+### Setup "GNU" style for perl-mode and cperl-mode.
+## Local Variables:
+## perl-indent-level: 2
+## perl-continued-statement-offset: 2
+## perl-continued-brace-offset: 0
+## perl-brace-offset: 0
+## perl-brace-imaginary-offset: 0
+## perl-label-offset: -2
+## cperl-indent-level: 2
+## cperl-brace-offset: 0
+## cperl-continued-brace-offset: 0
+## cperl-label-offset: -2
+## cperl-extra-newline-before-brace: t
+## cperl-merge-trailing-else: nil
+## cperl-continued-statement-offset: 2
+## End:
use 5.006;
use strict;
-use Exporter;
+use warnings FATAL => 'all';
+
use Carp;
+use Exporter;
use File::Basename;
-use vars qw (@ISA @EXPORT %channels $me);
+our @ISA = qw (Exporter);
+our @EXPORT = qw ($exit_code $warnings_are_errors
+ &reset_local_duplicates &reset_global_duplicates
+ ®ister_channel &msg &exists_channel &channel_type
+ &setup_channel &setup_channel_type
+ &dup_channel_setup &drop_channel_setup
+ &buffer_messages &flush_messages
+ &setup_channel_queue &pop_channel_queue
+ US_GLOBAL US_LOCAL
+ UP_NONE UP_TEXT UP_LOC_TEXT);
-@ISA = qw (Exporter);
-@EXPORT = qw ($exit_code $warnings_are_errors
- &reset_local_duplicates &reset_global_duplicates
- ®ister_channel &msg &exists_channel &channel_type
- &setup_channel &setup_channel_type
- &dup_channel_setup &drop_channel_setup
- &buffer_messages &flush_messages
- &setup_channel_queue &pop_channel_queue
- US_GLOBAL US_LOCAL
- UP_NONE UP_TEXT UP_LOC_TEXT);
-
-$me = basename $0;
+our %channels;
+our $me = basename $0;
=head2 Global Variables
=cut
-use vars qw ($exit_code);
-$exit_code = 0;
+our $exit_code = 0;
=item C<$warnings_are_errors>
=cut
-use vars qw ($warnings_are_errors);
-$warnings_are_errors = 0;
+our $warnings_are_errors = 0;
=back
=cut
-use vars qw (%_default_options %_global_duplicate_messages
- %_local_duplicate_messages);
-
# Default options for a channel.
-%_default_options =
+our %_default_options =
(
type => 'warning',
exit_code => 1,
# Filled with output messages as keys, to detect duplicates.
# The value associated with each key is the number of occurrences
# filtered out.
-%_local_duplicate_messages = ();
-%_global_duplicate_messages = ();
+our %_local_duplicate_messages = ();
+our %_global_duplicate_messages = ();
sub _reset_duplicates (\%)
{
}
# Store partial messages here. (See the 'partial' option.)
-use vars qw ($partial);
-$partial = '';
+our $partial = '';
# _format_message ($LOCATION, $MESSAGE, %OPTIONS)
# -----------------------------------------------
=cut
-use vars qw (@backlog %buffering);
-
# See buffer_messages() and flush_messages() below.
-%buffering = (); # The map of channel types to buffer.
-@backlog = (); # The buffer of messages.
+our %buffering = (); # The map of channel types to buffer.
+our @backlog = (); # The buffer of messages.
sub msg ($$;$%)
{
=cut
-use vars qw (@_saved_channels @_saved_werrors);
-@_saved_channels = ();
-@_saved_werrors = ();
+our @_saved_channels = ();
+our @_saved_werrors = ();
sub dup_channel_setup ()
{
use 5.006;
use strict;
+use warnings FATAL => 'all';
+
use Carp;
+use Exporter;
-require Exporter;
-use vars '@ISA', '@EXPORT_OK';
-@ISA = qw/Exporter/;
-@EXPORT_OK = qw/TRUE FALSE reduce_and reduce_or/;
+our @ISA = qw (Exporter);
+our @EXPORT_OK = qw (TRUE FALSE reduce_and reduce_or);
=head1 NAME
# Keys in this hash are conditional strings. Values are the
# associated object conditions. This is used by 'new' to reuse
# Condition objects with identical conditionals.
-use vars '%_condition_singletons';
+our %_condition_singletons;
# Do NOT reset this hash here. It's already empty by default,
# and any setting would otherwise occur AFTER the 'TRUE' and 'FALSE'
# constants definitions.
# along with this program. If not, see <https://www.gnu.org/licenses/>.
package Automake::Config;
-use strict;
use 5.006;
-require Exporter;
+use strict;
+use warnings FATAL => 'all';
+
+use Exporter;
our @ISA = qw (Exporter);
our @EXPORT = qw ($APIVERSION $PACKAGE $PACKAGE_BUGREPORT $VERSION
use 5.006;
use strict;
+use warnings FATAL => 'all';
+
use Exporter;
-use Automake::Channels;
-use Automake::ChannelDefs;
-use vars qw (@ISA @EXPORT);
+use Automake::ChannelDefs;
+use Automake::Channels;
-@ISA = qw (Exporter);
-@EXPORT = qw (&find_configure_ac &require_configure_ac);
+our @ISA = qw (Exporter);
+our @EXPORT = qw (&find_configure_ac &require_configure_ac);
=head1 NAME
use 5.006;
use strict;
+use warnings FATAL => 'all';
+
use Carp;
-use Automake::Condition qw/TRUE FALSE/;
+use Automake::Condition qw (TRUE FALSE);
=head1 NAME
# Keys in this hash are DisjConditions strings. Values are the
# associated object DisjConditions. This is used by 'new' to reuse
# DisjConditions objects with identical conditions.
-use vars '%_disjcondition_singletons';
+our %_disjcondition_singletons;
sub new ($;@)
{
use 5.006;
use strict;
+use warnings FATAL => 'all';
+
use Exporter;
use File::stat;
use IO::File;
+
use Automake::Channels;
use Automake::ChannelDefs;
-use vars qw (@ISA @EXPORT);
+our @ISA = qw (Exporter);
+our @EXPORT = qw (&contents
+ &find_file &mtime
+ &update_file
+ &xsystem &xsystem_hint &xqx
+ &dir_has_case_matching_file &reset_dir_cache
+ &set_dir_cache_file);
-@ISA = qw (Exporter);
-@EXPORT = qw (&contents
- &find_file &mtime
- &update_file &up_to_date_p
- &xsystem &xsystem_hint &xqx
- &dir_has_case_matching_file &reset_dir_cache
- &set_dir_cache_file);
+=over 4
=item C<find_file ($file_name, @include)>
}
-=item C<up_to_date_p ($file, @dep)>
-
-Is C<$file> more recent than C<@dep>?
-
-=cut
-
-# $BOOLEAN
-# &up_to_date_p ($FILE, @DEP)
-# ---------------------------
-sub up_to_date_p ($@)
-{
- my ($file, @dep) = @_;
- my $mtime = mtime ($file);
-
- foreach my $dep (@dep)
- {
- if ($mtime < mtime ($dep))
- {
- verb "up_to_date ($file): outdated: $dep";
- return 0;
- }
- }
-
- verb "up_to_date ($file): up to date";
- return 1;
-}
-
-
=item C<handle_exec_errors ($command, [$expected_exit_code = 0], [$hint])>
Display an error message for C<$command>, based on the content of
=cut
-use vars '%_directory_cache';
+our %_directory_cache;
sub dir_has_case_matching_file ($$)
{
# Note that print File::Spec->case_tolerant returns 0 even on MacOS
if exists $_directory_cache{$dirname};
}
+=back
+
+=cut
+
1; # for require
use 5.006;
use strict;
+use warnings FATAL => 'all';
+
use Exporter;
use File::Basename;
-use vars qw (@ISA @EXPORT);
-
-@ISA = qw (Exporter);
-@EXPORT = qw (&uniq &none $me);
+our @ISA = qw (Exporter);
+our @EXPORT = qw (&uniq &none $me);
# Variable we share with the main package. Be sure to have a single
# copy of them: using 'my' together with multiple inclusion of this
# package would introduce several copies.
-use vars qw ($me);
-$me = basename ($0);
+our $me = basename ($0);
# END
# ---
use 5.006;
use strict;
use warnings FATAL => 'all';
+
+use Carp qw (confess croak);
use Exporter ();
use Getopt::Long ();
-use Automake::ChannelDefs qw/fatal/;
-use Carp qw/croak confess/;
-use vars qw (@ISA @EXPORT);
-@ISA = qw (Exporter);
-@EXPORT= qw/getopt/;
+use Automake::ChannelDefs qw (fatal);
+
+our @ISA = qw (Exporter);
+our @EXPORT = qw (getopt);
=item C<parse_options (%option)>
use 5.006;
use strict;
+use warnings FATAL => 'all';
+
use Carp;
+
use Automake::ChannelDefs;
use Automake::DisjConditions;
use 5.006;
use strict;
+use warnings FATAL => 'all';
+
use Carp;
=head1 NAME
use 5.006;
use strict;
+use warnings FATAL => 'all';
use Class::Struct ();
+
Class::Struct::struct (
# Short name of the language (c, f77...).
'name' => "\$",
package Automake::Location;
use 5.006;
+use strict;
+use warnings FATAL => 'all';
=head1 NAME
use 5.006;
use strict;
+use warnings FATAL => 'all';
+
use Exporter;
+
use Automake::Config;
use Automake::ChannelDefs;
use Automake::Channels;
use Automake::Version;
-use vars qw (@ISA @EXPORT);
-
-@ISA = qw (Exporter);
-@EXPORT = qw (option global_option
- set_option set_global_option
- unset_option unset_global_option
- process_option_list process_global_option_list
- set_strictness $strictness $strictness_name
- &FOREIGN &GNU &GNITS);
+our @ISA = qw (Exporter);
+our @EXPORT = qw (option global_option
+ set_option set_global_option
+ unset_option unset_global_option
+ process_option_list process_global_option_list
+ set_strictness $strictness $strictness_name
+ &FOREIGN &GNU &GNITS);
=head1 NAME
=cut
# Values are the Automake::Location of the definition.
-use vars '%_options'; # From AUTOMAKE_OPTIONS
-use vars '%_global_options'; # From AM_INIT_AUTOMAKE or the command line.
+our %_options; # From AUTOMAKE_OPTIONS
+our %_global_options; # From AM_INIT_AUTOMAKE or the command line.
# Whether process_option_list has already been called for the current
# Makefile.am.
-use vars '$_options_processed';
+our $_options_processed;
# Whether process_global_option_list has already been called.
-use vars '$_global_options_processed';
+our $_global_options_processed;
=head2 Constants
=cut
# Strictness levels.
-use vars qw ($strictness $strictness_name);
+our ($strictness, $strictness_name);
# Strictness level as set on command line.
-use vars qw ($_default_strictness $_default_strictness_name);
+our ($_default_strictness, $_default_strictness_name);
=head2 Functions
use 5.006;
use strict;
+use warnings FATAL => 'all';
+
use Carp;
+use Exporter;
use Automake::Item;
use Automake::RuleDef;
use Automake::Options;
use Automake::Condition qw (TRUE FALSE);
use Automake::DisjConditions;
-require Exporter;
-use vars '@ISA', '@EXPORT', '@EXPORT_OK';
-@ISA = qw/Automake::Item Exporter/;
-@EXPORT = qw (reset register_suffix_rule next_in_suffix_chain
- suffixes rules $KNOWN_EXTENSIONS_PATTERN
- depend %dependencies %actions register_action
- accept_extensions
- reject_rule msg_rule msg_cond_rule err_rule err_cond_rule
- rule rrule ruledef rruledef);
+
+our @ISA = qw (Automake::Item Exporter);
+our @EXPORT = qw (reset register_suffix_rule next_in_suffix_chain
+ suffixes rules $KNOWN_EXTENSIONS_PATTERN
+ depend %dependencies %actions register_action
+ accept_extensions
+ reject_rule msg_rule msg_cond_rule err_rule err_cond_rule
+ rule rrule ruledef rruledef);
=head1 NAME
=cut
-use vars '%dependencies';
+our %dependencies;
=item <%actions>
=cut
-use vars '%actions';
+our %actions;
=item C<$KNOWN_EXTENSIONS_PATTERN>
=cut
-use vars qw ($KNOWN_EXTENSIONS_PATTERN);
-$KNOWN_EXTENSIONS_PATTERN = "";
+our $KNOWN_EXTENSIONS_PATTERN = "";
=back
use 5.006;
use strict;
+use warnings FATAL => 'all';
+
use Carp;
+use Exporter;
+
use Automake::ChannelDefs;
use Automake::ItemDef;
-require Exporter;
-use vars '@ISA', '@EXPORT';
-@ISA = qw/Automake::ItemDef Exporter/;
-@EXPORT = qw (&RULE_AUTOMAKE &RULE_USER);
+our @ISA = qw (Automake::ItemDef Exporter);
+our @EXPORT = qw (&RULE_AUTOMAKE &RULE_USER);
=head1 NAME
use 5.006;
use strict;
+use warnings FATAL => 'all';
+
use Carp;
+use Exporter;
+
use Automake::ChannelDefs;
use Automake::ItemDef;
-require Exporter;
-use vars '@ISA', '@EXPORT';
-@ISA = qw/Automake::ItemDef Exporter/;
-@EXPORT = qw (&VAR_AUTOMAKE &VAR_CONFIGURE &VAR_MAKEFILE
- &VAR_ASIS &VAR_PRETTY &VAR_SILENT &VAR_SORTED);
+our @ISA = qw (Automake::ItemDef Exporter);
+our @EXPORT = qw (&VAR_AUTOMAKE &VAR_CONFIGURE &VAR_MAKEFILE
+ &VAR_ASIS &VAR_PRETTY &VAR_SILENT &VAR_SORTED);
=head1 NAME
use 5.006;
use strict;
+use warnings FATAL => 'all';
+
use Carp;
+use Exporter;
use Automake::Channels;
use Automake::ChannelDefs;
use Automake::General 'uniq';
use Automake::Wrap 'makefile_wrap';
-require Exporter;
-use vars '@ISA', '@EXPORT', '@EXPORT_OK';
-@ISA = qw/Automake::Item Exporter/;
-@EXPORT = qw (err_var msg_var msg_cond_var reject_var
- var rvar vardef rvardef
- variables
- scan_variable_expansions check_variable_expansions
- variable_delete
- variables_dump
- set_seen
- require_variables
- variable_value
- output_variables
- transform_variable_recursively);
+our @ISA = qw (Automake::Item Exporter);
+our @EXPORT = qw (err_var msg_var msg_cond_var reject_var
+ var rvar vardef rvardef
+ variables
+ scan_variable_expansions check_variable_expansions
+ variable_delete
+ variables_dump
+ set_seen
+ require_variables
+ variable_value
+ output_variables
+ transform_variable_recursively);
=head1 NAME
# Variables that can be overridden without complaint from -Woverride
my %_silent_variable_override =
- (AM_MAKEINFOHTMLFLAGS => 1,
+ (AM_DISTCHECK_DVI_TARGET => 1,
+ AM_MAKEINFOHTMLFLAGS => 1,
AR => 1,
ARFLAGS => 1,
DEJATOOL => 1,
=cut
-use vars '%_hooks';
+our %_hooks;
sub hook ($$)
{
my ($var, $fun) = @_;
=cut
-use vars '%_variable_dict', '%_primary_dict';
+our (%_variable_dict, %_primary_dict);
sub variables (;$)
{
my ($suffix) = @_;
use 5.006;
use strict;
+use warnings FATAL => 'all';
+
use Automake::ChannelDefs;
=head1 NAME
use 5.006;
use strict;
+use warnings FATAL => 'all';
-require Exporter;
-use vars '@ISA', '@EXPORT_OK';
-@ISA = qw/Exporter/;
-@EXPORT_OK = qw/wrap makefile_wrap/;
+use Exporter;
+
+our @ISA = qw (Exporter);
+our @EXPORT_OK = qw (wrap makefile_wrap);
=head1 NAME
use 5.006;
use strict;
-use vars qw($VERSION @EXPORT @EXPORT_OK $AUTOLOAD @ISA);
-use Carp;
+use warnings FATAL => 'all';
+
use Errno;
+use Exporter;
use IO::File;
-use File::Basename;
+
use Automake::ChannelDefs;
-use Automake::Channels qw(msg);
+use Automake::Channels qw (msg);
use Automake::FileUtils;
-require Exporter;
-require DynaLoader;
-
-@ISA = qw(IO::File Exporter DynaLoader);
-
-$VERSION = "1.2";
-
-@EXPORT = @IO::File::EXPORT;
+our @ISA = qw(Exporter IO::File);
+our @EXPORT = @IO::File::EXPORT;
+our $VERSION = "1.2";
eval {
# Make all Fcntl O_XXX and LOCK_XXX constants available for importing
# Unless explicitly configured otherwise, Perl implements its 'flock' with the
# first of flock(2), fcntl(2), or lockf(3) that works. These can fail on
- # NFS-backed files, with ENOLCK (GNU/Linux) or EOPNOTSUPP (FreeBSD); we
+ # NFS-backed files, with ENOLCK (GNU/Linux) or EOPNOTSUPP (FreeBSD) or
+ # EINVAL (OpenIndiana, as per POSIX 1003.1-2017 fcntl spec); we
# usually ignore these errors. If $ENV{MAKEFLAGS} suggests that a parallel
# invocation of 'make' has invoked the tool we serve, report all locking
# failures and abort.
msg ($make_j ? 'fatal' : 'unsupported',
"cannot lock $file with mode $mode: $!" . ($make_j ? $note : ""))
- if $make_j || !($!{ENOLCK} || $!{EOPNOTSUPP});
+ if $make_j || !($!{EINVAL} || $!{ENOLCK} || $!{EOPNOTSUPP});
}
}
am--force-recheck:
@:
+## Exists only to be overridden. See bug#11745.
+AM_TESTSUITE_SUMMARY_HEADER = ' for $(PACKAGE_STRING)'
+
$(TEST_SUITE_LOG): $(TEST_LOGS)
@$(am__set_TESTS_bases); \
## Helper shell function, tells whether a path refers to an existing,
## Multi line coloring is problematic with "less -R", so we really need
## to color each line individually.
echo "$${col}$$br$${std}"; \
- echo "$${col}Testsuite summary for $(PACKAGE_STRING)$${std}"; \
+ echo "$${col}Testsuite summary"$(AM_TESTSUITE_SUMMARY_HEADER)"$${std}"; \
echo "$${col}$$br$${std}"; \
## This is expected to go to the console, so it might have to be colorized.
create_testsuite_report --maybe-color; \
@echo '# Do not edit here. If you wish to override these values' >>site.tmp
@echo '# edit the last section' >>site.tmp
@echo 'set srcdir "$(srcdir)"' >>site.tmp
- @echo "set objdir `pwd`" >>site.tmp
+ @echo "set objdir \"`pwd`\"" >>site.tmp
## Quote the *_alias variables because they might be empty.
?BUILD? @echo 'set build_alias "$(build_alias)"' >>site.tmp
?BUILD? @echo 'set build_triplet $(build_triplet)' >>site.tmp
AM_RECURSIVE_TARGETS += distcheck
endif %?SUBDIRS%
+# Exists only to be overridden by the user if desired.
+AM_DISTCHECK_DVI_TARGET = dvi
+
# This target untars the dist file and tries a VPATH configuration. Then
# it guarantees that the distribution is self-contained by making another
# tarfile.
## (in corner-case usages); see automake bug#14991.
--srcdir=../.. --prefix="$$dc_install_base" \
&& $(MAKE) $(AM_MAKEFLAGS) \
- && $(MAKE) $(AM_MAKEFLAGS) dvi \
+ && $(MAKE) $(AM_MAKEFLAGS) $(AM_DISTCHECK_DVI_TARGET) \
&& $(MAKE) $(AM_MAKEFLAGS) check \
&& $(MAKE) $(AM_MAKEFLAGS) install \
&& $(MAKE) $(AM_MAKEFLAGS) installcheck \
RECURSIVE_TARGETS += install-data-recursive install-exec-recursive \
install-recursive uninstall-recursive
install:%maybe_BUILT_SOURCES% install-recursive
-install-exec: install-exec-recursive
+install-exec:%maybe_BUILT_SOURCES% install-exec-recursive
install-data: install-data-recursive
uninstall: uninstall-recursive
else !%?SUBDIRS%
install:%maybe_BUILT_SOURCES% install-am
-install-exec: install-exec-am
+install-exec:%maybe_BUILT_SOURCES% install-exec-am
install-data: install-data-am
uninstall: uninstall-am
endif !%?SUBDIRS%
if %?maybe_BUILT_SOURCES%
.MAKE: install
+.MAKE: install-exec
endif %?maybe_BUILT_SOURCES%
.MAKE .PHONY: install-am
# Attempt to guess a canonical system name.
# Copyright 1992-2020 Free Software Foundation, Inc.
-timestamp='2020-01-01'
+timestamp='2020-11-07'
# This file is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by
# Please send patches to <config-patches@gnu.org>.
-me=`echo "$0" | sed -e 's,.*/,,'`
+me=$(echo "$0" | sed -e 's,.*/,,')
usage="\
Usage: $0 [OPTION]
test "$tmp" && return 0
: "${TMPDIR=/tmp}"
# shellcheck disable=SC2039
- { tmp=`(umask 077 && mktemp -d "$TMPDIR/cgXXXXXX") 2>/dev/null` && test -n "$tmp" && test -d "$tmp" ; } ||
+ { tmp=$( (umask 077 && mktemp -d "$TMPDIR/cgXXXXXX") 2>/dev/null) && test -n "$tmp" && test -d "$tmp" ; } ||
{ test -n "$RANDOM" && tmp=$TMPDIR/cg$$-$RANDOM && (umask 077 && mkdir "$tmp" 2>/dev/null) ; } ||
{ tmp=$TMPDIR/cg-$$ && (umask 077 && mkdir "$tmp" 2>/dev/null) && echo "Warning: creating insecure temp directory" >&2 ; } ||
{ echo "$me: cannot create a temporary directory in $TMPDIR" >&2 ; exit 1 ; }
PATH=$PATH:/.attbin ; export PATH
fi
-UNAME_MACHINE=`(uname -m) 2>/dev/null` || UNAME_MACHINE=unknown
-UNAME_RELEASE=`(uname -r) 2>/dev/null` || UNAME_RELEASE=unknown
-UNAME_SYSTEM=`(uname -s) 2>/dev/null` || UNAME_SYSTEM=unknown
-UNAME_VERSION=`(uname -v) 2>/dev/null` || UNAME_VERSION=unknown
+UNAME_MACHINE=$( (uname -m) 2>/dev/null) || UNAME_MACHINE=unknown
+UNAME_RELEASE=$( (uname -r) 2>/dev/null) || UNAME_RELEASE=unknown
+UNAME_SYSTEM=$( (uname -s) 2>/dev/null) || UNAME_SYSTEM=unknown
+UNAME_VERSION=$( (uname -v) 2>/dev/null) || UNAME_VERSION=unknown
case "$UNAME_SYSTEM" in
Linux|GNU|GNU/*)
#elif defined(__dietlibc__)
LIBC=dietlibc
#else
+ #include <stdarg.h>
+ #ifdef __DEFINED_va_list
+ LIBC=musl
+ #else
LIBC=gnu
#endif
+ #endif
EOF
- eval "`$CC_FOR_BUILD -E "$dummy.c" 2>/dev/null | grep '^LIBC' | sed 's, ,,g'`"
-
- # If ldd exists, use it to detect musl libc.
- if command -v ldd >/dev/null && \
- ldd --version 2>&1 | grep -q ^musl
- then
- LIBC=musl
- fi
+ eval "$($CC_FOR_BUILD -E "$dummy.c" 2>/dev/null | grep '^LIBC' | sed 's, ,,g')"
;;
esac
# Note: NetBSD doesn't particularly care about the vendor
# portion of the name. We always set it to "unknown".
sysctl="sysctl -n hw.machine_arch"
- UNAME_MACHINE_ARCH=`(uname -p 2>/dev/null || \
+ UNAME_MACHINE_ARCH=$( (uname -p 2>/dev/null || \
"/sbin/$sysctl" 2>/dev/null || \
"/usr/sbin/$sysctl" 2>/dev/null || \
- echo unknown)`
+ echo unknown))
case "$UNAME_MACHINE_ARCH" in
+ aarch64eb) machine=aarch64_be-unknown ;;
armeb) machine=armeb-unknown ;;
arm*) machine=arm-unknown ;;
sh3el) machine=shl-unknown ;;
sh3eb) machine=sh-unknown ;;
sh5el) machine=sh5le-unknown ;;
earmv*)
- arch=`echo "$UNAME_MACHINE_ARCH" | sed -e 's,^e\(armv[0-9]\).*$,\1,'`
- endian=`echo "$UNAME_MACHINE_ARCH" | sed -ne 's,^.*\(eb\)$,\1,p'`
+ arch=$(echo "$UNAME_MACHINE_ARCH" | sed -e 's,^e\(armv[0-9]\).*$,\1,')
+ endian=$(echo "$UNAME_MACHINE_ARCH" | sed -ne 's,^.*\(eb\)$,\1,p')
machine="${arch}${endian}"-unknown
;;
*) machine="$UNAME_MACHINE_ARCH"-unknown ;;
case "$UNAME_MACHINE_ARCH" in
earm*)
expr='s/^earmv[0-9]/-eabi/;s/eb$//'
- abi=`echo "$UNAME_MACHINE_ARCH" | sed -e "$expr"`
+ abi=$(echo "$UNAME_MACHINE_ARCH" | sed -e "$expr")
;;
esac
# The OS release
release='-gnu'
;;
*)
- release=`echo "$UNAME_RELEASE" | sed -e 's/[-_].*//' | cut -d. -f1,2`
+ release=$(echo "$UNAME_RELEASE" | sed -e 's/[-_].*//' | cut -d. -f1,2)
;;
esac
# Since CPU_TYPE-MANUFACTURER-KERNEL-OPERATING_SYSTEM:
echo "$machine-${os}${release}${abi-}"
exit ;;
*:Bitrig:*:*)
- UNAME_MACHINE_ARCH=`arch | sed 's/Bitrig.//'`
+ UNAME_MACHINE_ARCH=$(arch | sed 's/Bitrig.//')
echo "$UNAME_MACHINE_ARCH"-unknown-bitrig"$UNAME_RELEASE"
exit ;;
*:OpenBSD:*:*)
- UNAME_MACHINE_ARCH=`arch | sed 's/OpenBSD.//'`
+ UNAME_MACHINE_ARCH=$(arch | sed 's/OpenBSD.//')
echo "$UNAME_MACHINE_ARCH"-unknown-openbsd"$UNAME_RELEASE"
exit ;;
*:LibertyBSD:*:*)
- UNAME_MACHINE_ARCH=`arch | sed 's/^.*BSD\.//'`
+ UNAME_MACHINE_ARCH=$(arch | sed 's/^.*BSD\.//')
echo "$UNAME_MACHINE_ARCH"-unknown-libertybsd"$UNAME_RELEASE"
exit ;;
*:MidnightBSD:*:*)
alpha:OSF1:*:*)
case $UNAME_RELEASE in
*4.0)
- UNAME_RELEASE=`/usr/sbin/sizer -v | awk '{print $3}'`
+ UNAME_RELEASE=$(/usr/sbin/sizer -v | awk '{print $3}')
;;
*5.*)
- UNAME_RELEASE=`/usr/sbin/sizer -v | awk '{print $4}'`
+ UNAME_RELEASE=$(/usr/sbin/sizer -v | awk '{print $4}')
;;
esac
# According to Compaq, /usr/sbin/psrinfo has been available on
# OSF/1 and Tru64 systems produced since 1995. I hope that
# covers most systems running today. This code pipes the CPU
# types through head -n 1, so we only detect the type of CPU 0.
- ALPHA_CPU_TYPE=`/usr/sbin/psrinfo -v | sed -n -e 's/^ The alpha \(.*\) processor.*$/\1/p' | head -n 1`
+ ALPHA_CPU_TYPE=$(/usr/sbin/psrinfo -v | sed -n -e 's/^ The alpha \(.*\) processor.*$/\1/p' | head -n 1)
case "$ALPHA_CPU_TYPE" in
"EV4 (21064)")
UNAME_MACHINE=alpha ;;
# A Tn.n version is a released field test version.
# A Xn.n version is an unreleased experimental baselevel.
# 1.2 uses "1.2" for uname -r.
- echo "$UNAME_MACHINE"-dec-osf"`echo "$UNAME_RELEASE" | sed -e 's/^[PVTX]//' | tr ABCDEFGHIJKLMNOPQRSTUVWXYZ abcdefghijklmnopqrstuvwxyz`"
+ echo "$UNAME_MACHINE"-dec-osf"$(echo "$UNAME_RELEASE" | sed -e 's/^[PVTX]//' | tr ABCDEFGHIJKLMNOPQRSTUVWXYZ abcdefghijklmnopqrstuvwxyz)"
# Reset EXIT trap before exiting to avoid spurious non-zero exit code.
exitcode=$?
trap '' 0
exit ;;
Pyramid*:OSx*:*:* | MIS*:OSx*:*:* | MIS*:SMP_DC-OSx*:*:*)
# akee@wpdis03.wpafb.af.mil (Earle F. Ake) contributed MIS and NILE.
- if test "`(/bin/universe) 2>/dev/null`" = att ; then
+ if test "$( (/bin/universe) 2>/dev/null)" = att ; then
echo pyramid-pyramid-sysv3
else
echo pyramid-pyramid-bsd
echo sparc-icl-nx6
exit ;;
DRS?6000:UNIX_SV:4.2*:7* | DRS?6000:isis:4.2*:7*)
- case `/usr/bin/uname -p` in
+ case $(/usr/bin/uname -p) in
sparc) echo sparc-icl-nx7; exit ;;
esac ;;
s390x:SunOS:*:*)
- echo "$UNAME_MACHINE"-ibm-solaris2"`echo "$UNAME_RELEASE" | sed -e 's/[^.]*//'`"
+ echo "$UNAME_MACHINE"-ibm-solaris2"$(echo "$UNAME_RELEASE" | sed -e 's/[^.]*//')"
exit ;;
sun4H:SunOS:5.*:*)
- echo sparc-hal-solaris2"`echo "$UNAME_RELEASE"|sed -e 's/[^.]*//'`"
+ echo sparc-hal-solaris2"$(echo "$UNAME_RELEASE"|sed -e 's/[^.]*//')"
exit ;;
sun4*:SunOS:5.*:* | tadpole*:SunOS:5.*:*)
- echo sparc-sun-solaris2"`echo "$UNAME_RELEASE" | sed -e 's/[^.]*//'`"
+ echo sparc-sun-solaris2"$(echo "$UNAME_RELEASE" | sed -e 's/[^.]*//')"
exit ;;
i86pc:AuroraUX:5.*:* | i86xen:AuroraUX:5.*:*)
echo i386-pc-auroraux"$UNAME_RELEASE"
# If there is a compiler, see if it is configured for 64-bit objects.
# Note that the Sun cc does not turn __LP64__ into 1 like gcc does.
# This test works for both compilers.
- if [ "$CC_FOR_BUILD" != no_compiler_found ]; then
+ if test "$CC_FOR_BUILD" != no_compiler_found; then
if (echo '#ifdef __amd64'; echo IS_64BIT_ARCH; echo '#endif') | \
(CCOPTS="" $CC_FOR_BUILD -E - 2>/dev/null) | \
grep IS_64BIT_ARCH >/dev/null
SUN_ARCH=x86_64
fi
fi
- echo "$SUN_ARCH"-pc-solaris2"`echo "$UNAME_RELEASE"|sed -e 's/[^.]*//'`"
+ echo "$SUN_ARCH"-pc-solaris2"$(echo "$UNAME_RELEASE"|sed -e 's/[^.]*//')"
exit ;;
sun4*:SunOS:6*:*)
# According to config.sub, this is the proper way to canonicalize
# SunOS6. Hard to guess exactly what SunOS6 will be like, but
# it's likely to be more like Solaris than SunOS4.
- echo sparc-sun-solaris3"`echo "$UNAME_RELEASE"|sed -e 's/[^.]*//'`"
+ echo sparc-sun-solaris3"$(echo "$UNAME_RELEASE"|sed -e 's/[^.]*//')"
exit ;;
sun4*:SunOS:*:*)
- case "`/usr/bin/arch -k`" in
+ case "$(/usr/bin/arch -k)" in
Series*|S4*)
- UNAME_RELEASE=`uname -v`
+ UNAME_RELEASE=$(uname -v)
;;
esac
# Japanese Language versions have a version number like `4.1.3-JL'.
- echo sparc-sun-sunos"`echo "$UNAME_RELEASE"|sed -e 's/-/_/'`"
+ echo sparc-sun-sunos"$(echo "$UNAME_RELEASE"|sed -e 's/-/_/')"
exit ;;
sun3*:SunOS:*:*)
echo m68k-sun-sunos"$UNAME_RELEASE"
exit ;;
sun*:*:4.2BSD:*)
- UNAME_RELEASE=`(sed 1q /etc/motd | awk '{print substr($5,1,3)}') 2>/dev/null`
+ UNAME_RELEASE=$( (sed 1q /etc/motd | awk '{print substr($5,1,3)}') 2>/dev/null)
test "x$UNAME_RELEASE" = x && UNAME_RELEASE=3
- case "`/bin/arch`" in
+ case "$(/bin/arch)" in
sun3)
echo m68k-sun-sunos"$UNAME_RELEASE"
;;
}
EOF
$CC_FOR_BUILD -o "$dummy" "$dummy.c" &&
- dummyarg=`echo "$UNAME_RELEASE" | sed -n 's/\([0-9]*\).*/\1/p'` &&
- SYSTEM_NAME=`"$dummy" "$dummyarg"` &&
+ dummyarg=$(echo "$UNAME_RELEASE" | sed -n 's/\([0-9]*\).*/\1/p') &&
+ SYSTEM_NAME=$("$dummy" "$dummyarg") &&
{ echo "$SYSTEM_NAME"; exit; }
echo mips-mips-riscos"$UNAME_RELEASE"
exit ;;
exit ;;
AViiON:dgux:*:*)
# DG/UX returns AViiON for all architectures
- UNAME_PROCESSOR=`/usr/bin/uname -p`
- if [ "$UNAME_PROCESSOR" = mc88100 ] || [ "$UNAME_PROCESSOR" = mc88110 ]
+ UNAME_PROCESSOR=$(/usr/bin/uname -p)
+ if test "$UNAME_PROCESSOR" = mc88100 || test "$UNAME_PROCESSOR" = mc88110
then
- if [ "$TARGET_BINARY_INTERFACE"x = m88kdguxelfx ] || \
- [ "$TARGET_BINARY_INTERFACE"x = x ]
+ if test "$TARGET_BINARY_INTERFACE"x = m88kdguxelfx || \
+ test "$TARGET_BINARY_INTERFACE"x = x
then
echo m88k-dg-dgux"$UNAME_RELEASE"
else
echo m68k-tektronix-bsd
exit ;;
*:IRIX*:*:*)
- echo mips-sgi-irix"`echo "$UNAME_RELEASE"|sed -e 's/-/_/g'`"
+ echo mips-sgi-irix"$(echo "$UNAME_RELEASE"|sed -e 's/-/_/g')"
exit ;;
????????:AIX?:[12].1:2) # AIX 2.2.1 or AIX 2.1.1 is RT/PC AIX.
echo romp-ibm-aix # uname -m gives an 8 hex-code CPU id
- exit ;; # Note that: echo "'`uname -s`'" gives 'AIX '
+ exit ;; # Note that: echo "'$(uname -s)'" gives 'AIX '
i*86:AIX:*:*)
echo i386-ibm-aix
exit ;;
ia64:AIX:*:*)
- if [ -x /usr/bin/oslevel ] ; then
- IBM_REV=`/usr/bin/oslevel`
+ if test -x /usr/bin/oslevel ; then
+ IBM_REV=$(/usr/bin/oslevel)
else
IBM_REV="$UNAME_VERSION.$UNAME_RELEASE"
fi
exit(0);
}
EOF
- if $CC_FOR_BUILD -o "$dummy" "$dummy.c" && SYSTEM_NAME=`"$dummy"`
+ if $CC_FOR_BUILD -o "$dummy" "$dummy.c" && SYSTEM_NAME=$("$dummy")
then
echo "$SYSTEM_NAME"
else
fi
exit ;;
*:AIX:*:[4567])
- IBM_CPU_ID=`/usr/sbin/lsdev -C -c processor -S available | sed 1q | awk '{ print $1 }'`
+ IBM_CPU_ID=$(/usr/sbin/lsdev -C -c processor -S available | sed 1q | awk '{ print $1 }')
if /usr/sbin/lsattr -El "$IBM_CPU_ID" | grep ' POWER' >/dev/null 2>&1; then
IBM_ARCH=rs6000
else
IBM_ARCH=powerpc
fi
- if [ -x /usr/bin/lslpp ] ; then
- IBM_REV=`/usr/bin/lslpp -Lqc bos.rte.libc |
- awk -F: '{ print $3 }' | sed s/[0-9]*$/0/`
+ if test -x /usr/bin/lslpp ; then
+ IBM_REV=$(/usr/bin/lslpp -Lqc bos.rte.libc |
+ awk -F: '{ print $3 }' | sed s/[0-9]*$/0/)
else
IBM_REV="$UNAME_VERSION.$UNAME_RELEASE"
fi
echo m68k-hp-bsd4.4
exit ;;
9000/[34678]??:HP-UX:*:*)
- HPUX_REV=`echo "$UNAME_RELEASE"|sed -e 's/[^.]*.[0B]*//'`
+ HPUX_REV=$(echo "$UNAME_RELEASE"|sed -e 's/[^.]*.[0B]*//')
case "$UNAME_MACHINE" in
9000/31?) HP_ARCH=m68000 ;;
9000/[34]??) HP_ARCH=m68k ;;
9000/[678][0-9][0-9])
- if [ -x /usr/bin/getconf ]; then
- sc_cpu_version=`/usr/bin/getconf SC_CPU_VERSION 2>/dev/null`
- sc_kernel_bits=`/usr/bin/getconf SC_KERNEL_BITS 2>/dev/null`
+ if test -x /usr/bin/getconf; then
+ sc_cpu_version=$(/usr/bin/getconf SC_CPU_VERSION 2>/dev/null)
+ sc_kernel_bits=$(/usr/bin/getconf SC_KERNEL_BITS 2>/dev/null)
case "$sc_cpu_version" in
523) HP_ARCH=hppa1.0 ;; # CPU_PA_RISC1_0
528) HP_ARCH=hppa1.1 ;; # CPU_PA_RISC1_1
esac ;;
esac
fi
- if [ "$HP_ARCH" = "" ]; then
+ if test "$HP_ARCH" = ""; then
set_cc_for_build
sed 's/^ //' << EOF > "$dummy.c"
exit (0);
}
EOF
- (CCOPTS="" $CC_FOR_BUILD -o "$dummy" "$dummy.c" 2>/dev/null) && HP_ARCH=`"$dummy"`
+ (CCOPTS="" $CC_FOR_BUILD -o "$dummy" "$dummy.c" 2>/dev/null) && HP_ARCH=$("$dummy")
test -z "$HP_ARCH" && HP_ARCH=hppa
fi ;;
esac
- if [ "$HP_ARCH" = hppa2.0w ]
+ if test "$HP_ARCH" = hppa2.0w
then
set_cc_for_build
echo "$HP_ARCH"-hp-hpux"$HPUX_REV"
exit ;;
ia64:HP-UX:*:*)
- HPUX_REV=`echo "$UNAME_RELEASE"|sed -e 's/[^.]*.[0B]*//'`
+ HPUX_REV=$(echo "$UNAME_RELEASE"|sed -e 's/[^.]*.[0B]*//')
echo ia64-hp-hpux"$HPUX_REV"
exit ;;
3050*:HI-UX:*:*)
exit (0);
}
EOF
- $CC_FOR_BUILD -o "$dummy" "$dummy.c" && SYSTEM_NAME=`"$dummy"` &&
+ $CC_FOR_BUILD -o "$dummy" "$dummy.c" && SYSTEM_NAME=$("$dummy") &&
{ echo "$SYSTEM_NAME"; exit; }
echo unknown-hitachi-hiuxwe2
exit ;;
echo hppa1.0-hp-osf
exit ;;
i*86:OSF1:*:*)
- if [ -x /usr/sbin/sysversion ] ; then
+ if test -x /usr/sbin/sysversion ; then
echo "$UNAME_MACHINE"-unknown-osf1mk
else
echo "$UNAME_MACHINE"-unknown-osf1
echo craynv-cray-unicosmp"$UNAME_RELEASE" | sed -e 's/\.[^.]*$/.X/'
exit ;;
F30[01]:UNIX_System_V:*:* | F700:UNIX_System_V:*:*)
- FUJITSU_PROC=`uname -m | tr ABCDEFGHIJKLMNOPQRSTUVWXYZ abcdefghijklmnopqrstuvwxyz`
- FUJITSU_SYS=`uname -p | tr ABCDEFGHIJKLMNOPQRSTUVWXYZ abcdefghijklmnopqrstuvwxyz | sed -e 's/\///'`
- FUJITSU_REL=`echo "$UNAME_RELEASE" | sed -e 's/ /_/'`
+ FUJITSU_PROC=$(uname -m | tr ABCDEFGHIJKLMNOPQRSTUVWXYZ abcdefghijklmnopqrstuvwxyz)
+ FUJITSU_SYS=$(uname -p | tr ABCDEFGHIJKLMNOPQRSTUVWXYZ abcdefghijklmnopqrstuvwxyz | sed -e 's/\///')
+ FUJITSU_REL=$(echo "$UNAME_RELEASE" | sed -e 's/ /_/')
echo "${FUJITSU_PROC}-fujitsu-${FUJITSU_SYS}${FUJITSU_REL}"
exit ;;
5000:UNIX_System_V:4.*:*)
- FUJITSU_SYS=`uname -p | tr ABCDEFGHIJKLMNOPQRSTUVWXYZ abcdefghijklmnopqrstuvwxyz | sed -e 's/\///'`
- FUJITSU_REL=`echo "$UNAME_RELEASE" | tr ABCDEFGHIJKLMNOPQRSTUVWXYZ abcdefghijklmnopqrstuvwxyz | sed -e 's/ /_/'`
+ FUJITSU_SYS=$(uname -p | tr ABCDEFGHIJKLMNOPQRSTUVWXYZ abcdefghijklmnopqrstuvwxyz | sed -e 's/\///')
+ FUJITSU_REL=$(echo "$UNAME_RELEASE" | tr ABCDEFGHIJKLMNOPQRSTUVWXYZ abcdefghijklmnopqrstuvwxyz | sed -e 's/ /_/')
echo "sparc-fujitsu-${FUJITSU_SYS}${FUJITSU_REL}"
exit ;;
i*86:BSD/386:*:* | i*86:BSD/OS:*:* | *:Ascend\ Embedded/OS:*:*)
echo "$UNAME_MACHINE"-unknown-bsdi"$UNAME_RELEASE"
exit ;;
arm:FreeBSD:*:*)
- UNAME_PROCESSOR=`uname -p`
+ UNAME_PROCESSOR=$(uname -p)
set_cc_for_build
if echo __ARM_PCS_VFP | $CC_FOR_BUILD -E - 2>/dev/null \
| grep -q __ARM_PCS_VFP
then
- echo "${UNAME_PROCESSOR}"-unknown-freebsd"`echo ${UNAME_RELEASE}|sed -e 's/[-(].*//'`"-gnueabi
+ echo "${UNAME_PROCESSOR}"-unknown-freebsd"$(echo ${UNAME_RELEASE}|sed -e 's/[-(].*//')"-gnueabi
else
- echo "${UNAME_PROCESSOR}"-unknown-freebsd"`echo ${UNAME_RELEASE}|sed -e 's/[-(].*//'`"-gnueabihf
+ echo "${UNAME_PROCESSOR}"-unknown-freebsd"$(echo ${UNAME_RELEASE}|sed -e 's/[-(].*//')"-gnueabihf
fi
exit ;;
*:FreeBSD:*:*)
- UNAME_PROCESSOR=`/usr/bin/uname -p`
+ UNAME_PROCESSOR=$(/usr/bin/uname -p)
case "$UNAME_PROCESSOR" in
amd64)
UNAME_PROCESSOR=x86_64 ;;
i386)
UNAME_PROCESSOR=i586 ;;
esac
- echo "$UNAME_PROCESSOR"-unknown-freebsd"`echo "$UNAME_RELEASE"|sed -e 's/[-(].*//'`"
+ echo "$UNAME_PROCESSOR"-unknown-freebsd"$(echo "$UNAME_RELEASE"|sed -e 's/[-(].*//')"
exit ;;
i*:CYGWIN*:*)
echo "$UNAME_MACHINE"-pc-cygwin
echo x86_64-pc-cygwin
exit ;;
prep*:SunOS:5.*:*)
- echo powerpcle-unknown-solaris2"`echo "$UNAME_RELEASE"|sed -e 's/[^.]*//'`"
+ echo powerpcle-unknown-solaris2"$(echo "$UNAME_RELEASE"|sed -e 's/[^.]*//')"
exit ;;
*:GNU:*:*)
# the GNU system
- echo "`echo "$UNAME_MACHINE"|sed -e 's,[-/].*$,,'`-unknown-$LIBC`echo "$UNAME_RELEASE"|sed -e 's,/.*$,,'`"
+ echo "$(echo "$UNAME_MACHINE"|sed -e 's,[-/].*$,,')-unknown-$LIBC$(echo "$UNAME_RELEASE"|sed -e 's,/.*$,,')"
exit ;;
*:GNU/*:*:*)
# other systems with GNU libc and userland
- echo "$UNAME_MACHINE-unknown-`echo "$UNAME_SYSTEM" | sed 's,^[^/]*/,,' | tr "[:upper:]" "[:lower:]"``echo "$UNAME_RELEASE"|sed -e 's/[-(].*//'`-$LIBC"
+ echo "$UNAME_MACHINE-unknown-$(echo "$UNAME_SYSTEM" | sed 's,^[^/]*/,,' | tr "[:upper:]" "[:lower:]")$(echo "$UNAME_RELEASE"|sed -e 's/[-(].*//')-$LIBC"
exit ;;
*:Minix:*:*)
echo "$UNAME_MACHINE"-unknown-minix
echo "$UNAME_MACHINE"-unknown-linux-"$LIBC"
exit ;;
alpha:Linux:*:*)
- case `sed -n '/^cpu model/s/^.*: \(.*\)/\1/p' /proc/cpuinfo 2>/dev/null` in
+ case $(sed -n '/^cpu model/s/^.*: \(.*\)/\1/p' /proc/cpuinfo 2>/dev/null) in
EV5) UNAME_MACHINE=alphaev5 ;;
EV56) UNAME_MACHINE=alphaev56 ;;
PCA56) UNAME_MACHINE=alphapca56 ;;
#endif
#endif
EOF
- eval "`$CC_FOR_BUILD -E "$dummy.c" 2>/dev/null | grep '^CPU\|^MIPS_ENDIAN\|^LIBCABI'`"
+ eval "$($CC_FOR_BUILD -E "$dummy.c" 2>/dev/null | grep '^CPU\|^MIPS_ENDIAN\|^LIBCABI')"
test "x$CPU" != x && { echo "$CPU${MIPS_ENDIAN}-unknown-linux-$LIBCABI"; exit; }
;;
mips64el:Linux:*:*)
exit ;;
parisc:Linux:*:* | hppa:Linux:*:*)
# Look for CPU level
- case `grep '^cpu[^a-z]*:' /proc/cpuinfo 2>/dev/null | cut -d' ' -f2` in
+ case $(grep '^cpu[^a-z]*:' /proc/cpuinfo 2>/dev/null | cut -d' ' -f2) in
PA7*) echo hppa1.1-unknown-linux-"$LIBC" ;;
PA8*) echo hppa2.0-unknown-linux-"$LIBC" ;;
*) echo hppa-unknown-linux-"$LIBC" ;;
echo "$UNAME_MACHINE"-dec-linux-"$LIBC"
exit ;;
x86_64:Linux:*:*)
- echo "$UNAME_MACHINE"-pc-linux-"$LIBC"
+ set_cc_for_build
+ LIBCABI=$LIBC
+ if test "$CC_FOR_BUILD" != no_compiler_found; then
+ if (echo '#ifdef __ILP32__'; echo IS_X32; echo '#endif') | \
+ (CCOPTS="" $CC_FOR_BUILD -E - 2>/dev/null) | \
+ grep IS_X32 >/dev/null
+ then
+ LIBCABI="$LIBC"x32
+ fi
+ fi
+ echo "$UNAME_MACHINE"-pc-linux-"$LIBCABI"
exit ;;
xtensa*:Linux:*:*)
echo "$UNAME_MACHINE"-unknown-linux-"$LIBC"
echo "$UNAME_MACHINE"-pc-msdosdjgpp
exit ;;
i*86:*:4.*:*)
- UNAME_REL=`echo "$UNAME_RELEASE" | sed 's/\/MP$//'`
+ UNAME_REL=$(echo "$UNAME_RELEASE" | sed 's/\/MP$//')
if grep Novell /usr/include/link.h >/dev/null 2>/dev/null; then
echo "$UNAME_MACHINE"-univel-sysv"$UNAME_REL"
else
exit ;;
i*86:*:5:[678]*)
# UnixWare 7.x, OpenUNIX and OpenServer 6.
- case `/bin/uname -X | grep "^Machine"` in
+ case $(/bin/uname -X | grep "^Machine") in
*486*) UNAME_MACHINE=i486 ;;
*Pentium) UNAME_MACHINE=i586 ;;
*Pent*|*Celeron) UNAME_MACHINE=i686 ;;
exit ;;
i*86:*:3.2:*)
if test -f /usr/options/cb.name; then
- UNAME_REL=`sed -n 's/.*Version //p' </usr/options/cb.name`
+ UNAME_REL=$(sed -n 's/.*Version //p' </usr/options/cb.name)
echo "$UNAME_MACHINE"-pc-isc"$UNAME_REL"
elif /bin/uname -X 2>/dev/null >/dev/null ; then
- UNAME_REL=`(/bin/uname -X|grep Release|sed -e 's/.*= //')`
+ UNAME_REL=$( (/bin/uname -X|grep Release|sed -e 's/.*= //'))
(/bin/uname -X|grep i80486 >/dev/null) && UNAME_MACHINE=i486
(/bin/uname -X|grep '^Machine.*Pentium' >/dev/null) \
&& UNAME_MACHINE=i586
3[345]??:*:4.0:3.0 | 3[34]??A:*:4.0:3.0 | 3[34]??,*:*:4.0:3.0 | 3[34]??/*:*:4.0:3.0 | 4400:*:4.0:3.0 | 4850:*:4.0:3.0 | SKA40:*:4.0:3.0 | SDS2:*:4.0:3.0 | SHG2:*:4.0:3.0 | S7501*:*:4.0:3.0)
OS_REL=''
test -r /etc/.relid \
- && OS_REL=.`sed -n 's/[^ ]* [^ ]* \([0-9][0-9]\).*/\1/p' < /etc/.relid`
+ && OS_REL=.$(sed -n 's/[^ ]* [^ ]* \([0-9][0-9]\).*/\1/p' < /etc/.relid)
/bin/uname -p 2>/dev/null | grep 86 >/dev/null \
&& { echo i486-ncr-sysv4.3"$OS_REL"; exit; }
/bin/uname -p 2>/dev/null | /bin/grep entium >/dev/null \
NCR*:*:4.2:* | MPRAS*:*:4.2:*)
OS_REL='.3'
test -r /etc/.relid \
- && OS_REL=.`sed -n 's/[^ ]* [^ ]* \([0-9][0-9]\).*/\1/p' < /etc/.relid`
+ && OS_REL=.$(sed -n 's/[^ ]* [^ ]* \([0-9][0-9]\).*/\1/p' < /etc/.relid)
/bin/uname -p 2>/dev/null | grep 86 >/dev/null \
&& { echo i486-ncr-sysv4.3"$OS_REL"; exit; }
/bin/uname -p 2>/dev/null | /bin/grep entium >/dev/null \
exit ;;
*:SINIX-*:*:*)
if uname -p 2>/dev/null >/dev/null ; then
- UNAME_MACHINE=`(uname -p) 2>/dev/null`
+ UNAME_MACHINE=$( (uname -p) 2>/dev/null)
echo "$UNAME_MACHINE"-sni-sysv4
else
echo ns32k-sni-sysv
echo mips-sony-newsos6
exit ;;
R[34]000:*System_V*:*:* | R4000:UNIX_SYSV:*:* | R*000:UNIX_SV:*:*)
- if [ -d /usr/nec ]; then
+ if test -d /usr/nec; then
echo mips-nec-sysv"$UNAME_RELEASE"
else
echo mips-unknown-sysv"$UNAME_RELEASE"
*:Rhapsody:*:*)
echo "$UNAME_MACHINE"-apple-rhapsody"$UNAME_RELEASE"
exit ;;
+ arm64:Darwin:*:*)
+ echo aarch64-apple-darwin"$UNAME_RELEASE"
+ exit ;;
*:Darwin:*:*)
- UNAME_PROCESSOR=`uname -p`
+ UNAME_PROCESSOR=$(uname -p)
case $UNAME_PROCESSOR in
unknown) UNAME_PROCESSOR=powerpc ;;
esac
else
set_cc_for_build
fi
- if [ "$CC_FOR_BUILD" != no_compiler_found ]; then
+ if test "$CC_FOR_BUILD" != no_compiler_found; then
if (echo '#ifdef __LP64__'; echo IS_64BIT_ARCH; echo '#endif') | \
(CCOPTS="" $CC_FOR_BUILD -E - 2>/dev/null) | \
grep IS_64BIT_ARCH >/dev/null
echo "$UNAME_PROCESSOR"-apple-darwin"$UNAME_RELEASE"
exit ;;
*:procnto*:*:* | *:QNX:[0123456789]*:*)
- UNAME_PROCESSOR=`uname -p`
+ UNAME_PROCESSOR=$(uname -p)
if test "$UNAME_PROCESSOR" = x86; then
UNAME_PROCESSOR=i386
UNAME_MACHINE=pc
echo mips-sei-seiux"$UNAME_RELEASE"
exit ;;
*:DragonFly:*:*)
- echo "$UNAME_MACHINE"-unknown-dragonfly"`echo "$UNAME_RELEASE"|sed -e 's/[-(].*//'`"
+ echo "$UNAME_MACHINE"-unknown-dragonfly"$(echo "$UNAME_RELEASE"|sed -e 's/[-(].*//')"
exit ;;
*:*VMS:*:*)
- UNAME_MACHINE=`(uname -p) 2>/dev/null`
+ UNAME_MACHINE=$( (uname -p) 2>/dev/null)
case "$UNAME_MACHINE" in
A*) echo alpha-dec-vms ; exit ;;
I*) echo ia64-dec-vms ; exit ;;
echo i386-pc-xenix
exit ;;
i*86:skyos:*:*)
- echo "$UNAME_MACHINE"-pc-skyos"`echo "$UNAME_RELEASE" | sed -e 's/ .*$//'`"
+ echo "$UNAME_MACHINE"-pc-skyos"$(echo "$UNAME_RELEASE" | sed -e 's/ .*$//')"
exit ;;
i*86:rdos:*:*)
echo "$UNAME_MACHINE"-pc-rdos
#define __ARCHITECTURE__ "m68k"
#endif
int version;
- version=`(hostinfo | sed -n 's/.*NeXT Mach \([0-9]*\).*/\1/p') 2>/dev/null`;
+ version=$( (hostinfo | sed -n 's/.*NeXT Mach \([0-9]*\).*/\1/p') 2>/dev/null);
if (version < 4)
printf ("%s-next-nextstep%d\n", __ARCHITECTURE__, version);
else
}
EOF
-$CC_FOR_BUILD -o "$dummy" "$dummy.c" 2>/dev/null && SYSTEM_NAME=`$dummy` &&
+$CC_FOR_BUILD -o "$dummy" "$dummy.c" 2>/dev/null && SYSTEM_NAME=$($dummy) &&
{ echo "$SYSTEM_NAME"; exit; }
# Apollos put the system type in the environment.
https://git.savannah.gnu.org/gitweb/?p=config.git;a=blob_plain;f=config.guess
and
https://git.savannah.gnu.org/gitweb/?p=config.git;a=blob_plain;f=config.sub
+EOF
+
+year=$(echo $timestamp | sed 's,-.*,,')
+# shellcheck disable=SC2003
+if test "$(expr "$(date +%Y)" - "$year")" -lt 3 ; then
+ cat >&2 <<EOF
If $0 has already been updated, send the following data and any
information you think might be pertinent to config-patches@gnu.org to
config.guess timestamp = $timestamp
-uname -m = `(uname -m) 2>/dev/null || echo unknown`
-uname -r = `(uname -r) 2>/dev/null || echo unknown`
-uname -s = `(uname -s) 2>/dev/null || echo unknown`
-uname -v = `(uname -v) 2>/dev/null || echo unknown`
+uname -m = $( (uname -m) 2>/dev/null || echo unknown)
+uname -r = $( (uname -r) 2>/dev/null || echo unknown)
+uname -s = $( (uname -s) 2>/dev/null || echo unknown)
+uname -v = $( (uname -v) 2>/dev/null || echo unknown)
-/usr/bin/uname -p = `(/usr/bin/uname -p) 2>/dev/null`
-/bin/uname -X = `(/bin/uname -X) 2>/dev/null`
+/usr/bin/uname -p = $( (/usr/bin/uname -p) 2>/dev/null)
+/bin/uname -X = $( (/bin/uname -X) 2>/dev/null)
-hostinfo = `(hostinfo) 2>/dev/null`
-/bin/universe = `(/bin/universe) 2>/dev/null`
-/usr/bin/arch -k = `(/usr/bin/arch -k) 2>/dev/null`
-/bin/arch = `(/bin/arch) 2>/dev/null`
-/usr/bin/oslevel = `(/usr/bin/oslevel) 2>/dev/null`
-/usr/convex/getsysinfo = `(/usr/convex/getsysinfo) 2>/dev/null`
+hostinfo = $( (hostinfo) 2>/dev/null)
+/bin/universe = $( (/bin/universe) 2>/dev/null)
+/usr/bin/arch -k = $( (/usr/bin/arch -k) 2>/dev/null)
+/bin/arch = $( (/bin/arch) 2>/dev/null)
+/usr/bin/oslevel = $( (/usr/bin/oslevel) 2>/dev/null)
+/usr/convex/getsysinfo = $( (/usr/convex/getsysinfo) 2>/dev/null)
UNAME_MACHINE = "$UNAME_MACHINE"
UNAME_RELEASE = "$UNAME_RELEASE"
UNAME_SYSTEM = "$UNAME_SYSTEM"
UNAME_VERSION = "$UNAME_VERSION"
EOF
+fi
exit 1
# Configuration validation subroutine script.
# Copyright 1992-2020 Free Software Foundation, Inc.
-timestamp='2020-01-01'
+timestamp='2020-11-07'
# This file is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by
# CPU_TYPE-MANUFACTURER-KERNEL-OPERATING_SYSTEM
# It is wrong to echo any other type of specification.
-me=`echo "$0" | sed -e 's,.*/,,'`
+me=$(echo "$0" | sed -e 's,.*/,,')
usage="\
Usage: $0 [OPTION] CPU-MFR-OPSYS or ALIAS
;;
*-*-*-*)
basic_machine=$field1-$field2
- os=$field3-$field4
+ basic_os=$field3-$field4
;;
*-*-*)
# Ambiguous whether COMPANY is present, or skipped and KERNEL-OS is two
# parts
maybe_os=$field2-$field3
case $maybe_os in
- nto-qnx* | linux-gnu* | linux-android* | linux-dietlibc \
- | linux-newlib* | linux-musl* | linux-uclibc* | uclinux-uclibc* \
+ nto-qnx* | linux-* | uclinux-uclibc* \
| uclinux-gnu* | kfreebsd*-gnu* | knetbsd*-gnu* | netbsd*-gnu* \
| netbsd*-eabi* | kopensolaris*-gnu* | cloudabi*-eabi* \
| storm-chaos* | os2-emx* | rtmk-nova*)
basic_machine=$field1
- os=$maybe_os
+ basic_os=$maybe_os
;;
android-linux)
basic_machine=$field1-unknown
- os=linux-android
+ basic_os=linux-android
;;
*)
basic_machine=$field1-$field2
- os=$field3
+ basic_os=$field3
;;
esac
;;
case $field1-$field2 in
decstation-3100)
basic_machine=mips-dec
- os=
+ basic_os=
;;
*-*)
# Second component is usually, but not always the OS
# Prevent following clause from handling this valid os
sun*os*)
basic_machine=$field1
- os=$field2
+ basic_os=$field2
;;
# Manufacturers
dec* | mips* | sequent* | encore* | pc533* | sgi* | sony* \
| microblaze* | sim | cisco \
| oki | wec | wrs | winbond)
basic_machine=$field1-$field2
- os=
+ basic_os=
;;
*)
basic_machine=$field1
- os=$field2
+ basic_os=$field2
;;
esac
;;
case $field1 in
386bsd)
basic_machine=i386-pc
- os=bsd
+ basic_os=bsd
;;
a29khif)
basic_machine=a29k-amd
- os=udi
+ basic_os=udi
;;
adobe68k)
basic_machine=m68010-adobe
- os=scout
+ basic_os=scout
;;
alliant)
basic_machine=fx80-alliant
- os=
+ basic_os=
;;
altos | altos3068)
basic_machine=m68k-altos
- os=
+ basic_os=
;;
am29k)
basic_machine=a29k-none
- os=bsd
+ basic_os=bsd
;;
amdahl)
basic_machine=580-amdahl
- os=sysv
+ basic_os=sysv
;;
amiga)
basic_machine=m68k-unknown
- os=
+ basic_os=
;;
amigaos | amigados)
basic_machine=m68k-unknown
- os=amigaos
+ basic_os=amigaos
;;
amigaunix | amix)
basic_machine=m68k-unknown
- os=sysv4
+ basic_os=sysv4
;;
apollo68)
basic_machine=m68k-apollo
- os=sysv
+ basic_os=sysv
;;
apollo68bsd)
basic_machine=m68k-apollo
- os=bsd
+ basic_os=bsd
;;
aros)
basic_machine=i386-pc
- os=aros
+ basic_os=aros
;;
aux)
basic_machine=m68k-apple
- os=aux
+ basic_os=aux
;;
balance)
basic_machine=ns32k-sequent
- os=dynix
+ basic_os=dynix
;;
blackfin)
basic_machine=bfin-unknown
- os=linux
+ basic_os=linux
;;
cegcc)
basic_machine=arm-unknown
- os=cegcc
+ basic_os=cegcc
;;
convex-c1)
basic_machine=c1-convex
- os=bsd
+ basic_os=bsd
;;
convex-c2)
basic_machine=c2-convex
- os=bsd
+ basic_os=bsd
;;
convex-c32)
basic_machine=c32-convex
- os=bsd
+ basic_os=bsd
;;
convex-c34)
basic_machine=c34-convex
- os=bsd
+ basic_os=bsd
;;
convex-c38)
basic_machine=c38-convex
- os=bsd
+ basic_os=bsd
;;
cray)
basic_machine=j90-cray
- os=unicos
+ basic_os=unicos
;;
crds | unos)
basic_machine=m68k-crds
- os=
+ basic_os=
;;
da30)
basic_machine=m68k-da30
- os=
+ basic_os=
;;
decstation | pmax | pmin | dec3100 | decstatn)
basic_machine=mips-dec
- os=
+ basic_os=
;;
delta88)
basic_machine=m88k-motorola
- os=sysv3
+ basic_os=sysv3
;;
dicos)
basic_machine=i686-pc
- os=dicos
+ basic_os=dicos
;;
djgpp)
basic_machine=i586-pc
- os=msdosdjgpp
+ basic_os=msdosdjgpp
;;
ebmon29k)
basic_machine=a29k-amd
- os=ebmon
+ basic_os=ebmon
;;
es1800 | OSE68k | ose68k | ose | OSE)
basic_machine=m68k-ericsson
- os=ose
+ basic_os=ose
;;
gmicro)
basic_machine=tron-gmicro
- os=sysv
+ basic_os=sysv
;;
go32)
basic_machine=i386-pc
- os=go32
+ basic_os=go32
;;
h8300hms)
basic_machine=h8300-hitachi
- os=hms
+ basic_os=hms
;;
h8300xray)
basic_machine=h8300-hitachi
- os=xray
+ basic_os=xray
;;
h8500hms)
basic_machine=h8500-hitachi
- os=hms
+ basic_os=hms
;;
harris)
basic_machine=m88k-harris
- os=sysv3
+ basic_os=sysv3
;;
hp300 | hp300hpux)
basic_machine=m68k-hp
- os=hpux
+ basic_os=hpux
;;
hp300bsd)
basic_machine=m68k-hp
- os=bsd
+ basic_os=bsd
;;
hppaosf)
basic_machine=hppa1.1-hp
- os=osf
+ basic_os=osf
;;
hppro)
basic_machine=hppa1.1-hp
- os=proelf
+ basic_os=proelf
;;
i386mach)
basic_machine=i386-mach
- os=mach
+ basic_os=mach
;;
isi68 | isi)
basic_machine=m68k-isi
- os=sysv
+ basic_os=sysv
;;
m68knommu)
basic_machine=m68k-unknown
- os=linux
+ basic_os=linux
;;
magnum | m3230)
basic_machine=mips-mips
- os=sysv
+ basic_os=sysv
;;
merlin)
basic_machine=ns32k-utek
- os=sysv
+ basic_os=sysv
;;
mingw64)
basic_machine=x86_64-pc
- os=mingw64
+ basic_os=mingw64
;;
mingw32)
basic_machine=i686-pc
- os=mingw32
+ basic_os=mingw32
;;
mingw32ce)
basic_machine=arm-unknown
- os=mingw32ce
+ basic_os=mingw32ce
;;
monitor)
basic_machine=m68k-rom68k
- os=coff
+ basic_os=coff
;;
morphos)
basic_machine=powerpc-unknown
- os=morphos
+ basic_os=morphos
;;
moxiebox)
basic_machine=moxie-unknown
- os=moxiebox
+ basic_os=moxiebox
;;
msdos)
basic_machine=i386-pc
- os=msdos
+ basic_os=msdos
;;
msys)
basic_machine=i686-pc
- os=msys
+ basic_os=msys
;;
mvs)
basic_machine=i370-ibm
- os=mvs
+ basic_os=mvs
;;
nacl)
basic_machine=le32-unknown
- os=nacl
+ basic_os=nacl
;;
ncr3000)
basic_machine=i486-ncr
- os=sysv4
+ basic_os=sysv4
;;
netbsd386)
basic_machine=i386-pc
- os=netbsd
+ basic_os=netbsd
;;
netwinder)
basic_machine=armv4l-rebel
- os=linux
+ basic_os=linux
;;
news | news700 | news800 | news900)
basic_machine=m68k-sony
- os=newsos
+ basic_os=newsos
;;
news1000)
basic_machine=m68030-sony
- os=newsos
+ basic_os=newsos
;;
necv70)
basic_machine=v70-nec
- os=sysv
+ basic_os=sysv
;;
nh3000)
basic_machine=m68k-harris
- os=cxux
+ basic_os=cxux
;;
nh[45]000)
basic_machine=m88k-harris
- os=cxux
+ basic_os=cxux
;;
nindy960)
basic_machine=i960-intel
- os=nindy
+ basic_os=nindy
;;
mon960)
basic_machine=i960-intel
- os=mon960
+ basic_os=mon960
;;
nonstopux)
basic_machine=mips-compaq
- os=nonstopux
+ basic_os=nonstopux
;;
os400)
basic_machine=powerpc-ibm
- os=os400
+ basic_os=os400
;;
OSE68000 | ose68000)
basic_machine=m68000-ericsson
- os=ose
+ basic_os=ose
;;
os68k)
basic_machine=m68k-none
- os=os68k
+ basic_os=os68k
;;
paragon)
basic_machine=i860-intel
- os=osf
+ basic_os=osf
;;
parisc)
basic_machine=hppa-unknown
- os=linux
+ basic_os=linux
+ ;;
+ psp)
+ basic_machine=mipsallegrexel-sony
+ basic_os=psp
;;
pw32)
basic_machine=i586-unknown
- os=pw32
+ basic_os=pw32
;;
rdos | rdos64)
basic_machine=x86_64-pc
- os=rdos
+ basic_os=rdos
;;
rdos32)
basic_machine=i386-pc
- os=rdos
+ basic_os=rdos
;;
rom68k)
basic_machine=m68k-rom68k
- os=coff
+ basic_os=coff
;;
sa29200)
basic_machine=a29k-amd
- os=udi
+ basic_os=udi
;;
sei)
basic_machine=mips-sei
- os=seiux
+ basic_os=seiux
;;
sequent)
basic_machine=i386-sequent
- os=
+ basic_os=
;;
sps7)
basic_machine=m68k-bull
- os=sysv2
+ basic_os=sysv2
;;
st2000)
basic_machine=m68k-tandem
- os=
+ basic_os=
;;
stratus)
basic_machine=i860-stratus
- os=sysv4
+ basic_os=sysv4
;;
sun2)
basic_machine=m68000-sun
- os=
+ basic_os=
;;
sun2os3)
basic_machine=m68000-sun
- os=sunos3
+ basic_os=sunos3
;;
sun2os4)
basic_machine=m68000-sun
- os=sunos4
+ basic_os=sunos4
;;
sun3)
basic_machine=m68k-sun
- os=
+ basic_os=
;;
sun3os3)
basic_machine=m68k-sun
- os=sunos3
+ basic_os=sunos3
;;
sun3os4)
basic_machine=m68k-sun
- os=sunos4
+ basic_os=sunos4
;;
sun4)
basic_machine=sparc-sun
- os=
+ basic_os=
;;
sun4os3)
basic_machine=sparc-sun
- os=sunos3
+ basic_os=sunos3
;;
sun4os4)
basic_machine=sparc-sun
- os=sunos4
+ basic_os=sunos4
;;
sun4sol2)
basic_machine=sparc-sun
- os=solaris2
+ basic_os=solaris2
;;
sun386 | sun386i | roadrunner)
basic_machine=i386-sun
- os=
+ basic_os=
;;
sv1)
basic_machine=sv1-cray
- os=unicos
+ basic_os=unicos
;;
symmetry)
basic_machine=i386-sequent
- os=dynix
+ basic_os=dynix
;;
t3e)
basic_machine=alphaev5-cray
- os=unicos
+ basic_os=unicos
;;
t90)
basic_machine=t90-cray
- os=unicos
+ basic_os=unicos
;;
toad1)
basic_machine=pdp10-xkl
- os=tops20
+ basic_os=tops20
;;
tpf)
basic_machine=s390x-ibm
- os=tpf
+ basic_os=tpf
;;
udi29k)
basic_machine=a29k-amd
- os=udi
+ basic_os=udi
;;
ultra3)
basic_machine=a29k-nyu
- os=sym1
+ basic_os=sym1
;;
v810 | necv810)
basic_machine=v810-nec
- os=none
+ basic_os=none
;;
vaxv)
basic_machine=vax-dec
- os=sysv
+ basic_os=sysv
;;
vms)
basic_machine=vax-dec
- os=vms
+ basic_os=vms
;;
vsta)
basic_machine=i386-pc
- os=vsta
+ basic_os=vsta
;;
vxworks960)
basic_machine=i960-wrs
- os=vxworks
+ basic_os=vxworks
;;
vxworks68)
basic_machine=m68k-wrs
- os=vxworks
+ basic_os=vxworks
;;
vxworks29k)
basic_machine=a29k-wrs
- os=vxworks
+ basic_os=vxworks
;;
xbox)
basic_machine=i686-pc
- os=mingw32
+ basic_os=mingw32
;;
ymp)
basic_machine=ymp-cray
- os=unicos
+ basic_os=unicos
;;
*)
basic_machine=$1
- os=
+ basic_os=
;;
esac
;;
bluegene*)
cpu=powerpc
vendor=ibm
- os=cnk
+ basic_os=cnk
;;
decsystem10* | dec10*)
cpu=pdp10
vendor=dec
- os=tops10
+ basic_os=tops10
;;
decsystem20* | dec20*)
cpu=pdp10
vendor=dec
- os=tops20
+ basic_os=tops20
;;
delta | 3300 | motorola-3300 | motorola-delta \
| 3300-motorola | delta-motorola)
dpx2*)
cpu=m68k
vendor=bull
- os=sysv3
+ basic_os=sysv3
;;
encore | umax | mmax)
cpu=ns32k
elxsi)
cpu=elxsi
vendor=elxsi
- os=${os:-bsd}
+ basic_os=${basic_os:-bsd}
;;
fx2800)
cpu=i860
h3050r* | hiux*)
cpu=hppa1.1
vendor=hitachi
- os=hiuxwe2
+ basic_os=hiuxwe2
;;
hp3k9[0-9][0-9] | hp9[0-9][0-9])
cpu=hppa1.0
vendor=hp
;;
i*86v32)
- cpu=`echo "$1" | sed -e 's/86.*/86/'`
+ cpu=$(echo "$1" | sed -e 's/86.*/86/')
vendor=pc
- os=sysv32
+ basic_os=sysv32
;;
i*86v4*)
- cpu=`echo "$1" | sed -e 's/86.*/86/'`
+ cpu=$(echo "$1" | sed -e 's/86.*/86/')
vendor=pc
- os=sysv4
+ basic_os=sysv4
;;
i*86v)
- cpu=`echo "$1" | sed -e 's/86.*/86/'`
+ cpu=$(echo "$1" | sed -e 's/86.*/86/')
vendor=pc
- os=sysv
+ basic_os=sysv
;;
i*86sol2)
- cpu=`echo "$1" | sed -e 's/86.*/86/'`
+ cpu=$(echo "$1" | sed -e 's/86.*/86/')
vendor=pc
- os=solaris2
+ basic_os=solaris2
;;
j90 | j90-cray)
cpu=j90
vendor=cray
- os=${os:-unicos}
+ basic_os=${basic_os:-unicos}
;;
iris | iris4d)
cpu=mips
vendor=sgi
- case $os in
+ case $basic_os in
irix*)
;;
*)
- os=irix4
+ basic_os=irix4
;;
esac
;;
*mint | mint[0-9]* | *MiNT | *MiNT[0-9]*)
cpu=m68k
vendor=atari
- os=mint
+ basic_os=mint
;;
news-3600 | risc-news)
cpu=mips
vendor=sony
- os=newsos
+ basic_os=newsos
;;
next | m*-next)
cpu=m68k
vendor=next
- case $os in
+ case $basic_os in
openstep*)
;;
nextstep*)
;;
ns2*)
- os=nextstep2
+ basic_os=nextstep2
;;
*)
- os=nextstep3
+ basic_os=nextstep3
;;
esac
;;
op50n-* | op60c-*)
cpu=hppa1.1
vendor=oki
- os=proelf
+ basic_os=proelf
;;
pa-hitachi)
cpu=hppa1.1
vendor=hitachi
- os=hiuxwe2
+ basic_os=hiuxwe2
;;
pbd)
cpu=sparc
sde)
cpu=mipsisa32
vendor=sde
- os=${os:-elf}
+ basic_os=${basic_os:-elf}
;;
simso-wrs)
cpu=sparclite
vendor=wrs
- os=vxworks
+ basic_os=vxworks
;;
tower | tower-32)
cpu=m68k
w89k-*)
cpu=hppa1.1
vendor=winbond
- os=proelf
+ basic_os=proelf
;;
none)
cpu=none
;;
leon-*|leon[3-9]-*)
cpu=sparc
- vendor=`echo "$basic_machine" | sed 's/-.*//'`
+ vendor=$(echo "$basic_machine" | sed 's/-.*//')
;;
*-*)
# some cases the only manufacturer, in others, it is the most popular.
craynv-unknown)
vendor=cray
- os=${os:-unicosmp}
+ basic_os=${basic_os:-unicosmp}
;;
c90-unknown | c90-cray)
vendor=cray
- os=${os:-unicos}
+ basic_os=${Basic_os:-unicos}
;;
fx80-unknown)
vendor=alliant
dpx20-unknown | dpx20-bull)
cpu=rs6000
vendor=bull
- os=${os:-bosx}
+ basic_os=${basic_os:-bosx}
;;
# Here we normalize CPU types irrespective of the vendor
;;
blackfin-*)
cpu=bfin
- os=linux
+ basic_os=linux
;;
c54x-*)
cpu=tic54x
;;
e500v[12]-*)
cpu=powerpc
- os=$os"spe"
+ basic_os=${basic_os}"spe"
;;
mips3*-*)
cpu=mips64
;;
m68knommu-*)
cpu=m68k
- os=linux
+ basic_os=linux
;;
m9s12z-* | m68hcs12z-* | hcs12z-* | s12z-*)
cpu=s12z
;;
parisc-*)
cpu=hppa
- os=linux
+ basic_os=linux
;;
pentium-* | p5-* | k5-* | k6-* | nexgen-* | viac3-*)
cpu=i586
cpu=mipsisa64sb1el
;;
sh5e[lb]-*)
- cpu=`echo "$cpu" | sed 's/^\(sh.\)e\(.\)$/\1\2e/'`
+ cpu=$(echo "$cpu" | sed 's/^\(sh.\)e\(.\)$/\1\2e/')
;;
spur-*)
cpu=spur
cpu=x86_64
;;
xscale-* | xscalee[bl]-*)
- cpu=`echo "$cpu" | sed 's/^xscale/arm/'`
+ cpu=$(echo "$cpu" | sed 's/^xscale/arm/')
+ ;;
+ arm64-*)
+ cpu=aarch64
;;
# Recognize the canonical CPU Types that limit and/or modify the
# company names they are paired with.
cr16-*)
- os=${os:-elf}
+ basic_os=${basic_os:-elf}
;;
crisv32-* | etraxfs*-*)
cpu=crisv32
vendor=axis
;;
crx-*)
- os=${os:-elf}
+ basic_os=${basic_os:-elf}
;;
neo-tandem)
cpu=neo
cpu=nsx
vendor=tandem
;;
- s390-*)
- cpu=s390
- vendor=ibm
- ;;
- s390x-*)
- cpu=s390x
- vendor=ibm
+ mipsallegrexel-sony)
+ cpu=mipsallegrexel
+ vendor=sony
;;
tile*-*)
- os=${os:-linux-gnu}
+ basic_os=${basic_os:-linux-gnu}
;;
*)
| am33_2.0 \
| amdgcn \
| arc | arceb \
- | arm | arm[lb]e | arme[lb] | armv* \
+ | arm | arm[lb]e | arme[lb] | armv* \
| avr | avr32 \
| asmjs \
| ba \
| pyramid \
| riscv | riscv32 | riscv64 \
| rl78 | romp | rs6000 | rx \
+ | s390 | s390x \
| score \
| sh | shl \
| sh[1234] | sh[24]a | sh[24]ae[lb] | sh[23]e | she[lb] | sh[lb]e \
# Decode manufacturer-specific aliases for certain operating systems.
-if [ x$os != x ]
+if test x$basic_os != x
then
+
+# First recognize some ad-hoc caes, or perhaps split kernel-os, or else just
+# set os.
+case $basic_os in
+ gnu/linux*)
+ kernel=linux
+ os=$(echo $basic_os | sed -e 's|gnu/linux|gnu|')
+ ;;
+ os2-emx)
+ kernel=os2
+ os=$(echo $basic_os | sed -e 's|os2-emx|emx|')
+ ;;
+ nto-qnx*)
+ kernel=nto
+ os=$(echo $basic_os | sed -e 's|nto-qnx|qnx|')
+ ;;
+ *-*)
+ # shellcheck disable=SC2162
+ IFS="-" read kernel os <<EOF
+$basic_os
+EOF
+ ;;
+ # Default OS when just kernel was specified
+ nto*)
+ kernel=nto
+ os=$(echo $basic_os | sed -e 's|nto|qnx|')
+ ;;
+ linux*)
+ kernel=linux
+ os=$(echo $basic_os | sed -e 's|linux|gnu|')
+ ;;
+ *)
+ kernel=
+ os=$basic_os
+ ;;
+esac
+
+# Now, normalize the OS (knowing we just have one component, it's not a kernel,
+# etc.)
case $os in
# First match some system type aliases that might get confused
# with valid system types.
os=cnk
;;
solaris1 | solaris1.*)
- os=`echo $os | sed -e 's|solaris1|sunos4|'`
+ os=$(echo $os | sed -e 's|solaris1|sunos4|')
;;
solaris)
os=solaris2
unixware*)
os=sysv4.2uw
;;
- gnu/linux*)
- os=`echo $os | sed -e 's|gnu/linux|linux-gnu|'`
- ;;
# es1800 is here to avoid being matched by es* (a different OS)
es1800*)
os=ose
os=sco3.2v4
;;
sco3.2.[4-9]*)
- os=`echo $os | sed -e 's/sco3.2./sco3.2v/'`
- ;;
- sco3.2v[4-9]* | sco5v6*)
- # Don't forget version if it is 3.2v4 or newer.
+ os=$(echo $os | sed -e 's/sco3.2./sco3.2v/')
;;
- scout)
+ sco*v* | scout)
# Don't match below
;;
sco*)
psos*)
os=psos
;;
- # Now accept the basic system types.
- # The portable systems comes first.
- # Each alternative MUST end in a * to match a version number.
- # sysv* is not here because it comes later, after sysvr4.
- gnu* | bsd* | mach* | minix* | genix* | ultrix* | irix* \
- | *vms* | esix* | aix* | cnk* | sunos | sunos[34]*\
- | hpux* | unos* | osf* | luna* | dgux* | auroraux* | solaris* \
- | sym* | kopensolaris* | plan9* \
- | amigaos* | amigados* | msdos* | newsos* | unicos* | aof* \
- | aos* | aros* | cloudabi* | sortix* | twizzler* \
- | nindy* | vxsim* | vxworks* | ebmon* | hms* | mvs* \
- | clix* | riscos* | uniplus* | iris* | isc* | rtu* | xenix* \
- | knetbsd* | mirbsd* | netbsd* \
- | bitrig* | openbsd* | solidbsd* | libertybsd* | os108* \
- | ekkobsd* | kfreebsd* | freebsd* | riscix* | lynxos* \
- | bosx* | nextstep* | cxux* | aout* | elf* | oabi* \
- | ptx* | coff* | ecoff* | winnt* | domain* | vsta* \
- | udi* | eabi* | lites* | ieee* | go32* | aux* | hcos* \
- | chorusrdb* | cegcc* | glidix* \
- | cygwin* | msys* | pe* | moss* | proelf* | rtems* \
- | midipix* | mingw32* | mingw64* | linux-gnu* | linux-android* \
- | linux-newlib* | linux-musl* | linux-uclibc* \
- | uxpv* | beos* | mpeix* | udk* | moxiebox* \
- | interix* | uwin* | mks* | rhapsody* | darwin* \
- | openstep* | oskit* | conix* | pw32* | nonstopux* \
- | storm-chaos* | tops10* | tenex* | tops20* | its* \
- | os2* | vos* | palmos* | uclinux* | nucleus* \
- | morphos* | superux* | rtmk* | windiss* \
- | powermax* | dnix* | nx6 | nx7 | sei* | dragonfly* \
- | skyos* | haiku* | rdos* | toppers* | drops* | es* \
- | onefs* | tirtos* | phoenix* | fuchsia* | redox* | bme* \
- | midnightbsd* | amdhsa* | unleashed* | emscripten* | wasi* \
- | nsk* | powerunix)
- # Remember, each alternative MUST END IN *, to match a version number.
- ;;
qnx*)
- case $cpu in
- x86 | i*86)
- ;;
- *)
- os=nto-$os
- ;;
- esac
+ os=qnx
;;
hiux*)
os=hiuxwe2
;;
- nto-qnx*)
- ;;
- nto*)
- os=`echo $os | sed -e 's|nto|nto-qnx|'`
- ;;
- sim | xray | os68k* | v88r* \
- | windows* | osx | abug | netware* | os9* \
- | macos* | mpw* | magic* | mmixware* | mon960* | lnews*)
- ;;
- linux-dietlibc)
- os=linux-dietlibc
- ;;
- linux*)
- os=`echo $os | sed -e 's|linux|linux-gnu|'`
- ;;
lynx*178)
os=lynxos178
;;
lynx*5)
os=lynxos5
;;
+ lynxos*)
+ # don't get caught up in next wildcard
+ ;;
lynx*)
os=lynxos
;;
- mac*)
- os=`echo "$os" | sed -e 's|mac|macos|'`
+ mac[0-9]*)
+ os=$(echo "$os" | sed -e 's|mac|macos|')
;;
opened*)
os=openedition
os=os400
;;
sunos5*)
- os=`echo "$os" | sed -e 's|sunos5|solaris2|'`
+ os=$(echo "$os" | sed -e 's|sunos5|solaris2|')
;;
sunos6*)
- os=`echo "$os" | sed -e 's|sunos6|solaris3|'`
+ os=$(echo "$os" | sed -e 's|sunos6|solaris3|')
;;
wince*)
os=wince
;;
# Preserve the version number of sinix5.
sinix5.*)
- os=`echo $os | sed -e 's|sinix|sysv|'`
+ os=$(echo $os | sed -e 's|sinix|sysv|')
;;
sinix*)
os=sysv4
sysvr4)
os=sysv4
;;
- # This must come after sysvr4.
- sysv*)
- ;;
ose*)
os=ose
;;
*mint | mint[0-9]* | *MiNT | MiNT[0-9]*)
os=mint
;;
- zvmoe)
- os=zvmoe
- ;;
dicos*)
os=dicos
;;
;;
esac
;;
- nacl*)
- ;;
- ios)
- ;;
- none)
- ;;
- *-eabi)
- ;;
*)
- echo Invalid configuration \`"$1"\': system \`"$os"\' not recognized 1>&2
- exit 1
+ # No normalization, but not necessarily accepted, that comes below.
;;
esac
+
else
# Here we handle the default operating systems that come with various machines.
# will signal an error saying that MANUFACTURER isn't an operating
# system, and we'll never get to this point.
+kernel=
case $cpu-$vendor in
score-*)
os=elf
os=riscix1.2
;;
arm*-rebel)
- os=linux
+ kernel=linux
+ os=gnu
;;
arm*-semi)
os=aout
os=none
;;
esac
+
fi
+# Now, validate our (potentially fixed-up) OS.
+case $os in
+ # Sometimes we do "kernel-abi", so those need to count as OSes.
+ musl* | newlib* | uclibc*)
+ ;;
+ # Likewise for "kernel-libc"
+ eabi | eabihf | gnueabi | gnueabihf)
+ ;;
+ # Now accept the basic system types.
+ # The portable systems comes first.
+ # Each alternative MUST end in a * to match a version number.
+ gnu* | android* | bsd* | mach* | minix* | genix* | ultrix* | irix* \
+ | *vms* | esix* | aix* | cnk* | sunos | sunos[34]* \
+ | hpux* | unos* | osf* | luna* | dgux* | auroraux* | solaris* \
+ | sym* | plan9* | psp* | sim* | xray* | os68k* | v88r* \
+ | hiux* | abug | nacl* | netware* | windows* \
+ | os9* | macos* | osx* | ios* \
+ | mpw* | magic* | mmixware* | mon960* | lnews* \
+ | amigaos* | amigados* | msdos* | newsos* | unicos* | aof* \
+ | aos* | aros* | cloudabi* | sortix* | twizzler* \
+ | nindy* | vxsim* | vxworks* | ebmon* | hms* | mvs* \
+ | clix* | riscos* | uniplus* | iris* | isc* | rtu* | xenix* \
+ | mirbsd* | netbsd* | dicos* | openedition* | ose* \
+ | bitrig* | openbsd* | solidbsd* | libertybsd* | os108* \
+ | ekkobsd* | freebsd* | riscix* | lynxos* | os400* \
+ | bosx* | nextstep* | cxux* | aout* | elf* | oabi* \
+ | ptx* | coff* | ecoff* | winnt* | domain* | vsta* \
+ | udi* | lites* | ieee* | go32* | aux* | hcos* \
+ | chorusrdb* | cegcc* | glidix* \
+ | cygwin* | msys* | pe* | moss* | proelf* | rtems* \
+ | midipix* | mingw32* | mingw64* | mint* \
+ | uxpv* | beos* | mpeix* | udk* | moxiebox* \
+ | interix* | uwin* | mks* | rhapsody* | darwin* \
+ | openstep* | oskit* | conix* | pw32* | nonstopux* \
+ | storm-chaos* | tops10* | tenex* | tops20* | its* \
+ | os2* | vos* | palmos* | uclinux* | nucleus* | morphos* \
+ | scout* | superux* | sysv* | rtmk* | tpf* | windiss* \
+ | powermax* | dnix* | nx6 | nx7 | sei* | dragonfly* \
+ | skyos* | haiku* | rdos* | toppers* | drops* | es* \
+ | onefs* | tirtos* | phoenix* | fuchsia* | redox* | bme* \
+ | midnightbsd* | amdhsa* | unleashed* | emscripten* | wasi* \
+ | nsk* | powerunix* | genode* | zvmoe* | qnx* | emx*)
+ ;;
+ # This one is extra strict with allowed versions
+ sco3.2v2 | sco3.2v[4-9]* | sco5v6*)
+ # Don't forget version if it is 3.2v4 or newer.
+ ;;
+ none)
+ ;;
+ *)
+ echo Invalid configuration \`"$1"\': OS \`"$os"\' not recognized 1>&2
+ exit 1
+ ;;
+esac
+
+# As a final step for OS-related things, validate the OS-kernel combination
+# (given a valid OS), if there is a kernel.
+case $kernel-$os in
+ linux-gnu* | linux-dietlibc* | linux-android* | linux-newlib* | linux-musl* | linux-uclibc* )
+ ;;
+ uclinux-uclibc* )
+ ;;
+ -dietlibc* | -newlib* | -musl* | -uclibc* )
+ # These are just libc implementations, not actual OSes, and thus
+ # require a kernel.
+ echo "Invalid configuration \`$1': libc \`$os' needs explicit kernel." 1>&2
+ exit 1
+ ;;
+ kfreebsd*-gnu* | kopensolaris*-gnu*)
+ ;;
+ nto-qnx*)
+ ;;
+ os2-emx)
+ ;;
+ *-eabi* | *-gnueabi*)
+ ;;
+ -*)
+ # Blank kernel with real OS is always fine.
+ ;;
+ *-*)
+ echo "Invalid configuration \`$1': Kernel \`$kernel' not known to work with OS \`$os'." 1>&2
+ exit 1
+ ;;
+esac
+
# Here we handle the case where we know the os, and the CPU type, but not the
# manufacturer. We pick the logical manufacturer.
case $vendor in
unknown)
- case $os in
- riscix*)
+ case $cpu-$os in
+ *-riscix*)
vendor=acorn
;;
- sunos*)
+ *-sunos*)
vendor=sun
;;
- cnk*|-aix*)
+ *-cnk* | *-aix*)
vendor=ibm
;;
- beos*)
+ *-beos*)
vendor=be
;;
- hpux*)
+ *-hpux*)
vendor=hp
;;
- mpeix*)
+ *-mpeix*)
vendor=hp
;;
- hiux*)
+ *-hiux*)
vendor=hitachi
;;
- unos*)
+ *-unos*)
vendor=crds
;;
- dgux*)
+ *-dgux*)
vendor=dg
;;
- luna*)
+ *-luna*)
vendor=omron
;;
- genix*)
+ *-genix*)
vendor=ns
;;
- clix*)
+ *-clix*)
vendor=intergraph
;;
- mvs* | opened*)
+ *-mvs* | *-opened*)
+ vendor=ibm
+ ;;
+ *-os400*)
vendor=ibm
;;
- os400*)
+ s390-* | s390x-*)
vendor=ibm
;;
- ptx*)
+ *-ptx*)
vendor=sequent
;;
- tpf*)
+ *-tpf*)
vendor=ibm
;;
- vxsim* | vxworks* | windiss*)
+ *-vxsim* | *-vxworks* | *-windiss*)
vendor=wrs
;;
- aux*)
+ *-aux*)
vendor=apple
;;
- hms*)
+ *-hms*)
vendor=hitachi
;;
- mpw* | macos*)
+ *-mpw* | *-macos*)
vendor=apple
;;
- *mint | mint[0-9]* | *MiNT | MiNT[0-9]*)
+ *-*mint | *-mint[0-9]* | *-*MiNT | *-MiNT[0-9]*)
vendor=atari
;;
- vos*)
+ *-vos*)
vendor=stratus
;;
esac
;;
esac
-echo "$cpu-$vendor-$os"
+echo "$cpu-$vendor-${kernel:+$kernel-}$os"
exit
# Local variables:
<!--#include virtual="/server/header.html" -->
-<!-- Parent-Version: 1.77 -->
+<!-- Parent-Version: 1.78 -->
+
+<!--
+Copyright (C) 2006-2020 Free Software Foundation, Inc.
+
+Copying and distribution of this file, with or without modification,
+are permitted in any medium without royalty provided the copyright
+notice and this notice are preserved. This file is offered as-is,
+without any warranty.
+-->
+
<title>%%TITLE%% - GNU Project - Free Software Foundation</title>
<!--#include virtual="/server/banner.html" -->
<h2>%%TITLE%%</h2>
# are valid code in both sh and perl. When executed by sh, they re-execute
# the script through the perl program found in $PATH. The '-x' option
# is essential as well; without it, perl would re-execute the script
-# through /bin/sh. When executed by perl, the next two lines are a no-op.
+# through /bin/sh. When executed by perl, the next two lines are a no-op.
eval 'exec perl -wSx "$0" "$@"'
if 0;
-my $VERSION = '2018-03-07 03:47'; # UTC
+my $VERSION = '2020-04-04 15:07'; # UTC
# The definition above must lie within the first 8 lines in order
# for the Emacs time-stamp write hook (at end) to update it.
# If you change this file with Emacs, please let the write hook
#!/bin/sh
# install - install a program, script, or datafile
-scriptversion=2018-03-11.20; # UTC
+scriptversion=2020-11-14.01; # UTC
# This originates from X11R5 (mit/util/scripts/install.sh), which was
# later released in X11R6 (xc/config/util/install.sh) with the
# Desired mode of installed file.
mode=0755
+# Create dirs (including intermediate dirs) using mode 755.
+# This is like GNU 'install' as of coreutils 8.32 (2020).
+mkdir_umask=22
+
+backupsuffix=
chgrpcmd=
chmodcmd=$chmodprog
chowncmd=
--version display version info and exit.
-c (ignored)
- -C install only if different (preserve the last data modification time)
+ -C install only if different (preserve data modification time)
-d create directories instead of installing files.
-g GROUP $chgrpprog installed files to GROUP.
-m MODE $chmodprog installed files to MODE.
-o USER $chownprog installed files to USER.
+ -p pass -p to $cpprog.
-s $stripprog installed files.
+ -S SUFFIX attempt to back up existing files, with suffix SUFFIX.
-t DIRECTORY install into DIRECTORY.
-T report an error if DSTFILE is a directory.
Environment variables override the default commands:
CHGRPPROG CHMODPROG CHOWNPROG CMPPROG CPPROG MKDIRPROG MVPROG
RMPROG STRIPPROG
+
+By default, rm is invoked with -f; when overridden with RMPROG,
+it's up to you to specify -f if you want it.
+
+If -S is not specified, no backups are attempted.
+
+Email bug reports to bug-automake@gnu.org.
+Automake home page: https://www.gnu.org/software/automake/
"
while test $# -ne 0; do
-o) chowncmd="$chownprog $2"
shift;;
+ -p) cpprog="$cpprog -p";;
+
-s) stripcmd=$stripprog;;
+ -S) backupsuffix="$2"
+ shift;;
+
-t)
is_target_a_directory=always
dst_arg=$2
dstdir=$dst
test -d "$dstdir"
dstdir_status=$?
+ # Don't chown directories that already exist.
+ if test $dstdir_status = 0; then
+ chowncmd=""
+ fi
else
# Waiting for this to be detected by the "$cpprog $src $dsttmp" command
if test $dstdir_status != 0; then
case $posix_mkdir in
'')
- # Create intermediate dirs using mode 755 as modified by the umask.
- # This is like FreeBSD 'install' as of 1997-10-28.
- umask=`umask`
- case $stripcmd.$umask in
- # Optimize common cases.
- *[2367][2367]) mkdir_umask=$umask;;
- .*0[02][02] | .[02][02] | .[02]) mkdir_umask=22;;
-
- *[0-7])
- mkdir_umask=`expr $umask + 22 \
- - $umask % 100 % 40 + $umask % 20 \
- - $umask % 10 % 4 + $umask % 2
- `;;
- *) mkdir_umask=$umask,go-w;;
- esac
-
# With -d, create the new directory with the user-specified mode.
# Otherwise, rely on $mkdir_umask.
if test -n "$dir_arg"; then
fi
posix_mkdir=false
- case $umask in
- *[123567][0-7][0-7])
- # POSIX mkdir -p sets u+wx bits regardless of umask, which
- # is incompatible with FreeBSD 'install' when (umask & 300) != 0.
- ;;
- *)
- # Note that $RANDOM variable is not portable (e.g. dash); Use it
- # here however when possible just to lower collision chance.
- tmpdir=${TMPDIR-/tmp}/ins$RANDOM-$$
-
- trap 'ret=$?; rmdir "$tmpdir/a/b" "$tmpdir/a" "$tmpdir" 2>/dev/null; exit $ret' 0
-
- # Because "mkdir -p" follows existing symlinks and we likely work
- # directly in world-writeable /tmp, make sure that the '$tmpdir'
- # directory is successfully created first before we actually test
- # 'mkdir -p' feature.
- if (umask $mkdir_umask &&
- $mkdirprog $mkdir_mode "$tmpdir" &&
- exec $mkdirprog $mkdir_mode -p -- "$tmpdir/a/b") >/dev/null 2>&1
- then
- if test -z "$dir_arg" || {
- # Check for POSIX incompatibilities with -m.
- # HP-UX 11.23 and IRIX 6.5 mkdir -m -p sets group- or
- # other-writable bit of parent directory when it shouldn't.
- # FreeBSD 6.1 mkdir -m -p sets mode of existing directory.
- test_tmpdir="$tmpdir/a"
- ls_ld_tmpdir=`ls -ld "$test_tmpdir"`
- case $ls_ld_tmpdir in
- d????-?r-*) different_mode=700;;
- d????-?--*) different_mode=755;;
- *) false;;
- esac &&
- $mkdirprog -m$different_mode -p -- "$test_tmpdir" && {
- ls_ld_tmpdir_1=`ls -ld "$test_tmpdir"`
- test "$ls_ld_tmpdir" = "$ls_ld_tmpdir_1"
- }
- }
- then posix_mkdir=:
- fi
- rmdir "$tmpdir/a/b" "$tmpdir/a" "$tmpdir"
- else
- # Remove any dirs left behind by ancient mkdir implementations.
- rmdir ./$mkdir_mode ./-p ./-- "$tmpdir" 2>/dev/null
- fi
- trap '' 0;;
- esac;;
+ # The $RANDOM variable is not portable (e.g., dash). Use it
+ # here however when possible just to lower collision chance.
+ tmpdir=${TMPDIR-/tmp}/ins$RANDOM-$$
+
+ trap '
+ ret=$?
+ rmdir "$tmpdir/a/b" "$tmpdir/a" "$tmpdir" 2>/dev/null
+ exit $ret
+ ' 0
+
+ # Because "mkdir -p" follows existing symlinks and we likely work
+ # directly in world-writeable /tmp, make sure that the '$tmpdir'
+ # directory is successfully created first before we actually test
+ # 'mkdir -p'.
+ if (umask $mkdir_umask &&
+ $mkdirprog $mkdir_mode "$tmpdir" &&
+ exec $mkdirprog $mkdir_mode -p -- "$tmpdir/a/b") >/dev/null 2>&1
+ then
+ if test -z "$dir_arg" || {
+ # Check for POSIX incompatibilities with -m.
+ # HP-UX 11.23 and IRIX 6.5 mkdir -m -p sets group- or
+ # other-writable bit of parent directory when it shouldn't.
+ # FreeBSD 6.1 mkdir -m -p sets mode of existing directory.
+ test_tmpdir="$tmpdir/a"
+ ls_ld_tmpdir=`ls -ld "$test_tmpdir"`
+ case $ls_ld_tmpdir in
+ d????-?r-*) different_mode=700;;
+ d????-?--*) different_mode=755;;
+ *) false;;
+ esac &&
+ $mkdirprog -m$different_mode -p -- "$test_tmpdir" && {
+ ls_ld_tmpdir_1=`ls -ld "$test_tmpdir"`
+ test "$ls_ld_tmpdir" = "$ls_ld_tmpdir_1"
+ }
+ }
+ then posix_mkdir=:
+ fi
+ rmdir "$tmpdir/a/b" "$tmpdir/a" "$tmpdir"
+ else
+ # Remove any dirs left behind by ancient mkdir implementations.
+ rmdir ./$mkdir_mode ./-p ./-- "$tmpdir" 2>/dev/null
+ fi
+ trap '' 0;;
esac
if
then :
else
- # The umask is ridiculous, or mkdir does not conform to POSIX,
+ # mkdir does not conform to POSIX,
# or it failed possibly due to a race condition. Create the
# directory the slow way, step by step, checking for races as we go.
prefixes=
else
if $posix_mkdir; then
- (umask=$mkdir_umask &&
+ (umask $mkdir_umask &&
$doit_exec $mkdirprog $mkdir_mode -p -- "$dstdir") && break
# Don't fail if two instances are running concurrently.
test -d "$prefix" || exit 1
then
rm -f "$dsttmp"
else
+ # If $backupsuffix is set, and the file being installed
+ # already exists, attempt a backup. Don't worry if it fails,
+ # e.g., if mv doesn't support -f.
+ if test -n "$backupsuffix" && test -f "$dst"; then
+ $doit $mvcmd -f "$dst" "$dst$backupsuffix" 2>/dev/null
+ fi
+
# Rename the file to the real destination.
$doit $mvcmd -f "$dsttmp" "$dst" 2>/dev/null ||
# file should still install successfully.
{
test ! -f "$dst" ||
- $doit $rmcmd -f "$dst" 2>/dev/null ||
+ $doit $rmcmd "$dst" 2>/dev/null ||
{ $doit $mvcmd -f "$dst" "$rmtmp" 2>/dev/null &&
- { $doit $rmcmd -f "$rmtmp" 2>/dev/null; :; }
+ { $doit $rmcmd "$rmtmp" 2>/dev/null; :; }
} ||
{ echo "$0: cannot unlink or rename $dst" >&2
(exit 1); exit 1
#! /bin/sh
# mkinstalldirs --- make directory hierarchy
-scriptversion=2018-03-07.03; # UTC
+scriptversion=2020-07-26.22; # UTC
# Original author: Noah Friedman <friedman@prep.ai.mit.edu>
# Created: 1993-05-16
*)
if mkdir -m "$dirmode" -p --version . >/dev/null 2>&1 &&
test ! -d ./--version; then
+ echo "umask 22"
+ umask 22
echo "mkdir -m $dirmode -p -- $*"
exec mkdir -m "$dirmode" -p -- "$@"
else
;;
esac
+echo "umask 22"
+umask 22
+
for file
do
case $file in
if test ! -d "$pathcomp"; then
errstatus=$lasterr
- else
- if test ! -z "$dirmode"; then
- echo "chmod $dirmode $pathcomp"
- lasterr=
- chmod "$dirmode" "$pathcomp" || lasterr=$?
-
- if test ! -z "$lasterr"; then
- errstatus=$lasterr
- fi
- fi
fi
fi
pathcomp=$pathcomp/
done
+
+ if test ! -z "$dirmode"; then
+ echo "chmod $dirmode $file"
+ chmod "$dirmode" "$file" || errstatus=$?
+ fi
done
exit $errstatus
sub("^(not )?ok[ \t]*", "", line)
# If the result has an explicit number, get it and strip it; otherwise,
- # automatically assing the next progresive number to it.
+ # automatically assign the next test number to it.
if (line ~ /^[0-9]+$/ || line ~ /^[0-9]+[^a-zA-Z0-9_]/)
{
match(line, "^[0-9]+")
{
cat <<END
Usage:
- test-driver --test-name=NAME --log-file=PATH --trs-file=PATH
- [--expect-failure={yes|no}] [--color-tests={yes|no}]
- [--enable-hard-errors={yes|no}] [--]
+ test-driver --test-name NAME --log-file PATH --trs-file PATH
+ [--expect-failure {yes|no}] [--color-tests {yes|no}]
+ [--enable-hard-errors {yes|no}] [--]
TEST-SCRIPT [TEST-SCRIPT-ARGUMENTS]
+
The '--test-name', '--log-file' and '--trs-file' options are mandatory.
+See the GNU Automake documentation for information.
END
}
% Load plain if necessary, i.e., if running under initex.
\expandafter\ifx\csname fmtname\endcsname\relax\input plain\fi
%
-\def\texinfoversion{2020-02-11.09}
+\def\texinfoversion{2020-10-24.12}
%
-% Copyright 1985, 1986, 1988, 1990-2019 Free Software Foundation, Inc.
+% Copyright 1985, 1986, 1988, 1990-2020 Free Software Foundation, Inc.
%
% This texinfo.tex file is free software: you can redistribute it and/or
% modify it under the terms of the GNU General Public License as
% The texinfo.tex in any given distribution could well be out
% of date, so if that's what you're using, please check.
%
-% Send bug reports to bug-texinfo@gnu.org. Please include including a
+% Send bug reports to bug-texinfo@gnu.org. Please include a
% complete document in each bug report with which we can reproduce the
% problem. Patches are, of course, greatly appreciated.
%
\ifodd\pageno \advance\hoffset by \bindingoffset
\else \advance\hoffset by -\bindingoffset\fi
%
+ \checkchapterpage
+ %
% Retrieve the information for the headings from the marks in the page,
% and call Plain TeX's \makeheadline and \makefootline, which use the
% values in \headline and \footline.
%
- % This is used to check if we are on the first page of a chapter.
- \ifcase1\the\savedtopmark\fi
- \let\prevchaptername\thischaptername
- \ifcase0\firstmark\fi
- \let\curchaptername\thischaptername
- %
- \ifodd\pageno \getoddheadingmarks \else \getevenheadingmarks \fi
- %
- \ifx\curchaptername\prevchaptername
- \let\thischapterheading\thischapter
- \else
- % \thischapterheading is the same as \thischapter except it is blank
- % for the first page of a chapter. This is to prevent the chapter name
- % being shown twice.
- \def\thischapterheading{}%
- \fi
- %
% Common context changes for both heading and footing.
% Do this outside of the \shipout so @code etc. will be expanded in
% the headline as they should be, not taken literally (outputting ''code).
\def\commonheadfootline{\let\hsize=\txipagewidth \texinfochars}
%
+ \ifodd\pageno \getoddheadingmarks \else \getevenheadingmarks \fi
\global\setbox\headlinebox = \vbox{\commonheadfootline \makeheadline}%
- %
\ifodd\pageno \getoddfootingmarks \else \getevenfootingmarks \fi
\global\setbox\footlinebox = \vbox{\commonheadfootline \makefootline}%
%
\ifr@ggedbottom \kern-\dimen@ \vfil \fi}
}
+% Check if we are on the first page of a chapter. Used for printing headings.
+\newif\ifchapterpage
+\def\checkchapterpage{%
+ % Get the chapter that was current at the end of the last page
+ \ifcase1\the\savedtopmark\fi
+ \let\prevchaptername\thischaptername
+ %
+ \ifodd\pageno \getoddheadingmarks \else \getevenheadingmarks \fi
+ \let\curchaptername\thischaptername
+ %
+ \ifx\curchaptername\prevchaptername
+ \chapterpagefalse
+ \else
+ \chapterpagetrue
+ \fi
+}
% Argument parsing
\let\setfilename=\comment
% @bye.
-\outer\def\bye{\pagealignmacro\tracingstats=1\ptexend}
+\outer\def\bye{\chappager\pagelabels\tracingstats=1\ptexend}
\message{pdf,}
\fi
+% Output page labels information.
+% See PDF reference v.1.7 p.594, section 8.3.1.
+\ifpdf
+\def\pagelabels{%
+ \def\title{0 << /P (T-) /S /D >>}%
+ \edef\roman{\the\romancount << /S /r >>}%
+ \edef\arabic{\the\arabiccount << /S /D >>}%
+ %
+ % Page label ranges must be increasing. Remove any duplicates.
+ % (There is a slight chance of this being wrong if e.g. there is
+ % a @contents but no @titlepage, etc.)
+ %
+ \ifnum\romancount=0 \def\roman{}\fi
+ \ifnum\arabiccount=0 \def\title{}%
+ \else
+ \ifnum\romancount=\arabiccount \def\roman{}\fi
+ \fi
+ %
+ \ifnum\romancount<\arabiccount
+ \pdfcatalog{/PageLabels << /Nums [\title \roman \arabic ] >> }\relax
+ \else
+ \pdfcatalog{/PageLabels << /Nums [\title \arabic \roman ] >> }\relax
+ \fi
+}
+\else
+ \let\pagelabels\relax
+\fi
+
+\newcount\pagecount \pagecount=0
+\newcount\romancount \romancount=0
+\newcount\arabiccount \arabiccount=0
+\ifpdf
+ \let\ptxadvancepageno\advancepageno
+ \def\advancepageno{%
+ \ptxadvancepageno\global\advance\pagecount by 1
+ }
+\fi
+
+
% PDF uses PostScript string constants for the names of xref targets,
% for display in the outlines, and in other places. Thus, we have to
% double any backslashes. Otherwise, a name like "\node" will be
% subentries, which we calculated on our first read of the .toc above.
%
% We use the node names as the destinations.
+ %
+ % Currently we prefix the section name with the section number
+ % for chapter and appendix headings only in order to avoid too much
+ % horizontal space being required in the PDF viewer.
\def\numchapentry##1##2##3##4{%
+ \dopdfoutline{##2 ##1}{count-\expnumber{chap##2}}{##3}{##4}}%
+ \def\unnchapentry##1##2##3##4{%
\dopdfoutline{##1}{count-\expnumber{chap##2}}{##3}{##4}}%
\def\numsecentry##1##2##3##4{%
\dopdfoutline{##1}{count-\expnumber{sec##2}}{##3}{##4}}%
% Therefore, we read toc only once.
%
% We use node names as destinations.
+ %
+ % Currently we prefix the section name with the section number
+ % for chapter and appendix headings only in order to avoid too much
+ % horizontal space being required in the PDF viewer.
\def\partentry##1##2##3##4{}% ignore parts in the outlines
\def\numchapentry##1##2##3##4{%
- \dopdfoutline{##1}{1}{##3}{##4}}%
+ \dopdfoutline{##2 ##1}{1}{##3}{##4}}%
\def\numsecentry##1##2##3##4{%
\dopdfoutline{##1}{2}{##3}{##4}}%
\def\numsubsecentry##1##2##3##4{%
\let\appsecentry\numsecentry%
\let\appsubsecentry\numsubsecentry%
\let\appsubsubsecentry\numsubsubsecentry%
- \let\unnchapentry\numchapentry%
+ \def\unnchapentry##1##2##3##4{%
+ \dopdfoutline{##1}{1}{##3}{##4}}%
\let\unnsecentry\numsecentry%
\let\unnsubsecentry\numsubsecentry%
\let\unnsubsubsecentry\numsubsubsecentry%
\def\it{\fam=\itfam \setfontstyle{it}}
\def\sl{\fam=\slfam \setfontstyle{sl}}
\def\bf{\fam=\bffam \setfontstyle{bf}}\def\bfstylename{bf}
-\def\tt{\fam=\ttfam \setfontstyle{tt}}
+\def\tt{\fam=\ttfam \setfontstyle{tt}}\def\ttstylename{tt}
% Texinfo sort of supports the sans serif font style, which plain TeX does not.
% So we set up a \sf.
% arg (if given), and not the url (which is then just the link target).
\newif\ifurefurlonlylink
+% The default \pretolerance setting stops the penalty inserted in
+% \urefallowbreak being a discouragement to line breaking. Set it to
+% a negative value for this paragraph only. Hopefully this does not
+% conflict with redefinitions of \par done elsewhere.
+\def\nopretolerance{%
+\pretolerance=-1
+\def\par{\endgraf\pretolerance=100 \let\par\endgraf}%
+}
+
% The main macro is \urefbreak, which allows breaking at expected
-% places within the url. (There used to be another version, which
-% didn't support automatic breaking.)
-\def\urefbreak{\begingroup \urefcatcodes \dourefbreak}
+% places within the url.
+\def\urefbreak{\nopretolerance \begingroup \urefcatcodes \dourefbreak}
\let\uref=\urefbreak
%
\def\dourefbreak#1{\urefbreakfinish #1,,,\finish}
% Allow a ragged right output to aid breaking long URL's. There can
% be a break at the \allowbreak with no extra glue (if the existing stretch in
-% the line is sufficent), a break at the \penalty100 with extra glue added
+% the line is sufficient), a break at the \penalty with extra glue added
% at the end of the line, or no break at all here.
% Changing the value of the penalty and/or the amount of stretch affects how
-% preferrable one choice is over the other.
+% preferable one choice is over the other.
\def\urefallowbreak{%
- \allowbreak
+ \penalty0\relax
\hskip 0pt plus 2 em\relax
- \penalty300
+ \penalty1000\relax
\hskip 0pt plus -2 em\relax
}
\def\sup{\ifmmode \expandafter\ptexsp \else \expandafter\finishsup\fi}
\def\finishsup#1{$\ptexsp{\hbox{\switchtolllsize #1}}$}%
+% provide this command from LaTeX as it is very common
+\def\frac#1#2{{{#1}\over{#2}}}
+
+% @displaymath.
+% \globaldefs is needed to recognize the end lines in \tex and
+% \end tex. Set \thisenv as @end displaymath is seen before @end tex.
+{\obeylines
+\globaldefs=1
+\envdef\displaymath{%
+\tex
+\def\thisenv{\displaymath}%
+$$%
+}
+
+\def\Edisplaymath{$$
+\def\thisenv{\tex}%
+\end tex
+}}
+
% @inlinefmt{FMTNAME,PROCESSED-TEXT} and @inlineraw{FMTNAME,RAW-TEXT}.
% Ignore unless FMTNAME == tex; then it is like @iftex and @tex,
% except specified as a normal braced arg, so no newlines to worry about.
% @pounds{} is a sterling sign, which Knuth put in the CM italic font.
%
-\def\pounds{{\it\$}}
+\def\pounds{\ifmonospace{\ecfont\char"BF}\else{\it\$}\fi}
% @euro{} comes from a separate font, depending on the current style.
% We use the free feym* fonts from the eurosym package by Henrik
\fi
% Quotes.
-\chardef\quotedblleft="5C
-\chardef\quotedblright=`\"
\chardef\quoteleft=`\`
\chardef\quoteright=`\'
+% only change font for tt for correct kerning and to avoid using
+% \ecfont unless necessary.
+\def\quotedblleft{%
+ \ifmonospace{\ecfont\char"10}\else{\char"5C}\fi
+}
+
+\def\quotedblright{%
+ \ifmonospace{\ecfont\char"11}\else{\char`\"}\fi
+}
+
\message{page headings,}
\newtoks\evenheadline % headline on even pages
\newtoks\oddheadline % headline on odd pages
+\newtoks\evenchapheadline% headline on even pages with a new chapter
+\newtoks\oddchapheadline % headline on odd pages with a new chapter
\newtoks\evenfootline % footline on even pages
\newtoks\oddfootline % footline on odd pages
% Now make \makeheadline and \makefootline in Plain TeX use those variables
-\headline={{\textfonts\rm \ifodd\pageno \the\oddheadline
- \else \the\evenheadline \fi}}
+\headline={{\textfonts\rm
+ \ifchapterpage
+ \ifodd\pageno\the\oddchapheadline\else\the\evenchapheadline\fi
+ \else
+ \ifodd\pageno\the\oddheadline\else\the\evenheadline\fi
+ \fi}}
+
\footline={{\textfonts\rm \ifodd\pageno \the\oddfootline
\else \the\evenfootline \fi}\HEADINGShook}
\let\HEADINGShook=\relax
\def\evenheading{\parsearg\evenheadingxxx}
\def\evenheadingxxx #1{\evenheadingyyy #1\|\|\|\|\finish}
\def\evenheadingyyy #1\|#2\|#3\|#4\finish{%
-\global\evenheadline={\rlap{\centerline{#2}}\line{#1\hfil#3}}}
+ \global\evenheadline={\rlap{\centerline{#2}}\line{#1\hfil#3}}
+ \global\evenchapheadline=\evenheadline}
\def\oddheading{\parsearg\oddheadingxxx}
\def\oddheadingxxx #1{\oddheadingyyy #1\|\|\|\|\finish}
\def\oddheadingyyy #1\|#2\|#3\|#4\finish{%
-\global\oddheadline={\rlap{\centerline{#2}}\line{#1\hfil#3}}}
+ \global\oddheadline={\rlap{\centerline{#2}}\line{#1\hfil#3}}%
+ \global\oddchapheadline=\oddheadline}
\parseargdef\everyheading{\oddheadingxxx{#1}\evenheadingxxx{#1}}%
\parseargdef\headings{\csname HEADINGS#1\endcsname}
\def\headingsoff{% non-global headings elimination
- \evenheadline={\hfil}\evenfootline={\hfil}%
- \oddheadline={\hfil}\oddfootline={\hfil}%
+ \evenheadline={\hfil}\evenfootline={\hfil}\evenchapheadline={\hfil}%
+ \oddheadline={\hfil}\oddfootline={\hfil}\oddchapheadline={\hfil}%
}
\def\HEADINGSoff{{\globaldefs=1 \headingsoff}} % global setting
\HEADINGSoff % it's the default
% When we turn headings on, set the page number to 1.
+\def\pageone{
+ \global\pageno=1
+ \global\arabiccount = \pagecount
+}
+
% For double-sided printing, put current file name in lower left corner,
% chapter name on inside top of right hand pages, document
% title on inside top of left hand pages, and page numbers on outside top
% edge of all pages.
\def\HEADINGSdouble{%
-\global\pageno=1
-\global\evenfootline={\hfil}
-\global\oddfootline={\hfil}
-\global\evenheadline={\line{\folio\hfil\thistitle}}
-\global\oddheadline={\line{\thischapterheading\hfil\folio}}
-\global\let\contentsalignmacro = \chapoddpage
+\pageone
+\HEADINGSdoublex
}
\let\contentsalignmacro = \chappager
% For single-sided printing, chapter title goes across top left of page,
% page number on top right.
\def\HEADINGSsingle{%
-\global\pageno=1
-\global\evenfootline={\hfil}
-\global\oddfootline={\hfil}
-\global\evenheadline={\line{\thischapterheading\hfil\folio}}
-\global\oddheadline={\line{\thischapterheading\hfil\folio}}
-\global\let\contentsalignmacro = \chappager
+\pageone
+\HEADINGSsinglex
}
\def\HEADINGSon{\HEADINGSdouble}
\global\evenfootline={\hfil}
\global\oddfootline={\hfil}
\global\evenheadline={\line{\folio\hfil\thistitle}}
-\global\oddheadline={\line{\thischapterheading\hfil\folio}}
+\global\oddheadline={\line{\thischapter\hfil\folio}}
+\global\evenchapheadline={\line{\folio\hfil}}
+\global\oddchapheadline={\line{\hfil\folio}}
\global\let\contentsalignmacro = \chapoddpage
}
\def\HEADINGSsinglex{%
\global\evenfootline={\hfil}
\global\oddfootline={\hfil}
-\global\evenheadline={\line{\thischapterheading\hfil\folio}}
-\global\oddheadline={\line{\thischapterheading\hfil\folio}}
+\global\evenheadline={\line{\thischapter\hfil\folio}}
+\global\oddheadline={\line{\thischapter\hfil\folio}}
+\global\evenchapheadline={\line{\hfil\folio}}
+\global\oddchapheadline={\line{\hfil\folio}}
+\global\let\contentsalignmacro = \chappager
+}
+
+% for @setchapternewpage off
+\def\HEADINGSsinglechapoff{%
+\pageone
+\global\evenfootline={\hfil}
+\global\oddfootline={\hfil}
+\global\evenheadline={\line{\thischapter\hfil\folio}}
+\global\oddheadline={\line{\thischapter\hfil\folio}}
+\global\evenchapheadline=\evenheadline
+\global\oddchapheadline=\oddheadline
\global\let\contentsalignmacro = \chappager
}
% like the previous two, but they put @code around the argument.
\def\docodeindex#1{\edef\indexname{#1}\parsearg\docodeindexxxx}
-\def\docodeindexxxx #1{\doind{\indexname}{\code{#1}}}
+\def\docodeindexxxx #1{\docind{\indexname}{#1}}
% Used for the aux, toc and index files to prevent expansion of Texinfo
\let\lbracechar\{%
\let\rbracechar\}%
%
+ %
+ \let\do\indexnofontsdef
+ %
% Non-English letters.
- \def\AA{AA}%
- \def\AE{AE}%
- \def\DH{DZZ}%
- \def\L{L}%
- \def\OE{OE}%
- \def\O{O}%
- \def\TH{TH}%
- \def\aa{aa}%
- \def\ae{ae}%
- \def\dh{dzz}%
- \def\exclamdown{!}%
- \def\l{l}%
- \def\oe{oe}%
- \def\ordf{a}%
- \def\ordm{o}%
- \def\o{o}%
- \def\questiondown{?}%
- \def\ss{ss}%
- \def\th{th}%
- %
- \def\LaTeX{LaTeX}%
- \def\TeX{TeX}%
- %
- % Assorted special characters. \defglyph gives the control sequence a
- % definition that removes the {} that follows its use.
- \defglyph\atchar{@}%
- \defglyph\arrow{->}%
- \defglyph\bullet{bullet}%
- \defglyph\comma{,}%
- \defglyph\copyright{copyright}%
- \defglyph\dots{...}%
- \defglyph\enddots{...}%
- \defglyph\equiv{==}%
- \defglyph\error{error}%
- \defglyph\euro{euro}%
- \defglyph\expansion{==>}%
- \defglyph\geq{>=}%
- \defglyph\guillemetleft{<<}%
- \defglyph\guillemetright{>>}%
- \defglyph\guilsinglleft{<}%
- \defglyph\guilsinglright{>}%
- \defglyph\leq{<=}%
- \defglyph\lbracechar{\{}%
- \defglyph\minus{-}%
- \defglyph\point{.}%
- \defglyph\pounds{pounds}%
- \defglyph\print{-|}%
- \defglyph\quotedblbase{"}%
- \defglyph\quotedblleft{"}%
- \defglyph\quotedblright{"}%
- \defglyph\quoteleft{`}%
- \defglyph\quoteright{'}%
- \defglyph\quotesinglbase{,}%
- \defglyph\rbracechar{\}}%
- \defglyph\registeredsymbol{R}%
- \defglyph\result{=>}%
- \defglyph\textdegree{o}%
+ \do\AA{AA}%
+ \do\AE{AE}%
+ \do\DH{DZZ}%
+ \do\L{L}%
+ \do\OE{OE}%
+ \do\O{O}%
+ \do\TH{TH}%
+ \do\aa{aa}%
+ \do\ae{ae}%
+ \do\dh{dzz}%
+ \do\exclamdown{!}%
+ \do\l{l}%
+ \do\oe{oe}%
+ \do\ordf{a}%
+ \do\ordm{o}%
+ \do\o{o}%
+ \do\questiondown{?}%
+ \do\ss{ss}%
+ \do\th{th}%
+ %
+ \do\LaTeX{LaTeX}%
+ \do\TeX{TeX}%
+ %
+ % Assorted special characters.
+ \do\atchar{@}%
+ \do\arrow{->}%
+ \do\bullet{bullet}%
+ \do\comma{,}%
+ \do\copyright{copyright}%
+ \do\dots{...}%
+ \do\enddots{...}%
+ \do\equiv{==}%
+ \do\error{error}%
+ \do\euro{euro}%
+ \do\expansion{==>}%
+ \do\geq{>=}%
+ \do\guillemetleft{<<}%
+ \do\guillemetright{>>}%
+ \do\guilsinglleft{<}%
+ \do\guilsinglright{>}%
+ \do\leq{<=}%
+ \do\lbracechar{\{}%
+ \do\minus{-}%
+ \do\point{.}%
+ \do\pounds{pounds}%
+ \do\print{-|}%
+ \do\quotedblbase{"}%
+ \do\quotedblleft{"}%
+ \do\quotedblright{"}%
+ \do\quoteleft{`}%
+ \do\quoteright{'}%
+ \do\quotesinglbase{,}%
+ \do\rbracechar{\}}%
+ \do\registeredsymbol{R}%
+ \do\result{=>}%
+ \do\textdegree{o}%
%
% We need to get rid of all macros, leaving only the arguments (if present).
% Of course this is not nearly correct, but it is the best we can do for now.
\macrolist
\let\value\indexnofontsvalue
}
-\def\defglyph#1#2{\def#1##1{#2}} % see above
+
+% Give the control sequence a definition that removes the {} that follows
+% its use, e.g. @AA{} -> AA
+\def\indexnofontsdef#1#2{\def#1##1{#2}}%
\fi
}
+% Same as \doind, but for code indices
+\def\docind#1#2{%
+ \iflinks
+ {%
+ %
+ \requireopenindexfile{#1}%
+ \edef\writeto{\csname#1indfile\endcsname}%
+ %
+ \def\indextext{#2}%
+ \safewhatsit\docindwrite
+ }%
+ \fi
+}
+
% Check if an index file has been opened, and if not, open it.
\def\requireopenindexfile#1{%
\ifnum\csname #1indfile\endcsname=0
% trim spaces.
\edef\trimmed{\segment}%
\edef\trimmed{\expandafter\eatspaces\expandafter{\trimmed}}%
+ \ifincodeindex
+ \edef\trimmed{\noexpand\code{\trimmed}}%
+ \fi
%
\xdef\bracedtext{\bracedtext{\trimmed}}%
%
% Write the entry in \indextext to the index file.
%
-\def\doindwrite{%
+
+\newif\ifincodeindex
+\def\doindwrite{\incodeindexfalse\doindwritex}
+\def\docindwrite{\incodeindextrue\doindwritex}
+
+\def\doindwritex{%
\maybemarginindex
%
\atdummies
\else
\begindoublecolumns
\catcode`\\=0\relax
- \catcode`\@=12\relax
+ %
+ % Make @ an escape character to give macros a chance to work. This
+ % should work because we (hopefully) don't otherwise use @ in index files.
+ %\catcode`\@=12\relax
+ \catcode`\@=0\relax
\input \jobname.\indexname s
\enddoublecolumns
\fi
\def\CHAPPAGoff{%
\global\let\contentsalignmacro = \chappager
\global\let\pchapsepmacro=\chapbreak
-\global\let\pagealignmacro=\chappager}
+\global\def\HEADINGSon{\HEADINGSsinglechapoff}}
\def\CHAPPAGon{%
\global\let\contentsalignmacro = \chappager
\global\let\pchapsepmacro=\chappager
-\global\let\pagealignmacro=\chappager
\global\def\HEADINGSon{\HEADINGSsingle}}
\def\CHAPPAGodd{%
\global\let\contentsalignmacro = \chapoddpage
\global\let\pchapsepmacro=\chapoddpage
-\global\let\pagealignmacro=\chapoddpage
\global\def\HEADINGSon{\HEADINGSdouble}}
\CHAPPAGon
%
\def\startcontents#1{%
% If @setchapternewpage on, and @headings double, the contents should
- % start on an odd page, unlike chapters. Thus, we maintain
- % \contentsalignmacro in parallel with \pagealignmacro.
- % From: Torbjorn Granlund <tege@matematik.su.se>
+ % start on an odd page, unlike chapters.
\contentsalignmacro
\immediate\closeout\tocfile
%
%
% Roman numerals for page numbers.
\ifnum \pageno>0 \global\pageno = \lastnegativepageno \fi
+ \def\thistitle{}% no title in double-sided headings
+ % Record where the Roman numerals started.
+ \ifnum\romancount=0 \global\romancount=\pagecount \fi
}
% redefined for the two-volume lispref. We always output on
\fi
\closein 1
\endgroup
- \lastnegativepageno = \pageno
- \global\pageno = \savepageno
+ \contentsendroman
}
% And just the chapters.
\vfill \eject
\contentsalignmacro % in case @setchapternewpage odd is in effect
\endgroup
+ \contentsendroman
+}
+\let\shortcontents = \summarycontents
+
+% Get ready to use Arabic numerals again
+\def\contentsendroman{%
\lastnegativepageno = \pageno
\global\pageno = \savepageno
+ %
+ % If \romancount > \arabiccount, the contents are at the end of the
+ % document. Otherwise, advance where the Arabic numerals start for
+ % the page numbers.
+ \ifnum\romancount>\arabiccount\else\global\arabiccount=\pagecount\fi
}
-\let\shortcontents = \summarycontents
% Typeset the label for a chapter or appendix for the short contents.
% The arg is, e.g., `A' for an appendix, or `3' for a chapter.
\newdimen\tabw \setbox0=\hbox{\tt\space} \tabw=8\wd0 % tab amount
%
% We typeset each line of the verbatim in an \hbox, so we can handle
-% tabs. The \global is in case the verbatim line starts with an accent,
-% or some other command that starts with a begin-group. Otherwise, the
-% entire \verbbox would disappear at the corresponding end-group, before
-% it is typeset. Meanwhile, we can't have nested verbatim commands
-% (can we?), so the \global won't be overwriting itself.
+% tabs.
\newbox\verbbox
-\def\starttabbox{\global\setbox\verbbox=\hbox\bgroup}
+\def\starttabbox{\setbox\verbbox=\hbox\bgroup}
%
\begingroup
\catcode`\^^I=\active
\divide\dimen\verbbox by\tabw
\multiply\dimen\verbbox by\tabw % compute previous multiple of \tabw
\advance\dimen\verbbox by\tabw % advance to next multiple of \tabw
- \wd\verbbox=\dimen\verbbox \box\verbbox \starttabbox
+ \wd\verbbox=\dimen\verbbox
+ \leavevmode\box\verbbox \starttabbox
}%
}
\endgroup
\let\nonarrowing = t%
\nonfillstart
\tt % easiest (and conventionally used) font for verbatim
- % The \leavevmode here is for blank lines. Otherwise, we would
- % never \starttabbox and the \egroup would end verbatim mode.
- \def\par{\leavevmode\egroup\box\verbbox\endgraf}%
+ \def\par{\egroup\leavevmode\box\verbbox\endgraf\starttabbox}%
\tabexpand
\setupmarkupstyle{verbatim}%
% Respect line breaks,
% make each space count.
% Must do in this order:
\obeylines \uncatcodespecials \sepspaces
- \everypar{\starttabbox}%
}
% Do the @verb magic: verbatim text is quoted by unique
% ignore everything up to the first ^^M, that's the newline at the end
% of the @verbatim input line itself. Otherwise we get an extra blank
% line in the output.
- \xdef\doverbatim#1^^M#2@end verbatim{#2\noexpand\end\gobble verbatim}%
+ \xdef\doverbatim#1^^M#2@end verbatim{%
+ \starttabbox#2\egroup\noexpand\end\gobble verbatim}%
% We really want {...\end verbatim} in the body of the macro, but
% without the active space; thus we have to use \xdef and \gobble.
+ % The \egroup ends the \verbbox started at the end of the last line in
+ % the block.
\endgroup
%
\envdef\verbatim{%
- \setupverbatim\doverbatim
+ \setnormaldispenv\setupverbatim\doverbatim
}
\let\Everbatim = \afterenvbreak
\wlog{texinfo.tex: doing @verbatiminclude of #1^^J}%
\edef\tmp{\noexpand\input #1 }
\expandafter
- }\tmp
+ }\expandafter\starttabbox\tmp\egroup
\afterenvbreak
}%
}
\DeclareUnicodeCharacter{0233}{\=y}%
\DeclareUnicodeCharacter{0237}{\dotless{j}}%
%
+ \DeclareUnicodeCharacter{02BC}{'}%
+ %
\DeclareUnicodeCharacter{02DB}{\ogonek{ }}%
%
% Greek letters upper case
\globaldefs = 0
}}
+\def\bsixpaper{{\globaldefs = 1
+ \afourpaper
+ \internalpagesizes{140mm}{100mm}%
+ {-6.35mm}{-12.7mm}%
+ {\bindingoffset}{14pt}%
+ {176mm}{125mm}%
+ \let\SETdispenvsize=\smallword
+ \lispnarrowing = 0.2in
+ \globaldefs = 0
+}}
+
+
% @pagesizes TEXTHEIGHT[,TEXTWIDTH]
% Perhaps we should allow setting the margins, \topskip, \parskip,
% and/or leading, also. Or perhaps we should compute them somehow.
\setleading{\textleading}%
%
\dimen0 = #1\relax
- \advance\dimen0 by \voffset
- \advance\dimen0 by 1in % reference point for DVI is 1 inch from top of page
+ \advance\dimen0 by 2.5in % default 1in margin above heading line
+ % and 1.5in to include heading, footing and
+ % bottom margin
%
\dimen2 = \hsize
- \advance\dimen2 by \normaloffset
- \advance\dimen2 by 1in % reference point is 1 inch from left edge of page
+ \advance\dimen2 by 2in % default to 1 inch margin on each side
%
\internalpagesizes{#1}{\hsize}%
{\voffset}{\normaloffset}%
# are valid code in both sh and perl. When executed by sh, they re-execute
# the script through the perl program found in $PATH. The '-x' option
# is essential as well; without it, perl would re-execute the script
-# through /bin/sh. When executed by perl, the next two lines are a no-op.
+# through /bin/sh. When executed by perl, the next two lines are a no-op.
eval 'exec perl -wSx -0777 -pi "$0" "$@"'
if 0;
-my $VERSION = '2018-03-07.03:47'; # UTC
+my $VERSION = '2020-04-04.15:07'; # UTC
# The definition above must lie within the first 8 lines in order
# for the Emacs time-stamp write hook (at end) to update it.
# If you change this file with Emacs, please let the write hook
[am__api_version='1.16'
dnl Some users find AM_AUTOMAKE_VERSION and mistake it for a way to
dnl require some minimum version. Point them to the right macro.
-m4_if([$1], [1.16.2], [],
+m4_if([$1], [1.16.3], [],
[AC_FATAL([Do not call $0, use AM_INIT_AUTOMAKE([$1]).])])dnl
])
# Call AM_AUTOMAKE_VERSION and AM_AUTOMAKE_VERSION so they can be traced.
# This function is AC_REQUIREd by AM_INIT_AUTOMAKE.
AC_DEFUN([AM_SET_CURRENT_AUTOMAKE_VERSION],
-[AM_AUTOMAKE_VERSION([1.16.2])dnl
+[AM_AUTOMAKE_VERSION([1.16.3])dnl
m4_ifndef([AC_AUTOCONF_VERSION],
[m4_copy([m4_PACKAGE_VERSION], [AC_AUTOCONF_VERSION])])dnl
_AM_AUTOCONF_VERSION(m4_defn([AC_AUTOCONF_VERSION]))])
[AC_REQUIRE([AM_AUX_DIR_EXPAND])dnl
AC_REQUIRE_AUX_FILE([missing])dnl
if test x"${MISSING+set}" != xset; then
- case $am_aux_dir in
- *\ * | *\ *)
- MISSING="\${SHELL} \"$am_aux_dir/missing\"" ;;
- *)
- MISSING="\${SHELL} $am_aux_dir/missing" ;;
- esac
+ MISSING="\${SHELL} '$am_aux_dir/missing'"
fi
# Use eval to expand $SHELL
if eval "$MISSING --is-lightweight"; then
## ------------------------ -*- Autoconf -*-
## Python file handling
## From Andrew Dalke
-## Updated by James Henstridge
+## Updated by James Henstridge and other contributors.
## ------------------------
# Copyright (C) 1999-2020 Free Software Foundation, Inc.
#
m4_default([$3], [AC_MSG_ERROR([no suitable Python interpreter found])])
else
- dnl Query Python for its version number. Getting [:3] seems to be
- dnl the best way to do this; it's what "site.py" does in the standard
- dnl library.
+ dnl Query Python for its version number. Although site.py simply uses
+ dnl sys.version[:3], printing that failed with Python 3.10, since the
+ dnl trailing zero was eliminated. So now we output just the major
+ dnl and minor version numbers, as numbers. Apparently the tertiary
+ dnl version is not of interest.
AC_CACHE_CHECK([for $am_display_PYTHON version], [am_cv_python_version],
- [am_cv_python_version=`$PYTHON -c "import sys; sys.stdout.write(sys.version[[:3]])"`])
+ [am_cv_python_version=`$PYTHON -c "import sys; print('%u.%u' % sys.version_info[[:2]])"`])
AC_SUBST([PYTHON_VERSION], [$am_cv_python_version])
dnl Use the values of $prefix and $exec_prefix for the corresponding
# gives unlimited permission to copy and/or distribute it,
# with or without modifications, as long as this notice is preserved.
-# Check whether the Vala compiler exists in $PATH. If it is found, the
-# variable VALAC is set pointing to its absolute path. Otherwise, it is
-# simply set to 'valac'.
-# Optionally a minimum release number of the compiler can be requested.
-# If the ACTION-IF-FOUND parameter is given, it will be run if a proper
-# Vala compiler is found.
-# Similarly, if the ACTION-IF-FOUND is given, it will be run if no proper
-# Vala compiler is found. It defaults to simply print a warning about the
-# situation, but otherwise proceeding with the configuration.
+# Search for a Vala compiler in PATH. If it is found, the variable VALAC is
+# set to point to it. Otherwise, it is simply set to 'valac'. This macro
+# takes three optional arguments. The first argument, if present, is the
+# minimum version of the Vala API required to compile this package. For Vala
+# releases, this is the same as the major and minor release number; e.g., when
+# `valac --version' reports 0.48.7, `valac --api-version' reports 0.48. If a
+# compiler is found and satisfies MINIMUM-VERSION, then ACTION-IF-FOUND is run
+# (this defaults to do nothing). Otherwise, ACTION-IF-NOT-FOUND is run. If
+# ACTION-IF-NOT-FOUND is not specified, the default value is to print a
+# warning in case no compiler is found, or if a too-old version of the
+# compiler is found.
#
# AM_PROG_VALAC([MINIMUM-VERSION], [ACTION-IF-FOUND], [ACTION-IF-NOT-FOUND])
# --------------------------------------------------------------------------
AC_DEFUN([AM_PROG_VALAC],
[AC_PATH_PROG([VALAC], [valac], [valac])
AS_IF([test "$VALAC" != valac && test -n "$1"],
- [AC_MSG_CHECKING([whether $VALAC is at least version $1])
- am__vala_version=`$VALAC --version | sed 's/Vala *//'`
+ [AC_MSG_CHECKING([whether $VALAC supports at least API version $1])
+ am__vala_version=`$VALAC --api-version`
AS_VERSION_COMPARE([$1], ["$am__vala_version"],
[AC_MSG_RESULT([yes])],
[AC_MSG_RESULT([yes])],
VALAC=valac])])
if test "$VALAC" = valac; then
m4_default([$3],
- [AC_MSG_WARN([no proper vala compiler found])
- AC_MSG_WARN([you will not be able to compile vala source files])])
+ [AC_MSG_WARN([Vala compiler not found or too old])
+ AC_MSG_WARN([you will not be able to compile Vala source files])])
else
m4_default([$2], [:])
fi])
'run_make' function rather than calling $MAKE directly. Not only is
this more idiomatic, but it also avoid possible spurious racy failures
when the make invocations in the testsuite are run in parallel mode
- (as with "make check AM_TESTSUITE_MAKE='make -j4"').
+ (as with "make check AM_TESTSUITE_MAKE='make -j4'").
* Do not override Makefile variables using make arguments, as in e.g.:
$FGREP COMMAND Makefile.in Makefile # For debugging.
-grep "^NO_SUCH_COMMAND = \${SHELL} .*/missing .*am-none-none" Makefile
-grep "^MISMATCHED_COMMAND = \${SHELL} .*/missing .*am-exit-63" Makefile
-grep "^COMMAND_FOUND = \${SHELL} .*/missing .*my-command" Makefile
+grep "^NO_SUCH_COMMAND = \${SHELL} .*/missing.* .*am-none-none" Makefile
+grep "^MISMATCHED_COMMAND = \${SHELL} .*/missing.* .*am-exit-63" Makefile
+grep "^COMMAND_FOUND = \${SHELL} .*/missing.* .*my-command" Makefile
grep '^OVERRIDDEN_COMMAND = am-overridden *$' Makefile
$MAKE test1 test2 test3 test4
AC_INIT([Makefile.am])
AM_INIT_AUTOMAKE([twoargs], [1.0])
AC_CONFIG_FILES([Makefile])
+AC_OUTPUT
END
$ACLOCAL
. test-init.sh
+: > ar-lib
cat >> configure.ac << 'END'
AM_PROG_AR
END
. test-init.sh
+: > ar-lib
cat >> configure.ac << 'END'
AM_PROG_AR([
echo spy > bad-archiver-interface-detected
AM_AUX_DIR_EXPAND
printf '%s\n' "ac_aux_dir: '$ac_aux_dir'"
printf '%s\n' "am_aux_dir: '$am_aux_dir'"
-test "$ac_aux_dir" = . || AS_EXIT([1])
+test "${ac_aux_dir%/}" = . || AS_EXIT([1])
test "$am_aux_dir" = "`pwd`" || AS_EXIT([1])
AS_EXIT([0])
END
required=cc
. test-init.sh
+ver=$($AUTOCONF --version | sed -n '1s/.* //p')
+case $ver in
+ 2.69[d-z]*) ;;
+ 2.[7-9][0-9]*) ;;
+ [3-9].*) ;;
+ *) skip_ 'this test passes with autoconf-2.69d and newer'
+esac
+
cat > configure.ac <<END
AC_INIT([$me], [1.0])
AC_PROG_CC
# along with this program. If not, see <https://www.gnu.org/licenses/>.
# Check the testsuite summary with the parallel test harness. This
-# script is meant to be sourced by other test script, so that it can
+# script is meant to be sourced by other test scripts, so that it can
# be used to check different scenarios (colorized and non-colorized
# testsuite output, packages with and without bug-report addresses,
# testsuites in subdirectories, ...)
check_SCRIPTS = echo.sh
echo.sh:
## The next line ensures that command1.inc has been built before
-## recurring into the subdir.
+## recursing into the subdir.
test -f ../command1.inc
(echo '#! /bin/sh'; cat command2.inc) > $@
chmod +x $@
--- /dev/null
+#! /bin/sh
+# Copyright (C) 2002-2020 Free Software Foundation, Inc.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation; either version 2, or (at your option)
+# any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program. If not, see <https://www.gnu.org/licenses/>.
+
+# Test that 'install-exec:' honors $(BUILT_SOURCES);
+# https://bugs.gnu.org/43683.
+
+. test-init.sh
+
+cat >> configure.ac << 'END'
+AC_OUTPUT
+END
+
+cat > Makefile.am << 'END'
+BUILT_SOURCES = built1
+built1:
+ echo ok > $@
+END
+
+$ACLOCAL
+$AUTOCONF
+$AUTOMAKE
+./configure --prefix "$(pwd)/inst"
+
+# Make sure this file is rebuilt by make install-exec.
+$MAKE install-exec
+test -f built1
+
+:
BUILT_SOURCES = built2
built2:
## The next line ensures that command1.inc has been built before
-## recurring into the subdir.
+## recursing into the subdir.
cp ../built1 $@
CLEANFILES = built2
END
--- /dev/null
+#! /bin/sh
+# Copyright (C) 2011-2020 Free Software Foundation, Inc.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation; either version 2, or (at your option)
+# any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program. If not, see <https://www.gnu.org/licenses/>.
+
+# Check that we can override the 'dvi' target run as part of distcheck,
+# specifically to be 'html', so that TeX is not required.
+# Related to automake bug#8289.
+
+# TeX and texi2dvi should not be needed or invoked.
+TEX=false TEXI2DVI=false
+export TEX TEXI2DVI
+
+required='makeinfo'
+. test-init.sh
+
+cat >> configure.ac << 'END'
+AC_OUTPUT
+END
+
+cat > Makefile.am << 'END'
+AM_DISTCHECK_DVI_TARGET = html
+info_TEXINFOS = main.texi
+END
+
+# Protect with leading " # " to avoid spurious maintainer-check failures.
+sed 's/^ *# *//' > main.texi << 'END'
+ # \input texinfo
+ # @setfilename main.info
+ # @settitle main
+ #
+ # @node Top
+ # Hello.
+ # @bye
+END
+
+$ACLOCAL
+$AUTOMAKE -a
+$AUTOCONF
+
+./configure
+$MAKE
+$MAKE distcheck
+
+:
--- /dev/null
+#! /bin/sh
+# This file has been automatically generated. DO NOT EDIT BY HAND!
+. test-lib.sh
+
+am_test_prefer_config_shell=yes
+# In the spirit of VPATH, we prefer a test in the build tree
+# over one in the source tree.
+for dir in . "$am_top_srcdir"; do
+ if test -f "$dir/t/install-sh-option-S.sh"; then
+ echo "$0: will source $dir/t/install-sh-option-S.sh"
+ . "$dir/t/install-sh-option-S.sh"; exit $?
+ fi
+done
+echo "$0: cannot find wrapped test 't/install-sh-option-S.sh'" >&2
+exit 99
--- /dev/null
+#! /bin/sh
+# Copyright (C) 2006-2020 Free Software Foundation, Inc.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation; either version 2, or (at your option)
+# any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program. If not, see <https://www.gnu.org/licenses/>.
+
+# More install-sh checks: option -S SUFFIX should create backups.
+
+required=non-root
+. test-init.sh
+
+get_shell_script install-sh
+
+# File gets backed up if -S is specified.
+echo foo >file
+echo bar >newfile
+./install-sh -S .BAK newfile file
+test -r file.BAK
+
+:
# An empty "foo_PRIMARY" declaration should *not* cause "make install"
# to create directory $(foodir). See automake bug#10997 and bug#11030.
+required='makeinfo tex texi2dvi dvips'
. test-init.sh
cat >> configure.ac <<'END'
XFAIL_TESTS = \
t/all.sh \
-t/auxdir-pr19311.sh \
t/cond17.sh \
t/gcj6.sh \
t/override-conditional-2.sh \
t/built-sources-cond.sh \
t/built-sources-fork-bomb.sh \
t/built-sources-install.sh \
+t/built-sources-install-exec.sh \
t/built-sources-subdir.sh \
t/built-sources.sh \
t/candist.sh \
t/distcheck-missing-m4.sh \
t/distcheck-outdated-m4.sh \
t/distcheck-no-prefix-or-srcdir-override.sh \
+t/distcheck-override-dvi.sh \
t/distcheck-override-infodir.sh \
t/distcheck-pr9579.sh \
t/distcheck-pr10470.sh \
t/add-missing-install-sh.sh \
t/install-sh-unittests.sh \
t/install-sh-option-C.sh \
+t/install-sh-option-S.sh \
t/instdat.sh \
t/instdat2.sh \
t/instdir.sh \
t/testsuite-summary-color.sh \
t/testsuite-summary-count.sh \
t/testsuite-summary-count-many.sh \
+t/testsuite-summary-header.sh \
t/testsuite-summary-reference-log.sh \
t/test-driver-acsubst.sh \
t/test-driver-cond.sh \
t/test-driver-trs-suffix-registered.sh \
t/test-driver-fail.sh \
t/test-driver-is-distributed.sh \
+t/test-extensions-empty.sh \
t/test-harness-vpath-rewrite.sh \
t/test-log.sh \
t/test-logs-repeated.sh \
t/vala-grepping.sh \
t/vala-headers.sh \
t/vala-libs.sh \
+t/vala-libs-distcheck.sh \
+t/vala-libs-vpath.sh \
t/vala-mix.sh \
t/vala-mix2.sh \
t/vala-non-recursive-setup.sh \
$(AM_V_at)mv -f %D%/testsuite-part.tmp $@
EXTRA_DIST += gen-testsuite-part
-# The dependecies declared here are not truly complete, but such
+# The dependencies declared here are not truly complete, but such
# completeness would cause more issues than it would solve. See
-# automake bug#11347.
+# automake bug#11347 and #44458.
$(generated_TESTS): $(srcdir)/gen-testsuite-part
$(srcdir)/%D%/testsuite-part.am: $(srcdir)/gen-testsuite-part
$(srcdir)/%D%/testsuite-part.am: Makefile.am
+$(srcdir)/%D%/testsuite-part.am: %D%/list-of-tests.mk
# Hand-written tests for stuff in 'contrib/'.
include $(srcdir)/contrib/%D%/local.mk
cd build
../configure
-# Sanity check.
-grep '^HELP2MAN *=.*/missing help2man' Makefile
+# Sanity check. The line we're matching looks like this:
+# HELP2MAN = ${SHELL} '/am/checkout/t/man6.dir/missing' help2man
+# so let's not try to match the exact intervening quote.
+grep '^HELP2MAN *=.*/missing.* help2man' Makefile
$MAKE
$FGREP foobar ../foobar.1
rm -f *.1 # Remove leftover generated manpages.
./configure
-# Sanity check.
-grep '^HELP2MAN *=.*/missing help2man' Makefile
+# Sanity check again, same as above.
+grep '^HELP2MAN *=.*/missing.* help2man' Makefile
$MAKE
$FGREP foobar foobar.1
. test-init.sh
-echo AM_PROG_MKDIR_P >> configure.ac
+cat > configure.ac <<'END'
+AC_INIT([test], [1.0])
+AM_INIT_AUTOMAKE
+AM_PROG_MKDIR_P
+AC_CONFIG_FILES([Makefile])
+AC_OUTPUT
+END
+
: > Makefile.am
grep_err ()
{
- loc='^configure.ac:4:'
+ loc='^configure.ac:3:'
grep "$loc.*AM_PROG_MKDIR_P.*deprecated" stderr
grep "$loc.* use .*AC_PROG_MKDIR_P" stderr
grep "$loc.* use '\$(MKDIR_P)' instead of '\$(mkdir_p)'.*Makefile" stderr
END
cat > exp <<'END'
-FAIL: fail.test
-PASS: pass.test
ERROR: error.test
-XPASS: sub/xpass.test
-XFAIL: xfail.test
ERROR: sub/error2.test
+FAIL: fail.test
+PASS: pass.test
SKIP: a/b/skip.test
+XFAIL: xfail.test
+XPASS: sub/xpass.test
END
mkdir sub a a/b
fi
$srcdir/configure
run_make -O -e FAIL check
- LC_ALL=C grep '^[A-Z][A-Z]*:' stdout > got
+ LC_ALL=C grep '^[A-Z][A-Z]*:' stdout | sort > got
cat got
diff $srcdir/exp got
cd $srcdir
# vary among different python installations, so we need more relaxed
# and ad-hoc checks for them. Also, more proper "functional" checks
# on them should be done in the 'python-virtualenv.sh' test.
-PYTHON_VERSION=$($PYTHON -c 'import sys; print(sys.version[:3])') || exit 1
+#
+# This version identification is duplicated in python.m4 (and the manual).
+PYTHON_VERSION=$($PYTHON -c 'import sys; print("%u.%u" % sys.version_info[:2])') || exit 1
PYTHON_PLATFORM=$($PYTHON -c 'import sys; print(sys.platform)') || exit 1
PYTHON_EXEC_PREFIX='${exec_prefix}'
PYTHON_PREFIX='${prefix}'
py_version_post=$(python -V)
# Sanity check.
-test "$py_version_pre" = "$py_version_post"
+test "$py_version_pre" = "$py_version_post" \
+ || skip_ "virtualenv $py_version_post != $py_version_pre"
cwd=$(pwd) || fatal_ "getting current working directory"
py_version=$(python -c 'import sys; print("%u.%u" % tuple(sys.version_info[:2]))')
# if there are CONFIG_HEADERS.
# See automake bug#38139.
-required=''
+required=etags
. test-init.sh
# some AC_CONFIG_FILES header is needed to trigger the bug.
--- /dev/null
+#! /bin/sh
+# Copyright (C) 2020 Free Software Foundation, Inc.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation; either version 2, or (at your option)
+# any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program. If not, see <https://www.gnu.org/licenses/>.
+
+# Empty assignment to TEST_EXTENSIONS should not provoke Perl warning.
+# https://bugs.gnu.org/42635
+
+. test-init.sh
+
+cat > configure.ac << 'END'
+AC_INIT([foo],[1.0])
+AM_INIT_AUTOMAKE([foreign])
+AC_PROG_CC dnl comment this line to make the warning disappear
+AC_CONFIG_FILES([Makefile])
+AC_OUTPUT
+END
+
+cat > Makefile.am << 'END'
+TEST_EXTENSIONS =
+LOG_COMPILER = echo
+TESTS = foo.test
+END
+
+touch foo.test
+
+autoreconf -fi >reconf.out 2>&1
+grep 'uninitialized value' reconf.out && exit 1
+
+# What we're trying to avoid:
+# ...
+# Use of uninitialized value in string eq at /usr/bin/automake line 4953.
+# ...
+# nl -ba `command -v automake` | sed -n '4951,4955p'
+# 4951 if ($handle_exeext)
+# 4952 {
+# 4953 unshift (@test_suffixes, $at_exeext)
+# 4954 unless $test_suffixes[0] eq $at_exeext;
+# 4955 }
+
+:
t/exeext4-w.log: t/exeext4.log
t/install-sh-option-C-w.log: t/install-sh-option-C.sh
t/install-sh-option-C-w.log: t/install-sh-option-C.log
+t/install-sh-option-S-w.log: t/install-sh-option-S.sh
+t/install-sh-option-S-w.log: t/install-sh-option-S.log
t/install-sh-unittests-w.log: t/install-sh-unittests.sh
t/install-sh-unittests-w.log: t/install-sh-unittests.log
t/maken3-w.log: t/maken3.sh
generated_TESTS += t/compile7-w.sh
generated_TESTS += t/exeext4-w.sh
generated_TESTS += t/install-sh-option-C-w.sh
+generated_TESTS += t/install-sh-option-S-w.sh
generated_TESTS += t/install-sh-unittests-w.sh
generated_TESTS += t/maken3-w.sh
generated_TESTS += t/mdate5-w.sh
## Added by deps-extracting key 'check_testsuite_summary'.
check_testsuite_summary_TESTS = \
t/testsuite-summary-color.sh \
- t/testsuite-summary-count.sh
+ t/testsuite-summary-count.sh \
+ t/testsuite-summary-header.sh
EXTRA_DIST += t/ax/testsuite-summary-checks.sh
t/testsuite-summary-color.log: t/ax/testsuite-summary-checks.sh
t/testsuite-summary-count.log: t/ax/testsuite-summary-checks.sh
+t/testsuite-summary-header.log: t/ax/testsuite-summary-checks.sh
## Added by deps-extracting key 'depcomp'.
depcomp_TESTS = \
## Added by deps-extracting key 'extract_testsuite_summary'.
extract_testsuite_summary_TESTS = \
- t/testsuite-summary-count-many.sh
+ t/testsuite-summary-count-many.sh \
+ t/testsuite-summary-header.sh
EXTRA_DIST += t/ax/extract-testsuite-summary.pl
t/testsuite-summary-count-many.log: t/ax/extract-testsuite-summary.pl
+t/testsuite-summary-header.log: t/ax/extract-testsuite-summary.pl
## Added by deps-extracting key 'gettext_macros'.
gettext_macros_TESTS = \
t/suffix8.tap \
t/suffix10.tap \
t/vala-libs.sh \
+ t/vala-libs-distcheck.sh \
+ t/vala-libs-vpath.sh \
t/vartypo2.sh \
t/depcomp-lt-auto.tap \
t/depcomp-lt-cpp.tap \
t/suffix8.log: t/libtool-macros.log
t/suffix10.log: t/libtool-macros.log
t/vala-libs.log: t/libtool-macros.log
+t/vala-libs-distcheck.log: t/libtool-macros.log
+t/vala-libs-vpath.log: t/libtool-macros.log
t/vartypo2.log: t/libtool-macros.log
t/depcomp-lt-auto.log: t/libtool-macros.log
t/depcomp-lt-cpp.log: t/libtool-macros.log
pkgconfig_macros_TESTS = \
t/vala-headers.sh \
t/vala-libs.sh \
+ t/vala-libs-distcheck.sh \
+ t/vala-libs-vpath.sh \
t/vala-mix.sh \
t/vala-mix2.sh \
t/vala-non-recursive-setup.sh \
EXTRA_DIST +=
t/vala-headers.log: t/pkg-config-macros.log
t/vala-libs.log: t/pkg-config-macros.log
+t/vala-libs-distcheck.log: t/pkg-config-macros.log
+t/vala-libs-vpath.log: t/pkg-config-macros.log
t/vala-mix.log: t/pkg-config-macros.log
t/vala-mix2.log: t/pkg-config-macros.log
t/vala-non-recursive-setup.log: t/pkg-config-macros.log
--- /dev/null
+#! /bin/sh
+# Copyright (C) 2011-2020 Free Software Foundation, Inc.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation; either version 2, or (at your option)
+# any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program. If not, see <https://www.gnu.org/licenses/>.
+
+# Check that we can override the "Testsuite summary" header line,
+# per bug#11745.
+
+. test-lib.sh
+
+use_colors=no; use_vpath=no
+. testsuite-summary-checks.sh
+
+./configure
+
+# Cut down from do_check in ax/testsuite-summary-checks.sh
+# so that we can pass a make variable setting in $1.
+#
+do_header_check ()
+{
+ cat > summary.exp
+ run_make -O -e IGNORE check "$1"
+ test $am_make_rc -eq 0 || exit 1
+ $PERL "$am_testaux_srcdir"/extract-testsuite-summary.pl stdout >summary.got \
+ || fatal_ "cannot extract testsuite summary"
+ cat summary.exp
+ cat summary.got
+ compare=diff
+ $compare summary.exp summary.got || exit 1
+}
+
+# We don't actually run any tests, only interested in the header line.
+results="\
+# TOTAL: 0
+# PASS: 0
+# SKIP: 0
+# XFAIL: 0
+# FAIL: 0
+# XPASS: 0
+# ERROR: 0"
+#
+success_footer=${br}
+
+# Check the default.
+header="\
+${br}
+Testsuite summary for GNU AutoFoo 7.1
+${br}"
+#
+do_header_check 'junkvar=junkval' <<END
+$header
+$results
+$success_footer
+END
+
+# Elide the "for $(PACKAGE_STRING)".
+header_min="\
+${br}
+Testsuite summary
+${br}"
+#
+do_header_check 'AM_TESTSUITE_SUMMARY_HEADER=""' <<END
+$header_min
+$results
+$success_footer
+END
+
+# Add a suffix.
+header_more="\
+${br}
+Testsuite summary for GNU AutoFoo 7.1 (hi)
+${br}"
+#
+do_header_check 'AM_TESTSUITE_SUMMARY_HEADER=" for $(PACKAGE_STRING) (hi)"' <<END
+$header_more
+$results
+$success_footer
+END
+
+:
# Make sure the documentation targets work as required with BSD make,
# even in the presence of subdirs (requires presence of default *-am rules).
+required='makeinfo tex texi2dvi dvips'
. test-init.sh
mkdir sub
all-local: ps pdf dvi html # For "make distcheck".
info_TEXINFOS = foo.texi doc/bar.texi baz.texi
SUBDIRS = sub
+
+# Tell GNU make not to parallelize these, because they
+# have overlap between explicit and intermediate .dvi files.
+.NOTPARALLEL:
END
mkdir sub doc
if test "x$1" = x--version; then
echo "${vala_version-1.2.3}"
fi
+if test "x$1" = x--api-version; then
+ echo "${vala_version-1.2.3}"
+fi
exit 0
END
chmod +x bin/valac
if test "x$1" = x--version; then
echo 0.1
fi
+if test "x$1" = x--api-version; then
+ echo 0.1
+fi
exit 0
END
chmod +x bin/valac.old
st=0; vala_version=0.1.2 ./configure 2>stderr || st=$?
cat stderr >&2
test $st -eq 0
-grep '^configure: WARNING: no proper vala compiler found' stderr
+grep '^configure: WARNING: Vala compiler not found or too old' stderr
$MAKE no-valac
st=0; ./configure VALAC="$(pwd)/bin/valac.old" 2>stderr || st=$?
cat stderr >&2
test $st -eq 0 || exit 1
-grep '^configure: WARNING: no proper vala compiler found' stderr
+grep '^configure: WARNING: Vala compiler not found or too old' stderr
$MAKE no-valac
sed 's/^\(AM_PROG_VALAC\).*/\1([1], [: > ok], [: > ko])/' <configure.ac >t
--- /dev/null
+#! /bin/sh
+# Copyright (C) 2012-2020 Free Software Foundation, Inc.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation; either version 2, or (at your option)
+# any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program. If not, see <http://www.gnu.org/licenses/>.
+
+# Building libraries (libtool and static) from Vala sources.
+# And use of vapi files to call C code from Vala.
+
+required="valac cc pkg-config libtoolize GNUmake"
+am_create_testdir=empty
+. test-init.sh
+
+cat >> configure.ac << 'END'
+AC_INIT([atest],[0.1])
+AC_CONFIG_SRCDIR([data/atest.pc.in])
+AC_SUBST([API_VERSION],[0])
+
+AM_INIT_AUTOMAKE
+AM_MAINTAINER_MODE([enable])
+AM_PROG_AR
+LT_INIT
+
+AC_PROG_CC
+AC_PROG_INSTALL
+PKG_PROG_PKG_CONFIG([0.22])
+AM_PROG_VALAC([0.32])
+
+PKG_CHECK_MODULES(ATEST, [gio-2.0])
+
+AC_CONFIG_FILES([
+ Makefile
+
+ src/Makefile
+
+ src/atest-$API_VERSION.deps:src/atest.deps.in
+
+ data/Makefile
+ data/atest-$API_VERSION.pc:data/atest.pc.in
+
+],[],
+[API_VERSION='$API_VERSION'])
+AC_OUTPUT
+END
+
+
+cat > Makefile.am << 'END'
+SUBDIRS=data src
+END
+
+mkdir data
+
+cat > data/atest.pc.in << 'END'
+prefix=@prefix@
+exec_prefix=@exec_prefix@
+libdir=@libdir@
+datarootdir=@datarootdir@
+datadir=@datadir@
+includedir=@includedir@
+
+Name: atest-@API_VERSION@
+Description: atest library
+Version: @VERSION@
+Requires: glib-2.0 gobject-2.0
+Libs: -L${libdir} -latest-@API_VERSION@
+Cflags: -I${includedir}/atest-@API_VERSION@
+END
+
+
+cat > data/Makefile.am << 'END'
+# pkg-config data
+# Note that the template file is called atest.pc.in, but generates a
+# versioned .pc file using some magic in AC_CONFIG_FILES.
+pkgconfigdir = $(libdir)/pkgconfig
+pkgconfig_DATA = atest-$(API_VERSION).pc
+
+DISTCLEANFILES = $(pkgconfig_DATA)
+EXTRA_DIST = atest.pc.in
+END
+
+mkdir src
+
+cat > src/atest.deps.in << 'END'
+glib-2.0
+END
+
+
+cat > src/atest.vala << 'END'
+using GLib;
+
+namespace Atest {
+ public class A {
+ public bool foo() { return false; }
+ }
+}
+END
+
+cat > src/Makefile.am << 'END'
+lib_LTLIBRARIES = libatest-@API_VERSION@.la
+
+libatest_@API_VERSION@_la_SOURCES = \
+ atest.vala \
+ cservice.c \
+ cservice.h \
+ $(NULL)
+
+
+libatest_@API_VERSION@_la_CPPFLAGS = \
+ -DOKOKIMDEFINED=1 \
+ $(NULL)
+
+libatest_@API_VERSION@_la_CFLAGS = \
+ $(ATEST_CFLAGS) \
+ $(WARN_CFLAGS) \
+ $(NULL)
+
+libatest_@API_VERSION@_la_LIBADD = \
+ $(ATEST_LIBS) \
+ $(NULL)
+
+libatest_@API_VERSION@_la_LDFLAGS = \
+ $(WARN_LDFLAGS) \
+ $(NULL)
+
+libatest_@API_VERSION@_la_VALAFLAGS = \
+ --vapidir=$(VAPIDIR) \
+ --vapidir=$(srcdir) \
+ --pkg cservice \
+ --thread \
+ --target-glib=2.44 \
+ --pkg glib-2.0 \
+ -H atest.h \
+ --library atest-@API_VERSION@ \
+ $(NULL)
+
+header_DATA=atest.h
+headerdir=$(includedir)/atest-@API_VERSION@/atest
+
+atest-@API_VERSION@.deps:
+ cp atest.deps atest-@API_VERSION@.deps
+
+vapi_DATA=atest-@API_VERSION@.vapi atest-@API_VERSION@.deps
+vapidir=$(VAPIDIR)
+
+CLEANFILES = atest-@API_VERSION@.deps
+END
+
+
+cat > src/cservice.c << 'END'
+#include "cservice.h"
+int c_service_mu_call (void)
+{
+ return OKOKIMDEFINED;
+}
+END
+
+cat > src/cservice.h << 'END'
+int c_service_mu (void);
+END
+
+cat > src/cservice.vapi <<'END'
+namespace CService {
+ public class Mu {
+ [CCode (cheader_filename = "cservice.h", cname = "c_service_mu_call")]
+ public int call ();
+ }
+}
+END
+
+libtoolize
+$ACLOCAL
+$AUTOCONF
+$AUTOMAKE -a
+
+./configure
+
+$MAKE
+test -f src/libatest_0_la_vala.stamp
+test -f src/libatest-0.la
+
+$MAKE distcheck
+
+:
--- /dev/null
+#! /bin/sh
+# Copyright (C) 2012-2020 Free Software Foundation, Inc.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation; either version 2, or (at your option)
+# any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program. If not, see <http://www.gnu.org/licenses/>.
+
+# Building libraries (libtool and static) from Vala sources.
+# And use of vapi files to call C code from Vala.
+
+required="valac cc pkg-config libtoolize GNUmake"
+am_create_testdir=empty
+. test-init.sh
+
+cat >> configure.ac << 'END'
+AC_INIT([valalibs],[0.1])
+
+AC_CONFIG_MACRO_DIR([m4])
+
+AM_INIT_AUTOMAKE
+AM_PROG_AR
+LT_INIT
+
+AC_PROG_CC
+
+AM_PROG_VALAC([0.7.3])
+PKG_CHECK_MODULES([GOBJECT], [gobject-2.0 >= 2.4])
+
+AC_CONFIG_FILES([Makefile src/Makefile])
+AC_OUTPUT
+END
+
+
+cat > Makefile.am << 'END'
+SUBDIRS=src
+END
+
+mkdir src
+
+cat > src/Makefile.am << 'END'
+AUTOMAKE_OPTIONS = subdir-objects
+lib_LTLIBRARIES = libservice.la
+libservice_la_SOURCES = service.vala cservice.c cservice.h
+libservice_la_CPPFLAGS = -DOKOKIMDEFINED=1
+libservice_la_VALAFLAGS = --vapidir=$(srcdir) --pkg cservice --library service
+AM_CFLAGS = $(GOBJECT_CFLAGS)
+END
+
+libtoolize
+$ACLOCAL
+$AUTOCONF
+$AUTOMAKE -a
+
+cat > src/cservice.c << 'END'
+#include "cservice.h"
+int c_service_mu_call (void)
+{
+ return OKOKIMDEFINED;
+}
+END
+
+cat > src/cservice.h << 'END'
+int c_service_mu (void);
+END
+
+cat > src/cservice.vapi <<'END'
+namespace CService {
+ public class Mu {
+ [CCode (cheader_filename = "cservice.h", cname = "c_service_mu_call")]
+ public int call ();
+ }
+}
+END
+
+cat > src/service.vala << 'END'
+namespace CService {
+public class Generator : Object {
+ public Generator () {
+ stdout.printf ("construct generator");
+ }
+ public void init () {
+ stdout.printf ("init generator");
+ }
+}
+}
+END
+
+mkdir build
+cd build
+../configure
+
+$MAKE
+pwd
+test -f src/libservice_la_vala.stamp
+test -f src/libservice.la
+
+:
cat > Makefile.am << 'END'
AUTOMAKE_OPTIONS = subdir-objects
-lib_LIBRARIES = libmu.a
+lib_LIBRARIES = libservice.a
lib_LTLIBRARIES = src/libzardoz.la
-libmu_a_SOURCES = mu.vala mu2.c mu.vapi mu2.h
-libmu_a_CPPFLAGS = -DOKOKIMDEFINED=1
-libmu_a_VALAFLAGS = --vapidir=$(srcdir)
+libservice_a_SOURCES = service.vala cservice.c cservice.h
+libservice_a_CPPFLAGS = -DOKOKIMDEFINED=1
+libservice_a_VALAFLAGS = --vapidir=$(srcdir) --pkg cservice --library service
AM_CFLAGS = $(GOBJECT_CFLAGS)
src_libzardoz_la_LIBADD = $(GOBJECT_LIBS)
src_libzardoz_la_SOURCES = src/zardoz-foo.vala src/zardoz-bar.vala
./configure
-cat > mu2.c << 'END'
-#include "mu2.h"
-int mu2 (void)
+cat > cservice.c << 'END'
+#include "cservice.h"
+int c_service_mu_call (void)
{
return OKOKIMDEFINED;
}
END
-cat > mu2.h << 'END'
-int mu2 (void);
+cat > cservice.h << 'END'
+int c_service_mu (void);
END
-cat > mu.vapi <<'END'
-[CCode (cheader_filename = "mu2.h", cname = "mu2")]
-public int c_mu2 ();
+cat > cservice.vapi <<'END'
+namespace CService {
+ public class Mu {
+ [CCode (cheader_filename = "cservice.h", cname = "c_service_mu_call")]
+ public int call ();
+ }
+}
END
-cat > mu.vala << 'END'
-int main ()
-{
- stdout.printf ("mumumu\n");
- return c_mu2 ();
+cat > service.vala << 'END'
+namespace CService {
+public class Generator : Object {
+ public Generator () {
+ stdout.printf ("construct generator");
+ }
+ public void init () {
+ stdout.printf ("init generator");
+ }
+}
}
END
END
$MAKE
-test -f libmu.a
+test -f libservice.a
test -f src/libzardoz.la
-$FGREP "mumumu" mu.c
+$FGREP "construct generator" service.c
$FGREP "FooFooFoo" src/zardoz-foo.c
$FGREP "BarBarBar" src/zardoz-bar.c
-test -f libmu_a_vala.stamp
+test -f libservice_a_vala.stamp
test -f src_libzardoz_la_vala.stamp
$MAKE distcheck
END
cat > foo.h <<'END'
-int foo;
+extern int foo;
int bar (void);
int baz (void);
END
cat > baz.c <<'END'
#include "foo.h"
-extern int foo = 0;
+int foo = 0;
int baz (void) { return 0; }
END
./configure || skip_ "configure failure"
$MAKE
$MAKE distcheck
-$MAKE distclean
+$MAKE maintainer-clean
+
mkdir build
cd build
../configure
../configure
$MAKE -j6
ls -l . .. # For debugging.
-for x in main 1 2 3 4 5 6; do test -f ../$x.c; done
-test -f ../zardoz_vala.stamp
+for x in main 1 2 3 4 5 6; do test -f $x.c; done
+test -f ./zardoz_vala.stamp
$MAKE distcheck -j4
test -f src/xbar.c
$MAKE distclean
-test -f src/xfoo.c
-test -f src/xbar.c
+test ! -e src/xfoo.c
+test ! -e src/xbar.c
# Re-create Makefile.
mv config.sav config.status
# Check the distribution.
$MAKE distcheck
-$MAKE distclean
+$MAKE maintainer-clean
-# Tru a VPATH setup.
+# Try a VPATH setup.
mkdir build
cd build
# Test rebuild rules from builddir.
-rm -f ../src/zardoz.h
-$MAKE -C src ../../src/zardoz.h
-test -f ../src/zardoz.h
+rm -f src/zardoz.h
+$MAKE -C src zardoz.h
+test -f src/zardoz.h
-rm -f ../src/zardoz.c
+rm -f src/zardoz.c
$MAKE
-grep 'Zardoz!' ../src/zardoz.c
+grep 'Zardoz!' src/zardoz.c
$sleep
sed 's/Zardoz!/FooBar!/' ../src/zardoz.vala > t
mv -f t ../src/zardoz.vala
$MAKE
-grep 'FooBar!' ../src/zardoz.c
-grep 'Zardoz!' ../src/zardoz.c && exit 1
+grep 'FooBar!' src/zardoz.c
+grep 'Zardoz!' src/zardoz.c && exit 1
:
$ACLOCAL
$AUTOCONF
-$AUTOMAKE
+$AUTOMAKE -a
mkdir build
cd build
../configure
$MAKE
-test -f ../foo_vala.stamp
-test -f ../bar_vala.stamp
-grep foofoofoo ../hello.c
-test -f ../zardoz.h
+test -f ./foo_vala.stamp
+test -f ./bar_vala.stamp
+grep foofoofoo ./hello.c
+test -f ./zardoz.h
$MAKE distcheck
# Rebuild rules work also in VPATH builds.
END
$MAKE
-test -f ../foo_vala.stamp
-test -f ../bar_vala.stamp
-grep barbarbar ../hello.c
+test -f ./foo_vala.stamp
+test -f ./bar_vala.stamp
+grep barbarbar ./hello.c
+$MAKE distcheck
# Rebuild rules are not uselessly triggered.
$MAKE -q
# Cleanup rules work also in VPATH builds.
$MAKE clean
-test -f ../foo_vala.stamp
-test -f ../bar_vala.stamp
-test -f ../zardoz.h
-test -f ../hello.c
+test -f ./foo_vala.stamp
+test -f ./bar_vala.stamp
+test -f ./zardoz.h
+test -f ./hello.c
$MAKE maintainer-clean
-test ! -e ../zardoz.h
-test ! -e ../hello.c
-test ! -e ../foo_vala.stamp
-test ! -e ../bar_vala.stamp
+test ! -e ./zardoz.h
+test ! -e ./hello.c
+test ! -e ./foo_vala.stamp
+test ! -e ./bar_vala.stamp
: