4 Perform the :ref:`CTest Test Step` as a :ref:`Dashboard Client`.
8 ctest_test([BUILD <build-dir>] [APPEND]
11 [STRIDE <stride-number>]
12 [EXCLUDE <exclude-regex>]
13 [INCLUDE <include-regex>]
14 [EXCLUDE_LABEL <label-exclude-regex>]
15 [INCLUDE_LABEL <label-include-regex>]
16 [EXCLUDE_FIXTURE <regex>]
17 [EXCLUDE_FIXTURE_SETUP <regex>]
18 [EXCLUDE_FIXTURE_CLEANUP <regex>]
19 [PARALLEL_LEVEL <level>]
20 [RESOURCE_SPEC_FILE <file>]
21 [TEST_LOAD <threshold>]
22 [SCHEDULE_RANDOM <ON|OFF>]
24 [STOP_TIME <time-of-day>]
25 [RETURN_VALUE <result-var>]
26 [CAPTURE_CMAKE_ERROR <result-var>]
33 _note: If updating the argument list here, please also update the argument
34 list documentation for :command:`ctest_memcheck` as well.
36 Run tests in the project build tree and store results in
37 ``Test.xml`` for submission with the :command:`ctest_submit` command.
42 Specify the top-level build directory. If not given, the
43 :variable:`CTEST_BINARY_DIRECTORY` variable is used.
46 Mark ``Test.xml`` for append to results previously submitted to a
47 dashboard server since the last :command:`ctest_start` call.
48 Append semantics are defined by the dashboard server in use.
49 This does *not* cause results to be appended to a ``.xml`` file
50 produced by a previous call to this command.
52 ``START <start-number>``
53 Specify the beginning of a range of test numbers.
56 Specify the end of a range of test numbers.
58 ``STRIDE <stride-number>``
59 Specify the stride by which to step across a range of test numbers.
61 ``EXCLUDE <exclude-regex>``
62 Specify a regular expression matching test names to exclude.
64 ``INCLUDE <include-regex>``
65 Specify a regular expression matching test names to include.
66 Tests not matching this expression are excluded.
68 ``EXCLUDE_LABEL <label-exclude-regex>``
69 Specify a regular expression matching test labels to exclude.
71 ``INCLUDE_LABEL <label-include-regex>``
72 Specify a regular expression matching test labels to include.
73 Tests not matching this expression are excluded.
75 ``EXCLUDE_FIXTURE <regex>``
78 If a test in the set of tests to be executed requires a particular fixture,
79 that fixture's setup and cleanup tests would normally be added to the test
80 set automatically. This option prevents adding setup or cleanup tests for
81 fixtures matching the ``<regex>``. Note that all other fixture behavior is
82 retained, including test dependencies and skipping tests that have fixture
83 setup tests that fail.
85 ``EXCLUDE_FIXTURE_SETUP <regex>``
88 Same as ``EXCLUDE_FIXTURE`` except only matching setup tests are excluded.
90 ``EXCLUDE_FIXTURE_CLEANUP <regex>``
93 Same as ``EXCLUDE_FIXTURE`` except only matching cleanup tests are excluded.
95 ``PARALLEL_LEVEL <level>``
96 Specify a positive number representing the number of tests to
99 ``RESOURCE_SPEC_FILE <file>``
100 .. versionadded:: 3.16
103 :ref:`resource specification file <ctest-resource-specification-file>`. See
104 :ref:`ctest-resource-allocation` for more information.
106 ``TEST_LOAD <threshold>``
107 .. versionadded:: 3.4
109 While running tests in parallel, try not to start tests when they
110 may cause the CPU load to pass above a given threshold. If not
111 specified the :variable:`CTEST_TEST_LOAD` variable will be checked,
112 and then the ``--test-load`` command-line argument to :manual:`ctest(1)`.
113 See also the ``TestLoad`` setting in the :ref:`CTest Test Step`.
115 ``REPEAT <mode>:<n>``
116 .. versionadded:: 3.17
118 Run tests repeatedly based on the given ``<mode>`` up to ``<n>`` times.
122 Require each test to run ``<n>`` times without failing in order to pass.
123 This is useful in finding sporadic failures in test cases.
126 Allow each test to run up to ``<n>`` times in order to pass.
127 Repeats tests if they fail for any reason.
128 This is useful in tolerating sporadic failures in test cases.
131 Allow each test to run up to ``<n>`` times in order to pass.
132 Repeats tests only if they timeout.
133 This is useful in tolerating sporadic timeouts in test cases
136 ``SCHEDULE_RANDOM <ON|OFF>``
137 Launch tests in a random order. This may be useful for detecting
138 implicit test dependencies.
141 .. versionadded:: 3.18
143 Stop the execution of the tests once one has failed.
145 ``STOP_TIME <time-of-day>``
146 Specify a time of day at which the tests should all stop running.
148 ``RETURN_VALUE <result-var>``
149 Store in the ``<result-var>`` variable ``0`` if all tests passed.
150 Store non-zero if anything went wrong.
152 ``CAPTURE_CMAKE_ERROR <result-var>``
153 .. versionadded:: 3.7
155 Store in the ``<result-var>`` variable -1 if there are any errors running
156 the command and prevent ctest from returning non-zero if an error occurs.
158 ``OUTPUT_JUNIT <file>``
159 .. versionadded:: 3.21
161 Write test results to ``<file>`` in JUnit XML format. If ``<file>`` is a
162 relative path, it will be placed in the build directory. If ``<file>``
163 already exists, it will be overwritten. Note that the resulting JUnit XML
164 file is **not** uploaded to CDash because it would be redundant with
165 CTest's ``Test.xml`` file.
168 .. versionadded:: 3.3
170 Suppress any CTest-specific non-error messages that would have otherwise
171 been printed to the console. Output from the underlying test command is not
172 affected. Summary info detailing the percentage of passing tests is also
173 unaffected by the ``QUIET`` option.
175 See also the :variable:`CTEST_CUSTOM_MAXIMUM_PASSED_TEST_OUTPUT_SIZE`,
176 :variable:`CTEST_CUSTOM_MAXIMUM_FAILED_TEST_OUTPUT_SIZE` and
177 :variable:`CTEST_CUSTOM_TEST_OUTPUT_TRUNCATION` variables, along with their
178 corresponding :manual:`ctest(1)` command line options
179 ``--test-output-size-passed``, ``--test-output-size-failed``, and
180 ``--test-output-truncation``.
182 .. _`Additional Test Measurements`:
184 Additional Test Measurements
185 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
187 CTest can parse the output of your tests for extra measurements to report
190 When run as a :ref:`Dashboard Client`, CTest will include these custom
191 measurements in the ``Test.xml`` file that gets uploaded to CDash.
193 Check the `CDash test measurement documentation
194 <https://github.com/Kitware/CDash/blob/master/docs/test_measurements.md>`_
195 for more information on the types of test measurements that CDash recognizes.
197 .. versionadded: 3.22
198 CTest can parse custom measurements from tags named
199 ``<CTestMeasurement>`` or ``<CTestMeasurementFile>``. The older names
200 ``<DartMeasurement>`` and ``<DartMeasurementFile>`` are still supported.
202 The following example demonstrates how to output a variety of custom test
208 "<CTestMeasurement type=\"numeric/double\" name=\"score\">28.3</CTestMeasurement>"
212 "<CTestMeasurement type=\"text/string\" name=\"color\">red</CTestMeasurement>"
216 "<CTestMeasurement type=\"text/link\" name=\"CMake URL\">https://cmake.org</CTestMeasurement>"
220 "<CTestMeasurement type=\"text/preformatted\" name=\"Console Output\">" <<
222 " \033[31;1m line 2. Bold red, and indented!\033[0;0ml\n" <<
223 "line 3. Not bold or indented...\n" <<
224 "</CTestMeasurement>" << std::endl;
229 The following example demonstrates how to upload test images to CDash.
234 "<CTestMeasurementFile type=\"image/jpg\" name=\"TestImage\">" <<
235 "/dir/to/test_img.jpg</CTestMeasurementFile>" << std::endl;
238 "<CTestMeasurementFile type=\"image/gif\" name=\"ValidImage\">" <<
239 "/dir/to/valid_img.gif</CTestMeasurementFile>" << std::endl;
242 "<CTestMeasurementFile type=\"image/png\" name=\"AlgoResult\">" <<
243 "/dir/to/img.png</CTestMeasurementFile>"
246 Images will be displayed together in an interactive comparison mode on CDash
247 if they are provided with two or more of the following names.
252 * ``DifferenceImage2``
254 By convention, ``TestImage`` is the image generated by your test, and
255 ``ValidImage`` (or ``BaselineImage``) is basis of comparison used to determine
256 if the test passed or failed.
258 If another image name is used it will be displayed by CDash as a static image
259 separate from the interactive comparison UI.
264 .. versionadded:: 3.21
266 The following example demonstrates how to upload non-image files to CDash.
271 "<CTestMeasurementFile type=\"file\" name=\"TestInputData1\">" <<
272 "/dir/to/data1.csv</CTestMeasurementFile>\n" <<
273 "<CTestMeasurementFile type=\"file\" name=\"TestInputData2\">" <<
274 "/dir/to/data2.csv</CTestMeasurementFile>" << std::endl;
276 If the name of the file to upload is known at configure time, you can use the
277 :prop_test:`ATTACHED_FILES` or :prop_test:`ATTACHED_FILES_ON_FAIL` test
283 .. versionadded:: 3.21
285 The following example demonstrates how to specify a custom value for the
286 ``Test Details`` field displayed on CDash.
291 "<CTestDetails>My Custom Details Value</CTestDetails>" << std::endl;
293 .. _`Additional Labels`:
298 .. versionadded:: 3.22
300 The following example demonstrates how to add additional labels to a test
306 "<CTestLabel>Custom Label 1</CTestLabel>\n" <<
307 "<CTestLabel>Custom Label 2</CTestLabel>" << std::endl;
309 Use the :prop_test:`LABELS` test property instead for labels that can be
310 determined at configure time.