Imported Upstream version 1.72.0
[platform/upstream/boost.git] / libs / math / doc / html / math_toolkit / stat_tut / weg / inverse_chi_squared_eg.html
1 <html>
2 <head>
3 <meta http-equiv="Content-Type" content="text/html; charset=US-ASCII">
4 <title>Inverse Chi-Squared Distribution Bayes Example</title>
5 <link rel="stylesheet" href="../../../math.css" type="text/css">
6 <meta name="generator" content="DocBook XSL Stylesheets V1.79.1">
7 <link rel="home" href="../../../index.html" title="Math Toolkit 2.11.0">
8 <link rel="up" href="../weg.html" title="Worked Examples">
9 <link rel="prev" href="normal_example/normal_misc.html" title="Some Miscellaneous Examples of the Normal (Gaussian) Distribution">
10 <link rel="next" href="nccs_eg.html" title="Non Central Chi Squared Example">
11 </head>
12 <body bgcolor="white" text="black" link="#0000FF" vlink="#840084" alink="#0000FF">
13 <table cellpadding="2" width="100%"><tr>
14 <td valign="top"><img alt="Boost C++ Libraries" width="277" height="86" src="../../../../../../../boost.png"></td>
15 <td align="center"><a href="../../../../../../../index.html">Home</a></td>
16 <td align="center"><a href="../../../../../../../libs/libraries.htm">Libraries</a></td>
17 <td align="center"><a href="http://www.boost.org/users/people.html">People</a></td>
18 <td align="center"><a href="http://www.boost.org/users/faq.html">FAQ</a></td>
19 <td align="center"><a href="../../../../../../../more/index.htm">More</a></td>
20 </tr></table>
21 <hr>
22 <div class="spirit-nav">
23 <a accesskey="p" href="normal_example/normal_misc.html"><img src="../../../../../../../doc/src/images/prev.png" alt="Prev"></a><a accesskey="u" href="../weg.html"><img src="../../../../../../../doc/src/images/up.png" alt="Up"></a><a accesskey="h" href="../../../index.html"><img src="../../../../../../../doc/src/images/home.png" alt="Home"></a><a accesskey="n" href="nccs_eg.html"><img src="../../../../../../../doc/src/images/next.png" alt="Next"></a>
24 </div>
25 <div class="section">
26 <div class="titlepage"><div><div><h4 class="title">
27 <a name="math_toolkit.stat_tut.weg.inverse_chi_squared_eg"></a><a class="link" href="inverse_chi_squared_eg.html" title="Inverse Chi-Squared Distribution Bayes Example">Inverse
28         Chi-Squared Distribution Bayes Example</a>
29 </h4></div></div></div>
30 <p>
31           The scaled-inversed-chi-squared distribution is the conjugate prior distribution
32           for the variance (&#963;<sup>2</sup>) parameter of a normal distribution with known expectation
33           (&#956;). As such it has widespread application in Bayesian statistics:
34         </p>
35 <p>
36           In <a href="http://en.wikipedia.org/wiki/Bayesian_inference" target="_top">Bayesian
37           inference</a>, the strength of belief into certain parameter values
38           is itself described through a distribution. Parameters hence become themselves
39           modelled and interpreted as random variables.
40         </p>
41 <p>
42           In this worked example, we perform such a Bayesian analysis by using the
43           scaled-inverse-chi-squared distribution as prior and posterior distribution
44           for the variance parameter of a normal distribution.
45         </p>
46 <p>
47           For more general information on Bayesian type of analyses, see:
48         </p>
49 <div class="itemizedlist"><ul class="itemizedlist" style="list-style-type: disc; ">
50 <li class="listitem">
51               Andrew Gelman, John B. Carlin, Hal E. Stern, Donald B. Rubin, Bayesian
52               Data Analysis, 2003, ISBN 978-1439840955.
53             </li>
54 <li class="listitem">
55               Jim Albert, Bayesian Compution with R, Springer, 2009, ISBN 978-0387922973.
56             </li>
57 </ul></div>
58 <p>
59           (As the scaled-inversed-chi-squared is another parameterization of the
60           inverse-gamma distribution, this example could also have used the inverse-gamma
61           distribution).
62         </p>
63 <p>
64           Consider precision machines which produce balls for a high-quality ball
65           bearing. Ideally each ball should have a diameter of precisely 3000 &#956;m (3
66           mm). Assume that machines generally produce balls of that size on average
67           (mean), but individual balls can vary slightly in either direction following
68           (approximately) a normal distribution. Depending on various production
69           conditions (e.g. raw material used for balls, workplace temperature and
70           humidity, maintenance frequency and quality) some machines produce balls
71           tighter distributed around the target of 3000 &#956;m, while others produce balls
72           with a wider distribution. Therefore the variance parameter of the normal
73           distribution of the ball sizes varies from machine to machine. An extensive
74           survey by the precision machinery manufacturer, however, has shown that
75           most machines operate with a variance between 15 and 50, and near 25 &#956;m<sup>2</sup> on
76           average.
77         </p>
78 <p>
79           Using this information, we want to model the variance of the machines.
80           The variance is strictly positive, and therefore we look for a statistical
81           distribution with support in the positive domain of the real numbers. Given
82           the expectation of the normal distribution of the balls is known (3000
83           &#956;m), for reasons of conjugacy, it is customary practice in Bayesian statistics
84           to model the variance to be scaled-inverse-chi-squared distributed.
85         </p>
86 <p>
87           In a first step, we will try to use the survey information to model the
88           general knowledge about the variance parameter of machines measured by
89           the manufacturer. This will provide us with a generic prior distribution
90           that is applicable if nothing more specific is known about a particular
91           machine.
92         </p>
93 <p>
94           In a second step, we will then combine the prior-distribution information
95           in a Bayesian analysis with data on a specific single machine to derive
96           a posterior distribution for that machine.
97         </p>
98 <h6>
99 <a name="math_toolkit.stat_tut.weg.inverse_chi_squared_eg.h0"></a>
100           <span class="phrase"><a name="math_toolkit.stat_tut.weg.inverse_chi_squared_eg.step_one_using_the_survey_inform"></a></span><a class="link" href="inverse_chi_squared_eg.html#math_toolkit.stat_tut.weg.inverse_chi_squared_eg.step_one_using_the_survey_inform">Step
101           one: Using the survey information.</a>
102         </h6>
103 <p>
104           Using the survey results, we try to find the parameter set of a scaled-inverse-chi-squared
105           distribution so that the properties of this distribution match the results.
106           Using the mathematical properties of the scaled-inverse-chi-squared distribution
107           as guideline, we see that that both the mean and mode of the scaled-inverse-chi-squared
108           distribution are approximately given by the scale parameter (s) of the
109           distribution. As the survey machines operated at a variance of 25 &#956;m<sup>2</sup> on
110           average, we hence set the scale parameter (s<sub>prior</sub>) of our prior distribution
111           equal to this value. Using some trial-and-error and calls to the global
112           quantile function, we also find that a value of 20 for the degrees-of-freedom
113           (&#957;<sub>prior</sub>) parameter is adequate so that most of the prior distribution mass
114           is located between 15 and 50 (see figure below).
115         </p>
116 <p>
117           We first construct our prior distribution using these values, and then
118           list out a few quantiles:
119         </p>
120 <pre class="programlisting"><span class="keyword">double</span> <span class="identifier">priorDF</span> <span class="special">=</span> <span class="number">20.0</span><span class="special">;</span>
121 <span class="keyword">double</span> <span class="identifier">priorScale</span> <span class="special">=</span> <span class="number">25.0</span><span class="special">;</span>
122
123 <span class="identifier">inverse_chi_squared</span> <span class="identifier">prior</span><span class="special">(</span><span class="identifier">priorDF</span><span class="special">,</span> <span class="identifier">priorScale</span><span class="special">);</span>
124 <span class="comment">// Using an inverse_gamma distribution instead, we could equivalently write</span>
125 <span class="comment">// inverse_gamma prior(priorDF / 2.0, priorScale * priorDF / 2.0);</span>
126
127 <span class="identifier">cout</span> <span class="special">&lt;&lt;</span> <span class="string">"Prior distribution:"</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
128 <span class="identifier">cout</span> <span class="special">&lt;&lt;</span> <span class="string">"  2.5% quantile: "</span> <span class="special">&lt;&lt;</span> <span class="identifier">quantile</span><span class="special">(</span><span class="identifier">prior</span><span class="special">,</span> <span class="number">0.025</span><span class="special">)</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
129 <span class="identifier">cout</span> <span class="special">&lt;&lt;</span> <span class="string">"  50% quantile: "</span> <span class="special">&lt;&lt;</span> <span class="identifier">quantile</span><span class="special">(</span><span class="identifier">prior</span><span class="special">,</span> <span class="number">0.5</span><span class="special">)</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
130 <span class="identifier">cout</span> <span class="special">&lt;&lt;</span> <span class="string">"  97.5% quantile: "</span> <span class="special">&lt;&lt;</span> <span class="identifier">quantile</span><span class="special">(</span><span class="identifier">prior</span><span class="special">,</span> <span class="number">0.975</span><span class="special">)</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
131 </pre>
132 <p>
133           This produces this output:
134         </p>
135 <pre class="programlisting"><span class="identifier">Prior</span> <span class="identifier">distribution</span><span class="special">:</span>
136
137 <span class="number">2.5</span><span class="special">%</span> <span class="identifier">quantile</span><span class="special">:</span> <span class="number">14.6</span>
138 <span class="number">50</span><span class="special">%</span> <span class="identifier">quantile</span><span class="special">:</span> <span class="number">25.9</span>
139 <span class="number">97.5</span><span class="special">%</span> <span class="identifier">quantile</span><span class="special">:</span> <span class="number">52.1</span>
140 </pre>
141 <p>
142           Based on this distribution, we can now calculate the probability of having
143           a machine working with an unusual work precision (variance) at &lt;= 15
144           or &gt; 50. For this task, we use calls to the <code class="computeroutput"><span class="identifier">boost</span><span class="special">::</span><span class="identifier">math</span><span class="special">::</span></code> functions <code class="computeroutput"><span class="identifier">cdf</span></code>
145           and <code class="computeroutput"><span class="identifier">complement</span></code>, respectively,
146           and find a probability of about 0.031 (3.1%) for each case.
147         </p>
148 <pre class="programlisting"><span class="identifier">cout</span> <span class="special">&lt;&lt;</span> <span class="string">"  probability variance &lt;= 15: "</span> <span class="special">&lt;&lt;</span> <span class="identifier">boost</span><span class="special">::</span><span class="identifier">math</span><span class="special">::</span><span class="identifier">cdf</span><span class="special">(</span><span class="identifier">prior</span><span class="special">,</span> <span class="number">15.0</span><span class="special">)</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
149 <span class="identifier">cout</span> <span class="special">&lt;&lt;</span> <span class="string">"  probability variance &lt;= 25: "</span> <span class="special">&lt;&lt;</span> <span class="identifier">boost</span><span class="special">::</span><span class="identifier">math</span><span class="special">::</span><span class="identifier">cdf</span><span class="special">(</span><span class="identifier">prior</span><span class="special">,</span> <span class="number">25.0</span><span class="special">)</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
150 <span class="identifier">cout</span> <span class="special">&lt;&lt;</span> <span class="string">"  probability variance &gt; 50: "</span>
151   <span class="special">&lt;&lt;</span> <span class="identifier">boost</span><span class="special">::</span><span class="identifier">math</span><span class="special">::</span><span class="identifier">cdf</span><span class="special">(</span><span class="identifier">boost</span><span class="special">::</span><span class="identifier">math</span><span class="special">::</span><span class="identifier">complement</span><span class="special">(</span><span class="identifier">prior</span><span class="special">,</span> <span class="number">50.0</span><span class="special">))</span>
152 <span class="special">&lt;&lt;</span> <span class="identifier">endl</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
153 </pre>
154 <p>
155           This produces this output:
156         </p>
157 <pre class="programlisting"><span class="identifier">probability</span> <span class="identifier">variance</span> <span class="special">&lt;=</span> <span class="number">15</span><span class="special">:</span> <span class="number">0.031</span>
158 <span class="identifier">probability</span> <span class="identifier">variance</span> <span class="special">&lt;=</span> <span class="number">25</span><span class="special">:</span> <span class="number">0.458</span>
159 <span class="identifier">probability</span> <span class="identifier">variance</span> <span class="special">&gt;</span> <span class="number">50</span><span class="special">:</span> <span class="number">0.0318</span>
160 </pre>
161 <p>
162           Therefore, only 3.1% of all precision machines produce balls with a variance
163           of 15 or less (particularly precise machines), but also only 3.2% of all
164           machines produce balls with a variance of as high as 50 or more (particularly
165           imprecise machines). Moreover, slightly more than one-half (1 - 0.458 =
166           54.2%) of the machines work at a variance greater than 25.
167         </p>
168 <p>
169           Notice here the distinction between a <a href="http://en.wikipedia.org/wiki/Bayesian_inference" target="_top">Bayesian</a>
170           analysis and a <a href="http://en.wikipedia.org/wiki/Frequentist_inference" target="_top">frequentist</a>
171           analysis: because we model the variance as random variable itself, we can
172           calculate and straightforwardly interpret probabilities for given parameter
173           values directly, while such an approach is not possible (and interpretationally
174           a strict <span class="emphasis"><em>must-not</em></span>) in the frequentist world.
175         </p>
176 <h6>
177 <a name="math_toolkit.stat_tut.weg.inverse_chi_squared_eg.h1"></a>
178           <span class="phrase"><a name="math_toolkit.stat_tut.weg.inverse_chi_squared_eg.step_2_investigate_a_single_mach"></a></span><a class="link" href="inverse_chi_squared_eg.html#math_toolkit.stat_tut.weg.inverse_chi_squared_eg.step_2_investigate_a_single_mach">Step
179           2: Investigate a single machine</a>
180         </h6>
181 <p>
182           In the second step, we investigate a single machine, which is suspected
183           to suffer from a major fault as the produced balls show fairly high size
184           variability. Based on the prior distribution of generic machinery performance
185           (derived above) and data on balls produced by the suspect machine, we calculate
186           the posterior distribution for that machine and use its properties for
187           guidance regarding continued machine operation or suspension.
188         </p>
189 <p>
190           It can be shown that if the prior distribution was chosen to be scaled-inverse-chi-square
191           distributed, then the posterior distribution is also scaled-inverse-chi-squared-distributed
192           (prior and posterior distributions are hence conjugate). For more details
193           regarding conjugacy and formula to derive the parameters set for the posterior
194           distribution see <a href="http://en.wikipedia.org/wiki/Conjugate_prior" target="_top">Conjugate
195           prior</a>.
196         </p>
197 <p>
198           Given the prior distribution parameters and sample data (of size n), the
199           posterior distribution parameters are given by the two expressions:
200         </p>
201 <p>
202           &#8192;&#8192; &#957;<sub>posterior</sub> = &#957;<sub>prior</sub> + n
203         </p>
204 <p>
205           which gives the posteriorDF below, and
206         </p>
207 <p>
208           &#8192;&#8192; s<sub>posterior</sub> = (&#957;<sub>prior</sub>s<sub>prior</sub> + &#931;<sup>n</sup><sub>i=1</sub>(x<sub>i</sub> - &#956;)<sup>2</sup>) / (&#957;<sub>prior</sub> + n)
209         </p>
210 <p>
211           which after some rearrangement gives the formula for the posteriorScale
212           below.
213         </p>
214 <p>
215           Machine-specific data consist of 100 balls which were accurately measured
216           and show the expected mean of 3000 &#956;m and a sample variance of 55 (calculated
217           for a sample mean defined to be 3000 exactly). From these data, the prior
218           parameterization, and noting that the term &#931;<sup>n</sup><sub>i=1</sub>(x<sub>i</sub> - &#956;)<sup>2</sup> equals the sample
219           variance multiplied by n - 1, it follows that the posterior distribution
220           of the variance parameter is scaled-inverse-chi-squared distribution with
221           degrees-of-freedom (&#957;<sub>posterior</sub>) = 120 and scale (s<sub>posterior</sub>) = 49.54.
222         </p>
223 <pre class="programlisting"><span class="keyword">int</span> <span class="identifier">ballsSampleSize</span> <span class="special">=</span> <span class="number">100</span><span class="special">;</span>
224 <span class="identifier">cout</span> <span class="special">&lt;&lt;</span><span class="string">"balls sample size: "</span> <span class="special">&lt;&lt;</span> <span class="identifier">ballsSampleSize</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
225 <span class="keyword">double</span> <span class="identifier">ballsSampleVariance</span> <span class="special">=</span> <span class="number">55.0</span><span class="special">;</span>
226 <span class="identifier">cout</span> <span class="special">&lt;&lt;</span><span class="string">"balls sample variance: "</span> <span class="special">&lt;&lt;</span> <span class="identifier">ballsSampleVariance</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
227
228 <span class="keyword">double</span> <span class="identifier">posteriorDF</span> <span class="special">=</span> <span class="identifier">priorDF</span> <span class="special">+</span> <span class="identifier">ballsSampleSize</span><span class="special">;</span>
229 <span class="identifier">cout</span> <span class="special">&lt;&lt;</span> <span class="string">"prior degrees-of-freedom: "</span> <span class="special">&lt;&lt;</span> <span class="identifier">priorDF</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
230 <span class="identifier">cout</span> <span class="special">&lt;&lt;</span> <span class="string">"posterior degrees-of-freedom: "</span> <span class="special">&lt;&lt;</span> <span class="identifier">posteriorDF</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
231
232 <span class="keyword">double</span> <span class="identifier">posteriorScale</span> <span class="special">=</span>
233   <span class="special">(</span><span class="identifier">priorDF</span> <span class="special">*</span> <span class="identifier">priorScale</span> <span class="special">+</span> <span class="special">(</span><span class="identifier">ballsSampleVariance</span> <span class="special">*</span> <span class="special">(</span><span class="identifier">ballsSampleSize</span> <span class="special">-</span> <span class="number">1</span><span class="special">)))</span> <span class="special">/</span> <span class="identifier">posteriorDF</span><span class="special">;</span>
234 <span class="identifier">cout</span> <span class="special">&lt;&lt;</span> <span class="string">"prior scale: "</span> <span class="special">&lt;&lt;</span> <span class="identifier">priorScale</span>  <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
235 <span class="identifier">cout</span> <span class="special">&lt;&lt;</span> <span class="string">"posterior scale: "</span> <span class="special">&lt;&lt;</span> <span class="identifier">posteriorScale</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
236 </pre>
237 <p>
238           An interesting feature here is that one needs only to know a summary statistics
239           of the sample to parameterize the posterior distribution: the 100 individual
240           ball measurements are irrelevant, just knowledge of the sample variance
241           and number of measurements is sufficient.
242         </p>
243 <p>
244           That produces this output:
245         </p>
246 <pre class="programlisting"><span class="identifier">balls</span> <span class="identifier">sample</span> <span class="identifier">size</span><span class="special">:</span> <span class="number">100</span>
247 <span class="identifier">balls</span> <span class="identifier">sample</span> <span class="identifier">variance</span><span class="special">:</span> <span class="number">55</span>
248 <span class="identifier">prior</span> <span class="identifier">degrees</span><span class="special">-</span><span class="identifier">of</span><span class="special">-</span><span class="identifier">freedom</span><span class="special">:</span> <span class="number">20</span>
249 <span class="identifier">posterior</span> <span class="identifier">degrees</span><span class="special">-</span><span class="identifier">of</span><span class="special">-</span><span class="identifier">freedom</span><span class="special">:</span> <span class="number">120</span>
250 <span class="identifier">prior</span> <span class="identifier">scale</span><span class="special">:</span> <span class="number">25</span>
251 <span class="identifier">posterior</span> <span class="identifier">scale</span><span class="special">:</span> <span class="number">49.5</span>
252
253 </pre>
254 <p>
255           To compare the generic machinery performance with our suspect machine,
256           we calculate again the same quantiles and probabilities as above, and find
257           a distribution clearly shifted to greater values (see figure).
258         </p>
259 <div class="blockquote"><blockquote class="blockquote"><p>
260             <span class="inlinemediaobject"><img src="../../../../graphs/prior_posterior_plot.svg" align="middle"></span>
261
262           </p></blockquote></div>
263 <pre class="programlisting"><span class="identifier">inverse_chi_squared</span> <span class="identifier">posterior</span><span class="special">(</span><span class="identifier">posteriorDF</span><span class="special">,</span> <span class="identifier">posteriorScale</span><span class="special">);</span>
264
265  <span class="identifier">cout</span> <span class="special">&lt;&lt;</span> <span class="string">"Posterior distribution:"</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
266  <span class="identifier">cout</span> <span class="special">&lt;&lt;</span> <span class="string">"  2.5% quantile: "</span> <span class="special">&lt;&lt;</span> <span class="identifier">boost</span><span class="special">::</span><span class="identifier">math</span><span class="special">::</span><span class="identifier">quantile</span><span class="special">(</span><span class="identifier">posterior</span><span class="special">,</span> <span class="number">0.025</span><span class="special">)</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
267  <span class="identifier">cout</span> <span class="special">&lt;&lt;</span> <span class="string">"  50% quantile: "</span> <span class="special">&lt;&lt;</span> <span class="identifier">boost</span><span class="special">::</span><span class="identifier">math</span><span class="special">::</span><span class="identifier">quantile</span><span class="special">(</span><span class="identifier">posterior</span><span class="special">,</span> <span class="number">0.5</span><span class="special">)</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
268  <span class="identifier">cout</span> <span class="special">&lt;&lt;</span> <span class="string">"  97.5% quantile: "</span> <span class="special">&lt;&lt;</span> <span class="identifier">boost</span><span class="special">::</span><span class="identifier">math</span><span class="special">::</span><span class="identifier">quantile</span><span class="special">(</span><span class="identifier">posterior</span><span class="special">,</span> <span class="number">0.975</span><span class="special">)</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
269
270  <span class="identifier">cout</span> <span class="special">&lt;&lt;</span> <span class="string">"  probability variance &lt;= 15: "</span> <span class="special">&lt;&lt;</span> <span class="identifier">boost</span><span class="special">::</span><span class="identifier">math</span><span class="special">::</span><span class="identifier">cdf</span><span class="special">(</span><span class="identifier">posterior</span><span class="special">,</span> <span class="number">15.0</span><span class="special">)</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
271  <span class="identifier">cout</span> <span class="special">&lt;&lt;</span> <span class="string">"  probability variance &lt;= 25: "</span> <span class="special">&lt;&lt;</span> <span class="identifier">boost</span><span class="special">::</span><span class="identifier">math</span><span class="special">::</span><span class="identifier">cdf</span><span class="special">(</span><span class="identifier">posterior</span><span class="special">,</span> <span class="number">25.0</span><span class="special">)</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
272  <span class="identifier">cout</span> <span class="special">&lt;&lt;</span> <span class="string">"  probability variance &gt; 50: "</span>
273    <span class="special">&lt;&lt;</span> <span class="identifier">boost</span><span class="special">::</span><span class="identifier">math</span><span class="special">::</span><span class="identifier">cdf</span><span class="special">(</span><span class="identifier">boost</span><span class="special">::</span><span class="identifier">math</span><span class="special">::</span><span class="identifier">complement</span><span class="special">(</span><span class="identifier">posterior</span><span class="special">,</span> <span class="number">50.0</span><span class="special">))</span> <span class="special">&lt;&lt;</span> <span class="identifier">endl</span><span class="special">;</span>
274 </pre>
275 <p>
276           This produces this output:
277         </p>
278 <pre class="programlisting"><span class="identifier">Posterior</span> <span class="identifier">distribution</span><span class="special">:</span>
279
280    <span class="number">2.5</span><span class="special">%</span> <span class="identifier">quantile</span><span class="special">:</span> <span class="number">39.1</span>
281    <span class="number">50</span><span class="special">%</span> <span class="identifier">quantile</span><span class="special">:</span> <span class="number">49.8</span>
282    <span class="number">97.5</span><span class="special">%</span> <span class="identifier">quantile</span><span class="special">:</span> <span class="number">64.9</span>
283
284    <span class="identifier">probability</span> <span class="identifier">variance</span> <span class="special">&lt;=</span> <span class="number">15</span><span class="special">:</span> <span class="number">2.97e-031</span>
285    <span class="identifier">probability</span> <span class="identifier">variance</span> <span class="special">&lt;=</span> <span class="number">25</span><span class="special">:</span> <span class="number">8.85e-010</span>
286    <span class="identifier">probability</span> <span class="identifier">variance</span> <span class="special">&gt;</span> <span class="number">50</span><span class="special">:</span> <span class="number">0.489</span>
287 </pre>
288 <p>
289           Indeed, the probability that the machine works at a low variance (&lt;=
290           15) is almost zero, and even the probability of working at average or better
291           performance is negligibly small (less than one-millionth of a permille).
292           On the other hand, with an almost near-half probability (49%), the machine
293           operates in the extreme high variance range of &gt; 50 characteristic for
294           poorly performing machines.
295         </p>
296 <p>
297           Based on this information the operation of the machine is taken out of
298           use and serviced.
299         </p>
300 <p>
301           In summary, the Bayesian analysis allowed us to make exact probabilistic
302           statements about a parameter of interest, and hence provided us results
303           with straightforward interpretation.
304         </p>
305 <p>
306           A full sample output is:
307         </p>
308 <pre class="programlisting"> Inverse_chi_squared_distribution Bayes example:
309
310    Prior distribution:
311
312     2.5% quantile: 14.6
313     50% quantile: 25.9
314     97.5% quantile: 52.1
315
316     probability variance &lt;= 15: 0.031
317     probability variance &lt;= 25: 0.458
318     probability variance &gt; 50: 0.0318
319
320   balls sample size: 100
321   balls sample variance: 55
322   prior degrees-of-freedom: 20
323   posterior degrees-of-freedom: 120
324   prior scale: 25
325   posterior scale: 49.5
326   Posterior distribution:
327
328     2.5% quantile: 39.1
329     50% quantile: 49.8
330     97.5% quantile: 64.9
331
332     probability variance &lt;= 15: 2.97e-031
333     probability variance &lt;= 25: 8.85e-010
334     probability variance &gt; 50: 0.489
335
336 </pre>
337 <p>
338           (See also the reference documentation for the <a class="link" href="../../dist_ref/dists/inverse_chi_squared_dist.html" title="Inverse Chi Squared Distribution">Inverse
339           chi squared Distribution</a>.)
340         </p>
341 <p>
342           See the full source C++ of this example at <a href="../../../../../example/inverse_chi_squared_bayes_eg.cpp" target="_top">../../example/inverse_chi_squared_bayes_eg.cpp</a>
343         </p>
344 </div>
345 <table xmlns:rev="http://www.cs.rpi.edu/~gregod/boost/tools/doc/revision" width="100%"><tr>
346 <td align="left"></td>
347 <td align="right"><div class="copyright-footer">Copyright &#169; 2006-2019 Nikhar
348       Agrawal, Anton Bikineev, Paul A. Bristow, Marco Guazzone, Christopher Kormanyos,
349       Hubert Holin, Bruno Lalande, John Maddock, Jeremy Murphy, Matthew Pulver, Johan
350       R&#229;de, Gautam Sewani, Benjamin Sobotta, Nicholas Thompson, Thijs van den Berg,
351       Daryle Walker and Xiaogang Zhang<p>
352         Distributed under the Boost Software License, Version 1.0. (See accompanying
353         file LICENSE_1_0.txt or copy at <a href="http://www.boost.org/LICENSE_1_0.txt" target="_top">http://www.boost.org/LICENSE_1_0.txt</a>)
354       </p>
355 </div></td>
356 </tr></table>
357 <hr>
358 <div class="spirit-nav">
359 <a accesskey="p" href="normal_example/normal_misc.html"><img src="../../../../../../../doc/src/images/prev.png" alt="Prev"></a><a accesskey="u" href="../weg.html"><img src="../../../../../../../doc/src/images/up.png" alt="Up"></a><a accesskey="h" href="../../../index.html"><img src="../../../../../../../doc/src/images/home.png" alt="Home"></a><a accesskey="n" href="nccs_eg.html"><img src="../../../../../../../doc/src/images/next.png" alt="Next"></a>
360 </div>
361 </body>
362 </html>