Reflection theorem: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>JustAGal
Disambiguated: SpiegelSpiegel (disambiguation)
en>BG19bot
m →‎Leopoldt's Spiegelungssatz: WP:CHECKWIKI error fix for #99. Broken sup tag. Do general fixes if a problem exists. -, replaced: <sup>φ(1)</sub> → <sup>φ(1)</sup> using AWB (9957)
 
Line 1: Line 1:
{{Unreferenced|date=December 2009}}
Greetings. Allow me start by telling you the author's title - Phebe. Hiring is my occupation. His family members life in South Dakota but his spouse desires them to move. Doing ceramics is what her family members and her enjoy.<br><br>my page - [http://torontocartridge.com/uncategorized/the-ideal-way-to-battle-a-yeast-infection/ http://torontocartridge.com/uncategorized/the-ideal-way-to-battle-a-yeast-infection]
In [[probability theory]], calculation of the '''sum of normally distributed random variables''' is an instance of the arithmetic of [[random variable]]s, which can be quite complex based on the [[probability distribution]]s of the random variables involved and their relationships.
 
==Independent random variables==
If ''X'' and ''Y'' are [[statistical independence|independent]] [[random variable]]s that are [[normal distribution|normally distributed]] (and therefore also jointly so), then their sum is also normally distributed. i.e., if
 
:<math>X \sim N(\mu_X, \sigma_X^2)</math>
:<math>Y \sim N(\mu_Y, \sigma_Y^2)</math>
:<math>Z=X+Y</math>,
 
then
 
:<math>Z \sim N(\mu_X + \mu_Y, \sigma_X^2 + \sigma_Y^2).</math>
 
This means that the sum of two independent normally distributed random variables is normal, with its mean being the sum of the two means, and its variance being the sum of the two variances (i.e., the square of the standard deviation is the sum of the squares of the standard deviations).
 
Note that the result that the sum is normally distributed requires the assumption of independence, not just [[correlation and dependence|uncorrelatedness]]; two separately (not jointly) normally distributed random variables can be uncorrelated without being independent, in which case their sum can be non-normally distributed (see [[Normally distributed and uncorrelated does not imply independent#A symmetric example]]). The result about the mean holds in all cases, while the result for the variance requires uncorrelatedness, but not independence.
 
===Proofs===
====Proof using characteristic functions====
{{Citation needed|date=May 2011}}
 
The [[characteristic function (probability theory)|characteristic function]]
 
:<math>\varphi_{X+Y}(t) = \operatorname{E}\left(e^{it(X+Y)}\right)</math>
 
of the sum of two independent random variables ''X'' and ''Y'' is just the product of the two separate characteristic functions:
 
:<math>\varphi_X (t) = \operatorname{E}\left(e^{itX}\right), \qquad \varphi_Y(t) = \operatorname{E}\left(e^{itY}\right)</math>
 
of ''X'' and ''Y''.
 
The characteristic function of the normal distribution with expected value μ and variance σ<sup>2</sup> is
 
:<math>\varphi_X(t) = \exp\left(it\mu - {\sigma^2 t^2 \over 2}\right).</math>
 
So
 
:<math>\varphi_{X+Y}(t)=\varphi_X(t) \varphi_Y(t) =\exp\left(it\mu_X - {\sigma_X^2 t^2 \over 2}\right) \exp\left(it\mu_Y - {\sigma_Y^2 t^2 \over 2}\right) = \exp \left( it (\mu_X +\mu_Y) - {(\sigma_X^2 + \sigma_Y^2) t^2 \over 2}\right).</math>
 
This is the characteristic function of the normal distribution with expected value <math>\mu_X + \mu_Y</math> and variance <math>\sigma_X^2+\sigma_Y^2</math>
 
Finally, recall that no two distinct distributions can both have the same characteristic function, so the distribution of ''X''+''Y'' must be just this normal distribution.
 
====Proof using convolutions====
{{Citation needed|date=May 2011}}
 
For independent random variables ''X'' and ''Y'', the distribution ''f''<sub>''Z''</sub> of ''Z'' = ''X''+''Y'' equals the convolution of ''f''<sub>''X''</sub> and ''f''<sub>''Y''</sub>:
 
:<math>f_Z(z) = \int_{-\infty}^\infty f_Y(z-x) f_X(x) dx</math>
 
Given that ''f''<sub>''X''</sub> and ''f''<sub>''Y''</sub> are normal densities,
 
:<math>f_X(x) = \frac{1}{\sqrt{2\pi}\sigma_X} e^{-{(x-\mu_X)^2 \over 2\sigma_X^2}}</math>
:<math>f_Y(y) = \frac{1}{\sqrt{2\pi}\sigma_Y} e^{-{(y-\mu_Y)^2 \over 2\sigma_Y^2}}</math>
 
Substituting into the convolution:
 
:<math>\begin{align}
f_Z(z) &= \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi}\sigma_Y} e^{-{(z-x-\mu_Y)^2 \over 2\sigma_Y^2}} \frac{1}{\sqrt{2\pi}\sigma_X} e^{-{(x-\mu_X)^2 \over 2\sigma_X^2}} dx \\
&= \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi}\sqrt{\sigma_X^2+\sigma_Y^2}} \exp \left[ - { (z-(\mu_X+\mu_Y))^2 \over 2(\sigma_X^2+\sigma_Y^2) } \right]  \frac{1}{\sqrt{2\pi}\frac{\sigma_X\sigma_Y}{\sqrt{\sigma_X^2+\sigma_Y^2}}} \exp \left[ - \frac{\left(x-\frac{\sigma_X^2(z-\mu_Y)+\sigma_Y^2\mu_X}{\sigma_X^2+\sigma_Y^2}\right)^2}{2\left(\frac{\sigma_X\sigma_Y}{\sqrt{\sigma_X^2+\sigma_Y^2}}\right)^2} \right] dx \\
&= \frac{1}{\sqrt{2\pi(\sigma_X^2+\sigma_Y^2)}} \exp \left[ - { (z-(\mu_X+\mu_Y))^2 \over 2(\sigma_X^2+\sigma_Y^2) } \right] \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi}\frac{\sigma_X\sigma_Y}{\sqrt{\sigma_X^2+\sigma_Y^2}}} \exp \left[ - \frac{\left(x-\frac{\sigma_X^2(z-\mu_Y)+\sigma_Y^2\mu_X}{\sigma_X^2+\sigma_Y^2}\right)^2}{2\left(\frac{\sigma_X\sigma_Y}{\sqrt{\sigma_X^2+\sigma_Y^2}}\right)^2} \right] dx
\end{align}</math>
 
The expression in the integral is a normal density distribution on ''x'', and so the integral evaluates to 1. The desired result follows:
 
:<math>f_Z(z) = \frac{1}{\sqrt{2\pi(\sigma_X^2+\sigma_Y^2)}} \exp \left[ - { (z-(\mu_X+\mu_Y))^2 \over 2(\sigma_X^2+\sigma_Y^2) } \right]</math>
 
====Geometric proof====
{{Citation needed|date=May 2011}}
First consider the normalized case when ''X'', ''Y'' ~ ''N''(0, 1), so that their [[probability density function|PDF]]s are
:<math>f(x) = \sqrt{1/2\pi \,} e^{-x^2/2}</math>
and
:<math>g(y) = \sqrt{1/2\pi\,} e^{-y^2/2}.</math>
Let ''Z'' = ''X''+''Y''. Then the [[cumulative distribution function|CDF]] for ''Z''  will be
:<math>z \mapsto \int_{x+y \leq z} f(x)g(y) \, dx \, dy. </math>
This integral is over the half-plane which lies under the line ''x''+''y'' = ''z''.
 
The key observation is that the function
 
:<math> f(x)g(y) = (1/2\pi)e^{-(x^2 + y^2)/2}\,</math>
 
is radially symmetric. So we rotate the coordinate plane about the origin, choosing new coordinates <math>x',y'</math> such that the line ''x''+''y'' = ''z'' is described by the equation <math> x' = c </math> where <math> c = c(z) </math> is determined geometrically. Because of the radial symmetry, we have <math> f(x)g(y) = f(x')g(y') </math>, and the CDF for ''Z'' is
 
:<math>\int_{x'\leq c, y' \in \reals} f(x')g(y') \, dx' \, dy'.</math>
 
This is easy to integrate; we find that the CDF for ''Z'' is
 
:<math>\int_{-\infty}^{c(z)} f(x') \, dx' = \Phi(c(z)).</math>
 
To determine the value <math>c(z)</math>, note that we rotated the plane so that the line ''x''+''y'' = ''z'' now runs vertically with ''x''-intercept equal to ''c''. So ''c'' is just the distance from the origin to the line ''x''+''y'' = ''z'' along the perpendicular bisector, which meets the line at its nearest point to the origin, in this case <math>(z/2,z/2)\,</math>. So the distance is <math>c = \sqrt{ (z/2)^2 + (z/2)^2 } = z/\sqrt{2}\,</math>, and the CDF for ''Z'' is <math> \Phi(z/\sqrt{2})</math>, i.e., <math>Z = X+Y \sim N(0, 2).</math>
 
Now, if ''a'', ''b'' are any real constants (not both zero!) then the probability that <math>aX+bY \leq z</math> is found by the same integral as above, but with the bounding line <math>ax+by =z</math>. The same rotation method works, and in this more general case we find that the closest point on the line to the origin is located a (signed) distance
: <math>\frac{z}{\sqrt{a^2 + b^2}}</math>
away, so that
:<math>aX + bY \sim N(0, a^2 + b^2).</math>
The same argument in higher dimensions shows that if
:<math>X_i \sim N(0,\sigma_i^2), \qquad i=1, \dots, n,</math>
then
:<math>X_1+ \cdots + X_n \sim N(0, \sigma_1^2 + \cdots + \sigma_n^2).</math>
 
Now we are essentially done, because
:<math>X \sim N(\mu,\sigma^2) \Leftrightarrow \frac{1}{\sigma} (X - \mu) \sim N(0,1).</math>
So in general, if
:<math>X_i \sim N(\mu_i, \sigma_i^2), \qquad i=1, \dots, n,</math>
then
: <math>\sum_{i=1}^n a_i X_i \sim N\left(\sum_{i=1}^n a_i \mu_i, \sum_{i=1}^n (a_i \sigma_i)^2 \right).</math>
 
==Correlated random variables==
In the event that the variables ''X'' and ''Y'' are jointly normally distributed random variables, then ''X''&nbsp;+&nbsp;''Y'' is still normally distributed (see [[Multivariate normal distribution]]) and the mean is the sum of the means. However, the variances are not additive due to the correlation. Indeed,
 
:<math>\sigma_{X+Y} = \sqrt{\sigma_X^2+\sigma_Y^2+2\rho\sigma_X \sigma_Y},</math>
 
where ρ is the [[correlation]]. In particular, whenever ρ&nbsp;<&nbsp;0, then the variance is less than the sum of the variances of ''X'' and ''Y''. This is perhaps the simplest demonstration of the [[Correlation trading|principle of diversification]].
 
[[Variance#Sum_of_correlated_variables|Extensions of this result]] can be made for more than two random variables, using the [[covariance matrix]].
 
===Proof===
In this case, one needs to consider
 
:<math>\frac{1}{2 \pi \sigma_x \sigma_y \sqrt{1-\rho^2}} \iint_{x\,y} \exp \left[ -\frac{1}{2(1-\rho^2)} \left(\frac{x^2}{\sigma_x^2} + \frac{y^2}{\sigma_y^2} - \frac{2 \rho x y}{\sigma_x\sigma_y}\right)\right] \delta(z - (x+y))\, \operatorname{d}x\,\operatorname{d}y. </math>
 
As above, one makes the substitution <math>y\rightarrow z-x</math>
 
This integral is more complicated to simplify analytically, but can be done easily using a symbolic mathematics program. The probability distribution ''f''<sub>''Z''</sub>(''z'') is given in this case by
 
:<math>f_Z(z)=\frac{1}{\sqrt{2 \pi}\sigma_+ }\exp\left(-\frac{z^2}{2\sigma_+^2}\right)</math>
where
:<math>\sigma_+ = \sqrt{\sigma_x^2+\sigma_y^2+2\rho\sigma_x \sigma_y}.</math>
 
If one considers instead ''Z'' = ''X''&nbsp;−&nbsp;''Y'', then one obtains
:<math>f_Z(z)=\frac{1}{\sqrt{2\pi(\sigma_x^2+\sigma_y^2-2\rho\sigma_x \sigma_y)}}\exp\left(-\frac{z^2}{2(\sigma_x^2+\sigma_y^2-2\rho\sigma_x \sigma_y)}\right)</math>
which also can be rewritten with
:<math>\sigma_-=\sqrt{\sigma_x^2+\sigma_y^2-2\rho\sigma_x \sigma_y}.</math>
 
The standard deviations of each distribution are obvious by comparison with the standard normal distribution.
 
==See also==
* [[Algebra of random variables]]
* [[Stable distribution]]
* [[Standard error (statistics)]]
* [[Ratio distribution]]
* [[Product distribution]]
* [[Slash distribution]]
* [[List of convolutions of probability distributions]]
* Not to be confused with: [[Mixture_distribution]]
 
{{DEFAULTSORT:Sum Of Normally Distributed Random Variables}}
[[Category:Continuous distributions|Normal]]
[[Category:Normal distribution]]

Latest revision as of 10:47, 1 March 2014

Greetings. Allow me start by telling you the author's title - Phebe. Hiring is my occupation. His family members life in South Dakota but his spouse desires them to move. Doing ceramics is what her family members and her enjoy.

my page - http://torontocartridge.com/uncategorized/the-ideal-way-to-battle-a-yeast-infection