Photoemission electron microscopy: Difference between revisions
en>Rjwilmsi m Format plain DOIs using AWB (8060) |
en>Nanite No edit summary |
||
| Line 1: | Line 1: | ||
In [[probability theory]], a '''pairwise independent''' collection of [[random variable]]s is a set of random variables any two of which are [[statistical independence|independent]].<ref>Gut, A. (2005) ''Probability: a Graduate Course'', Springer-Verlag. ISBN 0-387-27332-8. pp. 71–72.</ref> Any collection of [[Mutual independence|mutually independent]] random variables is pairwise independent, but some pairwise independent collections are not mutually independent. Pairwise independent random variables with finite [[variance]] are [[uncorrelated]]. | |||
A pair of random variables ''X'' and ''Y'' are '''independent''' if and only if the random vector (''X'', ''Y'') with [[joint distribution|joint]] cumulative distribution function (CDF) <math>F_{X,Y}(x,y)</math> satisfies | |||
:<math>F_{X,Y}(x,y) = F_X(x) F_Y(y),</math> | |||
or equivalently, their joint density <math>f_{X,Y}(x,y)</math> satisfies | |||
:<math>f_{X,Y}(x,y) = f_X(x) f_Y(y).</math> | |||
That is, the joint distribution is equal to the product of the marginal distributions.<ref>{{cite book|title=Introduction to Mathematical Statistics|author = Hogg, R. V., McKean, J. W., Craig, A. T.| edition=6| year=2005| publisher=Pearson Prentice Hall|location=Upper Saddle River, NJ|isbn=0-13-008507-3}} Definition 2.5.1, page 109.</ref> | |||
Unless it is not clear in context, in practice the modifier "mutual" is usually dropped so that '''independence''' means '''mutual independence'''. A statement such as " ''X'', ''Y'', ''Z'' are independent random variables" means that ''X'', ''Y'', ''Z'' are mutually independent. | |||
==Example== | |||
Pairwise independence does not imply mutual independence, as shown by the following example attributed to S. Bernstein.<ref>{{cite book|title=Introduction to Mathematical Statistics|author = Hogg, R. V., McKean, J. W., Craig, A. T.| edition=6| year=2005| publisher=Pearson Prentice Hall|location=Upper Saddle River, NJ|isbn=0-13-008507-3}} Remark 2.6.1, p. 120.</ref> | |||
Suppose ''X'' and ''Y'' are two independent tosses of a fair coin, where we designate 1 for heads and 0 for tails. Let the third random variable ''Z'' be equal to 1 if one and only one of those coin tosses resulted in "heads", and 0 otherwise. Then jointly the triple (''X'', ''Y'', ''Z'') has the following [[joint probability distribution|probability distribution]]: | |||
:<math>(X,Y,Z)=\left\{\begin{matrix} | |||
(0,0,0) & \text{with probability}\ 1/4, \\ | |||
(0,1,1) & \text{with probability}\ 1/4, \\ | |||
(1,0,1) & \text{with probability}\ 1/4, \\ | |||
(1,1,0) & \text{with probability}\ 1/4. | |||
\end{matrix}\right.</math> | |||
Here the marginal probability distributions are identical: <math>f_X(0)=f_Y(0)=f_Z(0)=1/2,</math> and | |||
<math>f_X(1)=f_Y(1)=f_Z(1)=1/2.</math> The bivariate distributions also agree: <math> f_{X,Y}=f_{X,Z}=f_{Y,Z}, </math> where <math>f_{X,Y}(0,0)=f_{X,Y}(0,1)=f_{X,Y}(1,0)=f_{X,Y}(1,1)=1/4.</math> | |||
Since each of the pairwise joint distributions equals the product of their respective marginal distributions, the variables are pairwise independent: | |||
* ''X'' and ''Y'' are independent, and | |||
* ''X'' and ''Z'' are independent, and | |||
* ''Y'' and ''Z'' are independent. | |||
However, ''X'', ''Y'', and ''Z'' are '''not''' [[Mutually_independent#More_than_two_random_variables|mutually independent]], since <math>f_{X,Y,Z}(x,y,z) \neq f_X(x)f_Y(y)f_Z(z)</math>. Note that any of <math>\{X,Y,Z\}</math> is completely determined by the other two (any of ''X'', ''Y'', ''Z'' is the [[modular arithmetic|sum (modulo 2)]] of the others). That is as far from independence as random variables can get. | |||
==Generalization== | |||
More generally, we can talk about ''k''-wise independence, for any ''k'' ≥ 2. The idea is similar: a set of [[random variable]]s is ''k''-wise independent if every subset of size ''k'' of those variables is independent. ''k''-wise independence has been used in theoretical computer science, where it was used to prove a theorem about the problem [[MAXEkSAT]]. | |||
== See also == | |||
* [[Pairwise]] | |||
== References == | |||
{{reflist}} | |||
[[Category:Probability theory]] | |||
[[Category:Theory of probability distributions]] | |||
[[Category:Statistical dependence]] | |||
Revision as of 01:03, 21 January 2014
In probability theory, a pairwise independent collection of random variables is a set of random variables any two of which are independent.[1] Any collection of mutually independent random variables is pairwise independent, but some pairwise independent collections are not mutually independent. Pairwise independent random variables with finite variance are uncorrelated.
A pair of random variables X and Y are independent if and only if the random vector (X, Y) with joint cumulative distribution function (CDF) satisfies
or equivalently, their joint density satisfies
That is, the joint distribution is equal to the product of the marginal distributions.[2]
Unless it is not clear in context, in practice the modifier "mutual" is usually dropped so that independence means mutual independence. A statement such as " X, Y, Z are independent random variables" means that X, Y, Z are mutually independent.
Example
Pairwise independence does not imply mutual independence, as shown by the following example attributed to S. Bernstein.[3]
Suppose X and Y are two independent tosses of a fair coin, where we designate 1 for heads and 0 for tails. Let the third random variable Z be equal to 1 if one and only one of those coin tosses resulted in "heads", and 0 otherwise. Then jointly the triple (X, Y, Z) has the following probability distribution:
Here the marginal probability distributions are identical: and The bivariate distributions also agree: where
Since each of the pairwise joint distributions equals the product of their respective marginal distributions, the variables are pairwise independent:
- X and Y are independent, and
- X and Z are independent, and
- Y and Z are independent.
However, X, Y, and Z are not mutually independent, since . Note that any of is completely determined by the other two (any of X, Y, Z is the sum (modulo 2) of the others). That is as far from independence as random variables can get.
Generalization
More generally, we can talk about k-wise independence, for any k ≥ 2. The idea is similar: a set of random variables is k-wise independent if every subset of size k of those variables is independent. k-wise independence has been used in theoretical computer science, where it was used to prove a theorem about the problem MAXEkSAT.
See also
References
43 year old Petroleum Engineer Harry from Deep River, usually spends time with hobbies and interests like renting movies, property developers in singapore new condominium and vehicle racing. Constantly enjoys going to destinations like Camino Real de Tierra Adentro.
- ↑ Gut, A. (2005) Probability: a Graduate Course, Springer-Verlag. ISBN 0-387-27332-8. pp. 71–72.
- ↑ 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.
My blog: http://www.primaboinca.com/view_profile.php?userid=5889534 Definition 2.5.1, page 109. - ↑ 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.
My blog: http://www.primaboinca.com/view_profile.php?userid=5889534 Remark 2.6.1, p. 120.