|
|
(One intermediate revision by one other user not shown) |
Line 1: |
Line 1: |
| [[Image:Entropy-mutual-information-relative-entropy-relation-diagram.svg|thumb|256px|right|Individual (H(X),H(Y)), joint (H(X,Y)), and conditional entropies for a pair of correlated subsystems X,Y with mutual information I(X; Y).]]
| | <br><br>Bryan is a celebrity from the creating as well as profession development 1st next to his third resort recording, & , may be the resistant. He burst to the picture in 2005 along with his amusing mixture of down-residence convenience, motion picture celebrity wonderful appearance and lyrics, is defined t inside a significant way. The new a on the land graph or chart and #2 on the put graphs, creating it the 2nd greatest debut during that time of 2012 for any nation performer. <br><br>The child of the , understands perseverance and perseverance are important elements in relation to a prosperous job- . His initially recording, Stay Me, created the very best strikes “All My Girlfriends “Country and Say” Guy,” when his work, Doin’ Issue, discovered the performer-three directly No. 3 men and women: Else Getting in touch with Is often a Good Thing.”<br><br>From the fall of 2009, Tour: Bryan & which had [http://www.museodecarruajes.org vivid tickets] a remarkable listing of , which include Metropolitan. “It’s almost like you are obtaining a authorization [http://www.banburycrossonline.com luke bryan concert schedule 2014] to go to a higher level, states all those performers that were a part of the Tourabove right into a larger amount of musicians.” It packaged among the most successful organized tours in their 15-season history.<br><br>Feel free to surf to my web page: [http://lukebryantickets.lazintechnologies.com luke bryan luke bryan luke bryan] |
| | |
| In [[information theory]], the '''conditional entropy''' (or '''equivocation''') quantifies the amount of information needed to describe the outcome of a [[random variable]] <math>Y</math> given that the value of another random variable <math>X</math> is known.
| |
| Here, information is measured in [[bit]]s, [[nat (information)|nat]]s, or [[ban (information)|ban]]s.
| |
| The ''entropy of <math>Y</math> conditioned on <math>X</math>'' is written as <math>H(Y|X)</math>. | |
| | |
| == Definition ==
| |
| If <math>H(Y|X=x)</math> is the entropy of the variable <math>Y</math> conditioned on the variable <math>X</math> taking a certain value <math>x</math>, then <math>H(Y|X)</math> is the result of averaging <math>H(Y|X=x)</math> over all possible values <math>x</math> that <math>X</math> may take.
| |
| | |
| Given discrete random variable <math>X</math> with [[support (mathematics)|support]] <math>\mathcal X</math> and <math>Y</math> with support <math>\mathcal Y</math>, the conditional entropy of <math>Y</math> given <math>X</math> is defined as:<ref>{{cite book|last=Thomas|first=Thomas M. Cover, Joy A.|title=Elements of information theory|year=1991|publisher=Wiley|location=New York|isbn=0-471-06259-6|edition=99th ed.}}</ref>
| |
| | |
| ::<math>\begin{align}
| |
| H(Y|X)\ &\equiv \sum_{x\in\mathcal X}\,p(x)\,H(Y|X=x)\\
| |
| &{=}\sum_{x\in\mathcal X} \left(p(x)\sum_{y\in\mathcal Y}\,p(y|x)\,\log\, \frac{1}{p(y|x)}\right)\\
| |
| &=-\sum_{x\in\mathcal X}\sum_{y\in\mathcal Y}\,p(x,y)\,\log\,p(y|x)\\
| |
| &=-\sum_{x\in\mathcal X, y\in\mathcal Y}p(x,y)\log\,p(y|x)\\
| |
| &=\sum_{x\in\mathcal X, y\in\mathcal Y}p(x,y)\log \frac {p(x)} {p(x,y)}. \\
| |
| \end{align}</math>
| |
| | |
| <!-- This paragraph is incorrect; the last line is not the KL divergence between any two distributions, since p(x) is [in general] not a valid distribution over the domains of X and Y. The last formula above is the [[Kullback-Leibler divergence]], also known as relative entropy. Relative entropy is always positive, and vanishes if and only if <math>p(x,y) = p(x)</math>. This is when knowing <math>x</math> tells us everything about <math>y</math>. -->
| |
| | |
| ''Note:'' The supports of ''X'' and ''Y'' can be replaced by their [[domain of a function|domains]] if it is understood that <math> 0 \log 0 </math> should be treated as being equal to zero.<!-- and besides, p(x,y) could still equal 0 even if p(x) != 0 and p(y) != 0 -->
| |
| | |
| <math>H(Y|X)=0</math> if and only if the value of <math>Y</math> is completely determined by the value of <math>X</math>. Conversely, <math>H(Y|X) = H(Y)</math> if and only if <math>Y</math> and <math>X</math> are [[independent random variables]].
| |
| | |
| ==Chain rule==
| |
| | |
| Assume that the combined system determined by two random variables ''X'' and ''Y'' has entropy <math>H(X,Y)</math>, that is, we need <math>H(X,Y)</math> bits of information to describe its exact state.
| |
| Now if we first learn the value of <math>X</math>, we have gained <math>H(X)</math> bits of information.
| |
| Once <math>X</math> is known, we only need <math>H(X,Y)-H(X)</math> bits to describe the state of the whole system.
| |
| This quantity is exactly <math>H(Y|X)</math>, which gives the ''chain rule'' of conditional entropy:
| |
| :<math>H(Y|X)\,=\,H(X,Y)-H(X) \, .</math>
| |
| | |
| Formally, the chain rule indeed follows from the above definition of conditional entropy:
| |
| | |
| :<math>\begin{align}
| |
| H(Y|X)=&\sum_{x\in\mathcal X, y\in\mathcal Y}p(x,y)\log \frac {p(x)} {p(x,y)}\\
| |
| =&-\sum_{x\in\mathcal X, y\in\mathcal Y}p(x,y)\log\,p(x,y) + \sum_{x\in\mathcal X, y\in\mathcal Y}p(x,y)\log\,p(x) \\ | |
| =& H(X,Y) + \sum_{x \in \mathcal X} p(x)\log\,p(x) \\
| |
| =& H(X,Y) - H(X).
| |
| \end{align}</math>
| |
| | |
| ==Generalization to quantum theory==
| |
| | |
| In [[quantum information theory]], the conditional entropy is generalized to the [[conditional quantum entropy]]. The latter can take negative values, unlike its classical counterpart.
| |
| | |
| ==Other properties==
| |
| | |
| For any <math>X</math> and <math>Y</math>:
| |
| | |
| : <math>H(X|Y) \le H(X) \, </math> | |
| | |
| <math>H(X,Y) = H(X|Y) + H(Y|X) + I(X;Y)</math>, where <math>I(X;Y)</math> is the [[mutual information]] between <math>X</math> and <math>Y</math>.
| |
| | |
| : <math>I(X;Y) \le H(X),\,</math> | |
| | |
| where <math>I(X;Y)</math> is the mutual information between <math>X</math> and <math>Y</math>.
| |
| | |
| For independent <math>X</math> and <math>Y</math>:
| |
| | |
| : <math>H(Y|X) = H(Y)\text{ and }H(X|Y) = H(X) \, </math>
| |
| | |
| Although the specific-conditional entropy, <math>H(X|Y=y)</math>, can be either less or greater than <math>H(X|Y)</math>, <math>H(X|Y=y)</math> can never exceed <math>H(X)</math> when <math>X</math> is the uniform distribution.
| |
| | |
| ==References==
| |
| {{Reflist}}
| |
| | |
| == See also ==
| |
| * [[Entropy (information theory)]]
| |
| * [[Mutual information]]
| |
| * [[Conditional quantum entropy]]
| |
| * [[Variation of information]]
| |
| * [[Entropy power inequality]]
| |
| * [[Likelihood function]]
| |
| | |
| [[Category:Entropy and information]]
| |
| [[Category:Information theory]]
| |
Bryan is a celebrity from the creating as well as profession development 1st next to his third resort recording, & , may be the resistant. He burst to the picture in 2005 along with his amusing mixture of down-residence convenience, motion picture celebrity wonderful appearance and lyrics, is defined t inside a significant way. The new a on the land graph or chart and #2 on the put graphs, creating it the 2nd greatest debut during that time of 2012 for any nation performer.
The child of the , understands perseverance and perseverance are important elements in relation to a prosperous job- . His initially recording, Stay Me, created the very best strikes “All My Girlfriends “Country and Say” Guy,” when his work, Doin’ Issue, discovered the performer-three directly No. 3 men and women: Else Getting in touch with Is often a Good Thing.”
From the fall of 2009, Tour: Bryan & which had vivid tickets a remarkable listing of , which include Metropolitan. “It’s almost like you are obtaining a authorization luke bryan concert schedule 2014 to go to a higher level, states all those performers that were a part of the Tourabove right into a larger amount of musicians.” It packaged among the most successful organized tours in their 15-season history.
Feel free to surf to my web page: luke bryan luke bryan luke bryan