Cherenkov radiation: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Voidxor
m See also: Alphabetize list
en>Darko.veberic
No edit summary
Line 1: Line 1:
In [[probability theory]] and [[information theory]], the '''variation of information''' or '''shared information distance''' is a measure of the distance between two clusterings ([[Partition of a set|partitions of elements]]). It is closely related to [[mutual information]]; indeed, it is a simple linear expression involving the mutual informationUnlike the mutual information, however, the variation of information is a true [[metric (mathematics)|metric]], in that it obeys the [[triangle inequality]]. Even more, it is a [[universal metric]], in that if any other distance measure two items close-by, then the variation of information will also judge them close.<ref>Alexander Kraskov, Harald Stögbauer, Ralph G. Andrzejak, and [[Peter Grassberger]], "Hierarchical Clustering Based on Mutual Information", (2003) ''[http://arxiv.org/abs/q-bio/0311039 ArXiv q-bio/0311039]''</ref>
Bonjoսr, je suis Salam et je suis âgé de 28 piges. Je suis sur ce blog uniquement pour tchater avеc des gens. Je kіff des films dе qualités. Alors si cеla te tente, n'hésite pas à venir mе parleг. S'il y a des intéresséѕ je suis d'accoгd pour participer à սn RDV xҳx dans l'Eure et Loir.<br><br>Also visit my website :: [http://www.blog91.com/ Youporn]
 
==Background==
{{Empty section|date=July 2010}}
 
==Definition==
Suppose we have two clusterings (a division of a [[Set (mathematics)|set]] into several [[subset]]s) <math>X</math> and <math>Y</math> where <math>X = \{X_{1}, X_{2}, ..,, X_{k}\}</math>, <math>p_{i} = |X_{i}| / n</math>, <math>n = \Sigma_{k} |X_{i}|</math>. Then the variation of information between two clusterings is:
 
:<math>VI(X; Y ) = H(X) + H(Y) - 2I(X, Y)</math>
 
where <math>H(X)</math> is [[entropy (information theory)|entropy]] of <math>X</math> and <math>I(X, Y)</math> is [[mutual information]] between <math>X</math> and <math>Y</math>.
 
This is completely equivalent to the [[Mutual information#Metric|shared information distance]].
 
==References==
<references/>
 
== Further reading ==
* {{cite journal|last=Arabie|first=P.|coauthors=Boorman, S. A.|title=Multidimensional scaling of measures of distance between partitions|journal=Journal of Mathematical Psychology|year=1973|volume=10|pages=148–203}}
* {{Cite journal|last=Meila|first=Marina | title=Comparing Clusterings by the Variation of Information |journal=Learning Theory and Kernel Machines|year=2003|pages=173–187|doi=10.1007/978-3-540-45167-9_14}}
* {{cite doi|10.1016/j.jmva.2006.11.013}}
* {{Cite web
  | last = Kingsford
  | first = Carl
  | title = Information Theory Notes
  | year = 2009
  | url = http://www.cs.umd.edu/class/spring2009/cmsc858l/InfoTheoryHints.pdf
  | format = PDF
  | accessdate = 22 September 2009}}
{{Use dmy dates|date=September 2010}}
 
== External links ==
* [https://github.com/bjoern-andres/partition-comparison C++ implementation with MATLAB mex files]
 
{{DEFAULTSORT:Variation Of Information}}
[[Category:Entropy and information]]

Revision as of 19:29, 27 February 2014

Bonjoսr, je suis Salam et je suis âgé de 28 piges. Je suis sur ce blog uniquement pour tchater avеc des gens. Je kіff des films dе qualités. Alors si cеla te tente, n'hésite pas à venir mе parleг. S'il y a des intéresséѕ je suis d'accoгd pour participer à սn RDV xҳx dans l'Eure et Loir.

Also visit my website :: Youporn