53 equal temperament: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Lygophile
several songs & forms of music use it, some other ET articles have it, so it's useful for comparison, since many people dont realise sept. whole tone serves that. colored it to conform to other sept/und >4.5 cent entries, and grey to distinct from yellow.
improve
 
Line 1: Line 1:
In physics, the '''Tsallis entropy''' is a generalization of the standard [[Entropy (statistical thermodynamics)|Boltzmann–Gibbs entropy]]. It was introduced in 1988 by [[Constantino Tsallis]] <ref name=tsallis1988>{{Cite doi|10.1007/BF01016429}}</ref> as a basis for generalizing the standard statistical mechanics. In the scientific literature, the physical relevance of the Tsallis entropy was occasionally debated. However, from the years 2000 on, an increasingly wide spectrum of natural, artificial and social complex systems have been identified which confirm the predictions and consequences that are derived from this nonadditive entropy, such as nonextensive statistical mechanics,<ref name=book2009>{{cite book|last=Tsallis|first=Constantino|title=Introduction to nonextensive statistical mechanics : approaching a complex world|year=2009|publisher=Springer|location=New York|isbn=978-0-387-85358-1|edition=Online-Ausg.}}</ref> which generalizes the Boltzmann-Gibbs theory.
.mwe-math-mathml-a11y {
 
    clip: auto;
Among the various experimental verifications and applications presently available in the literature, the following ones deserve a special mention:  
    overflow: visible;
 
    position: static;
# The distribution characterizing the motion of cold atoms in dissipative optical lattices, predicted in 2003 <ref>{{Cite doi|10.1103/PhysRevA.67.051402}}</ref> and observed in 2006.<ref>{{Cite doi|10.1103/PhysRevLett.96.110601}}</ref>
    width: auto;
# The fluctuations of the magnetic field in the solar wind enabled the calculation of the q-triplet (or Tsallis triplet).<ref>{{Cite doi|10.1016/j.physa.2005.06.065}}</ref>
    height: auto;
# The velocity distributions in driven dissipative dusty plasma.<ref>{{Cite doi|10.1103/PhysRevLett.100.055003}}</ref>
    opacity: 1;
# [[Spin glass]] relaxation.<ref>{{Cite doi|10.1103/PhysRevLett.102.097202}}</ref>
}
# Trapped ion interacting with a classical buffer gas.<ref>{{Cite doi|10.1103/PhysRevLett.102.063001}}</ref>
.mwe-math-fallback-image-inline, .mwe-math-fallback-image-display {
# High energy collisional experiments at LHC/CERN (CMS, ATLAS and ALICE detectors) <ref>{{Cite doi|10.1103/PhysRevLett.105.022002}}</ref><ref>{{Cite doi|10.1007/JHEP08(2011)086}}</ref> and RHIC/Brookhaven (STAR and PHENIX detectors).<ref>{{Cite doi|10.1103/PhysRevD.83.052004}}</ref>
    display: none !important;
 
}
Among the various available theoretical results which clarify the physical conditions under which Tsallis entropy and associated statistics apply, the following ones can be selected:  
# Anomalous diffusion.<ref>{{Cite doi|10.1016/0378-4371(95)00211-1}}</ref><ref>{{Cite doi|10.1103/PhysRevE.54.R2197}}</ref>
# Uniqueness theorem.<ref>{{Cite doi|10.1016/S0375-9601(00)00337-6}}</ref>
# Sensitivity to initial conditions and entropy production at the edge of chaos.<ref>{{Cite doi|10.1103/PhysRevLett.80.53}}</ref><ref>{{Cite doi|10.1103/PhysRevE.69.045202}}</ref>
# Probability sets which make the nonadditive Tsallis entropy to be extensive in the thermodynamical sense.<ref>{{Cite doi|10.1073/pnas.0503807102}}</ref>
# Strongly quantum entangled systems and thermodynamics.<ref>{{Cite doi|10.1103/PhysRevE.78.021102}}</ref>
# Thermostatistics of overdamped motion of interacting particles.<ref>{{Cite doi|10.1103/PhysRevLett.105.260601}}</ref><ref>{{Cite doi|10.1103/PhysRevE.85.021146}}</ref>
# Nonlinear generalizations of the Schroedinger, Klein-Gordon and Dirac equations.<ref>{{Cite doi|10.1103/PhysRevLett.106.140601}}</ref>
 
For further details a bibliography is available at http://tsallis.cat.cbpf.br/biblio.htm
 
Given a discrete set of probabilities <math>\{p_i\}</math> with the condition <math>\sum_i p_i=1</math>, and <math>q</math> any real number, the '''Tsallis entropy''' is defined as
 
:<math>S_q({p_i}) = {1 \over q - 1} \left( 1 - \sum_i p_i^q \right),</math>
 
where <math>q</math> is a real parameter sometimes called ''entropic-index''.
In the limit as <math>q \to 1</math>, the usual Boltzmann-Gibbs entropy is recovered, namely
 
:<math>S_{BG} = S_1(p) =  -k \sum_i p_i \ln p_i .</math>
 
For continuous probability distributions, we define the entropy as
 
:<math>S_q[p] = {1 \over q - 1} \left( 1 - \int (p(x))^q\, dx \right),</math>
 
where <math>p(x)</math> is a [[probability density function]].
 
The Tsallis Entropy has been used along with the [[Principle of maximum entropy]] to derive the [[Tsallis distribution]].
 
== Various relationships ==
 
The discrete Tsallis entropy satisfies
 
:<math>S_q = -\lim_{x\rightarrow 1}D_q \sum_i p_i^x </math>
 
where ''D''<sub>''q''</sub> is the [[q-derivative]] with respect to ''x''. This may be compared to the standard entropy formula:
 
:<math>S = -\lim_{x\rightarrow 1}\frac{d}{dx} \sum_i p_i^x </math>
 
== Non-additivity ==
 
Given two independent systems ''A'' and ''B'', for which the joint [[probability density function|probability density]] satisfies
 
:<math>p(A, B) = p(A) p(B),\,</math>
 
the Tsallis entropy of this system satisfies
 
:<math>S_q(A,B) = S_q(A) + S_q(B) + (1-q)S_q(A) S_q(B).\,</math>
 
From this result, it is evident that the parameter <math>|1-q|</math> is a measure of the departure from additivity. In the limit when ''q'' = 1,
 
:<math>S(A,B) = S(A) + S(B),\,</math>
 
which is what is expected for an additive system. This property is sometimes referred to as "pseudo-additivity".
 
== Exponential families ==
Many common distributions like the normal distribution belongs to the statistical exponential families.
Tsallis entropy for an exponential family  can be written (Nielsen, Nock, 2011) as
 
:<math>H^T_q(p_F(x;\theta)) =  \frac{1}{1-q} \left((e^{F(q\theta)-q F(\theta)}) E_p[e^{(q-1)k(x)}]-1  \right)</math>
where ''F'' is log-normalizer and ''k'' the term indicating the carrier measure.
For multivariate normal, term ''k'' is zero, and therefore the Tsallis entropy is in closed-form.
 
== Generalised entropies ==
A number of interesting physical systems<ref name=pnas>{{Cite doi|10.1073/pnas.1109844108}}</ref> abide to entropic functionals that are more general than the standard Tsallis entropy. Therefore, several physically meaningful generalisations have been introduced. The two most general of those are notably: Superstatistics, introduced by C. Beck and E.G.D. Cohen in 2003<ref name=superstatistics>{{Cite doi|10.1016/S0378-4371(03)00019-0}}</ref> and Spectral Statistics, introduced by G.A. Tsekouras and [[Constantino Tsallis]] in 2005.<ref name=spectral>{{Cite doi|10.1103/PhysRevE.71.046144}}</ref> Both these entropic forms have Tsallis and Boltzmann-Gibbs statistics as special cases; Spectral Statistics has been proven to at least contain Superstatistics and it has been conjectured to also cover some additional cases.
 
== See also ==
*[[Rényi entropy]]
*[[Tsallis distribution]]
 
==References==
 
<references />
 
== External links ==
*[http://www.cscs.umich.edu/~crshalizi/notabene/tsallis.html Tsallis Statistics, Statistical Mechanics for Non-extensive Systems and Long-Range Interactions]
 
{{Tsallis}}
 
[[Category:Probability theory]]
[[Category:Entropy and information]]
[[Category:Thermodynamic entropy]]
[[Category:Information theory]]

Latest revision as of 23:58, 2 August 2014

.mwe-math-mathml-a11y {
    clip: auto;
    overflow: visible;
    position: static;
    width: auto;
    height: auto;
    opacity: 1;
}
.mwe-math-fallback-image-inline, .mwe-math-fallback-image-display {
    display: none !important;
}