Graph enumeration: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Myasuda
m added missing diacritic
 
en>Trappist the monk
m rename venn to acad; remove deprecated parameters; using AWB
Line 1: Line 1:
Nice to meet you, my name is Figures Held although I don't really like becoming known as like that. Bookkeeping is what I do. The factor she adores most is body developing and now she is trying to make cash with it. Years in the past he moved to North Dakota and his family loves it.<br><br>Also visit my web page: [http://www.uknewsx.com/what-you-want-to-do-when-confronted-with-candida-albicans/ at home std testing]
In [[mathematics]], the '''Khintchine inequality''', named after [[Aleksandr Khinchin]] and spelled in multiple ways in the Roman alphabet, is a theorem from [[probability]], and is also frequently used in [[mathematical analysis|analysis]]. Heuristically, it says that if we pick <math> N </math> [[complex numbers]] <math> x_1,\dots,x_N \in\mathbb{C}</math>, and add them together each multiplied by a random sign <math>\pm 1 </math>, then the [[expected value]] of its [[absolute value|modulus]], or the modulus it will be closest to on average, will be not too far off from <math> \sqrt{|x_1|^{2}+\cdots + |x_N|^{2}}</math>.
 
==Statement of theorem==
 
Let <math> \{\epsilon_{n}\}_{n=1}^{N} </math> be [[i.i.d.]] [[random variables]]
with <math>P(\epsilon_n=\pm1)=\frac12</math> for every <math>n=1\ldots N</math>,
i.e., a sequence with [[Rademacher distribution]].
Let <math> 0<p<\infty</math> and let <math> x_1,...,x_N\in \mathbb{C}</math>.
Then
 
:<math> A_p \left( \sum_{n=1}^{N}|x_{n}|^{2} \right)^{\frac{1}{2}} \leq \left(\mathbb{E}\Big|\sum_{n=1}^{N}\epsilon_{n}x_{n}\Big|^{p} \right)^{1/p}  \leq B_p \left(\sum_{n=1}^{N}|x_{n}|^{2}\right)^{\frac{1}{2}} </math>
 
for some constants <math> A_p,B_p>0 </math> depending only on <math>p</math> (see [[Expected value]] for notation). The sharp values of the constants <math>A_p,B_p</math> were found by Haagerup (Ref. 2; see Ref. 3 for a simpler proof).
 
==Uses in analysis==
 
The uses of this inequality are not limited to applications in [[probability theory]]. One example of its use in [[Mathematical Analysis|analysis]] is the following: if we let <math>T</math> be a [[linear operator]] between two [[Lp space|L<sup>''p''</sup> spaces]] <math> L^p(X,\mu)</math> and <math> L^p(Y,\nu) </math>, <math>1\leq p<\infty</math>, with bounded [[operator norm|norm]] <math> \|T\|<\infty </math>, then one can use Khintchine's inequality to show that
 
:<math> \left\|\left(\sum_{n=1}^{N}|Tf_n|^{2} \right)^{\frac{1}{2}}\right\|_{L^p(Y,\nu)}\leq C_p\left\|\left(\sum_{n=1}^{N}|f_{n}|^{2}\right)^{\frac{1}{2}}\right\|_{L^p(X,\mu)} </math>
 
for some constant <math>C_p>0</math> depending only on <math>p</math> and <math>\|T\|</math>.
 
== See also ==
* [[Marcinkiewicz–Zygmund inequality]]
 
==References==
#[[Thomas Wolff|Thomas H. Wolff]], "Lectures on Harmonic Analysis". American Mathematical Society, University Lecture Series vol. 29, 2003. ISBN 0-8218-3449-5
#Uffe Haagerup, "The best constants in the Khintchine inequality", Studia Math. 70 (1981), no. 3, 231&ndash;283 (1982).
#[[Fedor Nazarov]] and Anatoliy Podkorytov, "Ball, Haagerup, and distribution functions", Complex analysis, operators, and related topics, 247&ndash;267, Oper. Theory Adv. Appl., 113, Birkhäuser, Basel, 2000. 
 
[[Category:Mathematical analysis]]
[[Category:Probabilistic inequalities]]

Revision as of 17:21, 10 December 2013

In mathematics, the Khintchine inequality, named after Aleksandr Khinchin and spelled in multiple ways in the Roman alphabet, is a theorem from probability, and is also frequently used in analysis. Heuristically, it says that if we pick complex numbers , and add them together each multiplied by a random sign , then the expected value of its modulus, or the modulus it will be closest to on average, will be not too far off from .

Statement of theorem

Let be i.i.d. random variables with for every , i.e., a sequence with Rademacher distribution. Let and let . Then

for some constants depending only on (see Expected value for notation). The sharp values of the constants were found by Haagerup (Ref. 2; see Ref. 3 for a simpler proof).

Uses in analysis

The uses of this inequality are not limited to applications in probability theory. One example of its use in analysis is the following: if we let be a linear operator between two Lp spaces and , , with bounded norm , then one can use Khintchine's inequality to show that

for some constant depending only on and .

See also

References

  1. Thomas H. Wolff, "Lectures on Harmonic Analysis". American Mathematical Society, University Lecture Series vol. 29, 2003. ISBN 0-8218-3449-5
  2. Uffe Haagerup, "The best constants in the Khintchine inequality", Studia Math. 70 (1981), no. 3, 231–283 (1982).
  3. Fedor Nazarov and Anatoliy Podkorytov, "Ball, Haagerup, and distribution functions", Complex analysis, operators, and related topics, 247–267, Oper. Theory Adv. Appl., 113, Birkhäuser, Basel, 2000.