|
|
Line 1: |
Line 1: |
| {{For|the martingale betting strategy|martingale (betting system)}}
| | Hey! I am Dalton. Acting 's a thing that I appreciate totally addicted to. My asset is now in Vermont and I don't wish on changing it. I am a cashier. I'm not okay at webdesign but you may be want to check brand new website: http://prometeu.net<br><br>my web page - [http://prometeu.net clash of clans tricks] |
| [[Image:HittingTimes1.png|thumb|340px|[[Stopped process#Brownian_motion|Stopped Brownian motion]] is an example of a martingale. It can model an even coin-toss betting game with the possibility of bankruptcy.]]
| |
| In [[probability theory]], a '''martingale''' is a model of a fair game where knowledge of past events never helps predict the mean of the future winnings. In particular, a martingale is a [[sequence]] of [[random variable]]s (i.e., a [[stochastic process]]) for which, at a particular time in the [[realization (probability)|realized]] sequence, the [[Expected value|expectation]] of the next value in the sequence is equal to the present observed value even given knowledge of all prior [[realization (probability)|observed value]]s at a current time.
| |
| | |
| To contrast, in a process that is not a martingale, it may still be the case that the expected value of the process at one time is equal to the expected value of the process at the next time. However, knowledge of the prior outcomes (e.g., all prior cards drawn from a card deck) may be able to reduce the uncertainty of future outcomes. Thus, the expected value of the next outcome given knowledge of the present and all prior outcomes may be higher than the current outcome if a winning strategy is used. Martingales exclude the possibility of winning strategies based on game history, and thus they are a model of fair games.
| |
| | |
| ==History==
| |
| Originally, ''[[martingale (betting system)|martingale]]'' referred to a class of [[betting strategy|betting strategies]] that was popular in 18th-century [[France]].<ref>{{cite book| first=N. J. |last=Balsara|title=Money Management Strategies for Futures Traders|publisher= Wiley Finance|year= 1992| isbn =0-471-52215-5 |page=122}}</ref><ref>{{cite journal|url=http://www.jehps.net/juin2009/Mansuy.pdf|title=The origins of the Word "Martingale"|last1=Mansuy|first1=Roger|date=June 2009|volume=5|number=1|journal=Electronic Journal for History of Probability and Statistics|accessdate=2011-10-22}}</ref> The simplest of these strategies was designed for a game in which the [[gambler]] wins his stake if a coin comes up heads and loses it if the coin comes up tails. The strategy had the gambler double his bet after every loss so that the first win would recover all previous losses plus win a profit equal to the original stake. As the gambler's wealth and available time jointly approach infinity, his probability of eventually flipping heads approaches 1, which makes the martingale betting strategy seem like a [[almost surely|sure thing]]. However, the [[exponential growth]] of the bets eventually bankrupts its users, assuming the obvious and realistic i.e. finite bankrolls (one of the reasons [[casino]]s, though normatively enjoying a mathematical edge in the games offered to their patrons, impose betting limits). [[Stopped process#Brownian_motion|Stopped Brownian motion]], which is a martingale process, can be used to model the trajectory of such games.
| |
| | |
| The concept of martingale in probability theory was introduced by [[Paul Lévy (mathematician)|Paul Lévy]] in 1934, though he did not name them: the term "martingale" was introduced later by {{harvtxt|Ville|1939}}, who also extended the definition to continuous martingales. Much of the original development of the theory was done by [[Joseph Leo Doob]] among others. Part of the motivation for that work was to show the impossibility of successful betting strategies.
| |
| | |
| ==Definitions==
| |
| A basic definition of a [[Discrete-time stochastic process|discrete-time]] '''martingale''' is a discrete-time [[stochastic process]] (i.e., a [[sequence]] of [[random variable]]s) ''X''<sub>1</sub>, ''X''<sub>2</sub>, ''X''<sub>3</sub>, ... that satisfies for any time ''n'',
| |
| | |
| :<math>\mathbf{E} ( \vert X_n \vert )< \infty </math>
| |
| | |
| :<math>\mathbf{E} (X_{n+1}\mid X_1,\ldots,X_n)=X_n.</math>
| |
| | |
| That is, the [[conditional expected value]] of the next observation, given all the past observations, is equal to the last observation. Due to the linearity of expectation, this second requirement is equivalent to:
| |
| | |
| :<math>\mathbf{E} (X_{n+1} - X_n \mid X_1,\ldots,X_n)=0</math> or <math> \mathbf{E} (X_{n+1} \mid X_1,\ldots,X_n)- X_n=0 </math>
| |
| | |
| which states that the average "winnings" from observation <math>n</math> to observation <math>n+1</math> are 0.
| |
| | |
| ===Martingale sequences with respect to another sequence===
| |
| | |
| More generally, a sequence ''Y''<sub>1</sub>, ''Y''<sub>2</sub>, ''Y''<sub>3</sub> ... is said to be a '''martingale with respect to''' another sequence ''X''<sub>1</sub>, ''X''<sub>2</sub>, ''X''<sub>3</sub> ... if for all ''n''
| |
| | |
| :<math>\mathbf{E} ( \vert Y_n \vert )< \infty </math>
| |
| | |
| :<math>\mathbf{E} (Y_{n+1}\mid X_1,\ldots,X_n)=Y_n.</math>
| |
| | |
| Similarly, a '''[[continuous time|continuous-time]] martingale with respect to''' the [[stochastic process]] ''X<sub>t</sub>'' is a [[stochastic process]] ''Y<sub>t</sub>'' such that for all ''t''
| |
| | |
| :<math>\mathbf{E} ( \vert Y_t \vert )<\infty </math>
| |
| | |
| :<math>\mathbf{E} ( Y_{t} \mid \{ X_{\tau}, \tau \leq s \} ) = Y_s, \ \forall\ s \leq t.</math>
| |
| | |
| This expresses the property that the conditional expectation of an observation at time ''t'', given all the observations up to time <math> s </math>, is equal to the observation at time ''s'' (of course, provided that ''s'' ≤ ''t'').
| |
| | |
| ===General definition===
| |
| | |
| In full generality, a [[stochastic process]] <math>Y:T\times\Omega\to S</math> is a '''martingale with respect to a filtration''' <math>\Sigma_*</math> '''and [[probability measure]] P''' if
| |
| * Σ<sub>∗</sub> is a [[Filtration (mathematics)#Measure theory|filtration]] of the underlying [[probability space]] (Ω, Σ, '''P''');
| |
| * ''Y'' is [[adapted process|adapted]] to the filtration Σ<sub>∗</sub>, i.e., for each ''t'' in the [[index set]] ''T'', the random variable ''Y<sub>t</sub>'' is a Σ<sub>''t''</sub>-[[measurable function]];
| |
| * for each ''t'', ''Y<sub>t</sub>'' lies in the [[Lp space|''L<sup>p</sup>'' space]] ''L''<sup>1</sup>(Ω, Σ<sub>''t''</sub>, '''P'''; ''S''), i.e.
| |
| ::<math>\mathbf{E}_{\mathbf{P}} ( | Y_{t} | ) < + \infty;</math>
| |
| * for all ''s'' and ''t'' with ''s'' < ''t'' and all ''F'' ∈ Σ<sub>''s''</sub>,
| |
| ::<math>\mathbf{E}_{\mathbf{P}} \left([Y_t-Y_s]\chi_F\right)=0,</math>
| |
| :where ''χ<sub>F</sub>'' denotes the [[indicator function]] of the event ''F''. In Grimmett and Stirzaker's ''Probability and Random Processes'', this last condition is denoted as
| |
| ::<math>Y_s = \mathbf{E}_{\mathbf{P}} ( Y_t | \Sigma_s ),</math>
| |
| :which is a general form of [[conditional expectation]].<ref>{{cite book|first1=G. |last1=Grimmett |first2= D.|last2= Stirzaker|title=Probability and Random Processes|edition= 3rd|publisher= Oxford University Press|year= 2001| isbn =0-19-857223-9}}</ref>
| |
| | |
| It is important to note that the property of being a martingale involves both the filtration ''and'' the probability measure (with respect to which the expectations are taken). It is possible that ''Y'' could be a martingale with respect to one measure but not another one; the [[Girsanov theorem]] offers a way to find a measure with respect to which an [[Itō process]] is a martingale.
| |
| | |
| ==Examples of martingales==
| |
| | |
| * An unbiased [[random walk]] (in any number of dimensions) is an example of a martingale.
| |
| | |
| * A gambler's fortune (capital) is a martingale if all the betting games which the gambler plays are fair.
| |
| | |
| * [[Polya's urn]] contains a number of different coloured marbles, and each [[iterative method|iteration]] a marble is randomly selected out of the urn and replaced with several more of that same colour. For any given colour, the ''ratio'' of marbles inside the urn with that colour is a martingale. For example, if currently 95% of the marbles are red then—though the next iteration is much more likely add more red marbles—this bias is exactly balanced out by the fact that adding more red marbles alters the ratio much less significantly than adding the same number of non-red marbles would.
| |
| | |
| * Suppose ''X<sub>n</sub>'' is a gambler's fortune after ''n'' tosses of a [[fair coin]], where the gambler wins $1 if the coin comes up heads and loses $1 if the coin comes up tails. The gambler's conditional expected fortune after the next trial, given the history, is equal to his present fortune, so this sequence is a martingale.
| |
| | |
| * Let ''Y<sub>n</sub>'' = ''X<sub>n</sub>''<sup>2</sup> − ''n'' where ''X<sub>n</sub>'' is the gambler's fortune from the preceding example. Then the sequence { ''Y<sub>n</sub>'' : ''n'' = 1, 2, 3, ... } is a martingale. This can be used to show that the gambler's total gain or loss varies roughly between plus or minus the [[square root]] of the number of steps.
| |
| | |
| * ([[Abraham de Moivre|de Moivre]]'s martingale) Now suppose an "unfair" or "biased" coin, with probability ''p'' of "heads" and probability ''q'' = 1 − ''p'' of "tails". Let
| |
| | |
| ::<math>X_{n+1}=X_n\pm 1</math>
| |
| :with "+" in case of "heads" and "−" in case of "tails". Let
| |
| | |
| ::<math>Y_n=(q/p)^{X_n}.</math>
| |
| | |
| :Then { ''Y<sub>n</sub>'' : ''n'' = 1, 2, 3, ... } is a martingale with respect to { ''X<sub>n</sub>'' : ''n'' = 1, 2, 3, ... }. To show this
| |
| :: <math>
| |
| \begin{align}
| |
| E[Y_{n+1} \mid X_1,\dots,X_n] & = p (q/p)^{X_n+1} + q (q/p)^{X_n-1} \\[6pt]
| |
| & = p (q/p) (q/p)^{X_n} + q (p/q) (q/p)^{X_n} \\[6pt]
| |
| & = q (q/p)^{X_n} + p (q/p)^{X_n} = (q/p)^{X_n}=Y_n.
| |
| \end{align}
| |
| </math>
| |
| | |
| * ([[Likelihood-ratio test]]ing in [[statistics]]) A population is thought to be distributed according to either a probability density ''f'' or another probability density ''g''. A [[random sample]] is taken, the data being ''X''<sub>1</sub>, ..., ''X<sub>n</sub>''. Let ''Y<sub>n</sub>'' be the "likelihood ratio"
| |
| | |
| ::<math>Y_n=\prod_{i=1}^n\frac{g(X_i)}{f(X_i)}</math>
| |
| | |
| :(which, in applications, would be used as a test statistic). If the population is actually distributed according to the density ''f'' rather than according to ''g'', then { ''Y<sub>n</sub>'' : ''n'' = 1, 2, 3, ... } is a martingale with respect to { ''X<sub>n</sub>'' : ''n'' = 1, 2, 3, ... }.
| |
| | |
| * Suppose each [[amoeba]] either splits into two amoebas, with probability ''p'', or eventually dies, with probability 1 − ''p''. Let ''X<sub>n</sub>'' be the number of amoebas surviving in the ''n''th generation (in particular ''X<sub>n</sub>'' = 0 if the population has become extinct by that time). Let ''r'' be the [[Galton–Watson process|probability of ''eventual'' extinction]]. (Finding ''r'' as function of ''p'' is an instructive exercise. Hint: The probability that the descendants of an amoeba eventually die out is equal to the probability that either of its immediate offspring dies out, given that the original amoeba has split.) Then
| |
| | |
| ::<math>\{\,r^{X_n}:n=1,2,3,\dots\,\}</math>
| |
| | |
| :is a martingale with respect to { ''X<sub>n</sub>'': ''n'' = 1, 2, 3, ... }.
| |
| | |
| [[Image:Martingale1.svg|thumb|250px|Software-created martingale series.]]
| |
| * In an ecological community (a group of species that are in a particular trophic level, competing for similar resources in a local area), the number of individuals of any particular species of fixed size is a function of (discrete) time, and may be viewed as a sequence of random variables. This sequence is a martingale under the [[unified neutral theory of biodiversity and biogeography]].
| |
| | |
| * If { ''N<sub>t</sub>'' : ''t'' ≥ 0 } is a [[Poisson process]] with intensity λ, then the compensated Poisson process { ''N<sub>t</sub>'' − λ''t'' : ''t'' ≥ 0 } is a continuous-time martingale with [[Classification of discontinuities|right-continuous/left-limit]] sample paths.
| |
| | |
| * [[Wald's martingale]]
| |
| | |
| ==Submartingales, supermartingales, and relationship to harmonic functions{{anchor|Submartingales and supermartingales}}==
| |
| | |
| There are two popular generalizations of a martingale that also include cases when the current observation ''X<sub>n</sub>'' is not necessarily equal to the future conditional expectation ''E''[''X<sub>n+1</sub>''|''X''<sub>1</sub>,...,''X<sub>n</sub>''] but instead an upper or lower bound on the conditional expectation. These definitions reflect a relationship between martingale theory and [[potential theory]], which is the study of [[harmonic function]]s. Just as a continuous-time martingale satisfies ''E''[''X<sub>t</sub>''|{''X''<sub>τ</sub> : τ≤s}] − ''X<sub>s</sub>'' = 0 ∀''s'' ≤ ''t'', a harmonic function ''f'' satisfies the [[stochastic partial differential equation|partial]] [[stochastic differential equation]] Δ''f'' = 0 where Δ is the [[Laplace operator|Laplacian operator]]. Given a [[Brownian motion]] process ''W<sub>t</sub>'' and a harmonic function ''f'', the resulting process ''f''(''W<sub>t</sub>'') is also a martingale.
| |
| | |
| * A discrete-time '''submartingale''' is a sequence <math>X_1,X_2,X_3,\ldots</math> of [[Integrable function|integrable]] random variables satisfying
| |
| ::<math>{}E[X_{n+1}|X_1,\ldots,X_n] \ge X_n.</math>
| |
| : Likewise, a continuous-time submartingale satisfies
| |
| ::<math>{}E[X_t|\{X_{\tau} : \tau \le s\}] \ge X_s \quad \forall s \le t.</math>
| |
| :In potential theory, a [[subharmonic function]] ''f'' satisfies Δ''f'' ≥ 0. Any subharmonic function that is bounded above by a harmonic function for all points on the boundary of a ball are bounded above by the harmonic function for all points inside the ball. Similarly, if a submartingale and a martingale have equivalent expectations for a given time, the history of the submartingale tends to be bounded above by the history of the martingale. Roughly speaking, the [[prefix]] "sub-" is consistent because the current observation ''X<sub>n</sub>'' is ''less than'' (or equal to) the conditional expectation ''E''[''X<sub>n</sub>''<sub>+1</sub>|''X''<sub>1</sub>,...,''X<sub>n</sub>'']. Consequently, the current observation provides support ''from below'' the future conditional expectation, and the process tends to increase in future time.
| |
| | |
| * Analogously, a discrete-time '''supermartingale''' satisfies
| |
| ::<math>{}E[X_{n+1}|X_1,\ldots,X_n] \le X_n.</math>
| |
| : Likewise, a continuous-time supermartingale satisfies
| |
| ::<math>{}E[X_t|\{X_{\tau} : \tau \le s\}] \le X_s \quad \forall s \le t.</math>
| |
| :In potential theory, a [[superharmonic function]] ''f'' satisfies Δ''f'' ≤ 0. Any superharmonic function that is bounded below by a harmonic function for all points on the boundary of a ball are bounded below by the harmonic function for all points inside the ball. Similarly, if a supermartingale and a martingale have equivalent expectations for a given time, the history of the supermartingale tends to be bounded below by the history of the martingale. Roughly speaking, the prefix "super-" is consistent because the current observation ''X<sub>n</sub>'' is ''greater than'' (or equal to) the conditional expectation ''E''[''X<sub>n</sub>''<sub>+1</sub>|''X''<sub>1</sub>,...,''X<sub>n</sub>'']. Consequently, the current observation provides support ''from above'' the future conditional expectation, and the process tends to decrease in future time.
| |
| | |
| ===Examples of submartingales and supermartingales===
| |
| | |
| * Every martingale is also a submartingale and a supermartingale. Conversely, any stochastic process that is ''both'' a submartingale and a supermartingale is a martingale.
| |
| * Consider again the gambler who wins $1 when a coin comes up heads and loses $1 when the coin comes up tails. Suppose now that the coin may be biased, so that it comes up heads with probability ''p''.
| |
| ** If ''p'' is equal to 1/2, the gambler on average neither wins nor loses money, and the gambler's fortune over time is a martingale.
| |
| ** If ''p'' is less than 1/2, the gambler loses money on average, and the gambler's fortune over time is a supermartingale.
| |
| ** If ''p'' is greater than 1/2, the gambler wins money on average, and the gambler's fortune over time is a submartingale.
| |
| * A [[convex function]] of a martingale is a submartingale, by [[Jensen's inequality]]. For example, the square of the gambler's fortune in the fair coin game is a submartingale (which also follows from the fact that ''X<sub>n</sub>''<sup>2</sup> − ''n'' is a martingale). Similarly, a [[concave function]] of a martingale is a supermartingale.
| |
| | |
| ==Martingales and stopping times==
| |
| {{Main|Stopping time}}
| |
| | |
| A [[stopping time]] with respect to a sequence of random variables ''X''<sub>1</sub>, ''X''<sub>2</sub>, ''X''<sub>3</sub>, ... is a random variable τ with the property that for each ''t'', the occurrence or non-occurrence of the event τ = ''t'' depends only on the values of ''X''<sub>1</sub>, ''X''<sub>2</sub>, ''X''<sub>3</sub>, ..., ''X''<sub>t</sub>. The intuition behind the definition is that at any particular time ''t'', you can look at the sequence so far and tell if it is time to stop. An example in real life might be the time at which a gambler leaves the gambling table, which might be a function of his previous winnings (for example, he might leave only when he goes broke), but he can't choose to go or stay based on the outcome of games that haven't been played yet.
| |
| | |
| In some contexts the concept of ''stopping time'' is defined by requiring only that the occurrence or non-occurrence of the event τ = ''t'' be [[statistical independence|probabilistically independent]] of ''X''<sub>t + 1</sub>, ''X''<sub>t + 2</sub>, ... but not that it be completely determined by the history of the process up to time ''t''. That is a weaker condition than the one appearing in the paragraph above, but is strong enough to serve in some of the proofs in which stopping times are used.
| |
| | |
| One of the basic properties of martingales is that, if <math>(X_t)_{t>0}</math> is a (sub-/super-) martingale and <math>\tau</math> is a stopping time, then the corresponding stopped process <math>(X_t^\tau)_{t>0}</math> defined by <math>X_t^\tau:=X_{\min\{\tau,t\}}</math> is also a (sub-/super-) martingale.
| |
| | |
| The concept of a stopped martingale leads to a series of important theorems, including, for example, the [[optional stopping theorem]] which states that, under certain conditions, the expected value of a martingale at a stopping time is equal to its initial value.
| |
| | |
| ==See also==
| |
| | |
| *[[Azuma's inequality]]
| |
| *[[Brownian motion]]
| |
| *[[Martingale central limit theorem]]
| |
| *[[Martingale representation theorem]]
| |
| *[[Doob martingale]]
| |
| *[[Doob's martingale convergence theorems]]
| |
| *[[Local martingale]]
| |
| *[[Semimartingale]]
| |
| *[[Martingale difference sequence]]
| |
| *[[Markov chain]]
| |
| *[[Martingale (betting system)]]
| |
| | |
| == Notes ==
| |
| {{Reflist}}
| |
| | |
| == References ==
| |
| * {{springer|title=Martingale|id=p/m062570}}
| |
| * {{cite journal|title=The Splendors and Miseries of Martingales|journal= Electronic Journal for History of Probability and Statistics|volume=5|month= June|issue=1|year= 2009|url=http://www.jehps.net/juin2009.html}} Entire issue dedicated to Martingale probability theory.
| |
| * {{cite book| author-link=David Williams (mathematician)|first=David |last=Williams|title=Probability with Martingales|publisher= Cambridge University Press|year=1991| isbn =0-521-40605-6}}
| |
| * {{cite book|first=Hagen|last= Kleinert|author-link=Hagen Kleinert|title=Path Integrals in Quantum Mechanics, Statistics, Polymer Physics, and Financial Markets|edition= 4th|publisher= World Scientific |location=Singapore|year= 2004| ISBN =981-238-107-4|url=http://www.physik.fu-berlin.de/~kleinert/b5 }}
| |
| *{{cite web|title=Martingales and Stopping Times: Use of martingales in obtaining bounds and analyzing algorithms |url=http://www.corelab.ece.ntua.gr/courses/rand-alg/slides/Martingales-Stopping_Times.pdf |format=PDF|publisher=University of Athens|first=Paris |last=Siminelakis|year=2010}}
| |
| {{citation|zbl=0021.14601|last= Ville|first= Jean
| |
| |title=Étude critique de la notion de collectif|language=French|series=Monographies des Probabilités |volume=3 |place=Paris|publisher= Gauthier-Villars|year=1939|id=[http://dx.doi.org/10.1090/S0002-9904-1939-07089-4 Review by Doob]|url=http://books.google.com/books?id=ETY7AQAAIAAJ}}
| |
|
| |
| {{Stochastic processes}}
| |
| | |
| [[Category:Stochastic processes]]
| |
| [[Category:Martingale theory]]
| |
| [[Category:Game theory]]
| |