# Poisson process

Template:More footnotes {{#invoke:Hatnote|hatnote}}

In probability theory, a Poisson process is a stochastic process that counts the number of events[note 1] and the time points at which these events occur in a given time interval. The time between each pair of consecutive events has an exponential distribution with parameter λ and each of these inter-arrival times is assumed to be independent of other inter-arrival times. The process is named after the French mathematician Siméon Denis Poisson and is a good model of radioactive decay,[1] telephone calls[2] and requests for a particular document on a web server,[3] among many other phenomena.

The Poisson process is a continuous-time process; the sum of a Bernoulli process can be thought of as its discrete-time counterpart. A Poisson process is a pure-birth process, the simplest example of a birth-death process. It is also a point process on the real half-line.

## Definition

The basic form of Poisson process, often referred to simply as "the Poisson process", is a continuous-time counting process {N(t), t ≥ 0} that possesses the following properties:

Consequences of this definition include:

• The probability distribution of the waiting time until the next occurrence is an exponential distribution.
• The occurrences are distributed uniformly on any interval of time. (Note that N(t), the total number of occurrences, has a Poisson distribution over the non-negative integers, whereas the location of an individual occurrence on t ∈ (a, b] is uniform.)

Other types of Poisson process are described below.

## Types

### Homogeneous

Sample path of a counting Poisson process N(t)

The homogeneous Poisson process counts events that occur at a constant rate; it is one of the most well-known Lévy processes. This process is characterized by a rate parameter λ, also known as intensity, such that the number of events in time interval (tt + τ] follows a Poisson distribution with associated parameter λτ. This relation is given as

${\displaystyle P[N(t+\tau )-N(t)=k]={\frac {e^{-\lambda \tau }(\lambda \tau )^{k}}{k!}}\qquad k=0,1,\ldots ,}$

where N(t + τ) − N(t) = k is the number of events in time interval (tt + τ].

Just as a Poisson random variable is characterized by its scalar parameter λ, a homogeneous Poisson process is characterized by its rate parameter λ, which is the expected number of "events" or "arrivals" that occur per unit time.

N(t) is a sample homogeneous Poisson process, not to be confused with a density or distribution function.

### Inhomogeneous

{{#invoke:main|main}}

A inhomogeneous Poisson process counts events that occur at a variable rate. In general, the rate parameter may change over time; such a process is called a non-homogeneous Poisson process or inhomogeneous Poisson process. In this case, the generalized rate function is given as λ(t). Now the expected number of events between time a and time b is

${\displaystyle N_{a,b}=\int _{a}^{b}\lambda (t)\,dt.}$

Thus, the number of arrivals in the time interval [ab], given as N(b) − N(a), follows a Poisson distribution with associated parameter Na,b

${\displaystyle P[N(b)-N(a)=k]={\frac {e^{-N_{a,b}}(N_{a,b})^{k}}{k!}}\qquad k=0,1,\ldots .}$

A rate function λ(t) in a non-homogeneous Poisson process can be either a deterministic function of time or an independent stochastic process, giving rise to a Cox process. A homogeneous Poisson process may be viewed as a special case when λ(t) = λ, a constant rate.

### Spatial

An important variation on the (notionally time-based) Poisson process is the spatial Poisson process. In the case of a one-dimension space (a line) the theory differs from that of a time-based Poisson process only in the interpretation of the index variable. For higher dimension spaces, where the index variable (now x) is in some vector space V (e.g. R2 or R3), a spatial Poisson process can be defined by the requirement that the random variables defined as the counts of the number of "events" inside each of a number of non-overlapping finite sub-regions of V should each have a Poisson distribution and should be independent of each other.

### Space-time

A further variation on the Poisson process, the space-time Poisson process, allows for separately distinguished space and time variables. Even though this can theoretically be treated as a pure spatial process by treating "time" as just another component of a vector space, it is convenient in most applications to treat space and time separately, both for modeling purposes in practical applications and because of the types of properties of such processes that it is interesting to study.

In comparison to a time-based inhomogeneous Poisson process, the extension to a space-time Poisson process can introduce a spatial dependence into the rate function, such that it is defined as ${\displaystyle \lambda (x,t)}$, where ${\displaystyle x\in V}$ for some vector space V (e.g. R2 or R3). However a space-time Poisson process may have a rate function that is constant with respect to either or both of x and t. For any set ${\displaystyle S\subset V}$ (e.g. a spatial region) with finite measure ${\displaystyle \mu (S)}$, the number of events occurring inside this region can be modeled as a Poisson process with associated rate function λS(t) such that

${\displaystyle \lambda _{S}(t)=\int _{S}\lambda (x,t)\,d\mu (x).}$

#### Separable space-time processes

In the special case that this generalized rate function is a separable function of time and space, we have:

${\displaystyle \lambda (x,t)=f(x)\lambda (t)\,}$

for some function ${\displaystyle f(x)}$. Without loss of generality, let

${\displaystyle \int _{V}f(x)\,d\mu (x)=1.}$

(If this is not the case, λ(t) can be scaled appropriately.) Now, ${\displaystyle f(x)}$ represents the spatial probability density function of these random events in the following sense. The act of sampling this spatial Poisson process is equivalent to sampling a Poisson process with rate function λ(t), and associating with each event a random vector ${\displaystyle X}$ sampled from the probability density function ${\displaystyle f(x)}$. A similar result can be shown for the general (non-separable) case.

In its most general form, the only two conditions for a counting process to be a Poisson process are:{{ safesubst:#invoke:Unsubst||date=__DATE__ |$B= {{#invoke:Category handler|main}}{{#invoke:Category handler|main}}[citation needed] }} • Orderliness: which roughly means ${\displaystyle \lim _{\Delta t\to 0}P(N(t+\Delta t)-N(t)>1\mid N(t+\Delta t)-N(t)\geq 1)=0}$ which implies that arrivals don't occur simultaneously (but this is actually a mathematically stronger statement). • Memorylessness (also called evolution without after-effects): the number of arrivals occurring in any bounded interval of time after time t is independent of the number of arrivals occurring before time t. These seemingly unrestrictive conditions actually impose a great deal of structure in the Poisson process. In particular, they imply that the time between consecutive events (called interarrival times) are independent random variables. For the homogeneous Poisson process, these inter-arrival times are exponentially distributed with parameter λ (mean 1/λ). Also, the memorylessness property entails that the number of events in any time interval is independent of the number of events in any other interval that is disjoint from it. This latter property is known as the independent increments property of the Poisson process. ## Properties As defined above, the stochastic process {N(t)} is a Markov process, or more specifically, a continuous-time Markov process.{{ safesubst:#invoke:Unsubst||date=__DATE__ |$B= {{#invoke:Category handler|main}}{{#invoke:Category handler|main}}[citation needed] }}

To illustrate the exponentially distributed inter-arrival times property, consider a homogeneous Poisson process N(t) with rate parameter λ, and let Tk be the time of the kth arrival, for k = 1, 2, 3, ... . Clearly the number of arrivals before some fixed time t is less than k if and only if the waiting time until the kth arrival is more than t. In symbols, the event [N(t) < k] occurs if and only if the event [Tk > t] occurs. Consequently the probabilities of these events are the same:

${\displaystyle P(T_{k}>t)=P(N(t)

In particular, consider the waiting time until the first arrival. Clearly that time is more than t if and only if the number of arrivals before time t is 0. Combining this latter property with the above probability distribution for the number of homogeneous Poisson process events in a fixed interval gives:

${\displaystyle P(T_{1}>t)=P(N(t)=0)=P[N(t)-N(0)=0]={\frac {e^{-\lambda t}(\lambda t)^{0}}{0!}}=e^{-\lambda t}.}$

And therefore:

${\displaystyle P(T_{1}\leq t)=1-e^{-\lambda t}}$ (which is the CDF of the exponential distribution).

Consequently, the waiting time until the first arrival T1 has an exponential distribution, and is thus memoryless. One can similarly show that the other interarrival times Tk − Tk−1 share the same distribution. Hence, they are independent, identically distributed (i.i.d.) random variables with parameter λ > 0; and expected value 1/λ. For example, if the average rate of arrivals is 5 per minute, then the average waiting time between arrivals is 1/5 minute.

## Applications

The classic example of phenomena well modelled by a Poisson process is deaths due to horse kick in the Prussian army, as shown in 1898 by Ladislaus Bortkiewicz, a Polish economist and statistician who also examined data of child suicides.[4][5] The following examples are also well-modeled by the Poisson process:

• Number of road crashes (or injuries/fatalities) at a site or in an area
• Goals scored in a soccer match.[6]
• Requests for individual documents on a web server.[3]
• Particle emissions due to radioactive decay by an unstable substance. In this case the Poisson process is non-homogeneous in a predictable manner—the emission rate declines as particles are emitted.
• Action potentials emitted by a neuron.[7]
• L. F. Richardson showed that the outbreak of war followed a Poisson process from 1820 to 1950.[8]
• Photons landing on a photodiode, in particular in low light environments. This phenomena is related to shot noise.
• Opportunities for firms to adjust nominal prices.[9]
• Arrival of innovations from research and development.[10]
• Requests for telephone calls at a switchboard.{{ safesubst:#invoke:Unsubst||date=__DATE__ |\$B=

{{#invoke:Category handler|main}}{{#invoke:Category handler|main}}[citation needed] }}

• In queueing theory, the times of customer/job arrivals at queues are often assumed to be a Poisson process.
• The evolution (changes on pages) of Internet, in general (although not in the particular case of Wikipedia)[11]

## Occurrence

The Palm–Khintchine theorem provides a result that shows that the superposition of many low intensity non-Poisson point processes will be close to a Poisson process.

## Notes

1. The word event used here is not an instance of the concept of event as frequently used in probability theory.

## References

1. Template:Cite doi
2. Template:Cite doi
3. Template:Cite doi
4. Ladislaus von Bortkiewicz, Das Gesetz der kleinen Zahlen [The law of small numbers] (Leipzig, Germany: B.G. Teubner, 1898). On page 1, Bortkiewicz presents the Poisson distribution. On pages 23-25, Bortkiewicz presents his famous analysis of "4. Beispiel: Die durch Schlag eines Pferdes im preussischen Heere Getöteten." (4. Example: Those killed in the Prussian army by a horse's kick.).
5. {{#invoke:citation/CS1|citation |CitationClass=book }}
6. Template:Cite doi
7. Template:Cite doi
8. Template:Cite doi
9. Template:Cite doi
10. Template:Cite jstor
11. Almeida, R. B.; Mozafari, B., y Cho, J. (2007). On the evolution of Wikipedia. ICWSM (Boulder, Colorado) (Retrieved May 31, 2014)

• {{#invoke:citation/CS1|citation

|CitationClass=book }}

• {{#invoke:citation/CS1|citation

|CitationClass=book }}

• {{#invoke:citation/CS1|citation

|CitationClass=book }}

• {{#invoke:citation/CS1|citation

|CitationClass=book }}