2 minute read

Poisson Distribution

The Poisson distribution is a discrete probability distribution for the probability of a fixed number of events occurring in a fixed interval of time if these events occur with a known constant mean rate $m$ and independently of the time since the last event. In mathematically, it is called Poisson process, which assumes for $g(x, w)$; the probability of $x$ numbers of events in each interval $w$

(1) $g(1, h) = \lambda h + o(h)$
(2) $\sum_{x=2}^\infty g(x, h) = o(h)$
(3) The number of evetns in non-overlapping intervals are independent.

For these assumptions, we can derive the explicit formula of $g(x, w)$ (PMF).

PMF

\[\begin{align*} p(x) = \frac{m^x e^{-m}}{x!} \; (x = 0, 1, 2, \cdots) \end{align*}\]

image

$\mathbf{Derivation.}$

For $x = 0$,

Note that $g(0, 0) = 1$ reasonable. As $(0, t + h] = (0, t] \cup (t, t + h]$, from (3), $g(0, t + h) = g(0, t) \cdot g(0, h) = g(0, t) \cdot (1 - \lambda h - o(h))$. Then,

\[\begin{align*} &\displaystyle \lim_{h \to 0} \frac{g(0, t+h) - g(0,t)}{h} = \frac{dg(0,t)}{dt} = - \lambda g(0, t) \\ &\Rightarrow g(0, t) = e^{-\lambda t} \end{align*}\]


For $x \geq 1$,

\[\begin{align*} &g(x, w + h) = g(x, w) g(0, h) + g(x - 1, w) g(1, h) + \cdots + g(0, w) g(x, h) \\ &\frac{dg(x, w)}{dw} = - \lambda g(x, w) + \lambda g(x - 1, w) \; \Rightarrow \; g(x, w) = \frac{(\lambda w)^x e^{-\lambda w}}{x!} \end{align*}\]



MGF

\[\begin{align*} M(t) &= \mathbb{E}(e^{tX}) \\ &= \sum_{x=0}^\infty e^{tx} \frac{m^x e^{-m}}{x!} \\ &= \sum_{x=0}^\infty \frac{(e^{t}m)^x e^{-e^{t}m}}{x!} e^{m(e^{t}-1)} \\ &= e^{m(e^{t}-1)} \end{align*}\]

Mean, Variance

\[\begin{align*} \mathbb{E}[X] &= m \\ \text{Var}[X] &= m \end{align*}\]

Mode (Peak)

Note that the distribution has two peaks (modes) at $m, m - 1$ as

\[\begin{align*} p(x + 1) \geq p(x) \text{ if } x \leq m - 1 \\ p(x + 1) \leq p(x) \text{ if } x \geq m - 1 \\ \end{align*}\]

clearly.

$\mathbf{Thm\ 1.1.}$ Suppose $X_1, \cdots X_n$ are independent and each of them has a Poisson distribution with parameter $\lambda_i$.
Then, $X = \sum_{i=1}^n X_i$ has a Poisson distribution with parameter $\sum_{i=1}^n \lambda_i$.

$\mathbf{Proof.}$

Let $M_i (t_i)$ be the MGF of $X_i$, and $M(t)$ be the MGF of $X$. Then,

\[\begin{align*} M_i (t_i) &= e^{\lambda_i (e^{t_i} - 1)} \\ M (t) &= \mathbb{E}[e^{tY}] = \prod_{i=1}^m \mathbb{E}[e^{tX_i}] = e^{\sum_{i=1}^n \lambda_i (e^{t} - 1)}._\blacksquare \end{align*}\]



Law of rare events

Suppose a binomial distribution $B(n, p)$. If the number of processes is very large and the probability of the event is very small, where $p$, the binomial random variable $X$ can be approximated by a Poisson distribution.

$\mathbf{Thm\ 1.2.}$ (Law of rare events). Consider $X \sim B(n, p)$.

\[\begin{align*} \displaystyle \lim_{n \to \infty} np = \lambda \Rightarrow X \approx \text{Poisson}(\lambda) \end{align*}\]
$\mathbf{Proof.}$

Consider the PMF of the binomial distribution. When $n$ goes to $\infty$, the PMF of binomial distribution converges to the PMF of Poisson distribution:

\[\begin{align*} p(x = k) &= \frac{n!}{k! (n-k)!} p^k (1 - p)^{n - k} \\ &= \frac{1}{k!} \cdot \frac{n(n-1)(n-2) \cdots (n-k + 1)}{n^k} \cdot (np)^k \cdot (1 - \frac{np}{n})^{n-k} \\ &\approx \frac{1}{k!} \cdot 1 \cdot \lambda^k \cdot e^{-\lambda} \end{align*}\]





Reference

[1] Hogg, R., McKean, J. & Craig, A., Introduction to Mathematical Statistics, Pearson 2019
[2] Wikipedia, Poisson Distribution

Leave a comment