[Statistics] Expectation Inequalities
Expectation Inequalities
In this post, we will see inequalities that apply to expectations including Cauchy-Schwartz inequality.
Before we start, let’s prove the useful lemma, which most inequalities ahead are based on this:
$\mathbf{Lemma\ 1.}$ Let $a$ and $b$ be any positive numbers. Let $p$ and $q$ be any positive numbers greater than 1, satisfying $\frac{1}{p} + \frac{1}{q} = 1$. Then
\[\begin{align*} \frac{1}{p} a^p + \frac{1}{q} b^q \geq ab \end{align*}\]with equality if and only if $a^p = b^q$.
$\mathbf{Proof.}$
Let’s consider the function of $a$ (and fix $b$)
\[\begin{align*} g(a)=\frac{1}{p} a^p+\frac{1}{q} b^q-a b . \end{align*}\]We may show the inequality by finding the minimum of this function. By setting the first derivative zero and checking the second derivative;
\[\begin{align*} \frac{d}{d a} g(a)=0 \Rightarrow a^{p-1}-b=0 \Rightarrow b=a^{p-1} \end{align*}\]and
\[\begin{aligned} \frac{1}{p} a^p+\frac{1}{q}\left(a^{p-1}\right)^q-a a^{p-1} & =\frac{1}{p} a^p+\frac{1}{q} a^p-a^p \\ & =0 . \end{aligned}\]Thus, the minimum is 0 and it is unique: equality holds only if $a^{p-1} = b$.
Hölder’s Inequality
Let $X$ and $Y$ be any two random variables, and let $p$ and $q$ be any positive numbers greater than 1, satisfying $\frac{1}{p} + \frac{1}{q} = 1$. Then
\[\begin{aligned} \left| \mathbb{E}\left[XY \right] \right| \leq \mathbb{E} [|XY|] \leq (\mathbb{E}[|X|^p])^{1/p} (\mathbb{E}[|Y|^q])^{1/q} \end{aligned}\]$\mathbf{Proof.}$
Let’s consider the function of $a$ (and fix $b$)
\[\begin{aligned} g(a)=\frac{1}{p} a^p+\frac{1}{q} b^q-a b . \end{aligned}\]The first inequality is obvious. To prove the second one, we can use $\mathbf{Lemma\ 1}$ by setting
\[\begin{aligned} a=\frac{|X|}{\left(\mathbb{E}[|X|^p\right])^{1 / p}} \quad \text { and } \quad b=\frac{|Y|}{\left(\mathbb{E}[|Y|^q]\right)^{1 / q}} \end{aligned}\]Then, finally we get
\[\begin{aligned} \frac{1}{p} \frac{|X|^p}{\mathbb{E}[|X|^p]}+\frac{1}{q} \frac{|Y|^q}{\mathbb{E}[|Y|^q]} \geq \frac{|X Y|}{\left(\mathbb{E}[|X|^p]\right)^{1 / p}\left(\mathbb{E}[|Y|^q]\right)^{1 / q}} . \end{aligned}\]By taking expectations of both sides, done as the expectation of LHS is $1$.
Cauchy-Schwartz Inequality
For the special case of Hölder’s inequality, for any two random varables $X$ and $Y$, the following is satisfied: \(\begin{aligned} | \mathbb{E}[XY] | \leq \mathbb{E} [|XY|] \leq \sqrt{\mathbb{E}[|X|^2]} \sqrt{\mathbb{E}[|Y|^2]} \end{aligned}\)
Liapounov’s Inequality
Some other special cases of Hölder’s inequality are also useful. Set $Y = 1$ then we can get Liapounov’s Inequality:
\[\begin{aligned} (\mathbb{E}[| X |^r])^{1/r} \leq (\mathbb{E}[| X |^s])^{1/s} \; (1 \leq r < s < \infty) \end{aligned}\]$\mathbf{Proof.}$
For $1 < r < p$, we obtain
\[\begin{aligned} \mathbb{E}[| X |^r] \leq (\mathbb{E}[| X |^{pr}])^{1/p} \; (1 < p < \infty) \end{aligned}\]by replacing $$ | X | \(by\) | X | ^r$$. By setting $s = pr$, |
Minkowski’s Inequality
Let $X$ and $Y$ be any two random variables. For $1 \leq p < \infty$,
\[\begin{aligned} (\mathbb{E}[ | X + Y |^p ])^{1/p} \leq (\mathbb{E}[| X |^p])^{1/p} (\mathbb{E}[| Y |^p])^{1/p}. \end{aligned}\]$\mathbf{Proof.}$
By triangular inequality $$ | X + Y | \leq | X | + | Y | $$, |
By Hölder’s inequality,
\[\begin{aligned} \mathbb{E}[|X+Y|^p] \leq & {(\mathbb{E}[|X|^p])^{1 / p} (\mathbb{E}[|X+Y|^{q(p-1)}])^{1 / q}} \\ & +{(\mathbb{E}[|Y|^p])^{1 / p} (\mathbb{E}[|X+Y|^{q(p-1)}])^{1 / q}} \end{aligned}\]where $1/p + 1/q = 1$. By dividing BHS by $$(\mathbb{E}[ | X+Y | ^{q(p-1)}])^{1 / q}}$$, done. |
Reference
[1] Casella, George, and Roger L. Berger. Statistical inference. Cengage Learning, 2021.
Leave a comment