Chebyshev's Inequality and Weak Law of Large Numbers
Proposition (Markov's Inequality)
If X is a random variable that takes only nonnegative values, then any value
a>0
![$\displaystyle P\{X\geq a\}\leq\frac{E[X]}{a}$](img1.gif)

.
Taking expectations
of the above yields that
![$E[I]\leq\displaystyle\frac{E[X]}{a}$](img5.gif)
Proposition (Chebyshev's Inequality)
If X is a random variable with finite mean
and variance
,
then
for any value k>0

![$P\{(X-\mu)^2\geq k^2\}\leq\displaystyle\frac{E[(X-\mu)^2]}{k^2}\qquad (2.1)$](img11.gif)
![$P\{\vert X-\mu\vert\geq k\}\leq\displaystyle\frac{E[(X-\mu)^2]}{k^2}=\frac{\sigma^2}{k^2}
\qquad\rule[0.02em]{1.0mm}{1.5mm}$](img14.gif)
Proposition
If Var(X)=0, then
.
In other words, the only random variables having variances equal to 0 are those
that are constant with probability 1.
Proof:
By Chebyshev's inequality we have, for any

![$\begin{array}{rcl}
0=\displaystyle\lim_{n\rightarrow\infty}P\left\{\vert X-\mu\...
...\}\right\}
\\ \\
&=&P\{X\neq\mu\}\qquad\rule[0.02em]{1.0mm}{1.5mm}
\end{array}$](img19.gif)
Theorem (The Weak Law of Large Numbers)
Let
be a sequence of independent and identically distributed
random variables, each having finite mean
.
Then, for any
,

and
it follows from Chebyshev's inequality that
![$P\left\{\left \vert\displaystyle\frac{X_1+\cdots+X_n}{n}-\mu\right \vert\geq\va...
...on\right\}
\leq\frac{\sigma^2}{n\varepsilon^2}\qquad\rule[0.02em]{1.0mm}{1.5mm}$](img26.gif)