The Central Limit Theorem

Theorem (The Central Limit Theorem)
Let $X_1,X_2,\ldots$ be a sequence of independent and identically distributed random variables each having mean $\mu$ and variance $\sigma^2$. Then the distribution of

$\displaystyle\frac{X_1+\cdots+X_n -n \mu}{\sigma\sqrt{n}}$
tends to the standard normal as $n\rightarrow\infty$. That is,
$P\left\{\displaystyle\frac{X_1+\cdots+X_n -n \mu}{\sigma\sqrt{n}}\leq a\right\}
\rightarrow\frac{1}{\sqrt{2\pi}}\int_{-\infty}^ae^{-x^2/2}dx$
as $n\rightarrow\infty$

Remark
Although Theorem only states that, for each a

$P\left\{\displaystyle\frac{X_1+\ldots+X_n-n\mu}{\sigma\sqrt{n}}\leq a\right\}
\rightarrow\Phi(a)$
it can, in fact, be shown that the convergence is uniform in a. [We say that $f_n(a)\rightarrow f(a)$ uniformly in a, if for each $\varepsilon>0$, there exists an N such that $\vert f_n(a)-f(a)\vert<\varepsilon$ for all a whenever $n\geq N$.]

Theorem (Central Limit Theorem for Independent Random Variables)
Let $X_1,X_2,\ldots$ be a sequence of independent random variables having repective means and variances $\mu_i=E[X_i]$, $\sigma_i^2=Var(X_i)$.
(a) If the Xi are uniformly bounded, that is, if for some M, $P\{\vert X_i\vert<M\}=1$ for all i
(b) $\displaystyle\sum_{i=1}^\infty\sigma^2=\infty$
then

$P\left\{\displaystyle\frac{\displaystyle\sum_{i=1}^n(X_i-\mu_i)}
{\sqrt{\displaystyle\sum_{i=1}^n\sigma_i^2}}\leq a\right\}\rightarrow\Phi(a)$
as $n\rightarrow\infty$

Theorem (The Strong Law of Large Numbers)
Let $X_1,X_2,\ldots$ be a sequence of independent and identically distributed random variables, each having a finite mean $\mu=E[X_i]$. Then, with probability 1,

$\displaystyle\frac{X_1+X_2+\cdots+X_n}{n}\rightarrow\mu$
as $n\rightarrow\infty$