Expectation of Sums of Random Variables

Example (Expectation of a Binomial Random Variable)
Let X be a binomial random variable with parameters n and p. Recalling that such a random variable represents the number of successes in nindependent trials when each trial has probability p of being a success, we have that $X=X_1+X_2+\cdots+X_n$ where
$X_i=\left\{
\begin{array}{ll}
1&\mbox{ if the }i\mbox{th trial is a success} \\ \\
0&\mbox{ if the }i\mbox{th trial is a failure}
\end{array}\right .$
Hence, Xi is a Bernoulli random variable having expectation E[Xi]=1(p)+0(1-p). Thus
$E[X]=E[X_1]+E[X_2]+\cdots+E[X_n]=np\qquad\rule[0.02em]{1.0mm}{1.5mm}$

Example (Mean of a Negative Binomial Random Variable)
If independent trials, having a constant probability p of being successes, are performed, determine the expected number of trials required to amass a total of r successes.
Solution:
If X denotes the number of trials needed to amass a total of r successes, then X is a negative binomial random variable. It can be represented by $X=X_1+X_2+\cdots+X_r$ where
X1 is the number of trials required to obtain the first success,
X2 the number of additional trials until the second success is obtained,
X2 the number of additional trials until the third success is obtained, and so on.

That is, Xi represents the number of additional trials required, after the (i-1)st success, until a total of i successes are amassed. A little thought reveals that each of the random variables Xi is a geometric random variable with parameter p. Hence, $E[X_i]=1/p,i=1,2,\ldots,r$; and thus

$E[X]=E[X_1]+\cdots+E[X_r]=\displaystyle\frac{r}{p}\qquad\rule[0.02em]{1.0mm}{1.5mm}$

Example (Mean of a Hypergeometric Random Variable)
If n balls are randomly selected from an urn containing N balls of which m are white, find the expected number of white balls selected.
Solution:
Let X denote the number of white balls selected, and represent X as $X=X_1+\cdots+X_m$ where
$X_i=\left\{
\begin{array}{ll}
1&\mbox{ if the }i\mbox{th white ball is selected} \\ \\
0&\mbox{ otherwise }
\end{array}\right .$
Now,
$\begin{array}{rcl}
E[X_i]&=&P\{X_i=1\} \\ \\
&=&P\{i\mbox{th white ball is sel...
...n-1}}{\displaystyle{N\choose n}} \\ \\
&=&\displaystyle\frac{n}{N}
\end{array}$
Hence, $E[X]=E[X_1]+\cdots+E[X_m]=\displaystyle\frac{mn}{N}$.

We could also have obtained the above result by using the alternative representation $X=Y_1+\cdots+Y_n$ where

$Y_i=\left\{
\begin{array}{ll}
1&\mbox{ if the }i\mbox{th white ball is selected} \\ \\
0&\mbox{ otherwise }
\end{array}\right .$
Since the ith ball selected is equally likely to be any of the N balls, it follows that $E[Y_i]=\displaystyle\frac{m}{N}$ so $E[X]=E[Y_1]+\cdots+E[Y_n]=\displaystyle\frac{nm}{N}\qquad\rule[0.02em]{1.0mm}{1.5mm}$