Expected Value of a Binomial Distribution (The Long Way)

Recalling that with regard to the binomial distribution, the probability of seeing $k$ successes in $n$ trials where the probability of success in each trial is $p$ (and $q = 1-p$) is given by $$P(X=k) = ({}_n C_k) p^k q^{n-k}$$ we can find the expected value in the normal way, by finding the sum $$E(X) = \sum x \cdot P(X=x)$$ or written more carefully: $$E(X) = \sum_{k=0}^n k \cdot ({}_n C_k) p^k q^{n-k}$$

Let's play around with this a bit algebraically...

$$\begin{array}{rcl} E(X) & = & \displaystyle{\sum_{k=0}^n k \cdot ({}_n C_k) p^k q^{n-k}}\\\\ & = & \displaystyle{\sum_{k=1}^n k \cdot ({}_n C_k) p^k q^{n-k}} \quad \textrm{ as the first term is zero}\\\\ & = & \displaystyle{\sum_{k=1}^n k \cdot \frac{n!}{k! (n-k)!} p^k q^{n-k}}\\\\ & = & \displaystyle{\sum_{k=1}^n \frac{n!}{(k-1)!(n-k)!} p^k q^{n-k}}\\\\ & = & \displaystyle{n \sum_{k=1}^n \frac{(n-1)!}{(k-1)!(n-k)!} p^k q^{n-k}} \quad \textrm{ note that $(n-1) - (k-1) = n-k$, so}\\\\ & = & \displaystyle{n \sum_{k=1}^n ({}_{n-1} C_{k-1}) p^k q^{n-k}}\\\\ & = & \displaystyle{n \sum_{k=1}^n ({}_{n-1} C_{k-1}) p^k q^{(n-1)-(k-1)}} \quad \textrm{ making the same substitution in a different place}\\\\ & = & \displaystyle{np \sum_{k=1}^n ({}_{n-1} C_{k-1}) p^{k-1} q^{(n-1)-(k-1)}}\\\\ & = & \displaystyle{np \sum_{j=0}^m ({}_m C_j) p^j q^{m-j}} \quad \textrm{ making the substitutions $j=k-1$ and $m = n-1$}\\\\ & = & \displaystyle{np} \quad \textrm{ as by the binomial theorem, } \displaystyle{\sum_{j=0}^m ({}_m C_j) p^j q^{m-j} = (p+q)^m = 1^m = 1} \end{array}$$

We have just shown that the expected value, $E(X)$, of a binomial distribution associated with $n$ trials, where the probability of success in each trial is $p$ is given by $$E(X) = np$$

Isn't that just a beautifully simple result? It makes one wonder if there is an easier way, don't you think? ...and what about the variance of a binomial distribution?