The expected value associated with a discrete random variable $X$, denoted by either $E(X)$ or $\mu$ (depending on context) is the theoretical mean of $X$.
For a discrete random variable, this means that the expected value should be indentical to the mean value of a set of realizations of this random variable, when the distribution of this set agrees exactly with the associated probability mass function (presuming such a set exists).
As an example, suppose that we were flipping a coin three times and X counted the number of heads seen. The probability mass function for X is shown below.
$$\begin{array}{c|c|c|c|c}X & 0 & 1 & 2 & 3\\ \hline P(X) & 1/8 & 3/8 & 3/8 & 1/8\\ \end{array}$$If we were to do this 200 times, we would "expect" to see
The mean of this theoretical distribution would then be
$$\mu = \frac{0 \cdot 25 + 1 \cdot 75 + 2 \cdot 75 + 3 \cdot 25}{200}$$But think about where these numbers came from -- we could write instead:
$$\mu = \frac{ 0 \cdot 200 \cdot (1/8) + 1 \cdot 200 \cdot (3/8) + 2 \cdot 200 \cdot (3/8) + 3 \cdot 200 \cdot (1/8)}{200}$$We can then factor out a 200 from every term in the numerator, which would cancel with the 200 in the denominator, yielding
$$\mu = 0 \cdot (1/8) + 1 \cdot (3/8) + 2 \cdot (3/8) + 3 \cdot (1/8) = 1.5$$So we expect to see an average of 1.5 heads throughout our trials. Notice the complete lack of 200 in the last calculation of the above expression! This wasn't a coincidence -- it would have happened if the 200 was 1000, 10 million, or 13,798,235,114.
As you can see, the "expected value" depends only on the outcome values and the probabilities those outcomes occur. We just have to multiply the outcomes together with their corresponding probabilities and add them up!
With this in mind, and assuming that this random variable has an outcome/sample space of $S$ and probability mass function $P$, this expected value is given by
$$E(X) = \sum_{x \in S} \left[x \cdot P(x)\right]$$We will often have need to find the expected value of the sum or difference of two or more random variables. To this end, suppose both $X$ and $Y$ are discrete random variables with outcome spaces $S_x = \{x_1, x_2, \ldots\}$, and $S_y = \{y_1, y_2, \ldots\}$, respectively.
One can show without too much trouble that the expected value of a sum of two random variables is the sum of their individual expected values. That is to say, :
As a first step in seeing why this is true, note that when talking about two random variables, one needs to worry about their joint distribution -- that is to say, rather than dealing with $P(X=x)$, one needs to deal with $P(X=x \textrm{ and } Y=y)$ instead.
If it helps, one can think of $X$ and $Y$ being the separate outcomes of two experiments (or games) -- which may or may not be related. For example, $X$ could be how much one wins in one hand of poker, while $Y$ might be how much one wins in another hand. Then $X+Y$ would be how much you won playing both games.
Thus, $$\begin{array}{rcl} E(X+Y) &=& \displaystyle{\sum_{x \in S_x,\,y \in S_y} (x+y) \cdot P(X=x \textrm{ and } Y=y)}\\\\ &=& \displaystyle{\sum_{x \in S_x,\,y \in S_y} x \cdot P(X=x \textrm{ and } Y=y) + \sum_{x \in S_x,\,y \in S_y} y \cdot P(X=x \textrm{ and } Y=y)} \end{array}$$ Consider the first of these sums. Note that $$\begin{array}{rcl} \displaystyle{\sum_{x \in S_x,\,y \in S_y} x \cdot P(X=x \textrm{ and } Y=y)} &=& \displaystyle{\sum_{x \in S_x} \left[ \sum_{y \in S_y} x \cdot P(X=x \textrm{ and } Y=y) \right]}\\\\ &=& \displaystyle{\sum_{x \in S_x} \left[ x \sum_{y \in S_y} P(X=x \textrm{ and } Y=y) \right]}\\\\ &=& \displaystyle{\sum_{x \in S_x} x \cdot P(X=x)}\\\\ &=& E(X) \end{array}$$ Similarly, $$\displaystyle{\sum_{x \in S_x,\,y \in S_y} y \cdot P(X=x \textrm{ and } Y=y) = E(Y)}$$ Combining these, we have $$E(X+Y) = E(X) + E(Y)$$
One can also show (even more quickly) that the expected value of some multiple of a random variable is that same multiple of the expected value of that random variable. That is to say,
To see this, note $$\begin{array}{rcl} E(cX) &=& \displaystyle{\sum_{x \in S_x} \left[ cx \cdot P(x) \right]}\\\\ &=& c \displaystyle{\sum_{x \in S_x} \left[ x \cdot P(x) \right]}\\\\ &=& c \cdot E(X) \end{array}$$
Combining these two properties (i.e., $E(X + Y) = E(X) + E(Y)$ and $E(cX) = cE(X)$), using $c= -1$, we arrive at the result stated at the beginning of this section $$E(X \pm Y) = E(X) \pm E(Y)$$