The expected value, or mean, of a random variable is a measure of the central location for the random variable. The formula for the expected value of a discrete random variable X follows.

The expected value of a discrete random variable X can be written as i.e.,

\(E\left( X \right)\; =\; \mu \; =\; \sum_{}^{}{Xf\left( X \right)}\)

**Properties of Mathematical Expectations [Expected Values]**

1. The expected value of constant ‘C’ is equal to the constant, .i.e.,

\(E\left( C \right)\; =\; C\)

2. The expected value of the product of a constant ‘C’ and a random variable X is equal to the constant times the expected values of random variable. i.e.,

\(E\left( C X\right)\; =\; C \;E(X)\)

3. The expected value of the sum of a random variable X and a constant ‘C’ is the sum of the expected values of the random variable and constant. i.e.,

\(E\left( X +C \right)\; =\; E(X ) +C\)

4. The expected value of the product of two independent random variable is equal to the product of their individual expected value. i.e.,

\(E\left( XY \right)\; =\; E(X ) \;E(Y)\)

5. The expected value of the sum of two independent random variable is equal to the sum of their individual expected values. i.e.,

\(E\left( X+Y \right)\; =\; E(X ) +E(Y)\)

6. The variance of the product of a constant and a random variable X is equal to the constant squared times the variance of the random variable X. i.e.,

\(Var\left( CX \right)\; =\; C^2\; Var(X)\)

7. The variance of the sum of two independent random variables is equal to the sum of their individual variances. Also, the variance of the difference of two independent random variables is equal to the sum of their individual variances. i.e.,

\(Var(X+Y) = Var(X) + var(Y) = Var(X-Y)\)

See Proofs of BLUE properties here.