Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

4.4 Chapter Summary

This chapter introduced expectations as a tool for summarizing the center and breadth of a distribution.

Interactive Tools:

  1. Law of Large Numbers Interactive - Use this interactive to watch sample averages over large data sets converge to expectations.

Expectations

Definitions and examples are all available in Section 4.1.

  1. The expected value of a random variable, E[X]\mathbb{E}[X], is the weighted average of possible xx against the PMF/PDF:

    E[X]={all xxPMF(x) if discreteall xxPDF(x)dx if continuous\mathbb{E}[X] = \begin{cases} \sum_{\text{all } x} x \text{PMF}(x) & \text{ if discrete} \\ \int_{\text{all } x} x \text{PDF}(x) dx & \text{ if continuous} \end{cases}
    • The expected value is equivalent to the center of mass of the distribution

    • Long run sample averages converge to the expected value

  2. The expected value of a function of a random variable, E[f(X)]\mathbb{E}[f(X)], is the weighted average over each xx, of f(x)f(x), weighted by the PMF/PDF

E[g(X)]={all xg(x)PMF(x) if discreteall xg(x)PDF(x)dx if continuous\mathbb{E}[g(X)] = \begin{cases} \sum_{\text{all } x} g(x) \text{PMF}(x) & \text{ if discrete} \\ \int_{\text{all } x} g(x) \text{PDF}(x) dx & \text{ if continuous} \end{cases}
  1. The expected value is distinct from the:

    • Mode: the most likely outcome, or collection of outcomes that maximize the PMF/PDF.

    • Median: the “midpoint” value xx such that Pr(X<x)=Pr(X>x)\text{Pr}(X < x_*) = \text{Pr}(X > x_*).

Rules of Expectations

Definitions and examples are all available in Section 4.2.

  1. Expectations of key distributions:

    • Constants: E[c]=c\mathbb{E}[c] = c.

    • Indicators: if XBernoulli(p)X \sim \text{Bernoulli}(p), then E[X]=p\mathbb{E}[X] = p.

    • Symmetric: If XX is drawn symmetrically about xx_*, then E[X]=x\mathbb{E}[X] = x_*.

    • Binomial: If XBinomial(n,p)X \sim \text{Binomial}(n,p), then E[X]=np\mathbb{E}[X] = n p.

  2. Linearity: E[aX+b]=aE[X]+b\mathbb{E}[a X + b] = a \mathbb{E}[X] + b.

  3. Additivity: for any pair of random variables XX and YY, E[X+Y]=E[X]+E[Y]\mathbb{E}[X + Y] = \mathbb{E}[X] + \mathbb{E}[Y].

We’ll add more properties to this list in future chapters.

Variance:

Definitions and examples are all available in Section 4.3.

  1. Given, E[X]=xˉ\mathbb{E}[X] = \bar{x}, the variance and standard deviation of a random variable are:

    Var[X]=E[(Xxˉ)2],SD[X]=Var[X]\text{Var}[X] = \mathbb{E}[(X - \bar{x})^2], \quad \text{SD}[X] = \sqrt{\text{Var}[X]}
    • The standard deviation measures the breadth, spread, or width of the distribution

  2. Properties of Variance:

    • Var[X]0\text{Var}[X] \geq 0

    • Var[c]=0\text{Var}[c] = 0

    • Var[X+b]=Var[X]\text{Var}[X + b] = \text{Var}[X]

    • Var[aX]=a2Var[X]\text{Var}[a X] = a^2 \text{Var}[X]

  3. To compute variances, we often use:

    Var[X]=E[X2]E[X]2\text{Var}[X] = \mathbb{E}[X^2] - \mathbb{E}[X]^2
    • The variance in a random variable is its expected square, minus its squared expectation.