Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

12 Estimation

Chapter Overview

Where We’re Going

In this chapter, we will answer the question, “how can I estimate an unknown value from related data?”

We will:

  1. Introduce the idea of an estimator, a function that accepts data and returns an estimate (e.g. a best guess) at an unknown parameter (see Section 12.1).

    • Show how to derive a Maximum Likelihood Estimator (MLE) for an unknown parameter

    • Use maximum likelihood estimation to estimate unknown parameters for binomial, geometric, and normal random variables.

    • Show that many, but not all, MLE’s are empirical averages

  2. Define an empirical distribution and distinguish empirical/sample averages from expectations/population averages.

    • Discuss the use of empirical averages as estimators to unknown expectations, variances, and covariances.

  3. Discuss properties of estimators (see Section 12.2). In particular, we will discuss the:

    • consistency,

    • bias,

    • variance,

    • accuracy of an estimator.