6  Summaries: Distributions and Samples

6.1 Expectation

6.1.1 The expected value is meaningful for outcomes for which addition and subtraction is meaningful

6.2 Expectation

6.2.1 The expected value is meaningful for outcomes for which addition and subtraction is meaningful

  • Interval variables
  • Ratio variables

6.3 Expectation

6.3.1 The expected value is meaningful for outcomes for which addition and subtraction is meaningful or binary variables

  • Interval variables
  • Ratio variables
  • Binary variables (will get to this is a bit)

6.4 Expectation

6.4.1 The expected value of mutli-category ordinal variables can be meaningful if one assigns scores to each candidate outcome.

6.5 Expectation

6.5.1 The expected value of mutli-category ordinal variables can be meaningful if one assigns scores to each candidate outcome.

Outcome Score
Low 0
Medium 2
High 5

6.6 Expectation

6.6.1 The expected value of mutli-category ordinal variables can be meaningful if one assigns scores to each candidate outcome.

Outcome Score
Low 0
Medium 2
High 5

The selection of scores is a researcher decision which can be controversial.

6.7 Expectation

For a random variable \(X\),

6.7.1 Conceptually & Simulation

\[E[X] = \lim_{n\to\infty} \frac{X_1+X_2+\cdots+X_n}{n}\]

6.7.2 Mathematically

\[E[X] = \left\{\begin{array}{ll}\sum_x x\,P(X = x) & \text{if discrete} \\ \int x f_X(x)\,dx & \text{if continuous} \end{array}\right.\]

6.8 Expectation

If \(X\) is a Bernoulli random variable,

\[E[X] = 0\cdot P(X = 0) + 1\cdot P(X = 1) = P(X=1) \]

(This is why expectation is meaningful for binary outcomes.)

6.9 Expectation

If \(X\) is a binomial random variable,

\[E[X] = \sum_{x=0}^N x{N \choose x}p^x(1-p)^{(N-x)} \]

What does this simplify to?

6.10 Expectation

If \(X\) is a Poisson random variable,

\[E[X] = \sum_{x=0}^{\infty} x\frac{e^{-\lambda}\lambda^x}{x!} \]

What does this simplify to? (Use the fact that \(e^t = \sum_{i=0}^{\infty}\frac{t^i}{i!}\).) What does this simplify to?

6.11 Expectation

If \(X\) is a uniform random variable,

\[E[X] = \int_{a}^b x(b-a)^{-1}\,dx \]

What does this simplify to?

6.12 Expectation

If \(X\) is a normal random variable,

\[E[X] = \int_{-\infty}^{\infty} x\frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x-\mu)^2}{\sigma^2}}\,dx \]

What does this simplify to?

6.13 Expectation

  • Does not always exist.

6.14 Expectation

  • If we create a new random variable with addition, \[ \begin{align*} D &= U + V \\ &\text{then} \\ E[D] &= E[U] + E[V] \end{align*} \]

6.15 Example

\[ \begin{align*} U &= \text{Number of dropped calls in Nashville}\\ V &= \text{Number of dropped calls in Atlanta}\\ D &= U + V \\ &\text{then} \\ E[D] &= E[U] + E[V] \end{align*} \]

6.16 Example

Note: The random variables need not be independent.

\[ \begin{align*} U &= \text{Number of dropped calls in Nashville}\\ V &= \text{Number of dropped calls at Vanderbilt}\\ D &= U + V \\ &\text{then} \\ E[D] &= E[U] + E[V] \end{align*} \]

6.17 Expectation

  • If we create a new random variable with multiplication, \[ \begin{align*} &\text{ If } U \text{ and } V \text{ are independent, and} \\ &D = U \cdot V \\ &\text{then} \\ &E[D] = E[U] \cdot E[V] \\ \end{align*} \]

6.18 Special case

  • If we create a new random variable by multiplying a constant, say \(u\) \[ \begin{align*} &D = u \cdot V \\ &\text{then} \\ &E[D] = E[u] \cdot E[V] = u \cdot E[V] \\ \end{align*} \]

6.19 Example

\[ \begin{align*} &V = \text{Weight time in minutes}\\ &D = \frac{1}{60} \cdot V = \text{Weight time in hours} \\ &\text{then} \\ &E[D] = \frac{1}{60} \cdot E[V] \\ \end{align*} \]

6.20 Expectation

  • If we create a new random variable with function transformation, \[ \begin{align*} &D = g(V) \\ &\text{then} \\ &E[D] = E[g(V)] = \left\{\begin{array}{ll} \sum g(v)P(V = v) & \text{if discrete} \\ \int g(v)f_V(v)\, dv & \text{if continuous} \end{array}\right. \\ \end{align*} \]

6.21 Example

  • If we create a new random variable with log transformation,

\[ \begin{align*} &V = \text{Number of twitter followers}\\ &D = log(V) \\ &\text{then} \\ &E[D] = E[log(V)] = \sum_{v=0}^{\infty} log(v)P(V = v) \\ \end{align*} \]

6.22 Expecation

Suppose

\[g(V) = (V - E[V])^2\]

6.23 Expecation

The expectation

\[E[g(V)] = E[(V - E[V])^2]\]

is a special quanity called the variance.

6.24 Variance

\[\begin{align*} V[X] &= E[(X - E[X])^2]\\ &= E[X^2] - E^2[X]\\ \end{align*}\]

\[\text{Notation: } E^2[X] = (E[X])^2\]

6.25 Variance

If \(c\) is a constant and \(U\) is a random variable,

  • What is \(V[cU]\)?
  • What is \(V[c + U]\)?

If \(U\) and \(W\) are independent random variables,

  • What is \(V[U + W]\)?

6.26 Another special expecation

6.26.1 Covariance

\[ COV[U,W] = E\left[\,(U - E[U])\,(W - E[W])\,\right]\]

  • If we create a new random variable with function transformation,