📊

Jan 5 Pt 1 - Understanding Joint and Marginal Probability

Feb 22, 2025

Lecture on Joint and Marginal Probability Mass Functions (PMF)

Key Concepts

  • Joint PMF: Provides all information about two random variables.
  • Marginal PMF:
    • For variable X: Sum over all possible Y values.
    • For variable Y: Sum over all possible X values.

Continuous Random Variables

  • Joint Density:
    • Replace PMF with density for continuous variables.
    • Probability of being at an exact point is zero, hence use density.
    • To find the probability of X and Y in a set A, integrate over the set.
    • From joint density, derive marginal densities by integrating out the other variable.

Expected Values and Variance

  • Expected Value (E):
    • For a function of a random variable, integrate/differentiate with respect to the density or PMF.
    • Example: Expected value of X² is found via integration.
  • Variance:
    • For one variable: Var(X) = E[X²] - (E[X])²
    • Covariance (Cov):
      • Measures how two variables change together.
      • Cov(X, Y) = E[XY] - E[X]E[Y]
      • If X and Y are independent, Cov(X, Y) = 0.

Independence

  • Independent Variables:
    • Joint PMF or PDF factors into the product of individual PMFs/PDFs.
    • Independence implies Covariance is zero but not vice versa.

Calculation Examples

  • Expectation Properties:

    • Linear: E[aX + bY] = aE[X] + bE[Y]
    • This holds regardless of independence.
  • Example: Matching Problem

    • Problem: Calculate expected number of people receiving their own gifts out of n people.
    • Each person has a 1/n chance of getting their own gift.
    • Total expected number is 1, regardless of n.
    • This uses the linearity of expectation without requiring independence.

Practical Applications

  • Sum of Independent Variables:
    • E[X+Y] = E[X] + E[Y] applies whether or not X and Y are independent.

Summary

  • Understanding joint and marginal PMFs and densities is crucial in calculating probabilities for both discrete and continuous variables.
  • Expectations can simplify complex random situations by leveraging linearity, even without independence.
  • Covariance provides insight into how variables co-vary, critical for understanding relationships between variables, such as in financial asset analysis.