Understanding Projections in Linear Algebra

Aug 28, 2024

Lecture Notes: Projection

Introduction to Projection

  • Focus on projecting vector b onto vector a.
  • Goal: Find the point on the line through vector a that is closest to vector b.

Geometry of Projection in 2D

  • 1-D Subspace: Line through A as a 1-D subspace.
  • Closest point to B on the line is the projection P.
  • Relationship with Orthogonality: The error (difference between B and P) is perpendicular to A.

Key Equations

  • The equation for projection is derived from the condition of orthogonality:

    • A is perpendicular to E (error vector)
    • E = B - xA
    • Perpendicularity leads to the equation:

    A^T E = 0

Calculation of Projection

  • Simplifying leads to:

    • x = (A^T B) / (A^T A)
    • P = xA
  • Impact of Doubling Vectors:

    • If B is doubled, P also doubles.
    • If A is doubled, P remains unchanged.

Projection Matrix

  • Expression for the projection matrix P:
    • P = (A A^T) / (A^T A)
  • Properties of the projection matrix:
    • Rank: 1
    • Symmetric: P^T = P
    • Idempotent: P^2 = P

Projection in Higher Dimensions

  • Transitioning to projection in higher dimensions (planes, etc.).
  • Why Project?: Solutions may not exist for Ax = b; find the closest solution instead.

Projection onto a Plane

  • Define a plane using two independent vectors, A1 and A2.
  • The projection problem becomes:
    • Find x such that B - Ax is perpendicular to the plane.
  • Resulting equations in matrix form:
    • A^T B - A x = 0
    • Which simplifies to:
      A^T A x = A^T B

Solutions for Projection in n-Dimensions

  • x hat:
    • x hat = (A^T A)^{-1} A^T B
  • Projection:
    • P = A (A^T A)^{-1} A^T

Properties of the Projection Matrix in n-Dimensions

  • The projection matrix is symmetric and idempotent:
    • P^T = P
    • P^2 = P

Application: Least Squares Fitting

  • Example with data points (1,1), (2,2), (3,2).
  • Objective: Fit a line that minimizes the distance to these points.
  • Set up the equation Ax = b with matrices representing the data.
  • The solution involves finding the projection of b onto the column space.

Conclusion

  • The lecture focused on understanding projections in linear algebra and their applications in solving systems of equations, particularly in least squares fitting.
  • Next class will include numerical examples and application of derived formulas.