Coconote
AI notes
AI voice & video notes
Try for free
📐
Understanding Orthogonality in Linear Algebra
Jan 22, 2025
Lecture 20: Advanced Linear Algebra
Topic: Orthogonality
Introduction to Orthogonality
Orthogonality revisited from introductory linear algebra.
Definition
: Two vectors are orthogonal if they are perpendicular, i.e., the angle between them is π/2 or 90°.
Dot Product Method
: Vectors are orthogonal if their dot product is zero.
Generalization to Inner Products
:
In any vector space with an inner product, vectors are orthogonal if their inner product equals zero.
Orthogonality implies vectors are as independent as possible, pointing away from each other.
Geometric Interpretation
Linear Dependence
: Vectors are linear combinations of one another.
Linear Independence
: Vectors not along the same line; non-zero angle between them.
Orthogonality
: Stronger form of independence; vectors point maximally away from each other.
Orthogonal and Orthonormal Bases
Orthogonal Bases
: Basis where vectors are mutually orthogonal.
Orthonormal Bases
:
Same as orthogonal bases, but each vector is normalized to have length 1.
Orthogonality condition: Inner product between any two distinct basis vectors is zero.
Normalization: Each vector in the basis has a unit length.
Examples
Standard Basis in Rn and Cn
:
These are orthonormal bases with dot product and length checks.
Matrix Vector Spaces (m x n matrices)
:
Eij matrices form an orthonormal basis using the Frobenius inner product.
Function Spaces (Polynomials)
:
Not all standard bases form orthonormal bases; function space bases can be non-orthonormal.
Orthogonality vs. Linear Independence
Theorem
: Mutually orthogonal non-zero vectors are linearly independent.
Proof Outline
:
Demonstrates that any linear combination equals zero implies all coefficients are zero.
Uses properties of inner products to derive orthogonality implying independence.
Corollary
If a set is orthogonal, non-zero, and matches the dimension of the space, it forms an orthogonal basis.
Linear independence and spanning follow directly from orthogonality and dimension matching.
Example: Pauli Matrices
Pauli Matrices
: Shown as an orthogonal basis in the space of 2x2 complex matrices.
Checked through computations of inner products.
Conversion to Orthonormal Basis
:
Normalize each matrix by its Frobenius norm.
All matrices have a norm √2; divide each by √2 for normalization.
Conclusion
Orthogonality is a powerful tool in vector spaces, providing a stronger form of independence.
Next class: Practical applications of orthogonality.
📄
Full transcript