Overview
This lecture covers methods for solving systems of linear equations, focusing on Gaussian elimination, Gauss-Jordan, matrix inversion, LU factorization, and the Thomas algorithm for tridiagonal systems, including practical considerations and pitfalls.
Elimination and Gaussian Elimination
- The elimination (Gaussian elimination) method aims to reduce a system to an upper triangular (or echelon) form.
- The process involves eliminating variables row by row, creating zeros below pivots (diagonal elements).
- The pivot is a key element in each column used for elimination; if zero, rows/columns may need swapping (pivoting).
- Backward substitution is used after elimination to solve for unknowns starting from the last variable.
- If a pivot is zero or near-zero, use partial (row swaps) or full pivoting (row and column swaps) to avoid division by zero.
- In MATLAB, Gaussian elimination can be carried out with or without pivoting.
Gauss-Jordan and Matrix Inversion
- Gauss-Jordan extends Gaussian elimination to transform the matrix into the identity, solving directly without backward substitution.
- The augmented matrix [A|I] is used to find Aโs inverse by row-reducing A to I, generating Aโปยน on the other side.
- A singular matrix (determinant zero) has no inverse and the system may have no unique solution.
Matrix Notation and MATLAB Implementation
- Systems are rewritten as matrix equations: AX = B, with A as the coefficient matrix and B as the right-hand side.
- The solution can be found in MATLAB via backslash operator (X = A\B) or by finding the inverse if A is nonsingular.
- Determinant and invertibility are checked before solving.
LU Factorization
- LU factorization decomposes A into a lower (L) and upper (U) triangular matrix so that A = LU.
- The elimination multipliers M from Gaussian elimination become the off-diagonal entries of L.
- To solve AX = B: first solve LY = B (forward substitution), then UX = Y (backward substitution).
- DoLittle method: L has ones on the diagonal; U is obtained from elimination.
- MATLAB provides the LU command to compute L and U (and permutation P, if pivoting used).
Tridiagonal Systems and Thomas Algorithm
- Tridiagonal (triagonal) matrices have nonzero entries only on the main diagonal and the two adjacent diagonals.
- Thomas algorithm efficiently solves such systems by exploiting the sparsity pattern.
- Only elements on/below/above the main diagonal are modified in elimination.
- Forward elimination and back substitution are performed with simplified updates due to the tridiagonal structure.
Pitfalls in Elimination Methods
- Division by zero occurs if a pivot element is zero, requiring pivoting strategies.
- Roundoff errors accumulate in floating-point arithmetic, affecting solution accuracy.
- Ill-conditioned systems: small changes in A or B lead to large changes in solution X.
- Condition number measures sensitivity; high condition numbers indicate ill-conditioning.
Key Terms & Definitions
- Pivot โ Diagonal element used for eliminating entries below it; crucial in elimination methods.
- Gaussian Elimination โ Stepwise elimination to form an upper triangular matrix.
- Gauss-Jordan Elimination โ Extended elimination to identity matrix for direct solution.
- LU Factorization โ Decomposition of a matrix into lower (L) and upper (U) triangular matrices.
- Tridiagonal Matrix โ Matrix with nonzero entries only on the main, sub-, and super-diagonals.
- Thomas Algorithm โ Efficient solver for tridiagonal systems.
- Ill-conditioned System โ System where small input changes cause large output changes.
- Augmented Matrix โ [A|B] form combining coefficients and constants for elimination.
Action Items / Next Steps
- Complete the assignment: solve a given system using classical elimination, Gauss-Jordan, and LU methods.
- Practice using MATLAB for each method and verify the solutions.
- Prepare for next class by reviewing condition numbers and iterative improvement methods (Jacobi, Gauss-Seidel, SOR).
- Submit assignment by next Wednesday, 12 p.m., as instructed (online or in person).