Transcript for:
Key Topics in Linear Algebra Course

Good morning, let me welcome you to this NPTEL lecture on linear algebra. This is usually offered in the first semester of a postgraduate course on mathematics be it pure mathematics or applied mathematics. This is an absolutely fundamental course, absolutely fundamental area which one requires both in pure mathematics as well as in applied mathematics for example in problems of engineering starting from the basics of linear transformations, vector spaces, inner product spaces. The idea is to prove theorems that is one of the main objectives of this course. There will be lots of examples that we will be discussing in this course.

The video material has been divided into several modules. About 13 modules are there. I will write down the titles of each of these modules and also probably briefly tell you what each module contains.

Before that perhaps I should mention two books, both are classic in a sense. The first one is Halmos's book. finite dimensional vector spaces. So let me just write down the title and the author Paul Halmos is the name of the author.

The title of the book is finite dimensional vector spaces. Let me give the reference to the latest edition. It has appeared in Springer in 2011. This is the undergraduate text in mathematics, so called UTM series. There is another classic book which is followed in many universities, two authors Hoffman and Kunze.

The title is linear algebra, that is I am referring to the Prentice Hall edition which appeared in 2004. Alright, now let me give you the modules for this course. There are about as I mentioned there are about 13 modules in this course. This introductory lecture will be a brief one.

The actual lectures will begin from the second lecture. So these are the modules. modules that we have.

Module 1 systems of linear equations this is approximately in the lectures from 2 to 7. From the next lecture till the seventh lecture, we will be discussing mainly systems of linear equations. The notions that we will be discussing in this module are basically elementary row operation. equations and then when do we say that two systems of linear equations are equivalent then we will look at the elimination process basically the Gaussian elimination process which we learnt in high school for instance we will formalize Gaussian elimination by means of the elementary row operations. In particular we will be looking at what are called as row reduced. Echelon matrices also the notion of elementary matrices we will be studying both homogeneous as well as non-homogeneous equations.

How the solutions how do we characterize the existence of solutions of homogeneous equations or non-homogeneous equations in terms of row reduced echelon matrices in terms of of invertibility of the coefficient matrix etc. So these will be the topics that we will be discussing in this first module. In the next 3 lectures that is from lectures 8 to 10 we will be discussing the second module. So let me write down the title.

The title is vector spaces. Lectures 8 to 10 approximately 3 lectures, vector space that the axiomatic definition of a vector space, then lots of examples of vector spaces, we will also be discussing the notion of subspaces of vector spaces again lots of examples of subspaces and then spanning sets for instance we will. Conclude this second module with the notion of linear independence of vectors, linear independent subsets of vector spaces. Module 3 will be basis and dimension.

We will be discussing the notion of basis and dimension. This will be in covered in about approximately 4 lectures, lectures 10 to 13, part of lecture 10 will we will discuss the notion of linear dependence, linear independence of vectors. So in this section on basis and dimension we will discuss the notion of linear dependence, linear independence, look at lots of examples, some properties of linear independence subsets etcetera. Then, the notion of spanning subsets and then the notion of basis which then leads to the notion of dimension.

Towards the end of this third module, we will also discuss the problem of determining the dimension of the sum of two subspaces in a finite dimensional vector space. So that will be module 3. Module 4, perhaps the most important module in this. Course we will discuss what are called as linear transformations.

The notion of linear transformation which are absolutely fundamental in perhaps the whole of mathematics. We will discuss linear transformations, the definition of a linear transformation, examples then two important subspaces that are associated with a linear transformation, the null space and the range space. We look at lots of examples and we will also prove an absolutely fundamental result for linear transformations called the Rank Nullity Dimension Theorem.

We will also discuss the notion of what is called as a row rank of a matrix, the column rank of a matrix and the equality of the row rank and the column rank of a matrix. So these things will be discussed in this fourth module and we will be covering these topics in the next session. Lectures 14 to 18. In the fifth module we will discuss the notion of the matrix of a linear transformation. The matrix of a linear transformation.

This will be done in approximately 3 lectures, lectures 18 to 20. So in lecture 18 when we discuss linear transformations towards the end of the fourth module we will introduce the notion of the matrix of a vector in a vector space and then from lecture 19 onwards for a couple of lectures we will discuss the notion of the matrix of a linear transformation. Where we will discuss also what is the matrix of the composition operation, composition of two linear transformations and what is the matrix of the inverse transformation. We will also answer the question as to how the matrices of a linear transformation corresponding to two different bases behave, how are they related, the notion of similarity transformation.

This will also be discussed in the fifth module. In module 6, we will discuss the notion of linear functionals, especially what is called as the dual space. These topics will be discussed. in the lectures 21 to 25. So what is what is a linear functional then the representation theorem we will be proving a representation theorem for a linear functional on a finite dimensional vector space then the notion of the dual space more importantly the notion of a dual basis.

Some numerical examples for constructing dual bases, we will also discuss what is called as an annihilator of a subspace. An annihilator is a subspace of the dual space for instance. We will also discuss the notion of the double dual space and then of course the double annihilator.

We will also consider the problem of proving that a subspace is equal to its double annihilator. under a certain identification. So these topics will be discussed in module 6 linear functionals and in module 7 approximately 26 to 29 about 4 lectures we will discuss the notion of eigenvalues and eigenvectors of linear transformations. So we look at examples of linear transformations and ask the question as to whether these linear transformations have eigenvalues, whether they have enough eigenvectors etc.

What is a matrix formulation of such a problem? Then the diagonalizability, when is a linear transformation diagonalizable? What is the definition?

And we look at some examples of matrices which are diagonalizable. So, some other matters which are not diagonalizable etc. We will also look at one important characterization of diagonalizability in terms of the characteristic polynomial and the dimensions of the eigenspaces.

The notion of a characteristic polynomial leads to the notion of an eigenvalue. So, we look at a characterization of diagonality in terms of characteristic polynomial and eigenspaces the dimensions, dimensions of the eigenspaces. Whether the dimensions. of the eigenspaces add up to the dimension of the domain of the vector space that we start with.

We will ask this question. We will also discuss the relationship between a minimal polynomial and the characteristic polynomial. So there is a notion of a there are at least two polynomials that one would like to consider for a linear transformation the minimal polynomial and the characteristic polynomial. What are their relationships? What are the relationships when?

What can one say about the minimum polynomial for instance when the operator is diagonalizable etc so we will answer these questions. Towards the end of this 7th lecture 7th module towards the end of the 7th module we will also discuss a proof of the Cayley Hamilton theorem for matrices. The Cayley Hamilton theorem is informally He says that the characteristic polynomial of an operator is an annihilating polynomial of that operator. In the eighth module in about three lectures.

In this module we will this 8th module we will discuss the notion of invariant subspaces and triangulability. So for instance what is an invariant subspace of a linear transformation then what is a t conductor of a subspace. The notion of triangulability which is more general than diagonalizability of course we will also discuss diagonalizability in terms of the minimal polynomial and independent subspaces for instance.

In this module we will also discuss the notion of projection operators towards the end of this module we will also prove that projection matrices for instance are diagonalizable. So, that will be the topics that are covered in module 8. In the next module, we will look at direct sum decompositions. This will be discussed in about 2 lectures.

What is direct sum decomposition of a vector space? Then what are the relationships between direct sum decompositions and projections? In fact we will show that there is a one to one correspondence.

We will discuss the notion of invariance of spaces. We will recall this notion that was introduced in the previous module and then we will discuss study characterization of diagonalizability in terms of invariant subspaces etc. One important result that we will prove here is characterization of diagonalizability involving projection operators and direction decompositions.

Of course we will discuss lots of numerical examples to illustrate the main results. In the tenth module. Which is about, which contains about four lectures. We will discuss the notions of the primary decomposition theorem and the cyclic decomposition theorem. So this is covered in lectures 35, 38. So here we will discuss the notion of primary decomposition theorem which is essentially looking at a very general form of the characteristic polynomial like the primary the factorization.

which is similar to the factorization of a number in terms of the prime powers of its factors. We will also discuss the notion of the Jordan decomposition theorem which is a consequence of the primary decomposition theorem and also another result called cyclic decomposition theorem. Now these results will be proved in these four lectures.

In the next module we will discuss the notion of inner product spaces. That is module 11. The notion of inner product spaces, this will be covered in about four lectures. So what is the notion of an inner product on a vector space?

We will look at several examples of inner product spaces, then look at the notion of a norm. on a vector space coming through an inner product for instance which allows us to generalize the notions of perpendicularity of vectors on the plane or the three-dimensional space. So orthogonality will be discussed, orthonormality will be discussed, consequently we will be discussing the notion of the Gram-Schmidt process of obtaining an orthonormal set from a linearly independent set.

As a consequence of Gram-Schmidt process we will derive what is called as a QR decomposition of a matrix whose columns are linearly independent. We will also show that a finite dimensional inner product space always has an orthonormal basis. So these topics will be covered in these four lectures on inner product spaces.

In the next module, module 12. We will study the notion of what is called as best approximate solutions, generally best approximation that is about three lectures 43, 45 the notion of best approximation. So what is this notion of best approximation in a product space? More importantly how does this translate into the problem of finding least square solutions for linear equations possibly inconsistent systems of linear equations.

We will be using the QR decomposition that was studied in the previous module and using the QR decomposition we will obtain a solution. We will also discuss orthogonal complementary subspace. Given a subspace the orthogonal complementary subspace what are its properties? So we will discuss this. What comes along is the notion of an orthogonal projection.

So this will also be discussed in this module and how are orthogonal projections related to the notion of best approximation just to complete the circle. So these topics will be discussed in this module. On best approximation, the next module is we in the next two modules we will discuss the notion of the adjoint of an operator.

The next two lectures the adjoint of an operator. So this will be discussed in lectures 46 and 47. So the notion of the adjoint operator, some of its properties, some examples and then given an operator on a finite dimensional inner product space, what is the relationship of the matrix of this operator relative to an orthonormal basis and the matrix of its adjoint relative to the same orthonormal basis. So we will discuss this relationship, we will also discuss towards the end of this module.

module the notion of inner product space isomorphisms and give a characterization of when an operator on a finite dimensional product space is an inner product space isomorphism. In the last module the 14th module we will discuss three important classes of operators on inner product spaces. Sulfur joint, normal and unitary operators.

So this will be done in lectures. So this is the last part of this course. The topics. that we will be discussing in this module are first unitary operators, then normal operators and then self-adjoint operators.

Unitary operators, examples, then what are the properties of the matrix of a unitary operator relative to a basis, etc. We will discuss the notion of unitary equivalence of operators. which generalizes diagonalizability in some sense and then we will switch to self-adjoint operators and for self-adjoint operators we will look at some examples both finite dimensional infinite dimensional and importantly prove what is called as the spectral theorem for a self-adjoint operator. We will be requiring some properties of eigenvalues eigenvectors of self-adjoint operators we will discuss those and then prove the spectral theorem for a self-adjoint operator.

The third topic in this module is that of normal operator. So we look at examples again look at study some properties of eigenvalues eigenvectors and then prove what is called as the spectral theorem for a normal operator. For both the spectral theorem for a self-adjoint operator and for a normal operator we will look at the matrix version.

So matrix versions of these results the spectral theorem will also be. Presented. So these are the 14 modules that cover the topics that one would normally discuss in a first course on linear algebra. Let us move on to the actual lectures from the next lecture onwards.