MAT 22A - Linear Algebra
Course website for an introductory course in linear algebra in Summer Session II 2024
Meetings: MWF 10:00am-11:40am in OLSON 118
Office Hours: Tuesday 10am-11am, 2pm-3pm and Thursday 2pm-4pm in MSB 2202
Homework
Homework 1 due Thursday August 8th, 2024 Optional Post Lecture Problems for Hw 1
Homework 2 due Friday August 16th, 2024 Optional Post Lecture Problems for Hw 2
Homework 3 due Friday August 23rd, 2024 Study Guide 1, Study Guide 2, Study Guide 3 by Professor Orsola Capovilla-Searle
Homework 4 due Sunday September 1st, 2024 Practice Midterm
Homework 5 due Friday September 6th, 2024 Textbook Problems
Homework 6 due Thursday September 12th, 2024
Practice Final Exam Questions Day I
Lecture Notes
WEEK 1: Intro + Motivations to a New Approach to Matrices
[8/5, 10:00am-10:45am] Lecture 1: Syllabus Details and Review of Vector Algebra and Geometry
Important syllabus details discussed.
[8/5, 10:50am-11:40am] Lecture 2: Review of Vector Algebra and Geometry
Important syllabus details discussed. We then start with a review of vectors and geometry covered in prerequisite courses. Although rapid, the review will encompass the fundamental binary operations of vectors (addition and scalar multiplication), how to think geometrically of R^n in terms of linear combinations, and the "biometrics of vectors" -- i.e. how to dissect geometric properties of vectors in R^n using the dot product and projections.
[8/7, 10:00am-10:45am] Lecture 3: Intro to Linear Algebra and Matrices
An Introduction to the origins of linear algebra and motivations for the kinds of problems we'd like to solve. In particular: solving systems of linear creating search engines, optimizing algorithms, understanding vector spaces, finding invariant subspaces, combinatorics, warping space and time, and manipulating the fabric of the universe... Before any of that, we go to the 1637 view of Rene Descartes and linear systems. When is such a system solvable? No solutions, unique solutions, or infinitely many solutions... why? Solving linear systems with matrices. For now, matrices = better bookkeeping!
[8/7, 10:50am-11:40am] Lecture 4: Row Reduction + Row Echelon Form
We give the rules of the game for reducing a general matrix equation associated to a linear system. These encode the "allowed" algebraic manipulations we can do to linear systems of equations. The point is to turn the matrix associated with a given linear system into another modified matrix which is of a "nice form"; i.e. triangular. This will let us solve for solutions of the matrix equations extremely easily.
[8/9, 10:00am-10:45am] Lecture 5: Classification of Solution Spaces + Echelon Forms
The goal to solve sytems of linear equations is easily completely classified by row reduction. You can easily see that the only possible spaces of solutions for 2x2 linear systems are nothing, a point, or a line by looking at each canonical form of the row reduction. Generalizing this to nxn systems we find a complete classification of the intersection of n-1 dimensional subspaces of n dimensional Euclidean space. We then clarify the notation of Row Echelon and Reduced Row Echelon forms.
[8/9, 10:50am-11:40am] Lecture 6: A New Perspective + Linear Maps
At first our interpretation of Matrices are: They are better bookkeeping for carrying out elimination calculations when solving linear systems of equations. However, this is terribly naive. The richness of linear algebra comes with the interpretation of linear maps and vector spaces. Today we understand the notion of linear maps and how they connect to the theory we've developed so far. Then we clarify the descrepency of a need for a theory of bases and vector spaces!
WEEK 2: Basic Matrix Operations and Bases
[8/12, 10:00am-10:45am] Lecture 7: Linear Transformations of R^2 and Matrix Multiplication
We review some examples of linear transformations and focus on those in R^2. In particular, we see some examples of reflections, shears, and rotations in order to emphasize how linear maps are encoded by matrices. We then develop the computational skill of matrix multiplication.
[8/12, 10:50am-11:40am] Lecture 8: Compositions of Matrices
There are two extremely elementary questions when it comes to maps in mathematics. 1) When can we invert the map? and 2) When can we compose maps? In particular, when is a linear transformation able to be "undone" in a well-defined way and how do we encode doing one linear transformation after the other.
[8/14, 10:00am-10:45am] Lecture 9: Inverses of Matrices + Row Reduction as Matrix Multiplication
Here we find the various criterion for determining invertibility of a square matrix. In particular, if a matrix is row equivalent to a matrix with n-pivots, it is invertible. This geometrically tells us that the linear transformation associated to such an invertible matrix cannot be so "violent". We then discuss how to encode row operations as multiplication by particular "elementary matrices" on the left to justify how to compute inverse matrices via row reduction.
[8/14, 10:50am-11:40am] Lecture 10: Introduction to Real Vector Spaces
In this lecture I introduce the basic definition of a real vector space and provide some examples which connect to various areas of math we've been exposed to before. In particular, with spaces which satisfy the definition of vector spaces we can use our normal intuition of real Euclidean space to understand various properties.
[8/16, 10:00am-10:45am] Lecture 11: Linear Independence and Bases
We define bases and their interpretation as an alternative coordinate system on Euclidean space. In particular, the key property of a basis that every vector in the vector space can be uniquely written as a linear combination of the basis vectors tells us that they capture the essence of what a meaningful coordinate system is.
[8/16, 10:50am-11:40am] Lecture 12: Orthonormal Bases and Gram-Schmidt Process
We recall various projection formula for vectors in R^n and discuss why having an orthogonal basis or orthonormal basis is nice. In particular, we can get very easily calculatable coordinates in an orthonormal basis in terms of projections. For this reason orthonormal bases are desirable. How common/practical is it to expect to be able to work in an orthonormal basis? Turns out, you can always turn a basis into an orthonormal basis through the Gram-Schmidt process.
WEEK 3: Change of Basis and Associated Subspaces to Matrices
[8/19, 10:00am-10:45am] Lecture 13: Basis Reduction+Extension, Dimension, and Vector Representation in Basis
We revisit general properties of bases with two new theorems: Basis Reduction and Basis Extension. In particular, we find that any spanning set can be reduced to a basis and any linearly independent set can be extended to a basis in a finite dimensional vector space. With this we can define a good notion of dimension of a vector space and can show that any bases of a vector space must have the same length. With this we can in certain cases cut the work that it takes to prove a set is a basis in half. Finally, because of the uniqueness of representation of a vector as a linear combination of basis vectors, we can now rethink the idea of coordinate geometry as coordinates keep track of the weights attached to each basis vector!
[8/19, 10:50am-11:40am] Lecture 14: Change of Basis and Gauge Transformations of Linear Maps
Now that we are familiar representing vectors in different bases, the next natural question is: How can we translate between different bases? We now formalize the idea of change of basis matrices via a small example first, then describe the easier calculation via row reduction. Finally we discuss the idea that a linear map (as we've seen them) have been only written in standard basis! We can instead describe the same map via tracking where basis vectors get sent.
[8/21, 10:00am-11:40am] Lecture 15+16: Similarity Transformations of Linear Maps
In particular, a linear map has many different descriptions, each depending on a choice of basis. If we are given a linear map written in a certain basis, how do we rewrite this map in another basis? For this we develop the theory of Gauge transformations for linear maps and show that conjugation of the matrix representation of a linear map in one basis by the change of basis matrices rewrites the linear transformation as a matrix in a different basis. This then helps us refine the idea that linear maps = matrices WITH a choice of basis and we end the basic structure theory of linear maps here with the idea that a linear map modulo Gauge transformations is unique.
[8/23, 10:00am-10:45am] Lecture 17: Vector Subspaces, Injectivity and Surjectivity, Kernels and Images
In this lecture I define a notion of vector subspaces with some examples. These are the realization of subsets of a vector space which themselves are vector spaces. A decomposition of a vector space into it's simpler subspace parts is often important to study. One of the ways we choose to study vector spaces in mathematics is via maps between them. For this reason we learn some abstract properties about maps and such as injectivity and surjectivity and relate them to particular quantities associated to a map which are of particular interest to understand them.
[8/23, 10:50am-11:40am] Lecture 18: Rank, Column Space, and Null Space
We discuss the interplay between rank and linear independence. Then we define the four subspaces which are cut out by a single matrix and use the dimension formula for arbitrary kernels and images of maps to justify the two main dimension formulas which pair these four associated subspaces nicely. In particular, we have a result which decomposes the dimensions of the domain and codomain via understanding the dimensions of the column space and null space, as well as the row space and left null space respectively.
WEEK 4: Eigenstuff
[8/26, 10:00am-10:40am] Lecture 19: The Four Associated Subspaces, Row Space, Left Null Space
We continue the conversation of the four associated subspaces and rank. In particular, we construct the second complementary pair of associated subspaces associated to a matrix. We give techniques to find bases of each of these spaces. Finally we talk about rank and invertibility.
[8/26, 10:50am-11:40am] Lecture 20: Introduction to Determinant + Geometry
We give the definition of a 2x2 determinant and understand how it realizes area of the paralellogram induced by the image of the basis vectors under the linear transformation associated with a matrix. We understand the determinant as measuring how a matrix affects area and use this intuition to give a connection with determinants and invertibility. We also talk about cofactor expansion and give the general formula for an nxn determinants.
[8/28, 10:00am-10:40am] Lecture 21: The Determinant Properties and Row Reduction
We continue to talk about determinants and we discuss exactly how row operations affect the determinant of a matrix. We do so by analyzing properties of the elementary matrices, in particular, the way they affect the determinant of a matrix is built in axiomatically into the operation of the determinant as the unique group homomorphism which is also realizable as an antisymmetric k-tensor in it's rows.
[8/26, 10:50am-11:40am] Lecture 22: Eigenvectors and Eigenvalues Intro
I motivate the idea of calculating invariant subspaces corresponding to a square matrix A. In particular, we can study a linear map not by what it changes, but the subspaces which it does not change the linear information. We start by a small 2x2 example and we then discuss the fundamental theorem of algebra to break up the different cases of the characteristic equation into the case of distinct eigenvalues, eigenvalues with multiplicity, and complex eigenvalues.
Lecture 23/24: Midterm
WEEK 5: Special Topics
Labor Day
WEEK 6: Oral Exams + Finals
Lecture 29-32: Final Review
Lecture 33/34: Final
Double Pendulum + Hodgkin-Huxley Sims!