Please note that all future weeks are subject to change based on student and instructor feedback. Internal sheet for assignments (only accessible to class members).
Course notes:
Latest version: April 15, 2026. Freshest version, may not be up to date with course.
Older version: April 6, 2026
Fall (static, "complete"): P231 F25, use this to read ahead
Calibrating our prior background in vector spaces, unlearning a few ideas that we will generalize in this course. Course notes will be posted as we go along, for now you can refer to the first few chapters of Prof. Tanedo's P231 notes.
Meeting 1 (Mon, Mar 30): Course logistics. What do we already know about vectors, matrices, and tensors? What does the linear mean in linear algebra?
Practice: vector fundamentals.
Discussion (Wed, Apr 1):
Plan: Q&A, the big picture of the course, conversations about how to do the homework.
Actual: Small corrections to homework. Indices and summation convention. When you have an expression with one upper and one lower index, you may contract these indices. This means you sum over copies of that expression where the contracted indices take values 1, 2, 3, ... , N where N is however many possible values the index can take. (In 3 dimensions N=3). We write this using summation convention where the upper and lower index are given the same variable and we leave the summation (Σ) symbol implicit (don't write it). These indices are now called dummy indices; they are no longer free indices because the "user" cannot specify them.
Meeting 2 (Wed, Apr 1): Index notation, summation convention, contraction of indices as a linear transformation. A function f takes in some class of input object x (which may be a vector). The function is linear if f(αx + βy) = αf(x) + βf(y). It does not matter what kind of "output object" f produces. Here's what you should think about:
Is it obvious that any contraction by a tensor is a linear transformation? For example, when a matrix acts on a vector to produce a vector. The components of the output vector are linear functions of each component of the input vector.
A bit more nuanced: is it obvious that any linear function of N inputs and N outputs may be written as a matrix acting on a vector? More generally, any linear function of N inputs may be written as a contraction with an object with at least one lower index? (In that case, all the other indices encode the output object.)
Why this all matters: the laws of physics are local. The differential equations that you have come to know and love (or hate) are linear descriptions of what happens 'near' a test particle to determine what that test particle does. This is effectively a Taylor expansion to linear order. We 'integrate' these equations by asking what happens if we repeatedly ask what happens after "small time steps" where we take the linear approximation for each time step.
Meeting 3 (Fri, Apr 3): Linear transformations as tensors. The transformation of tensors. The meaning of the infamous aphroism in physics: "A tensor is any object that transforms like a tensor."
Due dates listed first, use the hyperlinks to submit homework. Please submit a pdf version of the assignment using the submission links below. Homework due this week:
Fri Apr 3. Please review the syllabus. You don't have to submit anything, just take some time to look over the document.
Fri Apr. 3: Pre-Class Survey.
Fri Apr. 3: Short homework 1. (Updated Apr 2, no significant changes, but thanks to Victoria for catching typos in the comments!) Submission link.
The short homework should be quick and something that you can easily after class. It is useful feedback for Prof. Tanedo.
Homework due next week:
Mon Apr 6: Please record a 2-5 minute video introducing yourself to the teaching team and the other students. You should start by stating your name. You may also include: where you grew up, your goals for your physics degree, hobbies, anything you are excited about. Submission link. Submission link
Monday Apr. 13: Long homework 1 (updated April 2; correction of due date, thanks Declan! updated April 3: corrections where w row vector did not appear.). Submission link
I am currently revising our notes and plan on regularly posting the most updated notes at the top of this agenda page. In the meanwhile, you can refer to the Physics 231 notes at the top of the page as a reference.
In class on Monday we talked about the "magnitude and direction" definition of a vector. We also brought up Banesh Hoffmann's anecdote about loose definitions, "Plato defined a man as a two-legged animal without feathers." (From the beginning of About Vectors.) I was amused to find another version of this today, an April Fool's paper on the arXiv: "No hair but plenty of feathers: are birds black holes?" (arXiv:2603.29064)
Meeting 4 (Mon, Apr 6): Transformations of vectors and tensors. Invitation: given a vector (for example, one with components v1 = 2, v2 = 1), what are the components of the vector after rotating it by 90 degrees counter clockwise? [Answer: v'1 = -1, v'2 = 2]
Rotations are a linear transformation of vectors, we wrote out the 2x2 rotation matrix for two-component vectors, R. We then argued that row vectors transform by contracting with the inverse of R.
A general tensor transforms under a transformation R as follows: For each upper index, contract that index with R. For each lower index, contract that index with R-inverse.
This is true for any transformation, not just rotations ("isometries").
Discussion (Wed, Apr 8):
From last time: we said that a matrix transforms under rotations as R M R^-1. Explain why this is a linear transformation on matrices.
Meeting 5 (Wed, Apr 8): A basis contains all of the vector-ness of a vector. It is also a cypher: once we agree on a basis, then we can communicate everything there is to know about a vector by simply listing its components. Surprising examples of vector spaces.
Meeting 6 (Fri, Apr 8): Matrices are defined by their action on basis vectors. A basis for all tensors.
Mon Apr 6: Please record a 2-5 minute video introducing yourself to the teaching team and the other students. You should start by stating your name. You may also include: where you grew up, your goals for your physics degree, hobbies, anything you are excited about. Submission link. Submission link
Monday Apr. 13: Long homework 1 (updated April 2; correction of due date, thanks Declan!). Submission link
Friday Apr 10: optional: there is an early term evaluation for our course available on the CANVAS (ugh) page. You are not required, but are encouraged to use this form to provide feedback about the course. More about what these evaluations are and how they are used. Formalism: Inner products. A vector space with an inner product is called a metric space. Rotations are special linear transformations (isometries) in a metric space that preserve the inner product.
Next week we will start doing our explainer videos. If you're looking for inspiration and a fun evening, you might consider attending the Grad Slam talks this Friday, 4-6pm.
Meeting 7 (Mon, Apr 13): Inner products. A vector space with an inner product is called a metric space. Rotations are special linear transformations (isometries) in a metric space that preserve the inner product. Introduction to Minkowski spacetime.
Discussion (Wed, Apr 15): Explainer videos live.
Victoria: LHW1, Problem 3
Andrew: LHW1, Problem 5
Meeting 8 (Wed, Apr 15): Lorentz transformations as the isometries of spacetime. Special relativity by diagram: length contraction, time dilation.
Meeting 9 (Fri, Apr 17): Lorentz transformations, spacetime diagrams. Time dilation and length contraction.
Fri Apr 17: Explainer Videos, assignments are on our internal sheet (class members only). Submission link.
Fri Apr. 17: Short homework 2. Submission link.
Monday Apr. 27: Long homework 2 Submission link
Meeting 10 (Mon, Apr 13): Barn-in-pole paradox as a spacetime diagram. The physics of special relativity: proper time, relativistic electrodynamics and the existence of magnetism.
Discussion (Wed, Apr 15): Working with special relativity, meeting some of its paradoxes.
Chloe: LHW2, Problem 2
Declan: LHW2, Problem 3
Meeting 11 (Wed, Apr 15): Projection and the Gram-Schmidt Process.
No Meeting (Fri, Apr 17): No class today, my apologies. Prof. Tanedo will be facilitating the UC PPFP spring retreat; please feel free to reach out to him if you're thinking of an academic career and would like to know more about the UC President's Postdoctoral Fellowship Program.
Monday Apr. 27: Long homework 2 Submission link
Friday Apr. 24: Peer Review, assignments on internal page. Submission link
Meeting 12 (Mon): Walking through a 2x2 eigenvalue problem.
We started with a symmetric 2x2 matrix and identified this as special.
We introduced a some jargon to get used to: symmetric, self-adjoint, Hermitian. For now, these all mean the same thing.
We then identified (without showing how we got them) two special vectors so that when the matrix acts on these vectors, they are simply rescaled. The rescaling factors are called the eigenvalues and the special vectors are called eigenvectors.
We argued that there are in fact an infinite number of eigenvectors for each eigenvalue: the rescaling of any eigenvector is also an eigenvector.
We observed that the eigenvectors are orthogonal. This is a result of the matrix being symmetric (we did not prove this yet). That meant that we could normalize the eigenvectors to form a nice basis, which we call an eigenbasis. (This is not standard nomenclature, but any physicist will know what you mean.)
By writing vectors with respect to the eigenbasis, matrix operations are easy: we simply rescale each vector component by the eigenvalue. This allows us to quickly calculate powers of the matrix, including negative and fractional powers. A power of -1 means you can calculate the inverse of the matrix easily in the eigenbasis.
Discussion (Wed):
Fede: LHW2, Problem 4
Andrew: LHW2, Problem 5
Meeting 13 (Wed): The determinant of a 2x2 matrix is an area. Specifically, it is the area of the parallelogram whose sides are |M e1> and |M e2>. We introduced the d-dimensional Levi-Civita symbol (or alternating symbol) as the tensor with d lower indices satisfying (1) ε_123...d = 1 and the exchange of any two indices is zero. We wrote the determinant of a 2x2 matrix with respect to the 2-dimensional Levi-Civita: det(M) = ε_ij M^i_1 M^j_2. We showed that this matches the definition in terms of the components of M, and argued the natural generalization to d dimensions:
det(M) = ε_ij...k M^i_1 M^j_2 .. M^k_d.
The extra credit for Long HW3 challenges you to argue why det(M) for a 3x3 matrix gives the volume of a parallelpiped (3D parallelogram, the way a cube is a 3D square) whose sides are |M e1>, ... , |M e3>. Then you can extend this to higher dimensions.
Meeting 14 (Fri): Review of the eigenvalue program from the top.
Start with a symmetric matrix. Your goal is to diagonalize it to write M = (rotation) (diagonal matrix) (inverse rotation
Find the eigenvalues. The components of the diagonal matrix are the eigenvalues. These come from the characteristic equation, det(M-λ1) = 0.
Find the eigenvectors, these are defined by M|ξ> = λ|ξ>.
Normalize the eigenvectors to form an eigenbasis. The eigenvectors should already be orthogonal.
Find the components of the rotation. This comes from "inserting 1" in a clever way. Note that the rows (or columns depending on how you define the rotation) of the rotation matrix are simply the components of eigenvectors in the standard basis.
Note: the "rotation matrix" might not take the standard form. That is usually because depending on the order of the eigenvectors (and corresponding order of eigenvalues), there may be a parity (mirror) transformation.
Fri May 1: Explainer Video 2, assignments are on our internal sheet (class members only). Submission link.
Fri May 1: Short homework 3. Submission link.
Mon May. 11: Long HW3 (corrected 5/6) Submission link
Meeting 15 (Mon): Review of the eigenbasis problem. We focused on motivating the characteristic equation from the diagonalization M = (rotation) (diagonal) (inverse rotation). Using det(rotation) = 1 because the rotation goes from one orthonormal basis to another, one may then see that the characteristic equation is simply the statement that (λ-λ1)...(λ-λn) = 0, whose solutions are λ = λi for some eigenvalue λi.
We started to talk about how to think about eigenvalues as stretching/shrinking along the eigenvector directions.
Discussion (Wed):
Owen, LHW3, Problem 1
Alejandro: LHW3, Problem 2
Meeting 16 (Wed):
Pictorial understanding of the action of a symmetric matrix on a vector.
Motivation 1: the 'big idea' of quantum mechanics. Introduction to complex vector spaces as "everywhere there is a number, it is now allowed to be complex." This means that linear combinations may be taken with complex coefficients. It also means that components of tensors may be complex. However, basis states are already completely abstract—they're not numbers.
Motivation 2: what is the transpose in index notation? We proposed a way to make sense of this that involved the metric. We then defined the adjoint of a matrix M as <M† v, w > = < v, M w> for any vectors v and w. We generalized "symmetric" to self-adjoint, M† = M. This is also known as Hermitian. So what? We now see that hte "special condition" (symmetric matrix) that gave us orthogonal eigenvectors can be framed with respect to the inner product. In components, the adjoint is "transpose and complex conjugate everything/"
The complex inner product: <v, w> = <w, v>* = v1* w1 + v2* w2 + ... . We motivated this by saying that it enforces that the length of a vector is a positive number (assuming the Euclidean metric): |v|^2 = |v1|^2 + |v2|^2 + ..
Tanatalizing claims: a self-adjoint/Hermitian matrix has (1) real eigenvalues and (2) orthogonal eigenvectors. This is significant for quantum mechanics because the real eigenvalues are usually associated with observables (nubmers we would measure in an experiment) and the eigenbasis are states that always produce that observed value. In general, a quantum state is a linear combination of these eigenstates.
Meeting 17 (Fri): Proof of the tantalizing claims.
Friday May 8: Peer Review 2, assignments on internal page. (Now updated! 5/6, sorry about the delay.) Submission link
Monday May 11: Long HW3 (corrected 5/6) Submission link
Friday May 15: Explainer Video 3, assignments are on our internal sheet (class members only). Submission link.
Meeting (Mon):
Discussion (Wed):
Steven: LHW3, Problem 4
Fede: LHW3, Problem 5
Meeting (Wed,):
Meeting (Fri)
Meeting (Mon):
Discussion (Wed):
Andrew, LHW4, Problem TBD
Alejandro, LHW4, Problem TBD
Meeting (Wed,):
Meeting (Fri)
Meeting (Mon):
Discussion (Wed):
Victoria, LHW4, Problem TBD
Chloe, LHW4, Problem TBD
Meeting (Wed,):
Meeting (Fri)
Review of key ideas, highlighting some subtle points, and (if there are no other more pressing discussions), some pictures of how this fits together into more advanced mathematical physics.
Meeting (Mon):
Discussion (Wed):
Declan, LHW5, Problem TBD
Owen, LHW5, Problem TBD
Meeting (Wed,):
Meeting (Fri)