## Colorado State University, Pueblo; Spring 2012 Math 307 — Introduction to Linear Algebra Course Schedule & Homework Assignments

Here is a link back to the course syllabus/policy page.

In the following, all sections and page numbers refer to the required course textbook, Linear Algebra, A Modern Introduction (2nd edition), by David Poole.

Also in the following, the image means that class was videoed that day and can be seen through the Blackboard page for this class. Note that if you know ahead of time that you will miss a class, you should tell me and I will be sure to video that day.

This schedule is will be changing very frequently, please check it at least every class day, and before starting work on any assignment (in case the content of the assignment has changed).

• M:
• Content:
1. bureaucracy and introductions
2. what is Linear Algebra (the study of vector spaces and linear transformations...)
3. why do we do so much abstraction and formality at this point of the mathematics curriculum: what abstraction is good for
• Miniquiz 0
• T: • Read: To the Student, p. xxiii and §§ 1.1–1.3
• Miniquiz 1
• Journals are not due today; it's a bit too early in the term. (First Journal entry will be due next Tuesday.)
• Content:
1. some basic terminology and notation:
• logical and basic set theoretic terminology/notation
• some basic sets of numbers
1. natural numbers $\NN$
2. integers $\ZZ$
3. rationals $\QQ$
4. real numbers $\RR$
• starting good definitional style, including
• all variables must be "bound"
• clearly identify the symbol and/or terminology being defined
• clearly identify the type of object being defined
• vectors in $\RR^n$
• vector addtion
• scalar multiplication
• the dot product
• norms
2. some basic properties
• of vector arithmetic
• of dot products and norms
• the triangle inequality
• Do HW0: Send me e-mail (to jonathan.poritz@gmail.com) telling me:
1. Your name.
2. Your e-mail address. (Please give me one that you actually check fairly frequently, since I may use it to contact you during the term.)
3. Your year/program/major at CSUP.
4. The reason you are taking this course.
5. What you intend to do after CSUP, in so far as you have an idea.
6. Past math classes you've had.
7. Other math and science classes you are taking this term, and others you intend to take in coming terms.
8. Your favorite mathematical subject.
9. Your favorite mathematical result/theorem/technique/example/problem.
10. Anything else you think I should know (disabilities, employment or other things that take a lot of time, etc.)
11. [Optional:] If you were going to be trapped on a desert island alone for ten years, what music would you like to have?
• W: • Read: §§ 2.1 & 2.2
• Miniquiz 2
• Content:
1. more basic terminology and notation...
• complex numbers
• angles between vectors
• quantifiers $\forall$ and $\exists$
• more on what makes a good defintion
• [systems of] linear equations
• solutions of linear systems, the solution set and its structure
• a[n in]consistent linear system
• the coefficient and augmented matrices of a linear system
• elemenatry row operations (EROs)
• row-equivalent matrices
• [in]homogeneous linear systems
2. more basic properties
• solution sets of linear systems are either empty, have exactly one point, or have an infinite number of points
• F:
• [Re]Read: §2.2
• Content:
1. yet more basic terminology and notation...
• angles between vectors
• orthogonal vectors
• a [non]trivial solution of a linear system
• relation between [non]trivial solutions, [non]homogeneous linear systems, and [non]unique solutions
• more on what makes a good defintion
• what makes a good statement of a result (theorem, propostion, etc.)
• [starting on] what makes a good proof
• proof structures:
1. unpacking ("follow your nose")
2. contradiction ("it can't not be true")
3. many more to come...
• the [reduced] row-echelon form of a matrix
• row-reduction
• free variables
• the rank of a matrix
2. more basic properties
• The Rank-Nullity Theorem
• Maxiquiz 1 handed out today, due on Monday
• Today [Friday] is the last day to add classes.

• M: • [Re]Read: §4.2 and Read: §4.1
• Content:
1. going over Maxiquiz 5
2. the identity transformation of $\RR^n$
3. the inverse of a linear/matrix transformation
4. definition of the determinant
• for $1\times 1$ matrices
• for $2\times 2$ matrices
• recursively for $n\times n$ matrices, where $n\ge 3$ — which is essentially Laplace's Expansion Theorem
• [there is actually a direct (=non-recursive) formula]
5. properties of determinants:
• $\det(A)=0\ \Leftrightarrow\ A$ is invertible.
• $\forall A,\ \det(A)=\det(A^T)$
• $\forall A,B,\ \det(AB)=\det(A)\cdot\det(B)$
• determinants for triangular matrices
• Miniquiz 12
• T: • [Re]Read: §4.1 and Read: §4.3
• Content:
1. eigenvectors, eigenvalues, and eigenspaces
2. an eigenspace is always a subspace of $\RR^n$; in fact, it is the nullspace of $A-\lambda I$.
3. the characteristic polynomial/equation of an $n\times n$ matrix
4. the algebraic multiplicity of an eigenvalue
• Journal 4 on §§3.6 & 4.2 is due today
• Hand in HW8: 4.2.53, 4.2.54, 4.2.69
• Miniquiz 13
• W:
• [Re]Read: §4.4 and Read: §4.4
• Content:
1. the geometric multiplicity of an eigenvalue
2. definition of matrix similarity
3. examples of eigenspaces and multiplicities of eigenvalues
• Miniquiz 14
• F: • [Re]Read: §4.4
• Content:
1. eigenvalues of triangular matrices
2. eigenvalues of invertible matrices
3. linear independence of eigenvectors corresponding to distinct eigenvalues
4. properties of matrix similarity:
• reflexive
• symmetric
• transitive
• ...so it's an equivalence relation
5. properties in common to similar matrices
6. diagonalizable matrices
7. building a basis for the ambient $\RR^n$ out of bases of all eigenspaces of an $n\times n$ matrix
• a $2\times 2$ example
8. The Diagonalization Theorem
• Maxiquiz 6 today
• Hand in HW9: 4.1.35, 4.1.37, 4.3.20-22

• M:
• T:
• Hand in Journal 5, if you are doing it on paper
• Hand in HW10: 4.4.40-42, 4.4.47-48
• Midterm I in class today
• W:
• Read: §5.1
• Content:
1. going over Midterm I
• F:
• [Re]Read: §5.1
• Content:
1. things to notice about an invertible matrix, say called $P$:
• the columns of $P$ are a basis of $\RR^n$, call them $\vec{p}_1,\dots,\vec{p}_n$
• conversely, if $\{\vec{p}_1,\dots,\vec{p}_n\}$ is a basis of $\RR^n$, and $P$ is a matrix whose columns are these vectors $\vec{p}_j$, for $1\le j\le n$, then $P$ is invertible $n\times n$.
• multiplication on the left by $P$ transforms the standard basis of $\RR^n$ to the new basis consisting of the columns of $P$; that is, if $\vec{e}_1,\dots,\vec{e}_n$ is the standard basis (so $\vec{e}_j$ is really just the $j$th column of the $n\times n$ identity matrix, for $1\le j\le n$), then $\vec{p}_j=P\vec{e}_j,$ again for $1\le j\le n$.
• multiplication on the left by $P^{-1}$ transforms the basis $\{\vec{p}_1,\dots,\vec{p}_n\}$ into the standard basis: $\vec{e}_j=P\vec{p}_j,$ for $1\le j\le n$.
2. what this has to do with diagonalization:
• if we can put together an entire basis $\{\vec{p}_1,\dots,\vec{p}_n\}$ of $\RR^n$ out of eigenvectors of some matrix $A$, so $A\vec{p}_j=\lambda_j\vec{p}_j$ for $1\le j\le n$, then building the matrix $P$ with columns from this basis, it will turn out that $P^{-1}AP$ is diagonal, with $\lambda_1,\dots,\lambda_n$ down the diagonal. [Why? because $$\left(P^{-1}AP\right)\vec{e_j}=P^{-1}\left(A\left(P\vec{e_j}\right)\right)=P^{-1}\left(A\vec{p_j}\right)=P^{-1}\left(\lambda_j\vec{p_j}\right)=\lambda_j\left(P^{-1}\vec{p_j}\right)=\lambda_j\vec{e}_j$$ so the standard basis of $\RR^n$ consists of eigenvectors of $P^{-1}AP$ with eigenvalues $\lambda_1,\dots,\lambda_n$ ... which means $P^{-1}AP$ is diagonal as claimed.]
3. so we get the Diagonalization Theorem, two versions:
• if separate bases of all the eigenspaces of an $n\times n$ matrix $A$ when put together yield a basis of all of $\RR^n$, then $A$ is diagonalizable.
• if the geometric multiplicity equals the algebraic multiplicity for every eigenvalue of a matrix $A$, then $A$ is diagonalizable
4. a corollary of the Diagonalization Theorem is that if an $n\times n$ matrix $A$ has $n$ distinct eigenvalues, then it is diagonalizable
• Hand in revised solutions to Midterm I, if you like

• M: • [Re]Read: §5.5 pp.411-426 only
• Content:
1. defining quadratic form
2. examples of quadratic forms: upward- and downward-pointing paraboloids and saddles
3. diagonalization of quadratic forms
4. our throwing bricks demo from last week is related to the inertia tensor, and applying the Spectral Theorem yields something called The Principal Axes Theorem in physics
• T: • Read: §6.1
• Content:
1. defining vector space
2. starting examples (and non-examples) of vector spaces:
• the trivial vector space $\{\vec{0}\}$
• $\RR^n$ with the usual vector addition and scalar multiplication
• $\RR^2$ with modified vector addition(s) is often not a vector space
• Start HW15, which is due tomorrow
• Miniquiz 22
• W: • [Re]Read: §6.1
• Content:
1. some algebraic (arithmetic?) properties in vector spaces which are consequences of their definition (e.g., in any vector space $V$, $0\vec{u}=\vec{0}\ \forall u\in V$).
2. more examples of vector spaces:
• spaces of functions, such as:
• $\Ff(\RR)$ — the space of all functions on the real line $\RR$
• $C(\RR)$ — the space of continuous functions on the real line $\RR$
• $C^k(\RR)$ for $k\in\NN$ — the space of $k$ times continuously differentiable functions on the real line $\RR$
• $C^\infty(\RR)$ — the space of infinitely differentiable functions on the real line $\RR$
• $\Pp_k$ for $k\in\NN$ — the space of polynomials in one variable of degree at most $k$
• $\Pp$ — the space of all polynomials in one variable
all with the pointwise addition of functions and scalar multiplications on functions
• $M_{m\times n}$ — the space of $m\times n$ matrices, with the usual addition of matrices and scalar multiplication on matrices.
• Hand in HW15: 5.5.32, 5.5.38, 5.5.42
• Miniquiz 23
• F: • [Re]Read: §6.1
• Content:
1. more discussion of the vector spaces of functions we defined last class -- these form a chain of subspaces:
$\qquad\Pp_1\subseteq\Pp_2\subseteq\dots\subseteq\Pp\subseteq C^\infty(\RR)\subseteq\dots C^2(\RR)\subseteq C^1(\RR)\subseteq C(\RR)\subseteq\Ff(\RR)$
2. defining [vector] subspace
3. examples of subspaces
4. how to check if something is a subspace
• Maxiquiz 9 today

• Spring Break! No classes, of course.
• But please catch up on any old homeworks or re-dos of old assignments, to hand in on Monday after the break.
• Also, do not forget that HW16 is due the first day after break, and Journal 7 the day after that

• M: • [Re]Read: §6.1 and Read: §6.2
• Content:
1. going over Maxiquiz 9 and recent HWs
2. still more about vector subspaces
3. another vector space example: $M_{m\times n}$ — the space of $m\times n$ matrices, with the usual addition of matrices and scalar multiplication on matrices.
4. the subspaces of symmetric or skew-symmetric matrices in $M_{n\times n}$
• Hand in HW16: 6.1.{2, 6, 12, 48}
• Miniquiz 24
• T: • [Re]Read: §6.2
• Content:
1. defining Span in an abstract vector space
2. properties of Span in an abstract vector space
3. defining linearly [in]dependent in an abstract vector space, and examples
4. defining basis in an abstract vector space, and examples
5. defining dimension in an abstract vector space, and examples
• Journal 7 on §§5.4, 5.5 (only pp. 411-418), & 6.1 is due today
• Miniquiz 25
• W:
• [Re]Read: §6.2
• Content:
1. a little more care with finite and infinite sets of vectors which might be linearly [in]dependent
2. defining [in]finite dimensional for an abstract vector space, and examples
3. the Basis Theorem in an abstract vector space
• Miniquiz 26
• F: • [Re]Read: §6.2
• Content:
1. yet more care with finite and infinite sets of vectors which might be linearly [in]dependent
2. be careful of the definition of dimension in the book, it doesn't quite work!
3. coordinates of a vector $\vec{v}$ with respect to a basis $\Bb$ in an abstract vector space: $[\vec{v}]_\Bb\in\RR^k$, if $\Bb$ consists of $k$ basis vectors
• Hand in HW17: 6.1.46, 6.2.{6, 34, 44}
• Maxiquiz 10 today

• M: • Read: §6.3 and [Re]Read: §6.4
• Content:
1. going over Maxiquiz 10 and recent HWs
2. still more about coordinates:
• a change-of-basis matrix $P_{\Cc\leftarrow\Bb}$ for converting from basis $\Bb$ to basis $\Cc$, by $[\vec{v}]_\Cc=P_{\Cc\leftarrow\Bb}\,[\vec{v}]_\Bb$
• this $P_{\Cc\leftarrow\Bb}$ will be $k\times k$ if $\Bb$ (and thus also $\Cc$) consists of $k$ vectors
• $P_{\Cc\leftarrow\Bb}$ is invertible: in fact, $P_{\Cc\leftarrow\Bb}^{-1}=P_{\Bb\leftarrow\Cc}$
• Miniquiz 27
• T:
• Read: §6.4
• Content:
1. more examples of change-of-basis matrices
2. definition of a linear tranformation between abstract vector spaces
• Miniquiz 28
• W:
• [Re]Read: §6.4
• Content:
1. examples of linear tranformations
• the zero transformation
• the identity transformation
• matrix multiplication
• differentiation in $\Pp$
2. review for Midterm II; see this review sheet
• Hand in HW18: 6.3.16, 6.3.21 [Hint: use Thm 6.12b.], 6.4.{20, 22, 24}
• Miniquiz 29 (handed out as a take-home quiz, due Friday)
• F:
• Midterm II in class today
• Journal 8 on §§6.2-6.4 is due today

• M: • Read: §7.1
• Content:
1. going over Maxiquiz 11
2. a theorem giving a necessary and sufficient condition for finite dimensional vector spaces to be isomorphic
3. an inner product and inner product space
4. examples of inner product spaces:
• $\RR^n$ with the usual dot product
• the $L^2$ inner product on $C[0,1]$
• Journal 9 on §§6.5 & 6.6 is due today
• Miniquiz 31
• T: • [Re]Read: §7.1
• Content:
1. elementary properties of inner products
2. length or norm of vectors in an inner product space
3. the Pythagorean Theorem
4. orthogonal vectors in an inner product space
5. projections and the Gram-Schmidt Process in an inner product space
6. an orthonormal set in the inner product space $C[-\pi,\pi]$ with the $L^2$ inner product: trigonometric functions, and the connection with Fourier Analysis and radios
• Miniquiz 32
• W:
• Read: §7.2 pp.561-564 only
• Content:
1. the distance between vectors in an inner product space
2. a norm and a normed linear space
3. the distance function $d(\vec{u},\vec{v})$ in a normed linear space
4. a metric and metric space
5. examples of norms/metrics:
• the norm coming from an inner product
• the sum norm (also called the $L^1$ norm)
• the max norm (also called the sup or $L^\infty$ norm)
• the taxicab metric
• Hand in HW20: 7.1.{36, 40}, 7.2.{8, 14}
• Journal 10 on §§7.1 & 7.2 [only the parts we covered in this class] is due today
• Miniquiz 33
• Hand in all late work and re-dos by 4pm today if you want them corrected and returned to you on Friday
• F: • Content:
1. going over recent HWs
2. review for the Final Exam next week; see this review sheet
• Last day to hand in [by noon!] all late work and re-dos for class credit [although materials handed in today will not be returned]