## Colorado State University, Pueblo; Fall 2013 Math 307 — Introduction to Linear Algebra Course Schedule & Homework Assignments

Here is a link back to the course syllabus/policy page.

In the following, all sections and page numbers refer to the required course textbook, Linear Algebra, A Modern Introduction (3rd edition), by David Poole.

Also in the following, the image means that class was videoed that day and can be seen through the Blackboard page for this class. Note that if you know ahead of time that you will miss a class, you should tell me and I will be sure to video that day.

This schedule is will be changing very frequently, please check it at least every class day, and before starting work on any assignment (in case the content of the assignment has changed).

• M: • Yes, we do have class today, even though it is the federal Labor Day holiday.
• [Re]Read: §2.1 and Read §2.3
• Content:
1. going over Maxiquiz 1
2. notation $A\Rightarrow B$ meaning "if $A$ then $B$" and $A\Leftrightarrow B$ meaning "$A$ if and only if $B$", also written "$A$ iff $B$"
3. [in]homogeneous linear system
4. relationship between a linear system being homogeneous and being consistent
5. linear combination [again]
6. Span, a few basic examples and properties (e.g., the span of a single vector is the line along that vector)
7. (size of) solution sets of a linear system: empty, single vector, or an infinite number of vectors.
8. result that a linear system is consistent iff the vector consisting of the right hand side constant values is in the span of the columns
• Hand in HW1: 1.2.62, 2.2.44, 2.2.47
• Miniquiz 3
• T: • [Re]Read: §2.3
• Content:
1. [non]trivial for both linear combinations and solutions of a linear system
2. some elementary facts about $\Span$ such as that each of the vectors $\vec{0},\vec{v}_1,\dots,\vec{v}_k$, and the zero vector $\vec{0}$ are in the set $\Span(\vec{v}_1,\dots,\vec{v}_k)$.
3. linearly [in]dependent vectors — note it is important that the scalars in the definition are not all zero
4. examples of linearly [in]dependent vectors, including:
• the set $\{\vec{0}\}$ is linearly dependent
• if $\{\vec{u},\vec{v}\}$ is a linearly dependent set, then $\vec{v} = \alpha\,\vec{u}$ for some $\alpha\in\RR$.
• if $\{\vec{v_1},\dots,\vec{v_k}\}$ is a linearly dependent set, then one of the $\vec{v_j}$ is linear combination of the remaining vectors.
• proof (by contradition) that if $\{\vec{v_1},\dots,\vec{v_k}\}$ is a linearly independent set, then any subset is also linearly independent.
• Miniquiz 4
• hand in Journal 1 on material of Chapter 1 and the first part of Chapter 2
• W: • [Re]Read: §2.3
• Content:
1. more on proofs by contradiction
• general remarks
• a very classical example: Euclid's proof of the infinitude of the primes
2. revisit Span a bit
• it is a set
• ... work at the level of sets, "anything you can do, I can do meta", as Daniel Dennett said to Douglas Hofstadter.
3. the relationship of linear [in]dependence and the linear system whose coefficient matrix has columns which are the vectors under consideration: there will be a non-trivial solution of the homogeneous linear system with that coefficient matrix if and only if the vectors are linearly dependent
• Miniquiz 5
• F:
• Read: §2.3
• Content:
1. rank, free variables, row reduction, etc.
• Hand in HW2: 2.3.44 and Chapter 2 Review problem 18 (on p.141)
• Maxiquiz 2 today

• M: • [Re]Read: §2.3
• Content:
1. going over Maxiquiz 2 — the moral was write down the definitions, it's often enough!
2. rank
3. a term is well-defined if any choices in its definition are mentioned explicitly
4. rank is well defined because matrices always do have an RREF form, and that form is unique; they also always do have an REF form, but that is not unique.
5. necessary linear dependence of $k$ vectors in $\RR^n$ if $k>n$
• Miniquiz 6
• Today [Monday] is the last day to drop classes without a grade being recorded
• T: • Read: §3.1 and Read §3.2
• Content:
1. context and type for definitions ... see the handout on definitions for more on this theme
2. going over recent miniquizzes and homework, noticed:
• It often pays to write a scratch version of a proof before putting down the official version on a quiz or HW.
• Students are sometimes (often?) forgetting to do both directions of the proof of an "if and only if" statement.
• When you are trying to prove that two sets are equal, $S=T$, often the best way is to take a general element $s\in S$, write it down specifically (yes, a specific form of a general element!), and show that this particular element must also be in $T$, thereby showing all of $S$ is in $T$, i.e., $S\subseteq T$. Then you turn around and do the same thing for an element of $T$, getting $T\subseteq S$. It then follows that $S=T$.
3. the converse of $P\Rightarrow Q$, which is $Q\Rightarrow P$) (or, equivalently, $\neg P\Rightarrow\neg Q$)
4. the contrapositive of $P\Rightarrow Q$, which is $\neg Q\Rightarrow\neg P$
5. if $P\Rightarrow Q$ is true, the converse may or may not be true
6. an "if-then" statement is true if and only if its contrapositive is true
7. definition of the inverse of a matrix, and what it means for a matrix to be invertible
8. We talked about the method of proof by induction, or proofs using the Principle of Mathematical Induction, which goes like this:
• It only applies to theorems of the specific form "$\forall n\in\NN\ S(n)$ is true," where $S(n)$ is a mathematical statement which depends upon a natural number parameter $n$.
• First one proves that $S(1)$ is true; this is called the base case.
• Then one proves "If $S(n)$, then $S(n+1)$"; this is called the inductive step and, during the proof of this step, when one invokes the hypothesis $S(n)$, one calls it the inductive hypothesis.
• One declares the theorem proven by induction (and goes home happy).
9. an example of an inductive proofs, to show that $\sum_{j=1}^n j = \frac{n(n+1)}{2}$.
10. another example: proving that $\forall n\in\NN$, if $A_1,\dots,A_n$ are invertible matrices, then $\left(A_1\cdot\dots\cdot A_n\right)^{-1}=A_n^{-1}\cdot\dots\cdot A_1^{-1}$.
11. ended class with the inductive proof that all pigs are yellow. [see if you can find the flaw in that proof.]
• Hand in Journal 2
• Miniquiz 7 (handed out)
• W:
• [Re]Read: §§3.1-3.3
• Content:
1. recall from Math 207 and your reading of the book the basic terminology:
• matrices
• matrix addition
• scalar multiplication with matrices
• matrix multiplication
• the identity matrix
• matrix algebra
2. in class we recalled the definition of transpose
3. [skew-]symmetric matrices, properties:
• symmetric and skew-symmetric matrices must be square
• a skew-symmetric matrix always has zeros on the diagonal
4. some more logic:
• the details of an if–then statement, i.e., one in the form $P\Rightarrow Q$
• the converse of $P\Rightarrow Q$ (which is $\neg P\Rightarrow\neg Q$).
• the contrapositive of $P\Rightarrow Q$ (which is $\neg Q\Rightarrow\neg P$).
• the negation of $P\Rightarrow Q$ (which is $P\land\neg Q$)
• if $P\Rightarrow Q$ is true, the converse may or may not be true
• an "if-then" statement is true if and only if its contrapositive is true
• the negation of a statement of the form $\forall x\ P(x)$ is $\exists x\ \neg P(x)$
• the negation of a statement of the form $\exists x\ P(x)$ is $\forall x\ \neg P(x)$
5. gave the book's definition of a subspace of $\RR^n$: it is a subset $V\subseteq\RR^n$ satisfying the properties:
1. $\vec{0}\in S$
2. $\forall \vec{u},\vec{v}\in S\ \vec{u}+\vec{v}\in S$
3. $\forall \vec{u}\in S,\forall\alpha\in\RR\ \alpha\vec{u}\in S$
6. examples of subspaces of $\RR^2$ and $\RR^3$:
1. the trivial subspace $\left\{\vec{0}\right\}$
2. any line through the origin
3. any plane through the origin
4. the whole thing ($\RR^2$ or $\RR^3$ as a subspace of itself)
• Miniquiz 8
• F:
• Read: §3.5
• Hand in HW3: 3.1.37, 3.2.33 and 3.3.46
• Content:
1. going over some recent HW and miniquizzes
2. discussed the flaw with the induction proof of the (false) theorem that all pigs are yellow: the inductive step did not prove "$\forall n\in\NN S(n)\Rightarrow S(n+1)$", because it did not work unless $n\ge 3$; in effect it proved that "$\forall n\in\NN,\ n\ge 3\land S(n)\Rightarrow S(n+1)$". That different inductive step would have proven the theorem "$\forall n\in\NN,\ n\ge 3, S(n)$" if we had used the base case $S(3)$. Instead, using the base case $S(1)$, as we did, the induction doesn't even get started and no theorem at all is proven.
3. thinking more about subspaces of $\RR^n$... Notice that the first part of the book's definition is actually not necessary as long as the subset $S$ is non-empty: if it has any vector $\vec{u}$ at all, then it has $0\cdot\vec{u}=\vec{0}$ as well. And the second and third parts of the book's definition are together saying that subspaces are closed under linear combinations.
So here is another, equivalent definition of what it means for a subset $S\subseteq\RR^n$ to be a subspace:
1. $S\neq\emptyset$ (remember the notation for the empty set)
2. $\forall \vec{u},\vec{v}\in S,\forall\alpha,\beta\in\RR\ \ \alpha\vec{u}+\beta\vec{v}\in S$
4. building the list of examples of subspaces of $\RR^n$:
1. the trivial subspace $\left\{\vec{0}\right\}$
2. any line through the origin
3. any plane through the origin
4. a hyperplane through the origin in higher dimensions (e.g., the set of vectors $\begin{pmatrix}x_1\\x_2\\x_3\\x_4\end{pmatrix}\in\RR^4$ with components satisfying $a\,x_1+b\,x_2+c\,x_3+d\,x_4=0$, where $a,b,c,d\in\RR$, is a "three-dimensional hyperplane" in $\RR^4$ (think of it as a linear system with only one equation: it has three free variables, so three parameters are needed to specify a point on this hyperplane).
5. $\RR^n$ is itself a subspace of $\RR^n$. (Note that any other subspace than this one, so any subspace of $\RR^n$ which is not all of $\RR^n$, is called a proper subspace.)
6. Spans are subspaces: $\forall n,k\in\NN$ and $\forall\vec{v}_1,\dots,\vec{v}_k\in\RR^n$, $\Span(\vec{v}_1,\dots,\vec{v_k})$ is a subspace of $\RR^n$
5. so the span of a bunch of vectors is a subspace, but it need not be an efficient way to describe that subspace. Looking for a way to characterize an efficient set of vectors to build a subset, we defined a basis of a subspace of $\RR^n$
6. examples of bases:
1. the trivial subspace does not have a basis
2. the standard basis of $\RR^n$ (which we have met before; it is the $n$ vectors $$\vec{e}_1=\begin{pmatrix}1\\0\\0\\\vdots\\0\\0\end{pmatrix},\vec{e}_2=\begin{pmatrix}0\\1\\0\\\vdots\\0\\0\end{pmatrix},\dots,\vec{e}_n=\begin{pmatrix}0\\0\\0\\\vdots\\0\\1\end{pmatrix},$$ where $\vec{e}_j$ is the vector in $\RR^n$ which has a $1$ in the $j^\text{th}$ component and $0$'s everywhere else; the dimension $n$ is not part of the notation $\vec{e}_j$, it must be understood from context) is, as the name suggests, a basis.
• Maxiquiz 3 was handed out today, and you can hand it in on Monday, if you like, but it is sufficiently similar to recent homework problems that skipping that would not be a problem.

• M:
• Read: §3.6
• Content:
1. going over Maxiquiz 4 and HW4 in some detail
2. defining a linear transformation
• Miniquiz 12
• T: • [Re]Read: §3.6
• Content:
1. examples of linear transformations:
• the trivial transformation which sends every input to $\vec{0}$.
• the identity transformation $T:\RR^n\to\RR^n$ in any dimension $n\in\NN$
• linear transformations $T:\RR^1\to\RR^1$:
• not the same thing as functions whose graphs are a line
• any such $T$ given by $T((x))=(m\cdot x)$, where $m\in\RR$, (thinking of $(x)$ as a vector of length one) is indeed a linear transformation — so the problem with the general function whose graph is a line is its $y$-intercept
• in fact, all linear transformations $T:\RR^1\to\RR^1$ are given by such $T((x))=(m\cdot x)$, where $m\in\RR$. $m=1$ gives the identity transformation, $m=0$ gives the trivial transformation
• linear transformations $T:\RR^1\to\RR^1$:
• thinking about the geometry made us think that rotations, reflections, and dilations should be linear transformations; unclear about translations
• by looking at what happened to unit vectors on the axes and assuming the rotation was linear and so could be extended to all other vectors using linear combinations, we got the matrix $R_\theta=\begin{pmatrix}\cos\theta&\sin\theta\\ -\sin\theta&\cos\theta\end{pmatrix}$ to represent a counterclockwise rotation of $\RR^2$ by the angle $\theta$
• similarly, a reflection of $\RR^2$ across the $y$-axis is represented by the matrix $F_y=\begin{pmatrix}-1&0\\ 0&1\end{pmatrix}$ and across the $x$-axis by $F_x=\begin{pmatrix}1&0\\ 0&-1\end{pmatrix}$ .
• dilation by scale factor $k$ is represented by the matrix $D_k=\begin{pmatrix}k&0\\ 0&k\end{pmatrix}$ .
• in each case, when we say a linear transformation is "represented by a matrix", this is in the sense of the next theorem
• Theorem If $A$ is an $m\times n$ matrix for some $m,n\in\NN$, then the map $L_A:\RR^n\to\RR^m$ given by $L_A(\vec{x})=A\cdot\vec{x}$ is a linear transformation.
• It is left to see if the converse of that theorem is true, and how composition of linear transformations can be described in terms of their matrices.
• Journal 4 (on everything up to and including §3.5) is due today
• Miniquiz 13
• W:
• [Re]Read: §3.6
• Content:
1. a Proposition: If $f:\RR^n\to\RR^m$ is a linear transformation, then $f(\vec{0})=\vec{0}$.
2. as a consequence, a translation — the map $T_\vec{a}:\RR^2\to\RR^2$ given by $T_\vec{a}(\vec{x})=\vec{x}+\vec{a}$ for some fixed $\vec{a}\in\RR^2$ — is not a linear transformation
3. more examples of linear transformations: projections
4. a linear transformation coming from matrix multiplication — what the book (and almost no one else) calls a "matrix transformation"
5. a linear transformation is determined by what it does to the standard basis
6. the matrix $[T]$ of a linear transformation $T$, with examples:
• projection onto the $x$-axis in $\RR^2$ has matrix $\begin{pmatrix}1&0\\ 0&0\end{pmatrix}$ .
7. the composition of linear transformations has matrix which is the product of the matrices of the constituent transformations
• Miniquiz 14
• F: • [Re]Read: §3.6
• Content:
1. repeating the Theorem that the composition of linear transformations is also a linear transformation and its associated matrix is the product (in the appropriate order) of the matrices associated to the constituent transformations
2. proved the Proposition that given two linear transformations $S,T:\RR^n\to\RR^m$, the transformation $S+T$, defined as $(S+T)(\vec{v})=S(\vec{v})+T(\vec{v})$, is also a linear transformation
3. domain, codomain, and range of a linear transformation
4. recall the definition of onto (that the range equals the codomain); examples of linear transformations we know that are either onto or not
5. recall the defintion of 1-1 (that $T(\vec{u})=T(\vec{v})$ happens only when $\vec{u}=\vec{v}$)
6. Theorem: A linear transformation is 1-1 iff and only if its associated matrix has nullity zero. In symbols: a linear transformation $T:\RR^n\to\RR^m$ is 1-1 iff $\operatorname{nullity}([T])=0$.
7. Proposition: A linear transformation $T:\RR^n\to\RR^m$ cannot be 1-1 if $n>m$.
8. Corollary: If a linear transformation $T:\RR^n\to\RR^m$ is invertible, then $n=m$.
• Maxiquiz 5 today
• Hand in HW5: 3.6.4, 3.6.8, 3.6.44, 3.6.54

• M: • Read: §4.2
• Content:
1. going over Maxiquiz 5 and HW5
2. review of some of the parts of the The Fundamental Theorem of Invertible Matrices, particularly the fact that a matrix $A$ is invertible iff it's RREF form is the $n\times n$ identity.
3. the inverse of a linear/matrix transformation
4. definition of the determinant
• for $1\times 1$ matrices
• for $2\times 2$ matrices
• recursively for $n\times n$ matrices, where $n\ge 2$ — which is essentially Laplace's Expansion Theorem
5. facts about determinants:
• the determinant determines if a matrix is invertible: $\det(A)=0\ \Leftrightarrow\ A$ is invertible.
• $\forall A,B,\ \det(AB)=\det(A)\cdot\det(B)$ — which is amazing, because matrix multiplication is highly non-commutative, while multiplication of real numbers is commutative, yet $\det(\cdot)$ turns one into the other
• Miniquiz 15
• T: • Read: §§4.1 and 4.3
• Content:
1. properties of determinants:
• determinants for triangular matrices (induction)
• $\forall A,B,\ \det(AB)=\det(A)\cdot\det(B)$ (use elementary matrices)
• $\det(A)=0\ \Leftrightarrow\ A$ is invertible. (elementary matrices, again)
2. defining eigenvectors and eigenvalues
3. examples of $2\times 2$ matrices with 0, 1, and 2 distinct eigenvalues
4. Theorem: If $\lambda$ is an eigenvalue of the matrix $A$, then $\det(A-\lambda I_{n\times n})=0$.
• Journal 5 on §§3.6 & 4.2 is due today
• Hand in HW6: 4.2.53, 4.2.54, 4.2.69 (or it may be handed in tomorrow)
• Miniquiz 16
• W:
• Hand in HW6 if you did not do so yesterday
• [Re]Read: §4.3 and Read: §4.4
• Content:
1. defining eigenspaces
2. an eigenspace is always a subspace of $\RR^n$; in fact, it is the nullspace of $A-\lambda I$.
3. the characteristic polynomial/equation of an $n\times n$ matrix
4. the algebraic multiplicity of an eigenvalue
5. the geometric multiplicity of an eigenvalue
6. examples of eigenspaces and multiplicities of eigenvalues
• Miniquiz 17
• F:
• [Re]Read: §4.4
• Content:
1. definition of matrix similarity
2. properties of matrix similarity:
• reflexive
• symmetric
• transitive
• ...so it's an equivalence relation
3. properties in common to similar matrices
4. diagonalizable matrices
• Maxiquiz 6 was handed out today. Please work on it entirely on your own (although you may consult your book and notes) and hand it in on Tuesday as part of Midterm I. If you missed class, contact me and I will get you a copy of this maxiquiz for you to work on at home and then to hand in on Tuesday.
• Hand in HW7: 4.1.35, 4.1.37, 4.3.20, and 4.3.21

• M: • [Re]Read: §4.4
• Content:
1. going over HW7
2. review for Midterm I; see this review sheet
3. eigenvalues of triangular (and hence, trivially, diagonal) matrices
4. if $A\sim B$ then the eigenvalues of $A$ and $B$ are the same
5. a $2\times 2$ matrix $A=\begin{pmatrix}a&b\\ 0&d\end{pmatrix}$ with $b\neq 0$ can be diagonalized — is similar to a matrix with the same diagonal entries but where the upper-right entry has been made $0$ — if $a\neq d$.
6. $\forall A,\ \det(A)=\det(A^T)$ (just compute)
• Miniquiz 18
• T:
• Midterm I in class today
• Hand in Maxiquiz 6, it will count as part of the Midterm.
• No Journal is due today, although it might be a good idea to do one on the material in §§4.1, 4.3, and (at least part of) 4.4 as part of your studying for the midterm; the next journal will be due next Tuesday.
• W:
• Content:
1. going over Midterm I
• F: • [Re]Read: §4.4
• Content:
1. linear independence of eigenvectors corresponding to distinct eigenvalues
2. building a basis for the ambient $\RR^n$ out of bases of all eigenspaces of an $n\times n$ matrix
• a $2\times 2$ example
3. The Diagonalization Theorem
4. things to notice about an invertible matrix, say called $P$:
• the columns of $P$ are a basis of $\RR^n$, call them $\vec{p}_1,\dots,\vec{p}_n$
• conversely, if $\{\vec{p}_1,\dots,\vec{p}_n\}$ is a basis of $\RR^n$, and $P$ is a matrix whose columns are these vectors $\vec{p}_j$, for $1\le j\le n$, then $P$ is invertible $n\times n$.
• multiplication on the left by $P$ transforms the standard basis of $\RR^n$ to the new basis consisting of the columns of $P$; that is, if $\vec{e}_1,\dots,\vec{e}_n$ is the standard basis (so $\vec{e}_j$ is really just the $j$th column of the $n\times n$ identity matrix, for $1\le j\le n$), then $\vec{p}_j=P\vec{e}_j,$ again for $1\le j\le n$.
• multiplication on the left by $P^{-1}$ transforms the basis $\{\vec{p}_1,\dots,\vec{p}_n\}$ into the standard basis: $\vec{e}_j=P\vec{p}_j,$ for $1\le j\le n$.
5. what this has to do with diagonalization:
• if we can put together an entire basis $\{\vec{p}_1,\dots,\vec{p}_n\}$ of $\RR^n$ out of eigenvectors of some matrix $A$, so $A\vec{p}_j=\lambda_j\vec{p}_j$ for $1\le j\le n$, then building the matrix $P$ with columns from this basis, it will turn out that $P^{-1}AP$ is diagonal, with $\lambda_1,\dots,\lambda_n$ down the diagonal. [Why? because $$\left(P^{-1}AP\right)\vec{e_j}=P^{-1}\left(A\left(P\vec{e_j}\right)\right)=P^{-1}\left(A\vec{p_j}\right)=P^{-1}\left(\lambda_j\vec{p_j}\right)=\lambda_j\left(P^{-1}\vec{p_j}\right)=\lambda_j\vec{e}_j$$ so the standard basis of $\RR^n$ consists of eigenvectors of $P^{-1}AP$ with eigenvalues $\lambda_1,\dots,\lambda_n$ ... which means $P^{-1}AP$ is diagonal as claimed.]
6. so we get the Diagonalization Theorem, two versions:
• if separate bases of all the eigenspaces of an $n\times n$ matrix $A$ when put together yield a basis of all of $\RR^n$, then $A$ is diagonalizable.
• if the geometric multiplicity equals the algebraic multiplicity for every eigenvalue of a matrix $A$, then $A$ is diagonalizable
7. a corollary of the Diagonalization Theorem is that if an $n\times n$ matrix $A$ has $n$ distinct eigenvalues, then it is diagonalizable
8. did examples of the practical process of diagonalizing a matrix we are given, following the above procedure
• Hand in revised solutions to Midterm I, if you like; due to some students' scheduling conflict, you may also hand in revisions on Monday, if you prefer

• M: • Read: §5.1
• Content:
1. eigenvalues of invertible matrices
2. orthogonal and linearly independent sets of vectors in $\RR^n$
3. an orthonormal basis [ONB]
4. coordinates with respect to an ONB — computing the coefficients by dot products
5. defining orthogonal matrix and the set $O(n)$ of $n\times n$ orthogonal matrices
• Miniquiz 19
• last day to hand in revised solutions to Midterm I
• Hand in HW8: one problem (your choice) in the set 4.4.{40, 41, 42} and 4.4.47
• T:
• [Re]Read: §5.1
• Content:
1. orthogonal matrices:
• definition
• alternative characterizations
• their effect on the dot product or the norm
• the set of such is closed under products and inverses [Hence we call the set $O(n)$ of orthogonal matrices the orthogonal group.]
• determinants of orthogonal matrices
• Journal 6 on §§4.1, 4.3, & 4.4 is due today
• Miniquiz 20
• W:
• Read: §5.2
• Hand in HW9: 5.1.26, 5.1.28(a)&(b) [read but do not do 5.1.28(c)&(d)]
• Content:
1. proved that give vectors $\vec{v}_1,\dots,\vec{v_k},\vec{w}_1,\dots,\vec{w}_\ell\in\RR^n$ such that $\vec{v}_i\perp\vec{w}_j$ $\forall i=1,\dots,k$, $\forall j=1,\dots,\ell$ (i.e., all the $\vec{v}$'s are orthogonal to all the $\vec{w}$'s) then $\Span(\vec{v}_1,\dots,\vec{v_k})\perp\Span(\vec{w}_1,\dots,\vec{w}_\ell)$
2. defining orthogonal complement $W^\perp$ (pronounced "W-perp") of a subspace $W\subseteq\RR^n$
3. proved that for any subspace of $\RR^n$, $W^\perp$ is a also a subspace of $\RR^n$
4. for a matrix $A$: perps of column- and row-spaces, nullspaces and nullspaces of $A^T$ (stated without proofs)
• Miniquiz 21
• F:
• [Re]Read: §5.2
• Content:
1. orthogonal projections
2. the Orthogonal Decomposition Theorem
• Maxiquiz 7 today
• Note: HW10, which was previously to be due today, has been delayed and changed: it will be due next Monday, and consists of a (slightly) different set of problems

• M:
• T: • Read: §6.1
• Content:
1. going over Maxiquiz 8 and HW11
2. defining [abstract] vector space
3. starting examples (and non-examples) of vector spaces:
• $\RR^n$ with the usual vector addition and scalar multiplication
• $\RR^2$ with modified vector addition(s) is often not a vector space
• Miniquiz 25
• Hand in HW12: 5.4.16, 5.5.38, 5.5.54
• Journal 8 on §§5.4 & 5.5 (only pp. 425-432 "Quadratic Forms" in §5.5) is due today
• W: • [Re]Read: §6.1
• Content:
1. some algebraic (arithmetic?) properties in vector spaces which are consequences of their definition, e.g.:
• in any vector space $V$, $0\vec{u}=\vec{0}\ \forall u\in V$
• in any vector space $V$, $(-1)\vec{u}=-\vec{u}\ \forall u\in V$
• in any vector space $V$, $\alpha\vec{0}=\vec{0}\ \forall \alpha\in\RR$
2. more examples of vector spaces:
• the trivial vector space $\{\vec{0}\}$
• spaces of functions, such as:
• $\Ff(\RR)$ — the space of all functions on the real line $\RR$
• $C(\RR)$ — the space of continuous functions on the real line $\RR$
• $C^k(\RR)$ for $k\in\NN$ — the space of $k$ times continuously differentiable functions on the real line $\RR$
• $C^\infty(\RR)$ — the space of infinitely differentiable functions on the real line $\RR$
• $\Pp_k$ for $k\in\NN$ — the space of polynomials in one variable of degree at most $k$
• $\Pp$ — the space of all polynomials in one variable
all with the pointwise addition of functions and scalar multiplications on functions
• $M_{m\times n}$ — the space of $m\times n$ matrices, with the usual addition of matrices and scalar multiplication on matrices.
• Miniquiz 26
• F: • [Re]Read: §6.1
• Content:
1. more discussion of the vector spaces of functions we defined last class -- these form a chain of subspaces:
$\qquad\Pp_1\subseteq\Pp_2\subseteq\dots\subseteq\Pp\subseteq C^\infty(\RR)\subseteq\dots C^2(\RR)\subseteq C^1(\RR)\subseteq C(\RR)\subseteq\Ff(\RR)$
2. defining [vector] subspace
3. examples of subspaces
4. how to check if something is a subspace (it's closed under vector addition and scalar multiplication
5. defining Span in an abstract vector space
• Maxiquiz 9 today

• M: • Content:
1. going over Midterm II
• T:
• Read: §6.5
• Content:
1. the kernel of a linear transformation
2. the range of a linear transformation
3. one-to-one (or 1-1 or injective)
4. onto (or surjective)
5. inverses of linear transformations
• Hand in revised solutions to Midterm II, if you like
• NOTE: no Journal is due today.
• W:
• [Re]Read: §6.5
• Content:
1. kernels and ranges are always vector subspaces
2. rank and nullity (again)
3. The Rank-Nullity Theorem (again)
4. isomorphisms and isomorphic
5. a theorem giving a necessary and sufficient condition for finite dimensional vector spaces to be isomorphic
• Miniquiz 32
• F:
• Read: §6.6
• Content:
1. defining the matrix of a linear transformation with respect to bases of its domain and codomain
2. the matrix of a composition of linear transformations
3. the matrix of the inverse of a linear transformations
4. matrices of endomorphisms of vector spaces and similarity using the change of basis matrix
• Hand in HW16: 6.4.32, 6.5.27, & 6.5.34
• Maxiquiz 11 today

• Thanksgiving Break! No classes or office hours. You can contact me by e-mail is you want, though.
• If you are unhappy with your recent quiz or test scores, it would be good to do additional problems from the book sections we've covered.
• But please catch up on any old homeworks or re-dos of old assignments, to hand in on Monday after the break.
• Also, do not forget that Journal 11 is due the first day after break.