Math 307 — Introduction to Linear Algebra

Course Schedule & Homework Assignments

Here is a link back to the course syllabus/policy page.

In the following, all sections and page numbers refer to the required course
textbook, * Linear Algebra, A Modern Introduction
(3^{rd} edition)*, by David Poole.

Also in the following, the image means that class was videoed that day and can be seen through the Blackboard page for this class. Note that if you know ahead of time that you will miss a class, you should tell me and I will be sure to video that day.

This schedule is will be changing **very frequently**, please check it at
least every class day, and before starting work on any assignment (in case the
content of the assignment has changed).

*M:**Content:*- bureaucracy and introductions
- what is Linear Algebra (the study of vector spaces and linear transformations...)
- why do we do so much abstraction and formality at this point of the mathematics curriculum: what abstraction is good for
- an example of an application of linear algebra: the \$282 billion
eigenvalue problem:
**Google**'s**PageRank**algorithm

**Miniquiz 0**

*T:***Read:***To the Student, p. xxv&xxvi*and §§ 1.1&1.2**Miniquiz 1***Journals*are**not due**today; it's a bit too early in the term. (First*Journal*entry will be due next Tuesday.)*Content:*- logical and basic set theoretic terminology/notation
- some basic sets of numbers
**natural numbers**$\NN$**integers**$\ZZ$**rationals**$\QQ$**real numbers**$\RR$

- starting good definitional style, including
variables must be "bound"**all**- clearly identify the symbol and/or terminology being defined
- clearly identify the type of object being defined

**vectors in $\RR^n$****vector addtion****scalar multiplication**- the
**dot product** **norms****linear combinations of vectors**- some basic properties
- of vector arithmetic
- of dot products and norms
- the triangle inequality

- Do
**HW0:***Send me e-mail*(to jonathan.poritz@gmail.com)*telling me*:- Your name.
- Your e-mail address. (Please give me one that you actually check fairly frequently, since I may use it to contact you during the term.)
- Your year/program/major at CSUP.
- The reason you are taking this course.
- What you intend to do after CSUP, in so far as you have an idea.
- Past math classes you've had.
- Other math and science classes you are taking this term, and others you intend to take in coming terms.
- Your favorite mathematical subject.
- Your favorite mathematical result/theorem/technique/example/problem.
- Anything else you think I should know (disabilities, employment
or other things that take a lot of time,
*etc.*) - [Optional:] If you were going to be trapped on a desert island alone for ten years, what music would you like to have?

*W:***[Re]Read:**§§ 1.1 & 1.2**Miniquiz 2***Content:*- the
**Cauchy-Schwartz Inequality** **angles between vectors****orthogonal**vectors**unit vectors****distances between vectors****projection of one vector onto another**

- the

*F:***[Re]Read:**§§ 1.1-1.3 & 2.1*Content:*- proof that the difference between a vector and its projection onto another vector is orthogonal to that second vector
**quantifiers**$\forall$ and $\exists$**[systems of] linear equations****solutions of linear systems**, the**solution set**and its structure**[in]consistent linear system**- the
**coefficient**and**augmented matrices**of a linear system

**Maxiquiz 1****Today [Friday] is the last day to add classes.**

*M:*- Yes,
**we do have class today,**even though it is the federal Labor Day holiday. **[Re]Read:**§2.1 and**Read**§2.3*Content:*- going over
*Maxiquiz 1* - notation $A\Rightarrow B$ meaning "if $A$ then $B$" and $A\Leftrightarrow B$ meaning "$A$ if and only if $B$", also written "$A$ iff $B$"
**[in]homogeneous linear system**- relationship between a linear system being
*homogeneous*and being*consistent* **linear combination**[again]**Span**, a few basic examples and properties (*e.g.,*the span of a single vector is the line along that vector)- (size of) solution sets of a linear system: empty, single vector, or an infinite number of vectors.
- result that a linear system is consistent iff the vector consisting of the right hand side constant values is in the span of the columns

- going over
- Hand in
**HW1:**1.2.62, 2.2.44, 2.2.47 **Miniquiz 3**

- Yes,
*T:***[Re]Read:**§2.3*Content:***[non]trivial**for both*linear combinations*and*solutions of a linear system*- some elementary facts about $\Span$ such as that each of the vectors $\vec{0},\vec{v}_1,\dots,\vec{v}_k$, and the zero vector $\vec{0}$ are in the set $\Span(\vec{v}_1,\dots,\vec{v}_k)$.
**linearly [in]dependent**vectors — note it is important that the scalars in the definition are*not all zero*- examples of linearly [in]dependent vectors, including:
- the set $\{\vec{0}\}$ is linearly dependent
- if $\{\vec{u},\vec{v}\}$ is a linearly dependent set, then $\vec{v} = \alpha\,\vec{u}$ for some $\alpha\in\RR$.
- if $\{\vec{v_1},\dots,\vec{v_k}\}$ is a linearly dependent set, then one of the $\vec{v_j}$ is linear combination of the remaining vectors.
- proof (
*by contradition*) that if $\{\vec{v_1},\dots,\vec{v_k}\}$ is a linearly independent set, then any subset is also linearly independent.

**Miniquiz 4**- hand in
**Journal 1**on material of Chapter 1 and the first part of Chapter 2

*W:***[Re]Read:**§2.3*Content:*- more on proofs
**by contradiction**- general remarks
- a very classical example: Euclid's proof of the infinitude of the primes

- revisit
*Span*a bit- it is a
*set* - ... work at the level of sets, "anything you can do, I can do
*meta*", as Daniel Dennett said to Douglas Hofstadter.

- it is a
- the relationship of
**linear [in]dependence**and the linear system whose coefficient matrix has columns which are the vectors under consideration: there will be a non-trivial solution of the homogeneous linear system with that coefficient matrix if and only if the vectors are linearly dependent

- more on proofs
**Miniquiz 5**

*F:***Read:**§2.3*Content:***rank**,**free variables**,**row reduction**,*etc.*

- Hand in
**HW2:**2.3.44 and Chapter 2 Review problem 18 (on*p.141*) **Maxiquiz 2 today**

*M:***[Re]Read:**§2.3*Content:*- going over
*Maxiquiz 2*— the moral was*write down the definitions, it's often enough!* **rank**- a term is
**well-defined**if any choices in its definition are mentioned explicitly *rank*is well defined because matrices always do have an RREF form, and that form is unique; they also always do have an REF form, but that is not unique.- necessary linear dependence of $k$ vectors in $\RR^n$ if $k>n$

- going over
**Miniquiz 6****Today [Monday] is the last day to drop classes without a grade being recorded**

*T:***Read:**§3.1 and**Read**§3.2*Content:***context**and**type**for definitions ... see the handout on definitions for more on this theme- going over recent miniquizzes and homework, noticed:
- It often pays to write a scratch version of a proof before putting down the official version on a quiz or HW.
- Students are sometimes (often?) forgetting to do both directions of the proof of an "if and only if" statement.
- When you are trying to prove that two sets are equal, $S=T$,
often the best way is to take a general element $s\in S$, write
it down specifically (yes, a specific form of a general
element!), and show that this particular element must also be in
$T$, thereby showing all of $S$ is in $T$,
*i.e.,*$S\subseteq T$. Then you turn around and do the same thing for an element of $T$, getting $T\subseteq S$. It then follows that $S=T$.

- the
**converse**of $P\Rightarrow Q$, which is $Q\Rightarrow P$) (or, equivalently, $\neg P\Rightarrow\neg Q$) - the
**contrapositive**of $P\Rightarrow Q$, which is $\neg Q\Rightarrow\neg P$ - if $P\Rightarrow Q$ is true, the converse may or may not be true
- an "if-then" statement is true if and only if its contrapositive is true
- definition of the
**inverse**of a matrix, and what it means for a matrix to be**invertible** - We talked about the method of
**proof by induction**, or proofs using the**Principle of Mathematical Induction**, which goes like this:- It only applies to theorems of the specific form "$\forall n\in\NN\ S(n)$ is true," where $S(n)$ is a mathematical statement which depends upon a natural number parameter $n$.
- First one proves that $S(1)$ is true; this is called the
**base case**. - Then one proves "If $S(n)$, then $S(n+1)$"; this is called
the
**inductive step**and, during the proof of this step, when one invokes the hypothesis $S(n)$, one calls it the**inductive hypothesis**. - One declares the theorem proven
**by induction**(and goes home happy).

- an example of an inductive proofs, to show that $\sum_{j=1}^n j = \frac{n(n+1)}{2}$.
- another example: proving that $\forall n\in\NN$, if $A_1,\dots,A_n$ are invertible matrices, then $\left(A_1\cdot\dots\cdot A_n\right)^{-1}=A_n^{-1}\cdot\dots\cdot A_1^{-1}$.
- ended class with the inductive proof that all pigs are yellow.
*[see if you can find the flaw in that proof.]*

- Hand in
**Journal 2** **Miniquiz 7**(handed out)

*W:***[Re]Read:**§§3.1-3.3*Content:*- recall from Math 207 and your reading of the book the basic
terminology:
**matrices**- matrix addition
- scalar multiplication with matrices
- matrix multiplication
- the
**identity matrix** - matrix algebra

- in class we recalled the definition of
**transpose** **[skew-]symmetric**matrices, properties:- symmetric and skew-symmetric matrices must be square
- a skew-symmetric matrix always has zeros on the diagonal

- some more
*logic:*- the details of an
**if–then**statement,*i.e.,*one in the form $P\Rightarrow Q$ - the
**converse**of $P\Rightarrow Q$ (which is $\neg P\Rightarrow\neg Q$). - the
**contrapositive**of $P\Rightarrow Q$ (which is $\neg Q\Rightarrow\neg P$). - the
**negation**of $P\Rightarrow Q$ (which is $P\land\neg Q$) - if $P\Rightarrow Q$ is true, the converse may or may not be true
- an "if-then" statement is true if and only if its contrapositive is true
- the negation of a statement of the form $\forall x\ P(x)$ is $\exists x\ \neg P(x)$
- the negation of a statement of the form $\exists x\ P(x)$ is $\forall x\ \neg P(x)$

- the details of an
- gave the book's definition of a
**subspace**of $\RR^n$: it is a subset $V\subseteq\RR^n$ satisfying the properties:- $\vec{0}\in S$
- $\forall \vec{u},\vec{v}\in S\ \vec{u}+\vec{v}\in S$
- $\forall \vec{u}\in S,\forall\alpha\in\RR\ \alpha\vec{u}\in S$

- examples of subspaces of $\RR^2$ and $\RR^3$:
- the
**trivial subspace**$\left\{\vec{0}\right\}$ - any line through the origin
- any plane through the origin
- the whole thing ($\RR^2$ or $\RR^3$ as a subspace of itself)

- the

- recall from Math 207 and your reading of the book the basic
terminology:
**Miniquiz 8**

*F:***Read:**§3.5- Hand in
**HW3:**3.1.37, 3.2.33 and 3.3.46 *Content:*- going over some recent HW and miniquizzes
- discussed the flaw with the induction proof of the (false) theorem
that
*all pigs are yellow:*the inductive step did**not**prove "$\forall n\in\NN S(n)\Rightarrow S(n+1)$", because it did not work unless $n\ge 3$; in effect it proved that "$\forall n\in\NN,\ n\ge 3\land S(n)\Rightarrow S(n+1)$". That different inductive step would have proven the theorem "$\forall n\in\NN,\ n\ge 3, S(n)$" if we had used the base case $S(3)$. Instead, using the base case $S(1)$, as we did, the induction doesn't even get started and no theorem at all is proven. - thinking more about
*subspaces of $\RR^n$...*Notice that the first part of the book's definition is actually not necessary as long as the subset $S$ is non-empty: if it has any vector $\vec{u}$ at all, then it has $0\cdot\vec{u}=\vec{0}$ as well. And the second and third parts of the book's definition are together saying that*subspaces are closed under linear combinations*.

So here is another, equivalent definition of what it means for a subset $S\subseteq\RR^n$ to be a**subspace**:- $S\neq\emptyset$ (remember the notation for the
*empty set*) - $\forall \vec{u},\vec{v}\in S,\forall\alpha,\beta\in\RR\ \ \alpha\vec{u}+\beta\vec{v}\in S$

- $S\neq\emptyset$ (remember the notation for the
- building the list of examples of subspaces of $\RR^n$:
- the
**trivial subspace**$\left\{\vec{0}\right\}$ - any line through the origin
- any plane through the origin
- a
*hyperplane*through the origin in higher dimensions (*e.g.,*the set of vectors $\begin{pmatrix}x_1\\x_2\\x_3\\x_4\end{pmatrix}\in\RR^4$ with components satisfying $a\,x_1+b\,x_2+c\,x_3+d\,x_4=0$, where $a,b,c,d\in\RR$, is a "three-dimensional hyperplane" in $\RR^4$ (think of it as a linear system with only one equation: it has three free variables, so three parameters are needed to specify a point on this hyperplane). - $\RR^n$ is itself a subspace of $\RR^n$. (Note that any other
subspace than this one, so any subspace of $\RR^n$ which is not
all of $\RR^n$, is called a
**proper**subspace.) - Spans are subspaces: $\forall n,k\in\NN$ and $\forall\vec{v}_1,\dots,\vec{v}_k\in\RR^n$, $\Span(\vec{v}_1,\dots,\vec{v_k})$ is a subspace of $\RR^n$

- the
- so the span of a bunch of vectors is a subspace, but it need not
be an
*efficient*way to describe that subspace. Looking for a way to characterize an efficient set of vectors to build a subset, we defined a**basis**of a subspace of $\RR^n$ - examples of bases:
- the trivial subspace
*does not have a basis* - the
**standard basis of $\RR^n$**(which we have met before; it is the $n$ vectors $$\vec{e}_1=\begin{pmatrix}1\\0\\0\\\vdots\\0\\0\end{pmatrix},\vec{e}_2=\begin{pmatrix}0\\1\\0\\\vdots\\0\\0\end{pmatrix},\dots,\vec{e}_n=\begin{pmatrix}0\\0\\0\\\vdots\\0\\1\end{pmatrix},$$ where $\vec{e}_j$ is the vector in $\RR^n$ which has a $1$ in the $j^\text{th}$ component and $0$'s everywhere else; the dimension $n$ is not part of the notation $\vec{e}_j$, it must be understood from context) is, as the name suggests, a basis.

- the trivial subspace

**Maxiquiz 3**was handed out today, and you can hand it in on Monday, if you like, but it is sufficiently similar to recent homework problems that skipping that would not be a problem.

*M:***[Re]Read:**§3.5*Content:*- proved the
**Theorem:**Given $A$ and $B$ invertible matrices, $A\,B$ will be invertible, and $(A\,B)^{-1}=B^{-1}\,A^{-1}$. - discussion of how thinking of and writing down proofs is a
serious new skill we are working on in this class, so we should
definitely not expect it to be terrifically easy and or to
come quickly; also, it is a very good idea to
*read the proofs in the book*, they will be a source of inspiration for your own proofs in the future. - going over
*HW3*— some of the morals were:- "repeat until terms are gone..." really means
*induction*, which would be better to write out explicitly - when you do induction, you must:
- clearly write down the statement $S(n)$ over which you are inducting -- the hint will be if the thing to be proven has the structure $\forall n\in\NN\ S(n)$.
- make sure you do a reasonable base case
- clearly enunciate the logic of each step, with lots of
*transition words/phrases*

*be careful*with induction, if you are not, you can make mistakes —*e.g.,*the famous proof that all pigs are yellow [which is unfortunately incorrect!] — so your best bet is to stick carefully to the standard structure of induction proofs and to check over each step very carefully

- "repeat until terms are gone..." really means
- recalling the idea of a
*basis*of a subspace of $\RR^n$, we noticed that a subspace which is a line (through the origin, of course), has many bases — any single, non-zero vector on the line will be a basis. - likewise, a subspace consisting of a plane (through the origin in $\RR^n) has many bases: any two non-zero vectors which are not mulitples of each other
- therefore bases are very much
*not uniquely determined by their subspaces*... but the number of vectors in a basis does seem to be uniquely determined by the subspace; hence we define the**dimension**of a subspace of $\RR^n$ *"Dimension" is well-defined,*says**The Basis Theorem:**Given a subspace $S$ of $\RR^n$, any two bases of $S$ have the same number of vectors.- Started the proof of
*The Basis Theorem.*

- proved the
**Miniquiz 9**

*T:***[Re]Read:**§3.5*Content:*- mentioned again that good definitions have all the components described in this guide
- finished the proof of
*The Basis Theorem*. - notice all subspaces of $\RR^n$ have either exactly one vector in them (in which case we're talking about the trivial subspace), or an infinite number of vectors
- to measure the
*size*of a subspace, we defined the**dimension**of a subspace of $\RR^n$ - defined the
**row space**and**column space**of a matrix

- Hand in
**Journal 3** **Miniquiz 10**

*W:***[Re]Read:**§3.5*Content:*- noticed that the previously defined
*row*and*column spaces*are subspaces (of $\RR^n$ and $\RR^m$, respectively, for an $m\times n$ matrix) - defined the
**null space**of an $m\times n$ matrix, and proved it is a subspace of $\RR^n$ - from the
*row space*comes the**rank** - from the
*null space*comes the**nullity** **The Rank-Nullity Theorem**- notice that the row space of a matrix doesn't change as we
do EROs to the matrix — the column space
*does*change - a basis of the row space of a matrix will consist of the non-zero rows of its RREF form.

- noticed that the previously defined
**Miniquiz 11**

*F:***[Re]Read:**§3.5*Content:*- more on the proof of
*The Rank[-Nullity] Theorem* - finding bases of the
*null space*,*column space*and*row space* - a part of
**The Fundamental Theorem of Invertible Matrices** - notation:
**TFAE**= "the following are equivalent";*i.e.,*the following statements are all joined by "iff"

- more on the proof of
- Hand in
**HW4:**3.5.4, 3.5.34, 3.5.62 **Maxiquiz 4 today**

*M:***Read:**§3.6*Content:*- going over
*Maxiquiz 4*and*HW4*in some detail - defining a
**linear transformation**

- going over
**Miniquiz 12**

*T:***[Re]Read:**§3.6*Content:*- examples of linear transformations:
- the
**trivial transformation**which sends every input to $\vec{0}$. - the
**identity transformation**$T:\RR^n\to\RR^n$ in any dimension $n\in\NN$ - linear transformations $T:\RR^1\to\RR^1$:
*not*the same thing as functions whose graphs are a line- any such $T$ given by $T((x))=(m\cdot x)$, where $m\in\RR$, (thinking of $(x)$ as a vector of length one) is indeed a linear transformation — so the problem with the general function whose graph is a line is its $y$-intercept
- in fact,
*all*linear transformations $T:\RR^1\to\RR^1$ are given by such $T((x))=(m\cdot x)$, where $m\in\RR$. $m=1$ gives the identity transformation, $m=0$ gives the trivial transformation

- linear transformations $T:\RR^1\to\RR^1$:
- thinking about the geometry made us think that
*rotations*,*reflections*, and*dilations*should be linear transformations; unclear about*translations* - by looking at what happened to unit vectors on the axes and assuming the rotation was linear and so could be extended to all other vectors using linear combinations, we got the matrix $R_\theta=\begin{pmatrix}\cos\theta&\sin\theta\\ -\sin\theta&\cos\theta\end{pmatrix}$ to represent a counterclockwise rotation of $\RR^2$ by the angle $\theta$
- similarly, a reflection of $\RR^2$ across the $y$-axis is represented by the matrix $F_y=\begin{pmatrix}-1&0\\ 0&1\end{pmatrix}$ and across the $x$-axis by $F_x=\begin{pmatrix}1&0\\ 0&-1\end{pmatrix}$ .
- dilation by scale factor $k$ is represented by the matrix $D_k=\begin{pmatrix}k&0\\ 0&k\end{pmatrix}$ .
- in each case, when we say a linear transformation is "represented by a matrix", this is in the sense of the next theorem

- thinking about the geometry made us think that
**Theorem**If $A$ is an $m\times n$ matrix for some $m,n\in\NN$, then the map $L_A:\RR^n\to\RR^m$ given by $L_A(\vec{x})=A\cdot\vec{x}$ is a linear transformation.- It is left to see if the converse of that theorem is true, and how composition of linear transformations can be described in terms of their matrices.

- the

- examples of linear transformations:
**Journal 4**(on everything up to and including §3.5)**is due today****Miniquiz 13**

*W:***[Re]Read:**§3.6*Content:*- a
**Proposition:**If $f:\RR^n\to\RR^m$ is a linear transformation, then $f(\vec{0})=\vec{0}$. - as a consequence, a
*translation*— the map $T_\vec{a}:\RR^2\to\RR^2$ given by $T_\vec{a}(\vec{x})=\vec{x}+\vec{a}$ for some fixed $\vec{a}\in\RR^2$ — is*not*a linear transformation - more examples of linear transformations:
**projections** - a linear transformation coming from matrix multiplication — what the book (and almost no one else) calls a "matrix transformation"
- a linear transformation is determined by what it does to the standard basis
- the
**matrix $[T]$ of a linear transformation $T$**, with examples:- projection onto the $x$-axis in $\RR^2$ has matrix $\begin{pmatrix}1&0\\ 0&0\end{pmatrix}$ .

- the composition of linear transformations has matrix which is the product of the matrices of the constituent transformations

- a
**Miniquiz 14**

*F:***[Re]Read:**§3.6*Content:*- repeating the
**Theorem**that the composition of linear transformations is also a linear transformation and its associated matrix is the product (in the appropriate order) of the matrices associated to the constituent transformations - proved the
**Proposition**that given two linear transformations $S,T:\RR^n\to\RR^m$, the transformation $S+T$, defined as $(S+T)(\vec{v})=S(\vec{v})+T(\vec{v})$, is also a linear transformation **domain**,**codomain**, and**range**of a linear transformation- recall the definition of
**onto**(that the range equals the codomain); examples of linear transformations we know that are either onto or not - recall the defintion of
**1-1**(that $T(\vec{u})=T(\vec{v})$ happens only when $\vec{u}=\vec{v}$) **Theorem:**A linear transformation is 1-1 iff and only if its associated matrix has nullity zero. In symbols: a linear transformation $T:\RR^n\to\RR^m$ is 1-1 iff $\operatorname{nullity}([T])=0$.**Proposition:**A linear transformation $T:\RR^n\to\RR^m$ cannot be 1-1 if $n>m$.**Corollary:**If a linear transformation $T:\RR^n\to\RR^m$ is invertible, then $n=m$.

- repeating the
**Maxiquiz 5 today**- Hand in
**HW5:**3.6.4, 3.6.8, 3.6.44, 3.6.54

*M:***Read:**§4.2*Content:*- going over
*Maxiquiz 5*and*HW5* - review of some of the parts of the
*The Fundamental Theorem of Invertible Matrices*, particularly the fact that a matrix $A$ is invertible iff it's RREF form is the $n\times n$ identity. - the
**inverse**of a linear/matrix transformation - definition of the
**determinant**- for $1\times 1$ matrices
- for $2\times 2$ matrices
- recursively for $n\times n$ matrices, where $n\ge 2$ —
which is essentially
**Laplace's Expansion Theorem**

- facts about determinants:
- the determinant
**determines**if a matrix is invertible: $\det(A)=0\ \Leftrightarrow\ A$ is invertible. - $\forall A,B,\ \det(AB)=\det(A)\cdot\det(B)$ — which is
*amazing*, because matrix multiplication is highly non-commutative, while multiplication of real numbers is commutative, yet $\det(\cdot)$ turns one into the other

- the determinant

- going over
**Miniquiz 15**

*T:***Read:**§§4.1 and 4.3*Content:*- properties of determinants:
- determinants for triangular matrices (induction)
- $\forall A,B,\ \det(AB)=\det(A)\cdot\det(B)$ (use elementary matrices)
- $\det(A)=0\ \Leftrightarrow\ A$ is invertible. (elementary matrices, again)

- defining
**eigenvectors**and**eigenvalues** - examples of $2\times 2$ matrices with 0, 1, and 2 distinct eigenvalues
**Theorem:**If $\lambda$ is an eigenvalue of the matrix $A$, then $\det(A-\lambda I_{n\times n})=0$.

- properties of determinants:
**Journal 5 on §§3.6 & 4.2 is due today**- Hand in
**HW6:**4.2.53, 4.2.54, 4.2.69 (or it may be handed in tomorrow) **Miniquiz 16**

*W:*- Hand in
**HW6**if you did not do so yesterday **[Re]Read:**§4.3 and**Read:**§4.4*Content:*- defining
**eigenspaces** - an eigenspace is always a subspace of $\RR^n$; in fact, it is the nullspace of $A-\lambda I$.
- the
**characteristic polynomial/equation**of an $n\times n$ matrix - the
**algebraic multiplicity**of an eigenvalue - the
**geometric multiplicity**of an eigenvalue - examples of eigenspaces and multiplicities of eigenvalues

- defining
**Miniquiz 17**

- Hand in
*F:***[Re]Read:**§4.4*Content:*- definition of matrix
**similarity** - properties of matrix similarity:
**reflexive****symmetric****transitive**- ...so it's an
**equivalence relation**

- properties in common to similar matrices
**diagonalizable**matrices

- definition of matrix
**Maxiquiz 6**was handed out today. Please work on it entirely on your own (although you may consult your book and notes) and hand it in on Tuesday as part of Midterm I. If you missed class, contact me and I will get you a copy of this maxiquiz for you to work on at home and then to hand in on Tuesday.- Hand in
**HW7:**4.1.35, 4.1.37, 4.3.20, and 4.3.21

*M:***[Re]Read:**§4.4*Content:*- going over
*HW7* - review for
*Midterm I*; see this review sheet - eigenvalues of triangular (and hence, trivially, diagonal) matrices
- if $A\sim B$ then the eigenvalues of $A$ and $B$ are the same
- a $2\times 2$ matrix $A=\begin{pmatrix}a&b\\ 0&d\end{pmatrix}$ with $b\neq 0$ can be diagonalized — is similar to a matrix with the same diagonal entries but where the upper-right entry has been made $0$ — if $a\neq d$.
- $\forall A,\ \det(A)=\det(A^T)$ (just compute)

- going over
**Miniquiz 18**

*T:**Midterm I*in class today- Hand in
*Maxiquiz 6*, it will count as part of the Midterm. - No
**Journal**is due today, although it might be a good idea to do one on the material in §§4.1, 4.3, and (at least part of) 4.4 as part of your studying for the midterm; the next journal will be due next Tuesday.

*W:**Content:*- going over
*Midterm I*

- going over

*F:***[Re]Read:**§4.4*Content:*- linear independence of eigenvectors corresponding to distinct eigenvalues
- building a basis for the ambient $\RR^n$ out of bases of all
eigenspaces of an $n\times n$ matrix
- a $2\times 2$ example

**The Diagonalization Theorem**- things to notice about an invertible matrix, say called $P$:
- the columns of $P$ are a basis of $\RR^n$, call them $\vec{p}_1,\dots,\vec{p}_n$
- conversely, if $\{\vec{p}_1,\dots,\vec{p}_n\}$ is a basis of $\RR^n$, and $P$ is a matrix whose columns are these vectors $\vec{p}_j$, for $1\le j\le n$, then $P$ is invertible $n\times n$.
- multiplication on the left by $P$ transforms the standard basis of $\RR^n$ to the new basis consisting of the columns of $P$; that is, if $\vec{e}_1,\dots,\vec{e}_n$ is the standard basis (so $\vec{e}_j$ is really just the $j$th column of the $n\times n$ identity matrix, for $1\le j\le n$), then $\vec{p}_j=P\vec{e}_j,$ again for $1\le j\le n$.
- multiplication on the left by $P^{-1}$ transforms the basis $\{\vec{p}_1,\dots,\vec{p}_n\}$ into the standard basis: $\vec{e}_j=P\vec{p}_j,$ for $1\le j\le n$.

- what this has to do with
*diagonalization*:- if we can put together an entire basis $\{\vec{p}_1,\dots,\vec{p}_n\}$ of $\RR^n$ out of eigenvectors of some matrix $A$, so $A\vec{p}_j=\lambda_j\vec{p}_j$ for $1\le j\le n$, then building the matrix $P$ with columns from this basis, it will turn out that $P^{-1}AP$ is diagonal, with $\lambda_1,\dots,\lambda_n$ down the diagonal. [Why? because $$\left(P^{-1}AP\right)\vec{e_j}=P^{-1}\left(A\left(P\vec{e_j}\right)\right)=P^{-1}\left(A\vec{p_j}\right)=P^{-1}\left(\lambda_j\vec{p_j}\right)=\lambda_j\left(P^{-1}\vec{p_j}\right)=\lambda_j\vec{e}_j$$ so the standard basis of $\RR^n$ consists of eigenvectors of $P^{-1}AP$ with eigenvalues $\lambda_1,\dots,\lambda_n$ ... which means $P^{-1}AP$ is diagonal as claimed.]

- so we get the
**Diagonalization Theorem**, two versions:- if separate bases of all the eigenspaces of an $n\times n$ matrix $A$ when put together yield a basis of all of $\RR^n$, then $A$ is diagonalizable.
- if the geometric multiplicity equals the algebraic multiplicity for every eigenvalue of a matrix $A$, then $A$ is diagonalizable

- a corollary of the
*Diagonalization Theorem*is that if an $n\times n$ matrix $A$ has $n$ distinct eigenvalues, then it is diagonalizable - did examples of the practical process of diagonalizing a matrix we are given, following the above procedure

- Hand in revised solutions to
*Midterm I*, if you like; due to some students' scheduling conflict, you may also hand in revisions on Monday, if you prefer

*M:***Read:**§5.1*Content:*- eigenvalues of invertible matrices
**orthogonal**and linearly independent sets of vectors in $\RR^n$- an
**orthonormal basis [ONB]** **coordinates with respect to an ONB**— computing the coefficients by dot products- defining
**orthogonal matrix**and the set $O(n)$ of $n\times n$ orthogonal matrices

**Miniquiz 19**- last day to hand in revised solutions to
*Midterm I* - Hand in
**HW8:**one problem (your choice) in the set 4.4.{40, 41, 42}**and**4.4.47

*T:***[Re]Read:**§5.1*Content:***orthogonal matrices**:- definition
- alternative characterizations
- their effect on the dot product or the norm
- the set of such is closed under products and inverses
*[Hence we call the set $O(n)$ of orthogonal matrices the***orthogonal group**.] - determinants of orthogonal matrices

**Journal 6 on §§4.1, 4.3, & 4.4 is due today****Miniquiz 20**

*W:***Read:**§5.2- Hand in
**HW9:**5.1.26, 5.1.28(a)&(b) [read but do not do 5.1.28(c)&(d)] *Content:*- proved that give vectors
$\vec{v}_1,\dots,\vec{v_k},\vec{w}_1,\dots,\vec{w}_\ell\in\RR^n$
such that $\vec{v}_i\perp\vec{w}_j$ $\forall i=1,\dots,k$,
$\forall j=1,\dots,\ell$ (
*i.e.,*all the $\vec{v}$'s are orthogonal to all the $\vec{w}$'s) then $\Span(\vec{v}_1,\dots,\vec{v_k})\perp\Span(\vec{w}_1,\dots,\vec{w}_\ell)$ - defining
**orthogonal complement**$W^\perp$ (pronounced "W-perp") of a subspace $W\subseteq\RR^n$ - proved that for any subspace of $\RR^n$, $W^\perp$ is a also a subspace of $\RR^n$
- for a matrix $A$: perps of column- and row-spaces, nullspaces and nullspaces of $A^T$ (stated without proofs)

- proved that give vectors
$\vec{v}_1,\dots,\vec{v_k},\vec{w}_1,\dots,\vec{w}_\ell\in\RR^n$
such that $\vec{v}_i\perp\vec{w}_j$ $\forall i=1,\dots,k$,
$\forall j=1,\dots,\ell$ (
**Miniquiz 21**

*F:***[Re]Read:**§5.2*Content:*- orthogonal projections
- the
**Orthogonal Decomposition Theorem**

**Maxiquiz 7 today***Note:***HW10**, which was previously to be due today, has been delayed and changed: it will be due next Monday, and consists of a (slightly) different set of problems

*M:**Content:*- some discussion of
*Maxiquiz 7*and*HW9*

- some discussion of
- Hand in
**HW10:**5.2.25, 5.2.27 - no miniquiz today, due to armadillos in the campus transformers

*T:***Read:**§5.3*pp. 399-403 only**Content:*- going over
*Maxiquiz 7*and*HW9* - brief revisit of
*orthogonal decomposition*and*projection* - the
**Gramm-Schmidt Process**

- going over
**Journal 7 on §§5.1, 5.2, & 5.3 (only***pp. 399-403*in §5.3) is due today- OK to hand in
**HW10**today, due to yesterday's armadillos **Miniquiz 22**

*W:***Read:**§5.4*Content:*- defining
**orthogonally diagonalizable** - useful notation: "$\exists!x$..." means "
**there exists a unique $x$ ...**" - starting the
**Spectral Theorem**: orthogonally diagonalizable implies symmetric - example of orthogonally diagonalizing a symmetric matrix: key seems to be to find an ONB of $\RR^n$ consisting of eigenvectors
- mentioned (proof next time) that symmetric implies eigenspaces
are orthogonal — this is the key step in the
*Spectral Theorem*

- defining
- OK to hand in
**Journal 7**today, due to Monday's armadillos **Miniquiz 23**

*F:***[Re]Read:**§5.4*Content:*- even more on the
**Spectral Theorem** - consequences of
*symmetry*... distinct eigenspaces are orthogonal - example application of the
*Spectral Theorem:*the sum of two orthogonally diagonalizable matrices is also orthogonally diagonalizable

- even more on the
**Maxiquiz 8 today**- Hand in
**HW11:**5.3.3, 5.4.12, 5.4.14 **Today [Friday] is the last day to withdraw (with a***W*) from classes

*M:***Read:**§5.5*pp.425-432 (the section "Quadratic Forms") only**Content:*- yet more on the
*Spectral Theorem* - defining
**quadratic form** - examples of quadratic forms: upward- and downward-pointing paraboloids and saddles
- diagonalization of quadratic forms
- brick-throwing demonstration: which is related to the
*inertia tensor*, and applying the Spectral Theorem yields something called*The Principal Axes Theorem*in physics

- yet more on the
**Miniquiz 24**

*T:***Read:**§6.1*Content:*- going over
*Maxiquiz 8*and*HW11* - defining
**[abstract] vector space** - starting examples (and non-examples) of
*vector spaces:*- $\RR^n$ with the usual vector addition and scalar multiplication
- $\RR^2$ with modified vector addition(s) is often
**not**a vector space

- going over
**Miniquiz 25**- Hand in
**HW12:**5.4.16, 5.5.38, 5.5.54 **Journal 8 on §§5.4 & 5.5 (only***pp. 425-432 "Quadratic Forms"*in §5.5) is due today

*W:***[Re]Read:**§6.1*Content:*- some algebraic (arithmetic?) properties in vector spaces which
are consequences of their definition,
*e.g.:*- in any vector space $V$, $0\vec{u}=\vec{0}\ \forall u\in V$
- in any vector space $V$, $(-1)\vec{u}=-\vec{u}\ \forall u\in V$
- in any vector space $V$, $\alpha\vec{0}=\vec{0}\ \forall \alpha\in\RR$

- more examples of
*vector spaces:*- the trivial vector space $\{\vec{0}\}$
- spaces of functions, such as:
- $\Ff(\RR)$ — the space of all functions on the real line $\RR$
- $C(\RR)$ — the space of continuous functions on the real line $\RR$
- $C^k(\RR)$ for $k\in\NN$ — the space of $k$ times continuously differentiable functions on the real line $\RR$
- $C^\infty(\RR)$ — the space of infinitely differentiable functions on the real line $\RR$
- $\Pp_k$ for $k\in\NN$ — the space of polynomials in one variable of degree at most $k$
- $\Pp$ — the space of all polynomials in one variable

- $M_{m\times n}$ — the space of $m\times n$ matrices, with the usual addition of matrices and scalar multiplication on matrices.

- some algebraic (arithmetic?) properties in vector spaces which
are consequences of their definition,
**Miniquiz 26**

*F:***[Re]Read:**§6.1*Content:*- more discussion of the vector spaces of functions we
defined last class -- these form a chain of subspaces:

$\qquad\Pp_1\subseteq\Pp_2\subseteq\dots\subseteq\Pp\subseteq C^\infty(\RR)\subseteq\dots C^2(\RR)\subseteq C^1(\RR)\subseteq C(\RR)\subseteq\Ff(\RR)$ - defining
**[vector] subspace** - examples of
*subspaces* - how to check if something is a
*subspace*(it's closed under vector addition and scalar multiplication - defining
**Span**in an abstract vector space

- more discussion of the vector spaces of functions we
defined last class -- these form a chain of subspaces:
**Maxiquiz 9 today**

*M:***[Re]Read:**§6.1 and**Read:**§6.2*Content:*- going over
*Maxiquiz 9*and recent*HW*s - still more about vector subspaces,
*e.g.,*the $\vec{0}$ in a subspace is the same vector as the $\vec{0}$ in the ambient space - more about
*Span*s,*e.g.,*as an intersection - the subspaces of symmetric or skew-symmetric matrices in $M_{n\times n}$
- Cartesian product of vector spaces
- defining
**linearly [in]dependent**in an abstract vector space, and examples

- going over
- Hand in
**HW13:**6.1.{2, 6, 48} **Miniquiz 27**

*T:***[Re]Read:**§6.2*Content:*- defining
**basis**in an abstract vector space, and examples - defining
**dimension**in an abstract vector space, and examples - defining
**[in]finite dimensional**for an abstract vector space, and examples - the
**Basis Theorem**in an abstract vector space

- defining
**Journal 9 on §§6.1 & 6.2***(only the beginning of §6.2, if you like)***is due today****Miniquiz 28**

*W:***[Re]Read:**§6.2 and**Read:**§6.3*Content:*- a little more care with finite and infinite sets of vectors which
might be
*linearly [in]dependent* **coordinates of a vector $\vec{v}$ with respect to a basis $\Bb$**in an abstract vector space: $[\vec{v}]_\Bb\in\RR^k$, if $\Bb$ consists of $k$ basis vectors- a
**change-of-basis matrix $P_{\Cc\leftarrow\Bb}$**for converting from basis $\Bb$ to basis $\Cc$, by $[\vec{v}]_\Cc=P_{\Cc\leftarrow\Bb}\,[\vec{v}]_\Bb$ - this $P_{\Cc\leftarrow\Bb}$ will be $k\times k$ if $\Bb$ (and thus also $\Cc$) consists of $k$ vectors

- a little more care with finite and infinite sets of vectors which
might be
**Miniquiz 29**

*F:***[Re]Read:**§6.3 and**Read:**§6.4*Content:*- more about coordinates,
*e.g.,*$P_{\Cc\leftarrow\Bb}$ is invertible: in fact $P_{\Cc\leftarrow\Bb}^{-1}=P_{\Bb\leftarrow\Cc}$ - definition of a
**linear tranformation**between abstract vector spaces

- more about coordinates,
- Hand in
**HW14:**6.1.46, 6.2.34, 6.2.44 **Maxiquiz 10 today**

*M:***[Re]Read:**§6.4*Content:*- going over
*Maxiquiz 10*and recent*HW*s - examples of
*linear tranformations:*- the
**zero transformation** - the
**identity transformation** - matrix multiplication
- differentiation in $\Pp$

- the
- properties of
*linear tranformations:*- they map $\vec{0}_V$ to $\vec{0}_W$
- they behave nicely with respect to the "additive inverse" operation $\vec{v}\mapsto-\vec{v}$.
- compositions of LTs are LTs

- going over

*T:***[Re]Read:**§6.4*Content:*- linear transformations are determined by what they do to a basis, which is, however, completely free: that is, if we want to make a linear transformation $T:V\to W$, and if $\{\vec{v}_1,\dots,\vec{v}_n\}$ is a basis of $V$, we can choose any vectors $\vec{w}_1,\dots,\vec{w}_n$ we like in $W$, and there will be a unique linear $T$ which satisifies $$T(\vec{v}_1)=\vec{w}_1,\quad\dots,\quad T(\vec{v}_n)=\vec{w}_n\ \ .$$
- more examples of
*linear tranformation* - inverses of linear transformations — are the linear?

**Journal 10 on §§6.2 — 6.4***(the end of §6.2, if you did not finish it in Journal 9)***is due today****Miniquiz 30**

*W:**Content:*- review for
*Midterm II*; see this review sheet

- review for
- Hand in
**HW15:**6.3.16, 6.4.{20, 22, 24} **Miniquiz 31**

*F:**Midterm II*in class today

*M:**Content:*- going over
*Midterm II*

- going over

*T:***Read:**§6.5*Content:*- the
**kernel**of a linear transformation - the
**range**of a linear transformation **one-to-one**(or**1-1**or**injective**)**onto**(or**surjective**)**inverses of linear transformations**

- the
- Hand in revised solutions to
*Midterm II*, if you like - NOTE: no
**Journal**is due today.

*W:***[Re]Read:**§6.5*Content:*- kernels and ranges are always vector subspaces
**rank**and**nullity**(again)**The Rank-Nullity Theorem**(again)**isomorphisms**and**isomorphic**- a theorem giving a necessary and sufficient condition for finite dimensional vector spaces to be isomorphic

**Miniquiz 32**

*F:***Read:**§6.6*Content:*- defining the
**matrix of a linear transformation with respect to bases of its domain and codomain** - the matrix of a composition of linear transformations
- the matrix of the inverse of a linear transformations
- matrices of endomorphisms of vector spaces and similarity using the change of basis matrix

- defining the
- Hand in
**HW16:**6.4.32, 6.5.27, & 6.5.34 **Maxiquiz 11 today**

**Thanksgiving Break!**No classes or office hours. You can contact me by e-mail is you want, though.- If you are unhappy with your recent quiz or test scores, it would be good to do additional problems from the book sections we've covered.
- But please catch up on any old homeworks or re-dos of old assignments, to hand in on Monday after the break.
- Also, do not forget that
**Journal 11**is due the first day after break.

*M:***Read:**§7.1*Content:*- going over
*Maxiquiz 11* - an
**inner product**and**inner product space** - examples of inner product spaces:
- $\RR^n$ with the usual dot product
- the $L^2$ inner product on $C[0,1]$
- $L^2$ also works on $\Pp$ and $\Pp_n$ ($n\in\NN$)
- another inner product on $\Pp_n$: $\lt p,q\gt=p(0)q(0)+\dots+p(n)q(n)$ (just using the first term $p(0)q(0)$ works in all parts of the definition of an inner product except for non-degeneracy; with all $n+1$ terms it becomes a full inner product ... which we know because of the Fundamental Theorem of Algebra!)

- going over
**Journal 11 on §§6.5 & 6.6 is due today**

*T:***[Re]Read:**§7.1*Content:*- elementary properties of inner products
**length**or**norm**of vectors in an inner product space- the
**Pythagorean Theorem** **orthogonal**vectors in an inner product space**projections**and the**Gram-Schmidt Process**in an inner product space- an orthonormal set in the inner product space $C[-\pi,\pi]$ with the $L^2$ inner product: trigonometric functions, and the connection with Fourier Analysis and radios

**Miniquiz 33**

*W:***Read:**§7.2*pp.575-578 only**Content:*- the
**distance**between vectors in an inner product space - a
**norm**and a**normed linear space** - the
**distance function $d(\vec{u},\vec{v})$**in a normed linear space - a
**metric**and**metric space** - examples of norms/metrics:
- the norm coming from an inner product
- the
**sum norm**(also called the**$L^1$ norm**) - the
**max norm**(also called the**sup**or**$L^\infty$ norm**) - the
**taxicab metric**

- the
- Hand in
**HW17:**7.1.{36, 40}, 7.2.{8, 14} **Journal 12 on §§7.1 & 7.2***[only the parts we covered in this class]*is due today**Maxiquiz 12 today****Hand in all late work and re-dos by 4pm today**if you want them corrected and returned to you on Friday

*F:**Content:***Last day to hand in [by noon!] all late work and re-dos for class credit***[although materials handed in today will not be returned]*

**Exam week**, no classes.- Our
*[comprehensive]*is scheduled for**FINAL EXAM****Tuesday, December 10th, 10:30am-12:50pm in our usual classroom****Wednesday, December 11th, 10:30am-12:50pm in our usual classroom**

**both**time slots. The format of the test is described on the final exam review sheet, but note that the parts in which you will have to state definitions and theorems will be on Tuesday (along with some other questions) — on Wednesday's part of the test, you will be able to refer to these definitions in your new work. So, in short: Wednesday is entirely problem-oriented, with little need for memorization, while for Tuesday, you will be required to have detailed, precise definitions and statements in your head.