Math 307 — Introduction to Linear Algebra

Homework Assignments & Course Schedule

Here is a link back to the course syllabus/policy page.

In the following all sections and page numbers refer to the required course
textbook, * Linear Algebra, A Modern Approach, Second Edition*,
by David Poole.

This schedule is subject to change, but should be accurate at any moment for at least a week into the future.

For each day, please read the section(s) named in *the plan*, **before
that day**. When homework is due, it must be turned in either in class or
at my office **by 3pm that day**.

*The plan for this week:**M:*Mostly bureaucracy and introductions. Read "To the Student" (*pp. xxiii-xxiv*).*T:*(Re)read §§ 1.1-1.3 (which material*should be review!*). Do**HW0:**Please send me (at jonathan.poritz@gmail.com) a message telling me:*Send me e-mail:*- Your name.
- Your e-mail address.
- Your year/program/major at CSUP.
- The reason you are taking this course.
- What you intend to do after CSUP, in so far as you have an idea.
- Past math classes you've had.
- Other math and science classes you are taking this term, and others you intend to take in coming terms.
- Your favorite mathematical subject.
- Your favorite mathematical result/theorem/technique/example/problem.
- Your computer experience: classes (which), self-taught (how), which operating system(s), language(s) or software package(s) (relating to science or mathematics or pure IT — I don't want to know which games you play!) do you know and how well?
- Anything else you think I should know (disabilities, employment
or other things that take a lot of time, how you feel about mathematics,
*etc.*) - [Optional:] The best book you have read recently.

**I will not put you in my gradebook until I have this e-mail from you.**

[By the way, just to be fair, in case you are interested, here is a version of such a self-introductory e-mail with information as I would fill it out for myself.]*W:*(Re)read §§ 2.1-2.3, (which material*should also be review!*).*F:*Read §§3.1-3.3, 3.5. (*still review!*).

*Content:*- finishing thoughts about
*Span*, examples. - defining
*linearly [in]dependent* - proving:
*A subset of a (finite) set of linearly independent vectors is also linearly independent*by**contradiction** - defining
*subspace*and, for a matrix*A*, its*column space col(A), row space row(A), and null space null(A)* - started proving that the nullspace of a matrix is indeed a subspace.

- finishing thoughts about

*NOTE:*Friday is the last day to add classes without explicit instructor approval.

*The plan for this week:**M:**Content:*- finishing proving that the nullspace of a matrix is indeed a subspace.
- definition of a
*basis*and of*dimension* - definition of the
*nullity*of a matrix

*T:**Content:*- finishing proving that the nullspace of a matrix is indeed a subspace.
- definition of the
*nullity*of a matrix - the row space of a matrix is not changed by elementary row operations
- group reading of the proof of the
*invariance of dimension*theorem

- Hand in
**MI1**, which should consist of: writing a useful, detailed, precise summary of the content (definitions and main results) of §§2.3 and 3.5, including any definitions or results from other (prior) sections of the book which you think are necessary to understand the material from §§2.3 and 3.5. - Actually, if your
*Main Ideas*are in a notebook and you don't want to tear them out, find some time (today!) to show me the notebook for around 5-10 minutes, and I can give you feedback and credit for this assignment. Or simply hand it in.... - Also hand in
**HW1**:- §1.2: 56
- Chapter 1 Review Questions,
*p.56*, 8 - §2.3: 42, 44
- §3.5: 10, 56

*W:***Quiz 1 today.***Content:*- how to build a basis of
*row(A)* - reading of the result on bases also of
*col(A)*and*null(A)*

- how to build a basis of

*F:**Content:*- going over the quiz: it is vital to know definitions, in their full precision!
- one of the quiz proofs is an example of the proof strategy "to show
*a=b*, show that*a≤b*and*b≤a*" (on the quiz, this was in the context of sets being subsets of each other; the book's proof of the*Basis Theorem*is an example of the straight numerical inequality version of this strategy). - going over
**MI1**: makes sure to include definitions, with some idea of what concept is behind them, perhaps an example - group reading of
*Fundamental Theorem of Invertible Matrices*

*The plan for this week:**M:**Content:*- proof that for a set of linearly independent vectors
in**v**_{1},...,**v**_{m}, every element in**R**^{n}*S=span(*can be expressed**v**_{1},...,**v**_{m})**in a unique way**as a linear combination of this basisof**v**_{1},...,**v**_{m}*S* - discussion of problem 63 from §3.5 — noted that a
step in this proof would require the fact that if
*S*and_{1}*S*are subspaces of_{2}such that**R**^{n}*S*then_{1}⊆S_{2}*dim(S*._{1})≤dim(S_{2}) - definition (and fundamental
*idea*) of*linear transformations* - examples of linear transformations:
- the constant map with value
**0** - flipping the two-dimensional plane across the line
*y=x*, or the*x*- or*y*-axis - rotations in the two-dimensional plane

- the constant map with value
**Theorem:**multiplication on the left by an*m×n*matrix is a linear transformation fromto**R**^{n}.**R**^{m}- know the matrices of the example linear transformations above

- proof that for a set of linearly independent vectors

*T:*[Only three students seemed to know we had class at all.]**CLASS PERIOD REDUCED TO 9:30-9:50AM**DUE TO WINTER WEATHER.*Content:*- Some discussion of a [
**HW2**] problem: §3.5: 60.

- Some discussion of a [

*W:***CLASS CANCELLED**DUE TO WINTER WEATHER.

*F:**Content:*- More discussion of a [
**HW2**] problem: §3.5: 60. The key ideas seemed to be that:- The outer product
of two vectors**u·v**^{T}and**u**will be a matrix each of whose rows is a multiple of the single (row) vector**v**.**v**^{T} - If a matrix
*A*has rank 1, that means its row space is 1-dimensional, so all elements of the rowspace are multiples of the single basis vector, call it. Since the rows themselves are elements of the row space, that means that the rows are each multiples of**b**.**b**

- The outer product
- Some discussion of a [
**HW2**] problem: §3.5: 44. Various strategies proposed by students, such as:- Row reduce the matrix. The last row will have only one
(potentially) non-zero entry (the last), which is quadratic in
*a*: when*a*has the two values which make this quadratic equal 0, the rank will be 2, otherwise the rank will be 3. - Compute the determinant of the matrix, again getting a quadratic
in
*a*. If*a*does not have one of the two values which make this quadratic equal 0, then the matrix will be invertible and hence have maximal rank (equalling 3). The problem with this approach is that if the determinant is zero, it is hard to tell from that if the rank is 0, 1, or 2....

- Row reduce the matrix. The last row will have only one
(potentially) non-zero entry (the last), which is quadratic in
- It was pointed out that the above sorts of discussions are typical
of the kinds of doodling or brain-storming one does to solve a
problem... they correspond to a
*rough draft*of a solution, and the final draft you hand in for homework should have careful definitions, full explanations of all steps, formal references to the theorems and techniques you use - More discussion of
*linear transformations*(§3.6): results and examples.

- More discussion of a [
- Hand in (or show to me)
**MI2**, based on §3.6. - Also hand in
**HW2**:- §2.3: 48
- §3.3: 17
- §3.5: 20, 44, 60, 64(optional)

*NOTE:*Monday is the last day to drop classes without a grade being recorded

*The plan for this week:**M:*Read §3.6 and §3.7*pp. 228—239.**Content:*- Ideas for §3.5 #64, the optional problem from
**HW2** - Some more discussion of proof-writing, particularly of
*quantifiers*and*first-order logic*; handouts given on this topic — which were printouts of the pages - The word
**kernel**— essentially just a synonym for*null space.*Some comments on the role of the kernel in the Fundamental Theorem on Invertible Matrices.

- Ideas for §3.5 #64, the optional problem from

*T:*Read §§4.1-4.3; continue working through §3.7*pp. 235—239 "Graphs and Digraphs"***Quiz 2 today**still on linear independence, bases, dimension, and §3.6.*Content:*- Some last comments around the content of §3.6. In
particular, thoughts about the kernel, domain, range,
codomain, and invertibility of a linear transformation (also
exercise 20 on
*p. 251.*). - Introduction to
**Markov Chains**(from section §3.7,*pp. 228—235*). - Introduction to eigenvalues, eigenvectors, eigenspaces.

- Some last comments around the content of §3.6. In
particular, thoughts about the kernel, domain, range,
codomain, and invertibility of a linear transformation (also
exercise 20 on

*W:*Continue above reading assignment from*Tuesday**Content:*- Discussion of yesterday's quiz
- More on eigenvalues,
*etc.*

*F:*- Hand in (or show to me)
**MI3**, based on §3.7*pp. 228—239.*and (the most basic/important definitions and results in) §§4.1-4.3. - Also hand in
**HW3**:- §3.6: 4, 12, 44
- §3.7: 10, 30, 34, 50
- §4.1: 4, 8, 37

*[Late breaking news: this HW may be handed in on***Monday**.]*Content:*- Discussion of some HW issues
- Relationship of eigenvalues and eigenvectors to the singularity and null space of a matrix

- Hand in (or show to me)

*The plan for this week:**M:*Continue reading §§4.1-4.3*Content:*- Discussion of a multi-billion dollar eigenvalue problem: the
Google
*PageRank*approach to ranking web pages. - formal definition of
*det A*for an*n×n*matrix*A*, in terms of expansion along any single row or column - the "homomorphism" property of the determinant:
*det (A·B) = (det A)·(det B)*for all pairs of*n×n*matrices*A*and*B*.

- Discussion of a multi-billion dollar eigenvalue problem: the
Google

*T:**Content:*- a proof strategy based on the
**Principle of Mathematical Induction**- for theorems that look like
*(∀n∈***N**) P(n) - first prove
*P(1)* - then prove the implication
*P(n)*⇒*P(n+1)*; this is called the*inductive step*, in which you use the assumption of*P(n)*(called the*inductive hypothesis*) and other logic/calculations to prove*P(n+1)* - then the
**Induction Demon**goes away and proves the theorem for*n=1*, uses the inductive step to prove the theorem for*n=2*, then for*n=3*,*etc., etc., etc.*(and the**Demon**never gets tired or bored!). - an example: proof that the sum of the first
*n*whole numbers is*n(n+1)/2*.

- for theorems that look like
- proof (
**by induction!**) that the determinant of a triangular matrix equals the product of its diagonal entries - definition of the
*characteristic polynomial*of a matrix; its zeros are the eigenvalues of the matrix

- a proof strategy based on the

*W:*Reading §4.4**Quiz 3 today**on material up to and including §4.3*Content:*- more discussion/examples on the definition of
*det* - the
*characteristic polynomial of a matrix* - definition of
*algebraic*and*geometric multiplicity*of an eigenvalue — examples, showing in particular that these multiplicities*can be different*

- more discussion/examples on the definition of

*F:**Content:*- discussion of Quiz 3, including:
- variables in definitions have to have
**quantifiers**, such "*∀c*", "_{1},...,c_{n}∈**R***∃*", ....**x**≠**0** - often a definition has a
**context**, so to define the word*eigenvalue*you would begin "Let*A*be an*n×n*matrix...." - it is really important to have a formal definition in your head for each technical term we use... and to have a few theorems in which such a term appears
- how to do (in several ways) the quiz problem on eigenvalues

- variables in definitions have to have
- starting the proof that "eigenvectors corresponding to different eigenvalues are linearly independent"

- discussion of Quiz 3, including:

*The plan for this week:**M:*Continue reading §4.4- Hand in
**HW4**:- §4.2: 56, 69
- §4.3: 18, 20, 24, 30
- §4.4: 40, 42, 47, 48

*Content:*- doing the quiz 3 problem on stochastic matrices
- finishing the proof that "eigenvectors corresponding to different eigenvalues are linearly independent"
- definition/examples of matrices being
*similar*and*diagonalizable* - some discussion of HW4 problems

- Hand in
*T:***HW4**may be handed in today.*Content:*- more HW4 discussion
- review for Test I; see this review sheet
- examination of similarity and its consequences —
*e.g.,*that the matrices have the same characteristic polynomials and eigenvalues, their powers are also similar,*etc.*

*W:***Test I today***F:*- going over the test

*The plan for this week:**M:*- revised, take-home solutions to
**Test I**problems should be handed in today. *Content:*- discussed an important point of view on invertible matrices:
the columns (also the rows) of an invertible
*n×n*matrix form an ordered basis of(that is, a basis in which the first vector is specified, and the second, and so on); likewise, an ordered basis of determines an**R**^{n}matrix by using the**R**^{n}*i*th basis vector as the*i*th column of the matrix.

This matrix has the property that it takes the*i*th standard basis vector to the*i*th vector of the given ordered basis.

This matrix, when built out of a basis ofconsisting entirely of eigenvectors of a matrix**R**^{n}*A*, is exactly the matrix*P*which diagonalizes*A*. - recall the definition of the word
*dot product*, mentioned its synonym*inner product* - recall the word
*norm*and its notation - define the words
*orthogonal vectors, orthogonal set,*and*orthogonal basis* - conjectured that an orthogonal set of
*n*non-zero vectors inwill necessarily be a basis**R**^{n}

- discussed an important point of view on invertible matrices:
the columns (also the rows) of an invertible

- revised, take-home solutions to
*T:*Reading §§5.1&5.2*Content:*- discussion of linear independence of orthogonal sets
- define
*orthonormal basis* - define
*orthogonal matrix* - characterization and properties of orthogonal matrices

*W:*Continue reading §§5.1&5.2- start these problems for
**HW5**(due next Monday):- §5.1: 8, 16, 28, 37

- start
**MI4**, by writing down careful definitions of the terms from recent classes and from §5.1 *Content:*- define
*orthogonal complement* - properties of orthogonal complements
- the
**four fundamental subspaces**corresponding to a linear transformation fromto**R**^{n}and their relationship.**R**^{m}

- define

- start these problems for
*F:*Reading §5.3- start these additional problems for
**HW5**(due next Monday):- §5.2: 4, 16, 25, 26

- put more work into
**MI4**, writing down definitions and content from recent classes and from §5.2 **Quiz 4 today**on material up to and including §5.2*Content:*- the
*Gram-Schmidt Process*and examples

- the

- start these additional problems for

*The plan for this week:**M:*- hand in
**HW5**, which consists, in all, of:- §5.1: 8, 16, 28, 37
- §5.2: 4, 16, 25, 26
- §5.3: 2, 4, 12

- hand in
**MI4**, covering definitions, theorems, and techniques from classes since Test I and in §§5.1-5.3 *Content:*- going over last week's quiz (answer sheet handed out)
- some discussion of the recent midterm (answer sheet handed out)

- hand in
*T:*Read §§5.2&5.3**Study Group**meets outside PM 248- discussion/definition of
*orthogonal projections*onto lines and even subspaces - some discussion of
*The Orthogonal Decomposition Theorem*, and how it would help in proving that our definition of the projection onto a subspace is well-defined

*Content:**W:*Read §5.4**HW5**may come in (late) today*Content:*- a day on symmetric matrices, oh boy!
- symmetric matrices can move around in dot (inner) products:
*(A*if**v**)·**w**=**v**·(A**w**)*A*is symmetric - for a symmetric matrix, eigenvectors corresponding to distinct eigenvalues are orthogonal.
- definition of
*orthogonally diagonalizable* - statement and (most of a) proof of
*The Spectral Theorem*: A matrix is symmetric if and only if it is orthogonally diagonalizable - a (small — 2×2) example of all of the above definitions and theorems

*F:*Read §5.5*pp. 411—418**Content:*- definition of a
*quadratic form* - definition of
*positive/negative [semi][in]definite*quadratic forms *The Principal Axes Theorem*- an application: the moment of inertia tensor — throwing bricks...

- definition of a

*The plan for this week:**M:*Read §6.1*Content:*- some issues which came up on the last HW:
- please
**narrate**your solutions — you are*explaining*something, even in a purely (or mostly) computational problem, you need- to tell a story;
- to identify the characters —
*context*and*quantifiers*!!; - to explain
*what*your equations are doing; and - to explain
*why*your equations are doing that. - [It's also a nice thing to make it easy on your
reader by saying a word or two about your over strategy
—
*e.g.,*say at the beginning that you are going to do a proof by contradiction — even though books often don't bother to do this.]

- some words from mathematical logic: when we have a
mathematical statement of the form

*If*[or**P**, then**Q**],**P**⇒**Q**

then we also have statements- the
**converse**:

*If*[or**Q**, then**P**]**Q**⇒**P**

which**is not**equivalent to the original statement - the
**contrapositive**:

*If*[or**¬Q**, then**¬P**]**¬Q**⇒**¬P**

which**is**equivalent to the original statement - we also sometimes have a claim that a statement and
its converse are both true, this is written

[or**P**if and only if**Q**or**P**iff**Q**]**P**⇔**Q** - note the English usage of "if...then" constructions is not as exact as the mathematical one, so that it is often unclear if one means the "if...then" or the "if and only if..." construction in English... it should never be unclear [in well-written mathematics]

- the

- please
- starting discussion of the ideas in the definition of a
*vector space*

- some issues which came up on the last HW:

*T:*Keep reading §6.1*Content:*- more discussion on the definition of a vector space
- examples of sets with operations that do not make a
vector space — often this is because the set is not
closed under one of the operations, but we also saw an
example where there was not an additive inverse
corresponding to every vector**-v****v** - we gave an example of an unusual vector space:
*l*, the set of bounded infinite sequences of real numbers._{∞}

*W:*Really, seriously: read §6.1*Content:*- definition of a
*subspace*of an abstract vector space: a subset of the vectors which, along with the vector addition and scalar multiplication of the ambient vector space, is still a vector space in its own right - examples of subspaces:
- a line through the origin in
**R**^{2} - a line or plane through the origin in
**R**^{2} - for
*k&isin*, let**N***C*be the set of real-valued functions defined on the interval^{k}[0,1]*[0,1]⊂*whose first**R***k*derivatives are continuous; when*k=0*, we write either*C*or simply^{0}[0,1]*C[0,1]*for the set of continuous, real-valued functions defined on*[0,1]*. then we have an infinite chain of subspaces

*C[0,1] ⊃ C*^{1}[0,1] ⊃ C^{2}[0,1] ⊃ ... ⊃C^{k}[0,1] ⊃ ...

[In fact, one also defines the intersection of all of these*C*to be^{k}[0,1]*C*— think of it as at the very end of that infinitely long chain of subsets — and one refers to^{∞}[0,1]*C*as the set of^{∞}[0,1]*smooth functions*on*[0,1]*.]

- a line through the origin in
- some discussion of problems from HW6

- definition of a

*F:*- hand in
**HW6**:- §5.4: 2, 4, 16
- §5.5: 24, 34
- §6.1: 4, 10, 38, 48, 49

**Quiz 5 today**- hand in
**MI5** *Content:*- theorem that a subset of vectors in a vector space is a subspace iff it is closed under vector addition and scalar multiplication with all scalars
- fact: every vector space
*V*has at least two subspaces: {**0**} and*V*; these are called the*trivial subspaces* - definition: a
*proper subspace*of a vector space is any subspace which is not one of the trivial subspaces —has no proper subspaces; all**R**^{1}for**R**^{n}*n>1*have many proper subspaces - a hint for problem 16 in §5.4 (from HW6): seeing the
hypothesis that the matrix
*A*is symmetric should ring a loud bell — it means that the Spectral Theorem applies. This is a frequent approach in situations where one has a nice "simplification theorem" (like the Spectral Theorem, which simplifies matrices by making them diagonal, or at least diagonalizable) — often the theorem puts some general object in a simpler, canonical form: prove the desired result for the simplified object (here, prove that diagonal matrices have square roots iff they have only non-negative eigenvalues), then use the simplification theorem to apply this to the general case. This last part usually involves moving the desired result back and forth to the general case via some mechanism described in the simplification theorem (*e.g.,*for the Spectral Theorem, the transportation back and forth is by*similarity*of matrices).

- hand in

*NOTE:*Friday is the last day to withdraw (with a**W**) from classes

**Spring Break!**No classes, of course.- It would be nice for all students to use this break to catch up on old
work which they have not completed or handed in. A general amnesty
applies to all such old work — in fact, if you want to hand in
new solutions to any past homework on which you lost an uncomfortably
large number of points (
*e.g.,*many students skipped many of the proofs in past homework sets), I would be happy to accept such revisions and additions after Spring Break.

*The plan for this week:**M:*Read §6.2*Content:*- going over quiz 5
- refreshing the ideas of the definition of an abstract vector space
- recalling the definitions of a
*linear combination*of (**a finite number of**) vectors, and of the*span*

*T:*Keep reading §6.2*Content:*- definition of
*linearly [in]dependent*and*basis* - definition of
*dimension*— be careful of the (trivial) vector space*{*, which is given the dimension 0 and of infinite-dimensional vector spaces, which are ones that do not have any finite spanning set.**0**} - some examples of vector spaces and their bases:
, with its**R**^{n}*standard basis**{*, where**e**_{1},...,**e**_{n}}is the vector with a 1 in the**e**_{i}*i*place and 0's elsewhere; dimension is^{th}*n*- the set of
*m×n*real matrices, with its**M**_{m×n}*standard basis**{*, where**E**_{1,1},...,**E**_{1,n},**E**_{2,1}...,**E**_{2,n},...,**E**_{m,n}}is the matrix with a 1 in the**E**_{i,j}*(i,j)*place and 0's elsewhere; dimension is*mn* - the set of polynomials in the variable
*x*, denoted, made into a vector space with the usual addition of polynomials and the multiplication of constants (scalars) and polynomials.**P**has subspaces**P**for any**P**_{n}*n∈*consisting of those polynomials of degree at most**N**∪{0}*n*. the set*{1,x,x*is linearly independent in^{2},x^{3},...}, and**P***{1,x,x*is the standard basis of^{2},...,x^{n}};**P**_{n}has dimension**P**_{n}*n+1*andis infinite-dimensional**P**

- definition of

*W:*Keep reading §6.2*Content:*- taking
*coordinates of a vector*in an abstract vector space**v**with respect to a basis B**V**, resulting in a column vector which is written*[***v**]_{B} - thinking of the above as a map
**V**→, and noticing nice properties of this map; in particular,**R**^{n}*∀*; and**v**,**w**&isin**V**[**v+w**]_{B}=[**v**]_{B}+[**w**]_{B}*∀*and**v**&isin**V***∀c&isin*; and**R**, [c**v**]_{B}=c[**v**]_{B}*∀k∈*and**N***∀*is linearly independent in**v**_{1},...,**v**_{k}&isin**V**{**v**_{1},...,**v**_{k}}iff**V***{[*is linearly independent in**v**_{1}]_{B},...,[**v**_{k}]_{B}}**R**^{n}

- the moral: an abstract vector space of finite dimension
*n**behaves exactly like*(which is not a very precise way of saying something mathematical...), by the process of taking coordinates.**R**^{n}

- taking

*F:*Read §6.4*Content:*- Definition of
*linear transformations*between abstract vector spaces. - Some examples of linear transformations:
- any of the old linear transformations we had between
and**R**^{n}(remember, each such came exactly from multiplication on the left by some particular**R**^{m}*m×n*matrix) - the map from
to itself given by transposing a matrix**M**_{m×n} **not**the map fromto itself given by inverting a matrix**M**_{m×n}- the map from
to**M**_{m×n}sending a matrix to its first column**R**^{m}

- any of the old linear transformations we had between
*Theorem:*fix an abstract vector space*V*and a(n ordered) basis*B*of*V*. Say*B*has*n*elements. Then then map*V*→which sends a vector**R**^{n}to**v***[*is a linear transformation.**v**]_{B}- a fair bit of discussion about how a linear transformation
*T:V→W*is completely determined by what it does to the vectors in a basis*{*of**v**_{1},...,**v**_{n}}*V*; also how these values*T(*), ...,**v**_{1}*T(*) can be any**v**_{n}*n*vectors in*W*since the basis vectors are linearly dependent — if instead you try to define*T(*),...,**u**_{1}*T(*) to be some desired vectors in**u**_{k}*W*, whereare vectors in**u**_{1},...,**u**_{k}*V*which perhaps satisfy some dependency relations, then the desired vectors in*W*must satisfy the same dependencies ... so are not arbitrary!

- Definition of

*The plan for this week:**M:*Continue reading §6.4- hand in
**HW7**:- §6.2: 4, 6, 14, 15, 17, 26

- hand in
**MI6** - Starting discussion of the
**term projects**, for which see the information sheet. Please start thinking about your project topic, have one chosen by the end of the week! *Content:*- mostly discussion of HW7 issues

- hand in
*T:*Reading §6.5**Quiz 6 today***Content:*- some basic properties of linear transformations, such as
the famous fact that
*T(*for a linear transformation**0**)=**0***T:V→V*between abstract vector spaces, which is often used to prove a function is not a linear transformation - discussion of
*L(V,W)*, the set of linear transformations between two fixed vector spaces*V*and*W*; in particular, a definition of a (vector) addition operation and a scalar multiplication which makes*L(V,W)*itself a vector space. - statement that the composition of two linear transformations,
one from
*L(V,W)*and one from*L(W,X)*yields a linear transformation in*L(V,X)*, where*V*,*W*, and*X*are vector spaces - definition of the
**kernel**and**range**of a linear transformation

- some basic properties of linear transformations, such as
the famous fact that

*W:*Still reading §6.5*Content:*- discussion of quiz 6
- definition of
**1-1**(pronounced "one-to-one"), and its synonym**injective** - definitions of
**onto**and its syonym**surjective** **bijective**; if*T∈L(V,W)*is bijective, for vector spaces*V*and*W*, then*T*is called called an**isomorphism**and*V*and*W*are are said to be**isomorphic**, written*V≅W*- examples of injections, surjections, and bijections

*F:*Yes, still reading §6.5- Please have chosen — and discussed (in person or by e-mail) with me — your term project topic by today.
*Content:*- today
*V*and*W*are vector spaces and*T∈L(V,W)* - proof that
*ker(T)*is a subspace of*V* - connecting
*injective*with the*kernel*: proof that*T*is injective iff*ker(T)={***0**} - proof that
*range(T)*is a subspace of*W* - connecting
*surjective*with the*range*: proof that*T*is surjective iff*range(T)=W* - example of kernel and range of linear transformations like
projection operators in
**R**^{2} - [Students should be able to state all recent definitions fully and formally (of course, always!) and recreate the above proofs.]

- today

*The plan for this week:**M:*Reading §§6.4&6.3*Content:*- definitions of
**rank**and**nullity**for a linear transformation - the
*Rank-Nullity Theorem*(again): what it says, the big idea in the proof, an example

- definitions of

*T:*Reading §6.3**Study Group**, for those interested in such,**4-5pm, outside my office***Content:*- discussion of the idea that "there is only one vector space of a given (finite) dimension" — which is obviously not literally true, but a version of this idea can be formalized and proved
- The freedom in the formalized idea just stated is in the
*choice of basis*. This is the entire theme of §6.3 [and (mostly) of §6.6]. - the book's notation
*P*_{C←B}

*W:*- another
**Study Group**,**3-4pm, outside my office** - review for, and discussion of, Friday's test; see this review sheet
- hand in
**HW8**:- §6.4: 32, 33
- §6.5: 4, 8, 10, 22, 26, 34

*Content:*- discussion of some problems from HW8
- preparation for Test II-

- another
*Θ:*(Yes, Thursday.)- another
**Study Group**,**11am-12pm, outside my office**

- another
*F:***Test II today**- in-class part, and...
- the take-home part was distributed — it's due on Monday — or you can get it right here
- hand in
**MI7**

*The plan for this week:**M:*Reading §6.6- hand in the take-home part of Test II
*Content:*- setting up the "commutative diagram" which defines the
*matrix of a linear transformation with respect to given bases*

- setting up the "commutative diagram" which defines the

*T:*[Re]reading §§6.1-6.6- going over the take-home part of Test II

*W:*§6.7- going over the in-class part of Test II
*Content:*- more on matrices of a linear transformation with respect to
bases of the domain and codomain — in particular, the
case of the identity transformation of a particular vector
space
*V*with respect to two different bases of*V*, including examples

- more on matrices of a linear transformation with respect to
bases of the domain and codomain — in particular, the
case of the identity transformation of a particular vector
space

*F:*Start reading §7.1*Content:*- definition of an
*inner product space* - examples of inner product spaces:
**R**^{n}- the
*L*inner product on^{2}*C[0,1]*

- going from an inner product to a
*norm* - properties:
- fundamental algebraic properties of inner products and norms
*Pythagoras' Theorem*- the
*Cauchy-Schwarz-Buniakovski Inequality* - the
*Triangle Inequality*

- definition of an

*The plan for this week:**M:*Read §§7.1&7.2- by today you should have e-mailed me, telling the major source(s) for your term project
*Content:**Gramm-Schmidt*in an inner product space, with the example of the Legendre polynomials*orthogonal transformations*and*projections*in an inner product space

*T:*Read §7.2 and part of §7.3, up to p.580*Content:*- Norms, distance functions
- The Best Approximation Theorem

*W:*Read §7.5 up to p.626*Content:*- Approximation of functions

- hand in
**HW9**:- Chapter 6 Review Questions, p.536-537: 8, 16
- §7.1: 6, 10, 40

- hand in
**MI8** - e-mail me (or hand in on paper) a draft of the introductory paragraph and outline of your term project

*F:*- discussion of the final...

*Sunday:***Special Extra Review Session**at 3pm in our usual classroom — come if you dare!

**Exam week**, no classes.- Our
is scheduled for two exam slots:*FINAL EXAM***Thursday, May 5, 2011, 8-10:20am in our usual classroom:**this is our main final exam. See this review sheet for lots of information.**Friday, May 6, 2011, 8-10:20am in our usual classroom:**end of finals party ... and short (*5 min*) presentations, by those who wish to present, describing their final projects. Both those who present and the audience will earn some extra credit on their term project score.