Search for notes by fellow students, in your own course and all over the country.

Browse our notes for titles which look like what you need, you can preview any of the notes via a sample of the contents. After you're happy these are the notes you're after simply pop them into your shopping cart.

My Basket

You have nothing in your shopping cart yet.

Title: Linear Algebra - Linear Transformations
Description: These notes cover all of Linear Transformations

Document Preview

Extracts from the notes are below, to see the PDF you'll receive please use the links above


Linear Transformations
Finally, we get to our last, absolute last, topic
...
We’ll look at them in order
...
It’s only a transformation if each vector x ∈ V gets linked to a SINGLE
vector in W , called the image of x
...

• T (ax) = aT (x) for all x ∈ V and a ∈ R
...

Note: ‘Linear Transformation’ may SOUND like it means ‘movement in a straight line’
...
It means that definition, nothing more or less
...
A
transformation that rotates vectors around the origin (circular motion) WILL be a linear
transformation
...

(Lame) Examples:
The transformation T : Rn → Rm where T (x) = 0 for all x is the zero transformation
...
Note that the image (output, whatever) is unique in the sense that each vector
can have only one image, but multiple vectors can have the same image
...
It’s linear
...

Example: The projection onto a subspace of Rn , ProjV (x), from Rn to Rn , is a linear
transformation
...
The projection of
is
−1
[
]
[ ] [
]
[
]0
1/2
0
−1/2
1/2 −1/2
simply
, that for
is
so the whole thing is

...


So, it has eigenvalues 0 and 1
...
Think about it
...
So,
if we diagonalize, then
A2 = P D2 P −1 ,
A = P DP −1 ,
so D2 = D and the eigenvalues are either 0 or 1
...

Other examples include rotations
...
We’d also want to change
(directly up) to
(directly to the
1
0
[
]
0 1
right)
...
Lets look at those eigenvalues, so
−1 0
−λ
1
−1 −λ

= λ2 + 1,

so

λ = ±i
...
If you want to create an arbitrary rotation in R2 (and all rotations are 2 dimensional,
really) then you get
]
[
cos(θ) sin(θ)
,
θ = degrees rotation clockwise
...



1
0




5
1
2
5/6
1/6
1/3
1
5/6 −1/3  = 6  1
5 −2 
...

So, how do find the matrix to calculate the projection onto V ⊥ ? This is easy, actually
...

The operator I − A will work as the projection matrix onto V ⊥ (when A is the matrix for
the projection onto V )
...

0
2

Property: If T : Rn → Rm and S : Rm → Rl are linear transformations then S · T is a
linear transformation on Rn → Rl
...
So S · T has BA as its matrix,
and has to be a linear transformation
...

The Kernel of T is the set {v | T (v = 0 ∈ W }
...

Section 4
...
b), 10,

3

Pre-Exam Section:
Examples, Again


 

1
2
1
...
Show that U = f f (1 + x) = −f (1 − x) is a subspace of F(0, 2)
...
Solve the linear system

3x1 + x2 + x3
=5
−x1 + x2 − 3x3 = −3
2x1 + x2
= 3
...
Solve the linear system




5
...





 and U ⊥
...

6
...
Find the best fit for the points (x, y) : (0, 2), (1, −3), (2, −2), for a line a0 + a1 x = y
...
Find the eigenvalues and vectors to A =  2 2 −2 
...
Calculate T  1  for the linear operator T : R3 → R2 , where
0
 
 
 
[ ]
[ ]
[ ]
1
1
2
0
1
1
T  1  =
,
T  0  =
T  0  =

...





u1
1
2
2 −3
2 −3  = u1
= det  u2
− u2
−1
1
u3 −1
1
 
 
 

1
0
0
=  0  (−1) −  1  (3) +  0  (−7) = 
0
0
1

1 2
1
2
+ u3
−1 1
2 −3

−1
−3 
...
As usual, three things to check
...

It’s in U since
z(1 + x) = 0 = −z(1 − x)
...
U is closed under scalar multiplication
...

3
...

0 0
0
0

No contradiction, one free variable, x3 = t
...

x3
x3
0
1
4
...
This has 0 = 1 as
the last row, so a contradiction and no solution
...
Arranging the vectors as rows of a matrix like so:




1 1 1 −1
1 0
2 −3
2 
...





The null space of A is equal to U ⊥
...

using t = x3 and s = x4
...
The inverse is  −2 −1 −3 
...
The data matrix is D =  1 1 , the first column relates to a0 , so it has to be in
1 2
every row
...
Using Gram Schmidt on the
related 

−1 
 1
columns of D, we get an orthogonal basis  1  ,  0 
...
Solving for a0 and a1 uses the
−3
−2
ORIGINAL D matrix, so




1 0
1
1 0
1
 1 1 −1  −→  0 1 −2 
...





0
8
...
The eigenvectors for 2 are  1 
0
 


1
1
 0 
...

and
1
−1
 
0
 1  as a combination of the three vectors we know about
...
First, we need to write
0

6


 
   

1
1
2
0
1 1 2
 1  + a2  0  + a3  0  =  1  −→  1 0 0
a1
0
1
3
0
0 1 3


1 0 0
1
 0 1 0 −3 , so
That reduces to
0 0 1
1
 
 
   
0
1
1
2
 1  = T  1  − 3  0  +  0 
T
0
0
1
3
 
 

1
1
2
 1  − 3T  0  + T  0
=T
0
1
3
[ ]
[ ] [ ] [
]
0
1
1
−2
=
−3
+
=

...

0





Title: Linear Algebra - Linear Transformations
Description: These notes cover all of Linear Transformations