Search for notes by fellow students, in your own course and all over the country.

Browse our notes for titles which look like what you need, you can preview any of the notes via a sample of the contents. After you're happy these are the notes you're after simply pop them into your shopping cart.

My Basket

You have nothing in your shopping cart yet.

Title: Algebra - Vector Inner product spaces
Description: Notes directly for the University of Bath 2nd year IPS and vector space course. As it is maths, will be similar to any course on IPS or Vector spaces. Includes marks of common exam questions here at bath. There will be similar to most other notes. Notes cover: Vector subspaces, sums and intersections, complementary subspaces, quotient spaces. Dual spaces, transpose of a linear map, annihilators. Inner product spaces over R and C. Cauchy-Schwarz inequality. Gram-Schmidt orthonormalization. Orthogonal subspaces and complements. Linear operators on inner product spaces. Orthogonal and unitary groups. Properties of eigenvalues and eigenspaces. Finite dimensional spectral theorem. Bilinear forms, relation with dual spaces, nondegeneracy. Tensor products and applications. Multilinear forms, alternating forms. Alternating bilinear forms, classification. Quadratic forms, relation to symmetric bilinear forms. Sylvester's law

Document Preview

Extracts from the notes are below, to see the PDF you'll receive please use the links above


MA20216 ALGEBRA 2A
Taught By:
Notes By:

David Calderbank
Robert Howie

Written December 2015

Contents
Preface
...
3
Useful definitions and Core concepts
...
5
Homomorphisms and Isomorphisms and linear maps
...
6
Subspace
...
7
Freely generated (free over)
...
7
Dual space and linear forms
...
8
Transpose of a linear map
...
9
Sums and Direct sums
...
10
Inner product spaces
...
11
Gram-Schmidt Process
...
12
Eigenvectors
...
13
The rest
...
14
Tensor product
...
15

Quadratic forms
...
16
Signature of quadratic forms:
...
16
Exam cheat sheet
...
Most people I
know still have no idea what an indexed set 𝔗 actually IS let alone what being β€˜free over 𝕿’ is supposed to
mean or what the difference is between annihilators and solutions spaces is
...
To his credit he does occasionally draw pictures, but only really
in problems classes where people have already been confused for a week on a topic
...
It is then followed by explanations of key concepts in the course
using pictures and examples, and finally ends with a couple of pages of β€˜exam cheat sheet’ for cramming key
things for the exam
...
}

∈
β„•
β„€

β„š
ℝ

𝔽

𝑀 π‘›Γ—π‘š (𝔽)

π‘“βˆ˜g

iff

𝐿 𝔽 (𝑉, π‘Š)
π‘‰βˆ—
⨁
Μ…
𝑋

Definition
Shorthand for:
β€˜there exists’
Shorthand for:
β€˜there exists a unique’
Shorthand for:
β€˜for all’
Shorthand for:
β€˜Is defined as’ or β€˜is defined equal to’
Shorthand for:
β€˜With respect to’
A set with …
...

Think of it like a mathematics box into which you put
mathematical things, like numbers or equation
...
}
The set with every whole number in, including zero and
below:
{…,-2,-1,0,1,2,…}
The set with all rational numbers in:
π‘Ž
( 𝑏 |π‘Ž ∈ β„€ , b ∈ β„•)
The set of every number you can write as a decimal, which
is all numbers which you might actually want to write in
real life
...
g
...

Shorthand for β€˜If and only if’
If you are asked to prove X iff Y you must show X⇒Y and
Y⇒X
...

The dual space of a vector space 𝑉
Direct sum (see below)
Complex conjugate
...


Used
Across Maths
Across Maths
Across Maths
Across Maths
Across Maths
Sets and group theory

Sets and group theory
Across Maths
Across Maths

Across Maths
Across Maths

When proving that
something is true, not just
for a specific field (e
...
ℝ)
but for all fields
...
Also written Μ… 𝑇
𝑋
Inner product of π‘Ž with 𝑏
Subspace orthogonal to π‘ˆ
Tensor product of 𝑋 and π‘Œ
Shorthand for:
β€˜is subset of’ or β€˜is contained within’
Shorthand for:
β€˜is subspace of’

Matrix’s

Complex vectors
Inner product spaces
Inner product spaces
Multilinear algebra
Set theory
Vector spaces

Useful definitions and Core concepts
Field
...
(see below)
Field Axioms
...

Notation!!! If a set called 𝔽 obeys the field axioms for addition then it is written (𝔽,+) which reads β€˜π”½, a
field over addition’
...
This is found more in text books, and occasionally pops up in his notes
...

A morphism is β€˜A structure preserving map’
...

You will just be told that a map is a homomorphism or β€˜linear’ in exams or in the notes, but knowing what it
means allows you to prove things
...


Linear map:

This is a homomorphism, but for vector spaces
...
So for a map 𝑓 to be linear, it needs to satisfy
the homomorphic criterion
...

A vector space is simply a field which obeys the field axioms, with field multiplication being valid for scalars
and a vector and field additions being valid for two vectors
...

If your vector space is ℝ2 (2D space - a sheet of paper) you can’t move to ℝ3 by addition or multiplaction
Vector are elements of a vector spaces
...
g
...
E
...

π‘Žπ‘›
π‘Ž1 π‘Ž2 … π‘Ž 𝑛 ) = 𝑒1 π‘Ž1 + 𝑒2 π‘Ž2 + β‹― + 𝑒 𝑛 π‘Ž 𝑛 where 𝑒 π‘₯ represents the π‘₯th baisis
...
g
...
This can be confusing for higher dimensions, but trying to work something out
thinking about a 2d or 3d example can be very helpful
In proofs he talks about the β€˜vector spaces of functions’ and a couple of other quite abstract vector spaces,
which can be very confusing, because they don’t seem to contain vectors
...
If it does
(for example if he rambles on about vector spaces of functions) don’t worry! Just treat them exactly like you
would a normal vector
...

A subspace is a vector space which is contained within another vector space
...
E
...
β€˜prove differentiable
functions is a subspace of the space of real
Z
functions
...

X,Y differentiable d/dx (aX+bY) = a d/dx (X)
X
+ b d/dx (Y) by rules of differentiation
...

Category theory
...
We have already seen how all functions are maps and all maps
can be represented as matrices which in turn link to vector spaces
...
It’s all very boring and very unexaminable
...


Index Set
...
It is a simple concept which is poorly explained within the
notes
...
Or in layman’s terms, how many β€˜things’
you need to know to define something
...
If we think about this in terms of a basis of 3d
space: 𝑒1 , 𝑒2 , 𝑒3 , we see these are β€˜labelled’ by {1,2,3} therefore the index set of 3d space (ℝ3 ) is {1,2,3}
...
How do you define a sequence? You need to know what every term
is: {π‘‘π‘’π‘Ÿπ‘š1 , π‘‘π‘’π‘Ÿπ‘š2 , π‘‘π‘’π‘Ÿπ‘š3 , … , π‘‘π‘’π‘Ÿπ‘š 𝑛 } Since in a sequence there are an infinite number of terms, to β€˜label’
every term we need β„• β€˜labels’
...

You can use the same logic to think about how maps from an interval [π‘Ž, 𝑏] are indexed by the interval
itself, since you need to know where every point goes to define a map
Freely generated (free over)
...
If we have the same index set
indexing two lists of vectors in down different vector spaces
...
g
...
{𝑒1 , 𝑒2 , 𝑒3 } βŠ†
π‘ˆ, {𝑣1 , 𝑣2 , 𝑣3 } βŠ† 𝑉, are two lists of vectors in each vector space indexed by 𝔗 ≔ {1,2,3}
...

Saying something is β€˜free over 𝔗’ is essentially equivalent to saying it has a basis
...
It is difficult to understand and there isn’t much he can
ask other than regurgitate the definition from his notes
...

Linear independence
...
For example in ℝ2 :
Y
Y
𝑣2
𝑣1
𝑣1

X

X

𝑣2
As shown in the diagram the diagram the first example is not linearly independent since 𝑣1 = π‘Žπ‘£2 where
π‘Ž is a scalar
...

To show that a large set 𝑛 of vectors are all linearly independent to each other you just need to show that
𝑛
βˆ‘ 𝑖=1 πœ† 𝑖 𝑣 𝑖 = 0 ⟹ πœ† 𝑖 = 0 βˆ€ 𝑖

Exam tip: this may look tricky but if you are given, for example 3 vectors in ℝ3 : 𝑣1 , 𝑣2 , 𝑣3 multiply them
each by an unknown and equate to zero: π‘Žπ‘£1 + 𝑏𝑣2 + 𝑐𝑣3 = 0, then substitute the vectors in and solve
each line row of the resulting vector like a simulations equation
...

Dual space and linear forms
...

Example: a map from ℝ3 to ℝ, such as the magnitude: ||𝑣|| is a linear form
...
For example β€–2β€– = √14
3
A dual space of a vector space is the space of all linear forms
...
The important thing to note
about dual spaces is that any equation within a dual space can be written as an β€˜action on a basis’
...
g
...

Important note: If there are is a finite number of basies, then: dim𝑉 = dim𝑉 βˆ—
...

If you have a vector space called 𝑉, and a set of homogeneous linear forms from the dual space called 𝐸,
the solution space is the subspace in 𝑉, which solve the set of linear equations 𝐸
...
Take the x-y Cartesian coordinates (e
...
ℝ2 ) as an
example
...
This is our set
𝐸
...
The
diagram above is again helpful in demonstrating this
...

This again is a simple thing made complicated
...
Its transpose is the transpose of its matrix
...
They are available on the solutions of 1b 2014 and 1b 2012 (like clockwork)

Annihilator
...
Unfortunately this means that
it becomes one of those things which if you β€˜get’ it you’re like β€˜ohhh that’s obvious’ and you kick yourself
for not seeing the difference
...
Hence
why he spend so much time in problems classes half way through the year trying to explain the concept
...
For a solution space we pick a set of equations, and
find what they all send to zero
...


𝑓 = π‘Œ + 𝑋 and 𝑔 = π‘Œ βˆ’ 𝑋
𝑓=0

Here is the equations
in their homogenous
β€˜=0’ form

𝑋= π‘Œ

Here is the subspace I
am referring to

𝑔=0
Find the SPACE in ℝ2 , in which the equations
intersect
...

In this case all points on that line go to zero when
the equation looks like 𝑓 = π‘Œ βˆ’ 𝑋

Exam tip: learn a method to differentiate between these two so you don’t confused them on the exam
...

If you have two spaces called 𝑉 and π‘Š
...
Example: ℝ3
...
Take a unit
vector along each line
...
Here is an attempt at a stereoscopic
3D drawing:
𝑉

π‘Š

Of 𝑉 βŠ• π‘Š

If the intersection (shared elements) of 𝑉 and π‘Š is the zero element (𝑉 β‹‚ π‘Š = {0}) we call their sum, the
β€˜direct sum’ denoted by βŠ•
...
So we are β€˜directly summing’ the basis
...
Hence it’s not a β€˜direct’ sum of basies, it’s a more complicated process
...

The affine spaces define a quotient space, so bear with this
...

The blue plane is parallel to the green space
...

π‘Ž 𝑏

Quotient spaces are harder to get your head round, so make sure you have a good understanding of affine
subspaces before you venture into this! Also I find the notes very hard to visualize, so I will be adding a
picture
...
Confusing huh? In
𝑉

the notes the definition 𝑃 ≔ {𝑣 + 𝑃|𝑣 ∈ 𝑉} is possibly even worse
...
It might also be useful to look
at alg 1A equivalence classes
...
There are two ways to think about this, one algebraic
and one topological
...
First of all we take our subspace 𝑃, work out all of the affine
subspaces
...
Generally π‘‘π‘–π‘š 𝑃 = π‘‘π‘–π‘šπ‘‰ βˆ’ π‘‘π‘–π‘šπ‘ƒ
...
g
...
This
creates a set of points in the line directed along the lines with infinity signs at the end on the diagram
...

I have stuck to referencing a space ℝ3 in all of this, the same applies to any space, although it’s extremely hard
to visualise this for say a dual space, or other exotic space; although the same principles apply
...

An inner product space (IPS) is simply a vector space, with a defined inner product
...
An inner product is anything that
follows these 3 rules:
βŸ¨π‘’|π‘£βŸ© = Μ…Μ…Μ…Μ…Μ…Μ…Μ…
βŸ¨π‘£|π‘’βŸ©
βŸ¨π‘’|πœ†π‘£ + πœ‡π‘€βŸ© = πœ†βŸ¨π‘’|π‘£βŸ© + πœ‡βŸ¨π‘’|π‘€βŸ©
βŸ¨π‘£|π‘£βŸ© β‰₯ 0 (equality only if 𝑣 = 0)

Conjugate symmetric
Linear in second variable
Positive definite
Algebra of IPS, RULES:
Cauchy-Schwarz inequality
...


Pythagoras theorem

If βŸ¨π‘’|π‘£βŸ© = 0 (the angle is a right angle)
‖𝑒 + 𝑣‖2 = ‖𝑒‖2 + ‖𝑣‖2 (you can apply pythag)
‖𝑒 + 𝑣‖ ≀ ‖𝑒‖ + ‖𝑣‖
‖𝑒 + 𝑣‖2 + ‖𝑒 βˆ’ 𝑣‖2 = 2(‖𝑒‖2 + ‖𝑣‖2 )
βŸ¨π‘£|π‘€βŸ© = 0βˆ€π‘€ ∈ π‘Š ⟹ 𝑣 = 0

⊿ inequality
Parallelogram identity
Non-degeneracy lemma

Riesz representation of an IPS:
The dual space is a set of maps mapping from the vector space to a scalar
...
g
...

An inner product maps two vectors from the vector space to a scalar
...
g
...

These are already very similar, and in fact if we write an inner product like βŸ¨π‘£| βˆ™βŸ© with the dot standing for β€˜put
any vector from 𝑉 into here’
...
This is therefore in the dual space
...
Try to learn the proofs of the IPS rules
since they come up periodically, as well as the Riesz representation of an IPS, since this appears most years
...

Two (non zero) elements of a vector space 𝑒 and 𝑣 are orthogonal if βŸ¨π‘’|π‘£βŸ© = 0
...

Two elements are normalized if they are length one
...
g
...


Two elements are orthonormal if they are both orthogonal and normalized
...

Two subspaces π‘ˆ and 𝑉 are orthogonal if βŸ¨π‘’|π‘£βŸ© = 0 for all elements 𝑒 ∈
π‘ˆ, 𝑣 ∈ 𝑉
...


Gram-Schmidt Process
You just need to know how to apply it, the theory behind it is rather useless to the exam, although Kahn
academy has a really good video explaining it
...

The definition is in the notes or as an answers in the exam past papers
...
The best worked examples are past exams 2012 and 2011
...
It has always been 3 marks state then 4 marks β€˜do’
...

Adjoints
...
I have tabled them to make it easier to read and learn
...
The rules are less examinable although a proof could come up with them in it
...

Eigenvectors are what they always have been πœ™(𝑣) = πœ†π‘£ for eigenvector 𝑣, eigenvalue πœ†
...
The only new thing is invariance
πœ™ invariant means that applying πœ™ to any element of a subspace, keeps the result within the subspace
...

Another table of adjoint β€˜facts’ (there are millions of these (sorry!))
If π‘ˆ is πœ™ invariant, π‘ˆ βŠ₯ is πœ™ βˆ— invariant
πœ™ βˆ— (𝑣) = πœ‡π‘£ and πœ™(𝑒) = πœ†π‘’
⟹ (πœ† βˆ’ πœ‡Μ… )βŸ¨π‘’|π‘£βŸ© = 0
βˆ— (𝑣)
πœ™
= πœ‡π‘£ and πœ™(𝑣) = πœ†π‘£
⟹ πœ† = πœ‡Μ…
πœ™ ∘ πœ™βˆ— = πœ™βˆ— ∘ πœ™
⟹ πœ™ βˆ— (𝑣) = πœ†π‘£ ⇔ πœ™(𝑣) = πœ†Μ… 𝑣
Eigenvalues are purely real
Eigenvalues have no real part (are just made of 𝑖′𝑠)

Adjoint invariance
Adjoint eigenvalues
Adjoint conjugate eigenvalues
Normal adjoint rules
πœ™ self adjoint
πœ™ skew adjoint

Exam tip: don’t stress about memorizing all million β€˜adjoint facts’ that he put in
...

Spectral theorem
...
Also, self adjoint and symmetric and just real
cases for normal and Hermitian matrix’s so you don’t need to learn them specifically
...
It is by far the most confusing and non-routine part
IMO
...

Another note is that his lecture notes on this part are actually decent compared to the rest
...

Bilinearity and degeneracy
...
For example you could have a map called β„³ which β€˜multiplies’
matrix’s
...
Now β„³(𝐴, 𝐡) β†’ 𝐢 ∈ 𝑀 𝑛×𝑙 (𝔽)
...
E
...
𝛼(πœ†π‘₯ + πœ‡π‘¦, 𝑧) = πœ†π›Ό(π‘₯, 𝑧) +
πœ‡π›Ό(𝑦, 𝑧) is β€˜linear in the first variable’
...
Written out it full it looks
complicated, but in reality it is just an extension of linearity:
Let 𝑉, π‘ˆ, π‘Š, π‘Œ be vector spaces over 𝔽
...
Let 𝛼, 𝛽, 𝛾, 𝛿 ∈ 𝔽
...
E
...
some 𝐴 β‰  0 which sends
β„³(𝐴, 𝐡) = 0 βˆ€π΅ ∈ π‘ˆ is β€˜degenerate in the first variable’ and similarly if there is a 𝐡 β‰  0 that nullifies the
bilinear map then it is called β€˜degenerate in the second variable’
...

In the same way that we fix 𝑣 to create a linear map from an inner product by writing it as βŸ¨βˆ™ |π‘£βŸ©, we can do
the same to any bilinear map β„± by writing it as β„±(βˆ™, 𝐴) or β„±(𝐴,βˆ™), this fixes the variable in the place of 𝐴 and
allows it to vary over 𝐡 where the βˆ™ is
...
The
bilinear map is non-degenerate in its variable if π‘˜π‘’π‘Ÿβ„± = {0}
...

Imagine a multilinear map with 10 or 15 variables
...
We don’t want this, we want to work in linear algebra,
and therefore its nice to have a way to turn multilinear algebra INTO linear algebra
...

Say we have vector spaces π‘ˆ and 𝑉 over a field 𝔽
...

A really good analogy is ℝ2
...
The space ℝ2 gives all possible pairings of two real numbers
...

Example linear map: π’œ adds together two numbers so π’œ(π‘Ž, 𝑏) = π‘Ž + 𝑏
...


Exam tip: learn to regurgitate the proof in the notes/answer to β€˜define a notion of a bilinear map/tensor
product/quadratic form from the notes, this always comes up and is easy marks
...
So if 𝑒1 … 𝑒 π‘˜ is a basis for π‘ˆ, and 𝑣1 … 𝑣 𝑙
is the same for 𝑉, the basis for π‘ˆβ¨‚π‘‰ is βˆ‘ π‘™π‘Žπ‘™π‘™ 𝑗 βˆ‘ π‘˜π΄π‘™π‘™ 𝑖(𝑒 𝑖 ⨂𝑣 𝑗 )

Bilinear Forms and parallels to linear maps
Almost all terminology about linear maps is translatable into bi and multi-linear maps
...

π‘‘π‘–π‘šπ‘‰ = π‘Ÿπ‘Žπ‘›π‘˜β„± + ′𝑛𝑒𝑙𝑙′ℱ
π‘‘π‘–π‘šπ‘‰ = π‘Ÿπ‘Žπ‘›π‘˜β„± + π‘‘π‘–π‘š π‘Ÿπ‘Žπ‘‘β„±

Quadratic forms
...
Remember a bilinear form maps from a vector space to
the scalar the vector space is over
...

A quadratic form fixes it so the variables are no longer independent
...
g
...
Since we know both variables are the same we just write it
𝑄(𝑣) ≔ β„±(𝑣, 𝑣)
...
H
...
H
...
So 𝑄 is β€˜negative definite’ on 𝑉 if βˆ’π‘„ is positive definite
...
For
example is 4 βˆ’ 5𝑖 positive or negative??
Signature of quadratic forms:
Often 𝑄 on 𝑉 can vary and be neither positive nor negative definite, such that π‘–π‘šπ‘„ = ℝ, or similar
...
E
...
𝑄(𝑒) is positive definite 𝑠
...

Hence we can find the smallest and biggest subspaces that satisfy 𝑄(𝑒) is positive or negative definite
...

Sylvester’s Law of intertia for Quadratic forms
...

π‘Ÿπ‘Žπ‘›π‘˜β„± = 𝑝 + π‘ž, and the diagonal matrix representing β„± has 𝑝 positive entries and π‘ž negative entries along
the diagonal
...
Theory over
...

Structure
...
(Do all Q’s if you
have time!)
~50% of marks are β€˜repeat definitions and solutions from notes’ ~50% are solving things and doing β€˜unseen
proofs’ Learn the stuff below
...

Question 1
...

Regular Q’s
ο‚·
ο‚·
ο‚·
ο‚·

Define annihilator and Solution space
Define transpose of linear map
Write down proof of π‘˜π‘’π‘Ÿπœ™ = π‘ π‘œπ‘™(π‘–π‘šπœ™ 𝑇 )
Prove (set of vectors) linearly independent

Question 2
...
– Adjoints and Eigen’s
Linear operators, adjoints, spectral (π‘ƒβˆ’1 𝐴𝑃) theorem, eigenvectors and spaces, more IPS’s
Regular Q’s
ο‚·
ο‚·
ο‚·
ο‚·

Define adjoint πœ™ βˆ— of πœ™ (or other subclass of adjoint, e
...
self adjoint)
Define orthogonal complement π‘ˆ βŠ₯ of π‘ˆ
State spectral theorem
Show π‘ƒβˆ’1 𝐴𝑃 is true and/or find 𝑃 for some matrix
...
– Multilinear things
Bilinear maps, quadratic forms, signatures, tensor product and quotient spaces
Regular Q’s
ο‚·
ο‚·
ο‚·

Define Bilinear map or quadratic form and/or prove *** is one
Title: Algebra - Vector Inner product spaces
Description: Notes directly for the University of Bath 2nd year IPS and vector space course. As it is maths, will be similar to any course on IPS or Vector spaces. Includes marks of common exam questions here at bath. There will be similar to most other notes. Notes cover: Vector subspaces, sums and intersections, complementary subspaces, quotient spaces. Dual spaces, transpose of a linear map, annihilators. Inner product spaces over R and C. Cauchy-Schwarz inequality. Gram-Schmidt orthonormalization. Orthogonal subspaces and complements. Linear operators on inner product spaces. Orthogonal and unitary groups. Properties of eigenvalues and eigenspaces. Finite dimensional spectral theorem. Bilinear forms, relation with dual spaces, nondegeneracy. Tensor products and applications. Multilinear forms, alternating forms. Alternating bilinear forms, classification. Quadratic forms, relation to symmetric bilinear forms. Sylvester's law