Search for notes by fellow students, in your own course and all over the country.

Browse our notes for titles which look like what you need, you can preview any of the notes via a sample of the contents. After you're happy these are the notes you're after simply pop them into your shopping cart.

My Basket

You have nothing in your shopping cart yet.

Title: Method of least squares
Description: Linear algebra course

Document Preview

Extracts from the notes are below, to see the PDF you'll receive please use the links above


Method of
Least Squares

7
...
, tn and obtains
2
the values Y1, y ,
...
Suppose that the experimenter believes that the
data fit (more or less) a curve of the form y = a+ bt + ct2
...
If the data fit the curve y = a+ bt+ ct2
perfectly, then for each i, y; = a + bt; + ct1
...
3
...


0
Figure 7
...
3

Some data points and a curve y

=

a + bt + ct2• Vertical line segments

measure the error in the fit at each t;
...


This would be unsatisfactory, however, because we might get a small

i=I

total error by having large positive errors cancelled by large negative errors
...

To find the parameters

ues t1,
...
, y11,
1

a, b,

and

c

that minimize this expression for given val­

one could use calculus, but we will proceed by using a

projection
...


y

Let

y=

1



�1,i'

, l=

=

1:

, and

i" =

'[

be vectors in R'
...
Observe that the square of this distance is exactly
the sum of the squares of the errors
1

n

ll

11
...


(al + bi'+ ci'2)

If at least four of the

is a vector in the subspace § of JR11 spanned by

t; are distinct,

then 13 is linearly independent (see

Problem D2), so it is a basis for §
...


a, b,

c such that al+ bi'+ ci'2

and

is the vector in § that is closest

By the Approximation Theorem, this vector is projs y and the required

c are the 13-coordinates of projs y
...
However, we can

c
have been chosen correctly, the error vector e= y-al-bi'-ci'2 is equal to perps y
...
Therefore,

use the theory of orthogonality and projections to simplify the problem
...

It is helpful to rewrite these equations by introducing the matrix

x=

and the ve
m

[1

i'

t2]

of parameters
...


Since the three equations are obtained by taking dot products of e with the columns of

X,

the system of equations can be written in the form

XT(y-Xil) = 0

The equations in this form are called the normal equations for the least squares fit
...

For a more general situation, we use a similar construction
...
This will be demonstrated in Example 2 below
...
O

2
...
1

4
...
9

6
...
l

12
...
1

30
...
9

55
...
We let

XT =

[f

t

6
...
0

4
...
1

4
...
9

6
...
61

16
...
01

36
...
6
21
...
2
40
...
5

Using a computer, we can find that the solution for the system

[ ]

a = (XTx)-lXTy is

l
...
38382
...
93608

2

the best-fitting quadratic curve to bey = 1
...
38t + 0
...
The results are shown
in Figure 7
...
4
...
3
...


EXAMPLE2

a and b to obtain the best-fitting equation of the form y at2 +bt for the following

Find

=

data:

t
y

-1
4

0
1

1

Solution: Using the method above, we observe that we want the error vector

e
y - at2 - bt
t2 and t
...
In particular,

e

must be orthogonal to

t2
...
(y at2 bi') = 0
t e t (y at2 bi') = 0
-

-

=

-

-

In this case, we want to pick X to be of the form

Taking jl =

So, y =

m

then gives

�t2 - �t is the equation of best fit for the given data
...
Suppose that Ax

=

b is

a system

of p equations in q variables, where p is greater than q
...


variables, we expect the system to be inconsistent unless b has some special properties
...

Note that the problem in Example 1 of finding the best-fitting quadratic curve was
of this form: we needed to solve Xa

=

y for the three variables

a,

b, and c, where there

3, this is an overdetermined system
...
However, Ax = x1a1 +
+ xqaq, which is a
vector in the columnspace of A
...
By the Approximation Theorem,
were

n

equations
...
Thus, to find a vector x that minimizes the "error"

llAx -bll, we want to solve the consistent system Ax

=

prokoI(A) x
...
The
must satisfy

AT Ax = AT b
...


PROBLEMS 7
...
Make a graph showing
the data and the best-fitting line
...

2

(a) y = at + bt for the data

t

-1

y

4

0

1
(b)

llAx - bll
...
Make a graph showing the data and

(a) y = at + bt for the data

the best-fitting line
...
Make
a graph showing the data and the best-fitting curve
...

Xj - Xz

(a)

2

B3 Find the best-fitting equation of the given form for

(b)

=

4

3x1 + 2x2

5

X1 - 6x2

10

X1 + Xz

=

7

X1 - Xz

=

4
14

Xj + 3x2

each set of data
...


for the following data
...
0

1
...
9

3
...
1

5
...
0

3
...
1

5
...
9

11
...
where

(=

l:J

and� =

[:;]

[

D2 Let X = l

('n],

i' t2

t1
where t= :

[]

and

f11

Then show that
II

L t;

n
xrx=

II

Lt;

Lt�

Lt3
i=l I
ll
Lti

II

Lt2

Lt3

I

l

i=l

i=l

i=l

Lt2

i=l

i=l
I!

ll

n

i=l

I

I!

i=l

for 1 � i � n
...

(a) Prove that the columns of X are linearly inde­






pendent by showing that the only solution to

col+ cit+···+ cmf'n = 0 is co=···= Cm= 0
...
)
r
(b) Use the result from part (a) to prove that x X

is invertible
...
)


Title: Method of least squares
Description: Linear algebra course