Lecture 5 - Single Variable Problems CVEN 302 June 12, 2002.
Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
-
Upload
phyllis-preston -
Category
Documents
-
view
213 -
download
0
Transcript of Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
Lecture 16 - Approximation Lecture 16 - Approximation MethodsMethods
CVEN 302
July 15, 2002
Lecture’s GoalsLecture’s Goals
• Discrete Least Square Approximation– Linear– Quadratic– Higher order Polynomials– Nonlinear
• Continuous Least Square– Orthogonal Polynomials– Gram Schmidt -Legendre Polynomial
Approximation MethodsApproximation Methods
• Interpolation matches the data points exactly. In case of experimental data, this assumption is not often true.
• Approximation - we want to consider the curve that will fit the data with the smallest “error”.
What is the difference between approximation and interpolation?
Least Square Fit Least Square Fit ApproximationsApproximations
Data Example
700
800
900
1000
1100
0 20 40 60 80 100
X Values
Y V
alu
es
Suppose we want to fit the data set.
X Y20.5 76532.7 82651 873
73.2 94295.7 1032
We would like to find the best straight line to fit the data?
Least Square Fit Least Square Fit ApproximationsApproximations
The problem is how to minimize the error. We can use the error defined as:
kk xxe However, the errors can cancel one another and still be wrong.
Least Square Fit Least Square Fit ApproximationsApproximations
We could minimax the error, defined as:
kk xxe The error minimization is going to have problems.
Least Square Fit Least Square Fit ApproximationsApproximations
The solution is the minimization of the sum of squares. This will give a least square solution.
2k eS
This is known as the maximum likelihood principle.
Least Square ApproximationsLeast Square Approximations
Assume:
Point eApproximat
Point Data
i
i
iii
y
Y
yYe
b a ii xy
The error is defined as:
Least Square ErrorLeast Square Error
The sum of the errors:
N
1i
2ii
N
1i
2ii
N
1i
2i b a xYyYeS
23
22
21 eeeS
Substitute for the error
Least Square ErrorLeast Square Error
How do you minimize the error?
0db
d
0da
d
S
STake the derivative with the coefficients and set it equal to zero.
Least Square ErrorLeast Square Error
The first component, a
N
1iiii b a20
da
dxxY
S
0baN
1ii
N
1i
2i
N
1iii
xxYx
1 baN
1iii
N
1ii
N
1i
2i
Yxxx
Least Square ErrorLeast Square Error
The second component, b
N
1iii 1b a20
db
dxY
S
0baN
1i
N
1ii
N
1ii
xY
2 baN
1ii
N
1i
N
1ii
Yx
Least Square CoefficientsLeast Square Coefficients
The equations can be rewritten
2 ba
1 ba
N
1ii
N
1i
N
1ii
N
1iii
N
1ii
N
1i
2i
Yx
Yxxx
Least Square CoefficientsLeast Square Coefficients
The equations can be rewritten
N
1ii
N
1iii
N
1ii
N
1ii
N
1i
2i
b
a
N Y
Yx
x
xx
N
1iiixy
N
1iiy
N
1i
2ixx
N
1iix and , , ,Let YxSYSxSxS
Least Square CoefficientsLeast Square Coefficients
The coefficients are defined as:
2xxx
xxyyxx
2xxx
yxxy
Nb
N
Na
SS
SSSS
SS
SSS
Least Square ExampleLeast Square Example
Given the data:
X Y20.5 76532.7 82651 873
73.2 94295.7 1032
Using the results into table of the values:
x x2 y xy N20.5 420.25 765 15682.5 132.7 1069.29 826 27010.2 151 2601 873 44523 1
73.2 5358.24 942 68954.4 195.7 9158.49 1032 98762.4 1
273.1 18607.27 4438 254932.5 5
Least Square ExampleLeast Square Example
Data Example
700
800
900
1000
1100
0 20 40 60 80 100
X Values
Y V
alu
es
2.702
1.27327.186075
5.2549321.273443827.18607b
395.3
1.27327.186075
44381.2735.2549325a
2
2
2.702395.3 xy
The equation is:
Least Square ErrorLeast Square Error
How do you minimize the error for a quadratic fit?
0dc
d
0db
d
0da
d
S
S
S
Take the derivative with the coefficients and set it equal to zero.
cba 2 xxy
Least Square Coefficients for Least Square Coefficients for Quadratic fitQuadratic fit
The equations can be written as:
N
1ii
N
1iii
N
1ii
2i
N
1ii
N
1i
2i
N
1ii
N
1i
2i
N
1i
3i
N
1i
2i
N
1i
3i
N
1i
4i
c
b
a
N Y
Yx
Yx
xx
xxx
xxx
Least Square of Quadratic FitLeast Square of Quadratic Fit
The matrix can be solved using a Gaussian elimination and the coefficients can be found.
Quadratic Least Square Quadratic Least Square ExampleExample
Given a set of dataExample 2First Order
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
X Values
Y V
alu
es
Linear
Data
X Value Data0.05 0.9560.11 0.890.15 0.8320.31 0.7170.46 0.5710.52 0.5390.7 0.378
0.74 0.370.82 0.3060.98 0.2421.17 0.104
The linear results:
Quadratic Least Square Quadratic Least Square ExampleExample
X Value Data x2 x3 x4 xy x2y N0.05 0.956 0.0025 0.000125 0.00000625 0.0478 0.00239 10.11 0.89 0.0121 0.001331 0.00014641 0.0979 0.010769 10.15 0.832 0.0225 0.003375 0.00050625 0.1248 0.01872 10.31 0.717 0.0961 0.029791 0.00923521 0.22227 0.068904 10.46 0.571 0.2116 0.097336 0.04477456 0.26266 0.120824 10.52 0.539 0.2704 0.140608 0.07311616 0.28028 0.145746 10.7 0.378 0.49 0.343 0.2401 0.2646 0.18522 1
0.74 0.37 0.5476 0.405224 0.29986576 0.2738 0.202612 10.82 0.306 0.6724 0.551368 0.45212176 0.25092 0.205754 10.98 0.242 0.9604 0.941192 0.92236816 0.23716 0.232417 11.17 0.104 1.3689 1.601613 1.87388721 0.12168 0.142366 1
6.01 5.905 4.6545 4.114963 3.91612773 2.18387 1.335721 11
Quadratic Least Square Quadratic Least Square ExampleExample
The results are:
a = 0.225, b = -1.018 , c = 0.998
y = 0.225x2 -1.018x + 0.998
Example 2 Second Order
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
X Values
Y V
alu
es
Quadratic
Data
Polynomial Least Square Polynomial Least Square The technique can be used to all forms of polynomials of the form:
nn
2210 xaxaxaay
N
1ii
ni
N
1iii
N
1ii
n
1
0
N
1i
2ni
N
1i
ni
N
1ii
N
1i
ni
N
1ii
a
a
aN
Yx
Yx
Y
xx
x
xx
Polynomial Least Square Polynomial Least Square
Solving large sets of linear equations are not a simple task. They can have the undesirable property known as ill-conditioning. The results of this method is that round-off errors in solving for the coefficients cause unusually large errors in the curve fits.
Polynomial Least SquarePolynomial Least Square
How do you measure the error of higher order polynomials?
N
1k
2keS
Polynomial Least SquarePolynomial Least Square
Or measure of the variance of the problem
Where, n is the degree polynomial and N is the number of elements and Yk are the data points and,
n
0j
jkjk xay
N
1k
2kk
2 1
1yY
nN
Polynomial Least Square Polynomial Least Square ExampleExample
Example 2 can be fitted with cubic equation and the coefficients are:
a0 =1.004 a1 = -1.079
a2 = 0.351 a3 = - 0.069
33
2210 aaaa xxxy Example 2
Third Order
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
X Value
Y V
alu
eCubic
Data
Polynomial Least Square Polynomial Least Square ExampleExample
However, if we were to look at the higher order polynomials such the sixth and seventh order.
The results are not all that promising.
Example 2 Sixth and Seventh Order
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
X Value
Y V
alu
e Sixth
Seventh
Data
Polynomial Least Square Polynomial Least Square ExampleExample
The standard deviation of the polynomial fit shows that the best fit for the data is the second order polynomial.
Sum of the Standard Deviation
0
0.005
0.01
0.015
0.02
0.025
0.03
0.035
0 1 2 3 4 5 6 7 8
Degree of the Polynomial fit
Sta
nd
ard
De
via
tio
n
N
1k
2kk
1
1yY
nN
SummarySummary
• The linear least squared method is straight forward to determine the coefficients of the line.
2xxx
xxyyxx
2xxx
yxxy
Nb
N
Na
SS
SSSS
SS
SSS
ba xy
SummarySummary
• The quadratic and higher order polynomial curve fits use a similar technique and involve solving a matrix of (n+1) x (n+1).
nn10 aaa xxy
N
1ii
ni
N
1iii
N
1ii
n
1
0
N
1i
2ni
N
1i
ni
N
1ii
N
1i
ni
N
1ii
a
a
aN
Yx
Yx
Y
xx
x
xx
SummarySummary
• The higher order polynomials fit required that one selects the best fit for the data and a means of measuring the fit is the standard deviation of the results as a function of the degree of the polynomial.
N
1k
2kk
1
1yY
nN
HomeworkHomework
• Check the Homework webpage