Lect - 3 Basic equations of one-dimensional, two-dimensional and three-dimensional conduction.pptx
Lesson31 Higher Dimensional First Order Difference Equations Slides
-
Upload
matthew-leingang -
Category
Technology
-
view
1.533 -
download
0
description
Transcript of Lesson31 Higher Dimensional First Order Difference Equations Slides
![Page 1: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/1.jpg)
Lesson 31First Order, Higher Dimensional Difference
Equations
Math 20
April 30, 2007
AnnouncementsI PS 12 due Wednesday, May 2I MT III Friday, May 4 in SC Hall AI Final Exam: Friday, May 25 at 9:15am, Boylston 110 (Fong
Auditorium)
![Page 2: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/2.jpg)
Recap
Higher dimensional linear systemsExamples
Markov ChainsPopulation Dynamics
Solution
Qualitative AnalysisDiagonal systemsExamples
Higher dimensional nonlinear
![Page 3: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/3.jpg)
one-dimensional linear difference equations
FactThe solution to the inhomogeneous difference equation
yk+1 = ayk +b
(with a 6= 1) has solution
yk = ak(
y0−b
1−a
)+
b1−a
Please try not to memorize this. When a and b have actualvalues, it’s either to follow this process:
1. Start with ak times an undetermined parameter c (thissatisfies the homogenized equation)
2. Find the equilibrium value y∗.3. Add the two and pick c to match y0 when k = 0.
![Page 4: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/4.jpg)
Nonlinear equations
FactThe equilibriumpoint y∗ of thenonlineardifferenceequationyk+1 = g(yk ) isstable if|g′(yk )|< 1.
slope
=1
slope = g ′(y∗ )
slope=−1
y0
y1
y2
![Page 5: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/5.jpg)
Recap
Higher dimensional linear systemsExamples
Markov ChainsPopulation Dynamics
Solution
Qualitative AnalysisDiagonal systemsExamples
Higher dimensional nonlinear
![Page 6: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/6.jpg)
Let’s kick it up a notch and look at the multivariable, linear,homogeneous difference equation
y(k +1) = Ay(k)
(we move the index into parentheses to allow y(k) to havecoordinates and to avoid writing yk ,i .)
![Page 7: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/7.jpg)
Skipping class
ExampleThis example was a Markov chain with transition matrix
A =
[0.7 0.80.3 0.2
]Then the probability of going or skipping on day k satisfies theequation
p(k +1) = Ap(k)
![Page 8: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/8.jpg)
ExampleFemale lobsters have more eggs each season the longer theylive. For this reason, it is illegal to keep a lobster that has laideggs.Let yi be the number of lobsters in a fishery which are i yearsalive. Then the difference equation might have the simplifiedform
y(k +1) =
0 100 400 700
0.1 0 0 00 0.3 0 00 0 0.9 0
y(k)
![Page 9: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/9.jpg)
Mmmm. . . Lobster
![Page 10: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/10.jpg)
Formal solution
y(1) = Ay(0)
y(2) = Ay(1) = A2y(0)
y(3) = Ay(2) = A3y(0)
So
FactThe solution to the homogeneous system of linear differenceequations y(k +1) = Ay(k) is
y(k) = Aky(0)
![Page 11: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/11.jpg)
Formal solution
y(1) = Ay(0)
y(2) = Ay(1) = A2y(0)
y(3) = Ay(2) = A3y(0)
So
FactThe solution to the homogeneous system of linear differenceequations y(k +1) = Ay(k) is
y(k) = Aky(0)
![Page 12: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/12.jpg)
Flop count
I To multiply two n×n matrices takes n3(n−1) additions ormultiplications (flop=floating point operation)
I So finding Ak takes about n4k flops!
![Page 13: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/13.jpg)
Flop count
I To multiply two n×n matrices takes n3(n−1) additions ormultiplications (flop=floating point operation)
I So finding Ak takes about n4k flops!
![Page 14: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/14.jpg)
Now what?Suppose v is an eigenvector of A with eigenvalue λ . Then thesolution to the problem
y(k +1) = Ay(k), y(0) = v
is
y(k) = λkv
Supposey(0) = c1v1 +c2v2 + · · ·+cmvm
Then
Ay(0) = c1λ1v1 +c2λ2v2 + · · ·+cmλmvm
A2y(0) = c1λ21 v1 +c2λ
22 v2 + · · ·+cmλ
2mvm
If A is diagonalizable, we can take m = n and write any initialvector as a linear combination of eigenvalues.
![Page 15: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/15.jpg)
Now what?Suppose v is an eigenvector of A with eigenvalue λ . Then thesolution to the problem
y(k +1) = Ay(k), y(0) = v
isy(k) = λ
kv
Supposey(0) = c1v1 +c2v2 + · · ·+cmvm
Then
Ay(0) = c1λ1v1 +c2λ2v2 + · · ·+cmλmvm
A2y(0) = c1λ21 v1 +c2λ
22 v2 + · · ·+cmλ
2mvm
If A is diagonalizable, we can take m = n and write any initialvector as a linear combination of eigenvalues.
![Page 16: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/16.jpg)
Now what?Suppose v is an eigenvector of A with eigenvalue λ . Then thesolution to the problem
y(k +1) = Ay(k), y(0) = v
isy(k) = λ
kv
Supposey(0) = c1v1 +c2v2 + · · ·+cmvm
Then
Ay(0) = c1λ1v1 +c2λ2v2 + · · ·+cmλmvm
A2y(0) = c1λ21 v1 +c2λ
22 v2 + · · ·+cmλ
2mvm
If A is diagonalizable, we can take m = n and write any initialvector as a linear combination of eigenvalues.
![Page 17: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/17.jpg)
Now what?Suppose v is an eigenvector of A with eigenvalue λ . Then thesolution to the problem
y(k +1) = Ay(k), y(0) = v
isy(k) = λ
kv
Supposey(0) = c1v1 +c2v2 + · · ·+cmvm
Then
Ay(0) = c1λ1v1 +c2λ2v2 + · · ·+cmλmvm
A2y(0) = c1λ21 v1 +c2λ
22 v2 + · · ·+cmλ
2mvm
If A is diagonalizable, we can take m = n and write any initialvector as a linear combination of eigenvalues.
![Page 18: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/18.jpg)
Now what?Suppose v is an eigenvector of A with eigenvalue λ . Then thesolution to the problem
y(k +1) = Ay(k), y(0) = v
isy(k) = λ
kv
Supposey(0) = c1v1 +c2v2 + · · ·+cmvm
Then
Ay(0) = c1λ1v1 +c2λ2v2 + · · ·+cmλmvm
A2y(0) = c1λ21 v1 +c2λ
22 v2 + · · ·+cmλ
2mvm
If A is diagonalizable, we can take m = n and write any initialvector as a linear combination of eigenvalues.
![Page 19: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/19.jpg)
The big picture
FactLet A have a complete system of eigenvalues and eigenvectorsλ1,λ2, . . . ,λn and v1,v2, . . . ,vn. Then the solution to thedifference equation y(k +1) = Ay(k) is
y(k) = Aky(0) = c1λk1 v1 +c2λ
k2 v2 + · · ·+cnλ
kn vn
where c1,c2, . . . ,cn are chosen to make
y(0) = c1v1 +c2v2 + · · ·+cnvn
![Page 20: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/20.jpg)
Recap
Higher dimensional linear systemsExamples
Markov ChainsPopulation Dynamics
Solution
Qualitative AnalysisDiagonal systemsExamples
Higher dimensional nonlinear
![Page 21: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/21.jpg)
Iterating diagonal systems
Consider a 2×2 matrix of the form
D =
[λ1 00 λ2
]Then the λ ’s tell the behavior of the system.
![Page 22: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/22.jpg)
Picture in terms of eigenvalues
I λ1 > λ2 > 1: repulsion away from the origin
I 1 > λ1 > λ2 > 0: attraction to the originI λ1 > 1 > λ2: saddle point
For negative eigenvalues just square them and use the aboveresults.
![Page 23: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/23.jpg)
Picture in terms of eigenvalues
I λ1 > λ2 > 1: repulsion away from the originI 1 > λ1 > λ2 > 0: attraction to the origin
I λ1 > 1 > λ2: saddle point
For negative eigenvalues just square them and use the aboveresults.
![Page 24: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/24.jpg)
Picture in terms of eigenvalues
I λ1 > λ2 > 1: repulsion away from the originI 1 > λ1 > λ2 > 0: attraction to the originI λ1 > 1 > λ2: saddle point
For negative eigenvalues just square them and use the aboveresults.
![Page 25: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/25.jpg)
Picture in terms of eigenvalues
I λ1 > λ2 > 1: repulsion away from the originI 1 > λ1 > λ2 > 0: attraction to the originI λ1 > 1 > λ2: saddle point
For negative eigenvalues just square them and use the aboveresults.
![Page 26: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/26.jpg)
Picture in terms of eigenvalues
I λ1 > λ2 > 1: repulsion away from the originI 1 > λ1 > λ2 > 0: attraction to the originI λ1 > 1 > λ2: saddle point
For negative eigenvalues just square them and use the aboveresults.
![Page 27: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/27.jpg)
Back to skipping class
ExampleIf
A =
[0.7 0.80.3 0.2
]
The eigenvectors (in decreasing order of absolute value) are[8/113/11
]with eigenvalue 1 and
[−1
212
]with eigenvalue − 1
10 . So the
system converges to a multiple of[
8/113/11
].
![Page 28: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/28.jpg)
Back to skipping class
ExampleIf
A =
[0.7 0.80.3 0.2
]The eigenvectors (in decreasing order of absolute value) are[
8/113/11
]with eigenvalue 1 and
[−1
212
]with eigenvalue − 1
10 .
So the
system converges to a multiple of[
8/113/11
].
![Page 29: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/29.jpg)
Back to skipping class
ExampleIf
A =
[0.7 0.80.3 0.2
]The eigenvectors (in decreasing order of absolute value) are[
8/113/11
]with eigenvalue 1 and
[−1
212
]with eigenvalue − 1
10 . So the
system converges to a multiple of[
8/113/11
].
![Page 30: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/30.jpg)
Back to the lobsters
We had
A =
0 100 400 700
0.1 0 0 00 0.3 0 00 0 0.9 0
The eigenvalues are 3.80293,−2.84895,−0.476993+1.23164i ,−0.476993−1.23164i and the first eigenvector is[0.999716 0.0233099 0.00489153
]T
The population will grow despite the increased harvesting!
![Page 31: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/31.jpg)
Back to the lobsters
We had
A =
0 100 400 700
0.1 0 0 00 0.3 0 00 0 0.9 0
The eigenvalues are 3.80293,−2.84895,−0.476993+1.23164i ,−0.476993−1.23164i and the first eigenvector is[0.999716 0.0233099 0.00489153
]TThe population will grow despite the increased harvesting!
![Page 32: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/32.jpg)
Recap
Higher dimensional linear systemsExamples
Markov ChainsPopulation Dynamics
Solution
Qualitative AnalysisDiagonal systemsExamples
Higher dimensional nonlinear
![Page 33: Lesson31 Higher Dimensional First Order Difference Equations Slides](https://reader033.fdocuments.in/reader033/viewer/2022052622/558cfb05d8b42a206f8b475a/html5/thumbnails/33.jpg)
The nonlinear case
Consider now the nonlinear system
y(k +1) = g(y(k)).
The process is as it was with the one-dimensional nonlinear:1. Look for equilibria y∗ with g(y∗) = y∗2. Linearize about the equilibrium using the matrix
A = Dg(y∗) =
(∂gi
∂yj
)3. The eigenvalues of A determine the stability of y∗.