Numerical Methods and Optimization: introduction · Numerical Methods and Optimization:...
Transcript of Numerical Methods and Optimization: introduction · Numerical Methods and Optimization:...
Numerical Methods and Optimization:introduction
Mauro Passacantando
Department of Computer Science, University of [email protected]
Numerical Methods and OptimizationMaster in Computer Science – University of Pisa
M. Passacantando Numerical Methods and Optimization 1 / 12 –
Course Organization
I 1 course (12 cfu/ects)I 1 programI 1 examI 2 lecturers
Federico Poloni (Numerical methods)
Dipartimento di InformaticaLargo B. Pontecorvo 3, PisaBuilding C, 2nd floor, room 343Tel. 050 2213143, e-mail: [email protected] hours: Still to decide
Mauro Passacantando (Optimization)
Dipartimento di InformaticaLargo B. Pontecorvo 3, PisaBuilding C, 2nd floor, room 323Tel. 050 2212756, e-mail: [email protected] hours: Wednesday 9:00-11:00
M. Passacantando Numerical Methods and Optimization 2 / 12 –
Information
Course Schedule
I Tuesday 9:00-11:00 room L1
I Tuesday 16:00-18:00 room H-lab
I Wednesday 11:00-13:00 room C (except September 21: room C1)
I Thursday 11:00-13:00 room C
Web page
https://elearning.di.unipi.it/course/view.php?id=77
ExamWritten and oral exam
M. Passacantando Numerical Methods and Optimization 3 / 12 –
Bibliography
I S. Boyd, L. Vandenberghe, Convex optimization, Cambridge University Press, 2004(available on http://web.stanford.edu/~boyd/cvxbook/)
I M.S. Bazaraa, H.D. Sherali, C.M. Shetty, Nonlinear programming: theory andalgorithms, Wiley & Sons, 2006 (Chapters 1-6, 8-9)
I J. Nocedal, S. Wright, Numerical Optimization, Springer Series in OperationsResearch and Financial Engineering, 2006 (Chapters 1-3, 5, 12, 16, 17)
I A.R. Conn, K. Scheinberg, L.N. Vicente, Introduction to Derivative-FreeOptimization, SIAM series on Optimization, 2009 (Chapters 1, 7)
I J. Demmel, Applied Numerical Linear Algebra, SIAM press, 1996
I L. N. Trefethen, D. Bau, Numerical Linear Algebra, SIAM press, 1997
M. Passacantando Numerical Methods and Optimization 4 / 12 –
Syllabus
I Linear algebra and calculus background
I Unconstrained optimization and systems of equations
I Direct and iterative methods for linear systems
I Iterative methods for nonlinear systems
I Numerical methods for unconstrained optimization
I The linear least-squares problem
I Iterative methods for computing eigenvalues
I Constrained optimization and systems of equations
I Lagrangian duality
I Numerical methods for constrained optimization
I The fast Fourier transform
I Applications: regression, parameter estimation, approximation and datafitting, support vector machines, image and signal reconstruction
I Software tools for numerical and optimization problems (Matlab).
M. Passacantando Numerical Methods and Optimization 5 / 12 –
Support Vector Machines for Data Classification
Given training data in different classes (labels known)
Predict test data (labels unknown)
Examples:
I handwritten digits recognition
I spam filtering
I credit card fraud detection
I marketing
I medical diagnosis
I image processing
I . . .
M. Passacantando Numerical Methods and Optimization 6 / 12 –
Support Vector Machines for Data Classification
Given two sets A,B ⊂ Rn (training set).Assume that A and B are linearly separable, i.e., there is an hyperplaneH = {x ∈ Rn : wTx + b = 0} such that
wTx i + b > 0 ∀ x i ∈ AwTx j + b < 0 ∀ x j ∈ B
0 1 2 3 4 5 6 7 8 9 100
1
2
3
4
5
6
7
8
9
10
Test data x :decision function f (x) = sign(wTx + b)
M. Passacantando Numerical Methods and Optimization 7 / 12 –
Support Vector Machines for Data Classification
Many possible choices of w and b. Which hyperplane do we choose?
0 1 2 3 4 5 6 7 8 9 100
1
2
3
4
5
6
7
8
9
10
0 1 2 3 4 5 6 7 8 9 100
1
2
3
4
5
6
7
8
9
10
M. Passacantando Numerical Methods and Optimization 8 / 12 –
Support Vector Machines for Data Classification
0 1 2 3 4 5 6 7 8 9 100
1
2
3
4
5
6
7
8
9
10
0 1 2 3 4 5 6 7 8 9 100
1
2
3
4
5
6
7
8
9
10
We look for the separating hyperplane with the maximum margin of separation.This problem can be modeled as
min ‖w‖2wTx i + b ≥ 1 ∀ x i ∈ AwTx j + b ≤ −1 ∀ x j ∈ B
M. Passacantando Numerical Methods and Optimization 9 / 12 –
Low-rank approximation with the Singular Value Decomposition
ProblemGiven a matrix M ∈ Rn×m, approximate it with the product of a ‘tall thin’A ∈ Rn×k and a ‘fat large’ B ∈ Rk×m (k � n,m):
≈ ·
minA,B‖M − AB‖
Why?
I Reduce data dimensionI Reveal features
Applications:
I Text and data miningI Neural networksI Web searchI Graph / community analysis. . .
M. Passacantando Numerical Methods and Optimization 10 / 12 –
An example
An example easy to visualize: image compression by approximation.Black/white image: matrix with color intensities ∈ [0, 1].
Original (512× 512) k = 1 k = 10
k = 25 k = 50 k = 100
M. Passacantando Numerical Methods and Optimization 11 / 12 –
Eigenvalue problems
ProblemGiven a matrix M ∈ Rn×m, approximate it with the product of a ‘tall thin’A ∈ Rn×k and a ‘fat large’ B ∈ Rk×m (k � n): minA,B ‖M − AB‖.
Optimal A, B can be obtained from eigenvectors of MTM and MMT .
I How can we solve this problem?
I What if M is very large and sparse?
I What if M has some special structure?
I Can we avoid forming MTM and MMT ?
I Is the solution numerically stable, or should we expect large errors?
M. Passacantando Numerical Methods and Optimization 12 / 12 –