Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal...

126
Regularization Michael Moeller Observations from previous chapter Regularization General idea Definition and Properties Error-free parameter choice Behavior on Y\D(A ) Construction based on SVD Examples Convergence Without source conditions With source conditions Generalizations updated 18.11.2014 Chapter 3 Regularization Ill-Posed Problems in Image and Signal Processing WS 2014/2015 Michael Moeller Optimization and Data Analysis Department of Mathematics TU M ¨ unchen

Transcript of Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal...

Page 1: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Chapter 3RegularizationIll-Posed Problems in Image and Signal ProcessingWS 2014/2015

Michael MoellerOptimization and Data Analysis

Department of MathematicsTU Munchen

Page 2: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• No solution exists

→ least squares solution.

• Solution not unique→ minimal norm solution.

• Linear operator for the above→ Moore-Penrose inverse.

• Third criterion for well posedness?→ A† continuous.

• A† continuous⇔ R(A) closed.

• A compact, R(A) infinite dimensional⇒ A† not continuous.

• Integral equation with H.S. kernel⇒ A compact

Page 3: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• No solution exists→ least squares solution.

• Solution not unique→ minimal norm solution.

• Linear operator for the above→ Moore-Penrose inverse.

• Third criterion for well posedness?→ A† continuous.

• A† continuous⇔ R(A) closed.

• A compact, R(A) infinite dimensional⇒ A† not continuous.

• Integral equation with H.S. kernel⇒ A compact

Page 4: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• No solution exists→ least squares solution.

• Solution not unique

→ minimal norm solution.

• Linear operator for the above→ Moore-Penrose inverse.

• Third criterion for well posedness?→ A† continuous.

• A† continuous⇔ R(A) closed.

• A compact, R(A) infinite dimensional⇒ A† not continuous.

• Integral equation with H.S. kernel⇒ A compact

Page 5: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• No solution exists→ least squares solution.

• Solution not unique→ minimal norm solution.

• Linear operator for the above→ Moore-Penrose inverse.

• Third criterion for well posedness?→ A† continuous.

• A† continuous⇔ R(A) closed.

• A compact, R(A) infinite dimensional⇒ A† not continuous.

• Integral equation with H.S. kernel⇒ A compact

Page 6: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• No solution exists→ least squares solution.

• Solution not unique→ minimal norm solution.

• Linear operator for the above

→ Moore-Penrose inverse.

• Third criterion for well posedness?→ A† continuous.

• A† continuous⇔ R(A) closed.

• A compact, R(A) infinite dimensional⇒ A† not continuous.

• Integral equation with H.S. kernel⇒ A compact

Page 7: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• No solution exists→ least squares solution.

• Solution not unique→ minimal norm solution.

• Linear operator for the above→ Moore-Penrose inverse.

• Third criterion for well posedness?→ A† continuous.

• A† continuous⇔ R(A) closed.

• A compact, R(A) infinite dimensional⇒ A† not continuous.

• Integral equation with H.S. kernel⇒ A compact

Page 8: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• No solution exists→ least squares solution.

• Solution not unique→ minimal norm solution.

• Linear operator for the above→ Moore-Penrose inverse.

• Third criterion for well posedness?

→ A† continuous.

• A† continuous⇔ R(A) closed.

• A compact, R(A) infinite dimensional⇒ A† not continuous.

• Integral equation with H.S. kernel⇒ A compact

Page 9: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• No solution exists→ least squares solution.

• Solution not unique→ minimal norm solution.

• Linear operator for the above→ Moore-Penrose inverse.

• Third criterion for well posedness?→ A† continuous.

• A† continuous⇔ R(A) closed.

• A compact, R(A) infinite dimensional⇒ A† not continuous.

• Integral equation with H.S. kernel⇒ A compact

Page 10: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• No solution exists→ least squares solution.

• Solution not unique→ minimal norm solution.

• Linear operator for the above→ Moore-Penrose inverse.

• Third criterion for well posedness?→ A† continuous.

• A† continuous

⇔ R(A) closed.

• A compact, R(A) infinite dimensional⇒ A† not continuous.

• Integral equation with H.S. kernel⇒ A compact

Page 11: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• No solution exists→ least squares solution.

• Solution not unique→ minimal norm solution.

• Linear operator for the above→ Moore-Penrose inverse.

• Third criterion for well posedness?→ A† continuous.

• A† continuous⇔ R(A) closed.

• A compact, R(A) infinite dimensional⇒ A† not continuous.

• Integral equation with H.S. kernel⇒ A compact

Page 12: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• No solution exists→ least squares solution.

• Solution not unique→ minimal norm solution.

• Linear operator for the above→ Moore-Penrose inverse.

• Third criterion for well posedness?→ A† continuous.

• A† continuous⇔ R(A) closed.

• A compact, R(A) infinite dimensional

⇒ A† not continuous.

• Integral equation with H.S. kernel⇒ A compact

Page 13: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• No solution exists→ least squares solution.

• Solution not unique→ minimal norm solution.

• Linear operator for the above→ Moore-Penrose inverse.

• Third criterion for well posedness?→ A† continuous.

• A† continuous⇔ R(A) closed.

• A compact, R(A) infinite dimensional⇒ A† not continuous.

• Integral equation with H.S. kernel⇒ A compact

Page 14: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• No solution exists→ least squares solution.

• Solution not unique→ minimal norm solution.

• Linear operator for the above→ Moore-Penrose inverse.

• Third criterion for well posedness?→ A† continuous.

• A† continuous⇔ R(A) closed.

• A compact, R(A) infinite dimensional⇒ A† not continuous.

• Integral equation with H.S. kernel

⇒ A compact

Page 15: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• No solution exists→ least squares solution.

• Solution not unique→ minimal norm solution.

• Linear operator for the above→ Moore-Penrose inverse.

• Third criterion for well posedness?→ A† continuous.

• A† continuous⇔ R(A) closed.

• A compact, R(A) infinite dimensional⇒ A† not continuous.

• Integral equation with H.S. kernel⇒ A compact

Page 16: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• Compact, self adjoint linear operators have aneigendecomposition.

• If A is compact, A∗A and AA∗ are compact and self adjoint.

• We can develop the singular value decomposition:

Ax =∑n∈I

σn〈x ,un〉vn

for a countable index set I.

Page 17: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• Compact, self adjoint linear operators have aneigendecomposition.

• If A is compact, A∗A and AA∗ are compact and self adjoint.

• We can develop the singular value decomposition:

Ax =∑n∈I

σn〈x ,un〉vn

for a countable index set I.

Page 18: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• Compact, self adjoint linear operators have aneigendecomposition.

• If A is compact, A∗A and AA∗ are compact and self adjoint.

• We can develop the singular value decomposition:

Ax =∑n∈I

σn〈x ,un〉vn

for a countable index set I.

Page 19: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• Moore-Penrose inverse can be expressed as

A†y =∑n∈I

1σn〈y , vn〉un

• The speed of σn → 0 classifies ill-posedness.

→ Mildly (at most O( 1n )),

→ Moderately (at most polynomial, O( 1nγ )),

→ Severely (faster than polynomial).

Page 20: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• Moore-Penrose inverse can be expressed as

A†y =∑n∈I

1σn〈y , vn〉un

• The speed of σn → 0 classifies ill-posedness.

→ Mildly (at most O( 1n )),

→ Moderately (at most polynomial, O( 1nγ )),

→ Severely (faster than polynomial).

Page 21: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• Moore-Penrose inverse can be expressed as

A†y =∑n∈I

1σn〈y , vn〉un

• The speed of σn → 0 classifies ill-posedness.

→ Mildly (at most O( 1n )),

→ Moderately (at most polynomial, O( 1nγ )),

→ Severely (faster than polynomial).

Page 22: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• Moore-Penrose inverse can be expressed as

A†y =∑n∈I

1σn〈y , vn〉un

• The speed of σn → 0 classifies ill-posedness.

→ Mildly (at most O( 1n )),

→ Moderately (at most polynomial, O( 1nγ )),

→ Severely (faster than polynomial).

Page 23: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

A little summary

What did we learn so far?

• Moore-Penrose inverse can be expressed as

A†y =∑n∈I

1σn〈y , vn〉un

• The speed of σn → 0 classifies ill-posedness.

→ Mildly (at most O( 1n )),

→ Moderately (at most polynomial, O( 1nγ )),

→ Severely (faster than polynomial).

Page 24: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Fighting the ill-posedness

What can we do?

Page 25: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Making a plan...

In finite dimensions:

• Symmetric positive definite matrix A

• Approximate A−1 = A† by Rα.

• Rα inverts a modified version of A.

‖A−1y − Rαyδ‖ ≤ ‖A−1y − Rαy‖︸ ︷︷ ︸Approximation error

+ ‖Rαy − Rαyδ‖︸ ︷︷ ︸Data error

• Regularization parameter α depending on the noise.

• The smaller the noise, the smaller the α.

Page 26: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Making a plan...

In finite dimensions:

• Symmetric positive definite matrix A

• Approximate A−1 = A† by Rα.

• Rα inverts a modified version of A.

‖A−1y − Rαyδ‖ ≤ ‖A−1y − Rαy‖︸ ︷︷ ︸Approximation error

+ ‖Rαy − Rαyδ‖︸ ︷︷ ︸Data error

• Regularization parameter α depending on the noise.

• The smaller the noise, the smaller the α.

Page 27: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Making a plan...

In finite dimensions:

• Symmetric positive definite matrix A

• Approximate A−1 = A† by Rα.

• Rα inverts a modified version of A.

‖A−1y − Rαyδ‖ ≤ ‖A−1y − Rαy‖︸ ︷︷ ︸Approximation error

+ ‖Rαy − Rαyδ‖︸ ︷︷ ︸Data error

• Regularization parameter α depending on the noise.

• The smaller the noise, the smaller the α.

Page 28: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Making a plan...

In finite dimensions:

• Symmetric positive definite matrix A

• Approximate A−1 = A† by Rα.

• Rα inverts a modified version of A.

‖A−1y − Rαyδ‖ ≤ ‖A−1y − Rαy‖︸ ︷︷ ︸Approximation error

+ ‖Rαy − Rαyδ‖︸ ︷︷ ︸Data error

• Regularization parameter α depending on the noise.

• The smaller the noise, the smaller the α.

Page 29: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Making a plan...

In infinite dimensions:

• y = Ax for A ∈ L(X ,Y ) compact, dim(R(A)) =∞.

• We are given yδ ∈ Y and know ‖y − yδ‖ ≤ δ.

• Goal: Find x .

• Problems:• A†yδ might not even be defined! yδ ∈ Y 6= D(A†).• A† is discontinuous→ ‖A†yδ − x‖ arbitrary large!

Page 30: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Making a plan...

In infinite dimensions:

• y = Ax for A ∈ L(X ,Y ) compact, dim(R(A)) =∞.

• We are given yδ ∈ Y and know ‖y − yδ‖ ≤ δ.

• Goal: Find x .

• Problems:• A†yδ might not even be defined! yδ ∈ Y 6= D(A†).• A† is discontinuous→ ‖A†yδ − x‖ arbitrary large!

Page 31: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Making a plan...

In infinite dimensions:

• y = Ax for A ∈ L(X ,Y ) compact, dim(R(A)) =∞.

• We are given yδ ∈ Y and know ‖y − yδ‖ ≤ δ.

• Goal: Find x .

• Problems:• A†yδ might not even be defined! yδ ∈ Y 6= D(A†).• A† is discontinuous→ ‖A†yδ − x‖ arbitrary large!

Page 32: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Making a plan...

In infinite dimensions:

• y = Ax for A ∈ L(X ,Y ) compact, dim(R(A)) =∞.

• We are given yδ ∈ Y and know ‖y − yδ‖ ≤ δ.

• Goal: Find x .

• Problems:• A†yδ might not even be defined! yδ ∈ Y 6= D(A†).• A† is discontinuous→ ‖A†yδ − x‖ arbitrary large!

Page 33: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Making a plan...

Idea: Try something similar to the finite dimensional case!

• Define family {Rα} of continuous operators Rα : Y → X .

• Index α ∈ I ⊂ ]0, α0[.

• We need: Rαy → A†y for α→ 0.

• Careful: There are y ∈ Y\D(A†), ‖Rαy‖ → ∞!

• Desired: α = α(δ, yδ) with Rα(δ,yδ)yδ → A†y as δ → 0.

• Should work for all y ∈ D, yδ ∈ Y .

Page 34: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Making a plan...

Idea: Try something similar to the finite dimensional case!

• Define family {Rα} of continuous operators Rα : Y → X .

• Index α ∈ I ⊂ ]0, α0[.

• We need: Rαy → A†y for α→ 0.

• Careful: There are y ∈ Y\D(A†), ‖Rαy‖ → ∞!

• Desired: α = α(δ, yδ) with Rα(δ,yδ)yδ → A†y as δ → 0.

• Should work for all y ∈ D, yδ ∈ Y .

Page 35: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Making a plan...

Idea: Try something similar to the finite dimensional case!

• Define family {Rα} of continuous operators Rα : Y → X .

• Index α ∈ I ⊂ ]0, α0[.

• We need: Rαy → A†y for α→ 0.

• Careful: There are y ∈ Y\D(A†), ‖Rαy‖ → ∞!

• Desired: α = α(δ, yδ) with Rα(δ,yδ)yδ → A†y as δ → 0.

• Should work for all y ∈ D, yδ ∈ Y .

Page 36: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Making a plan...

Idea: Try something similar to the finite dimensional case!

• Define family {Rα} of continuous operators Rα : Y → X .

• Index α ∈ I ⊂ ]0, α0[.

• We need: Rαy → A†y for α→ 0.

• Careful: There are y ∈ Y\D(A†), ‖Rαy‖ → ∞!

• Desired: α = α(δ, yδ) with Rα(δ,yδ)yδ → A†y as δ → 0.

• Should work for all y ∈ D, yδ ∈ Y .

Page 37: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Making a plan...

Idea: Try something similar to the finite dimensional case!

• Define family {Rα} of continuous operators Rα : Y → X .

• Index α ∈ I ⊂ ]0, α0[.

• We need: Rαy → A†y for α→ 0.

• Careful: There are y ∈ Y\D(A†), ‖Rαy‖ → ∞!

• Desired: α = α(δ, yδ) with Rα(δ,yδ)yδ → A†y as δ → 0.

• Should work for all y ∈ D, yδ ∈ Y .

Page 38: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Making a plan...

Idea: Try something similar to the finite dimensional case!

• Define family {Rα} of continuous operators Rα : Y → X .

• Index α ∈ I ⊂ ]0, α0[.

• We need: Rαy → A†y for α→ 0.

• Careful: There are y ∈ Y\D(A†), ‖Rαy‖ → ∞!

• Desired: α = α(δ, yδ) with Rα(δ,yδ)yδ → A†y as δ → 0.

• Should work for all y ∈ D, yδ ∈ Y .

Page 39: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Regularization

Definition: Regularization

Let A ∈ L(X ,Y ), and let for every α ∈]0, α0[

Rα : Y → X

be a continuous operator. The family {Rα}α∈I is calledregularization (or regularization operator ) for A†, if for ally ∈ D(A†) there exists a parameter choice rule α = α(δ, yδ),α : R+ × Y → I, such that

limδ→0

sup{‖Rα(δ,yδ)yδ − A†y‖ | yδ ∈ Y , ‖y − yδ‖ ≤ δ

}= 0, (1)

and

limδ→0

sup{α(δ, yδ) | yδ ∈ Y , ‖y − yδ‖ ≤ δ

}= 0. (2)

For a specific y ∈ D(A†), the pair (Rα, α) is called (convergent)regularization method of Ax = y if (1) and (2) hold.

Some considerations on the board.

Page 40: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Regularization

Proposition: Pointwise convergence 1

Let A ∈ L(X ,Y ) and Rα : Y → X be a family of continuousoperators with α ∈ R+. If

Rαy → A†y

for all y ∈ D(A†) as α→ 0, then there exists an a-prioriparameter choice rule α = α(δ) such that (Rα, α) is aconvergent regularization method.

Proposition: Pointwise convergence 2

If (Rα, α) is a regularization method with continuous parameterchoice rule α(δ, yδ), then the {Rα} are converging pointwise toA† on D(A†), i.e.

Rαy → A†y

for α→ 0.

Page 41: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Regularization

Proposition: Pointwise convergence 1

Let A ∈ L(X ,Y ) and Rα : Y → X be a family of continuousoperators with α ∈ R+. If

Rαy → A†y

for all y ∈ D(A†) as α→ 0, then there exists an a-prioriparameter choice rule α = α(δ) such that (Rα, α) is aconvergent regularization method.

Proposition: Pointwise convergence 2

If (Rα, α) is a regularization method with continuous parameterchoice rule α(δ, yδ), then the {Rα} are converging pointwise toA† on D(A†), i.e.

Rαy → A†y

for α→ 0.

Page 42: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Regularization

Assume we have a family of Rα ∈ L(Y ,X ) that convergepointwise to A† on D(A†).

What kind of a-prior parameter choice rules work?

Proposition: A-priori parameter choice rules

Let Rβ ∈ L(Y ,X ) be a family of operators such that Rβ → A†

pointwise for β → 0. Then an a-priori parameter choice ruleα = α(δ) makes (Rα, α) a regularization method if and only if

limδ→0

α(δ) = 0, limδ→0

δ‖Rα(δ)‖ = 0.

Proof: Board.

Page 43: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Regularization

Assume we have a family of Rα ∈ L(Y ,X ) that convergepointwise to A† on D(A†).

What kind of a-prior parameter choice rules work?

Proposition: A-priori parameter choice rules

Let Rβ ∈ L(Y ,X ) be a family of operators such that Rβ → A†

pointwise for β → 0. Then an a-priori parameter choice ruleα = α(δ) makes (Rα, α) a regularization method if and only if

limδ→0

α(δ) = 0, limδ→0

δ‖Rα(δ)‖ = 0.

Proof: Board.

Page 44: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Parameter choice rules

What kind of parameter choice rules work?• Seen: Rαy → A†y , then a-priori choices α = α(δ) work• Possible improvements α = α(δ, yδ)?

Page 45: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Parameter choice rules

Page 46: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Parameter choice rules

Page 47: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Parameter choice rules

Page 48: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Parameter choice rules

Page 49: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Parameter choice rules

Page 50: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Parameter choice rules

Page 51: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Parameter choice rules

Page 52: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Parameter choice rules

Page 53: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Parameter choice rules

What kind of parameter choice rules work?• Seen: Rαy → A†y , then a-priori choices α = α(δ) work• Possible improvements α = α(δ, yδ)?

• Tempting in practice: α = α(yδ). Can this work?

Theorem: Data dependent parameter choice impossible

Let A ∈ L(X ,Y ) and let {Rα} be a regularization for A† with aparameter choice rule α which depends on yδ only (and not onδ), such that the regularization method converges for everyy ∈ D(A†). Then A† is continuous.

Proof: Board

Remark: This makes life difficult for applications!

Page 54: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Parameter choice rules

What kind of parameter choice rules work?• Seen: Rαy → A†y , then a-priori choices α = α(δ) work• Possible improvements α = α(δ, yδ)?• Tempting in practice: α = α(yδ). Can this work?

Theorem: Data dependent parameter choice impossible

Let A ∈ L(X ,Y ) and let {Rα} be a regularization for A† with aparameter choice rule α which depends on yδ only (and not onδ), such that the regularization method converges for everyy ∈ D(A†). Then A† is continuous.

Proof: Board

Remark: This makes life difficult for applications!

Page 55: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Parameter choice rules

What kind of parameter choice rules work?• Seen: Rαy → A†y , then a-priori choices α = α(δ) work• Possible improvements α = α(δ, yδ)?• Tempting in practice: α = α(yδ). Can this work?

Theorem: Data dependent parameter choice impossible

Let A ∈ L(X ,Y ) and let {Rα} be a regularization for A† with aparameter choice rule α which depends on yδ only (and not onδ), such that the regularization method converges for everyy ∈ D(A†). Then A† is continuous.

Proof: Board

Remark: This makes life difficult for applications!

Page 56: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Parameter choice rules

What kind of parameter choice rules work?• Seen: Rαy → A†y , then a-priori choices α = α(δ) work• Possible improvements α = α(δ, yδ)?• Tempting in practice: α = α(yδ). Can this work?

Theorem: Data dependent parameter choice impossible

Let A ∈ L(X ,Y ) and let {Rα} be a regularization for A† with aparameter choice rule α which depends on yδ only (and not onδ), such that the regularization method converges for everyy ∈ D(A†). Then A† is continuous.

Proof: Board

Remark: This makes life difficult for applications!

Page 57: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Behavior of Rα

What happens on Y\D(A†)?

Theorem: Banach-Steinhaus

Let (Rn)n∈N, Rn ∈ L(Y ,X ), be a sequence. The Rn convergespointwise to an operator R ∈ L(Y ,X ) if and only if the followingtwo conditions are met• The sequence (‖Rn‖)n∈N is bounded in R,• There exists a dense subset Y0 ⊂ Y such that for every

y0 ∈ Y0, (Rny0)n∈N converges in X .

Conclusion

If A ∈ L(X ,Y ), R(A) is not closed, and Rα convergespointwise to A† on D(A†), then

‖Rα‖ → ∞

for α→ 0.

Page 58: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Behavior of Rα

What happens on Y\D(A†)?

Theorem: Banach-Steinhaus

Let (Rn)n∈N, Rn ∈ L(Y ,X ), be a sequence. The Rn convergespointwise to an operator R ∈ L(Y ,X ) if and only if the followingtwo conditions are met• The sequence (‖Rn‖)n∈N is bounded in R,• There exists a dense subset Y0 ⊂ Y such that for every

y0 ∈ Y0, (Rny0)n∈N converges in X .

Conclusion

If A ∈ L(X ,Y ), R(A) is not closed, and Rα convergespointwise to A† on D(A†), then

‖Rα‖ → ∞

for α→ 0.

Page 59: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Behavior of Rα

What happens on Y\D(A†)?

Theorem: Uniform boundedness theorem

Ifsup{‖Rαy‖ | α ∈]0, α[} <∞ ∀y ∈ Y

thensup{‖Rα‖ | α ∈]0, α[} <∞

Conclusion

If A ∈ L(X ,Y ), R(A) is not closed, and Rα converges pointwiseto A† on D(A†), then the exists a y ∈ Y\D(A†) for which

‖Rαy‖ → ∞

as α→ 0.

Page 60: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Behavior of Rα

What happens on Y\D(A†)?

Theorem: Uniform boundedness theorem

Ifsup{‖Rαy‖ | α ∈]0, α[} <∞ ∀y ∈ Y

thensup{‖Rα‖ | α ∈]0, α[} <∞

Conclusion

If A ∈ L(X ,Y ), R(A) is not closed, and Rα converges pointwiseto A† on D(A†), then the exists a y ∈ Y\D(A†) for which

‖Rαy‖ → ∞

as α→ 0.

Page 61: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Behavior of Rα

What happens on Y\D(A†)?

Theorem: Divergence on Y\D(A†)

Let A ∈ L(X ,Y ) be compact, dim(R(A)) =∞, and letRα ∈ L(Y ,X ) be a family of regularization operators. If

supα>0‖ARα‖ <∞,

then ‖Rαy‖ → ∞ for y /∈ D(A†).

Proof: Requires some functional analysis and is given in Engl,Hanke, Neubauer, Regularization of inverse problems, 1996.

Page 62: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Construction of regularizations

How can we actually construct regularizations?• A ∈ L(X ,Y ) compact, dim(range(A)) =∞,

A†y =

∞∑n=1

1σn〈vn, y〉un

for y ∈ D(A†).• Unboundedness due to σn → 0.• We need operators Rα that are

• defined on Y ,• continuous,• converge pointwise to A†.

• Idea:

Rαy =∞∑

n=1

gα(σn)〈vn, y〉un

Page 63: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Construction of regularizations

How can we actually construct regularizations?• A ∈ L(X ,Y ) compact, dim(range(A)) =∞,

A†y =∞∑

n=1

1σn〈vn, y〉un

for y ∈ D(A†).• Unboundedness due to

σn → 0.• We need operators Rα that are

• defined on Y ,• continuous,• converge pointwise to A†.

• Idea:

Rαy =∞∑

n=1

gα(σn)〈vn, y〉un

Page 64: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Construction of regularizations

How can we actually construct regularizations?• A ∈ L(X ,Y ) compact, dim(range(A)) =∞,

A†y =∞∑

n=1

1σn〈vn, y〉un

for y ∈ D(A†).• Unboundedness due to σn → 0.• We need operators Rα that are

• defined on Y ,• continuous,• converge pointwise to A†.

• Idea:

Rαy =∞∑

n=1

gα(σn)〈vn, y〉un

Page 65: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Construction of regularizations

How can we actually construct regularizations?• A ∈ L(X ,Y ) compact, dim(range(A)) =∞,

A†y =∞∑

n=1

1σn〈vn, y〉un

for y ∈ D(A†).• Unboundedness due to σn → 0.• We need operators Rα that are

• defined on Y ,• continuous,• converge pointwise to A†.

• Idea:

Rαy =∞∑

n=1

gα(σn)〈vn, y〉un

Page 66: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Construction of regularizations

How can we actually construct regularizations?• A ∈ L(X ,Y ) compact, dim(range(A)) =∞,

A†y =∞∑

n=1

1σn〈vn, y〉un

for y ∈ D(A†).• Unboundedness due to σn → 0.• We need operators Rα that are

• defined on Y ,• continuous,• converge pointwise to A†.

• Idea:

Rαy =∞∑

n=1

gα(σn)〈vn, y〉un

Page 67: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Construction of regularizations

SVD based regularizations

For Rα : Y → X defined by

Rαy =∞∑

n=1

gα(σn)〈vn, y〉un

and a function gα : R+ → R+ with1 gα(σ)→ 1

σ for σ > 0 as α→ 0,2 gα(σ) ≤ Cα <∞ for all σ > 0,

3 σgα(σ) ≤ C <∞ for all α and σ > 0.Rα is a regularization for A†.

Proof + considerations on the board.

Page 68: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Recalling RegularizationHow can we get a stable estimate for x , if we are givenyδ = Ax + n, ‖n‖ ≤ δ, and A† is discontinuous?

• Adapt ideas from finite dimensional case.• Look for a family of continuous operators Rα : Y → X with

limδ→0

sup{‖Rα(δ,yδ)yδ − A†y‖ | ‖y − yδ‖ ≤ δ

}= 0

• Means that Rαy → A†y on D(A†).• We need a parameter choice rule α

• If Rαy → A†y on D(A†), α = α(δ) works.• α = α(δ, yδ) can possibly improve the results.• α = α(yδ) can not work in general for ill-posed problems.

• If {Rα}, Rα ∈ L(Y ,X ), is a regularization then a particularchoice α = α(δ) works if and only if

limδ→0

α(δ) = 0, limδ→0

δ‖Rα(δ)‖ = 0

Page 69: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Recalling RegularizationHow can we get a stable estimate for x , if we are givenyδ = Ax + n, ‖n‖ ≤ δ, and A† is discontinuous?

• Adapt ideas from finite dimensional case.

• Look for a family of continuous operators Rα : Y → X with

limδ→0

sup{‖Rα(δ,yδ)yδ − A†y‖ | ‖y − yδ‖ ≤ δ

}= 0

• Means that Rαy → A†y on D(A†).• We need a parameter choice rule α

• If Rαy → A†y on D(A†), α = α(δ) works.• α = α(δ, yδ) can possibly improve the results.• α = α(yδ) can not work in general for ill-posed problems.

• If {Rα}, Rα ∈ L(Y ,X ), is a regularization then a particularchoice α = α(δ) works if and only if

limδ→0

α(δ) = 0, limδ→0

δ‖Rα(δ)‖ = 0

Page 70: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Recalling RegularizationHow can we get a stable estimate for x , if we are givenyδ = Ax + n, ‖n‖ ≤ δ, and A† is discontinuous?

• Adapt ideas from finite dimensional case.• Look for a family of continuous operators Rα : Y → X with

limδ→0

sup{‖Rα(δ,yδ)yδ − A†y‖ | ‖y − yδ‖ ≤ δ

}= 0

• Means that Rαy → A†y on D(A†).• We need a parameter choice rule α

• If Rαy → A†y on D(A†), α = α(δ) works.• α = α(δ, yδ) can possibly improve the results.• α = α(yδ) can not work in general for ill-posed problems.

• If {Rα}, Rα ∈ L(Y ,X ), is a regularization then a particularchoice α = α(δ) works if and only if

limδ→0

α(δ) = 0, limδ→0

δ‖Rα(δ)‖ = 0

Page 71: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Recalling RegularizationHow can we get a stable estimate for x , if we are givenyδ = Ax + n, ‖n‖ ≤ δ, and A† is discontinuous?

• Adapt ideas from finite dimensional case.• Look for a family of continuous operators Rα : Y → X with

limδ→0

sup{‖Rα(δ,yδ)yδ − A†y‖ | ‖y − yδ‖ ≤ δ

}= 0

• Means that Rαy → A†y on D(A†).

• We need a parameter choice rule α• If Rαy → A†y on D(A†), α = α(δ) works.• α = α(δ, yδ) can possibly improve the results.• α = α(yδ) can not work in general for ill-posed problems.

• If {Rα}, Rα ∈ L(Y ,X ), is a regularization then a particularchoice α = α(δ) works if and only if

limδ→0

α(δ) = 0, limδ→0

δ‖Rα(δ)‖ = 0

Page 72: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Recalling RegularizationHow can we get a stable estimate for x , if we are givenyδ = Ax + n, ‖n‖ ≤ δ, and A† is discontinuous?

• Adapt ideas from finite dimensional case.• Look for a family of continuous operators Rα : Y → X with

limδ→0

sup{‖Rα(δ,yδ)yδ − A†y‖ | ‖y − yδ‖ ≤ δ

}= 0

• Means that Rαy → A†y on D(A†).• We need a parameter choice rule α

• If Rαy → A†y on D(A†), α = α(δ) works.• α = α(δ, yδ) can possibly improve the results.• α = α(yδ) can not work in general for ill-posed problems.

• If {Rα}, Rα ∈ L(Y ,X ), is a regularization then a particularchoice α = α(δ) works if and only if

limδ→0

α(δ) = 0, limδ→0

δ‖Rα(δ)‖ = 0

Page 73: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Recalling RegularizationHow can we get a stable estimate for x , if we are givenyδ = Ax + n, ‖n‖ ≤ δ, and A† is discontinuous?

• Adapt ideas from finite dimensional case.• Look for a family of continuous operators Rα : Y → X with

limδ→0

sup{‖Rα(δ,yδ)yδ − A†y‖ | ‖y − yδ‖ ≤ δ

}= 0

• Means that Rαy → A†y on D(A†).• We need a parameter choice rule α

• If Rαy → A†y on D(A†), α = α(δ) works.

• α = α(δ, yδ) can possibly improve the results.• α = α(yδ) can not work in general for ill-posed problems.

• If {Rα}, Rα ∈ L(Y ,X ), is a regularization then a particularchoice α = α(δ) works if and only if

limδ→0

α(δ) = 0, limδ→0

δ‖Rα(δ)‖ = 0

Page 74: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Recalling RegularizationHow can we get a stable estimate for x , if we are givenyδ = Ax + n, ‖n‖ ≤ δ, and A† is discontinuous?

• Adapt ideas from finite dimensional case.• Look for a family of continuous operators Rα : Y → X with

limδ→0

sup{‖Rα(δ,yδ)yδ − A†y‖ | ‖y − yδ‖ ≤ δ

}= 0

• Means that Rαy → A†y on D(A†).• We need a parameter choice rule α

• If Rαy → A†y on D(A†), α = α(δ) works.• α = α(δ, yδ) can possibly improve the results.

• α = α(yδ) can not work in general for ill-posed problems.

• If {Rα}, Rα ∈ L(Y ,X ), is a regularization then a particularchoice α = α(δ) works if and only if

limδ→0

α(δ) = 0, limδ→0

δ‖Rα(δ)‖ = 0

Page 75: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Recalling RegularizationHow can we get a stable estimate for x , if we are givenyδ = Ax + n, ‖n‖ ≤ δ, and A† is discontinuous?

• Adapt ideas from finite dimensional case.• Look for a family of continuous operators Rα : Y → X with

limδ→0

sup{‖Rα(δ,yδ)yδ − A†y‖ | ‖y − yδ‖ ≤ δ

}= 0

• Means that Rαy → A†y on D(A†).• We need a parameter choice rule α

• If Rαy → A†y on D(A†), α = α(δ) works.• α = α(δ, yδ) can possibly improve the results.• α = α(yδ) can not work in general for ill-posed problems.

• If {Rα}, Rα ∈ L(Y ,X ), is a regularization then a particularchoice α = α(δ) works if and only if

limδ→0

α(δ) = 0, limδ→0

δ‖Rα(δ)‖ = 0

Page 76: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Recalling RegularizationHow can we get a stable estimate for x , if we are givenyδ = Ax + n, ‖n‖ ≤ δ, and A† is discontinuous?

• Adapt ideas from finite dimensional case.• Look for a family of continuous operators Rα : Y → X with

limδ→0

sup{‖Rα(δ,yδ)yδ − A†y‖ | ‖y − yδ‖ ≤ δ

}= 0

• Means that Rαy → A†y on D(A†).• We need a parameter choice rule α

• If Rαy → A†y on D(A†), α = α(δ) works.• α = α(δ, yδ) can possibly improve the results.• α = α(yδ) can not work in general for ill-posed problems.

• If {Rα}, Rα ∈ L(Y ,X ), is a regularization then a particularchoice α = α(δ) works if and only if

limδ→0

α(δ) = 0, limδ→0

δ‖Rα(δ)‖ = 0

Page 77: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Recalling RegularizationHow do we construct a regularization method?

• For compact linear operators use the SVD

A†y =∞∑

n=1

1σn〈y , vn〉un

and construct

Rαy =∞∑

n=1

gα(σn)〈y , vn〉un

• We need

1 gα(σ)→ 1σ

for σ > 0 as α→ 0,

2 gα(σ) ≤ Cα <∞ for all σ > 0,

3 σgα(σ) ≤ C <∞ for all α and σ > 0,

4 For the choice rule: δCα(δ) → 0

Page 78: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Recalling RegularizationHow do we construct a regularization method?

• For compact linear operators use the SVD

A†y =∞∑

n=1

1σn〈y , vn〉un

and construct

Rαy =∞∑

n=1

gα(σn)〈y , vn〉un

• We need

1 gα(σ)→ 1σ

for σ > 0 as α→ 0,

2 gα(σ) ≤ Cα <∞ for all σ > 0,

3 σgα(σ) ≤ C <∞ for all α and σ > 0,

4 For the choice rule: δCα(δ) → 0

Page 79: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Recalling RegularizationHow do we construct a regularization method?

• For compact linear operators use the SVD

A†y =∞∑

n=1

1σn〈y , vn〉un

and construct

Rαy =∞∑

n=1

gα(σn)〈y , vn〉un

• We need

1 gα(σ)→ 1σ

for σ > 0 as α→ 0,

2 gα(σ) ≤ Cα <∞ for all σ > 0,

3 σgα(σ) ≤ C <∞ for all α and σ > 0,

4 For the choice rule: δCα(δ) → 0

Page 80: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Recalling RegularizationHow do we construct a regularization method?

• For compact linear operators use the SVD

A†y =∞∑

n=1

1σn〈y , vn〉un

and construct

Rαy =∞∑

n=1

gα(σn)〈y , vn〉un

• We need

1 gα(σ)→ 1σ

for σ > 0 as α→ 0,

2 gα(σ) ≤ Cα <∞ for all σ > 0,

3 σgα(σ) ≤ C <∞ for all α and σ > 0,

4 For the choice rule: δCα(δ) → 0

Page 81: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Recalling RegularizationHow do we construct a regularization method?

• For compact linear operators use the SVD

A†y =∞∑

n=1

1σn〈y , vn〉un

and construct

Rαy =∞∑

n=1

gα(σn)〈y , vn〉un

• We need

1 gα(σ)→ 1σ

for σ > 0 as α→ 0,

2 gα(σ) ≤ Cα <∞ for all σ > 0,

3 σgα(σ) ≤ C <∞ for all α and σ > 0,

4 For the choice rule: δCα(δ) → 0

Page 82: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Recalling RegularizationHow do we construct a regularization method?

• For compact linear operators use the SVD

A†y =∞∑

n=1

1σn〈y , vn〉un

and construct

Rαy =∞∑

n=1

gα(σn)〈y , vn〉un

• We need

1 gα(σ)→ 1σ

for σ > 0 as α→ 0,

2 gα(σ) ≤ Cα <∞ for all σ > 0,

3 σgα(σ) ≤ C <∞ for all α and σ > 0,

4 For the choice rule: δCα(δ) → 0

Page 83: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Truncated SVDExample: Truncated SVD

gα(σ) Cα Convergence 1σ if σ ≥ α

0 else.1α

δα → 0

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

1

2

3

4

5

6

7

8

g0.2

(σ)

1/σ

Pros and cons?

Page 84: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Bringmann-Drizsga regularizationExample: Bringmann-Drizsga regularization

gα(σ) Cα Convergence 1σ if σ ≥ α1α else.

δα → 0

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

1

2

3

4

5

6

7

8

g0.2

(σ)

1/σ

Pros and cons?

Page 85: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Lavrentiev regularization

Example: Lavrentiev regularization

gα(σ) Cα Convergence1

σ+α1α

δα → 0

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

1

2

3

4

5

6

7

8

g0.15

(σ)

1/σ

Pros and cons?

Page 86: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Tikhonov regularization

Example: Tikhonov regularization

gα(σ) Cα Convergenceσ

σ2+α1

2√α

δ√α→ 0

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

1

2

3

4

5

6

7

8

g0.005

(σ)

1/σ

Pros and cons?

Page 87: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Iterative regularization

Example: Landweber regularization

gk (σ) Ck Convergence1−(1−τσ2)k

σ More difficult More difficult

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

1

2

3

4

5

6

7

8

g20

(σ)

1/σ

Pros and cons?

Page 88: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

The discrepancy principle for iterative regularization

Monotonic Improvement

Let γ = 22−τ‖A‖2 and let xk

δ be given by the Landweber iteration

with noisy data yδ. Let y = Ax†. Then xk+1δ is a better

approximation of x† than xkδ as long as ‖Axk

δ − yδ‖ > γδ.

Discrepancy Principle

The discrepancy principle is the parameter choice rule for theLandweber iteration based on choosing the regularizedsolution to be xk∗

δ with

k∗(δ, yδ) := inf{k | ‖yδ − Axkδ ‖ < γδ}.

Page 89: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

The discrepancy principle for iterative regularization

Monotonic Improvement

Let γ = 22−τ‖A‖2 and let xk

δ be given by the Landweber iteration

with noisy data yδ. Let y = Ax†. Then xk+1δ is a better

approximation of x† than xkδ as long as ‖Axk

δ − yδ‖ > γδ.

Discrepancy Principle

The discrepancy principle is the parameter choice rule for theLandweber iteration based on choosing the regularizedsolution to be xk∗

δ with

k∗(δ, yδ) := inf{k | ‖yδ − Axkδ ‖ < γδ}.

Page 90: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

The discrepancy principle for iterative regularization

Finite Determination

Let y = Ax†. If τ ≤ 1‖A‖2 is fixed then the discrepancy principle

k∗(δ, yδ) := inf{k | ‖yδ − Axkδ ‖ < γδ}.

determines a finite stopping index k(δ, yδ) for the Landweberiteration for any γ > 1 with k(δ, yδ) ∈ O(δ−2).

Proof: Board.

One can show that the discrepancy principle makes theLandweber iteration a convergent regularization method.

Page 91: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

The discrepancy principle for iterative regularization

Finite Determination

Let y = Ax†. If τ ≤ 1‖A‖2 is fixed then the discrepancy principle

k∗(δ, yδ) := inf{k | ‖yδ − Axkδ ‖ < γδ}.

determines a finite stopping index k(δ, yδ) for the Landweberiteration for any γ > 1 with k(δ, yδ) ∈ O(δ−2).

Proof: Board.

One can show that the discrepancy principle makes theLandweber iteration a convergent regularization method.

Page 92: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Further Questions

We have seen that regularization can yield continuousdependence of our reconstruction on the data in the sense that

limδ→0

sup{‖Rα(δ,yδ)yδ − A†y‖ | ‖y − yδ‖ ≤ δ

}= 0

But can we say something about the rate of convergence?Particularly, can we show

‖Rα(δ)(yδ)− A†y‖ ≤ C δν

for some ν?

Page 93: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Further Questions

We have seen that regularization can yield continuousdependence of our reconstruction on the data in the sense that

limδ→0

sup{‖Rα(δ,yδ)yδ − A†y‖ | ‖y − yδ‖ ≤ δ

}= 0

But can we say something about the rate of convergence?Particularly, can we show

‖Rα(δ)(yδ)− A†y‖ ≤ C δν

for some ν?

Page 94: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

How fast do regularizationmethods converge?

Page 95: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Let us stay in the setting of compact operators andregularizations defined via

Rαyδ =∑n∈I

gα(σn)〈yδ, vn〉un.

We do our usual estimate

‖x† − Rαyδ‖ ≤ ‖x† − Rαy‖+ Cαδ.

The convergence speed depends on two factors, one being

‖x† − Rαy‖ =

√√√√∑n∈I

(gα(σn)−

1σn

)2

|〈y , vn〉|2

=

√∑n∈I

(σngα(σn)− 1)2 |〈x†,un〉|2

Page 96: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Let us stay in the setting of compact operators andregularizations defined via

Rαyδ =∑n∈I

gα(σn)〈yδ, vn〉un.

We do our usual estimate

‖x† − Rαyδ‖ ≤ ‖x† − Rαy‖+ Cαδ.

The convergence speed depends on two factors

, one being

‖x† − Rαy‖ =

√√√√∑n∈I

(gα(σn)−

1σn

)2

|〈y , vn〉|2

=

√∑n∈I

(σngα(σn)− 1)2 |〈x†,un〉|2

Page 97: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Let us stay in the setting of compact operators andregularizations defined via

Rαyδ =∑n∈I

gα(σn)〈yδ, vn〉un.

We do our usual estimate

‖x† − Rαyδ‖ ≤ ‖x† − Rαy‖+ Cαδ.

The convergence speed depends on two factors, one being

‖x† − Rαy‖ =

√√√√∑n∈I

(gα(σn)−

1σn

)2

|〈y , vn〉|2

=

√∑n∈I

(σngα(σn)− 1)2 |〈x†,un〉|2

Page 98: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

We have

‖x† − Rαy‖ =√∑

n∈I

(σngα(σn)− 1)2 |〈x†,un〉|2.

Particularly, if x† = un then

‖x† − Rαy‖ = |σngα(σn)− 1|.

Page 99: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

We have

‖x† − Rαy‖ =√∑

n∈I

(σngα(σn)− 1)2 |〈x†,un〉|2.

Particularly, if x† = un then

‖x† − Rαy‖ = |σngα(σn)− 1|.

Page 100: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Consider the convergence of

|σngα(σn)− 1|

• Good: For fixed n it holds that for all ε > 0 there exists anα0 small enough such that

|σngα(σn)− 1| ≤ ε ∀α ≤ α0.

Reason for Rα being a regularization.

• Bad: For a fixed α it holds that for all ε > 0 there exists anN large enough such that

|σngα(σn)− 1| ≥ 1− ε ∀n ≥ N.

Reason: gα(σ) is bounded and σn → 0.

Page 101: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Consider the convergence of

|σngα(σn)− 1|

• Good: For fixed n it holds that for all ε > 0 there exists anα0 small enough such that

|σngα(σn)− 1| ≤ ε ∀α ≤ α0.

Reason for Rα being a regularization.

• Bad: For a fixed α it holds that for all ε > 0 there exists anN large enough such that

|σngα(σn)− 1| ≥ 1− ε ∀n ≥ N.

Reason: gα(σ) is bounded and σn → 0.

Page 102: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Consider the convergence of

|σngα(σn)− 1|

• Good: For fixed n it holds that for all ε > 0 there exists anα0 small enough such that

|σngα(σn)− 1| ≤ ε ∀α ≤ α0.

Reason for Rα being a regularization.

• Bad: For a fixed α it holds that for all ε > 0 there exists anN large enough such that

|σngα(σn)− 1| ≥ 1− ε ∀n ≥ N.

Reason: gα(σ) is bounded and σn → 0.

Page 103: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Conclusion

Without further assumptions, the convergence of aregularization method can be arbitrarily slow!

What can we assume?

The convergence depended on∑n∈I

(σngα(σn)− 1)2 |〈x†,un〉|2

We need additional information about the decay of |〈x†,un〉|2!

Page 104: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Conclusion

Without further assumptions, the convergence of aregularization method can be arbitrarily slow!

What can we assume?

The convergence depended on∑n∈I

(σngα(σn)− 1)2 |〈x†,un〉|2

We need additional information about the decay of |〈x†,un〉|2!

Page 105: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Conclusion

Without further assumptions, the convergence of aregularization method can be arbitrarily slow!

What can we assume?

The convergence depended on∑n∈I

(σngα(σn)− 1)2 |〈x†,un〉|2

We need additional information about the decay of |〈x†,un〉|2!

Page 106: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Definition: Source-condition

We say that the minimal norm solution x† satisfies asource-condition of order µ if there exists a w ∈ X such thatx† = (A∗A)µw , where the latter is defined as

(A∗A)µw :=∞∑

n=1

σ2µn 〈w ,un〉un.

Assume that x† satisfies a source-condition of order µ, and forour SVD based regularization let there exist a function ωµ(α)such that

t2µ|tgα(t)− 1| ≤ ωµ(α).

Then it holds that

‖Rαy − A†y‖ ≤ ωµ(α)‖w‖.

Page 107: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Definition: Source-condition

We say that the minimal norm solution x† satisfies asource-condition of order µ if there exists a w ∈ X such thatx† = (A∗A)µw , where the latter is defined as

(A∗A)µw :=∞∑

n=1

σ2µn 〈w ,un〉un.

Assume that x† satisfies a source-condition of order µ, and forour SVD based regularization let there exist a function ωµ(α)such that

t2µ|tgα(t)− 1| ≤ ωµ(α).

Then it holds that

‖Rαy − A†y‖ ≤ ωµ(α)‖w‖.

Page 108: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Proof, considerations and example truncated SVD on the board.

• We get‖Rαyδ − A†y‖ ≤ ωµ(α)‖w‖+ δCα.

• Typically we can get some rate of decay likeωµ(α) = α2µ for µ ≤ µ0.

• This allows to derive a convergence rate!

• The optimal order of convergence is δ2µ

2µ+1 .

Page 109: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Proof, considerations and example truncated SVD on the board.

• We get‖Rαyδ − A†y‖ ≤ ωµ(α)‖w‖+ δCα.

• Typically we can get some rate of decay likeωµ(α) = α2µ for µ ≤ µ0.

• This allows to derive a convergence rate!

• The optimal order of convergence is δ2µ

2µ+1 .

Page 110: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Proof, considerations and example truncated SVD on the board.

• We get‖Rαyδ − A†y‖ ≤ ωµ(α)‖w‖+ δCα.

• Typically we can get some rate of decay likeωµ(α) = α2µ for µ ≤ µ0.

• This allows to derive a convergence rate!

• The optimal order of convergence is δ2µ

2µ+1 .

Page 111: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Proof, considerations and example truncated SVD on the board.

• We get‖Rαyδ − A†y‖ ≤ ωµ(α)‖w‖+ δCα.

• Typically we can get some rate of decay likeωµ(α) = α2µ for µ ≤ µ0.

• This allows to derive a convergence rate!

• The optimal order of convergence is δ2µ

2µ+1 .

Page 112: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Proof, considerations and example truncated SVD on the board.

• We get‖Rαyδ − A†y‖ ≤ ωµ(α)‖w‖+ δCα.

• Typically we can get some rate of decay likeωµ(α) = α2µ for µ ≤ µ0.

• This allows to derive a convergence rate!

• The optimal order of convergence is δ2µ

2µ+1 .

Page 113: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

What does it mean to meet a source condition?

• Easiest interpretation for µ = 1.• Source condition means x† ∈ range(A∗A).• Example: Differentiation

Aw(t) =∫ t

0w(s) ds

A∗Aw(t) =∫ 1

t

∫ r

0w(s) ds dr

• Two times differentiable! Smoothness!

Page 114: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

What does it mean to meet a source condition?• Easiest interpretation for µ = 1.

• Source condition means x† ∈ range(A∗A).• Example: Differentiation

Aw(t) =∫ t

0w(s) ds

A∗Aw(t) =∫ 1

t

∫ r

0w(s) ds dr

• Two times differentiable! Smoothness!

Page 115: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

What does it mean to meet a source condition?• Easiest interpretation for µ = 1.• Source condition means x† ∈ range(A∗A).

• Example: Differentiation

Aw(t) =∫ t

0w(s) ds

A∗Aw(t) =∫ 1

t

∫ r

0w(s) ds dr

• Two times differentiable! Smoothness!

Page 116: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

What does it mean to meet a source condition?• Easiest interpretation for µ = 1.• Source condition means x† ∈ range(A∗A).• Example: Differentiation

Aw(t) =∫ t

0w(s) ds

A∗Aw(t) =∫ 1

t

∫ r

0w(s) ds dr

• Two times differentiable! Smoothness!

Page 117: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

What does it mean to meet a source condition?• Easiest interpretation for µ = 1.• Source condition means x† ∈ range(A∗A).• Example: Differentiation

Aw(t) =∫ t

0w(s) ds

A∗Aw(t) =∫ 1

t

∫ r

0w(s) ds dr

• Two times differentiable! Smoothness!

Page 118: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Punshlines:• Without smoothness assumptions, regularization methods

can converge arbitrarily slow.

• Under the assumption of a source-condition, one canderive a convergence rate.

• Under our assumption the optimal order a regularizationcan achieve is O

2µ2µ+1

).

• Convergence rate is worse than the error in the data.• Most regularization methods can achieve this rate forµ ≤ µ0.

• Some can achieve it unconditionally (e.g. truncated SVD).• The output error can be in the same order as the data.

Page 119: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Punshlines:• Without smoothness assumptions, regularization methods

can converge arbitrarily slow.• Under the assumption of a source-condition, one can

derive a convergence rate.

• Under our assumption the optimal order a regularizationcan achieve is O

2µ2µ+1

).

• Convergence rate is worse than the error in the data.• Most regularization methods can achieve this rate forµ ≤ µ0.

• Some can achieve it unconditionally (e.g. truncated SVD).• The output error can be in the same order as the data.

Page 120: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Punshlines:• Without smoothness assumptions, regularization methods

can converge arbitrarily slow.• Under the assumption of a source-condition, one can

derive a convergence rate.• Under our assumption the optimal order a regularization

can achieve is O(δ

2µ2µ+1

).

• Convergence rate is worse than the error in the data.• Most regularization methods can achieve this rate forµ ≤ µ0.

• Some can achieve it unconditionally (e.g. truncated SVD).• The output error can be in the same order as the data.

Page 121: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Punshlines:• Without smoothness assumptions, regularization methods

can converge arbitrarily slow.• Under the assumption of a source-condition, one can

derive a convergence rate.• Under our assumption the optimal order a regularization

can achieve is O(δ

2µ2µ+1

).

• Convergence rate is worse than the error in the data.

• Most regularization methods can achieve this rate forµ ≤ µ0.

• Some can achieve it unconditionally (e.g. truncated SVD).• The output error can be in the same order as the data.

Page 122: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Punshlines:• Without smoothness assumptions, regularization methods

can converge arbitrarily slow.• Under the assumption of a source-condition, one can

derive a convergence rate.• Under our assumption the optimal order a regularization

can achieve is O(δ

2µ2µ+1

).

• Convergence rate is worse than the error in the data.• Most regularization methods can achieve this rate forµ ≤ µ0.

• Some can achieve it unconditionally (e.g. truncated SVD).• The output error can be in the same order as the data.

Page 123: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Punshlines:• Without smoothness assumptions, regularization methods

can converge arbitrarily slow.• Under the assumption of a source-condition, one can

derive a convergence rate.• Under our assumption the optimal order a regularization

can achieve is O(δ

2µ2µ+1

).

• Convergence rate is worse than the error in the data.• Most regularization methods can achieve this rate forµ ≤ µ0.

• Some can achieve it unconditionally (e.g. truncated SVD).

• The output error can be in the same order as the data.

Page 124: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Convergence of SVD based methods

Punshlines:• Without smoothness assumptions, regularization methods

can converge arbitrarily slow.• Under the assumption of a source-condition, one can

derive a convergence rate.• Under our assumption the optimal order a regularization

can achieve is O(δ

2µ2µ+1

).

• Convergence rate is worse than the error in the data.• Most regularization methods can achieve this rate forµ ≤ µ0.

• Some can achieve it unconditionally (e.g. truncated SVD).• The output error can be in the same order as the data.

Page 125: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Tikhonov ReguliarzationWe have seen that Tikhonov regularization given via

gα(σ) =σ

σ2 + α

can be obtained by solving

(A∗A + αI)xδα = yδ.

Next: Tikhonov regularization is solving an optimization problem

The xδα obtained from Tikhonov regularization are the uniqueminimizers of

E(x) = ‖Ax − yδ‖2 + α‖x‖2.

• This allows to generalize our concept of regularization!• Create regularizations by convex functionals without SVD!• Generalize theory from linear to nonlinear problems!

Page 126: Regularization Michael Moeller Chapter 3 3 Regularization Ill-Posed Problems in Image and Signal Processing ... Department of Mathematics TU Munchen¨ Regularization Michael Moeller

Regularization

Michael Moeller

Observations fromprevious chapter

RegularizationGeneral idea

Definition and Properties

Error-free parameter choice

Behavior on Y\D(A†)Construction based on SVD

Examples

ConvergenceWithout source conditions

With source conditions

Generalizations

updated 18.11.2014

Tikhonov ReguliarzationWe have seen that Tikhonov regularization given via

gα(σ) =σ

σ2 + α

can be obtained by solving

(A∗A + αI)xδα = yδ.

Next: Tikhonov regularization is solving an optimization problem

The xδα obtained from Tikhonov regularization are the uniqueminimizers of

E(x) = ‖Ax − yδ‖2 + α‖x‖2.

• This allows to generalize our concept of regularization!• Create regularizations by convex functionals without SVD!• Generalize theory from linear to nonlinear problems!