Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of...
-
Upload
dulcie-wiggins -
Category
Documents
-
view
220 -
download
2
Transcript of Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of...
![Page 1: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/1.jpg)
Extremal Problems of Information Combining
Alexei Ashikhmin
Information Combining: formulation of the problem Mutual Information Function for the Single Parity Check Codes More Extremal Problems of Information Combining Solutions (with the help of Tchebysheff Systems) for the Single Parity Check Codes
Joint work with Yibo Jiang, Ralf Koetter, Andrew Singer
![Page 2: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/2.jpg)
Encoder
Channel
Channel
Channel
APPDecoder
Information Transmission
Density function of the channel is not known
We only know
![Page 3: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/3.jpg)
Optimization Problem
We assume that
and that the channel is symmetric
Problem 1
Among all probability distributions such that
determine the probability distribution that maximizes (minimizes)
the mutual information at the output of the optimal decoder
![Page 4: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/4.jpg)
Var
iabl
e no
des
proc
essi
ng
Che
ck n
odes
pro
cess
ing
Interleaver
Inpu
t fro
m c
hann
el
From variable nodes
To variable nodes Decoder of single parity check code
![Page 5: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/5.jpg)
Problem is Solved Already
1. I.Land, P. Hoeher, S.Huettinger, J. Huber, 2003
2. I.Sutskover, S. Shamai, J. Ziv, 2003
![Page 6: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/6.jpg)
erasure
Repetition code: The Binary Erasure Channel (BEC) is the best
The Binary Symmetric Channel (BSC) is the worst
Single Parity Check Code:
is Dual of Repetition Code The Binary Erasure Channel (BEC) is the worst
The Binary Symmetric Channel (BSC) is the best
![Page 7: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/7.jpg)
Our Goals
We would like to solve the optimization problem for the Single Parity Check Codes directly (without using duality)
Get some improvements
![Page 8: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/8.jpg)
Soft Bits
We call soft bit, it has support on
Channel
![Page 9: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/9.jpg)
erasure
![Page 10: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/10.jpg)
Binary symmetric channel,
Gaussian Channel:
![Page 11: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/11.jpg)
Channel
Channel
Channel
Decoder Single ParityCheck Code
Encoder Single ParityCheck Code
E.Sharon, A. Ashikhmin, S. Litsyn Results:
![Page 12: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/12.jpg)
Properties of the moments
Lemma 1. is nonnegative and nonincreasing
2. The ratio sequence is nonincreasing
Lemma
In the Binary Erasure Channel all moments are the same
![Page 13: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/13.jpg)
Problem 2
Among all T-consistent probability distributions on [0,1]
such that
determine the probability distribution that maximizes
(minimizes) the second moment
![Page 14: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/14.jpg)
Solution of Problem 2
Theorem
Among all binary-input symmetric-output channel
distributions with a fixed mutual information
Binary Symmetric Channel maximizes
and
Binary Erasure Channel minimizes
the second moment
Proof: We use the theory of Tchebysheff Systems
![Page 15: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/15.jpg)
Lemma
Binary Symmetric, Binary Erasure and an arbitrary channel
with the same mutual information have the following layout of
![Page 16: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/16.jpg)
Lemma
Let and
1)
2) if for and for
then
![Page 17: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/17.jpg)
This is exactly our case
satisfy conditions of the previous lemma
![Page 18: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/18.jpg)
Problem 1 on extremum of mutual information
and
Problem 2 on extremum of the second moment
are equivalent
![Page 19: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/19.jpg)
Extrema of MMSE
It is known that the channel soft bit is the MMSE estimator fo
the channel input
Theorem Among all binary-input symmetric-output channels withfixed the Binary Symmetric Channel has the minimum MMSE: and the Binary Erasure Channel has the maximum MMSE:
Channel
![Page 20: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/20.jpg)
How good the bounds are
![Page 21: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/21.jpg)
Problem 3
1)
2)
Among all T-consistent channels find that maximizes
(minimizes)
Channel
Channel
Channel
Decoder Single ParityCheck Code
Encoder Single ParityCheck Code
![Page 22: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/22.jpg)
Problem 4
Among all T-consistent probability distributions on [0,1]
such that
1)
2)
determine the probability distribution that maximizes
(minimizes) the fourth moment
![Page 23: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/23.jpg)
Theorem The distribution with mass at , mass at
and mass at 0 maximizes
The distribution with mass at , mass at
and mass at 1 minimizes
![Page 24: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/24.jpg)
Extremum densities
Maximizing
Minimizing:
![Page 25: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/25.jpg)
Lemma
Channel with minimum and maximum and an arbitrary
channel with the same mutual information have the followin
layout of
![Page 26: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/26.jpg)
Problem 3 on extremum of mutual information
and
Problem 4 on extremum of the fourth moment
are equivalent
![Page 27: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/27.jpg)
Assume that
and is the same as in AWGN channel with this
![Page 28: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/28.jpg)
Tchebysheff Systems
Definition
A set of real continues functions is called
Tchebysheff system (T-system) if for any real the linear
combination has at most distinct roots at
Definition
A distribution is a nondecreasing, right-continues function
The moment space, defined by
( is the set of valid distributions), is a closed convex cone.
For define
![Page 29: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/29.jpg)
Problem For a given find
that maximizes (minimizes)
![Page 30: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/30.jpg)
Theorem
If and are T-systems,
and then the extrema are attained uniquely with
distrtibutions and with finitely many mass points
Lower principalrepresentation
Upper principalrepresentation
![Page 31: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/31.jpg)
Soft Bits
We call soft bit, it has support on
Lemma (Sharon, Ashikhmin, Litsyn)
If then
Channel
Random variables with this property are called T-consistent
![Page 32: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/32.jpg)
Find extrema of
Under constrains
![Page 33: Extremal Problems of Information Combining Alexei Ashikhmin Information Combining: formulation of the problem Mutual Information Function for the Single.](https://reader030.fdocuments.in/reader030/viewer/2022020117/56649ed55503460f94be5b0e/html5/thumbnails/33.jpg)
Theorem
Systems and are T-systems on [0,1].
---------------------------------------------------------------------------------
the distribution that maximizes
has only one mass point at :
has probability mass at
and at
This is exactly the Binary Symmetric Channel