Post on 22-Dec-2015
Linear Codes for Distributed Source Coding: Reconstruction of a Function of
the Sources-D. Krithivasan and S. Sandeep Pradhan
-University of Michigan, Ann Arbor
Presentation Overview
• Problem Formulation• Motivation• Nested Linear Codes• Main Result• Applications and Examples• Conclusions
Problem Formulation• Distributed Source Coding
• Typical application: Sensor networks.• Example: Lossless reconstruction of all sources –
joint entropy.
Problem Formulation
• We ask: What if the decoder is interested only in a function of the sources?
• In general: fidelity criterion of the form • Ex: average of the sensor measurements.• Obvious strategy: Reconstruct the sources and
then compute the function.• Are rate gains possible if we directly encode the
function in a distributed setting?
d(X ;Y;Z)
Motivation: A Binary Example
• Korner and Marton – Reconstruction of • Centralized encoder:
– Compute
– Compress using a good source encoder
• Suppose satisfies• Centralized scheme becomes distributed scheme.• Are there good source codes with this property?
– Linear Codes.
Z = X ©2 Y
Z = X ©2 Y
Z f (Z)
f (¢) f (X ©2 Y ) = f (X )©2 f (Y )
The Korner-Marton Coding Scheme
• matrix such that:– Decoder with high probability.
– Entropy achieving:
• Encoders transmit • Decoder: with high probability.• Rate pair achievable.• Can be lower than Slepian-Wolf bound: • Scheme works for addition in any finite field.
A ¡ k £ n
kn ¼H(Z)
Ã(¢):Ã(AZn) = Zn
s1 =AX n;s2 =AY n
Ã(s1©2 s2) = Zn
(H (Z);H (Z))
H(X ;Y )
Properties of the Linear Code
• Matrix :Puts different typical in different bins.• Consider - Coset code• Good channel code for channel with noise • Both encoders use identical codebooks
– Binning completely “correlated”
– Independent binning more prevalent in information theory.
A Zn
C= fxn :Axn = skg
Z
Slepian-Wolf Coding
• Function to be reconstructed
• Treat binary sources as sources.
• Function equivalent to addition in :
• Encode the vector function one digit at a time.
0 1
0 0 0
1 1 1
F (X ;Y ) = (X ;Y )
F4F4 ~Z = ~X ©4 ~Y
©2 0 1
0 0 1
1 0 1
©2
First digit of ~Z Second digit of ~Z
Slepian-Wolf Coding contd.
• Use Korner-Marton coding scheme on each digit plane.
• Sequential strategy achieves Slepian-Wolf bound.• General lossless strategy:
– “Embed” the function in a digit plane field (DPF).
– DPF – direct sum of Galois fields of prime order.
– Encode the digits sequentially using Korner-Marton strategy.
Lossy Coding
• Quantize to , to• - best estimate of w.r.t the
distortion measure given• Use lossless coding to encode• What we need: Nested linear codes.
X U Y VG(U;V) F (X ;Y )
U;V
G(U;V)
Nested Linear Codes• Codes used in KM, SW –
good channel codes– Cosets bin the entire space.
– Suitable for lossless coding.
• Lossy coding: Need to quantize first.– Decrease coset density.
Nested Linear Codes• Codes used in KM, SW –
good channel codes– Cosets bin the entire space.
– Suitable for lossless coding.
• Lossy coding: Need to quantize first.– Decrease coset density –
Nested linear codes.
– Fine code: quantizes the source.
– Coarse code: bins only the fine code.
Nested Linear Codes
• Linear code• nested if• We need
– : “good” source code• Can find jointly typical with
– :“good” channel code• Can find unique typical for a given
(C1;C2)
C , fxn :Hxn = 0kg
C2½C1
C1½Un
un 2 C1 xn
C2½Zn
zn H2zn
Good Linear Source Codes
• Good linear code for the triple• Assume for some prime• Exists for large if• Not a good source code in the Shannon sense.
– Contains a subset that is a good Shannon source code.
• Linearity – rate loss of bits/sample
C1 (X ;U;PX U )
U = Fq q
n 1n logjC1j ¸ logq¡ H (UjX )
(logq¡ H (U))
Good Linear Channel Codes
• Good linear code for the triple• Assume for some prime• Exists for large if• Not a good channel code in the Shannon sense.
– Every coset contains a subset which is a good channel code.
• Linearity – rate loss of bits/sample
C2 (Z;S;PZS )
Z = Fq qn 1
n logjC2j ¸ (logq¡ H (ZjS))
(logq¡ H (Z))
Main Result
• Fix test channel such that and• Embed in . Need to encode• Fix order of encoding of digit planes –• Idea: Encode one digit at a time.• At bth stage: Use previous reconstructed
digits as side information.
U-X-Y -V Ed(F;G) · D
G(U;V) DPF(s) ~Z = ~U ©s ~V
¼s(¢)~Z
(b¡ 1)~Z¦ s (b)
Coding Strategy for • Good source codes , good channel code
~Z¼s (b)
C11b;C12b C2b
Cardinalities of the Linear Code• Cardinality of the nested codes
• Rate of encoder:
• Conventional coding:
1n logjC11bj ¸ logq¡ H
³~U¼s (b) j X ; ~U¦ s (b)
´
1n logjC12bj ¸ logq¡ H
³~V¼s (b) j Y; ~V¦ s (b)
´
1n logjC2bj · logq¡ H
³~Z¼s (b) j ~Z¦ s (b)
´
XR(1)1b ¸ H
³~Z¼s (b) j ~Z¦ s (b)
´¡ H
³~U¼s (b) j X ; ~U¦ s (b)
´
R(2)1b ¸ H³~U¼s (b) j ~Z¦ s (b)
´¡ H
³~U¼s (b) j X ; ~U¦ s (b)
´
Coding Theorem
• An achievable rate regionRDin =
SU ¡ X ¡ Y ¡ Vs2S ;¼s
©(R1;R2;D):
R1 ¸P (s)
b=1 minfR(1)1b ;R
(2)1b g;R2 ¸
P (s)b=1 minfR
(1)2b ;R
(2)2b g
D ¸ Ed(F (X ;Y );G(U;V))ª
• Corollary:
RD0
in =S
U ¡ X ¡ Y ¡ Vs2S ;¼s
f (R1;R2;D):R1 ¸ H (Z) ¡ H (UjX )
R2 ¸ H (Z) ¡ H (VjY );D ¸ Ed(F (X ;Y );G(U;V))g:
Nested Linear Codes Achieve Rate Distortion Bound
• Choose as constant.• Follows that
achievable for any• Can also recover
– Berger-Tung inner bound.
– Wyner-Ziv rate region.
– Wyner’s source coding with side information.
– Slepian-Wolf and Korner Marton rate regions.
Y G(U;V) = U
R =H(U) ¡ H (UjX ) = I (X ;U)PX PU jX
Lossy Coding of
• Fix test channels• independent binary random variables.• Reconstruct• Using corollary to rate region, can achieve
• Can achieve more rate points by– Choosing more general test channels.
– Embedding in
Z = X ©2 Y
U =X ©2Q1;V = Y ©2Q2
Q1;Q2
Z =U ©2 V
RDin =SQ1;Q2
fR1 ¸ H (Z) ¡ H (Q1); R2 ¸ H (Z) ¡ H (Q2)
D ¸ P (Q1©2Q2 6= 0)g
DPF(3);DPF(4)
Conclusions
• Presented an unified approach to distributed source coding.
• Involves use of nested linear codes.• Coding: Quantization followed by “correlated”
binning.• Recovers the known rate regions for many
problems.• Presents new rate regions for other problems.