1 Local optimality in Tanner codes June 2011 Guy Even Nissim Halabi.

34
1 Local optimality in Tanner codes June 2011 Guy Even Nissim Halabi
  • date post

    21-Dec-2015
  • Category

    Documents

  • view

    222
  • download

    2

Transcript of 1 Local optimality in Tanner codes June 2011 Guy Even Nissim Halabi.

1

Local optimality in Tanner codes

June 2011

Guy EvenNissim Halabi

2

Error Correcting Codes for Memoryless Binary-Input Output-Symmetric Channels (1)

Memoryless Binary-Input Output-Symmetric ChannelBinary input but output is real

Stochastic: characterized by conditional probability function P( y | c )

Memoryless: errors are independent (from bit to bit)

Symmetric channel: Errors affect ‘0’s and ‘1’s symmetrically

Example: Binary Symmetric Channel (BSC)

NoisyChannel

Channel Decoding

Channel Encoding

ˆ 0,1N

c 0,1N

c C

0 0

1 1

yici

p

p

1-p

1-p

codeword noisy codeword

Ny ¡

3

Error Correcting Codes for Memoryless Binary-Input Output-Symmetric Channels (2)

Log-Likelihood Ratio (LLR) i for a received observation yi:

i > 0 yi is more likely to be ‘0’

i < 0 yi is more likely to be ‘1’

y : injective mapping replace y by

NoisyChannel

Channel Decoding

Channel Encoding

ˆ 0,1N

c 0,1N

c C Ny ¡

/

/

| 0ln

| 1i i

i i

Y X i ii i

Y X i i

y xy

y x

P

P

codeword noisy codeword

( )

4

Maximum-Likelihood (ML) DecodingMaximum-likelihood (ML) decoding can be formulated by: (for any binary-input memory-less channel)

or a linear program:

conv

arg min , arg min ,xx

ML x x

CC

No Efficient Representation

{0,1}N

conv()[0,1]N

arg min ,x

ML x

C

5

Linear Programming (LP) Decoding

Linear Programming (LP) decoding [Fel03, FWK05] – relaxation of the polytope conv(C)

P : (1) All codewords x in C are vertices

(2) All new vertices are fractional

(therefore, new vertices are not in C)

(3) Has an efficient representation

Solve LP

arg min ,x

LP x

P

fractional !LP fail

integral LP LP ML

P

{0,1}N

conv()[0,1]N

fractional conv() P

6

Linear Programming (LP) Decoding

Linear Programming (LP) decoding [Fel03, FWK05] – relaxation of the polytope conv(C)

P : (1) All codewords x in C are vertices

(2) All new vertices are fractional

(therefore, new vertices are not in C)

(3) Has an efficient representation

LP decoder finds ML codeword

arg min ,x

LP x

P

Solve LP

fractional !LP fail

integral LP LP ML

P

{0,1}N

conv()[0,1]N

fractional conv() P

7

Linear Programming (LP) Decoding

Linear Programming (LP) decoding [Fel03, FWK05] – relaxation of the polytope conv(C)

P : (1) All codewords x in C are vertices

(2) All new vertices are fractional

(therefore, new vertices are not in C)

(3) Has an efficient representation

LP decoder fails

Solve LP

integral LP LP ML

fractional !LP fail

arg min ,x

LP x

P

P

{0,1}N

conv()[0,1]N

fractional conv() P

8

Tanner Codes [Tan81]Factor graph representation of Tanner codes:

Every Local-code node j is associated withlinear code of length degG(j)

Tanner code C(G) and codewords x :

Extended local-code j {0,1}N: extend to bits outside the local-code

Example: Expander codes [SS’96]

Tanner graph is an expander; Simple bit flipping decoding algorithm.

Variablenodes

Local-Codenodes

1C

2C

3C

5C

4C

10x

1x

3x2x

4x

5x

6x

7x

8x

9x

. j

jx G j x local code C

C CN

G = ( I J , E )

min* min j

jd loc ded al co C

9

LP Decoding of Tanner Codes

conv

arg min ,x

ML x

C

arg min ,x

LP x

P

extended local-codeco nvP Cjj

conv = extended local-code conv C C

jj

conv(extended local-code 1)conv(extended local-code 2)

Maximum-likelihood (ML) decoding:

where

Linear Programming (LP) decoding [following Fel03, FWK05]:

where

P satisfies the requirements.

10

Why study (irregular) Tanner Codes?

Tighter LP relaxation because local codes are not parity codes

Irregularity of the Tanner graph Codes with better performance in the case of LDPC codes [Luby et al., 98]

Regular Low-Density Tanner Code with non-trivial local codes Irregular LDPC code.

Irregular Tanner code with non-trivial local codes Might yield improved designs.

11

Criterions of Interest

Let N denote an LLR vector received from the channel.Let x (G) denote a codeword.Consider the following questions:

E.g., efficient test via local computations “Local Optimality” criterion

?

x ML

?

x LP ?LP unique

?ML unique

Efficient Test with One Sided Errorx Definitely Yes /

Maybe No

?

?x ML unique

NP-Hard

12

Let x (G) {0,1}N

f [0,1]N N

[Fel03] Define relative point x f by

Consider a finite set B [0,1]N of deviations

Definition: A codeword x is locally optimal for N if for all vectors b B,

Goal: find a set B deviations such that:(1) x LO( ) x ML() and ML() unique(2) x LO( ) x LP() and LP() unique

(3)

LP decoding fails 0 is not locally opt. | 0N Nc P P

Combinatorial Characterization of Local Optimality (1)

, ,x x

i iix f x f

, 0 | 0 1 (1)

P

B

Nc o All-Zeros Assumption

LO( )

ML()

LP() integral

13

Goal: find a set B such that:(1) x LO( ) x ML() and ML() unique(2) x LO( ) x LP() and LP() unique

(3)

Suppose we have properties (1), (2). Large support(b) property (3). (e.g., Chernoff-like bounds)

If B = C, then: x LO( ) x ML() and ML() uniqueHowever, analysis of property (3) ???

b – GLOBAL Structure

Combinatorial Characterization of Local Optimality (2)

, 0 | 0 1 (1)

P

B

Nc o

14

Goal: find a set B such that:(1) x LO( ) x ML() and ML() unique(2) x LO( ) x LP() and LP() unique

(3)

Suppose we have properties (1), (2). Large support(b) property (3). (e.g., Chernoff-like bounds)

For analysis purposes, consider structures with a local nature B is a set of TREES [following KV’06] (height < ½ girth)

Strengthen analysis by introducing layer weights! [following ADS’09] better bounds on

Finally, height(subtrees(G)) < ½ girth(G) = O(log N) Take path prefix trees – not bounded by girth!

Combinatorial Characterization of Local Optimality (3)

, 0 | 0 1 (1)

P

B

Nc o

, 0 | 0

P

B

Nc

15

Captures the notion of computation tree

Consider a graph G=(V,E) and a node r V: – set of all backtrackless paths in G emanating from node r with length at most h.

– path-prefix tree of G rooted at node r with height h.

Path-Prefix Tree

ˆ ˆ,hrT G V E

1

1

1

1

1 2

1

1

1

2 2

1

2

3 2

1

2

4 2

4 :rT GG:

16

d-TreeConsider a path-prefix tree of a Tanner graph G = (I J , E)

d-tree T [r,h,d] – subgraph of root = r

∀ v ∈ T ∩ I : deg T (v) = deg G (v).

∀ c ∈ T ∩ J : deg T (c) = d.

v0

2-tree = skinny tree / minimal deviation

v0

3-tree

v0

4-treeNot necessarily a valid

configuration!

hrT G

hrT G

17

Consider layer weights : {1,…,h} → , and a subtree of a path prefix tree .

Define a weight function for the subtree induced by :

where

– projection to Tanner graph G.

Cost of a Projected Weighted Subtree

ˆ

ˆ: RTr V

2hrT G

1

2

2ˆ 2

16

2 2 1 2r

T

1ˆ 1

16

2 2r

T

ˆ

NG r

T R

1

1

1

1

1 2

1

1

1

2 2

1

2

3 2

1

2

4 2

1

2 1 2ˆ 4 8G r T

project

ˆTr

ˆTr

18

Combinatorial Characterization of Local Optimality

Setting:(G) {0,1}N Tanner code with minimal local distance d*

2 ≤ d ≤ d*

[0,1]h\{0N} – set of all vectors corresponding to projections to G by

-weighted d-trees of height 2h rooted at variable nodes

Definition: A codeword x is (h, , d)-locally optimal for N if for all vectors ,

, ,x x

Bd

Bd

is -weighted a -tree of height B T Td G d h

19

Thm: Local Opt ML Opt / LP Opt⇒Theorem: If x (G) is (h , , d)-locally optimal for , then: (1) x is the unique ML codeword for .(2) x is the unique optimal LP solution given .

Proof sketch – Two steps: STEP 1: Local Opt. ML Opt.Key structural lemma: Every codeword x equals a convex combination of projected -weighted d-trees in G (up to a scaling factor ):

For every codeword x’ ≠ x, let

, ' ,x x

, ,

,

,

1 , , '

x E x

x E

x z

x x

'z x x

Key structural lemma

local-optimality of x

x

20

Thm: Local Opt ML Opt / LP Opt⇒Theorem: If x (G) is (h , , d)-locally optimal for , then: (1) x is the unique ML codeword for .(2) x is the unique optimal LP solution given .

Proof sketch – Two steps: STEP 2: Local Opt. LP Opt.Local opt. ML Opt. used as “black box”.Reduction using graph covers and graph cover decoding [VK’05]

21

Thm: Local Opt ML Opt / LP Opt⇒Theorem: If x (G) is (h , , d)-locally optimal for , then: (1) x is the unique ML codeword for .(2) x is the unique optimal LP solution given .

Interesting outcomes:Characterizes the event of LP decoding failure. For example:

Work in progress: Design an iterative message passing decoding algorithm that computes an (h , , d)- locally-optimal codeword after h iterations.

x locally optimal codeword for weighted message passing algorithm finds x after h iterations + guarantee that x is the ML codeword.

Theorem: Fix h R . Then

LP decoding fails -tree . [ ], 0 | 0 P P nGd c .

All-Zeros Assumption

22

Thm: Local Opt ML Opt / LP Opt⇒Theorem: If x (G) is (h , , d)-locally optimal for , then: (1) x is the unique ML codeword for .(2) x is the unique optimal LP solution given .

LO( )

ML()

LP() integral

Goals achieved:(1) x LO( ) x ML() and ML() unique(2) x LO( ) x LP() and LP() uniqueLeft to show: (3) Pr{x LO( )} = 1 – o(1)

23

Bounds for LP Decoding Based on d-Trees (analysis for regular factor graphs)

(dL,dR)-regular Tanner codes with minimum local distance d*

- Form of finite length bounds: c > 1. t. noise < t.

Pr(LP decoder success) 1 - exp(-cgirth)

- If girth = Θ(logn), then

Pr(LP decoder success) 1 - exp(-nγ), for 0 < < 1

n → ∞ : t is a lower bound on the threshold of LP decoding

Results for BSC using analysis of uniform weighted d-trees:

4-trees 3-trees Skinny-Tree analysis (2-tree)

[ADS’09]

pLP > 0.4512 pLP > 0.076 pLP > 0.02(3,6)-regular Tanner code with d*=4

pLP > 0.365 pLP > 0.057 pLP > 0.017(4,8)-regular Tanner code with d*=4

24

Message Passing Decoding for Irregular LDPC codes

Dynamic Programming on coupled computation trees of the Tanner graph.

Input: Channel observations for variable nodes v V.

Output: An assignment to variable nodes v V , s.t.: assignment to v agrees with the “Tree” codeword that minimizes an objective function on the computation tree rooted at v.

Algorithm principle – Messages are sent iteratively:variable nodes → check nodes

check nodes → variable nodes

Goal: A locally optimal codeword x exists Message passing algorithm finds x

25

-Weighted Min-Sum: Message Rules

( ) ( 1)

' \

1

deg deg 1h

v c v c vc N v cG Gv v

( ) ( ) ( )

' \' \

min

c v v c v cv N c v

v N c v

sign

Variable node → Check node message (iteration l):

Check node → Variable node message (iteration l):

26

-Weighted Min-Sum: Decision Rule

Decision at variable node v on iteration h:

( 1)0 1

deg deg 1h

v v c vc N vG Gv v

0 0ˆ

1 . .

vv

ifx

o w

27

Thm: Local Opt. ⇒ -Weighted Min-Sum Opt.

Theorem: If x (G) is (h , , 2)-locally optimal for , then: (1) -weighted min-sum computes x in h iterations.(2) x is the unique ML codeword given .

The -Weighted Min-Sum algorithm generalizes the attenuated max-product algorithm [Frey and Koetter ’00] [Jian and Pfister ’10].

Open question: probabilistic analysis of local-optimality in the irregular case.

28

Local Optimality Certificates for LP decoding

Previous results:

Koetter and Vontobel ’06 – Characterized LP solutions and provide a criterion for certifying the optimality of a codeword for LP decoding of regular LDPC codes.

Characterization is based on combinatorial structure of skinny trees of height h in the factor graph. (h<girth(G))

Arora, Daskalakis and Steurer ’09 – Certificate for LP decoding of regular LDPC codes over BSC. Extension to MBIOS channels in [Halabi-Even ’10].

Certificate characterization is based on weighted skinny trees of height h. (h<girth(G))

Note: Bounds on the word error probability of LP decoding are computed by analyzing the certificates. By introducing layer weights, ADS certificate occurs with high probability for much larger noise rates.

Vontobel ’10 – Implies a certificate for LP decoding of Tanner codes based on weighted skinny subtrees of graph covers.

29

Local Optimality Certificates for LP decoding – Summary of Main Techniques

Current work [KV06,ADS09,HE10]

h is unbounded. characterization uses computation trees

h < girth(G) Girth

Irregular factor graph – normalization by node degrees. Regular factor graph.

Regularity

Linear Codes.Tighter relaxation for the generalized fundamental polytope.

Parity code. Check Nodes / Constraints

1C

2C

3C

5C

4C

10x

1x

3x2x

4x

5x

6x

7x

8x

9x

30

Local Optimality Certificates for LP decoding – Summary of Main Techniques

Current work [KV06,ADS09,HE10]

h is unbounded. characterization uses computation trees

h < girth(G) Girth

Irregular factor graph – normalization by node degrees. Regular factor graph.

Regularity

Linear Codes.Tighter relaxation for the generalized fundamental polytope.

Parity code. Check Nodes / Constraints

“fat” – Certificates based on “fat” structures likely to occur with high probability for larger noise rates.Not necessarily a valid configuration!

“skinny”Locally satisfies parity checks.

Deviations

v0v0

31

Local Optimality Certificates for LP decoding – Summary of Main Techniques

Current work [KV06,ADS09,HE10]

h is unboundedCharacterization using computation trees

h < girth(G)No dependencies on the factor graph.

Girth

Irregular factor graph – add normalization factors according to node degrees.

Regular factor graph.Regularity

Linear Codes.Tighter relaxation for the generalized fundamental polytope.

Parity code. Check Nodes / Constraints

“fat” – Certificates based on “fat” structures likely to occur with high probability for larger noise rates.Not necessarily a valid configuration!

“skinny”Locally satisfies inner parity checks.

Deviations

Use reduction to ML via characterization of graph cover decoding.

Dual / Primal LP analysis. Polyhedral analysis.

LP solution analysis /

characterization

32

ConclusionsCombinatorial, graph theoretic, and algorithmic methods for analyzing decoding of (modern) error correcting codes over any memoryless channel:

New local optimality combinatorial certificate for LP decoding of irregular Tanner codes, i.e., Local Opt. LP Opt ML Opt.

The certificate is based on weighted “fat” subtrees of computation trees of height h. h is not bounded by girth.

Proofstechniques: combinatorial decompositions and graph covers arguments.

Efficient algorithm (dynamic programming), runs in O(|E|h) time, for the computation of local optimality certificate of a codeword given an LLR vector.

Degree of local-code nodes is not limited to 2

33

Future Work Work in progress: Design an iterative message passing decoding algorithm for Tanner codes that computes an (h , , d)- locally-optimal codeword after h iterations.

x locally optimal codeword for weighted message passing algorithm computes x in h iterations + guarantee that x is the ML codeword.

Asymptotic analysis of LP decoding and weighted min-sum decoding for ensembles of irregular Tanner codes.

apply density evolution techniques and the combinatorial characterization of local optimality.

34

Thank You!