Lecture 7 Multiple Access Channelhomepage.ntu.edu.tw/.../Slides/IT_Lecture_07_v3.pdf · Lecture...
Transcript of Lecture 7 Multiple Access Channelhomepage.ntu.edu.tw/.../Slides/IT_Lecture_07_v3.pdf · Lecture...
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Lecture 7Multiple Access Channel
I-Hsiang Wang
Department of Electrical EngineeringNational Taiwan University
December 10, 2014
1 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Noiseless Graphical Network → Noisy Multi-User Network
We have shown that Shannon’s paradigm can be readily extended tonoiseless graphical networks. Moreover, in particular for single multicast,
Zero-error decoding at finite blocklength is feasible.Low-complexity explicit construction of network codes is possible.
Caveat: There are couples of distinct features that a single noiselessgraphical multicast problem possesses such that its solution is so elegant:
Noiseless: links are modeled as noiseless finite capacitated edges.Orthogonal: links can carry independent information withoutinterfering with one another.
In other words, it well models the overlay network beyond PHY layer.Beyond wireline: in many scenarios, the communication medium isshared among multiple users, and hence the above two features may befar from valid. As a first step, we investigate several kinds of simplesingle-hop multi-user noisy channels.
2 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
ComparisonLecture 6 Lecture 7,8,9
Noiseless graphicalmulticast
Single-hop multi-usernoisy channel
Topology General Single-hop
Linkage Noiseless, Orthogonal General (Gaussian asMain Example)
TrafficSingle unicast/multicast;Special cases of multiple
unicast (MAC, BC)
Multiple access channel,Broadcast channel,Interference channel
3 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Key Features Missing in Noiseless Graphical Multicast
1 Superposition of signals at receiving terminals. The simplestone-hop model is the multiple access channel. (Lecture 7)
Encoder 1
Multiple Access Channel DecoderEncoder
2
Encoder K
X1
X2
XK
Source 1
Source 2
Source Kp (y|x1, . . . , xK)
Y
Destination
2 Broadcast of signals from transmitting terminals. The simplestone-hop model is the broadcast channel. (Lecture 8)
Broadcast Channel
Decoder 2Encoder
Source Destination 2X
Y1 Decoder 1
Destination1
Decoder K
Destination K
Y2
YKp (y1, . . . , yK |x)
4 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Key Features Missing in Noiseless Graphical Multicast
3 Interference among independent information flows. The simplestone-hop model is the interference channel. (Lecture 9)
Interference Channel
Decoder 2
Destination 2
Y1 Decoder 1
Destination1
Decoder K
Destination K
Y2
YK
Encoder 1
Encoder 2
Encoder K
Source 1
Source 2
Source Kp
�y[1:K]|x[1:K]
�
We shall start with the multiple access channel (MAC), which answerspartially the following two kinds of questions:
1 How do multiple transmitters trade-off their rates when accessing asingle receiver? (Traffic Pattern)
2 How does a single receiver decode multiple data streams when theyare superimposed together? (Superposition)
5 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Multiple Access Channel: Problem Formulation
ENC 1
DECENC 2
ENC K
X1
X2
XK
Y
......
......
......
W1
W2
WK
cW1, . . . ,cWKpY |X1,...,XK
1 K independent messages {W1, . . . ,WK}, each of which is onlyaccessible by one encoder. Wk ∼ Unif
[1 : 2NRk
], ∀ k ∈ [1 : K].
2 Channel:(X1, . . . ,XK, pY|X1,...,XK , Y
).
3 Rate tuple: (R1, . . . ,RK).
6 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Multiple Access Channel: Problem Formulation
ENC 1
DECENC 2
ENC K
X1
X2
XK
Y
......
......
......
W1
W2
WK
cW1, . . . ,cWKpY |X1,...,XK
4 A(2NR1 , 2NR2 , . . . , 2NRK ,N
)MAC channel code consists of
∀ k ∈ [1 : K], an encoding function enck,N :[1 : 2NRk
]→ XN
k thatmaps message wk to a length N codeword xN
k .a decoding function decN : YN →×K
k=1
[1 : 2NRk
]that maps a
channel output yN to a reconstructed message tuple (w1, . . . , wK).
7 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Multiple Access Channel: Problem Formulation
ENC 1
DECENC 2
ENC K
X1
X2
XK
Y
......
......
......
W1
W2
WK
cW1, . . . ,cWKpY |X1,...,XK
5 Error probability P(N)e := Pr
{(W1 . . . ,WK) =
(W1, . . . , WK
)}.
6 A rate tuple R := (R1, . . . ,RK) is said to be achievable if thereexist a sequence of
(2NR1 , 2NR2 , . . . , 2NRK ,N
)MAC channel codes
such that P(N)e → 0 as N → ∞.
7 The capacity region C := cl{
R ∈ [0,∞)K: R is achievable
}.
8 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Lecture Overview
We mainly focus on the two-user case (K = 2) in this lecture. For MAC,the results in the two-user case can be extended to the K-user case in astraightforward manner.
1 First we extend the achievability of point-to-point channels to thetwo-user MAC, and establish a capacity inner bound.
2 Second we characterize the capacity region of Gaussian MAC byproving that the inner bound above is tight.
3 We then use Gaussian MAC as an example to introduce variousschemes including successive interference cancellation (SIC),time-sharing, etc.
4 Finally we characterize the capacity region for general MAC, byproviding an enhanced achievability and a general converse proof.
9 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
1 Basic Bounds and Gaussian MAC
2 General Discrete Memoryless MAC
3 Summary
10 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Capacity Inner Bound
Let us begin with extending the achievability of point-to-point channel.Now the decoder has to decode two independent messages, and the keylies in how to analyze the error event in the appropriate way.
Lemma 1 (Achievability)If (R1,R2) ≥ 0 satisfies the following for some (X1,X2) ∼ pX1 · pX2 ,then (R1,R2) is achievable.
R1 < I (X1 ;Y |X2) (1)R2 < I (X2 ;Y |X1) (2)
R1 + R2 < I (X1,X2 ;Y) (3)
Remark: Note that the input distribution is chosen such that X1 ⊥⊥ X2,which is reasonable since the two encoders are not cooperating.
11 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Proof of Achievability
pf: As in the point-to-point case, we use random coding argument toprove the existence of sequence of
(2NR1 , 2NR2 ,N
)-codes such that
limN→∞ P(N)e = 0 as long as (1) – (3) hold.
Our proof is divided into three parts (similar to the point-to-point case):(1) random codebook generation, (2) encoding and decoding, and(3) error probability analysis.Random codebook generation: ∀ k = 1, 2, randomly and independentlygenerate 2NRk sequences xN
k (wk), wk ∈[1 : 2NRk
], each i.i.d. over time
according to pXk (that is, XNk ∼
∏Ni=1 pXk (xk[i])).
Encoding: ∀ k = 1, 2, to send message wk, Encoder k transmits xNk (wk).
Decoding: To facilitate error probability analysis, use typicality decoder:(w1, w2) = a unique (w1,w2) ∈
[1 : 2NR1
]×[1 : 2NR2
]such that(
xN1 (w1) , xN
2 (w2) , yN) ∈ T (N)ϵ (X1,X2,Y).
12 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Error Probability Analysis: By the symmetry of codebook generation,we can assume WLOG the actual message tuple is (W1,W2) = (1, 1)and focus on analyzing the “averaged-over-codebook” error probabilitygiven (W1,W2) = (1, 1): (E denotes the error event
(W1, W2
)= (W1,W2))
P(1,1) {E} := Pr {E| (W1,W2) = (1, 1)} .
The key is to distinguish error event E into the following four cases Ea,E(1)
t , E(2)t , and E(1,2)
t , such that E = Ea ∪ E(1)t ∪ E(2)
t ∪ E(1,2)t , where
Ea :={(
XN1 (1) ,XN
2 (1) ,YN) /∈ T (N)ϵ
}E(1)
t :={(
XN1 (w1) ,XN
2 (1) ,YN) ∈ T (N)ϵ for some w1 = 1
}E(2)
t :={(
XN1 (1) ,XN
2 (w2) ,YN) ∈ T (N)ϵ for some w2 = 1
}E(1,2)
t :={(
XN1 (w1) ,XN
2 (w2) ,YN) ∈ T (N)ϵ for some w1 = 1,w2 = 1
}
13 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Next, we would like to find a set of sufficient conditions under which theabove error events have vanishing probability as N → ∞.Following the point-to-point proof, let us define event
A(w1,w2) :={(
XN1 (w1) ,XN
2 (w2) ,YN) ∈ T (N)ϵ
},
and rewrite Ea = Ac
(1,1)
E(1)t =
∪w1 =1 A(w1,1)
E(2)t =
∪w2 =1 A(1,w2)
E(1,2)t =
∪w1 =1,w2 =1 A(w1,w2)
.
Hence,
E = Ac(1,1) ∪
( ∪w1 =1
A(w1,1)
)∪
( ∪w2 =1
A(1,w2)
)∪
( ∪w1 =1,w2 =1
A(w1,w2)
).
Next, we present a key lemma bounding the probability of these events.
14 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Lemma 2P(1,1)
{A(1,1)
}≥ 1− ϵ for N large enough, and
P(1,1)
{A(w1,1)
}≤ 2−N(I(X1;Y|X2)−δ1(ϵ)) for all w1 = 1
P(1,1)
{A(1,w2)
}≤ 2−N(I(X2;Y|X1)−δ2(ϵ)) for all w2 = 1
P(1,1)
{A(w1,w2)
}≤ 2−N(I(X1,X2;Y)−δ1,2(ϵ)) for all w1 = 1, w2 = 1.
where δ1 (ϵ) , δ2 (ϵ) , δ1,2 (ϵ) → 0 as ϵ → 0.
With the above lemma and the union of events bound, we see that for Nsufficiently large, P(1,1) {Ea} ≤ ϵ, and
P(1,1)
{E(1)
t
}≤ 2NR12−N(I(X1;Y|X2)−δ1(ϵ))
P(1,1)
{E(2)
t
}≤ 2NR22−N(I(X2;Y|X1)−δ2(ϵ))
P(1,1)
{E(1,2)
t
}≤ 2N(R1+R2)2−N(I(X1,X2;Y)−δ1,2(ϵ))
∴ As long as (R1,R2) satisfies (1) – (3), limN→∞ P(1,1) (E) = 0.
15 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Proof of Lemma 2
Proof of P(1,1)
{A(1,1)
}≥ 1− ϵ for N large enough:
Given (W1,W2) = (1, 1),(XN
1 (1),XN2 (1),YN) are distributed i.i.d. over
time according to pX1,X2,Y = pX1 · pX2 · pY|X1,X2. Hence by LLN and the
fact that the typicality decoder is based on pX1,X2,Y, proof complete.
Proof of P(1,1)
{A(w1,1)
}≤ 2−N(I(X1;Y|X2)−δ1(ϵ)) for all w1 = 1:
Due to the memoryless channel assumption and the fact that codewords aregenerated i.i.d. at random, XN
1 (w1)⊥⊥(XN
2 (1) ,YN), and
P(1,1)
{A(w1,1)
}=
∑(xN
1 ,xN2 ,yN)∈T (N)
ϵ
p(
xN1
)· p
(xN2 , yN
)≤ 2N(1+ϵ)H(X1,X2,Y) · 2−N(1−ϵ)H(X1) · 2−N(1−ϵ)H(X2,Y)
= 2−N(I(X1;Y|X2)−δ1(ϵ)),
where δ1(ϵ) = ϵ (H (X1,X2,Y) + H (X1) + H (X2,Y)) → 0 as ϵ → 0.
Proof of the other two statements follows similarly.
16 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Gaussian MAC: Model
ENC 1
ENC 2
X1
X2
W1
W2
DECY
Z
g2
g1
cW1,cW2
1 Channel law: Y = g1X1 + g2X2 + Z. Z ∼ N(0, σ2
)⊥⊥ (X1,X2).
2 White Gaussian: {Z [t]} is an i.i.d. (white) Gaussian random process3 Memoryless: Z[t]⊥⊥
(W1,W2,Xt−1
1 ,Xt−12 ,Zt−1
).
4 Average power constraint: 1N∑N
t=1 |xk[t]|2 ≤ Pk, k = 1, 2.
5 Signal-to-noise ratio: SNRk := |gk|2Pkσ2 , k = 1, 2.
17 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Characterization of Gaussian MAC Capacity
Theorem 1 (Capacity of Gaussian MAC)If (R1,R2) ≥ 0 satisfies the following, then (R1,R2) is achievable.
Rk <1
2log (1 + SNRk) , k = 1, 2 (4)
R1 + R2 <1
2log (1 + SNR1 + SNR2) (5)
Conversely, if (R1,R2) ≥ 0 is achievable, then it must satisfy (4) – (5)with “<” replaced by “≤”.
pf: Achievability is proved by extending the inner bound in Lemma 1 tothe continuous setting with input cost constraint (omitted), and pickXk ∼ N (0,Pk), k = 1, 2. (Evaluation of mutual information is left as exercise.)
To prove the converse part, we make use of Fano’s inequality and dataprocessing inequality to obtain for k = 1, 2, (ϵk,N → 0 as N → ∞ below)
18 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
NRk = H (Wk) = I(
Wk; Wk
)+ H
(Wk
∣∣∣Wk
)≤ I
(Wk;YN)+ Nϵk,N.
Bound (4) on Individual Rate: let k = 1 (k = 2 can be similarlyproved), and we continue with the above inequality:
N (R1 − ϵ1,N) ≤ I(W1;YN) (a)
≤ I(W1;YN,W2
) (b)= I
(W1;YN|W2
)=∑N
t=1 I(W1;Y[t]|Yt−1,W2
)≤∑N
t=1 I(W1,Yt−1;Y[t]|W2
).
(a) is due to I(W1;W2|YN) ≥ 0; (b) is due to I (W1;W2) = 0.
Next we upper bound I(W1,Yt−1;Y[t]|W2
)by I (X1[t];Y[t]|X2[t])
I(W1,Yt−1;Y[t]|W2
) (c)= I
(W1,Yt−1,X1[t];Y[t]|W2,X2[t]
)≤ I
(W1,W2,Yt−1,X1[t];Y[t]|X2[t]
) (d)= I (X1[t];Y[t]|X2[t]) .
(c) is due to the fact that Xk[t]f= Wk for k = 1, 2. (d) is due to the
memorylessness of the channel:(W1,W2,Yt−1
)− (X1[t],X2[t])− Y[t].
19 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
For Gaussian MAC, note that
I (X1[t];Y[t]|X2[t]) = h (Y[t]|X2[t])− h (Y[t]|X1[t],X2[t])= h (g1X1[t] + Z[t]|X2[t])− h (Z[t]|X1[t],X2[t])(e)≤ h (g1X1[t] + Z[t])− h (Z[t])(f)≤ 1
2 log(1 + |g1|2 P1,t
σ2
), P1,t := E
[|X1[t]|2
].
(e) is due to “conditioning reduces entropy” and Z[t]⊥⊥ (X1[t],X2[t]).(f) is due to “Gaussian maximizes differential entropy”.
∴ N (R1 − ϵ1,N) ≤∑N
t=112 log
(1 + |g1|2 P1,t
σ2
)≤ N · 1
2 log(1 + |g1|2
1N∑N
t=1 P1,tσ2
)(Jensen’s Inequality)
≤ N · 12 log
(1 + |g1|2 P1
σ2
)= N · 1
2 log (1 + SNR1) .
20 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Bound (5) on Sum Rate: set ϵN := ϵ1,N + ϵ2,N. Then,
N (R1 + R2 − ϵN) ≤ I(W1;YN|W2
)+ I(W2;YN)
= I(W1,W2;YN) =∑N
t=1 I(W1,W2;Y[t]|Yt−1
)(a)≤∑N
t=1 I (X1[t],X2[t];Y[t])(b)≤ N · 1
2 log (1 + SNR1 + SNR2)
(a) is similar to the previous proof (exercise). (b) is due to the followingand Jensen’s inequality (exercise):
I (X1[t],X2[t];Y[t]) = h (Y[t])− h (Y[t]|X1[t],X2[t])= h (g1X1[t] + g2X2[t] + Z[t])− h (Z[t])(c)≤ 1
2 log(1 + Var[g1X1[t]+g2X2[t]]
σ2
)(d)= 1
2 log(1 +
|g1|2P1,t+|g2|2P2,tσ2
)(c) is due to “Gaussian maximizes differential entropy”. (d) is due to thefact that X1[t]⊥⊥ X2[t] since W1 ⊥⊥ W2.
21 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Capacity Region of Gaussian MAC
R1
R2
12 log (1 + SNR1)
12 log (1 + SNR2)
12 log
⇣1 +
SNR21+SNR1
⌘
12 log
⇣1 +
SNR11+SNR2
⌘
CGMAC =
((R1, R2) � 0
�����
(Rk 1
2 log (1 + SNRk) , k = 1, 2
R1 +R2 12 log (1 + SNR1 + SNR2)
)
22 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Successive Interference Cancellation
A natural scheme for the receiver to resolve multiple data streams issuccessive interference cancellation:
1 First decode one user’s data, treating the other user’s signal as noise.2 Remove the decoded data, and then decode the next user’s data.
Proposition 1 (Successive Decoding Achievability)SIC with decoding order W1 → W2 achieves (R1,R2) ≥ 0 satisfying (forsome (X1,X2) ∼ pX1 · pX2 in the first equalities below){
R1 < I (X1;Y) = 12 log
(1 + SNR1
1+SNR2
)R2 < I (X2;Y|X1) =
12 log (1 + SNR2)
The proof is quite straightforward. Decoding W1 first will be successfulas long as R1 < I (X1;Y). Decoding W2 with XN
1 (W1) known will besuccessful as long as R2 < I (X2;Y|X1).
23 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Time Sharing: Capacity Region is Convex
Proposition 2 (Time Sharing Achievability)If rate tuples R(1) ∈ [0,∞)
K and R(2) ∈ [0,∞)K are both achievable,
then R(λ) := λR(1) + λR(2) is also achievable ∀λ ∈ [0, 1], λ := 1− λ.
pf: Since R(1) and R(2) are both achievable, there exist a sequence of(2NR(1)
,N)
-codes and a sequence of(2NR(2)
,N)
-codes, both of whichhave vanishing error probability. The main idea of achieving R(λ) is tosplit the blocklength N into two parts λN and λN, and split the data Winto two parts W(1) ∈
[1 : 2NλR(1)
]and W(2) ∈
[1 : 2NλR(2)
].
The proof is complete by using a(2λNR(1)
, λN)
-code to send W(1) in
the first part (of length λN) and using a(2λNR(2)
, λN)
-code to sendW(2) in the second part (of length λN).
24 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
R1
R2
I (X1;Y |X2)
I (X2;Y |X1)
I (X2;Y )
I (X1;Y )
B
A
SIC with decoding order W1 → W2 achieves A (the green region)SIC with decoding order W2 → W1 achieves B (the blue region)
25 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
R1
R2
I (X1;Y |X2)
I (X2;Y |X1)
I (X2;Y )
I (X1;Y )
B
A
With time sharing, all other rate pairs inside the inner bound region canbe achieved.
26 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
1 Basic Bounds and Gaussian MAC
2 General Discrete Memoryless MAC
3 Summary
27 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Enlarged Achievable Region by Time Sharing
By Lemma 1, for (X1,X2) ∼ pX1 · pX2 , an achievable rate region (innerbound of capacity region) Rinner (X1,X2) is defined by
Rinner (X1,X2) :=
(R1,R2) ≥ 0
∣∣∣∣∣ R1 ≤ I (X1;Y|X2)R2 ≤ I (X2;Y|X1)R1 + R2 ≤ I (X1,X2;Y)
By time sharing we obtain an enlarged region (conv: convex hull operation)
Rinner := conv
∪(X1,X2)∼pX1
·pX2
Rinner (X1,X2)
. (6)
Remark: For (scalar) Gaussian MAC, since a single distribution(Gaussian) maximizes three boundaries simultaneously, time sharing willnot enlarge the inner bound, and hence capacity region is characterizedwithout using time sharing.
28 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
R1
R2
R1
29 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
R1
R2
R1
R2
30 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
R1
R2
R1
R2
R3
31 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
R1
R2
R1
R2
R3
conv (R1 [ R2 [ R3)
32 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
It turns out that Rinner in (6) is equal to C .To prove this result, we first develop an outer bound region Router, andthen show that outer and inner bounds match, Router = Rinner = C .
33 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Outer Bound Region of General MAC
Lemma 3 (Outer Bound)If (R1,R2) ≥ 0 is achievable, then it must satisfy the following rateconstraints for some (Q,X1,X2) ∼ pQ · pX1|Q · pX2|Q:
R1 ≤ I (X1;Y|X2,Q) (7)R2 ≤ I (X2;Y|X1,Q) (8)
R1 + R2 ≤ I (X1,X2;Y|Q) (9)
Router (Q,X1,X2) := {(R1,R2) ≥ 0 satisfying (7) – (9)}.
Hence, an outer bound region of C can be defined as
Router :=∪
(Q,X1,X2)∼pQ·pX1|Q·pX2|Q
Router (Q,X1,X2) . (10)
34 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Single Letterization
pf: Recall that in the converse proof of Gaussian MAC, without makinguse of the assumption of Gaussian, we arrive at:If (R1,R2) ≥ 0 is achievable (i.e., ∃ a sequence of
(2NR1 , 2NR2 ,N
)-codes with
limN→∞ P(N)e = 0), then (R1,R2) must satisfy the following 3 inequalities
for some(XN
1 ,XN2
),XN
1 ⊥⊥ XN2 :
R1 ≤ 1N∑N
t=1 I (X1[t];Y[t]|X2[t]) + ϵ1,N (11)R2 ≤ 1
N∑N
t=1 I (X2[t];Y[t]|X1[t]) + ϵ2,N (12)R1 + R2 ≤ 1
N∑N
t=1 I (X1[t],X2[t];Y[t]) + ϵN (13)
Observe that the right-hand-side of (11) – (13) are the average of mutualinformation terms over time. In the point-to-point case, since there’sonly one rate constraint, just find a maximizing distribution to upperbound the mutual information terms at all time slots.
35 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Auxiliary Random Variable Q
R1 ≤ 1N∑N
t=1 I (X1[t];Y[t]|X2[t]) + ϵ1,N (11)R2 ≤ 1
N∑N
t=1 I (X2[t];Y[t]|X1[t]) + ϵ2,N (12)R1 + R2 ≤ 1
N∑N
t=1 I (X1[t],X2[t];Y[t]) + ϵN (13)
However, here we cannot do this, simply because such simultaneouslymaximizing distribution may not exist for all rate constraints (except forsome special cases such as Gaussian MAC).Instead, we introduce an auxiliary random variable Q ∼ Unif [1 : N] and(X1,X2,Y) such that (X1,X2,Y) |{Q = t} d
= (X1[t],X2[t],Y[t]).
=⇒ I (X1;Y|X2,Q) =∑N
t=1 Pr {Q = t} I (X1;Y|X2,Q = t)= 1
N∑N
t=1 I (X1[t];Y[t]|X2[t])
36 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Since XN1 ⊥⊥ XN
2 , the newly introduced (Q,X1,X2) ∼ pQ · pX1|Q · pX2|Q.Finally, let us rewrite the righ-thand-side of inequalities (11) – (13) intoI (X1;Y|X2,Q) + ϵ1,N, I (X2;Y|X1,Q) + ϵ2,N, and I (X1,X2;Y|Q) + ϵNrespectively, and obtain
R1 ≤ I (X1;Y|X2,Q) + ϵ1,N
R2 ≤ I (X2;Y|X1,Q) + ϵ2,N
R1 + R2 ≤ I (X1,X2;Y|Q) + ϵN
Hence, if (R1,R2) ≥ 0 is achievable, then (7) – (9) must hold for some(Q,X1,X2) ∼ pQ · pX1|Q · pX2|Q.
Q here is usually called the “time-sharing” random variable, for reasonsthat will become clear later.Remark: In multi-user information theory, in order to obtain single-lettercapacity bounds, introduction of auxiliary random variable is ofteninevitable, except for some special cases such as Gaussian networks.
37 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
So far, by Lemma 1, time sharing, and Lemma 3, we have shown
Rinner ⊆ C ⊆ Router.
Next, we shall prove Router ⊆ Rinner to complete the proof for
Router = Rinner = C .
38 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Capacity Region for General MAC
Theorem 2 (Capacity of General MAC)For a general multiple access channel pY|X1,X2
, its capacity region C canbe characterized in the following two equivalent forms:
C = Rinner as in (6) = Router as in (10).
pf: Note that for any (Q,X1,X2) ∼ pQ · pX1|Q · pX2|Q, bounds (7) – (9)defines a “pentagon” with a 45° side, with two corner points
A:{
R1 = I (X1;Y|Q)
R2 = I (X2;Y|X1,Q), B:
{R1 = I (X1;Y|X2Q)
R2 = I (X2;Y|Q).
As long as point A and B ∈ Rinner, we prove that Router ⊆ Rinner.This is simple to prove since point A is a convex combination of points(I (X1;Y) , I (X2;Y|X2)) achieved in Lemma 1. Similarly, B ∈ Rinner.
39 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Coded Time Sharing
We establish the capacity region of general MAC in a pretty convolutedway. The main reason is that, the inner bound by Lemma 1 and (6), hasa different form from the outer bound by Lemma 3 and (10).Question: Can we directly prove an inner bound that has the same formas the outer bound by Lemma 3 and (10)?The answer is YES, by a new coding technique called coded time sharing.
Lemma 4 (Inner Bound by Coded Time Sharing)If (R1,R2) ≥ 0 satisfies the following rate constraints for some(Q,X1,X2) ∼ pQ · pX1|Q · pX2|Q, then it is achievable:
R1 ≤ I (X1;Y|X2,Q)
R2 ≤ I (X2;Y|X1,Q)
R1 + R2 ≤ I (X1,X2;Y|Q)
40 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Proof Sketch of Coded Time Sharing Inner Bound
The key idea is to generate a time-sharing sequence qN to control the“configuration” of coding schemes at the two distributed encoders.Hence, qN should be thought of as part of the codebook, and it isrevealed to ALL terminals (all encoders and the decoder).
ENC 1
DEC
ENC 2
yN
w1
w2
bw1, bw2pY |X1,X2
qN
QNt=1 pQ (q[t])
QNt=1 pX1|Q (x1[t]|q[t])
QNt=1 pX2|Q (x2[t]|q[t])
x
N2 (w2)
x
N1 (w1)
�q
N, x
N1 ( bw1) , x
N2 ( bw2) , y
N�
2 T (N)✏ (Q,X1, X2, Y )
41 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
R1
R2
Q = {1, 2} , |Q| = 2
2Y
k=1
pXk|Q (·|Q = 1)
2Y
k=1
pXk|Q (·|Q = 2)
42 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
R1
R2
Q = {1, 2} , |Q| = 2
2Y
k=1
pXk|Q (·|Q = 1)
2Y
k=1
pXk|Q (·|Q = 2)
pQ (1) = 34
43 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
R1
R2
Q = {1, 2} , |Q| = 2
2Y
k=1
pXk|Q (·|Q = 1)
2Y
k=1
pXk|Q (·|Q = 2)
pQ (1) = 12
pQ (1) = 34
44 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
R1
R2
Q = {1, 2} , |Q| = 2
2Y
k=1
pXk|Q (·|Q = 1)
2Y
k=1
pXk|Q (·|Q = 2)
pQ (1) = 14
pQ (1) = 12
pQ (1) = 34
45 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
R1
R2
Q = {1, 2} , |Q| = 2
2Y
k=1
pXk|Q (·|Q = 1)
2Y
k=1
pXk|Q (·|Q = 2)
No need to take the convex hull!
46 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Remarks
1 For the capacity region in (10) to be computable, the time-sharingrandom variable Q has to take value in a finite set Q.For a K-user MAC, it suffices to evaluate (10) for |Q| ≤ K. This iscalled the cardinality bound on the auxiliary random variable.
2 The capacity region in (10), without any convex hull operation, isalready convex and closed.
3 For MAC, it turns out that coded time sharing does not improve theinner bound. This is in general not true for other multi-userchannels, such as the interference channel.
4 The single-letterization technique presented in the converse proofwill be useful in other multi-user channel coding problems.
47 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
1 Basic Bounds and Gaussian MAC
2 General Discrete Memoryless MAC
3 Summary
48 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MACGeneral Discrete Memoryless MAC
Summary
Comparison between point-to-point and multi-user channels:1 Capacity vs. Capacity Region; Supremum vs. Closure
2 Single-letterization in converse proofs:single maximizing distribution vs. auxiliary random variables
3 Achievability: both use random coding arguments; extendsstraightforwardly
Time sharing
Successive decoding (successive interference cancellation SIC)
Coded time sharing
49 / 49 I-Hsiang Wang NIT Lecture 7