The Capacity of Interference Channels with Partial Transmitter Cooperation

50
The Capacity of Interference The Capacity of Interference Channels with Partial Transmitter Channels with Partial Transmitter Cooperation Cooperation Ivana Marić Ivana Marić Roy D. Yates Gerhard Kramer Roy D. Yates Gerhard Kramer Stanford WINLAB, Rutgers Bell Labs Stanford WINLAB, Rutgers Bell Labs Sender 1 Sender 2 Decoder 1 Decoder 2

description

The Capacity of Interference Channels with Partial Transmitter Cooperation. Ivana Marić Roy D. Yates Gerhard Kramer Stanford WINLAB, Rutgers Bell Labs. Decoder 1. Sender 1. Sender 2. Decoder 2. X 1. W 1. encoder 1. channel. ^. Y 1. W 1. - PowerPoint PPT Presentation

Transcript of The Capacity of Interference Channels with Partial Transmitter Cooperation

Page 1: The Capacity of Interference Channels with Partial Transmitter Cooperation

The Capacity of Interference Channels The Capacity of Interference Channels with Partial Transmitter Cooperationwith Partial Transmitter Cooperation

Ivana MarićIvana Marić Roy D. Yates Gerhard Roy D. Yates Gerhard KramerKramer Stanford WINLAB, Rutgers Bell LabsStanford WINLAB, Rutgers Bell Labs

Sender 1

Sender 2

Decoder 1

Decoder 2

Page 2: The Capacity of Interference Channels with Partial Transmitter Cooperation

2

• Gaussian Channel in Standard FormGaussian Channel in Standard Form

X1

√h21

√h12X2

Y1

Y2

1

1

channelencoder 1

encoder 2

X1

X2

W1

W2

W1

^

decoder 1

decoder 2

Y1

Y2 W2

^

p(y1,y2|x1,x2)

Interference Channel Interference Channel

221122

122111

ZXXhY

ZXhXY

Z1~ N (0,1), Z2~ N (0,1)where where

• How to cope with How to cope with interferenceinterference is not well understood is not well understood• The interference channel capacity is a long standing open problemThe interference channel capacity is a long standing open problem

Page 3: The Capacity of Interference Channels with Partial Transmitter Cooperation

3

Strong Interference Strong Interference

• Capacity region known if there is strong interference:

• Capacity regionCapacity region = = Capacity region of a compound MAC in which bothboth messages are decoded at both receivers [Ahlswede, 1974]

)|;()|;(

)|;()|;(

112122

221211

XYXIXYXI

XYXIXYXI

• For the Gaussian channel in standard form, this means [Sato, Han&Kobayashi, 1981]

1

1

21

12

h

h

for all input product probability distributions [Costa&El Gamal,1987]

• Focus of this work: strong interference

Page 4: The Capacity of Interference Channels with Partial Transmitter Cooperation

4

Compound MACCompound MAC

Source 1

Source 2

Decoder 1

Decoder 2

W1

W2

(W1, W2)

(W1, W2)

Page 5: The Capacity of Interference Channels with Partial Transmitter Cooperation

5

Cooperation in Interference ChannelCooperation in Interference Channel

• Not captured in the interference channel model:• Broadcasting: sources “overhear” each other’s transmission• Cooperation

• Our goal: consider cooperation and derive capacity results• Work on more involved problems than already unsolved?• What are insightful and tractable channel models?• How do we model cooperation?

channelencoder 1

encoder 2

X1

X2

W1

W2

W1

^

decoder 1

decoder 2

Y1

Y2 W2

^

p(y1,y2|x1,x2)

Page 6: The Capacity of Interference Channels with Partial Transmitter Cooperation

6

• Full cooperation: MIMO broadcast channel• DPC optimal [Weingarten, Steinberg & Shamai, 04], [Caire & Shamai, 01], [Viswanath, Jindal & Goldsmith, 02]

• Several cooperation strategies proposed [Host-Madsen, Jindal, Mitra& Goldsmith, 03, Ng & Goldsmith, 04]

• [Jindal, Mitra& Goldsmith, Ng& Goldsmith]:•Gain from DPC when sources close together•When apart, relaying outperforms DPC•Assumptions:•Dedicated orthogonal cooperation channels •Total power constraint

Transmitter Cooperation for Gaussian ChannelsTransmitter Cooperation for Gaussian Channels

S1

S2

R1

R2

S1

S2

R1

R2

Page 7: The Capacity of Interference Channels with Partial Transmitter Cooperation

7

Transmitter Cooperation In Discrete Memoryless Channel Transmitter Cooperation In Discrete Memoryless Channel

• MAC with partially cooperating encoders [Willems, 1983]

• Capacity region CMAC (C12,C21) determined by Willems, 1983

• Communication through conference

• Realistic model

C21channel

encoder 1

encoder 2 decoder

X1

X2

W1

W2

Y (W1,^

W2)^

C12

Page 8: The Capacity of Interference Channels with Partial Transmitter Cooperation

8

• Each transmitter sends K symbols

•V1k depends on previously received

• Alphabet size of is at most NC12

• Alphabet size of is at most NC21

Cooperation through ConferenceCooperation through Conference

2,1),,...,( 1 tVV tKt

K

kk NC

1212log V

12211

2 ,..., k

k VVV

KkVWhV

VWhVk

kk

kkk

,...,1,

,1

1222

12111

1K

K

kk NC

1121log V

KV2

KV1

Page 9: The Capacity of Interference Channels with Partial Transmitter Cooperation

9

Partial Transmitter CooperationPartial Transmitter Cooperation

channel

encoder 1

encoder 2

decoder 1

decoder 2

X1

X2

W1,W0

W2,W0

Y1

Y2

• In the conference: partial information about messages W1, W2 exchanged between encoders

• After the conference: • Common message at the encoders

• Encompasses scenarios in which encoders have a partial knowledge about each other’s messages

Page 10: The Capacity of Interference Channels with Partial Transmitter Cooperation

10

Transmitter CooperationTransmitter Cooperation

• Limited cooperation allows transmitters to exchange partial information about each other messages

• After cooperation• Common message known to both encoders• Private message at each encoder

• In the strong interference, the capacity region tied to the capacity region of The Compound MAC with Common Information

•Two MAC channels with a common and private messages

common

private

private

channel

encoder 1

encoder 2

decoder 1

decoder 2

X1

X2

W1

W2

Y1

Y2

W0

(W1,W2, W0) ^ ^ ^

(W1,W2, W0) ^ ^ ^p(y1,y2|x1,x2)

Page 11: The Capacity of Interference Channels with Partial Transmitter Cooperation

11

Compound MAC with Common InformationCompound MAC with Common Information

Source 1

Source 2

Decoder 1

Decoder 2

W1,W0

W2,W0

(W1, W2,W0)

(W1, W2,W0)

Page 12: The Capacity of Interference Channels with Partial Transmitter Cooperation

12

The Compound MAC with Common Information

• Encoding

• Decoding

• The error probability

• (R0, R1, R2) achievable if, for any , there is an (M0, M1, M2, N, Pe) code such that

• The capacity region is the closure of the set of all achievable (R0, R1, R2)

• Easy to determine given the result by [Slepian & Wolf, 1973]

0,WWf ttt x

ttgtWtWtW Y)(ˆ),(ˆ),(ˆ021

),,()Y(),,()Y( 2102221011 WWWgWWWgPPe

2,1t

eP iNRiM 2

0

2,1,0i

Page 13: The Capacity of Interference Channels with Partial Transmitter Cooperation

13

• Capacity region [ Slepian & Wolf , 1973]

)};,(

)|;,(

),|;(

),|;(

21210

2121

122

211

YXXIRRR

UYXXIRR

UXYXIR

UXYXIR

:),,( 210 RRRMAC C

union over p(u)p(x1|u)p(x2|u)p(y|x1,x2)

common

private

private

channel

encoder 1

encoder 2

decoder

X1

X2

W1

W2

YW0 (W1,W2,W0 ) ^ ^ ^

p(y|x1,x2)

The MAC with Common Information

Page 14: The Capacity of Interference Channels with Partial Transmitter Cooperation

14

• Capacity region [ Slepian & Wolf , 1973]

)};,(

)|;,(

),|;(

),|;(

21210

2121

122

211

YXXIRRR

UYXXIRR

UXYXIR

UXYXIR

:),,( 210 RRRMAC C

union over p(u)p(x1|u)p(x2|u)p(y|x1,x2)

common

private

private

channel

encoder 1

encoder 2

decoder

X1

X2

W1

W2

YW0 (W1,W2,W0 ) ^ ^ ^

p(y|x1,x2)

The MAC with Common Information

)'( pMACR

Page 15: The Capacity of Interference Channels with Partial Transmitter Cooperation

15

The Compound MAC with Common Information

);,(),;,(min

)|;,(),|;,(min

),|;(),,|;(min

),|;(),,|;(min

221121210

22112121

1221122

2212111

YXXIYXXIRRR

UYXXIUYXXIRR

UXYXIUXYXIR

UXYXIUXYXIR

:),,( 210 RRRCMAC C

union over p(u)p(x1|u)p(x2|u)p(y1,y2|x1,x2)

• Capacity region [MYK, ISIT 2005]

common

private

private

channel

encoder 1

encoder 2

decoder 1

decoder 2

X1

X2

W1

W2

Y1

Y2

W0

(W1,W2, W0) ^ ^ ^

(W1,W2, W0) ^ ^ ^p(y1,y2|x1,x2)

Page 16: The Capacity of Interference Channels with Partial Transmitter Cooperation

16

ppp

21 MACMACCMAC RR C

union over p(u)p(x1|u)p(x2|u)p(y1,y2|x1,x2)

• For each p : ( R0,R1,R2 ) is an intersection of rate regions RMACt achieved in two MACs

with common information:

2

),|,(),|( 21212111y

xxyypxxypp 1

),|,(),|( 21212122y

xxyypxxypp

The Compound MAC with Common Information

• Capacity region

Page 17: The Capacity of Interference Channels with Partial Transmitter Cooperation

17

ConverseConverse

• Error probability in MACt

• Error probability in CMAC

→ Necessary condition for :

→ Rates confined to R MAC1 (p) and R MAC2 (p) for every p

2,1)( 210 t,W,WWgPP ttet Y

2102221011 ,,)(,,)( WWWgWWWgPPe YY

eee PPP 21,max

0eP ,01 eP 02 eP

Page 18: The Capacity of Interference Channels with Partial Transmitter Cooperation

18

• From Slepian and Wolf result, choosing the rates ( R0,R1,R2 ) in

will guarantee that Pe1 and Pe2 can be made arbitrarily small

→ Pe will be arbitrarily small

21 MACMAC RR

Achievability Achievability

• The probability of error

),,()(),,()( 2102221011 WWWgWWWgPPe YY

),,()(),,()( 2102221011 WWWgPWWWgP YY

21 ee PP

Page 19: The Capacity of Interference Channels with Partial Transmitter Cooperation

19

ImplicationsImplications

• We can use this result to determine the capacity region of several channel with partial transmitter cooperation:

1. The Strong Interference Channel with Common Information

2. The Strong Interference Channel with Unidirectional Cooperation

3. The Compound MAC with Conferencing Encoders

Page 20: The Capacity of Interference Channels with Partial Transmitter Cooperation

20

• Compound MAC with common information

After The ConferenceAfter The Conference

common

private

private

channel

encoder 1

encoder 2

decoder 1

decoder 2

X1

X2

W1

W2

Y1

Y2

W0

(W1,W0, ) ^ ^

• Relax the decoding constraints:• Each receiver decodes only one private message

W2

^

(W2,W0, ) ^ ^

W1

^

Page 21: The Capacity of Interference Channels with Partial Transmitter Cooperation

21

TheoremTheorem

• For the interference channel with common information satisfying

for all product input distributions the capacity region C is

);,(),;,(min

)|;,(),|;,(min

),|;(

),|;(

121121210

22112121

1222

2111

YXXIYXXIRRR

UYXXIUYXXIRR

UXYXIR

UXYXIR

:),,( 210 RRRC

union over p(u)p(x1|u)p(x2|u)p(y1,y2|x1,x2)

)|;()|;(

)|;()|;(

112122

221211

XYXIXYXI

XYXIXYXI

• Capacity region = capacity region of the compound MAC withCapacity region = capacity region of the compound MAC with

common informationcommon information

Page 22: The Capacity of Interference Channels with Partial Transmitter Cooperation

22

ProofProof

• Achievability

• Follows directly from the achievability in the Compound MAC with common information

– Decoding constraint is relaxed

• Converse

• Using Fano’s inequality

• In the interference channel with no cooperation: outer bounds rely on the independence of X1 and X2

• Due to cooperation: Codewords not independent

• Theorem conditions obtained from the converse

Page 23: The Capacity of Interference Channels with Partial Transmitter Cooperation

23

Relationship to the Strong Interference ConditionsRelationship to the Strong Interference Conditions

)|;()|;(

)|;()|;(

112122

221211

XYXIXYXI

XYXIXYXI

),|;(),|;(

),|;(),|;(

112122

221211

UXYXIUXYXI

UXYXIUXYXI

for all input distributions

for all product input distributions

• The same interference channel class satisfies two sets of conditions

• Strong interference channels conditions

)()( 21 21xpxp XX

)()()( 21 21uxpuxpup UXUXU

Page 24: The Capacity of Interference Channels with Partial Transmitter Cooperation

24

Interference Channel With Unidirectional CooperationInterference Channel With Unidirectional Cooperation

channel

encoder 1

encoder 2

decoder 1

decoder 2

X1

X2

W1

W2

Y1

Y2

W1 ^

W2^

• Encoding functions

• Decoding functions

tttte WgPP )(, Y• The error probability },max{ 2,1, eee PPP 2,1tfor

2122

111

,WWf

Wf

x

x

222

111

ˆ

ˆ

Y

Y

gW

gW

• The difference from the interference channel: one encoder knows both messages

Page 25: The Capacity of Interference Channels with Partial Transmitter Cooperation

25

Cognitive Radio SettingsCognitive Radio Settings

• Cognitive Radio Channel [Devroye, Mitran, Tarokh, 2005]

• An achievable rate region• Consider simple two-transmitter, two-receiver network:

Assume that one transmitter is cognitive• It can “overhear” transmission of the primary user• It obtains partially the primary user’s message it can cooperate

Secondary user

Primary user

Decoder 1

Decoder 2

(cognitive radio)

Interference channel with unidirectional cooperation

• The assumption that the full message W1 is available at the cognitive user is an over-idealization

Page 26: The Capacity of Interference Channels with Partial Transmitter Cooperation

26

Interference Channel with Unidirectional Cooperation

• The Interference Channel with Unidirectional Cooperation [Marić, Yates & Kramer, 2005]

• Capacity in very strong interference• The Interference Channel with Degraded Message Set

[Wu, Vishwanath & Arapostathis, 2006]

• Capacity for weak interference and for Gaussian channel in weak interference

• Cognitive Radio Channel [Joviċić& Viswanath, 2006]

• Capacity for Gaussian channel in weak interference• The Interference Channel with a Cognitive Transmitter

[Marić, Goldsmith, Kramer& Shamai, 2006]

• New outer bounds and an achievable region

Page 27: The Capacity of Interference Channels with Partial Transmitter Cooperation

27

• For the interference channel with unidirectional cooperation satisfying

for all joint input distributions p(x1,x2), the capacity region C is

• Capacity region = capacity region of the Compound MAC channel with Common Information

TheoremTheorem

);,();,(

)|;()|;(

221121

112122

YXXIYXXI

XYXIXYXI

);,(

)|;(

12121

1222

YXXIRR

XYXIR

:),( 21 RRC

where the union is over p(x1,x2)p(y1,y2|x1,x2).

Page 28: The Capacity of Interference Channels with Partial Transmitter Cooperation

28

Achievability: Compound MAC with Common Information Compound MAC with Common Information

• Encode as for a Compound MAC with Common Information

• Due to unidirectional cooperation:

• W1 is a common message

• encoder 1 has no private message R’0=R1, R’1=0 choose: U=X1

common

private

channel

encoder 1

encoder 2

decoder 1

decoder 2

X1

X2

Y1

Y2

W1

W2

(W1,W2) ^ ^

(W1,W2) ^̂

:),( 21 RRCMAC C );,(),;,(min

)|;(),|;(min

22112121

1221122

YXXIYXXIRR

XYXIXYXIR

Page 29: The Capacity of Interference Channels with Partial Transmitter Cooperation

29

ConverseConverse

• Using Fano’s inequality

• Interference channel with no cooperation: outer bounds rely on the independence of X1 and X2

• Due to cooperation: Codewords not independent

• Theorem conditions obtained from the converse

Page 30: The Capacity of Interference Channels with Partial Transmitter Cooperation

30

Converse (Converse (Continued Continued ))

• Lemma: If per-letter conditions are satisfied for all p(x1,x2), then

),|;(),|;( 11121122 WIWI XYXXYX

)|X;YI(X)|X;YI(X 112122

• Proof: Similar to proof by [Costa& El Gamal,1987] with the changes

• X1, X2 not independent

• Conditioning on W1

• We would next like

But because the situation is asymmetric, this seems difficult

N

N

nnnn N;Y,XXI

1221 )( 21 RRN

21 RRN

N

nnnn YXXI

1121 );,(• Bound

),|;(),|;( 11121122 WIWI XYXXYX • Requires

Page 31: The Capacity of Interference Channels with Partial Transmitter Cooperation

31

Converse (Converse (Continued Continued ))

• Recall that the achievable rates are

• By assumption, for all p(x1,x2)

• The rates

union over all p(x1,x2)

are thus achievable and are an outer bound

:),( 21 RRCMAC C );,(),;,(min

)|;(),|;(min

22112121

1221122

YXXIYXXIRR

XYXIXYXIR

);,();,(

)|;()|;(

221121

112122

YXXIYXXI

XYXIXYXI

:),( 21 RR);,(

)|;(

12121

1222

YXXIRR

XYXIR

Page 32: The Capacity of Interference Channels with Partial Transmitter Cooperation

32

Gaussian ChannelGaussian Channel

• Channel outputs:

• Power constraints:

iiii

iiii

zxxhy

zxhxy

221122

122111

2,11

1

2

tPxN

N

itti

Z1~ N (0,1)

Z2~ N (0,1)

• Noise:

S1

S2

x1

√h21

√h12x2

y1

y2

1

1

Page 33: The Capacity of Interference Channels with Partial Transmitter Cooperation

33

• Encoder 2: - Dedicates a portion P2 to help

-Uses superposition coding

Capacity Achieving Scheme in Strong InterferenceCapacity Achieving Scheme in Strong Interference

11

2202 X

P

PXX

1P

2P 2)1( P 22

2 )1(1log2

1PR

1212122

2121 ||21log2

1PPPhPhRR 11 ,0~ PX

220 ,0~ PX

• Encoder 1: Codebook x1(w1) 11 ,0~ PX 12,...,11NRw

2P

• Decoders reduce interference as they can decode each other’s messages

1W

2W

dec 1

dec 2

enc 1

enc 2

√h21

√h12

Page 34: The Capacity of Interference Channels with Partial Transmitter Cooperation

34

• Interference channel – no cooperation:

• Interference channel with common information:

• Unidirectional cooperation:

- more demanding conditions

• For P1= P2 sufficient conditions:

• For P1= 0 conditions never satisfied

• Channel reduces to a degraded broadcast channel from sender 2

Gaussian Channel- Strong Interference ConditionsGaussian Channel- Strong Interference Conditions

1

1

21

12

h

h

1

1

21

12

h

h

121

2112

h

hh

S1

S2

x1

√h21

√h12x2

y1

y2

1

1

221

212

221

212

21

)()1(

)()1(

1||

hh

hh

h

21 / PP

Page 35: The Capacity of Interference Channels with Partial Transmitter Cooperation

35

Gaussian Channel with Unidirectional CooperationGaussian Channel with Unidirectional Cooperation

h12

1

weak strong

interference

1/ 21 PP• Let

h21 1

•Sufficient conditions:

121

2112

h

hh

• Weak and strong interference regime solved

• Our current work: in between regime

s1 s2d2 d1

•Strong interference scenario:

Page 36: The Capacity of Interference Channels with Partial Transmitter Cooperation

36

Gaussian Channel With ConferencingGaussian Channel With Conferencing

• Channel outputs:

• Power constraints:

iiii

iiii

zxxhy

zxhxy

221212

121211

2,11

1

2

tPxN

N

itti

Z1~ N (0,1)

Z2~ N (0,1)

• Noise:

S1

S2

Z1

x1

√h12

+

+

Z2

√h21x2

y1

y2

1

1

• Note: additional resources needed for the conference

C12 C21

Page 37: The Capacity of Interference Channels with Partial Transmitter Cooperation

37

a = 0

max sum rate

c=0(no cooperation)

Gaussian Channel With ConferencingGaussian Channel With Conferencing

a increases

• Symmetric channel

21

21 ][

PP

XXEa

• a=0: X1, X2 independent

• Max sum-rate: Two sum-rate bounds equal

h12 = h21= hP1 = P2 = P

c = c12 = c21

Zt ~ N (0,1) t=1,2

Page 38: The Capacity of Interference Channels with Partial Transmitter Cooperation

38

a = 0

max sum ratec=0

(no cooperation)

c=0(no cooperation)

Gaussian Channel With ConferencingGaussian Channel With Conferencing

Sender t power, t=1,2:

Pc-conference power

a increases

N

iti Px

N 1

21

N

icti PPx

N 1

21

• Symmetric channel

)1log(2

1cPc

Page 39: The Capacity of Interference Channels with Partial Transmitter Cooperation

39

• Large c:c:

• Rate region

→→ Maximized for a=1

• Senders able to exchange indexes W1 and W2

→→ Full cooperation condition

→ → Less cooperation needed as the receivers further away

Full Cooperation in Gaussian ChannelFull Cooperation in Gaussian Channel

21 h

N

PCc

hah

N

PCRR 221

Page 40: The Capacity of Interference Channels with Partial Transmitter Cooperation

40

Interference Channel With ConferencingInterference Channel With Conferencing

• Relax the constraint → Each user decodes only one message:

Strong interference conditions?• After the conferenceAfter the conference

common

private

private

channel

encoder 1

encoder 2

decoder 1

decoder 2

X1

X2

W1

W2

Y1

Y2

W0

(W1,W01 ) ^ ^

(W2,W02 ) ^ ^

• Common message contains information about Common message contains information about bothboth original messages original messages W0=(W01,W02)

• Decoders are interested in a part of the common message• Strong interference conditions?

Page 41: The Capacity of Interference Channels with Partial Transmitter Cooperation

41

• Encompasses various models:

• MIMO broadcast

• Unidirectional cooperation

• Partial cooperation

• Strong Interference conditions?

Interference Channel With ConferencingInterference Channel With Conferencing

(W1,W01) ̂^

channel

encoder 1

encoder 2

decoder 1

decoder 2

X1

X2

W1 W01 W02

Y1

Y2

W2 W01 W02 (W2,W02) ̂^

Page 42: The Capacity of Interference Channels with Partial Transmitter Cooperation

42

DiscussionDiscussion

• Introduced cooperation into the Interference Channel:

• Capacity results for scenarios of strong interference

• If the interference is strong : decoders can decode the interference

• No partial decoding at the receivers → easier problem

• Ongoing work:

• Channel models that incorporate node cooperation

– Capture characteristics of cognitive radio networks

• Capacity results and cooperation gains for more general settings

Page 43: The Capacity of Interference Channels with Partial Transmitter Cooperation

43

Ongoing WorkOngoing Work

• Determining the fundamental limits of wireless networks requires understanding the optimal ways of cooperation

• Hierarchy of cooperation

• Encoding schemes

• Optimum resource allocation

• How to cope with interference

Page 44: The Capacity of Interference Channels with Partial Transmitter Cooperation

44

• Not neededNot needed

Page 45: The Capacity of Interference Channels with Partial Transmitter Cooperation

45

a = 0

max sum rate

a increases A B

The Gaussian Channel Capacity RegionThe Gaussian Channel Capacity Region

c+R1

)(hPC

• Corner A:

• S1 acts as a relay:

•Sends at rate c

• S2 sends independent information at rate

• Corner B:

• In addition, S1 sends own data

• Sum-rate for a=0 achieved

)(hPC

)(21 PhPCRR

c

S2

S1

c=0(no cooperation)

Page 46: The Capacity of Interference Channels with Partial Transmitter Cooperation

46

Interference Channel with Confidential Messages Interference Channel with Confidential Messages

channelencoder 1

encoder 2

X1

X2

W1

W2

W1

^

decoder 1

decoder 2

Y1

Y2 W2

^

p(y1,y2|x1,x2)

Interference ChannelInterference Channel

W1

W2

Page 47: The Capacity of Interference Channels with Partial Transmitter Cooperation

47

Interference Channel with Confidential MessagesInterference Channel with Confidential Messages

• Joint work with: Ruoheng Liu, Predrag Spasojevic and Roy D. Yates • Developed inner and outer bounds

Receiver 1

Transmitter 1

encoder 1

encoder 2

decoder

X1

X2

W1

W2

W1

^

channel

p(y1, y2|x1, x2)

Y1

Transmitter 2

H(W2| Y1,W1)

Receiver 2

decoder W2

^

H(W1| Y2,W2)

Y2

Page 48: The Capacity of Interference Channels with Partial Transmitter Cooperation

48

The Compound MAC With Conferencing Encoders The Compound MAC With Conferencing Encoders

channelencoder 1

encoder 2

X1

X2

W1

W2

C21C12

decoder 1

decoder 2

Y1

Y2

(W1,W2)^ ^

(W1,W2)^ ^p(y1,y2|x1,x2)

• Adopt Willems’ cooperation model

• Decoding constraint: Both messages decoded by both receivers

• Encoding:

• Decoding:

• The probability of error:

K

K

VWf

VWf

1222

2111

,

,

x

x

ttgWW Y21ˆ,ˆ

21222111 ,)(,)( WWgWWgPPe YY

Page 49: The Capacity of Interference Channels with Partial Transmitter Cooperation

49

TheoremTheorem

• The Compound MAC capacity regioncapacity region C (C12,C21) is

union over p(u)p(x1|u)p(x2|u)p(y1,y2|x1,x2) denoted p

• For each p : Intersection between rate regions of two MACs with partially cooperating encoders

);,(

)|;,(

),|;(

),|;(

12121

211212121

211122

122111

YXXIRR

CCUYXXIRR

CUXYXIR

CUXYXIR

)};,(

)|;,(

),|;(

),|;(

22121

211222121

211222

122211

YXXIRR

CCUYXXIRR

CUXYXIR

CUXYXIR

R MAC1 (p,C12,C21) R MAC2 (p,C12,C21)

:),(),( 212112 RRCC C

21122211212112 ,,,,),( CCCCCC MACMAC ppp

RR C

Page 50: The Capacity of Interference Channels with Partial Transmitter Cooperation

50

Gaussian Channel With ConferencingGaussian Channel With Conferencing

)2(min0

)(min0

)(min0

)(min0

:),(),(

22112211}2,1{

21

21122211}2,1{

21

2122}2,1{

2

1211}2,1{

1

212112

bPhaPhPhPhCRR

CCPbhPahCRR

CPbhCR

CPahCR

RRCC

jjjjj

jjj

jj

jj

C

• Union over all

• 10,10 ba

bbaahh 1,1,12211

• Use the maximum entropy theorem• The Gaussian C-MAC capacity region

• For C12=C21=0: optimum choice is a=0, b=0