Applications of Random Matrix Theory in Wireless ...

Post on 13-Jun-2022

7 views 0 download

Transcript of Applications of Random Matrix Theory in Wireless ...

Applications of Random Matrix Theory in

Wireless Underwater CommunicationWhy Signal Processing and Wireless Communication Need Random

Matrix Theory

Atulya Yellepeddi

May 13, 201318.338- Eigenvalues of Random Matrices, Spring 2013 -Final

Project

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 1 / 10

Signal Processing and the Law of Large Numbers

I Want to estimate a “population size” m from n observations

Methods and Analysis based on statistics of population asn→∞Usually works if n� m

I Modern signal processing- cases when n ∼ m

Parameter of a time-varying system (eg., underwatercommunication)Population size grows as observations grow (eg., socialnetworks)Huge population size (eg., huge array beamformers)

I Random Matrix Theory: excellent at making predictions insuch scenarios

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 2 / 10

Signal Processing and the Law of Large Numbers

I Want to estimate a “population size” m from n observations

Methods and Analysis based on statistics of population asn→∞Usually works if n� m

I Modern signal processing- cases when n ∼ m

Parameter of a time-varying system (eg., underwatercommunication)Population size grows as observations grow (eg., socialnetworks)Huge population size (eg., huge array beamformers)

I Random Matrix Theory: excellent at making predictions insuch scenarios

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 2 / 10

Signal Processing and the Law of Large Numbers

I Want to estimate a “population size” m from n observations

Methods and Analysis based on statistics of population asn→∞Usually works if n� m

I Modern signal processing- cases when n ∼ m

Parameter of a time-varying system (eg., underwatercommunication)Population size grows as observations grow (eg., socialnetworks)Huge population size (eg., huge array beamformers)

I Random Matrix Theory: excellent at making predictions insuch scenarios

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 2 / 10

Why Random Matrix Theory?

For Mathematicians, 30 ≈ ∞So with reasonable population sizes m ∼ 30 or more, we can makehighly accurate predictions which work for small number ofobservations.

Almost Anything is IID Gaussian

Results for IID Gaussian ensembles carry over, in practice, to all sortsof ensembles (with some caveats!) if they are “reasonably” likeGaussian, and “more or less” independent.

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 3 / 10

Why Random Matrix Theory?

For Mathematicians, 30 ≈ ∞So with reasonable population sizes m ∼ 30 or more, we can makehighly accurate predictions which work for small number ofobservations.

Almost Anything is IID Gaussian

Results for IID Gaussian ensembles carry over, in practice, to all sortsof ensembles (with some caveats!) if they are “reasonably” likeGaussian, and “more or less” independent.

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 3 / 10

Why Random Matrix Theory?

For Mathematicians, 30 ≈ ∞So with reasonable population sizes m ∼ 30 or more, we can makehighly accurate predictions which work for small number ofobservations.

Almost Anything is IID Gaussian

Results for IID Gaussian ensembles carry over, in practice, to all sortsof ensembles (with some caveats!) if they are “reasonably” likeGaussian, and “more or less” independent.

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 3 / 10

Least Squares Channel Estimation- the Problem

I Transmit u(n). . .m× 1 vectors, independent from time to timeE[u(n)] = 0E[u(n)u†(m)] = Rδ(n−m)

I . . . across channel w0. . .

outputs: d(n) = w†0u(n) + v(n)v(n) is noise of power σ2v

I . . . what’s w0?

Don’t know R

Least Squares Solution

w(n) = R−1(n)︸ ︷︷ ︸=∑n

i=1 u(i)u†(i)+δI

=∑n

i=1 u(i)d∗(i)︷︸︸︷

z(n) (1)

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 4 / 10

Least Squares Channel Estimation- the Problem

I Transmit u(n). . .m× 1 vectors, independent from time to timeE[u(n)] = 0E[u(n)u†(m)] = Rδ(n−m)

I . . . across channel w0. . .outputs: d(n) = w†0u(n) + v(n)v(n) is noise of power σ2v

I . . . what’s w0?

Don’t know R

Least Squares Solution

w(n) = R−1(n)︸ ︷︷ ︸=∑n

i=1 u(i)u†(i)+δI

=∑n

i=1 u(i)d∗(i)︷︸︸︷

z(n) (1)

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 4 / 10

Least Squares Channel Estimation- the Problem

I Transmit u(n). . .m× 1 vectors, independent from time to timeE[u(n)] = 0E[u(n)u†(m)] = Rδ(n−m)

I . . . across channel w0. . .outputs: d(n) = w†0u(n) + v(n)v(n) is noise of power σ2v

I . . . what’s w0?Don’t know R

Least Squares Solution

w(n) = R−1(n)︸ ︷︷ ︸=∑n

i=1 u(i)u†(i)+δI

=∑n

i=1 u(i)d∗(i)︷︸︸︷

z(n) (1)

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 4 / 10

Least Squares Channel Estimation- the Problem

I Transmit u(n). . .m× 1 vectors, independent from time to timeE[u(n)] = 0E[u(n)u†(m)] = Rδ(n−m)

I . . . across channel w0. . .outputs: d(n) = w†0u(n) + v(n)v(n) is noise of power σ2v

I . . . what’s w0?Don’t know R

Least Squares Solution

w(n) = R−1(n)︸ ︷︷ ︸=∑n

i=1 u(i)u†(i)+δI

=∑n

i=1 u(i)d∗(i)︷︸︸︷

z(n) (1)

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 4 / 10

Least Squares Channel Estimation-the Challenge

I Idea is as n→∞, the Sample Covariance Matrix R(n)n→ R.

I Performance: the channel estimation error ε(n) = w0 − w(n)

I If n→∞ (practically, n� m) is assumed:

E[‖ε(n)‖22] ≈1

nσ2vtr{R−1} (2)

I Bad predictions for n ∼ m

Occurs in applications like underwater acoustic communication,where channel varies quicklyOr when taking observations is expensive

I Random Matrix Theory allows better predictions

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 5 / 10

Least Squares Channel Estimation-the Challenge

I Idea is as n→∞, the Sample Covariance Matrix R(n)n→ R.

I Performance: the channel estimation error ε(n) = w0 − w(n)

I If n→∞ (practically, n� m) is assumed:

E[‖ε(n)‖22] ≈1

nσ2vtr{R−1} (2)

I Bad predictions for n ∼ m

Occurs in applications like underwater acoustic communication,where channel varies quicklyOr when taking observations is expensive

I Random Matrix Theory allows better predictions

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 5 / 10

Least Squares Channel Estimation-the Challenge

I Idea is as n→∞, the Sample Covariance Matrix R(n)n→ R.

I Performance: the channel estimation error ε(n) = w0 − w(n)

I If n→∞ (practically, n� m) is assumed:

E[‖ε(n)‖22] ≈1

nσ2vtr{R−1} (2)

I Bad predictions for n ∼ m

Occurs in applications like underwater acoustic communication,where channel varies quicklyOr when taking observations is expensive

I Random Matrix Theory allows better predictions

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 5 / 10

Least Squares Channel Estimation-the Challenge

I Idea is as n→∞, the Sample Covariance Matrix R(n)n→ R.

I Performance: the channel estimation error ε(n) = w0 − w(n)

I If n→∞ (practically, n� m) is assumed:

E[‖ε(n)‖22] ≈1

nσ2vtr{R−1} (2)

I Bad predictions for n ∼ m

Occurs in applications like underwater acoustic communication,where channel varies quicklyOr when taking observations is expensive

I Random Matrix Theory allows better predictions

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 5 / 10

Least Squares Channel Estimation-the Challenge

I Idea is as n→∞, the Sample Covariance Matrix R(n)n→ R.

I Performance: the channel estimation error ε(n) = w0 − w(n)

I If n→∞ (practically, n� m) is assumed:

E[‖ε(n)‖22] ≈1

nσ2vtr{R−1} (2)

I Bad predictions for n ∼ m

Occurs in applications like underwater acoustic communication,where channel varies quicklyOr when taking observations is expensive

I Random Matrix Theory allows better predictions

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 5 / 10

The Random Matrix Theory Results

Let Mk(m,n) =1mE[Tr(R−k(n)

)]. Then. . .

Channel Estimation Error:

E[‖ε(n)‖22

]= m

(σ2vM1(m,n) + δ2M2(m,n)− δσ2

vM2(m,n))

(3)

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 6 / 10

The Random Matrix Theory Results

Let Mk(m,n) =1mE[Tr(R−k(n)

)]. Then. . .

Channel Estimation Error:

E[‖ε(n)‖22

]= m

(σ2vM1(m,n) + δ2M2(m,n)− δσ2

vM2(m,n))

(3)

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 6 / 10

Moments: Independent Input Observations

Let A = 1nXX†T 1/2.

2 key ideas to compute the moments:

I Assume m,n are large but c = mnis a constant. Then,

eigenvalue density of R(n) can be approximated (MarcenkoPastur law):

µR(n)(x) ≈ µΦ(x) =1

nµA

(x− δn

)(4)

I Use the following to compute moments:

limm→∞

1

mE[Tr

(Bk)] =

∫tkµB(t) dt (5)

Compute Moments Using:

Mk(m,n) ≈∫t−kµΦ(t) dt (6)

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 7 / 10

Moments: Independent Input Observations

Let A = 1nXX†T 1/2.

2 key ideas to compute the moments:I Assume m,n are large but c = m

nis a constant. Then,

eigenvalue density of R(n) can be approximated (MarcenkoPastur law):

µR(n)(x) ≈ µΦ(x) =1

nµA

(x− δn

)(4)

I Use the following to compute moments:

limm→∞

1

mE[Tr

(Bk)] =

∫tkµB(t) dt (5)

Compute Moments Using:

Mk(m,n) ≈∫t−kµΦ(t) dt (6)

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 7 / 10

Moments: Independent Input Observations

Let A = 1nXX†T 1/2.

2 key ideas to compute the moments:I Assume m,n are large but c = m

nis a constant. Then,

eigenvalue density of R(n) can be approximated (MarcenkoPastur law):

µR(n)(x) ≈ µΦ(x) =1

nµA

(x− δn

)(4)

I Use the following to compute moments:

limm→∞

1

mE[Tr

(Bk)] =

∫tkµB(t) dt (5)

Compute Moments Using:

Mk(m,n) ≈∫t−kµΦ(t) dt (6)

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 7 / 10

Moments: Independent Input Observations

Let A = 1nXX†T 1/2.

2 key ideas to compute the moments:I Assume m,n are large but c = m

nis a constant. Then,

eigenvalue density of R(n) can be approximated (MarcenkoPastur law):

µR(n)(x) ≈ µΦ(x) =1

nµA

(x− δn

)(4)

I Use the following to compute moments:

limm→∞

1

mE[Tr

(Bk)] =

∫tkµB(t) dt (5)

Compute Moments Using:

Mk(m,n) ≈∫t−kµΦ(t) dt (6)

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 7 / 10

Predictions Made- Gaussian Input

0 50 100 150 200−50

−40

−30

−20

−10

0

10Gaussian input, Channel Length=30, SNR=40 dB

number of observations

dB

simulationstheoryn>>m

Figure : Channel Estimation MSE vs Number of Observations

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 8 / 10

Predictions Made- Gaussian Input

0 50 100 150 200−15

−10

−5

0

5Uniform +1/−1 input, Channel Length=30, SNR=5 dB

number of observations

dB

simulationstheoryn>>m

Figure : Channel Estimation MSE vs Number of Observations

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 9 / 10

Conclusions

I RMT makes nice predictions about signal processing systemsrunning with a small number of observations

I Leads to identifying phenomena that were previously unknown

I Simple tools, but widely applicable

I More sophisticated tools available. . . how to use?

Atulya Yellepeddi RMT Appl. to Underwater Wireless Comm 18.338 Course Project 10 / 10