parametric method of power spectrum Estimation

Post on 09-Jan-2017

84 views 2 download

Transcript of parametric method of power spectrum Estimation

Parametric Method of Power Spectrum Estimation PRESENTED BY:-

Duha Hassan

2Power Spectrum of signal Power spectrum of signal gives the distribution of the

signal power among various frequencies. Power spectrum is the Fourier transform of the correlation function it describes the characteristics over time series in frequency domain .

So the power spectrum represent variance or power as a function of frequency in the process and tell us where the energy is distributed

3Power Spectrum of signal

Estimate the power spectrum given set of data.

If the signal is random ,then only an estimate of the signal can be obtained.

4

To estimate the spectral characteristics of signal characterized as random processes.

To estimation of spectra in frequency domain when signals are random in nature.

Power Spectral Estimation method is to obtain an approximate estimation of the power spectral density of a given real random process .

WHY WE USE POWER SPECTRUM ESTIMATION ?

5Introduction to Estimation

Estimation theory is concerned with the determination of the best estimate of an unknown parameter vector from an observation signal , or the recovery of a clean signal degraded by noise and distortion.

6

Parametric modeling techniques find the parameters for a mathematical model describing a signal, system, or process. These techniques use known information about the system to determine the model

Parametric Modeling

7 HANT

8Model-Based Power Spectrum Estimation

parametric or model-based methods teqniques are based on the use of models for the data

Let assume that the data is the output for a linear time invariant system with frequency response in response to a wait noise input sequins

The assumption that the input has a flat spectrum implies that the power spectrum of the model output is shaped entirely by the frequency response of the model

9

The system function H(z) of the discrete linear time-invariant model of Equation is given by

The input–output relation of a generalised discrete linear time-invariant model is given by

where x(m) is the model output, e(m) is the input, and the ak and bk are theparameters of the model. is known as an auto-regressivemoving-average (ARMA) model.

where 1/A(z) and B(z) are the autoregressive and moving-average parts of H(z) respectively.

10

The power spectrum of the signal x(m) is given as the product of the power spectrum of the input signal and the squared magnitude frequency response of the model:

where H(f) is the frequency response of the model and PEE(f) is the inputpower spectrum.

model based approach

In the model based approach, the estimation procedure consists of two steps:-

Step 1:- estimate the parameters {ak} and {bk} of the model.

Step 2:- from these estimates compute the power spectrum estimate.

12Parametric Method

First step is to select an appropriate model for the process. This selection based upon:

A-priori knowledge about how the process is generated Experimental results indicate that a particular model “works well”.

Models used are Autoregressive (AR) Model Moving Average (MA) Model Autoregressive Moving Average (ARMA) Model

13Parametric Methods

Once the model is selected, the next step is to estimate the model parameters from the given data.

The final step is the estimate the power spectrum by incorporating the estimated parameters into the parametric form for the spectrum.

Example: An ARMA(p , q) model with ap(k) and bq(k) estimated , the spectrum estimate would be

14Power spectrum estimation (cont.)

If the model is correct • High quality spectrum estimates • Significantly less data required If the model is wrong • Parametric can given wrong / misleading estimation

15Maximum–Entropy Spectral Estimation

Maximum entropy spectral estimation is a method of spectral density estimation. The goal is to improve the spectral quality based on the principle of maximum entropy. maximum-entropy estimate is based on the principle that the estimate of the autocorrelation sequence must correspond to the most random signal whose correlation values in the range | m |≤ P coincide with the measured values. The maximum-entropy principle is appealing because it assumes no more structure in the correlation sequence than that indicated by the measured data. In maximum entropy modeling, probability distributions are created on the basis of that which is known, leading to a type of statistical inference about the missing information which is called the maximum entropy estimate.

16

Equation shows that the maximum-entropy power spectrum estimate is the power spectrum of an autoregressive (AR) model. Equation was obtained by maximizing the entropy of the power spectrum with respect to the unknown autocorrelation values. The known values of the autocorrelation function can be used to obtain the coefficients of the AR model of Equation

the maximum-entropy power spectrum may be expressed as

power spectrum of a stationary signal is defined as the Fourier transformof the autocorrelation sequence:

17

What is AR Model

A model which depends only on the previous outputs of the system is called an autoregressive model (AR).

Note that:-AR model is based on frequency-domain analysis and should be windowed. (We use the hamming.)

AR model has only poles while the MA model has only zeros.

19

20

linear prediction models are often referred to as autoregressive (AR) processes.

signal can be predicted from its past samples depends on the autocorrelation function, or equivalently the bandwidth and the power spectrum, of the signal.

The model parameters are found by solving a set of linear equation obtained by minimizing the mean squared error.

The characteristic of this error is that it decreases as the order of the AR model is increased.

21 A linear predictor model forecasts the amplitude of a

signal at time m, x(m), using a linearly weighted combination of P past samples [x(m−1),x(m−2), ..., x(m−P)] as

m: is the discrete time indexxˆ (m): is the prediction of x(m)ak : are the predictor coefficients

22implementation of the EquationA block-diagram

23 The error e(m), defined as the difference between the actual sample

value x(m) and its predicted value xˆ (m) , is given by

u(m): is a zero-mean, unit-variance random signal,

G: a gain term is the square root of the variance of e(m):

24

25 One of the most important consideration is the choice of the number of terms in the AR model, this is known as its order p.

If a model with too low an order,

We obtain a highly smoothed spectrum. If a model with too high an order,

There is risk of introducing spurious low-level peaks in the spectrum.

The power spectrum of an autoregressive process is given by

where e(m) is a random signal of variance σe2

The relation between the autocorrelation values and the AR model parameters is obtained by multiplying both sides of Equation by x(m-j) and taking the expectation:

27

A moving-average model is also known as an all-zero or a finite impulse response (FIR) filter. A signal x(m), modeled as a moving-average process is described as

where e(m) is a zero-mean random input and Q is the model order.

The cross-correlation of the input and output of a moving average process is given by

Moving-Average Power Spectrum Estimation

28the autocorrelation function of a moving average process is

the power spectrum of a moving-average process may be obtained directly from the Fouriertransform of the autocorrelation function as

29

30Determine the mean and the autocorrelation of the sequence jc (n) generated by theMA(2) process described by the difference equationx(n) = w(n) - 2 w(n - 1) + w(n — 2)where w(n) is a white noise process with variance a*.

31

The ARMA, or pole–zero, The relationship between the ARMA parameters and the autocorrelation sequence can be obtained by multiplying both sides of Equation by x(m–j) and taking the expectation

It consists of two parts, an autoregressive (AR) part and a moving average (MA) part.

can be used to obtain the coefficients ak

Autoreg-ressive Moving Average (ARMA)

32

An autoregressive moving average (ARMA) process has a power spectrum of the form:

This process can be generated by filtering unit variance white noise with a filter having both poles and zeros: the spectral estimate

where σe2 is the variance of the input of the ARMA model.

ARMA model

It is a tool for understanding and predicting the future values in the series.

It is usually referred to as the ARMA(p,q) model where p is the order of the autoregressive part and q is the order of the moving average part.

ARMA model

It requires fewer model parameters for the spectrum estimation.

This model is appropriate when the signal has been corrupted by noise.

Calculation of model parameters

Consider a data sequence x(n) generated by AR model.

Let the output is corrupted by additive white noise.The Z-transform of the autocorrelation of the

signal is:-

Relationship between autocorrelation and model parameters for ARMA(p,q) process

Matrix representation

Matrix representation for m > p+q

It may be represented as:-

On minimizing, the result is:-

From the AR model parameters, A(Z) can be estimated by:-

This yields the sequence

41(cont.)

In an ARMA model if A(z) = 1 then H(z) = B(z) and the model reduces to moving average (MA) process of order q.

In an ARMA model if B(z) = 1 then H(z) = 1/A(z) and the model reduces to autoregressive (AR) process of order p.

42Spectral Estimation by Autoreg-ressive Moving Average (ARMA) (Parametric)

Has following advantages:Suitable for short data length.Gives better frequency resolution.Avoids spectral leakage.

43High-Resolution Spectral Estimation Based on Subspace Eigen-Analysis Eigen-analysis is used for partitioning the eigen vectors

the eigen values of the autocorrelation matrix of a noisy signal partition into two subspaces:

(a) the signal subspace composed of the principle eigenvectorsassociated with the largest eigenvalues;(b) the noise subspace represented by the smallest eigenvalues. The decomposition of a noisy signal into a signal subspace and

a noise subspace forms the basis of the eigen-analysis methods

44 Applications for parametric modeling include

speech and music synthesis, data compression, high-resolution spectral estimation, communications, manufacturing, and simulation.

45

Thank you ^_^