Adaptive Algorithms 1

download Adaptive Algorithms 1

of 55

Transcript of Adaptive Algorithms 1

  • 7/24/2019 Adaptive Algorithms 1

    1/55

    A D A P T I V E A L G O R I T H M S

    I N D I G I T A L S I G N A L P R O C E S S I N G

    O V E R V I E W T H E O R Y A N D A P P L I C A T I O N S

    D r . G e o r g e O . G l e n t i s

    A s s o c i a t e P r o f e s s o r

    U n i v e r s i t y o f P e l o p o n n e s e

    D e p a r t m e n t o f . T e l e c o m m u n i c a t i o n s

    E M a i l : g g l e n t i s @ u o p . g r

    .

    1

  • 7/24/2019 Adaptive Algorithms 1

    2/55

    O v e r v i e w

    1 . I n t r o d u c t i o n

    2 . M o d e l b a s e d s i g n a l p r o c e s s i n g

    3 . A p p l i c a t i o n e x a m p l e s

    4 . S y s t e m i d e n t i c a t i o n s e t u p

    5 . T h e W i e n e r F i l t e r

    6 . A d a p t i v e F i l t e r i n g

    7 . S t o c h a s t i c A p p r o x i m a t i o n

    8 . T h e L M S

    9 . T h e R L S

    1 0 . E x a m p l e s

    2

  • 7/24/2019 Adaptive Algorithms 1

    3/55

    B i b l i o g r a p h y

    1 . S . H a y k i n , A d a p t i v e F i l t e r T h e o r y . T h i r d

    E d i t i o n , P r e n t i c e H a l l , 1 9 9 6

    2 . N . K a l o u p t s i d i s , S i g n a l P r o c e s s i n g

    S y s t e m s . T h e o r y a n d D e s i g n . W i l e y 1 9 9 7 .

    3 . G . O . G l e n t i s , K . B e r b e r i d i s , a n d S .

    T h e o d o r i d i s , ' E c i e n t L e a s t S q u a r e s

    A l g o r i t h m s f o r F I R T r a n s v e r s a l F i l t e r i n g , '

    I E E E S i g n a l P r o c e s s i n g M a g a z i n e , p p .

    1 3 - 4 1 , J u l y 1 9 9 9

    4 . K . P a r h i , V L S I D i g i t a l S i g n a l P r o c e s s i n g

    S y s t e m s , D e s i g n a n d I m p l e m e n t a t i o n ,

    W i l e y 1 9 9 9

    5 . H . S o r e n s e n , J . C h e n , A D i g i t a l S i g n a l

    P r o c e s s i n g L a b o r a t o r y u s i n g t h e

    T M S 3 2 0 C 3 0 , P r e n t i c e - H a l l 1 9 9 7

    6 . S . K u o , A n I m l e m e n t a t i o n o f A d a p t i v e

    F i l t e r s w i t h t h e T M S 3 2 0 C 2 5 o r t h e

    T M S 3 2 0 C 3 0 , T I , S P R A 1 1 6

    3

  • 7/24/2019 Adaptive Algorithms 1

    4/55

    S i g n a l s a n d S y s t e m s

    A s i g n a l e x p r e s s e s t h e v a r i a t i o n o f a v a r i a b l e

    w i t h r e s p e c t t o a n o t h e r

    t ! x ( t )

    n ! x ( n )

    A s y s t e m c a n b e t h o u g h t o f a s a t r a n s f o r m a t i o n

    b e t w e e n s i g n a l s

    y ( t ) ! S

    t

    ( x ( t ) ; ( t ) )

    t ( n ) ! S

    n

    ( x ( n ) ; ( n ) )

    Input x(n)

    SYSTEM y(n) Output

    Noise (n)

    4

  • 7/24/2019 Adaptive Algorithms 1

    5/55

    S i g n a l p r o c e s s i n g s y s t e m s

    C o m p r e s s i o n f o r t r a n s m i s s i o n / s t o r a r e

    M o d u l a t i o n f o r e c i e n t c o m m u n i c a t i o n

    E r r o r c o n t r o l c o d i n g

    C h a n n e l e q u a l i z a t i o n

    F i l t e r i n g

    E n c r y p t i o n

    C o n t r o l f o r m o d i n g a p l a n t

    C l a s s i c a t i o n a n d c l u s t e r i n g

    P r e d i c t i o n

    I d e n t i c a t i o n f o r m o d e l i n g o f a p l a n t o r a

    s i g n a l

    5

  • 7/24/2019 Adaptive Algorithms 1

    6/55

    S i g n a l p r o c e s s i n g s y s t e m d e s i g n

    B a s i c S t e p s

    S i g n a l r e p r e s e n t a t i o n a n d m o d e l i n g

    S y s t e m r e p r e s e n t a t i o n a n d m o d e l i n g

    S i g n a l a c q u i s i t i o n a n d s y n t e s i s

    D e s i g n f o r m u l a t i o n a n d o p t i m i z a t i o n

    E c i e n t s o f t w a r e / h a r d w a r e

    i m p l e m e n t a t i o n

    C r i t i c a l f a c t o r s

    E n a b l i n g t e c h n o l o g i e s

    A p p l i c a t i o n a r e a

    6

  • 7/24/2019 Adaptive Algorithms 1

    7/55

    C l a s s i c a l a n d m o d e l b a s e d s i g n a l p r o c e s s i n g

    S i g n a l p r o c e s s i n g m a i n t a s k s

    e x t r a c t u s e f u l i n f o r m a t i o n a n d d i s c a r d

    u n w a n d e d s i g n a l c o m p o n e n t s

    a c c e n t u a t e c e r t a i n s i g n a l c h a r a c t e r i s t i c s

    r e l e v a n t t o t h e u s e f u l i n f o r m a t i o n

    ' C l a s s i c a l ' S i g n a l P r o c e s s i n g

    F i l t e r i n g

    s m o o t h i n g

    p r e d i c t i o n

    ' M o d e l - b a s e d ' S i g n a l P r o c e s s i n g

    T h e s i g n a l i s d e s c r i b e d a s t h e o u t p u t o f a

    s y s t e m e x c i t e d b y a k n o w n s i g n a l a n d t h e

    s y s t e m i s i n t u r n m o d e l l e d s o t h a t i t s o u t p u t

    r e s e m b e s t h e o r i g i n a l s i g n a l i n a n o p t i m a l w a y .

    7

  • 7/24/2019 Adaptive Algorithms 1

    8/55

    E x a m p l e 1 : F i l t e r i n g o f a s p e e c h s i g n a l

    0 0.5 1 1.5 2 2.5 3 3.5 4

    x 104

    0.2

    0.1

    0

    0.1

    0.2

    0.3

    mary has a little lamp

    Time

    Frequency

    0 0.5 1 1.5 2 2.5 3 3.5 4 4.50

    1000

    2000

    3000

    4000

    0 500 1000 1500 2000 2500 3000 3500 400010

    10

    105

    100

    low pass filter

    frequency

    freq.

    response(db)

    0 500 1000 1500 2000 2500 3000 3500 400010

    5

    100

    band pass filter

    frequency

    freq.

    respon

    se(db)

    0 500 1000 1500 2000 2500 3000 3500 400010

    10

    105

    100

    high pass filter

    frequency

    freq.

    response(db)

    8

  • 7/24/2019 Adaptive Algorithms 1

    9/55

    0 0.5 1 1.5 2 2.5 3 3.5 4

    x 104

    0.15

    0.1

    0.05

    0

    0.05

    0.1

    0.15

    0.2mary has a little lamp low

    Time

    Frequency

    0 0.5 1 1.5 2 2.5 3 3.5 4 4.5

    0

    1000

    2000

    3000

    4000

    0 0.5 1 1.5 2 2.5 3 3.5 4

    x 104

    0.05

    0

    0.05mary has a little lamp band

    Time

    Frequency

    0 0.5 1 1.5 2 2.5 3 3.5 4 4.50

    1000

    2000

    3000

    4000

    0 0.5 1 1.5 2 2.5 3 3.5 4

    x 104

    4

    2

    0

    2

    4x 10

    3 mary has a little lamp high

    Time

    Frequency

    0 0.5 1 1.5 2 2.5 3 3.5 4 4.50

    1000

    2000

    3000

    4000

    9

  • 7/24/2019 Adaptive Algorithms 1

    10/55

    S y s t e m I d e n t i c a t i o n

    (n)

    y(n) system output

    SYSTEM

    x(n) +

    e(n)

    Input _ error

    MODEL

    y(n) model output

    I d e n t i c a t i o n i f t h e p r o c e d u r e o f s p e c i f y i n g t h e

    u n k n o w n m o d e l i n t e r m s o f t h e a v a i l a b l e

    e x p e r i m e n t a l e v i d e n c e , t h a t i s , a s e t o f

    m e a s u r e m e n t s o f a ) t h e i n p u t - o u t p u t d e s i r e d

    r e s p o n s e s i g n a l s , a n d b ) a n a p p r o p r i a t e l y

    c h o s e n e r r o r c o s t f u n c t i o n w h i c h i s o p t i m i z e d

    w i t h r e s p e c t t o t h e u n k n o w n m o d e l p a r a m e t e r s .

    1 0

  • 7/24/2019 Adaptive Algorithms 1

    11/55

    M o d e l - b a s e d F i l t e r i n g

    z(n) desired response

    +

    y(n) + FILTER

    e(n)

    x(n)

    Input

    Design

    Algorithm

    M o d e l b a s e d l t e r i n g a i m s t o s h a p e a n i n p u t

    s i g n a l s o t h a t t h e c o r r e s p o n d i n g o u t p u t t r a c k s

    a d e s i r e d r e s p o n s e s i g n a l .

    M o d e l b a s e d l t e r i n g c a n b e v i e w e d a s a

    s p e c i a l c a s e o f s y s t e m i d e n t i c a t i o n .

    1 1

  • 7/24/2019 Adaptive Algorithms 1

    12/55

    C h a n n e l E q u a l i z a t i o n

    I(n) x(n) + (n)

    Sourse Channel

    + y(n)

    I(n) x(n)

    Sink Detector Equalizer

    FILTER x(n)

    y(n)

    _ T(n)

    Training

    Sequence

    Design +

    Algorithm

    1 2

  • 7/24/2019 Adaptive Algorithms 1

    13/55

    A c o u s t i c E c h o C a n c e l l a t i o n

    From far-end speaker

    Echo Echo Signal

    Canceller

    _

    To far-end speaker +

    Local speech signal

    From far-end speaker

    x(n)

    Echo Echo

    Canceller Path

    Local speech z(n) s(n)

    + +

    _ z(n) + +

    e(n) + y(n) (n)

    Local noise

    To far-end speaker

    1 3

  • 7/24/2019 Adaptive Algorithms 1

    14/55

    S y s t e m I d e n t i c a t i o n S e t U p

    c

    c

    x ( n ) : i n p u t , y ( n ) : o u t p u t , ( n ) : n o i s e

    s y s t e m m o d e l y ( n ) = S ( x ( n ) ; ( n ) )

    p r e d i c t o r ^ y ( n ) =

    S ( y ( n ) ; x ( n ) j c )

    p r e d i c t i o n e r r o r e ( n ) = y ( n ) ? y ( n )

    c o s t f u n c t i o n V

    N

    ( c ) = QQ Q ( e ( n ) )

    O p t i m u m e s t i m a t i o n c = a r g m i n

    c

    V

    N

    ( c )

    1 4

  • 7/24/2019 Adaptive Algorithms 1

    15/55

    T h e F I R s y s t e m / l t e r

    z-1 z-1 z-1

    +

    C1 C2

    -

    C3

    +

    C4

    x(n) x(n-1) x(n-2) x(n-3)x(n)

    +

    Adaptivealgorithm

    y(n)

    e(n)

    T h e s y s t e m m o d e l i s d e s c r i b e d b y t h e

    d i e r e n c e e q u a t i o n

    y ( n ) =

    M

    X

    i = 1

    c

    o

    i

    x ( n ? i + 1 ) + ( n )

    T h e F I R e s t i m a t o r f o r t h e a b o v e s y s t e m i s

    d e n e d a s

    y ( n ) =

    M

    X

    i = 1

    c

    i

    x ( n ? i + 1 )

    1 5

  • 7/24/2019 Adaptive Algorithms 1

    16/55

    T h e W i e n e r l t e r

    T h e m o d e l

    y ( n ) =

    M

    X

    i = 1

    c

    o

    i

    x ( n ? i + 1 ) + ( n )

    x ( n ) = x ( n ) x ( n ? 1 ) . . . x ( n ? M + 1 ) ]

    T

    c

    o

    = c

    o

    1

    c

    o

    2

    . . . c

    o

    ]

    T

    y ( n ) = x

    T

    ( n ) c

    o

    + ( n )

    T h e e s t i m a t o r

    y ( n ) = x

    T

    ( n ) c

    T h e e s t i m a t i o n e r r o r

    e ( n ) = y ( n ) ? y ( n ) = y ( n ) ? x

    T

    ( n ) c

    T h e c o s t f u n c t i o n

    V ( c ) = E

    e

    2

    ( n )

    = E

    h

    ( y ( n ) ? y ( n ) )

    2

    i

    T h e o p t i m u m s o l u t i o n

    c = m i n

    c

    V ( c ) = m i n

    c

    E

    h

    ( y ( n ) ? x

    T

    ( n ) c )

    2

    i

    1 6

  • 7/24/2019 Adaptive Algorithms 1

    17/55

    Q u a d r a t i c c o s t f u n c t i o n

    V ( c ) = E

    y

    2

    ( n )

    + c

    T

    R c ? 2 d

    T

    c

    R = E x ( n ) x

    T

    ( n ) ]

    d = E x ( n ) y ( n ) ]

    r V ( c ) = 0 ! R c ? d = 0

    T h e n o r m a l e q u a t i o n s

    R c = d

    T h e m i n i m u m e r r o r a t t a i n e d

    E

    m i n

    = E

    y

    2

    ( n )

    ? d

    T

    c

    I n p r a c t i c e , e x p e c a t i o n i s r e p l a c e d b y a

    n i t e h o r i z o n t i m e a v e r a g i n g , i . e . ,

    E ( : ) !

    1

    N

    N

    X

    n = 1

    ( : )

    1 7

  • 7/24/2019 Adaptive Algorithms 1

    18/55

    E x a m p l e 2 : W i e n e r l t e r i n g d e s i g n e x a m p l e

    T h e m o d e l

    y ( n ) = c

    o

    1

    x ( n ) + c

    i

    2

    x ( n ? 1 ) + ( n ) ; c

    0

    1

    = 1 ; c

    o

    2

    = 2

    E x p e r i m e n t a l c o n d i t i o n s x ( n ) 2 N ( 0 ; 1 ) ,

    ( n ) 2 N ( 0 ; : 0 0 9 7 ) , S N R = 2 0 d b ,

    N = 1 0 0 d a t a

    R =

    1

    N

    1 0 0

    X

    n = 1

    x ( n ) x

    T

    ( n ) ] ; d =

    1

    N

    1 0 0

    X

    n = 1

    x ( n ) y ( n ) ]

    1 : 0 1 8 3 ? 0 : 0 2 3

    ? 0 : 0 2 3 1 : 1 0 1 8 3

    c

    1

    c

    2

    =

    0 : 9 9 4 4

    0 : 9 9 7 7

    10

    5

    0

    5

    10

    10

    5

    0

    5

    10

    0

    50

    100

    150

    200

    250

    c(1)

    Wiener Filtering: error surface

    c(2)

    V(c)

    c

    1

    = 0 : 9 9 9 2 c

    2

    = 1 : 0 0 2 3 ; E

    m i n

    = 0 : 0 1 0 2

    1 8

  • 7/24/2019 Adaptive Algorithms 1

    19/55

    E x a m p l e 3 : D e s i g n o f o p t i m u m e q u a l i z e r

    t h e t r a n s m i t t e d d a t a I ( n ) 2 f ? 1 ; 1 g

    t h e c h a n n e r f

    1

    = : 3 , f

    2

    = : 8 , f

    3

    = : 3

    x ( n ) =

    p = 3

    X

    i = 1

    f

    i

    I ( n + 1 ? i ) + ( n )

    t h e e q u a l i z e r

    y ( n ? 7 ) =

    q = 1 1

    X

    i = 1

    c

    i

    x ( n ? i )

    I ( n ? 7 ) = s i g n ( y ( n ? y ) )

    0 0.5 1 1.5 2 2.5 3 3.510

    1

    100

    101

    channel frequency response

    0 0.5 1 1.5 2 2.5 3 3.510

    1

    100

    101

    equalizer frequency response

    1 9

  • 7/24/2019 Adaptive Algorithms 1

    20/55

    0 10 20 30 40 50 60 70 80 90 1002

    1

    0

    1

    2errors before equalization

    0 10 20 30 40 50 60 70 80 90 1001

    0.5

    0

    0.5

    1errors after equalization

    1 0.5 0 0.5 11

    0.5

    0

    0.5

    1scatter diagram before ISI

    I(n)

    I(n1)

    2 1 0 1 21.5

    1

    0.5

    0

    0.5

    1

    1.5scatter diagram after ISI

    x(n)

    x(n1)

    2 1 0 1 21.5

    1

    0.5

    0

    0.5

    1

    1.5scatter diagram after equalization

    y(n)

    y(n1)

    1 0.5 0 0.5 11

    0.5

    0

    0.5

    1scatter diagram after detection

    x(n)

    x(n1)

    2 0

  • 7/24/2019 Adaptive Algorithms 1

    21/55

    A d a p t i v e I d e n t i c a t i o n a n d l t e r i n g

    A d a p t i v e i d e n t i c a t i o n a n d s i g n a l p r o c e s s i n g

    r e f e r s t o a p a r t i c u l a r p r o c e d u r e w h e r e t h e

    m o d e l e s t i m a t o r i s u p d a t e d t o i n c o r p o r a t e t h e

    n e w l y r e c e i v e d i n f o r m a t i o n .

    1 . W e l e a r n a b o u t t h e m o d e l a s e a c h n e w p a i r

    o f m e a s u r e m e n t s i s r e c e i v e d , a n d w e

    u p d a t e o u r k n o w l e d g e t o i n c o r p o r a t e t h e

    n e w l y r e c e i v e d i n f o r m a t i o n .

    2 . I n a t i m e - v a r y i n g e n v i r o n m e n t t h e m o d e l

    e s t i m a t o r s h o u l d b e a b l e t o f o l l o w t h e

    v a r i a t i o n s o c c u r e d , a l l o w i n g f o r p a s t

    m e a s u r e m e n t s s o m e h o w b e f o r g o t t e n i n

    f a v o r o f t h e m o s t r e c e n t e v i d e n c e .

    T h e r a p i d a d v a n c e s i n s i l i c o n t e c h n o l o g y ,

    e s p e c i a l l y t h e a d v e n t o f V L S I c i r c u i t s , h a v e

    m a d e p o s s i b l e t h e i m p l e m e n t a t i o n o f a l g o r i t h m s

    f o r a d a p t i v e s y s t e m i d e n t i c a t i o n a n d s i g n a l

    p r o c e s s i n g a t c o m m e r c i a l l y a c c e p t a b l e c o s t s .

    2 1

  • 7/24/2019 Adaptive Algorithms 1

    22/55

    S t r u c t r u r e o f a n a d a p t i v e a l g o r i t h m

    c(n) = F ( c(n-1), x(n), y(n) )

    new

    parameters

    estimate

    old

    parameters

    estimate

    new

    information

    while x(n), y(n) available

    end

    P e r f o r m a n c e i s s u e s

    A c c u r a c y o f t h e o b t a i n e d s o l u t i o n

    C o m p l e x i t y a n d m e m o r y r e q u i r e m e n t s

    E n h a n c e d p a r a l l e l i s m a n d m o d u l a r i t y

    S t a b i l i t y a n d n u m e r i c a l p r o p e r t i e s

    F a s t c o n v e r g e n c e a n d t r a c k i n g

    c h a r a c t e r i s t i c s

    2 2

  • 7/24/2019 Adaptive Algorithms 1

    23/55

    A d a p t i v e W i e n e r l t e r s

    A r s t a p p r o a c h . . .

    T h e n o r m a l e q u a t i o n s R c = d

    e x p e c a t i o n i s r e p l a c e d b y a n i t e h o r i z o n

    t i m e a v e r a g i n g , i . e . ,

    E ( : )

    E ( : ) =

    1

    N

    n

    X

    i = 1

    ( : )

    d e v e l o p a r e c u r s i v e e s t i m a t o r f o r R a n d d

    R ( n ) =

    1

    n

    n

    X

    i = 1

    x ( n ) x

    T

    ( n ) =

    n ? 1

    n

    R ( n ? 1 ) +

    1

    n

    x ( n ) x

    T

    ( n ) =

    R ( n ? 1 ) +

    1

    n

    x ( n ) x

    T

    ( n ) ? R ( n ? 1 )

    2 3

  • 7/24/2019 Adaptive Algorithms 1

    24/55

    A r e c u r s i v e W i e n e r a l g o r i t h m

    W h i l e d a t a x ( n ) ; y ( n ) a r e a v a i l a b l e

    R ( n ) = R ( n ? 1 ) +

    1

    n

    x ( n ) x

    T

    ( n ) ? R ( n ? 1 )

    d ( n ) = d ( n ? 1 ) +

    1

    n

    z ( n ) x ( n ) ? d ( n ? 1 ) ]

    R ( n ) c ( n ) = d ( n )

    E n d

    P e r f o r m a n c e a n a l y s i s

    A c c u r a c y o f t h e o b t a i n e d s o l u t i o n Y E S

    C o m p l e x i t y a n d m e m o r y r e q u i r e m e n t s N O

    E n h a n c e d p a r a l l e l i s m a n d m o d u l a r i t y N O

    S t a b i l i t y a n d n u m e r i c a l p r o p e r t i e s Y E S

    F a s t c o n v e r g e n c e a n d t r a c k i n g

    c h a r a c t e r i s t i c s N O

    2 4

  • 7/24/2019 Adaptive Algorithms 1

    25/55

    A n a d a p t i v e W i e n e r a l g o r i t h m

    R e p l a c e

    1

    n

    ! , 0 < < < 1

    W h i l e d a t a x ( n ) ; y ( n ) a r e a v a i l a b l e

    R ( n ) = R ( n ? 1 ) +

    x ( n ) x

    T

    ( n ) ? R ( n ? 1 )

    d ( n ) = d ( n ? 1 ) + z ( n ) x ( n ) ? d ( n ? 1 ) ]

    R ( n ) c ( n ) = d ( n )

    E n d

    P e r f o r m a n c e a n a l y s i s

    A c c u r a c y o f t h e o b t a i n e d s o l u t i o n Y E S

    C o m p l e x i t y a n d m e m o r y r e q u i r e m e n t s N O

    E n h a n c e d p a r a l l e l i s m a n d m o d u l a r i t y N O

    S t a b i l i t y a n d n u m e r i c a l p r o p e r t i e s Y E S

    F a s t c o n v e r g e n c e a n d t r a c k i n g

    c h a r a c t e r i s t i c s Y E S

    2 5

  • 7/24/2019 Adaptive Algorithms 1

    26/55

    E x a m p l e 3 : A d a p t i v e m e a n a n d v a r i a n c e e s t i m a t i o n

    T h e m e a n v a l u e

    m

    x

    = E ( x ( k ) ) ; ! m

    x

    ( n ) =

    1

    n

    n

    X

    i = 1

    x ( i )

    T h e v a r i a n c e

    v

    x

    = E ( ( x ( k ) ? m

    x

    )

    2

    ) =

    E ( x

    2

    ( k ) ) ? m

    2

    x

    ! v

    x

    ( n ) =

    1

    n

    n

    X

    i = 1

    x

    2

    ( i ) ? m

    2

    x

    ( n )

    A d a p t i v e e s t i m a t o r s

    W h i l e x ( n ) i s a v a i l a b l e ,

    m

    x

    ( n ) = m

    x

    ( n ? 1 ) + ( x ( n ) ? m

    x

    ( n ? 1 ) )

    p

    x

    ( n ) = p

    x

    ( n ? 1 ) + ( x

    2

    ( n ) ? p

    x

    ( n ? 1 ) )

    v

    x

    ( n ) = p

    x

    ( n ) ? m

    2

    x

    ( n )

    E n d

    2 6

  • 7/24/2019 Adaptive Algorithms 1

    27/55

    S i m u l i n k S c h e m a t i c s

    2 7

  • 7/24/2019 Adaptive Algorithms 1

    28/55

    E x a m p l e 4 : W i e n e r l t e r i n g d e s i g n e x a m p l e

    T h e m o d e l

    y ( n ) = c

    o

    1

    x ( n ) + c

    i

    2

    x ( n ? 1 ) + ( n ) ; c

    0

    1

    = 1 ; c

    o

    2

    = 2

    E x p e r i m e n t a l c o n d i t i o n s x ( n ) 2 N ( 0 ; 1 ) ,

    ( n ) 2 N ( 0 ; : 0 0 9 7 ) , S N R = 2 0 d b , N = 1 0 0 d a t a

    10

    5

    0

    5

    10

    10

    5

    0

    5

    10

    0

    50

    100

    150

    200

    250

    c(1)

    Wiener Filtering: error surface

    c(2)

    V(c)

    2 8

  • 7/24/2019 Adaptive Algorithms 1

    29/55

    A r e c u r s i v e W i e n e r a l g o r i t h m

    W h i l e d a t a x ( n ) ; y ( n ) a r e a v a i l a b l e

    R = R ( n ? 1 ) +

    1

    n

    x ( n ) x

    T

    ( n ) ? R ( n ? 1 )

    d = d ( n ? 1 ) +

    1

    n

    z ( n ) x ( n ) ? d ( n ? 1 ) ]

    R ( n ) c ( n ) = d ( n )

    E n d

    0.9 0.92 0.94 0.96 0.98 1 1.02 1.04 1.06 1.08 1.10.9

    0.92

    0.94

    0.96

    0.98

    1

    1.02

    1.04

    1.06

    1.08

    1.1

    c(1)

    c(2)

    Wiener Filtering: error surface contour

    0 100 200 300 400 500 600 700 800 900 10000.5

    0

    0.5

    1

    1.5

    2prediction error

    samples

    e(n)

    0 100 200 300 400 500 600 700 800 900 100010

    3

    102

    101

    100

    101

    learning curve

    samples

    V(n)

    2 9

  • 7/24/2019 Adaptive Algorithms 1

    30/55

    A n a d a p t i v e W i e n e r a l g o r i t h m

    W h i l e d a t a x ( n ) ; y ( n ) a r e a v a i l a b l e

    R = R ( n ? 1 ) +

    x ( n ) x

    T

    ( n ) ? R ( n ? 1 )

    d = d ( n ? 1 ) + z ( n ) x ( n ) ? d ( n ? 1 ) ]

    R ( n ) c ( n ) = d ( n )

    E n d

    0.9 0.92 0.94 0.96 0.98 1 1.02 1.04 1.06 1.08 1.10.9

    0.92

    0.94

    0.96

    0.98

    1

    1.02

    1.04

    1.06

    1.08

    1.1

    c(1)

    c(2)

    Wiener Filtering: error surface contour

    0 100 200 300 400 500 600 700 800 900 10000.5

    0

    0.5

    1

    1.5

    2prediction error

    samples

    e(n)

    0 100 200 300 400 500 600 700 800 900 100010

    3

    102

    101

    100

    101

    learning curve

    samples

    V(n)

    3 0

  • 7/24/2019 Adaptive Algorithms 1

    31/55

    A b r u p t v a r i a t i o n s

    1510

    50

    510

    15

    15

    10

    5

    0

    5

    10

    150

    1

    2

    3

    4

    5

    6

    7

    8

    9

    c(1)

    Wiener Filtering: error surface

    c(2)

    V(c)

    15 10 5 0 5 10 1515

    10

    5

    0

    5

    10

    15

    c(1)

    c(2)

    Wiener Filtering: coefficients update

    0 200 400 600 800 1000 1200 1400 1600 1800 200050

    0

    50prediction error

    samples

    e(n)

    0 200 400 600 800 1000 1200 1400 1600 1800 200010

    4

    102

    100

    102

    104 learning curve

    samples

    V(n)

    3 1

  • 7/24/2019 Adaptive Algorithms 1

    32/55

    T r a c k i n g a b i l i t y

    c

    o

    1

    ( n )

    c

    o

    2

    ( n )

    =

    1 0

    1 0

    + 4

    s i n ( f n )

    c o s ( f n )

    15 10 5 0 5 10 1515

    10

    5

    0

    5

    10

    15

    c(1)

    c(2)

    Wiener Filtering: coefficients update

    0 200 400 600 800 1000 1200 1400 1600 1800 2000100

    50

    0

    50

    100prediction error

    samples

    e

    (n)

    0 200 400 600 800 1000 1200 1400 1600 1800 200010

    2

    100

    102

    104

    learing curve

    samples

    V(n)

    3 2

  • 7/24/2019 Adaptive Algorithms 1

    33/55

    I t e r a t i v e o p t i m i z a t i o n

    T h e p r o b l e m

    M i n i m i z e V ( c ) = EE E Q ( c ; ) ]

    D e t e r m i n i n s t i c i t e r a t i v e o p t i m i z a t i o n

    V ( c ) = E

    y

    2

    ( n )

    + c

    T

    R c ? 2 d

    T

    c

    D e s c e n t m e t h o d s

    c

    i

    = c

    i ? 1

    +

    i

    v

    i

    i

    = a r g m i n

    V ( c

    i ? 1

    + v

    i

    )

    T h e s p e e p e s t d e s c e n t m e t h o d

    c

    i

    = c

    i ? 1

    ?

    i

    rr r V ( c

    i ? 1

    )

    T h e N e w t o n - R a p h s o n m e t h o d

    c

    i

    = c

    i ? 1

    ?

    i

    rr r

    2

    V ( c

    i ? 1

    ) ]

    ? 1

    rr r V ( c

    i ? 1

    )

    Q u a s s i - N e w t o n m e t h o d s

    c

    i

    = c

    i ? 1

    ?

    i

    A

    i

    ]

    ? 1

    rr r V ( c

    i ? 1

    )

    3 3

  • 7/24/2019 Adaptive Algorithms 1

    34/55

    S t o c h a s t i c A p p r o x i m a t i o n

    I t e r a t i v e d e t e r m i n i s t i c o p t i m i z a t i o n s c h e m e s

    r e q u i r e t h e k n o w l e d g e e i t h e r o f

    t h e c o s t f u n c t i o n V ( c )

    t h e g r a d i e n t r V ( c )

    t h e H e s s i a n m a t r i x r

    2

    V ( c )

    T h e s t o c h a s t i c a p p r o x i m a t i o n c o u n t e r p a r t o f a

    d e t e r m i n i s t i c o p t i m i z a t i o n a l g o r i t h m i s

    o b t a i n e d i f t h e a b o v e v a r i a b e l s a r e r e p l a c e d b y

    u n b i a s e d e s t i m a t e s , i . e . ,

    d

    V ( c )

    d

    r V ( c )

    d

    r

    2

    V ( c )

    3 4

  • 7/24/2019 Adaptive Algorithms 1

    35/55

    E x p e c t a t i o n A p p r o x i m a t i o n C o s t f u n c t i o n

    E

    n

    ] =

    n

    X

    k = n

    ]

    n

    X

    k = n

    e

    2

    ( n ) ]

    E

    n

    ] =

    1

    L

    n

    X

    k = n ? L + 1

    ]

    n

    X

    k = n ? L

    e

    2

    ( n ) ]

    E

    n

    ] =

    n

    X

    k = 0

    n ? k

    ]

    n

    X

    k = 0

    n ? k

    e

    c

    ( n ) ]

    T h e r e c u r s i v e s t o c h a s t i c a p p r o x i m a t i o n s c h e m e

    c ( n ) = c ( n ? 1 ) ?

    1

    2

    ( n ) W ( n ) g ( n )

    g ( n ) = rr r

    c ( n ? 1 )

    b

    E

    n

    e

    2

    ( n )

    3 5

  • 7/24/2019 Adaptive Algorithms 1

    36/55

    A d a p t i v e g r a d i e n t a l g o r i t h m s

    T h e b a s i c r e c u r s i o n

    c ( n ) = c ( n ? 1 ) ?

    1

    2

    ( n ) g ( n )

    A 1 . M e m o r y l e s s a p p r o x i m a t i o n o f t h e g r a d i e n t

    d

    V ( c ) = e

    2

    ( n ) ; g ( n ) = ? 2 x ( n ) e ( n )

    T h e a d a p t i v e a l g o r i t h m

    e ( n ) = y ( n ) ? x

    T

    c ( n ? 1 )

    c ( n ) = c ( n ? 1 ) + ( n ) x ( n ) e ( n )

    T h e L M S a l g o r i t h m

    ( n )

    T h e n o r m a l i z e d L M S a l g o r i t h m

    ( n ) =

    + x

    T

    ( n ) x ( n )

    3 6

  • 7/24/2019 Adaptive Algorithms 1

    37/55

    P r o p e r t i e s o f t h e L M S a l g o r i t h m

    L e t w ( n ) c ( n ) ? c

    E ( w ( n ) ) = ( I ? R ) E ( w ( n ? 1 ) )

    0 <

  • 7/24/2019 Adaptive Algorithms 1

    38/55

    D e t e r m i n i s t i c I n t e r p r e t a t i o n

    T h e N L M S a l g o r i t h m a l l o w s f o r a d e t e r m i n i s t i c

    i n t e r p r e t a t i o n , a s t h e l t e r t h a t m i n i m i z e s t h e

    e r r o r n o r m

    c ( n ) = m i n

    c

    j j c ? c ( n ? 1 ) j j

    2

    s u b j e c t t o t h e c o n s t r a i n t i m p o s e d b y t h e m o d e l

    y ( n ) = x

    T

    ( n ) c

    I n t h i s c o n t e x t , t h e N L M S a l g o r i t h m i s a l s o

    k n o w n a s t h e p r o j e c t i o n a l g o r i t h m

    3 8

  • 7/24/2019 Adaptive Algorithms 1

    39/55

    T h e L M S a l g o r i t h m

    1 0.5 0 0.5 1 1.5 2 2.5 31

    0.5

    0

    0.5

    1

    1.5

    2

    2.5

    3

    c(1)

    c(2)

    Wiener Filtering: error surface contour

    0 100 200 300 400 500 600 700 800 900 10001

    0.5

    0

    0.5

    1

    1.5

    2prediction error

    samples

    e(n)

    0 100 200 300 400 500 600 700 800 900 100010

    3

    102

    101

    100

    101

    learning curve

    samples

    V(n)

    T h e L M S a l g o r i t h m - a b r u p t c h a n g e s

    15 10 5 0 5 10 1515

    10

    5

    0

    5

    10

    15

    c(1)

    c(2)

    Wiener Filtering: coefficients update

    0 200 400 600 800 1000 1200 1400 1600 1800 200040

    30

    20

    10

    0

    10

    20prediction error

    samples

    e(n)

    0 200 400 600 800 1000 1200 1400 1600 1800 200010

    4

    102

    100

    102

    104

    learing curve

    samples

    V(n)

    T h e L M S a l g o r i t h m - t r a c k i n g a b i l i t y

    15 10 5 0 5 10 1515

    10

    5

    0

    5

    10

    15

    c(1)

    c(2)

    Wiener Filtering: coefficients update

    0 200 400 600 800 1000 1200 1400 1600 1800 200060

    40

    20

    0

    20

    40

    60prediction error

    samples

    e(n)

    0 200 400 600 800 1000 1200 1400 1600 1800 200010

    2

    100

    102

    104

    learing curve

    samples

    V(n)

    3 9

  • 7/24/2019 Adaptive Algorithms 1

    40/55

    L M S c h i l d r e n

    T h e s i n g - e r r o r L M S

    e ( n ) = y ( n ) ? x

    T

    c ( n ? 1 )

    c ( n ) = c ( n ? 1 ) + x ( n ) s i g n ( e ( n ) )

    T h e s i n g - d a t a L M S

    e ( n ) = y ( n ) ? x

    T

    c ( n ? 1 )

    c ( n ) = c ( n ? 1 ) + s i g n ( x ( n ) ) ( e ( n )

    T h e s i n g - s i g n L M S

    e ( n ) = y ( n ) ? x

    T

    c ( n ? 1 )

    c ( n ) = c ( n ? 1 ) + s i g n ( x ( n ) ) s i g n ( ( e ( n ) )

    4 0

  • 7/24/2019 Adaptive Algorithms 1

    41/55

    T h e s i g n - e r r o r L M S a l g o r i t h m

    1 0.5 0 0.5 1 1.5 2 2.5 31

    0.5

    0

    0.5

    1

    1.5

    2

    2.5

    3

    c(1)

    c(2)

    Wiener Filtering: error surface contour

    0 100 200 300 400 500 600 700 800 900 10004

    2

    0

    2

    4

    6prediction error

    samples

    e(n)

    0 100 200 300 400 500 600 700 800 900 100010

    2

    101

    100

    101

    learning curve

    samples

    V(n)

    a b r u p t c h a n g e s

    15 10 5 0 5 10 1515

    10

    5

    0

    5

    10

    15

    c(1)

    c(2)

    Wiener Filtering: coefficients update

    0 200 400 600 800 1000 1200 1400 1600 1800 200080

    60

    40

    20

    0

    20

    40

    60prediction error

    samples

    e(n)

    0 200 400 600 800 1000 1200 1400 1600 1800 200010

    2

    100

    102

    104

    learing curve

    samples

    V(n)

    t r a c k i n g a b i l i t y

    15 10 5 0 5 10 1515

    10

    5

    0

    5

    10

    15

    c(1)

    c(2)

    Wiener Filtering: coefficients update

    0 200 400 600 800 1000 1200 1400 1600 1800 2000200

    100

    0

    100

    200prediction error

    samples

    e(n)

    0 200 400 600 800 1000 1200 1400 1600 1800 200010

    2

    100

    102

    104

    106

    learing curve

    samples

    V(n)

    4 1

  • 7/24/2019 Adaptive Algorithms 1

    42/55

    T h e s i g n - d a t a L M S a l g o r i t h m

    1 0.5 0 0.5 1 1.5 2 2.5 31

    0.5

    0

    0.5

    1

    1.5

    2

    2.5

    3

    c(1)

    c(2)

    Wiener Filtering: error surface contour

    0 100 200 300 400 500 600 700 800 900 10002

    1

    0

    1

    2

    3

    4prediction error

    samples

    e(n)

    0 100 200 300 400 500 600 700 800 900 100010

    3

    102

    101

    100

    101

    learning curve

    samples

    V(n)

    a b r u p t c h a n g e s

    15 10 5 0 5 10 1515

    10

    5

    0

    5

    10

    15

    c(1)

    c(2)

    Wiener Filtering: coefficients update

    0 200 400 600 800 1000 1200 1400 1600 1800 200060

    40

    20

    0

    20

    40prediction error

    samples

    e(n)

    0 200 400 600 800 1000 1200 1400 1600 1800 200010

    4

    102

    100

    102

    104

    learing curve

    samples

    V(n)

    t r a c k i n g a b i l i t y

    15 10 5 0 5 10 1515

    10

    5

    0

    5

    10

    15

    c(1)

    c(2)

    Wiener Filtering: coefficients update

    0 200 400 600 800 1000 1200 1400 1600 1800 2000100

    50

    0

    50

    100prediction error

    samples

    e(n)

    0 200 400 600 800 1000 1200 1400 1600 1800 200010

    2

    100

    102

    104

    learing curve

    samples

    V(n)

    4 2

  • 7/24/2019 Adaptive Algorithms 1

    43/55

    T h e s i g n - s i g n L M S a l g o r i t h m

    1 0.5 0 0.5 1 1.5 2 2.5 31

    0.5

    0

    0.5

    1

    1.5

    2

    2.5

    3

    c(1)

    c(2)

    Wiener Filtering: error surface contour

    0 100 200 300 400 500 600 700 800 900 10002

    1

    0

    1

    2

    3

    4prediction error

    samples

    e(n)

    0 100 200 300 400 500 600 700 800 900 100010

    2

    101

    100

    101

    learning curve

    samples

    V(n)

    a b r u p t c h a n g e s

    15 10 5 0 5 10 1515

    10

    5

    0

    5

    10

    15

    c(1)

    c(2)

    Wiener Filtering: coefficients update

    0 200 400 600 800 1000 1200 1400 1600 1800 200080

    60

    40

    20

    0

    20

    40

    60prediction error

    samples

    e(n)

    0 200 400 600 800 1000 1200 1400 1600 1800 200010

    2

    100

    102

    104

    learing curve

    samples

    V(n)

    t r a c k i n g a b i l i t y

    15 10 5 0 5 10 1515

    10

    5

    0

    5

    10

    15

    c(1)

    c(2)

    Wiener Filtering: coefficients update

    0 200 400 600 800 1000 1200 1400 1600 1800 2000300

    200

    100

    0

    100

    200

    300prediction error

    samples

    e(n)

    0 200 400 600 800 1000 1200 1400 1600 1800 200010

    2

    100

    102

    104

    106

    learing curve

    samples

    V(n)

    4 3

  • 7/24/2019 Adaptive Algorithms 1

    44/55

    E x a m p l e 5 : D e s i g n o f a n a d a p t i v e e q u a l i z e r

    t h e t r a n s m i t t e d d a t a I ( n ) 2 f ? 1 ; 1 g

    t h e c h a n n e r

    f ( n ) =

    f

    1

    = : 3 ; f

    2

    = : 8 ; f

    3

    = : 3

    f

    1

    = : 3 ; f

    2

    = ? : 8 ; f

    3

    = : 3

    x ( n ) =

    p = 3

    X

    i = 1

    f

    i

    ( n ) I ( n + 1 ? i ) + ( n )

    t h e e q u a l i z e r

    y ( n ? 7 ) =

    q = 1 1

    X

    i = 1

    c

    i

    x ( n ? i )

    I ( n ? 7 ) = s i g n ( y ( n ? y ) )

    4 4

  • 7/24/2019 Adaptive Algorithms 1

    45/55

    T h e N L M S a d a p t i v e e q u a l i z e r

    0 500 1000 1500 2000 2500 3000 3500 40005

    0

    5estimation error

    0 500 1000 1500 2000 2500 3000 3500 400010

    4

    102

    100

    102

    learning curve

    0 500 1000 1500 2000 2500 3000 3500 40002

    1

    0

    1

    2errors after equalization

    1 0.5 0 0.5 11

    0.5

    0

    0.5

    1scatter diagram before ISI

    I(n)

    I(n1)

    2 1 0 1 22

    1.5

    1

    0.5

    0

    0.5

    1

    1.5scatter diagram after ISI

    x(n)

    x(n1)

    2 1 0 1 21.5

    1

    0.5

    0

    0.5

    1

    1.5scatter diagram after equalization

    y(n)

    y(n1)

    1 0.5 0 0.5 11

    0.5

    0

    0.5

    1scatter diagram after detection

    x(n)

    x(n1)

    4 5

  • 7/24/2019 Adaptive Algorithms 1

    46/55

    T h e s i g n - s i g n L M S a d a p t i v e e q u a l i z e r

    0 500 1000 1500 2000 2500 3000 3500 400010

    5

    0

    5

    10estimation error

    0 500 1000 1500 2000 2500 3000 3500 400010

    4

    102

    100

    102

    learning curve

    0 500 1000 1500 2000 2500 3000 3500 40002

    1

    0

    1

    2errors after equalization

    1 0.5 0 0.5 11

    0.5

    0

    0.5

    1scatter diagram before ISI

    I(n)

    I(n1)

    2 1 0 1 22

    1.5

    1

    0.5

    0

    0.5

    1

    1.5scatter diagram after ISI

    x(n)

    x(n1)

    2 1 0 1 22

    1

    0

    1

    2scatter diagram after equalization

    y(n)

    y(n1)

    1 0.5 0 0.5 11

    0.5

    0

    0.5

    1scatter diagram after detection

    x(n)

    x(n1)

    4 6

  • 7/24/2019 Adaptive Algorithms 1

    47/55

    A d a p t i v e G a u s s - N e w t o n A l g o r i t h m s

    T h e m a i n r e c u r s i o n

    c ( n ) = c ( n ? 1 ) ?

    1

    2

    ( n ) W ( n ) g ( n )

    T h e e x p e c a t i o n a p p r o x i m a t i o n

    E

    n

    : ] =

    n

    X

    k = 0

    n ? k

    : ] 0 < 1

    T h e g r a d i e n t e s t i m a t i o n

    g ( n ) = rr r

    c ( n ? 1 )

    b

    E

    n

    e

    2

    ( n )

    U s e t h e i n v e r s e H e s s i a n a s a w e i g h t i n g

    m a t r i x , t h u s f o r c i n g t h e c o r r e c t i o n d i r e c t i o n t o

    p o i n t t o t h e m i n i m a l p o i n t .

    W ( n ) =

    h

    rr r

    2

    c ( n ? 1 )

    b

    E

    n

    e

    2

    ( n )

    i

    ? 1

    4 7

  • 7/24/2019 Adaptive Algorithms 1

    48/55

    T h e e x p o n e n t i a l f o r g e t t i n g w i n d o w R L S

    I n i t i a l i z a t i o n

    c ( ? 1 ) = 0 ; R

    ? 1

    ( ? 1 ) = I ; > > 1

    w ( n ) =

    ? 1

    R

    ? 1

    ( n ? 1 ) x ( n )

    ( n ) = 1 + w

    T

    ( n ) x ( n )

    e ( n ) = y ( n ) ? x

    T

    ( n ) c ( n ? 1 )

    ( n ) = e ( n ) = ( n )

    c ( n ) = c ( n ? 1 ) + w ( n ) ( n )

    R

    ? 1

    ( n ) =

    1

    R

    ? 1

    ( n ? 1 ) ?

    w

    T

    ( n ) w ( n )

    ( n )

    4 8

  • 7/24/2019 Adaptive Algorithms 1

    49/55

    P r o p e r t i e s o f t h e R L S a l g o r i t h m

    L e t w ( n ) = c ( n ) ? c

    E ( w ( n ) ) =

    n

    R E ( w ( 0 ) )

    T h e m e a n s q u a r e d e s t i m a t i o n e r r o r

    E ( e

    2

    ( n ) ) c o n v e r g e n c e r a t e i s i n d e p e n d e d o f t h e

    e i g e n v a l u e s p r e a d o f t h e a u t o c o r r e l a t i o n m a t r i x .

    T h e c o m p l e x i t y o f t h e R L S i s O ( M

    2

    )

    T h e m e m o r y n e e d e d f o r t h e R L S i s O ( M

    2

    ) .

    4 9

  • 7/24/2019 Adaptive Algorithms 1

    50/55

    D e t e r m i n i s t i c I n t e r p r e t a t i o n

    T h e R L S a l g o r i t h m a l l o w s f o r a d e t e r m i n i s t i c

    i n t e r p r e t a t i o n , a s t h e l t e r t h a t m i n i m i z e s t h e

    t o t a l s q u a r e d e r r o r

    V

    n

    ( c ) =

    n

    X

    k = 1

    n ? k

    e

    2

    ( n )

    D i r e c t o p t i m i z a t i o n l e a d s t o

    R ( n ) c ( n ) = d ( n )

    w h e r e

    R ( n ) =

    n

    X

    k = 1

    n ? k

    x ( k ) x

    T

    ( k )

    d ( n ) =

    n

    X

    k = 1

    n ? k

    x ( k ) y ( k )

    5 0

  • 7/24/2019 Adaptive Algorithms 1

    51/55

    T h e R L S a l g o r i t h m

    0.9 0.92 0.94 0.96 0.98 1 1.02 1.04 1.06 1.08 1.10.9

    0.92

    0.94

    0.96

    0.98

    1

    1.02

    1.04

    1.06

    1.08

    1.1

    c(1)

    c(2)

    Wiener Filtering: error surface contour

    0 100 200 300 400 500 600 700 800 900 10001

    0.8

    0.6

    0.4

    0.2

    0

    0.2

    0.4prediction error

    samples

    e(n)

    0 100 200 300 400 500 600 700 800 900 100010

    3

    102

    101

    100

    101

    learning curve

    samples

    V(n)

    T h e R L S a l g o r i t h m - a b r u p t c h a n g e s

    15 10 5 0 5 10 1515

    10

    5

    0

    5

    10

    15

    c(1)

    c(2)

    Wiener Filtering: coefficients update

    0 200 400 600 800 1000 1200 1400 1600 1800 200050

    40

    30

    20

    10

    0

    10

    20prediction error

    samples

    e(n)

    0 200 400 600 800 1000 1200 1400 1600 1800 200010

    4

    102

    100

    102

    104

    learing curve

    samples

    V(n)

    T h e R L S a l g o r i t h m - t r a c k i n g a b i l i t y

    15 10 5 0 5 10 1515

    10

    5

    0

    5

    10

    15

    c(1)

    c(2)

    Wiener Filtering: coefficients update

    0 200 400 600 800 1000 1200 1400 1600 1800 200080

    60

    40

    20

    0

    20

    40

    60prediction error

    samples

    e(n)

    0 200 400 600 800 1000 1200 1400 1600 1800 200010

    2

    100

    102

    104

    learing curve

    samples

    V(n)

    5 1

  • 7/24/2019 Adaptive Algorithms 1

    52/55

    T h e R L S a d a p t i v e e q u a l i z e r

    0 500 1000 1500 2000 2500 3000 3500 40004

    2

    0

    2

    4estimation error

    0 500 1000 1500 2000 2500 3000 3500 400010

    4

    102

    100

    102

    learning curve

    0 500 1000 1500 2000 2500 3000 3500 40002

    1

    0

    1

    2errors after equalization

    1 0.5 0 0.5 11

    0.5

    0

    0.5

    1scatter diagram before ISI

    I(n)

    I(n1)

    2 1 0 1 22

    1.5

    1

    0.5

    0

    0.5

    1

    1.5scatter diagram after ISI

    x(n)

    x(n1)

    2 1 0 1 21.5

    1

    0.5

    0

    0.5

    1

    1.5scatter diagram after equalization

    y(n)

    y(n1)

    1 0.5 0 0.5 11

    0.5

    0

    0.5

    1scatter diagram after detection

    x(n)

    x(n1)

    5 2

  • 7/24/2019 Adaptive Algorithms 1

    53/55

    A c o u s t i c e c h o c a n c e l l a t i o n

    From far-end speaker x(n)

    Echo EchoCanceller Path

    Local speech

    z(n) s(n) + +

    _ z(n) + +

    e(n) + y(n) (n)Local noise

    To far-end speaker

    D i r e c t s i g n a l f r o m m i c r o p h o n e x ( n )

    E c h o s i g n a l z ( n ) = h ( n ) ? x ( n )

    L o c a l s p e e c h s i g n a l s ( n ) a n d n o i s e ( n )

    S i g n a l a t t h e m i c r o p h o n e

    y ( n ) = z ( n ) + s ( n ) + ( n )

    T r a i n i n g m o d e : W h e n s ( n ) = 0 ,

    E s t i m a t e ^ z ( n )

    O p e r a t i o n m o d e : A f t e r t r a i n i n g , S e n d

    e ( n ) = y ( n ) ? z ( n ) s ( n ) + ( n )

    5 3

  • 7/24/2019 Adaptive Algorithms 1

    54/55

    E x p e r i m e n t a l C o n d i t i o n s

    0 2 4 6

    x 104

    6

    4

    2

    0

    2

    4

    samples

    x(n)

    stationary speech signal

    0 2 4 6

    x 104

    6

    4

    2

    0

    2

    4

    samples

    u(n)

    original speech signal

    0 100 200 3001

    0.5

    0

    0.5

    1

    samples

    c0

    impulse response

    0 2 4 6

    x 104

    100

    102

    104

    106

    108

    samples

    condtion number

    5 4

  • 7/24/2019 Adaptive Algorithms 1

    55/55

    T h e L M S a d a p t i v e c a n c e l l e r

    0 50 100 150 200 250 300 350 400 450 50050

    40

    30

    20

    10

    0

    10

    20

    MSE(db)

    Number of samples (x100)

    T h e R L S a d a p t i v e c a n c e l l e r

    0 50 100 150 200 250 300 350 400 450 50050

    45

    40

    35

    30

    25

    20

    15

    10

    5

    M

    SE(db)