asymptotic approximation

download asymptotic approximation

of 33

Transcript of asymptotic approximation

  • 7/25/2019 asymptotic approximation

    1/33

    Probability and RandomProcesses

    Sanjit Kaul

  • 7/25/2019 asymptotic approximation

    2/33

    Asymptotic Approximations ofThe Binomial Random Variable

  • 7/25/2019 asymptotic approximation

    3/33

    Why Approximations?

    For a RV X that is Binomial(n,k) we know that

    Note that n choosek term increases very rapidly

    We want approximations that make computationeasier and result in acceptable error

  • 7/25/2019 asymptotic approximation

    4/33

    The Normal Approximation (DeMoivre-Laplace Theorem) Sec 4-5 PP

    Paraphrasing From Wiki: It is believed that de Moivres attempt to approximate

    coefficients of (a+b)^n is where the normal distribution

    first appeared Also, Gauss first showed that the error in

    measurements can be described by a probability law,which is the normal law of errors

  • 7/25/2019 asymptotic approximation

    5/33

    The Normal Approximation

    We can write (see Eq 4-96 in PP)

    where

  • 7/25/2019 asymptotic approximation

    6/33

    Animation From Wiki

    http://en.wikipedia.org/wiki/De_Moivre%E2%80%93Lapla

    ce_theorem(Also, see proof. Use of the Stirlings

    approximation for fact(n) when n is large)

    http://en.wikipedia.org/wiki/De_Moivre%E2%80%93Laplace_theoremhttp://en.wikipedia.org/wiki/De_Moivre%E2%80%93Laplace_theoremhttp://en.wikipedia.org/wiki/De_Moivre%E2%80%93Laplace_theoremhttp://en.wikipedia.org/wiki/De_Moivre%E2%80%93Laplace_theoremhttp://en.wikipedia.org/wiki/De_Moivre%E2%80%93Laplace_theorem
  • 7/25/2019 asymptotic approximation

    7/33

    The Poisson Approximation (The Law ofRare Events)

    We are interested in the following case

    Suppose I ask you stand besides a stall and countthe number of customers that arrive at the stall.

    Or I ask you to count (manually?) the numberof

    packets that arrive at the IIITD internet gateway

  • 7/25/2019 asymptotic approximation

    8/33

    The Poisson Approximation

    Lets divide our observation interval T into smaller

    intervals We can come up with a model where we perform a

    Bernoulli trial in every A customer/packet arrives in with probability p = /T

    As becomes smaller, the number of trials n increasesand p decreases, that is n -> and p -> 0.

    Under the condition that

    the probability that k arrivals take place in n trials can beapproximated by a Poisson distribution

    Average!

  • 7/25/2019 asymptotic approximation

    9/33

    The Poisson Approximation

    The Poisson approximation is useful for caseswherepis very small and nis very large

    We have, for our Binomial RV Kn

  • 7/25/2019 asymptotic approximation

    10/33

    The Poisson Approximation

    We have

    Therefore

    which is the Poisson distribution The law of rare eventsstates that for a large number of

    trials (n) and a small probability of occurrence of the eventduring a trial, the number of event occurrences follows(approximately) a Poisson distribution

  • 7/25/2019 asymptotic approximation

    11/33

    Inequalities!

  • 7/25/2019 asymptotic approximation

    12/33

    Markovs Inequality (See 5.5.1 RN)

    Inequality on the survivor function P[X >= t]of aRV X

    Expectation of a function h(X) of RV X is given by

    Assume h(z) >= 0 for all z and that h(z) is a non-decreasing function

    For any t we have?

  • 7/25/2019 asymptotic approximation

    13/33

    Markovs Inequality (See 5.5.1 RN)

    For any t we have

    Therefore

    We get

    ?

    Markovs

    Inequality

  • 7/25/2019 asymptotic approximation

    14/33

    Markovs Inequality

    An example of h(x) that is non-negative and non-decreasing is h(x) = x+where

    We have

    The above is also called the Simple MarkovInequality

  • 7/25/2019 asymptotic approximation

    15/33

    Markovs Inequality

    Inequality is useful to make observations aboutthe tail of a distribution

    Let RV X be the service time at a restaurant or thetime it takes to load a webpage

    Clearly, X >= 0 and E[X+] = E[X]

    Using the inequality, since X>=0, we get

    If the expected time is 1 sec

    P[X >= 10]

  • 7/25/2019 asymptotic approximation

    16/33

    Chebyshevs Inequality

    We will use the Markov Inequality

    Define

    We have (using Markov Inequality and since Y>=0)

    Thus

  • 7/25/2019 asymptotic approximation

    17/33

    Chebyshevs Inequality

    We have

    Let

    Substituting

    For any RV X, the probability that the RV is morethan c standard deviations away from the mean is

  • 7/25/2019 asymptotic approximation

    18/33

    Chebyshevs Inequality

    We have

    If Var[X] = 0, then probability that X = E[X] is 1.

    The bound may not be a tight upper bound

    For X that is Gaussian

  • 7/25/2019 asymptotic approximation

    19/33

    Chebyshevs Inequality

    The Chebyshev bound is

    Clearly a very loose bound in our example!

    Note that the calculation of the bound does notrequire knowledge of the distribution

  • 7/25/2019 asymptotic approximation

    20/33

    Lyapunov Inequality (Eq 5-92 PP)

    Define RV Y as

    We know that E[Y2] >= 0

    That is

  • 7/25/2019 asymptotic approximation

    21/33

    Lyapunov Inequality (5-92 PP)

    The quadratic

    Its discriminant must be non-positive

    We have

    That is

    We get

    Via Wikipedia

    http://en.wikipedia.org/wiki/Quadratic_equationhttp://en.wikipedia.org/wiki/Quadratic_equation
  • 7/25/2019 asymptotic approximation

    22/33

    Lyapunov Inequality (5-92 PP)

    Note that 0

    = 1

    Substituting for 0in the first inequality, 1in thesecond inequality and so on, we get

    Thus we have

  • 7/25/2019 asymptotic approximation

    23/33

    Jensens Inequality

    For any convexfunction f(.) and RV X

  • 7/25/2019 asymptotic approximation

    24/33

    Calculating Moments

  • 7/25/2019 asymptotic approximation

    25/33

    Prove the Identity

    Think Gaussian pdf

    Use the fact that area under it is 1

  • 7/25/2019 asymptotic approximation

    26/33

    Calculating Moments of a Gaussian RV

    E[Xn] = 0 for odd n

    For even n: n=2k, k =0,1,2,

    Start with

    Differentiate k times with respect to . We get

    Set =1/(22) and we can calculate the evenmoments E[X2k] for k=0,1,2,

  • 7/25/2019 asymptotic approximation

    27/33

    Moments of a Gaussian RV

    What about E[|Xn|]?

    Same as E[Xn] for even n

    For odd n = 2k+1, k=0,1,2,

    Let y=x2/(22) , we get

  • 7/25/2019 asymptotic approximation

    28/33

    Moments of a Gaussian RV

    Finally, note that, for integer k

    HW: A Rayleigh density is given by

    Find E[Xn]. Start with the definition and use E[|Y|n]we calculated for a Gaussian Y. You must get

  • 7/25/2019 asymptotic approximation

    29/33

    Problem 5-23 from PP

    X has a Rayleigh density

    Let Y = b + c X2

    Show that E[Y2] = 4c24

  • 7/25/2019 asymptotic approximation

    30/33

    HW: Problem 4-21 PP

    The probability of heads of a random coin is a RV Puniform in the interval (0,1). (a) Find P[0.3

  • 7/25/2019 asymptotic approximation

    31/33

    Some More Interesting Problems

    Problem 5-23 PP

    Problem 5-14 PP

    Problem 5-12 PP

    Problem 5-2 PP

    Problems 5-51 and 5-52 PP

  • 7/25/2019 asymptotic approximation

    32/33

  • 7/25/2019 asymptotic approximation

    33/33

    Summary For a RV X

    Moments of RV X

    When conditioning on an event E