notes_geo

download notes_geo

of 4

Transcript of notes_geo

  • 8/10/2019 notes_geo

    1/4

    ST2334: SOME NOTES ON THE GEOMETRIC AND NEGATIVE

    BINOMIAL DISTRIBUTIONS AND MOMENT GENERATING

    FUNCTIONS

    Geometric Distribution

    Consider a sequence of independent and identical Bernoulli trials with successprobabilityp (0, 1). Define the random variableXas the number of trials until

    we see a success, and we include the successful trial (for example, if I flip a coinwhich shows heads (a success) with probability p, then Xis the number of flips toobtain a head, including the successful flip). We know that:

    X X ={1, 2, . . . , }

    that is,Xis a positive integer. Now, what is the probability that Xtakes the valuex? Well, supposeX= 1, then we must have:

    P(X= 1) =p.

    This is because, we have only one Bernoulli trial, and it is a success. Suppose, nowX= 2; then:

    P(X= 2) = (1 p)p.

    This is because we have two Bernoulli trials, and the first is a failure and the seconda success. Similarly

    P(X= 3) = (1 p)2p.

    Thus, it follows that:

    P(X= x) = f(x) = (1 p)x1p x X ={1, 2, . . . , }.

    Any random variable with the above PMF is said to have a geometric distributionand we write X Ge(p).

    The distribution function, for x X:

    F(x) =x

    y=1

    (1 p)y1p= px

    y=1

    (1 p)y1 =p1 (1 p)x

    p = 1 (1 p)x.

    1

  • 8/10/2019 notes_geo

    2/4

    2 ST2334

    Calculating the expectation is somewhat tedious:

    E

    [X] =

    x=1 x(1 p)

    x1

    p

    = px=1

    d

    dp

    (1 p)x

    = pd

    dp

    x=1

    (1 p)x

    = pd

    dp

    (1 p)

    p

    = 1

    p.

    Perhaps an easier way (and this is the case for E[Xq], q 1) is via the MGF:

    M(t) = E[eXt ]

    =x=1

    ext(1 p)x1p

    = p

    1 p

    x=1

    [(1 p)et]x

    = p

    1 p(1 p)et

    1

    1 (1 p)et

    = pet

    1 (1 p)et

    where we have assumed that (1 p)et

  • 8/10/2019 notes_geo

    3/4

    ST2334 3

    solution (that is the Geometric distribution). Let us suppose that r = 2. Then wehave, forx {2, 3, . . . }

    P(X=x) = (x 1)(1 p)x2

    p2

    .The logic is as follows: we have to have two successes and hence x 2 failures,which accounts for the (1 p)x2p2 part, then we know that the last successfultrial is atx, so the first successful trial must lie in one of the first x 1 trials; thisis why we multiply by x 1 (remember the trials are identical). Now suppose thatr= 3, Then we have, for x {3, 4, . . . }

    P(X= x) =

    x 1

    2

    (1 p)x3p3.

    The logic is as follows: we have to have three successes and hencex 3 failures,which accounts for the (1 p)x3p3 part, then we know that the last successfultrial is atx, so the first and second successful trials must lie in one of the first x 1trials; this is why we multiply by x12 which is the number of ways of picking twoout ofx 1 when the order does not matter. Then, following this reasoning, wehave for any r 1

    P(X=x) = f(x) =

    x 1

    r 1

    (1 p)xrpr x X ={r, r+ 1, . . . }.

    A random variable with the above PMF is said to have a negative binomial distri-bution with parameters r, p, denoted X Ne(r, p).

    The distribution function cannot typically be written down in terms of an an-alytic expression (i.e. without a summation) and computing the expectation fromthe definition is a very tedious and tricky exercise. We focus on calculating themoment generating function:

    M(t) = E[eXt ]

    =x=r

    x 1r 1

    (1 p)xrprext

    =x=r

    x 1

    r 1

    ((1 p)et)xr(pet)r

    = pet

    1 (1 p)et

    r x=r

    x 1

    r 1

    ((1 p)et)xr(1 (1 p)et)r

    = pet

    1 (1 p)et

    r

    ift

  • 8/10/2019 notes_geo

    4/4

    4 ST2334

    Moment Generating Functions

    We note:

    M(t) = ddt

    M(t) = ddtE[eXt ] = E[d

    dteXt ] = E[XeXt ].

    Thus M(0) = E[X]. We are assuming that it is legitimate to swap the order ofsummation and differentiation, which holds for all cases in this course. Using asimilar approach, one can show thatM(2)(0) = E[X2].