Probability & Statistics

85
Probability & Statistics Rubayet Karim Assistant Professor Dept. of Industrial & Production Engineering Jessore University of Science & Technology

description

Probability & Statistics

Transcript of Probability & Statistics

Page 1: Probability & Statistics

Probability & Statistics

Rubayet KarimAssistant Professor

Dept. of Industrial & Production EngineeringJessore University of Science & Technology

Rubayet KarimAssistant Professor

Dept. of Industrial & Production EngineeringJessore University of Science & Technology

Page 2: Probability & Statistics
Page 3: Probability & Statistics

P(E)

Where: P(E): probability of occurrence of event E

N: Number of outcomes in EN: total number of outcomes

Probability is a branch of mathematics that deals withcalculating the likelihood of a given event's occurrence,which is expressed as a number between 1 and 0.

P(E)

Where: P(E): probability of occurrence of event E

N: Number of outcomes in EN: total number of outcomes

Page 4: Probability & Statistics

P(E)P(E)

Where: P(E): probability of occurrence of event EN: total number of trials

Number of outcomes in E

Page 5: Probability & Statistics
Page 6: Probability & Statistics
Page 7: Probability & Statistics
Page 8: Probability & Statistics

Example.Experiment : Tossing 4 Coins.

Trial : Tossing each coin.

We can consider the act of tossing each coin as a trial and thussay that there are 4 trials in the experiment of tossing 4 coins.

In probability theory an elementary event (also calledan atomic event or simple event) is an event which containsonly a single outcome in the sample space.Example: Die rolling•The possible outcomes of this experiment are 1, 2, 3, 4, 5and 6. Single outcome(Elementary event).• When the objective is to get a even number from thisexperiment, then possible outcome is 2,4,6 so not a singleoutcome that’s why this is not a elementary event.

In probability theory an elementary event (also calledan atomic event or simple event) is an event which containsonly a single outcome in the sample space.Example: Die rolling•The possible outcomes of this experiment are 1, 2, 3, 4, 5and 6. Single outcome(Elementary event).• When the objective is to get a even number from thisexperiment, then possible outcome is 2,4,6 so not a singleoutcome that’s why this is not a elementary event.

Page 9: Probability & Statistics
Page 10: Probability & Statistics
Page 11: Probability & Statistics
Page 12: Probability & Statistics
Page 13: Probability & Statistics
Page 14: Probability & Statistics
Page 15: Probability & Statistics

{ }Therefore P( ) ) = 0

Page 16: Probability & Statistics

P ( X Y) = P(X) and P ( Y X) = P(Y)

P(A) + P( ) = 1Types of Probability

There are four types:

Marginal probability P(X)

• The probability of X occurring

P ( X Y) = P(X) and P ( Y X) = P(Y)

P(A) + P( ) = 1Types of Probability

There are four types:

Marginal probability P(X)

• The probability of X occurring

Page 17: Probability & Statistics

Union Probability P( )• The probability of X or Y occurringJoint Probability P( )• The probability of X and Y occurringConditional probability P ( X Y)• The probability of X occurring given that Y has

occurred.

General Law of AdditionP( ) = P(X) + P(Y) – P( )

Union Probability P( )• The probability of X or Y occurringJoint Probability P( )• The probability of X and Y occurringConditional probability P ( X Y)• The probability of X occurring given that Y has

occurred.

General Law of AdditionP( ) = P(X) + P(Y) – P( )

Page 18: Probability & Statistics
Page 19: Probability & Statistics

P( ) = P(X) + P(Y)P( ) = P(X) + P(Y)

Page 20: Probability & Statistics

• P(T C) = P(T) + P(C) =

P( )=P(X) P(Y X) =P(Y) P(X Y)

Page 21: Probability & Statistics

• P(S) = P(M)=• P(S M) =0.2• P(S M) = P(M) P(S M)=

P(X)= P(X Y), P(Y) =P(Y X)P(X)= P(X Y), P(Y) =P(Y X)

P( ) = P(X) P(Y)

Page 22: Probability & Statistics

P ( ( ) P(Y X) P(X)

P(X Y) = =

P(Y) P(Y)

Page 23: Probability & Statistics
Page 24: Probability & Statistics
Page 25: Probability & Statistics

(Y Xi) P(Xi)

P(Xi Y)P(Y Xi) P(Y Xi) P(Xi )

(Y Xi) P(Xi)(Y Xi) P(Xi)

Page 26: Probability & Statistics

Bayes’ rule• Events are mutually exclusive(i.e conflict with each other )• Together they must form a sample space• Most of the time use reverse time order probability (i.e

P(cause effect) )• When conditional probability declares reverse time order

then it is called posterior probability• P(Y X) Time order

Effect Cause

• P( X Y) Reverse time order

Bayes’ rule• Events are mutually exclusive(i.e conflict with each other )• Together they must form a sample space• Most of the time use reverse time order probability (i.e

P(cause effect) )• When conditional probability declares reverse time order

then it is called posterior probability• P(Y X) Time order

Effect Cause

• P( X Y) Reverse time order

Page 27: Probability & Statistics

P(Y X1) P(X1)

P(X1 )+P(Y X1) P(Y X2) P(X2)+ P(Y X3) P(X3)

Page 28: Probability & Statistics
Page 29: Probability & Statistics

ExampleMarie is getting married tomorrow, at an outdoor ceremony inthe desert. In recent years, it has rained only 5 days each year.Unfortunately, the weatherman has predicted rain for tomorrow.When it actually rains, the weatherman correctly forecasts rain90% of the time. When it doesn't rain, he incorrectly forecastsrain 10% of the time. What is the probability that it will rain onthe day of Marie's wedding?

ExampleMarie is getting married tomorrow, at an outdoor ceremony inthe desert. In recent years, it has rained only 5 days each year.Unfortunately, the weatherman has predicted rain for tomorrow.When it actually rains, the weatherman correctly forecasts rain90% of the time. When it doesn't rain, he incorrectly forecastsrain 10% of the time. What is the probability that it will rain onthe day of Marie's wedding?

Solution: The sample space is defined by two mutually-exclusiveevents - it rains or it does not rain. Additionally, a third eventoccurs when the weatherman predicts rain. Notation for theseevents appears below.Event A1. It rains on Marie's wedding.Event A2. It does not rain on Marie's wedding.Event B. The weatherman predicts rain.

Page 30: Probability & Statistics

In terms of probabilities, we know the following's P( A1 ) = 5/365=0.0136985 [It rains 5 days out of the year.]P( A2 ) = 360/365 = 0.9863014 [It does not rain 360 days out of theyear.]P( B | A1 ) = 0.9 [When it rains, the weatherman predicts rain 90%of the time.]P( B | A2 ) = 0.1 [When it does not rain, the weatherman predictsrain 10% of the time.]

In terms of probabilities, we know the following's P( A1 ) = 5/365=0.0136985 [It rains 5 days out of the year.]P( A2 ) = 360/365 = 0.9863014 [It does not rain 360 days out of theyear.]P( B | A1 ) = 0.9 [When it rains, the weatherman predicts rain 90%of the time.]P( B | A2 ) = 0.1 [When it does not rain, the weatherman predictsrain 10% of the time.]

We want to know P( A1 | B ), the probability it will rain on theday of Marie's wedding, given a forecast for rain by theweatherman. The answer can be determined from Bayes'theorem, as shown below.

Page 31: Probability & Statistics

Note the somewhat unintuitive result. Even when the weathermanpredicts rain, it only rains only about 11% of the time. Despite theweatherman's gloomy prediction, there is a good chance thatMarie will not get rained on at her wedding.

Page 32: Probability & Statistics
Page 33: Probability & Statistics
Page 34: Probability & Statistics
Page 35: Probability & Statistics
Page 36: Probability & Statistics
Page 37: Probability & Statistics
Page 38: Probability & Statistics
Page 39: Probability & Statistics
Page 40: Probability & Statistics
Page 41: Probability & Statistics

c = constant

Page 42: Probability & Statistics
Page 43: Probability & Statistics
Page 44: Probability & Statistics
Page 45: Probability & Statistics
Page 46: Probability & Statistics
Page 47: Probability & Statistics
Page 48: Probability & Statistics
Page 49: Probability & Statistics
Page 50: Probability & Statistics
Page 51: Probability & Statistics
Page 52: Probability & Statistics
Page 53: Probability & Statistics
Page 54: Probability & Statistics
Page 55: Probability & Statistics
Page 56: Probability & Statistics
Page 57: Probability & Statistics
Page 58: Probability & Statistics

P(X ≥ 12.5 ) = = = 0.1666P(X ≥ 12.5 ) = = = 0.1666

P(X< 12.5 ) = 1- P(X ≥ 12.5 ) = 1- 0.1666 =0.8333

Page 59: Probability & Statistics
Page 60: Probability & Statistics
Page 61: Probability & Statistics
Page 62: Probability & Statistics
Page 63: Probability & Statistics
Page 64: Probability & Statistics
Page 65: Probability & Statistics
Page 66: Probability & Statistics
Page 67: Probability & Statistics
Page 68: Probability & Statistics
Page 69: Probability & Statistics
Page 70: Probability & Statistics

Application

The exponential distribution occurs naturally when describingthe lengths of the inter-arrival times in a homogeneous PoissonProcess• Queuing theory :the service times of agents in a system (e.g.

how long it takes for a bank teller etc. to serve a customer)are often modeled as exponentially distributed variables.

• Reliability theory: Because of the memory less property ofthis distribution, it is well-suited to model the constant hazardrate portion of the bathtub curve.

The exponential distribution occurs naturally when describingthe lengths of the inter-arrival times in a homogeneous PoissonProcess• Queuing theory :the service times of agents in a system (e.g.

how long it takes for a bank teller etc. to serve a customer)are often modeled as exponentially distributed variables.

• Reliability theory: Because of the memory less property ofthis distribution, it is well-suited to model the constant hazardrate portion of the bathtub curve.

Page 71: Probability & Statistics
Page 72: Probability & Statistics
Page 73: Probability & Statistics
Page 74: Probability & Statistics
Page 75: Probability & Statistics
Page 76: Probability & Statistics
Page 77: Probability & Statistics
Page 78: Probability & Statistics
Page 79: Probability & Statistics
Page 80: Probability & Statistics
Page 81: Probability & Statistics
Page 82: Probability & Statistics
Page 83: Probability & Statistics

1st moment of x : Mean, E[X]= μ/1 (x)

1st & 2nd moment of x: Variance, E[ X2 ] – {E[X]}2 = μ/2 (x) -

[μ/1 (x)]2

1st , 2nd & 3rd moment of x:Skewness , E[ X3 ] -3E[X] E[ X2 ]+ 2(E[X])3

= μ/3 (x)-3. μ/

1(x) μ/2 (x)+ 2. [μ/

1 (x)]3

1st , 2nd ,3rd & 4th moment of x:Kurtosis , E[ X4 ] -4E[X] E[ X3 ]+ 6(E[X])2 E[ X2] -3(E[X])4

=μ/4 (x)-4μ/

1 (x) μ/3 (x)+ 6[μ/

1 (x) ]2 μ/2 (x) -3. [μ/

1 (x)]4

1st moment of x : Mean, E[X]= μ/1 (x)

1st & 2nd moment of x: Variance, E[ X2 ] – {E[X]}2 = μ/2 (x) -

[μ/1 (x)]2

1st , 2nd & 3rd moment of x:Skewness , E[ X3 ] -3E[X] E[ X2 ]+ 2(E[X])3

= μ/3 (x)-3. μ/

1(x) μ/2 (x)+ 2. [μ/

1 (x)]3

1st , 2nd ,3rd & 4th moment of x:Kurtosis , E[ X4 ] -4E[X] E[ X3 ]+ 6(E[X])2 E[ X2] -3(E[X])4

=μ/4 (x)-4μ/

1 (x) μ/3 (x)+ 6[μ/

1 (x) ]2 μ/2 (x) -3. [μ/

1 (x)]4

Page 84: Probability & Statistics

Example: The value of a piece of factory equipment afterthree years of use is 100(0.5)x where X is a random variablehaving moment generating function Mx (t) for t <Calculate the expected value of this piece of equipment afterthree years of use.

Soln: Let , Value Y = 100(0.5)x

So expected value, E[Y] = E[100(0.5)x ]= 100 E[(0.5)x ]=100E [ ]=100E [ ]= 100Mx (ln 0.5)= 100 x

= 41.9060

Example: The value of a piece of factory equipment afterthree years of use is 100(0.5)x where X is a random variablehaving moment generating function Mx (t) for t <Calculate the expected value of this piece of equipment afterthree years of use.

Soln: Let , Value Y = 100(0.5)x

So expected value, E[Y] = E[100(0.5)x ]= 100 E[(0.5)x ]=100E [ ]=100E [ ]= 100Mx (ln 0.5)= 100 x

= 41.9060

Page 85: Probability & Statistics

THE ENDTHE END