Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing...

32
Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University

Transcript of Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing...

Page 1: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Independence and Dependence

1

Krishna.V.PalemKenneth and Audrey Kennedy Professor of ComputingDepartment of Computer Science, Rice University

Page 2: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

ContentSolution to previous class problemsConditional ProbabilityMonty Hall ProblemNotion of distribution

2

Page 3: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Solutions to snake and ladder

3

Using the model of the snakes and ladders, calculate the following valuesThe probability of reaching square 4 from

square 1Ans: <TO BE ADDED>

The probability of reaching square 5 OR square 6 from square 2Ans: <TO BE ADDED>

The probability of reaching square 3 AND square 5 from square 1Ans: <TO BE ADDED>

Page 4: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

ContentSolution to previous class problemsConditional ProbabilityMonty Hall ProblemIn-class Exercise -1Notion of distribution

4

Page 5: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Conditional ProbabilityConsider the same situation

Consider an experiment.

The outcome of the experiment can be specified in terms of two different event spaces

Event AEvent BEvent C

Event 1Event 2Event 3

p

pA

pB

pC

p

p1

p2

p3

The refined probability of Event A when information about Event 1 is given is written as P(Event A | Event 1) { read as probability of Event A given Event 1}5

Page 6: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Consider the die

In the die example

For example, a die is rolled.Through the knowledge of the Oracle, you know that the outcome is an even number.

What is the probability that the outcome is 2 ?

Now because there are only even numbers as possible events, the event space of the die is

Event 2Event 4Event 6

As all these events are equally likely, the probability that the event 2 occurs is 1/3.

6

Page 7: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Conditional ProbabilityThus conditional probability can be defined as follows

If event A is dependent on another event B, then the probability of event A given knowledge about event B is

For the die problemP(Die rolled a 2 | Die rolled an even number) = P(Die rolled 2 and Die rolled even) = 1/6 = 1/3!! Note: P(Die rolled 2 and Die rolled even) is NOT equal to (1/6)*(1/2)

P(Event A | Event B) = P(Event A and Event B occurring) P(Event B occurring)

7

P (Die rolled even) 1 / 2

Page 8: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Intuition for conditional probabilityLet us try to find an equation for conditional

probability.For example, let “Event A” and “Event 1” occur

simultaneously“Event A and Event 1 occurred simultaneously” is

same as“Event 1 occurred” and “Event A occurring given

Event 1 occurred” (Or vice versa). P(Event A and Event 1 Occurred) = P(Event 1

occurred)P(Event A Occurred | Event 1 Occurred) P(Event A Occurred | Event 1 Occurred) = P(Event A and

Event 1 Occurred)

8

Event AEvent BEvent C

Event 1Event 2Event 3

P(Event 1 Occurred)

Page 9: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Intuition for conditional ProbabilityFor example, for the dice game

P(Die rolled 2 and die rolled even) = P(Die rolled 2)P(Die rolled Even | Die rolled 2)

P(Die rolled Even | Die rolled 2) = 1.P(Die rolled 2 and die rolled even) = P(Die rolled 2) =

1/6!

9

Page 10: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Exercise - 6Consider the following experimentThere are two players involved in the game

The first player rolls a pair of diceThe second player has to guess the two

outcomesPlayer A informs Player B of the sumPlayer B guesses again

Calculate the probability of correctness in the both the cases using conditional probability(i)Before knowing the sum(ii)After knowing the sum

10

Page 11: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Data

CASE Total number of guesses

Correct Guesses

Probability of correct guess*

Without suggestion

With suggestion

Collect the data in the following form

Roll Guess before suggestion

Guess after suggestion

Correct outcome

1

2

3

…Consolidate data from the above table in the following form

11 * Use the frequentist definition

Page 12: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Using conditional probability

12

Calculate the probability of correctness in the both the cases using conditional probability(i) Before knowing the sum(ii) After knowing the sum

Recall the formulaP(Event A | Event B) = P(Event A and Event B occurring simultaneously)

P(Event B occurring)

Page 13: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

13

Case 1: Probability that Player B guesses (1,5) without any additional information

Define the favorable event as Event 1: Player B guesses (1,5) which represents seeing a 1 on D1 and 5 on D2.

Total number of events = 6*6 = 36Number of favorable events = 1

Therefore,P(Event 1) = 1/36Case 2: Probability that Player B guesses (1,5) when Player A says that the sum of the numbers is 6

Total number of events is less than 36 because of the knowledge of the sum

the total events space is reduced to { (1,5), (2,4), (3,3), (4,2),(5,1) }

First, let us try to solve the question in the conventional way using the total number of events and the number of favorable events. Player A rolls a pair of dice and Player B guesses the number on them. Suppose Player A rolls the dice and sees (1,5)

P(Event 1 |Player A informs that the sum is 6) = 1/5

Page 14: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

14

Now let us use conditional probability to solve for the same answer

LetEvent 1: Player B guesses (1,5) correctlyEvent 2: Player A notices that the sum is 6 and informs this to B

Using the same example of sum = 6

Let us say that Player B guessed as (1,5)

P(Event 1) = 1/36 = P(Event 1 AND Event 2)

P(Event 2) = P(sum of two dice having sum as 6) = 5/36

For sum = 6

Total number of events = 36Favorable events = (1,5), (2,4), (3,3), (4,2),(5,1)

P(Event 1| Event 2)

= P(Event 1 and Event 2)

P(Event 2)

= (1/36) / (5/36)= 1/5

Page 15: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Take Home Exercise - 7Solve the game of Player A and Player B for all casesPlayer A rolls a pair of dice Player B guesses the number

on them. Only Player A can see the outcome of the diceWithout any informationWith Player telling the sum of the two numbers on the dice

Solve this using the following 2 methodsFrequentist approach by listing all casesConditional probability approach

Calculate the probability of correctness in the both the cases using conditional probability(i)Before knowing the sum(ii)After knowing the sum

15

Page 16: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Independence and Conditional Probability

16

We introduced conditional probability to explain the magnitude of dependence of one random variable upon another.

What is the conditional probability of a random variable X given another random variable Y , if X and Y are independent ?

Let us see…

If X and Y are independent, then the outcome of Y should not have any effect on the outcomeof X.

Therefore given the information about Y, the probability of X will not be affected.

Therefore, p(X=x | Y=y) = p(X=x)

Page 17: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Independence of two random experiments

17

Two random events can be shown as independent in another way also.

Consider two random experiments with the following event spaces

Event AEvent BEvent C

Event 1Event 2Event 3

Random variable X Random variable Y

Page 18: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

18

Recall that while discussing the method of intersection of events we mentioned that for the ruleto apply the events should be independent

The method of intersection of events stated that

“The probability of two independent events occurring simultaneously is equal to the product of probability of individual events”

But the most important condition for that to be true is that the two events should be independent

Therefore another way of checking independence of two experiments is :

Random variables X and Y are independent if and only if

For every )()(),(, YyPXxPYyXxPYyandXx

Page 19: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Exercise 7

19

Event X probability

Head 1 0.5

Tail 0 0.5

Event Y probability

Head 1 0.5

Tail 0 0.5

Event probability

Head – Head

0.25

Head – Tail

0.3

Tail – Head

0.3

Tail – Tail 0.15

Here all probabilities are computed after infinitely many trials. Can you check if the two random variables are independent using the formulation we just discussed?

Experiment with coin CX Experiment with coin CY

Joint Experiment with coin CX and CY

Page 20: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

ContentSolution to previous class problemsConditional ProbabilityMonty Hall ProblemNotion of distribution

20

Page 21: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Exercise 8The Monty Hall problem is a probability

puzzle based on the American television game show Let's Make a Deal.Suppose you're on a game show, and you're given the choice of three doors: Behind one door is a car; behind the others, goats.

You pick a door, say No. 1, and the host, who knows what's behind the doors, opens another door, say Number 3, which has a goat.

He then says to you, "Do you want to pick door Number 2?"

Is it to your advantage to switch your choice?

21

Page 22: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Developing the intuitionDo you think the host opening the door is

independent of your choice of the door?Ans: NO!

Hint: If you choose a door with a goat, the host

MUST open the other door with the goatIf you choose a door with a car, the host can

pick one of the 2 doors with the goat.So the host opening the door is DEPENDENT

on your choice of the door!!Now try to solve the problem!!

22

Page 23: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Intuition ContinuedImagine you doing this experiment 99 times. Suppose you are asked to choose a door.

No. of times you will choose a door with a goat = (99*2)/3 = 66 times.

No. of times you will choose a door with a car = (99*1)/3 = 33 times.

Case 1: You don’t switchNo. of times you win the car after host opens

door with a goat = No. of times you chose a door with a car in the first place = 33 times.

Probability of winning without switching = 33/99 = 1/3.

23

Page 24: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Intuition ContinuedCase 2: You switchImportant Intuition

Case 2.1: If you choose a door with a goat,The host MUST choose the other door with the goatAs a result, the third unopened door MUST have the car.Clearly, switching = Winning the car

Case 2.2: If you choose a door with a car, Clearly, switching does not help at all!

Therefore, the number of times you will win if you switch = Number of times you choose a door with a goat in the first place = 66 times!!! (2x33 times)

Result: Switching doubles your chance of winning!!

24

Page 25: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Exercise-9The game

You will be given three cupsThere will be a marble under one of them.The rest of the two will be empty

Divide yourselves into groups of 2One of the two players would be the host for the first

10 roundsThen swap roles

First play 10 rounds each without changing your choice after the first cup is opened

Then play another 10 rounds by changing your choice

Compare the probabilities in both the cases25

Page 26: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

Data Game First

ChoiceFirst Cup opened

Changed choice (if any)

Second cup opened

Correct cup

1

2

3

From the above data calculate the probability of each case(i)No change in choice(ii)Choice changed26

Page 27: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

ObservationsWhat did you observe ?Was the case where you changed your

choice better or worse ?Can you mathematically explain the correct

answer to the above question and also show by how much?Hint: Use conditional probabilities

27

Page 28: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

ContentSolution to previous class problemsConditional ProbabilityMonty Hall ProblemNotion of distribution

28

Page 29: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

The notion of a distributionA distribution is a function defined on the random variable that gives the value of the probability of the random variable taking a particular value

X p(X)

1 1/6

2 1/6

3 1/6

4 1/6

5 1/6

6 1/6

Why do we need a distribution?Compactness: Lets take a look at

history of number. It clearly shows that time and again,

number representations have evolved into more and more compact representations from tally marks to decimal representation.

Similarly, a distribution is a compact way to represent a random variable and all the possible outcomes.IVXXX 10.3

Page 30: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

The notion of a distributionLets look at the following game of dice

problem. For a fair die, the probability seeing a 1, 2, all the way to 6 is 1/6.

30

X p(X)

1 1/6

2 1/6

3 1/6

4 1/6

5 1/6

6 1/6

•A compact representation of this would be an equation• P(X=i) = 1/6 for all i={1,2,3,4,5,6} is a distribution. •Such a distribution is called “Uniform distribution”•If there is a fair die which has N faces, then P(X=i) = 1/n for all i={1,2,3,….,n}

Page 31: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

The notion of a distributionSuppose you throw a die till you see number 1

after which you stop. Here we want to find the probability that the

number of times you throw the die

31

X p(X)

1 1/6

All other

number5/6

P(just one throw) = P(getting a 1 in first throw) = 1/6

P(just two throws) = P(getting number other than 1 in first throw)* P(getting number 1 on second) = (5/6)*(1/6)

P(just N throws) = P(getting number other than 1 in first N-1 throws)*P(getting 1 on Nth throw) = (5/6)N-1(1/6)

This is a “geometric distribution”

Page 32: Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.

END

32