Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

25
Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy

Transcript of Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Page 1: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Data Analysis – Workshop

Decision Making and RiskSpring 2006Partha Krishnamurthy

Page 2: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Outline

Introduction to Decision Analysis Review Decision Trees Terminology/Orientation

Introduction to DA software Workshop (hands-on)

Basic decision tree Sensitivity analysis Incorporating costs and utilities

Page 3: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Basic Decision Tree

Outcome 1

p1Value_1_or_U1

Outome 2

p2Value_2_or_U2

Choice 1

Outcome 3

p3Value_3_or_U3

Outcome 4

p4Value_4_or_U4

Choice 2

Decision to be made

Nodes: Decision node, Chance node, Terminal node,

Branches, connect nodes Outcomes, labels, probabilities (fixed,

variable) Values/Utilities

Page 4: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Data Analysis Conventions

Outcome 1

p1Value_1_or_U1

Outome 2

p2Value_2_or_U2

Choice 1

Outcome 3

p3Value_3_or_U3

Outcome 4

p4Value_4_or_U4

Choice 2

Decision to be made

Distal/Downstream

Upstream/Proximal

What is the relevance of this distinction?

Page 5: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Analysis – Basic Principle

Outcome 1, 1

p11U11

Outcome 1, 2

p12U12

Outcome 1

p1

Outcome 2, 1

p21U21

Outcome 2, 2

p22U22

Outome 2

p2

Choice 1

Outcome 3, 1

p31Outcome 3, 2

p32

Outcome 3

p3

Outcome 3, 1

p41Outcome 3, 2

p42

Outcome 4

p4

Choice 2

Decision to be made

Evaluation of trees typically proceeds from the terminal nodes to the decision nodes, i.e., upstream.

At each chance node, expected value is calculated in the usual fashion; EVj=∑i=1

to npij*Uij

The expected value serves as the “Utility” at the chance node, as the analysis proceeds upstream.

This process is called average-out/fold-back.

Page 6: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

The Use of Variables

Both outcomes (payoffs) and probabilities can be specified as raw numbers or as variables.

Specifying them as variables facilitate sensitivity analysis.

Also, you can specify correlations among them.

Let us look at a simple decision tree.

Page 7: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Market Intervention Example

Problem: Declining sales. Possibilities: Product out-of-sync with market,

or nothing wrong (seasonality, things will get better).

Interventions: Launch promotion or do nothing. Data:

Outcomes 22.5m if it succeeds with/without promo. 12.5 if it fails without promo and 10m if it fails with promo

Probabilities p(outofsync)= .3 p(promoeffective_if outofsync)=.86 p(failure_outofsync) = 0.90

Page 8: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Market Intervention Tree

Promotion Succeeds

.8622.5

Promotion Fails

.1410

Product out of sync

.3

Product not out of sync

.722.5

Launch Promotion

Product Succeeds

.1022.5

Product Fails

.9012.5

Product out of sync

.3

Product not out of sync

.722.5

Do nothing

Launch Promotion or Not

Page 9: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Marketing Intervention Example

Promotion Succeeds

0.860$23; P = 0.258

Promotion Fails

0.140$10; P = 0.042

Product out of sync

0.300$21

Product not out of sync

0.700$23; P = 0.700

Launch Promotion$22

Product Succeeds

0.100$23

Product Fails

0.900$13

Product out of sync

0.300$14

Product not out of sync

0.700$23

Do nothing$20

Launch Promotion or NotLaunch Promotion : $22

Page 10: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Remainder of the Session

We will do the following with the castle decision problem. Structure the tree. Perform sensitivity analysis. All of us walk through the same decision

problem, step-by-step.

Page 11: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Reading Break

Take 10 minutes and read the first three pages of the castle decision problem.

Page 12: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Application Break

Take 10 minutes to answer the questions in hand-out.

Page 13: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Structuring the Tree in DATA

In this segment, we are going to follow the steps in the handout (pages 4 through 8).

Page 14: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Sensitivity Analysis

What if our assumptions about the probabilities and/payoffs are different? How would the decision change?

Conceptually, what does sensitivity analysis help us accomplish?

Page 15: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Mechanics

Specify how many different variables you want to analyze.

Specify the range you want analyzed. Specify the number of intervals.

DATA computes expected values at the intervals only, and performs linear interpolation of expected values in between.

More intervals, smoother curve. More intervals, significant computational

resources.

Page 16: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Performing Sensitivity Analysis

In this segment, we will go through the steps in the handout session 2.

Page 17: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Insights from Sensitivity Analysis

Can you generate some insights about the castle decision by performing different types of sensitivity analyses?

Page 18: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Modeling Costs Separately

Previously, we modeled payoff for each outcome state as a single net revenue.

It is more reasonable to think of the payoffs has having two components, a revenue component and a cost component.

Your decision may be sensitive to your revenue assumptions as well as cost assumptions.

Modeling your decision as a cost-benefit tree allows you to gauge the importance of revenues and cost assumptions separately.

Page 19: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Decision Context

Refer to Castle product introduction decision.

Payoff for each outcome state has two components, a revenue component and a cost component.

We modeled only the revenue component. Our goal is to assess what happens to the

decision if the cost of introduction (now and later) and development cost if product fails in 6 months are modeled explicitly.

Page 20: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Modeling Strategy

First change the calculation method to “Benefit-Cost”.

Second, specify costs of introduction now and later, and cost of development if product fails in 6 months.

Initially set all costs to zero at the root node and recover the model expected value (should be the same as before).

Later, use sensitivity analysis to find out the impact of these three variables on the decision.

Page 21: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Application Break

Go back to the castle decision tree (if you have saved the tree, good, if not, let me know, I will give you the file).

Follow the steps in cost-benefit modeling from hand out session 3.

Page 22: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Generalized Multi-Attribute Models

What if every outcome state had payoffs that are not necessarily like costs and benefits? Notice that they are traded-off using a positive

and a negative coefficient, equally important and opposite in effect.

This is where multi-attribute models come into play.

You can tell the DA program, how to combine multiple payoffs, and how to evaluate them.

Page 23: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Warrant for Multi-Attribute Models

Each outcome has more than one attribute. For example: Revenue Market Share Strategic Fit Profit

Decisions have to tackle multiple attributes at the same time.

Page 24: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Market Segmentation Decision - Example

Segments under consideration Cash Cow

10 on revenue, 5 on market share growth, 3 on strategic fit, 6 on profitability.

Star of the Future, Dog for now. 3 on revenue, 7 on market share growth, 8 on

strategic fit, 2 on profitability.

Multi-segment 5 on revenue, 4 on market share growth, 7 on

strategic fit, 6 on profitability.

Page 25: Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy.

Modeling Strategy

Create three branches off of the decision node, each for one segment.

Define each choice as a terminal node, and enter the four payoffs for each choice.

Change model to generalized multi-attribute. Tell DATA how to combine the attributes.

Set importance of each attribute Specify each attribute importance coefficient as a

variable. Set the value for the each attribute

importance, say 0.25, equally weighted. Then perform sensitivity analyses to see how

shifting decision criteria changes decisions.