11 Ahti Salo, Juuso Liesiö and Eeva Vilkkumaa Department of Mathematics and Systems Analysis Aalto...

download 11 Ahti Salo, Juuso Liesiö and Eeva Vilkkumaa Department of Mathematics and Systems Analysis Aalto University School of Science and Technology P.O. Box.

If you can't read please download the document

description

3 n Projects offer different amounts of value (eg NPV) n Estimates about projects’ values are inherently uncertain n Yet decisions must be based on these uncertain estimates n In reality, projects whose values have been overestimated have a higher chance of getting selected n Thus the decision maker should expect to be disappointed with the performance of the selected portfolio Logic behind the optimizer’s curse

Transcript of 11 Ahti Salo, Juuso Liesiö and Eeva Vilkkumaa Department of Mathematics and Systems Analysis Aalto...

11 Ahti Salo, Juuso Liesi and Eeva Vilkkumaa Department of Mathematics and Systems Analysis Aalto University School of Science and Technology P.O. Box 11000, Aalto FINLAND Selecting Better Portfolios Based on Uncertainty-Adjusted Performance Estimates 2 Characteristics project portfolio selection n Large number of proposals Typically dozens or even hundreds of proposal n Only a fraction can be selected with available resources Even other resources than money may matter (critical competences) n Value may be measured with regard to several criteria International collaboration, innovativeness, feasibility of plans n Reliable information about value is difficult hard to obtain Different experts may give different ratings The accuracy of evaluation information may vary from one project to the next 3 n Projects offer different amounts of value (eg NPV) n Estimates about projects values are inherently uncertain n Yet decisions must be based on these uncertain estimates n In reality, projects whose values have been overestimated have a higher chance of getting selected n Thus the decision maker should expect to be disappointed with the performance of the selected portfolio Logic behind the optimizers curse 4 Example choose 5 out of 12 projects 5 Value of information and optimality in DA n Optimizers curse: skepticism and postdecision surprise in decision analysis (Smith and Winkler, 2006) Choose one out of many alternatives Normally distributed values and errors Positively correlated errors aggravate the curse n Value of information in project portfolio selection (Keisler, 2004) For some selection rules, the selected portfolio has much higher value than for other selection rules It pays off to devote attention to the design of the selection process 6 7 Emphasis in the Priority-Setting Process Salo, A. & J. Liesi. A Case Study in Participatory Priority-Setting for a Scandinavian Research Programme, International Journal of Information Technology and Decision Making 5/1 (2006) 8 n High expectations may not be necessarily met Eg., biotechnological research in Finland has not lead to the emergence of a strong industrial sector n Management questions: Relevance to funding agencies Should projects with higher evaluation uncertainties be selected together with lower evaluation uncertainties? Should the level of uncertainties be explicitly accounted for in project selection decisions? 9 What if evaluation uncertainties vary across projects? n Projects whose value has been overestimated are more likely to become selected n When the competition is strong, it is likely that more selections will be made from projects with high evaluation errors these projects become overrepresented n Thus, one should pay attention not only to the estimates but also to the uncertainty of estimates n How can such uncertainties be systematically accounted for? 10 Example choose 5 projects out of 12 Value V Estimate V+e Max estimates Optimal solution 19,628, ,677, ,1812, ,663, ,8611, ,4917, ,655, ,7322, ,4513, ,4614, ,2720, ,2413,5810 Estimated value87,56- Actual value70,1973,53 E[V]Std[V]Std[e] Estiamte Value : Projects with low evaluation uncertainties : Projects with high evaluation uncertainties 11 Select k out of n projects with the aim of maximizing the sum of the projects true values i, i =1,..., n The values i are generally unknown Decisions are made based on estimates V i about i Selection process EstimatesPortfolio selectionValues t 12 n Assume that estimates are unbiased n Overestimated projects are more likely to get selected Resulting portfolio value is less that what the estimates suggest (optimizers curse; cf. Smith and Winkler, 2006) where is the index set of the selected projects. Optimizers curse in project portfolio selection 13 n Choose 10 projects out of 100 n Values i.i.d with n Unbiased estimates Optimizers curse i ~ N(0,1) V i = i + i, i ~ N(0, 2 ) 14 Optimal revision of the estimates n Estimates do not account for the uncertainties n Use Bayesian revised estimates instead as a basis for project selection For, the estimate and the prior mean m i are weighted according to their respective levels uncertainties, where 15 Selections based on revised estimates E[V]Std[V]Std[e] : Projects with low evaluation uncertainties : Projects with high evaluation uncertainties Value V Estimate V+e Revised estimate V* Max estimates Max revised estimates Optimum portfolio 19,628,938, ,677,537, ,1812,4312, ,663,063, ,8611,4211, ,4917,0716, ,655,658, ,7322,1814, ,4513,5311, ,4614,0211, ,2720,7214, ,2413,5811,40110 Estimated value87,5669,27- True value70,1970,9173,53 Arvio Arvo E[V] 16 Elimination of optimizers curse n With revised estimates, the optimizers curse is eliminated that is where is the index set of projects selected based on revised estimates n Previous example Choosing 10 projects out of 100 True values i.i.d. with Unbiased estimates i ~ N(0,1) V i = i + i, i ~ N (0, 2 ) Portfolio value Standard deviation of estimation error 17 Revised estimates and portfolio composition n In this example, the projects values were identically distributed and the estimation errors had equal variances n In this case, the priorization of projects remains unchanged when using revised estimates, because n In general, revised estimates may result in a different project prioritization than the initial estimates 18 Example on the revision of estimates n Choose 3 projects out of 8 n True values are i.i.d. with i ~ N(0,1) n Left: All projects are equally difficult to evaluate equal variances of errors n Right: Four project are harder to evaluate and have higher variances of error terms steeper correction slopes V i = i + i, i ~ N(0,0.5 2 ) V i = i + i, i ~ N(0,1) 19 n Left: For equal variances, all estimates are revised towards the mean in the same way n Right: More uncertain dashed estimates are revised to the mean more strongly n Thus different portfolios of three projects would be selected, depending on whether or not estimates are revised Revision of estimates and portfolio selections 20 Portfolio for revised estimates n Revised estimates tend to yield higher portfolio value n Example Select 10 out of 100 projects with values i ~ N(3,12) Two sub-populations 1) i ~ N(0,0.1) - small errors 2) i ~ N(0,1) - large errors Optimal Estimates Revised estimates 21 Share of correct choices Using revised estimates increases the share of projects that belong to the optimal portfolio, i.e., where K is the index set of the projects in the optimal portfolio There is a statistically significant difference between the portfolios ( =0.05) when the share of projects with high evaluation uncertainties is between 25-55% Share of correct choices [%] Share of projects with large error variance [%] 22 n Selection based on unrevised estimates Optimizers curse: The value of the portfolio will, on average, be lower than expected If the proposals come from populations with different levels of estimation errors, the selected portfolio is likely to contain too many projects from the population with high uncertainties n Improving the selection process Account for evaluation uncertainties by using revised estimates Build separate portfolios for sub-populations with different levels of evaluation errors (e.g., a separate budget for high-risk projects) But do we know how uncertain the evaluations are? Conclusion