Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Multi-criteria Analysis for Impact Assessment The Maximum Likelihood Approach

Michaela Saisana [email protected]

European Commission Joint Research Centre Econometrics and Applied Statistics Unit

1 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Outline • Applications of MCA and CBA • Cost Benefit Analysis (+ limitations) • Roots of MCA in Social Choice Theory • 5 methods (Relative majority, Condorcet, Borda, Successive eliminations, Median ranking) • Limitations of the Weighted Sum (most common approach) • Weights as importance coefficients (BA and AHP) • MCA: Maximum likelihood approach (steps, suitability) • Sensitivity Analysis of MCA result • Conclusion

2 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Some Decision or Evaluation Problems • Locating a new plant • Human resources management • Evaluating projects • Selecting an investment strategy • Electricity production planning • Regional planning • Evaluation of urban waste management systems • Environmental applications • Health Risk Prediction • Systemic Risk Assessment ( JRC collaboration with the European Systemic Risk Board – European Central Bank)

3 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

4 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Basic steps of cost-benefit analysis (CBA)

1. Determine if CBA is worth doing 2. Identify objectives and policy alternatives 3. Determine stakeholders 4. Identify costs and benefits of each alternative 5. Sort into measurable and non-measurable costs and benefits 6. Estimate costs and benefits that can be measured in monetary terms 7. Conduct sensitivity analysis 8. Compare costs-benefits across alternatives 9. Adjust for non-measurable costs and benefits(?) 10. Make a decision

5 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Cost-benefit guidelines

• UK Department of the Treasury, Appraisal and Evaluation in Central Government (The Green Book), London:2002, http://www.hmtreasury.gov.uk/data_greenbook_index.htm • NZ Treasury guidelines www.treasury.govt.nz/publications/guidance/planning/costbenefitanalysis> • Australian Government, Office of Best Practice Regulation, http://www.finance.gov.au/obpr/cost-benefit-analysis.html (see especially Handbook of Cost- Benefit Analysis, and Best Practice Regulation Handbook) • Queensland Government, Department of Infrastructure and Planning, Cost Benefit Analysis, www.dip.qld.gov.au/resources/guideline/project-assurance-framework/pafcost-benefit- analysis.pdf • Government of Western , Department of Treasury and Finance, 2005, Project Evaluation Guidelines, www.dtf.wa.gov.au/cms/uploadedFiles/project_evaluation_guidelines_2002.pdf

6 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Limitations of CBA

• Results often highly sensitive to specific assumptions, such as discount rate • Difficult to balance non-quantifiable costs/benefits against quantifiable ones • Anthropocentric in its underlying social vision How much is life, education (literacy), welfare, health, ecological sustainability, employment (business confidence) worthy?

7 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Multi Criteria Analysis (MCA) - Definition

“Multi Criteria Analysis is a decision-making tool, developed for complex multi-criteria problems that include quantitative and/or qualitative aspects of the problem in the decision making process.”

(Center for International Forestry Research, CIFOR, 1999)

8 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

MCA - Steps

1. Establish the decision context 2. Identify the performance criteria and the options 3. Describe/rate the performance of each option against the criteria 4. Assign weights across criteria 5. Combine the information to obtain a ranking of the options 6. Examine the results and review 7. Conduct sensitivity analysis 8. Final decisions

9 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

MCA - Performance matrix

Criteria should not be dependant on each other and not redundant (to avoid double counting) Criterion 1 Criterion 2 Criterion 3 Criterion 4 … (/20) (rating) (qual.) (Y/N) Action 1 20 135 G Yes … Action 2 9 156 B Yes … Action 3 15 129 VG No … Action 4 9 146 VB No … Action 5 7 121 G Yes … … … … … … …

10 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

MCA - Performance matrix

• Who decides the ratings? MCA very flexible wrt who gets a say in either the criteria or rating the options:

Democratic decision-making - all members of the decision-making body, or each organizational branch/unit, independently allowed to rate options Panel of experts asked to make judgments; can use different panel to judge different criteria Consensus model - decision-making body ‘thrash it out’ Stakeholder inclusion Different groups can rate options on different criteria

11 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

MCA - Result

The outcome of MCA can be used to: • Identify a single, most-preferred option • Rank options • Short-list a limited number of options for subsequent detailed appraisal through other methods such as CBA • Distinguish acceptable from unacceptable options • Combine different options based on relative strengths

12 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Social Choice Theory

Problem: • A group of voters have to select a candidate among a group of candidates (election) • Each voter has a personal ranking of the candidates according to his/her preferences • Which candidate must be elected? Best interest of society What is the «best» voting procedure?

Analogy with multi-criteria analysis: • Candidates  actions • Voters  criteria

13 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Social Choice Theory

Social choice theory methods would be ideally suited for assessing multiple options through multiple criteria … and were already available between the end of the XIII and the XV century, …

14 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

1 2 3 4 1. Ramon Llull (ca. 1232 – ca. 1315) proposed first what would then become known as the method of Condorcet. 2. Nicolas de Condorcet, (1743 –1794) His „Sketch for a Historical Picture of the Progress of the Human Spirit (1795)‟ can be considered as an ideological foundation for evidence based policy (modernity at its best!). 3. Nicholas of Kues (1401 – 1464), also referred to as Nicolaus Cusanus and Nicholas of Cusa developed what would later be known as the method of Borda. 4. Jean-Charles, chevalier de Borda (1733 – 1799) developed the Borda count.

15 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Five methods (among many others)

1. Relative majority 2. Condorcet 3. Borda 4. Successive eliminations 5. Median ranking

16 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Method 1 : Relative majority

3 candidates: Adam, Brian, Carlos 30 voters: 11 10 9 voters voters voters A 11 A B C B 10 B C B C 9 C A A Adam is elected

17 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Method 1 : Relative majority

3 candidates: Adam, Brian, Carlos Problem: B and C preferred to 30 voters: A by a majority of voters! 11 10 9 voters voters voters A 11 A B C B 10 B C B C 9 C A A Adam is elected

18 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Method 2 : Condorcet

3 candidates: Adam, Brian, Carlos 30 voters:

11 10 9 B preferred to A 19 voters voters voters votes B preferred to C 21 A B C votes

C preferred to A 19 B C B votes

C A A Brian is elected 19 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Method 2 : Condorcet 3 candidates: Adam, Brian, Carlos Problem: Nobody is elected! 9 voters: (cycle) 4 3 2 A preferred to B 6 voters voters voters votes B preferred to C 7 A B C votes

C preferred to A 5 B C A votes

C A B

20 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

31 x 2 + 39 x 1 Method 3 : Borda 3 candidates: Adam, Brian, Carlos 11 x 2 + 11 x 1 81 voters: 30 29 10 10 1 1 Points Scores voters voters voters voters voter voter

A C C B A B 2 A 101

C A B A B C 1 B 33

B B A C C A 0 C 109

Carlos is elected! 39 x 2 + 31 x 1 21 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Method 3 : Borda 4 candidates: Adam, Brian, Carlos, David 7 voters: Scores Ranking 3 2 2 Points voters voters voters A 13 A C B A 3 B A D 2 B 12 B A D C 1 C 11 C D C B 0 D 6 D Adam is elected 22 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Method 3 : Borda Problem: Fully Dependant on irrelevant alternatives (easy to 4 candidates: Adam, Brian, Carlos, David manipulate) 7 voters: Scores Ranking 3 2 2 Points voters voters voters A 6 C B A 2 C

B A C 1 B 7 B

A C B 0 C 8 A Carlos is elected

23 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Method 4 : Successive eliminations 3 candidates: Adam, Brian, Carlos 11 voters: A tour-wise procedure, whereby the worst candidate (most voted in the 6 4 1 last position) is eliminated Ranking progressively until one is left. voters voters voters A C C A

C A B C

B B A B

24 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Method 5 : Median ranking • Ranking of candidates for each voter 3 candidates: Adam, Brian, Carlos • Median rank for each candidate across voters 11 voters: 6 4 1 Ranking voters voters voters A C C A: 11111122223 A C A B B:23333333333 C B B A C:11111222222 B

25 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

5 candidates: Adam, Brian, Carlos, David, Edison 25 voters: Relative majority

8 7 4 4 2 Adam elected voters voters voters voters voters Condorcet:

A B E D C Carlos elected Borda: ? C D C E E David elected D C D B D Successive eliminations: Edison elected B E B C B Median ranking: E A A A A Carlos elected

26 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

27 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Kenneth Arrow (Nobel prize in economy, 1972) Impossibility theorem (1952):

With at least 2 voters and 3 candidates, it is impossible to build a voting procedure that simultaneously satisfies the 5 following properties:

• Non-dictatorship • Universality • Independence with respect to third parties • Monotonicity • Non-imposition

28 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Most common approach: Weighted sum I1 I2 weights (50%) (50%) a 90 10 b 10 90 c 50 50

V(a) = V(b) = V(c) = 50 Problems: 1) Fully compensatory (elimination of conflicts)

29 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Most common approach: Weighted sum I1 I2 weights (50%) (50%) a 100 10 b 20 90 c 50 50

V(a) = V(b) = 55, V(c) = 50 Problems: 2) Does not encourage improvement in the weak dimensions

30 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Most common approach: Weighted sum

Y = 0.5 ×X1+ 0.5 ×X2 2 2 R1 = 0.08, R2 = 0.83, corr(X1, X2) =−0.151, V(x1) = 116, V(x2) = 614, V(y) = 162

Problems: 3) Weights are used as if they were importance coefficients while they are trade off coefficients

31 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Most common approach: Weighted sum

• Weighted sum approach only possible under special circumstances (eg standardized variables, uniform covariance matrix…) • Hence we need to move away from weighted sums …

Effective weights are compared with nominal weights to ensure coherence between the two.

[Paolo Paruolo, Michaela Saisana, Andrea Saltelli, 2013, Ratings and rankings: Voodoo or Science?, J. R. Statist. Soc. A, 176 (3), 609-634]

32 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

MCA: Maximum likelihood Approach

Features: • no impact of outliers; • no need for data normalisation; • no need for uniform covariance matrix; • no need to attach monetary value and use of both continuous and categorical variables; • no use of any linear or multiplicative formula; • use of the weights attached to the indicators as importance coefficients; • a compromise between conflicting opinions; • reasonably resistant to manipulation; • produces a ranking that is statistically optimal (anonymous, neutral, Pareto optimal, satisfies reinforcement and local independence of irrelevant alternatives)

[Kemeny (1959), Young and Levenglick (1978)] Led to: Condorcet-Kemeny-Young-Levenglick (C-K-Y-L) ranking procedure 33 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

MCA - Performance matrix

• Criteria should not be dependant on each other and not redundant (to avoid double counting)

Criterion 1 Criterion 2 Criterion 3 Criterion 4 Where do (20%) (30%) (20%) (30%) weights come Action 1 20 135 G Yes from? Action 2 9 156 B Yes (…next couple of slides) Action 3 15 129 VG No Action 4 9 146 VB No Action 5 7 121 G Yes

34 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013 70 Weights based on Budget In 4 dimensions of poverty, the average expert Allocation (42 experts) weight is similar to equal weighting  Tiredness 60 in filling in the questionnaire on weights??

50

43 39 40 38 38 39 38 36 36 35 34 34 35 33 33 33 33 33 33 33 32 32 32 31 30 29 29 30 28 29 26 25

20

Exposure & 10 Farm Gender Education Resilience to Assets Shocks Equality

0

Food

Skills

Inputs

Status

Assets

Quality Quality Quality Quality Quality

Energy Tenure

Access Access Access

Services

Facilities

Practices

Exposure

Education

Availability Availability

Healthcare

Consumption

Copingability

Toilet Facilities

AccessStability

Recoveryability NutritionQuality

WasteManagement 35 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Weights based on Analytic Hierarchy Process

USING PAIRWISE COMPARISONS, THE RELATIVE IMPORTANCE OF ONE CRITERION OVER ANOTHER CAN BE EXPRESSED

1 EQUAL 3 MODERATE 5 STRONG 7 VERY STRONG 9 EXTREME

Weights Questionnaire Which Indicator Do You Feel Is More Important? To What Degree?

1 2 3 4 5 6 7 8 9 solve for the Patents 0.109 Patents vs. x Royalties x x Patents vs. Internet x x Patents vs. Technology exports x Eigenvector Royalties 0.103 x Patents vs. Telephones x x Patents vs. Electricity x Patents Royalties Internet Tech.Exports Telephones Electricity Schooling University St. Patents vs. x Schooling years x Internet hosts 0.029 Patents vs. x University Students x Patents 1 1/3 5 4 3 9 1/6 1/8 x Royalties vs. Internet x Royalties vs. x Technology exports x Royalties 3 1 3 1/4 5 9 1/3 1/4 Tech exports 0.117 x Royalties vs. Telephones x x Royalties vs. Electricity x Internet 1/5 1/3 1 1/6 2 2 1/7 1/6 Royalties vs. x Schooling years x Telephones 0.030 Royalties vs. x University Students x Tech.Exports 1/4 4 6 1 5 9 1/4 1/5 Internet vs. x Technology exports x x Internet vs. Telephones x Telephones 1/3 1/5 1/2 1/5 1 7 1/9 1/9 x Internet vs. Electricity x Electricity 0.014 Internet vs. x Schooling years x Electricity 1/9 1/9 1/2 1/9 1/7 1 1/9 1/9 Internet vs. x University Students x x Technology exports vs. Telephones x Schooling 6 3 7 4 9 9 1 2 Schooling 0.301 x Technology exports vs. Electricity x Technology exports vs. x Schooling years x University St. 8 4 6 5 9 9 1/2 1 Technology exports vs. x University Students x University st. 0.297 x Telephones vs. Electricity x Telephones vs. x Schooling years x Telephones vs. x University Students x Electricity vs. x Schooling years x Electricity vs. x University Students x x Schooling years vs. University Students x Inconsistency 17.4 % 36 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Weights based on Analytic Hierarchy Process

USING PAIRWISE COMPARISONS, THE RELATIVE IMPORTANCE OF ONE CRITERION OVER ANOTHER CAN BE EXPRESSED

1 EQUAL 3 MODERATE 5 STRONG 7 VERY STRONG 9 EXTREME P=5I Patents Royalties Internet Tech.Exports Telephones Electricity Schooling University St. R=3I Patents 1 1/3 5 4 3 9 1/6 1/8

Royalties 3 1 3 1/4 5 9 1/3 1/4 We expect: Internet 1/5 1/3 1 1/6 2 2 1/7 1/6 Tech.Exports 1/4 4 6 1 5 9 1/4 1/5 P > R Telephones 1/3 1/5 1/2 1/5 1 7 1/9 1/9 Electricity 1/9 1/9 1/2 1/9 1/7 1 1/9 1/9 Expert said: Schooling 6 3 7 4 9 9 1 2 R > P (R=3P) University St. 8 4 6 5 9 9 1/2 1 Inconsistency

37 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

MCA: Maximum likelihood Approach

Step 1 - Input matrix to the multicriteria analysis

Example: Three options need to be ranked according to five criteria. The importance of the criteria is reflected in the respective weights.

Performance Criterion Criterion Criterion Criterion Criterion matrix 1 2 3 4 5 Weights 10% 20% 10% 30% 30% Option A 50 0.6 400 0.6 4000 Option B 70 0.3 500 0.7 5000 Option C 90 0.4 600 0.4 3000

38 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

MCA: Maximum likelihood Approach

Step 2 – Options are compared pairwise

For each comparison, e.g. option A versus option B, all the weights corresponding to the criteria that favour A versus B are added up (abbreviated as AB). In this case AB gets the weight of Criterion 2 only (=0.2). The comparison BA gets the sum of the weights of the remaining criteria: 1, 3, 4, 5 (=0.8). For n options, there are n (n-1) comparisons to be made. All the values from the pairwise comparisons are entered in a so called outranking matrix.

Outranking matrix Option B Option C Option A Option A 0 0.2 0.8 Option B 0.8 0 0.6 Option C 0.2 0.4 0

39 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

MCA: Maximum likelihood Approach

Step 3 – Calculate support for all permutations and select the maximum

• All 3! (=6) permutations of the options are considered and the support score for each ranking is calculated. • ABC has a support of 1.6 (=0.2+0.8+0.6), which is the sum of elements above the diagonal in the outranking matrix. • Support scores for all six rankings: ABC= 1.6 |ACB=1.4 | BAC=2.2 | BCA=1.6 | CAB=0.8 | CBA=1.4 • The ranking selected is the one with the maximum likelihood score: BAC

Outranking matrix Option B Option C Option A Option A 0 0.2 0.8 Option B 0.8 0 0.6 Option C 0.2 0.4 0 40 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

MCA: Maximum likelihood Approach

Important to assess sensitivity of results to the weights

How coupled stairs are shaken in most of How to shake coupled stairs available literature

41 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Frequency matrix – Sensitivity of the final ranking to the assumptions (e.g. weights)

42 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

MCA: Maximum Likelihood Approach

• The main limitation of this method is the difficulty in computing the ranking when the number of options grows (e.g. 100).

• For 10 options  10 = 3,628,800 permutations …still trivial for today’s PCs

• To solve this NP-hard problem when the number of options is very large there are plenty of numerical algorithms (JRC works on them!)

43 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

Concluding: How to use MCA in your work

1. Decide on the criteria that you want to use in your evaluation; 2. Identify appropriate indicators for each of the criteria (more than one indicator for each criteria is OK); 3. Score the alternatives on each criterion based on their performance on that criterion; 4. Determine the weights of all the criteria (use for instance AHP); 5. Calculate the overall ranking of the alternatives using Maximum Likelihood; 6. Examine the results: try to explain why some options turn out to be the better than others; 7. Do a Sensitivity Analysis: what happens if you assign other weights to the criteria? Does it affect the overall results? 8. Make a final decision

44 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013 “

The more important issue is whether the (maximum likelihood) method is intuitively easy to grasp, and whether it improves on methods currently in use. On both these counts I think that the answer is affirmative, and I predict that the time will come when it is considered a standard tool for political and group decision making.

[Peyton Young, 2002, Optimal Voting Rules, Journal of Economic Perspectives 9:51-64] Peyton Young Professor Emeritus, Research Professor in Economics, Johns Hopkins University

45 4th Impact Assessment Course Michaela Saisana JRC, Ispra, 9-10 December 2013

More reading at http://composite-indicators.jrc.ec.europa.eu

46