Strategic Planning and Performance Evaluation- Tarun Das

download Strategic Planning and Performance Evaluation- Tarun Das

of 18

Transcript of Strategic Planning and Performance Evaluation- Tarun Das

  • 8/14/2019 Strategic Planning and Performance Evaluation- Tarun Das

    1/18

    Strategic Planning and Performance Evaluation-Methodology, Systems and Management

    Professor Tarun Das1

    1. Introduction, Scope and Objectives

    Eventually Alice in Wonderland realized that it matters a great deal to know where togoand how to get there. Similarly, in any agency it is important to know the vision,mission, basic goals and objectives of an agency and the overall scope of its activities interms of exact outputs and outcomes in the medium term, and how to achieve thesegoals, objectives, outputs and outcomes in a time bound manner and with least cost.Strategic Planning and Performance Management Cycle is an interactive on-goingprocess to facilitate sound business and financial plan of any agency.

    2. Strategic Planning

    A Strategic Business Plan (SBP) must focus to achieve a clear Mission, embedded in arealistic Vision, based on issues, objectives and strategies identified in collaborationwith the major stakeholders. SBP need not be too ambitious with an impressive plan butunrealistic targets. It should emphasize concrete plan of actions and strict

    implementation schedule.

    In the short run, strategies need to be tailored to take advantage of institutionalstrengths and to avoid weak institutions. But, in the medium and long run, emphasisshould be placed on strengthening, replacing or even eliminating weak institutions.

    SBP needs to recognize that the global business environment is complex and fastchanging and global public policy is an area of conflicts and adversity. We need tounderstand the dynamics of both internal and external environment and be preparedwith appropriate strategy to tackle any contingent liabilities.

    SBP needs to recognize that any policies and programs cannot be successful unless anagency is able to take the stakeholders along with them. Thus collaboration in SBP is adeal which rewards all parties involved, and creates win-win situations for allstakeholders.

    1 Author is presently working as a Professor in the Institute for Integrated Learning for Management and as

    a Consultant in the World Bank Country Office for Uzbekistan at Tashkent. This paper was written when

    author was working as the Asian Development Bank Strategic Planning Expert for the Government of

    Mongolia at Ulaanbaatar during June 2007 to July 2008.

    1

    Would you tell me please, which way I ought to go from here? asked Alice.That depends a great deal on where you want to get to, said the cat.

    I dont much care where ., said Alice.

    Then it does not matter which way you go, said the cat.

    --- Extract from Alice in Wonderland, Lewis B. Carroll

  • 8/14/2019 Strategic Planning and Performance Evaluation- Tarun Das

    2/18

    SBPs have to be integrated fully with structural and governance reforms and capacitybuilding. The SBPs need to adopt a gradual, step by step, evolutionary and cumulativeapproach towards structural changes within an organization, and should avoid thetemptation of adopting a Big Bang, shock therapy, radical, fundamental or revolutionaryapproach, which may create islands of hostile stakeholders.

    We do not advocate for a small, shrunken and weak management, but want themanagement to be strong to guide the transition process, to bring good governancereforms to their logical ends, to strengthen the existing strong and efficient institutions,and to repair the weak ones. An efficient strategic planning will require a firm andconsistent leadership by the management over a long period of time.

    As per international best practices, a typical or a stylized Strategic Business Planadopts a top down approach and has the following design of events:

    Table-1 Typical Structure of SBP Top Down Approach

    Goals

    Long term wide spread results of programs and policies, like the

    achievement of the Millennium Development Goals by 2015

    Outcomes

    Medium term impact of programs and policies on the economy anduser groups.

    Output

    Deliverables- products and services produced during program period.Outputs are the immediate or end results of programs or activities,whereas outcomes are the impact on the markets and society evenafter completion of the project.

    Activities

    Tasks- undertaken by the staff to transform inputs into outputs.

    Inputs

    Resources- (Personnel, financial, goods and services) are the basicrequirements for any output and strategic planning.

    3. Integration of Strategic Plan with Program Budget

    To accomplish its strategic objectives effectively, an Agency must link outcomes,strategy, budget, and performance into an integrated management system. Thismanagement system, based on a model of continuous improvement, is shown in FlowChart-1 below:

    2

  • 8/14/2019 Strategic Planning and Performance Evaluation- Tarun Das

    3/18

    Flow-Chart-1: Strategic Plan and Performance Based Budgeting Cycles

    The process begins with an understanding of important national priorities and outcomes,

    which are then translated into broadly defined goals and intended results for the Agency.These become the Agencys strategic goals and strategic objectives. Outcomes andOutputs related to these strategic goals and objectives are then articulated. Programsare developed to achieve these outcomes and outputs with least resources, and thenperformance measures and indicators are identified to provide the means to assess theprogress made during the budget year and to suggest improvements for the next yearsbudget. Flow Chart-2 below explains the relationship between an Agencys MediumTerm Strategic Business Plan and its Annual Program Budget and performanceevaluation.

    PerformanceAssessment and

    Improvement PlanCompares actual to targetperformance andbenchmarks.Determine changes that willproduce the best value.

    Program Budget andPerformance Planning

    Program Budgets- Funds allocated to specificprograms to achieve desired goals, outputs,and outcomes with least cost.

    Performance - specifically designed resultsValue- achieving value for moneyEstablish long-term and annual targets for

    spending, performance and value.

    Strategic Business PlanBroadly defines strategic goals, outcomes, outputs of an

    Agency and the methods to achieve them.

    Performance MonitoringTrack the progress, expenditure, and value for

    money for achieving outcomes.

    3

  • 8/14/2019 Strategic Planning and Performance Evaluation- Tarun Das

    4/18

    4. Ideal Performance Evaluation Systems4.1 Characteristics of an Ideal Performance Evaluation System

    Paul G. Thomas (2005) has mentioned the following properties of an ideal perforancemeasurement system:

    It has clearly defined purposes and uses.

    It focuses on outcomes, not just on inputs and outputs.

    It employs a limited, cost effective set of measures.

    It uses measures which are valid, reliable, consistent, and comparable.

    It produces information which is relevant, meaningful, balanced and valued by

    the leaders/ funders of the organisation. It is integrated with the planning and budgetary processes.

    It is embedded in the Agency, stable and widely understood and supported.

    Strategic Business Plan Next Years Budget

    Vision, Mission, Strategyand Objectives

    Performance Monitoring and

    Evaluation through PART

    Performance Indicators

    And Measures

    Performance Indicators

    And Measures

    Program OutcomesAnd Outputs

    Strategic OutcomesAnd Outputs

    Program BudgetsActivities and Processes

    Resources forThe Budget Year

    Inputs (Staff, Funds,Goods, Services, ICT)

    Flow Chart-2: Integration of Strategic Business Plan,Program Budget and Performance Evaluation

    Improvement Plan

    Strategic Goals

    4

  • 8/14/2019 Strategic Planning and Performance Evaluation- Tarun Das

    5/18

    In somewhat less abstract terms, the Canadian Comprehensive Auditing Foundation(CCAF) has developed nine principles 'to provide direction for performance reporting inCanada. Box-1 presents these nine principles. The first five principles provide guidanceabout what governments should report, while the remaining four relate to howgovernments report. The principles start with as ideals, the ceiling that reporting aspiresto reach, but over time they become 'standards, the floor below which reporting should

    not sink. Taken as a set, the nine principles are meant to provide a framework forperformance reporting.

    4.2 Various Approaches to Performance Measurement

    Performance measurement has become so widespread that it is impossible to know all

    what is taking place within both public and private sectors around the world. There is nosingle, one best approach to performance measurement. An agency needs to developan approach that fits with its constitutional/ legal requirements, arrangements, politicalideology, its size, administrative and institutional capabilities, and, not least important,what it can afford. The general tendency for agencies has been to apply a single anduniform approach to all divisions, and programs. This across-the-board approach mayhave apparent virtues of consistency, comparability and fairness, but it is not withoutproblems.

    It would be interesting to know how the system evolved in Canada over the past decade.The federal and provincial governments in Canada developed two broad approaches toperformance measurement. In Alberta, Nova Scotia, Ontario and Quebec, governments

    reported on the performance of the entire government in terms of the impacts of theirprograms on the society. This social-indicatortype of approach supports external andpolitical accountability. However, the selection of indicators included in report cards tocitizens was inherently arbitrary (Thomas 2006).

    Other provinces and the Government of Canada began their performance measurementefforts by requiring individual departments to prepare business plans and performancereports. The business-line approach is more of a managerial tool than something whichis normally liked by politicians and the public. However, these two broad approaches

    5

    Box-1 Nine Principles of Better Performance Reporting

    1. Focus on the few critical aspects of performance.

    2. Look forward as well as back.3. Explain key risk considerations.4. Explain key capacity considerations.5. Explain other factors critical to performance.

    6. Integrate financial and non-financial information.

    7. Provide comparative information.8. Present credible information, fairly interpreted.9. Disclose the basis for reporting.

    Source: Canadian Comprehensive Auditing Foundation, Reporting Principles: TakingPublic Performance Reporting to a New Level. Ottawa, 2002.

  • 8/14/2019 Strategic Planning and Performance Evaluation- Tarun Das

    6/18

    viz. the system-wide and the business-line could be pursued simultaneously andcomplement one another. This has been the evolution of the performance reportingsystem in the Government of Canada. It began by publishing performance reports on adepartmental basis and now more than 80 such documents for departmental and non-departmental bodies are tabled in Parliament on a annual basis.

    Agencies have developed a number of frameworks to identify successful programs.Probably the most common framework involves the so-called three big Es: economy,efficiency and effectiveness as described below.

    Economy Have inputs been acquired at least cost?

    Efficiency Are the inputs (people, money, supplies) being combined to producethe maximum volume of outputs (goods and services)?

    Effectiveness Are the goals of the organization/ program being met, without undueunintended consequences?

    These elements had been used by both the public and private sector management overthe past four or five decades. But, the framework misses another important E i.e. equity,which deals with the distributional impacts of performance. Omitting equity may have anadverse impact on another big E in government i.e. electability.

    In its earlier program evaluation scheme, the Government of Canada considered aprogram to be well performing when it was:

    Relevant Consistent with government and departmental priorities.

    Successful Achieves the intended outputs and outcomes.Cost-effective Involves the most efficient means to achieve goals.

    This framework deals with the desirability of continuing a program, but does not addressthe question: Does the organisation have the capability and capacity to deliver thedesired results?

    Organisational report cards represent another type of performance measurement andreporting. Box-2 presents one interpretation of the requirements for such report cards.

    Box-2: Organisational Report Cards Criteria for Design

    (a)Validity- satisfies legal requirements(b)Comprehensiveness- covers all aspects of a budget(c)Comprehensibility- easy to understand

    (d)Relevance- appropriate for strategic objectives(e)Reasonableness- can be achieved within time and at reasonable cost(f)Functionality- operational and realistic

    Source: William T. Gormley and David L. Weimer, Organisational Report Cards.Cambridge, Mass.: Harvard University Press, 1999. pp. 36-37.

    6

  • 8/14/2019 Strategic Planning and Performance Evaluation- Tarun Das

    7/18

    In 1987 the Canadian Comprehensive Audit Foundation (CCAF) published a report onorganisational effectiveness which mentioned the following attributes of an effectiveorganisation management:

    Box-3: Attributes of an effective organisation management(a)Relevance(b)Appropriateness(c)Achievement of purpose(d)Acceptance(e)Secondary Impacts(f)Costs and Productivity(g)Responsiveness(h)Financial Results(i)Working Environment

    (j)Monitoring and Reporting

    Several Agencies have since applied this framework. But, there may be conflict inpractice among the attributes e.g., cost efficiency may reduce responsiveness.Besides, assigning weights to these attributes and constructing a weighted overall indexis a challenging job.

    The Office of the Auditor General of Canada (OAG) has recommended anotherperformance framework with six components of performance:

    1. Mission statements2. Results statements3. Performance indicators/measures4. Performance expectations/targets/commitments

    5. Strategies/activities6. Performance accomplishments/achievements

    This framework emphasises the desirability of integrating performance planning,budgeting, monitoring and reporting, and also stresses external accountability for results.

    4.3 Choice of Particular Performance Evaluation System

    Regardless of the approach adopted, a sound performance measurement andevaluation system must have three qualities: it must be technically valid, it must befunctional, and it must be legitimate. Table-2 presents one generic listing of the idealqualities of such a system.

    7

  • 8/14/2019 Strategic Planning and Performance Evaluation- Tarun Das

    8/18

    Table-2: Characteristics of an Ideal Performance Measurment System

    Characteristic Definition

    Clarity Performance indices should be simple, well defined, and easilyunderstood.

    Consistency Definitions of indicators should be consistent over time and acrossagencies.

    Comparability One should compare like with like.

    Controllability A Mmanagers performance should only be measured for areasover which he/she has control.

    Contingency Performance is not independent of the environment within whichdecisions are made.

    Comprehensive Covers all aspects of a budget.

    Bounded Consider a l imited number of performance indices which providethe biggest pay-off.

    Relevance Performance indicators are relevant to the special needs.Feasibility Targets are based on realistic expectations.

    Source: Peter M. Jackson, Measures for Success in the Public Sector.

    4.4 Types of Performance Indicators

    Performance indicators measure what an Agency did in the fiscal year. There are manykinds of performance indicators ranging from quantities or value of goods and servicesproduced in a given period (such as the number of crimes or breaking of traffic rulesdetected by the police) to more complex indicators such as efficiency and effectivenessof service delivery. Nayyer-Stone (1999) mentioned four primary types of performanceindicators: input, output, outcome and efficiency, which are described in Table-3.

    Table-3: Performance Indicators

    Type of Indicator Definition ExampleInput Indicator Measure of

    Resources Employed Employees Required Goods and Services Used Equipment Needed

    Output Indicator Quantity of Goodsand ServicesProvided

    Number of projects Number of outputs Number of people served

    Effectiveness/

    outcome Indicator

    The degree to which

    the intendedobjectives of theservices are beingmet.

    Increase in literacy rate

    Increase in employment Decrease in crime rate Reduction of poverty. Reduction of maternal and child

    mortality rate

    Efficiency Indicator Cost per unit ofoutput

    Cost/ liter of water delivered. Cost of garbage collected.

    Cost per student in schools

    8

  • 8/14/2019 Strategic Planning and Performance Evaluation- Tarun Das

    9/18

    Source: Adapted from Hatry, Harry P. (1977).

    4.5 Characteristics of ideal performance indicators

    Like any statistical measure, performance indicators must satisfy a number of criteria. Ingeneral, an ideal performance indicator should be S.M.A.R.T. (i.e. simple, measurable,

    achievable, relevant and timely) and C.R.E.A.M. (i.e. clear, relevant, economic,adequate and monitorable).

    Table-4: Ideal Properties of Performance Indicators

    S.M.A.R.T. Simple-easily defined

    Measurable- easily quantifiable

    Achievable can be achieved, not a wish list

    Relevant- Appropriate for the strategic objectives

    Timely- can be achieved in time

    C.R.E.A.M. Clear- Precise, unambiguous, tangible and quantifiable

    Realistic- achievable with reasonable cost and in time Economic- Available at reasonable cost and in time

    Adequate- Provides sufficient basis to access performance

    Monitorable-Amenable to impartial/ objective evaluation

    4.6 Use of Performance Measures

    Performance measures can be used in several ways, including the following:

    a) Controlling costs enabling agencies to identify costs which are much higher orlower than average and determine why these differences exist.

    b) Comparing processes analyzing performance of services provided with aparticular technology, approach or procedure.

    c) Maintaining standards monitoring service performance against establishedperformance targets or benchmarks.

    d) Comparing sectors comparing costs of public sector delivery to costs of privatesector delivery of the same type of goods and services.

    5. Performance Evaluation Methodology

    Performance Evaluation involves four main steps as the following:

    1.Summary of Baseline Scenario2.Diagnostic Scan and SWOT Analysis3.Budget Compliance, Efficiency and Effectiveness Evaluation4.Performance Evaluation

    5.1 Review of Strategic Plan and Baseline Profile

    9

  • 8/14/2019 Strategic Planning and Performance Evaluation- Tarun Das

    10/18

    In undertaking a performance evaluation, it is necessary to start with a baseline andinitial profile and to identify the key issues on which the performance evaluation is to befocused.

    Table-5: Scope of Performance Review and the Initial Profile

    Scope of Performance Review: Scope of Initial Profile (say for the Budget year2008):

    1.Strategic Business Plans 1.Scope, Governance, Vision, Mission andobjectives

    2.Scope of review 2.Main functions, programs and activities

    3.Review steps and keymilestones

    3.Structure, staffing and time schedules

    4.Preliminary assessment 4.Program-wise Budgeted funds

    5.Focus of review 5.Output costs, benchmarks and performanceparameters

    This assessment allows an agency to take a detailed look at their current businessactivities and how they wanted to perform in the budget review year. Various profitcenters under an Agency will be asked to provide a brief description of their StrategicBusiness Plan with vision, mission, objectives and goals. They will also be asked toprovide summary of their program budgets, which is being reviewed, with budgetedresources, outputs and outcomes. Agencies will be required to provide details ofworkforce size, their functions and skills, workload volume and contributions to thestrategic planning.

    5.2 Diagnostic Scan and SWOT Analysis

    A diagnostic scan of an Agency is necessary before starting a performance review,because actual performance is influenced by constraints on resources, technicalmanpower and the ICT system. There are basically two types of review- strategic reviewand operational review.

    Strategic Review:How well an Agency manages its external environment by delivering relevant andeffective services.

    Operational Review:How well an Agency manages its internal environment by using its resourcesefficiently and prudently.

    Both desktop and field scans are required to determine the following aspects:

    whether best practice techniques were attempted;

    whether the practice was documented; and

    whether it was widely applied within the agency.

    10

  • 8/14/2019 Strategic Planning and Performance Evaluation- Tarun Das

    11/18

    The desktop scan involves checking the existing material on strategic plans andprogram/ output budgets already submitted by the Agency to the Ministry of Finance,whereas field scans involve conducting surveys and interviewing key stakeholders(clients, community groups, staff and management), to obtain their views on how internalmanagement tools are working in practice.

    There are 8 possible strategic review areas and 8 operational review areas as indicatedin Table-6.

    Table-6 Strategic Review and Operational Review Areas

    Strategic review areas Operational review areas

    1.Strategies 9. Work Culture

    2.Environment 10. Communications

    3.Clients 11. Organization structure

    4.Other stakeholders 12. Reporting Lines5.Regulation 13. Human resources

    6.Policy regime 14. Processes and systems

    7.Service delivery 15. Controls

    8.Reviews 16. Cost and Asset management

    For each of these 16 areas, it is necessary to test whether the agency has applied anytypical best practice management techniques. For example, when examining Clients,agencies would be asked whether they have applied any of the following types ofmanagement practices e.g. client needs analysis, client segmentation, clients

    satisfaction surveys, grievances and complaints handling and so on. When examiningControls and Cost and Asset Management, agencies would be asked if they use thefollowing practices e.g. financial information system, management information system,asset management plan and corporate overhead costs analysis, etc.

    Table-7: Typical Best Practices for Strategic Review

    Strategic Review Areas (numberof sub-areas)

    Typical Best Management Practices

    1. Strategies (2) Strategic Business Plan, Master Plan

    2. Environment (2) Socio-Economic-Political Environment Analysis,SWOT Analysis.

    3. Clients (2) Clients Needs and Satisfaction Surveys,Grievances and Complaints Handling.

    4. Other Stakeholders (2) Stakeholders Consultation, Focus Groups

    5.Regulation (2) Regulatory Review, Parliamentary ConsultativeCommittee Review

    6. Policy (2) Ministerial Review, Donors Review

    11

  • 8/14/2019 Strategic Planning and Performance Evaluation- Tarun Das

    12/18

    7. Service Delivery (2) Service Charter, Benchmarking

    8. Review plan (2) Performance Agreements, External Audits

    Table-8: Typical Best Practices for Operational Review

    Operational Review Areas Typical Best Management Practices

    9. Work Culture (2) Code of Conduct, Regular Staff Meetings

    10. Communications (2) Annual Report, Website for Public

    11. Organization Structure (2) Organization Chart, Job Descriptions

    12. Reporting Lines (2) Delegation of Powers, Chinese Walls

    13. Human Resources (2) H/R Manual, Training and Development Programs.

    14. Process & Systems (2) Rules and Procedure Manuals, ICT developmentPlans.

    15. Controls (2) Financial Information System, Management

    Information System16. Expenditure and assetmanagement (2)

    Asset Management Plan,Agency Overheads Analysis

    5.3 SWOT Analysis

    After diagnostic scan, a SWOT analysis may be undertaken to determine the strengths,weakness, opportunities and threats of the Agency..

    5.4 Strategic and Operational Performance Evaluation

    An agencys performance can be assessed in relation to the 16 performance factorslisted in Tables-7 and 8. Each factor can be given scores on a scale of 0 to 5 by using anapproach adapted from the Australian Quality Council:

    Table-9: Scores for Strategic and Performance Evaluation

    0 Approach had not been considered or attempted or does not exist.

    1 Some evidence of individual initiative and unsystematic efforts.

    2 Approach is planned and introduced in some areas in a limited way.

    3 Systematic approach has been implemented in some areas and results areunder examination.

    4 Approach has been implemented in some areas and results/outcomes havebeen used to improve the planning and budgeting.

    5 Approach used in most of the areas and results/outcomes have been used toimprove the planning and budgeting.

    12

  • 8/14/2019 Strategic Planning and Performance Evaluation- Tarun Das

    13/18

    Then a Borda Index (i.e. sum of all ranks for all factors) can be estimated. This willprovide a composite index for rating performance of agencies. There are 32 (=16X2)sub-areas. So a maximum 160 marks can be scored by an Agency. Total score can beexpressed as a percentage of 160 marks. Percentage can also be calculated separately

    for strategic performance and operational performance. Then, total marks for eachcategory can be expressed as a percentage of 80 marks. It is most unlikely that anAgency will be able to score 100% marks. On the basis of percentage of marks, thestrategic performance or operational performance, or the combined strategic andoperational performance of an Agency could be rated as follows:

    Table-10: Rating of an Agency on the basis of Performance Scores

    Rating of AgencyRange of Performance Scores

    (in percentage)

    (a) Effective (EF) 85 100

    (b) Moderately Effective (ME) 70 84

    (c) Adequate (AD) 50 69

    (d) Ineffective (IN) 0 49

    5.4 Compliance, Efficiency and Effectiveness Evaluation

    Under Compliance Evaluation, program and sub-program wise budgeted expenditureare compared with the actual expenditure, and the following marks are assigned to eachprogram:

    Table-11: Marks for Budget Compliance Evaluation

    0 If actual expenditure exceeds budgeted expenditure by more than 10%.

    1 If actual expenditure exceeds budgeted expenditure by more than 7.5 per centbut less than 10 per cent.

    2 If actual expenditure exceeds budgeted expenditure by more than 5 per cent butless than 7.5 per cent.

    3 If actual expenditure exceeds budgeted expenditure by more than 2.5 per centbut less than 5 per cent.

    4 If actual expenditure exceeds budgeted expenditure by less than 2.5%.

    5 If actual expenditure is within the budgeted expenditure.

    Under efficiency evaluation, budgeted outputs are compared with actual outputs, and thefollowing marks are assigned to each program.

    13

  • 8/14/2019 Strategic Planning and Performance Evaluation- Tarun Das

    14/18

    Table-12: Marks for Budget Efficiency Evaluation

    0 If actual output falls short of budgeted output by more than 10 per cent.

    1 If actual output falls short of budgeted output by more than 7.5 per cent but lessthan 10 per cent.

    2 If actual output falls short of budgeted output by more than 5 per cent but lessthan 7.5 per cent.

    3 If actual output falls short of budgeted output by more than 2.5 per cent but lessthan 5 per cent.

    4 If actual output falls short of budgeted output by less than 2.5 per cent.

    5 If actual output is at least equal to the budgeted output.

    Under effectiveness evaluation, budgeted outcomes are compared with actualoutcomes, and the following marks are assigned to each program. However, one has to

    wait a number of years before the outcome results are available. Therefore, for the nextthree years, effectiveness evaluation may not be feasible.

    Table-13: Marks for Budget Effectiveness Evaluation

    0 If actual outcome falls short of budgeted outcome by more than 10%.

    1 If actual outcome falls short of budgeted outcome by more than 7.5 per cent butless than 10 per cent.

    2 If actual outcome falls short of budgeted outcome by more than 5 per cent but lessthan 7.5 per cent.

    3 If actual outcome falls short of budgeted outcome by more than 2.5 per cent but

    less than 5 per cent.4 If actual outcome falls short of budgeted outcome by less than 2.5%.

    5 If actual outcome is at least equal to the budgeted outcome.

    After assigning marks for all sub-programs, actual marks obtained for all programs of anAgency will be expressed as a percentage of total possible marks.

    5.5 Overall Assessment and Score

    Thus, we have the following three broad evaluations:

    (1) Strategic Plan and Baseline Profile Evaluation-(2) Strategic and Operational Performance Evaluation (3) Compliance and Effectiveness Evaluation

    For overall assessment a weight of 30 per cent may be given for strategic plan andbaseline evaluation, a weight of 20 percent may be given for strategic and operationalperformance evaluation and 50 per cent may be given for budget compliance andeffectiveness evaluation.

    14

  • 8/14/2019 Strategic Planning and Performance Evaluation- Tarun Das

    15/18

    Table-14: Weights for Various Types of EvaluationType of Evaluation Weights

    1-A Strategic Plan Evaluation Weight: 10%

    1-B Systems Development Weight: 10%

    1-C Human Resource Development Weight: 10%

    2-A Strategic Performance Evaluation Weight: 10%2-B Operational Performance Evaluation Weight: 10%

    3-A Program Budget Compliance Weight: 25%

    3-B Program Budget Effectiveness Weight: 25%Total 100%

    Translating Performance Scores into Ratings: Finally, the overall performance scoreswill be converted into qualitative ratings using the scoring bands given in the followingtable:

    Table-15: Rating of an Agency on the Basis of Overall ScoresRating of Agency

    Range of Performance Scores(in percentage)

    (e) Effective (EF) 85 100

    (f) Moderately Effective (ME) 70 84

    (g) Adequate (AD) 50 69

    (h) Ineffective (IN) 0 - 49

    There will be another category called Results Not Demonstrated when an Agencydoes not have performance measures that have been agreed by MOF either for

    baselines or for the assessment year.

    An Example:

    To provide an example, let us assuming that we are evaluating budget performance forthree Agencies- A, B and C. The results are given in tables16-17.

    15

  • 8/14/2019 Strategic Planning and Performance Evaluation- Tarun Das

    16/18

    Table-16: An Example of Performance Scores for three Agencies

    Performance Scores(%)

    Weighted Scores

    Type of Evaluation Weight

    Agency-A

    Agency-B

    Agency-C

    Agency-A

    Agency-B

    Agency-C

    1-A Strategic Plan

    Evaluation 0.1 50 60 85 5 6 8.51-B SystemsDevelopment 0.1 50 50 75 5 5 7.5

    1-C Human ResourceDevelopment 0.1 60 50 70 6 5 7

    2-A StrategicPerformanceEvaluation

    0.1

    50 50 85 5 5 8.5

    2-B OperationalPerformanceEvaluation

    0.1

    60 70 70 6 7 7

    3-A Program BudgetCompliance 0.25 40 70 70 10 17.5 17.5

    3-B Program BudgetEffectiveness 0.25 40 70 70 10 17.5 17.5

    Total 100% 47 63 73.5 47 63 73.5On the basis of these scores, the Agencies would be graded as given in table-17:

    Table-17: Estimation of Overall Rating for three Agencies

    Performance Scores(%)

    Rating of an Agency

    Type of Evaluation Weight

    Agency-A

    Agency-B

    Agency-C

    Agency-A

    Agency-B

    Agency-C

    (1) (2) (3) (4) (5) (6) (7) (8)1-A Strategic PlanEvaluation 0.1 50 60 85 AD AD EF

    1-B SystemsDevelopment 0.1 50 50 75 AD AD ME1-C Human ResourceDevelopment 0.1 60 50 70 AD AD ME2-A StrategicPerformanceEvaluation

    0.1

    50 50 85 AD AD EF

    2-B OperationalPerformance

    Evaluation

    0.1

    60 70 70 AD ME ME3-A Program BudgetCompliance 0.25 40 70 70 IN ME ME3-B Program BudgetEffectiveness 0.25 40 70 70 IN ME METotal 100% 47 63 73.5 IN AD ME

    Note: AD stands for Adequate, EF for Effective, IN for Ineffective and ME forModerately Effective.

    16

  • 8/14/2019 Strategic Planning and Performance Evaluation- Tarun Das

    17/18

    Selected References

    Bergin, Jeffrey (2004) Performance Based Budgeting, Performance ManagementInstitute.

    Canadian Comprehensive Auditing Foundation (2002) Reporting Principles: TakingPublic Performance Reporting to a New Leve, Ottawa, 2002.

    Das, Tarun (2007a) Preparation of Strategic Business Plans- General Guidelines,Suggestions for Improvement, and Summary of Recommendations, Final Report, pp.1-74, 30 Sept 2007.

    Das, Tarun (2007b) Output Costing and Output Budgeting- Basic Concepts and

    Methodology, pp.1-51, October 2007.

    Das, Tarun (2007c)Accrual Accounting and Accrual Budgeting- Basic Concepts andMethodology, pp.1-43, November 2007.

    Das, Tarun (2007d)Transition from Cash to Accrual Accounting, pp.1-26, Nov 2007.

    Das, Tarun (2007e) Benchmarks Setting and Best Practices for Output Costing andOutput Budgeting- Part-1: Basic Concepts, pp.1-31, Dec 2007.

    Das, Tarun (2007f) Benchmarks Setting and Best Practices for Output Costing andOutput Budgeting- Part-2: Practical Applications for Mongolia, pp.1-36, Dec 2007.

    Das, Tarun (2007g)Terminal Report: Part-1, Major Conclusions and Recommendations,pp.1-70 and Part-2 on Strategic Business Plans, Output costing and Output Budgeting,Accrual Accounting and Accrual Budgeting, and Benchmarks Setting, pp.71-157, ADBCapacity Building Project on Governance Reforms, prepared by Tarun Das for detailedguidelines on Output Costing.

    Government of Australia, Council on the Cost and Quality of Government (2001)Annual Report 2001, November 2001.

    Government of Mongolia (2002) Public Sector Management and Finance Act (PSMFA,27 June 2002).

    Government of USA (2005) Performance Management Process: Strategic Planning,Budget and Performance Management cycle, General Services Administration (GSA),Office of the Chief financial Officer, 31 January 2005.

    Hatry, Harry P. (1977) How Effective are your Community Services?, The UrbanInstitute, Washington, D.C.

    17

  • 8/14/2019 Strategic Planning and Performance Evaluation- Tarun Das

    18/18

    Kaplan, Robert S. and David P. Norton (1996) The Balanced Scorecard: TranslatingStrategy into Action, Harvard Business School Press.

    Mercer, John: See Website on GPRA and Performance Management:www.governmentperformance.info

    Mercer, John (2003) CASCADE Performance Budgeting- A Guide to an EffectiveSystem of Integrating Budget and Performance Information and for Linking Long-TermGoals for Day-to-Day Activities, USA, May 2003, www.governmentperformance.info

    Meyers, Roy T. (1996) Is There a Key to the Normative Budgeting Lock, The WorldBank, Washington, D.C.

    Schick, Allen (1995) Federal Budget: Politics, Policy and Process, Brookings Institution.

    Steiner, George; Simon and Schuster (1997) Strategic Planning: What EveryManager Must Know.

    Thomas, Paul G. (2004) Performance Measurement, Reporting and Accountability:Recent Trends and Future Directions, Saskatchewan Institute of Public Policy Paper No23, February 2004; http://www.uregina.ca/sipp/

    Thomas, Paul G. (2005) Performance Management and Management in the PublicSector, Optimum Online The Journal of Public Sector Management, Vol 35, Issue 2,July 2005. http://www.optimumonline.ca/

    Thomas, Paul G. (2006) Performance Measurement, Reporting, Obstacles and Accountability -Recent Trends and Future Directions, Research School of SocialSciences, The Australian National University, Canberra ACT 0200.

    USA (1993) Government Performance and Results Act (GPRA) of 1993, Office ofManagement and Budget (OMB).

    United States of America, Office of Management and Budget (OMB) Homepage:http://www.whitehouse.gov/omb/gils/gil-home.html

    18

    http://www.governmentperformance.info/http://www.governmentperformance.info/http://www.uregina.ca/sipp/http://www.optimumonline.ca/http://www.whitehouse.gov/omb/gils/gil-home.htmlhttp://www.governmentperformance.info/http://www.governmentperformance.info/http://www.uregina.ca/sipp/http://www.optimumonline.ca/http://www.whitehouse.gov/omb/gils/gil-home.html