Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1...

160

Transcript of Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1...

Page 1: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:
Page 2: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Table ofContentsTOC

Page TOC-1

CHAPTER 1 - EXECUTIVE SUMMARY 1

A. Introduction 1

B. Study Methodology 2

C. Conclusions of Process Benchmarking: Best Management Practices 3

D. Conclusions of Performance Benchmarking 3

E. Lessons Learned 8

CHAPTER 2 - INTRODUCTION 11

A. Study Background 11

B. Study Objectives 12

C. Participants 12

D. Report Structure 13

CHAPTER 3 - PARTICIPATING AGENCIES 15

A. Introduction 15

B. Description of Participating Agencies 16

I. City of Long Beach 16

II. City of Los Angeles 18

III. City of Oakland 20

IV. City of Sacramento 22

V. City of San Diego 24

VI. City and County of San Francisco 26

VII. City of San Jose 28

C. Similarities and Differences 30

Page 3: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page TOC-2

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

CHAPTER 4 - PROCESS BENCHMARKING 31

A. Guiding Principles 31

B. Implementation of Best Management Practices 31

C. Additional Improvements 31

D. Discussion Items for the Improvement of Processes 36

E. Online Discussions and Communications 36

F. Special Studies 39

CHAPTER 5 - PERFORMANCE BENCHMARKING 41

A. Guiding Principles 41

B. Additional Data Collection and Database Improvements 41

C. Distribution of Projects 42

D. Performance Graph Development 43

E. Discussion 43

CHAPTER 6 - CONCLUSIONS AND RECOMMENDATIONS 51

A. Process Benchmarking – Recommended Best Management Practices 51

B. Performance Benchmarking 51

C. Study Qualifications and Characteristics 51

D. Next Steps 52

ACKNOWLEDGEMENTS I

APPENDICES

APPENDIX A - ADMINISTRATIVE INFORMATION A-1

APPENDIX B - PERFORMANCE BENCHMARKING B-1

Curves Group 1 B-3Curves Group 2 B-17Curves Group 3 B-33Curves Group 4 B-49Curves Group 5 B-65

APPENDIX C - OUTLIERS IDENTIFICATION C-1

APPENDIX D - MULTI-AGENCY BENCHMARKING DATABASE(UPDATE 2003) D-1

Page 4: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

C h a p t e r 1

Executive Summary

py

Page 5: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

CHAPTER ExecutiveSummary1

Page 1

A. INTRODUCTION

Municipal agencies have become increas-ingly concerned that the condition oftheir infrastructure reflects the condi-

tion and quality of life within their respective com-munities. At the same time, the practice of fund-ing public projects by borrowing money throughthe sale of municipal bonds provides incentive touse the funds with extraordinary care and effi-ciency. To that end, the intent of this study con-tinues to be: To improve the public project deliv-ery process.

Over the next three years, seven of the largest cit-ies in California will continue with their CapitalImprovement Programs and will award nearly $6billion in public works infrastructure construc-tion contracts. These municipalities are buildingroads and transportation systems, sewer and wa-ter infrastructure, municipal facilities, libraries,parks and recreation facilities, animal shelters, firestations, bridges, seismic retrofits, bikeways, stormdrains, and other facilities.

Other significant costs - over and above the $6billion for construction - are required to deliverthese projects. The costs associated with theproject delivery process - which involves planning,design, environmental documentation, value en-gineering, permits, construction management,inspection, testing, and startup - are influencedby many factors. These factors include projectsize and complexity, whether it is new construc-tion or rehabilitation, the internal organizationof the agency, and the many tasks included in theprocess of delivering the project.

Update 2003 of the California Multi-Agency CIPBenchmarking Study gives government decision-makers a critically-needed tool: a means of moreaccurately anticipating the true costs of publicprojects based upon an in-depth study of over500 projects. Additionally, Update 2003 pro-vides insight relative to improving the deliveryprocess through the use of Best Management Prac-tices and continued agency collaboration.

Background

In October 2001, the City of Los Angeles, De-partment of Public Works, Bureau of Engineer-ing initiated a benchmarking study with five otherlarger cities in California. This unprecedentedstudy required the cooperative effort of individu-als responsible for the development and imple-mentation of Capital Improvement Projects(CIP). These cities and their representatives joinedtogether to form the California Multi-Agency CIPBenchmarking Study Team. After working to-gether for two years, this team has shown that itis possible - and beneficial - for cities to collabo-rate, pool their knowledge and experience withthe factors that influence project delivery costs,then benchmark their project delivery processesto learn from each other's successes.

The objective of the California Multi-Agency CIPBenchmarking Study was to provide a generalanalysis of the efficiency of capital project deliv-ery systems within various agencies in California.The analysis was based on the observed perfor-mance and the processes related to the develop-ment of public works projects implemented inthe previous five years.

Page 6: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 2

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

The first year (2002) of the California Multi-Agency CIP Benchmarking Study was the begin-ning of a planned cooperative and continuous ef-fort that could eventually include other Califor-nia agencies. The study initially involved six agen-cies with a seventh (City of Oakland) joining theteam in 2003. The following agencies partici-pated in the second year of the study:

City of Long Beach, Department of PublicWorks

City of Los Angeles, Department of PublicWorks/Bureau of Engineering

City of Oakland, Public Works Agency

City of Sacramento, Department of PublicWorks

City of San Diego, Engineering & CapitalProjects

City & County of San Francisco, Depart-ment of Public Works / Bureau of Engi-neering / Bureau of Architecture / Bureau ofConstruction Management

City of San Jose, Department of PublicWorks

Process and PerformanceBenchmarking

This document, Update 2003, is the result of thefirst two years of collaboration among these sevenagencies. The study examined process bench-marks, focusing on business processes (the ap-proach to managing Capital ImprovementProjects in the individual agencies).

Twenty-four recommended Best ManagementPractices were identified in the first edition of thisstudy to deliver high quality projects faster and atlower cost. The Benchmarking Study Team alsoidentified 15 common Best Management Prac-tices that are currently used by most participantsin the study. Update 2003 documented the cur-rent and planned implementation of the Best

Management Practices by each agency as the firststep in the actual linking of improved processesto performance.

Update 2003 also examined performancebenchmarking which involved an analysis of costand schedule project data from the participatingagencies. The agencies provided data for 525Capital Improvement Projects, of which 453complied with the 2003 criteria and were used toupdate the performance models with an objec-tive of improving the reliability of the results.

The California Multi-Agency CIP BenchmarkingStudy is intended to be a continuing effort. Fu-ture updates are expected to refine and improvethe conclusions and recommendations as addi-tional project data are collected. Annual updatesof this report are planned.

B. STUDY METHODOLOGY

Update 2003 was conducted in four stages:

1. One focus of the first stage of Update 2003was to improve process benchmarking by re-viewing the level at which agencies imple-mented the study's recommended Best Man-agement Practices. Two new recommendedBest Management Practices were added inUpdate 2003.

2. Performance benchmarking data collectioncriteria were updated in the second stage ofthis year's study. The Study Team reviewedand modified general criteria for performancedata collection (project selection, categories,and performance curves format) based on theavailability of information and agencies' ex-pectations.

3. The third stage of the study emphasized per-formance data enhancement, data compila-tion into the project database, and develop-ment and optimization of performance curves(graphs that relate the cost of construction to

Page 7: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 3

Chapter 1Executive Summary

the various costs of project delivery). Perfor-mance data on a total of 453 projects with atotal construction value of over $830 mil-lion were used to develop the performancebenchmarking curves (graphs) for 14 differ-ent classifications in four project types (mu-nicipal facilities, streets, pipe systems, andparks) showing design, construction manage-ment, overall project delivery, and change or-ders costs as a percentage of total construc-tion costs.

4. The fourth stage of Update 2003 consistedof review and discussion of performance andprocess benchmarking outcomes.

C. CONCLUSIONS OF PROCESSBENCHMARKING:BEST MANAGEMENT PRACTICES

In the original study, team members identified,discussed, and evaluated 98 processes associatedwith the effective delivery of capital projects.Update 2003 studied the practical implementa-tion of these processes and introduced two newBest Management Practices. Table A indicates theimplementation status and planning of the BestManagement Practices so that implementation caneventually be tracked against anticipated perfor-mance improvements.

In addition, through their BMP and process im-provement discussions, the Project Team identi-fied certain improvements in contract documentswhich may be critical to process improvement:

Standardized Indemnification Language

Standardized indemnification language, inwhich risk is reasonably apportioned tothe parties most able to control it, wouldallow design professionals and their in-surers know what to expect and wouldexpedite agreements.

Standardized Document OwnershipLanguage

The development of standard documentownership language and clauses would letdesign professionals know what to expectand would expedite agreements.

The implementation of these process improve-ments is outside of the immediate control of theproject delivery team. A plan for implementa-tion of these and other process improvements willbe developed as a part of the next generation ofthis report.

D. CONCLUSIONS OF PERFORMANCEBENCHMARKING

The following performance benchmarking con-clusions are based upon the Study Team's analy-sis of project data:

The percentage of design costs decreased withincreasing size of the projects. Design costsaveraged 17% of the total construction costfor 453 representative projects completed af-ter 1997, each with total construction costgreater than $100,000.

The ratio of costs for construction manage-ment decreased as total construction costs in-creased. Construction management averaged16% of the total construction cost for 453representative projects completed after 1997and greater than $100,000 total constructioncost.

Based on the performance data, total projectdelivery cost (total design cost and construc-tion management cost) for 453 projectsgreater than $100,000 total construction costaveraged 33% of the total construction cost.

Page 8: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 4

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Page 9: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 5

Chapter 1Executive Summary

Page 10: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 6

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Page 11: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 7

Chapter 1Executive Summary

Page 12: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 8

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Relating the costs of change orders to the totalconstruction cost was difficult to quantify. Fu-ture updates of the California Multi-Agency CIPBenchmarking Study will investigate the possibil-ity of categorizing change orders in two groups:"Scope Changes" and "All Others". This catego-rization should lead to better correlations and moreintuitive trends.

The additional project duration data provided forUpdate 2003 led to much better duration mod-els. These improved models were significantlymore realistic than last year's report, and were usedto develop a tabulation of duration data for Up-date 2003.

Table B shows various project delivery costs forCapital Improvement Projects with known con-struction values. This table was improved inUpdate 2003 in that, in some areas, it is a morereliable predictive tool due to additional repre-sentative data and improved correlations.

E. LESSONS LEARNED

Additional data collection is still warranted.In Update 2003, where significant additionaldata were provided, some of the statistical cor-relations improved significantly. In futurebenchmarking studies, more data collectionin all categories should improve correlationcoefficients and make performance modelsmore effective for prediction.

Critical review of 2003 data indicated thatthe performance models would be signifi-cantly improved by eliminating "non-repre-sentative" projects. These are the projects thatappear as anomalies in the performance mod-els and are proven not to represent agencies'standard project delivery procedures.

Implementing the recommended Best Man-agement Practices is essential if agencies areto improve their processes. The CIPBenchmarking Team should monitor agen-cies' progress to implement these practices andcompare performance results to study the ac-tual effectiveness of such practices.

Team presentations and online discussionsproved to be effective tools to communicateand share experiences.

Page 13: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 9

Chapter 1Executive Summary

Page 14: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

C h a p t e r 2

Introduction

p

Page 15: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

CHAPTER

Introduction2

Page 11

A. STUDY BACKGROUND

The City of Los Angeles, Department ofPublic Works/Bureau of Engineering ini-tiated the California Multi-Agency CIP

Benchmarking Study with five other larger Cali-fornia municipal agencies in October of 2001.These agencies recognized this event as an oppor-tunity to improve the efficiency of deliveringCapital Improvement Projects throughout Cali-fornia. They took the unprecedented step of shar-ing costs and procedures related to the delivery ofpublic works projects implemented during theprevious five years. Their CIP Benchmarking Studywas widely recognized for the innovative workthat it represents. (See below)

The 2002 California Multi-Agency CIPBenchmarking Study was presented to theAmerican Public Works National Congress(Conference) in Kansas City, Missouri onSeptember 24, 2002.

The City of Los Angeles, Quality and Pro-ductivity Commission recognized the Bureauof Engineering’s efforts on this study with aProductivity Award in May of 2003.

The study received Honorable Mention fromthe League of California Cities as a candidatefor the prestigious Helen Putnam Award forExcellence program, recognizing its innova-tive approach to reducing costs through multi-agency collaboration.

In June 2003, the City of Los Angeles pre-sented the study to a group of Washingtonagencies including King County Departmentof Natural Resources, Wastewater TreatmentDivision, King County Department of Trans-portation, Metro Transit Division, and theCity of Seattle.

The City of San Diego used the trend linefor design costs on Municipal Facilities. Theline appeared to be high for the City of SanDiego. In their research using the 2002Benchmarking report data, the City deter-mined that all libraries had significantly longerdurations that inflated the soft costs of projectdelivery. San Diego found that the longerdurations were caused by lags in State grantsand private donations, and/or incrementalCIP funding from other fund sources. Hardcost escalation was also a factor given thelonger duration to get the projects out to bid.San Diego is now considering a $312 millionfinancing plan that will pay for a massive over-haul of the city’s library system, and providefor more predictable project funding and de-livery.

The City of San Jose is using the performancemodels from the 2002 report as performancetargets for project delivery. They are also com-mitted to enhancing the benchmarking data-base so that it becomes a more credible pre-dictive tool.

Update 2003 began with a meeting of the ProjectTeam on October 2, 2002, in Long Beach, Cali-fornia. The City of Oakland joined the six origi-nal study participants in 2003. The Project Teamnow represents seven of the largest communitiesin California.

The Project Team met quarterly (February, April,and July) to continue the second year of the CIPBenchmarking Study. During this year’s efforts,the Project Team:

Page 16: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 12

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Reviewed and evaluated the implementationof 98 Best Management Practices.

Enhanced the performance benchmarkingdatabase with data from 286 new projects.The database currently contains informationon 525 Capital Improvement Projects, ofwhich 453 fit 2003 performance guidelinesand are included in Update 2003 (valued at$830 million). Twenty-five of the 525projects were smaller than $100,000; 13 werecompleted before 1997; and 34 were identi-fied as non-representative projects and wereexcluded from the 2003 analysis.

Prepared Update 2003 to report the findingsof the second year of the CIP BenchmarkingStudy.

B. STUDY OBJECTIVES

The purpose of the first year of the study was toanalyze and benchmark how capital improvementprojects are delivered within California. Theanalysis included the study of both the processesused to deliver the projects and the actual perfor-mance results.

The primary objective of Update 2003 focusedon project delivery process improvement andemphasized Best Management Practices imple-mentation. Specifically, the Project Team begana series of presentations to share BMP informa-tion.

Performance objectives were identified for Up-date 2003, as follows:

Collect additional data to increase the num-ber of projects included in the database toimprove the statistical credibility of the re-sults.

Improve the models by identifying and cor-recting missing data and errors.

Develop an improved “Predictive Tool” us-ing the new, larger pool of project data thatalso excluded non-representative projects.

Collect data on additional project types andclassifications.

Begin the process of linking processes to per-formance.

Conduct a comparison between in-house de-livery and usage of consultants.

Improve user interface with the database tofacilitate reporting.

C. PARTICIPANTS

The City of Los Angeles, Department of PublicWorks, Bureau of Engineering facilitated theUpdate 2003 benchmarking study and sponsoreda Study Team of expert staff that was responsiblefor logistics, management, and execution of the2003 study. The following agencies contributedto the study.

City of Long Beach, Department of PublicWorks

City of Los Angeles, Department of PublicWorks/Bureau of Engineering

City of Oakland, Public Works Agency

City of Sacramento, Department of PublicWorks

City of San Diego, Engineering & CapitalProjects

City and County of San Francisco, Depart-ment of Public Works / Bureau of Engi-neering / Bureau of Architecture / Bureau ofConstruction Management

City of San Jose, Department of PublicWorks

The Project Team has now worked together foralmost two years to plan and implement thebenchmarking study.

Page 17: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Chapter 2Introduction

Page 13

D. REPORT STRUCTURE

Update 2003 is organized as follows:

This introductory chapter (Chapter 2) pro-vides a brief explanation of the project his-tory, objectives, and project participants.

Chapter 3 provides a profile of each of theparticipating agencies, including descriptionsof their city and agency structure, and theircapital improvement programs for FY 2002-03 through FY 2004-05.

Chapter 4 identifies common and recom-mended Best Management Practices, basedon process benchmarking, and discusses pro-cess study findings.

Chapter 5 describes performancebenchmarking and explains the basis forproject selection and data definition. Perfor-mance graphs that have been generated fromthe project database (Appendix B) are alsoreviewed and discussed within Chapter 5.

Chapter 6 contains the conclusions and rec-ommendations based on the processbenchmarking results in Chapter 4 and theperformance benchmarking graphs that arepresented in Appendix B and discussed inChapter 5.

Page 18: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

C h a p t e r 3

Participating Agencies

pp

gg

Page 19: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

CHAPTER ParticipatingAgencies3

Page 15

A. INTRODUCTION

This section of the report updates infor-mation on the six agencies that partici-pated in the California Multi-Agency CIP

Benchmarking Study - 2002 and provides infor-mation on the seventh agency (City of Oakland)that joined the study for Update 2003.

Each agency's summary includes City Descrip-tion, Agency Description, and CIP project infor-mation that has been updated for the next threeyears (FY 02-03 through FY 04-05). The Cityof Oakland has also provided information for fis-cal year 2001-2002 for comparison with the otheragencies.

"In-house Project Delivery Services" (Table C) wasupdated to include the City of Oakland. The FactSheet (Table D) was updated for FY 02-03through FY 04-05.

The agencies' operations and approaches to projectdelivery are very similar. They generally have a"strong project management" approach with aproject manager responsible for budgets, sched-ules, and quality management from the beginningof a project to the end. The City of Long Beachhad initiated this "strong project management"approach during the 2002 study, and is now imple-menting it successfully.

Collectively, the seven participating cities expectto award nearly $6 billion in public works capitalimprovement project contracts over the next threeyears.

Page 20: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 16

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

B. DESCRIPTION OF PARTICIPATING AGENCIES

I. City of Long BeachPOPULATION 461,522

AREA 50 square miles

WEBSITE ADDRESS http://www.long-beach.gov

FORM OF GOVERNMENT

Long Beach has a Council-Manager form of government as provided by Charter effective July5, 1921. The current City Charter became effective in 1980. The Mayor, City Auditor, CityProsecutor, and City Attorney are elected by the people every four years. Nine City Council

members representing nine districts are elected by the people to four-year terms. Members of com-missions are appointed by the Mayor, subject to the approval of the City Council. Most otherofficials and employees of the City are subject to the civil service provisions of the Charter.

DEPARTMENT OF PUBLIC WORKS - Bureau of Engineering

The Bureau of Engineering is responsible for the design and construction of all public facilities,streets, sewers, and storm drains. The Bureau is also responsible for the engineering features andstandards of all privately developed subdivisions, tracts, and construction of public improvements inthe City's right-of-way. The head of the Bureau is the City Engineer.

Bureau personnel work on the expansion and modernization of over 860 miles of streets, the designand construction of marinas, airport facilities, parking structures, bridge rehabilitations, and otherpublic works projects. Recent past projects include the Emergency Communications and Opera-tions Center, Lakewood Boulevard Widening, and the seismic retrofit of the historic Rancho LosCerritos.

The Bureau employs over 90 employees in many different disciplines including engineering, archi-tecture, surveying, drafting, and construction management.

PROJECT MANAGEMENT/DELIVERY

The Bureau uses a strong project management delivery system in which the projects are assigned to aproject manager who is responsible for the budget and schedule from planning through projectcloseout. Project funding is usually generated from a variety of funding sources. The three groups/divisions within the Bureau have a philosophical approach to a design-bid-build project deliverysystem with the objective of using a mix of in-house and consultant contracts to provide design andconstruction management.

Page 21: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 17

Chapter 3Participating Agencies

CONSTRUCTION CONTRACTS/CAPITAL IMPROVEMENT PROGRAM

Construction contracts to be awarded for Fiscal Year 2002-03 through Fiscal Year 2004-05:

Program Total Projects Total Cost

Airport 12 $24,000,000

Community Development 2 $9,000,000

Parks, Recreation and Marine 32 $46,000,000

Public Facilities 29 $81,000,000

Public Thoroughfares 22 $ 63,000,000

Storm Drains 1 $4,000,000

Tidela 16 $8,000,000

Total 114 $ 235,000,000

Page 22: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 18

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

II. II. City of Los AngelesPOPULATION 3,694,820

AREA 469 square miles

WEBSITE ADDRESS http://eng.lacity.org

FORM OF GOVERNMENT

Los Angeles has a Mayor-Council-Commission form of government as provided by the Freeholders' Charter effective July 1, 1925. The current City Charter became effective on July 1,2000. The people elect the Mayor, City Controller, and City Attorney every four years.

Fifteen City Council members representing fifteen districts are elected to four-year terms. Membersof commissions are appointed by the Mayor, subject to the approval of the City Council. With fewexceptions, all other officials and employees of the City are subject to the civil service provisions ofthe Charter.

DEPARTMENT OF PUBLIC WORKS - Bureau of Engineering

The Bureau of Engineering is responsible for the design and construction of all public facilities,streets, sewers, and storm drains. The Bureau is also responsible for the engineering features andstandards of all privately developed subdivisions, tracts, and construction of public improvements inthe City's right-of-way. The head of the Bureau is the City Engineer.

Bureau personnel work on the expansion and modernization of over 7,400 miles of streets, 1,000miles of storm drains, 6,500 miles of sewer lines, the design and construction of police and firestations, libraries, parking structures, wastewater treatment plants, bridges, and other public worksprojects. Recent past projects include the Convention Center Expansion, renovation of the CentralLibrary, and the seismic retrofit of City Hall.

The Bureau employs over 1,000 employees in many different disciplines including engineering,architecture, surveying, drafting, real estate, environmental, and construction management.

PROJECT MANAGEMENT/DELIVERY

The Bureau uses a strong project management delivery system in which projects are assigned to aproject manager who is responsible for the budget and schedule from planning through projectcloseout. Project funding is usually generated from special funds including bonds, user fees, andgrants. The 33 groups/divisions within the Bureau use a design-bid-construct project delivery systemwith the objective of using in-house resources to provide design and construction management.Consultants are used to supplement in-house resources when necessary.

Page 23: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 19

Chapter 3Participating Agencies

CONSTRUCTION CONTRACTS/CAPITAL IMPROVEMENT PROGRAM

Construction contracts to be awarded for Fiscal Year 2002-03 through Fiscal Year 2004-05 include:

Program Total Projects Total Cost

Animal Bond 8 $84,200,000

Bridge Improvement Program 28 $66,600,000

Fire Bond 21 $183,800,000

Library Bond 10 $38,200,000

Municipal Facilities 20 $44,700,000

Recreation Facilities (Prop. K) 44 $56,100,000

Seismic Bond 2 $16,800,000

Storm-water Program 40 $18,600,000

Street Program 44 $127,800,000

Wastewater Program 133 $224,800,000

Proposition Q 6 $124,200,000

Total 356 $985,800,000

Page 24: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 20

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

III. III. City of OaklandPOPULATION 399,484

AREA 66.25 SQUARE MILES

WEBSITE ADDRESS www.oaklandpw.com

The City of Oakland was founded May 4th, 1852. In 1930, adoption of Charter amend-ments divided Oakland into seven Council districts and provided for a Council-Managerform of government. Council Members, Mayor, City Auditor, and City Attorney are elected

by the people every four years. Most other officials and employees of the City are subject to the CivilService provisions of the Charter.

PUBLIC WORKS AGENCY

The Public Works Agency mission is to provide for the design, construction, management andmaintenance of the City's infrastructure, including streets, sidewalks and pathways: creeks sewers andstorm drains; buildings and structures; vehicles and equipment; street lights and traffic signals. Inaddition, the Agency is responsible for related activities such as community clean-up (especially alongpublic rights of way), graffiti abatement and facilitating environmental compliance. The goal of thePublic Works Agency is to provide top quality, professional, effective and timely delivery of servicesto residents, businesses and City departments to assure utmost customer satisfaction.

The Agency is structured as follows: Director's Office, Administration, Design & ConstructionServices Department, and Maintenance Services Department.

PROJECT MANAGEMENT/DELIVERY

Project Delivery is the responsibility of the Design and Construction Services Department consti-tuted by the following Divisions: Administrative Services; Environmental Services; Project Manage-ment; Engineering Design; Transportation Services; Electrical Services; and Construction & FieldServices.

The Project Management Division provides comprehensive project management services for projectsoriginating in other City Departments, including budgeting, scheduling, defining scope, coordinat-ing design and construction activities, community engagement, and coordination with the client andother stakeholders. Clients are typically departments and agencies external to PWA. The types ofprojects include building renovations (libraries, fire stations, etc.), seismic retrofits, ADA improve-ments, tenant improvements, streetscape, and landscape improvements.

The Engineering Design Division manages several programs within Public Works. These include theSanitary Sewer Rehabilitation Program, Street Improvement Program, and Storm Drainage Pro-gram. In addition, the Division provides professional engineering services and project managementservices on a wide variety of technical projects for public infrastructure and to support the Commu-nity Economic Development Agency, City Manager, City Attorney, Council and others.

The Transportation Services Division manages traffic engineering, transportation planning, parkingand funding programs for the City. Staff closely works with other city agencies, council offices,

Page 25: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 21

Chapter 3Participating Agencies

residents, business owners and developers to address traffic related matters and assure timely imple-mentation of worthy transportation projects increasing mobility, enhancing vehicular and pedestriansafety as well as improving air quality. The staff advocates City's interest on transportation matters atregional, state, and federal agencies and assures that transportation projects benefiting Oakland arefunded and implemented in a timely manner.

The Electrical Services Division designs street lighting improvements; provides maintenance for 35,200street lights; manages and installs street lights for the Utility Undergrounding Projects; providesdesign review and coordination for Private Jobs (P-Jobs), Port of Oakland Projects, Caltrans Projects,and Alameda County Projects. In addition, The Electrical Division designs new traffic signal im-provements; provides maintenance for 591 traffic signalized intersections; provides 24 hour troublecall response for traffic signals; and provides support and construction for major maintenance pro-grams and projects.

The Construction & Field Services Division provides the following services: The Capital ProjectsSection provides construction management and inspection services for capital projects such as: publicstreet, sanitary sewer, and storm drain improvements; renovation and new construction of city-ownedbuildings including earthquake-damaged buildings, fire stations, libraries, parking facilities and parkand recreation facilities; new traffic signals and street lights. The Field Surveying Section providesinitial surveys, construction stakeout, and boundary surveys of city-owned facilities and easementsand maintains the network of vertical and horizontal benchmarks and monuments. The TestingLaboratory provides quality control and testing of materials utilized in capital improvement projects.The testing lab also supports the street-paving program of the Maintenance Department.

CONSTRUCTION CONTRACTS/CAPITAL IMPROVEMENT PROGRAM

Capital Improvement Program for Fiscal Year 2003-04 through Fiscal Year 2004-05:Funding Summary by Source:

Page 26: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 22

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

IV. IV. City of SacramentoPOPULATION 418,700

AREA 98 square miles

WEBSITE ADDRESShttp://www.pw.cityofsacramento.com

FORM OF GOVERNMENT

Sacramento's City Council-City Manager form of government was adopted in 1920. The CityCharter was also adopted in 1920. The City Council consists of a Mayor elected by the peopleand Council members, elected to represent the eight separate Council districts in the City.

Elected members serve four-year terms and elections are staggered every two years in even numberedyears. Members of Boards and Commissions are appointed by the Mayor, subject to the approval ofthe City Council. The City Manager, City Treasurer, City Attorney, and City Clerk are appointed bythe City Council with all other exempt managers appointed by the City Manager. All other officialsand employees of the City are subject to the civil service provisions of the Charter.

DEPARTMENT OF PUBLIC WORKS - Project Delivery Division

The Project Delivery Division is responsible for the design and construction of public buildings,facilities, and transportation projects. The division is managed by the Project Delivery Manager,who reports to the Director of Public Works.

Division personnel work on the expansion and modernization of 1,290 miles of streets, the designand construction of police and fire stations, libraries, parking structures, community centers, bridges,freeway interchanges, and other public works projects. Recent past and current projects include theJoe Serna, Jr. Environmental Protection Agency Headquarters Building, South Natomas Commu-nity Center and Library, the extension of Seventh Street, and the Arena Boulevard Interchange atInterstate 5.

The Division has about 100 employees in many different disciplines including civil engineering,electrical engineering, mechanical engineering, architecture, surveying, drafting, and constructionmanagement. Accounting and administrative staff provide support.

PROJECT MANAGEMENT/DELIVERY

The Division uses a strong project management delivery system in which the projects are assigned toa project manager who is responsible for the budget and schedule. Projects are managed by theFunding & Priorities Section during the planning phase. When the projects have been fully scopedand funded, other project managers are assigned that are responsible from design through construc-tion and project closeout. Funding for projects is usually generated from transportation funds,grants, fees, bonds, redevelopment funds, and the City's General Fund. The Division uses privateconsultants to supplement in-house resources to provide design and construction management ser-vices.

Page 27: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 23

Chapter 3Participating Agencies

CONSTRUCTION CONTRACTS/CAPITAL IMPROVEMENT PROGRAM

Construction contracts to be awarded for Fiscal Years 2002-03 through 2004-05:

Program Total Projects Total Cost

Public Facilities 130 $190,000,000

Transportation 89 $54,000,000

Total 219 $244,000,000

Page 28: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 24

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

V. City of San DiegoPOPULATION 1,277,168

AREA 342 square miles

WEBSITE ADDRESS http://www.sandiego.gov

FORM OF GOVERNMENT

The City of San Diego, the second largest city in the state and the seventh largest city in thenation, was incorporated on March 27, 1850. In 1931 the Charter by the Board ofFreeholder's was adopted by the voters and, although it has undergone many modifications,

is still in effect today. The City utilizes a Mayor-Council-Manager form of government with onlythe Mayor and City Attorney elected city-wide by the people every four years. Eight City Councilmembers are elected by the people in their respective districts to serve four-year terms. The Councilselects a City Manager who is responsible for the administration of most City departments. Offi-cials and employees of the City are subject to the civil service provisions of the Charter, with theexception of unclassified management and a few unrepresented employee classifications.

ENGINEERING AND CAPITAL PROJECTS DEPARTMENT

The Engineering & Capital Projects (E&CP) Department provides capital improvement project(CIP) services for the various operating departments throughout the City, including the Transporta-tion Department, Fire, Park & Recreation, and others. In this role, the E&CP Department is re-sponsible for the design, project management, and construction management for a vast majority ofpublic facility capital improvement projects (CIP). This work includes such projects as streets, bridges,bikeways, storm drains, and municipal buildings as well as the replacement of water and sewer mainsthroughout the City.

The Department is split into four divisions with three project management/design divisions (includ-ing Transportation Engineering, Water & Sewer Design, and Architectural Engineering and ContractServices) and one (1) support division (Field Engineering). The Director of the E&CP Departmentis the City Engineer. The E&CP Department employs approximately over 470 employees in manydifferent disciplines under this structure, including engineering, architecture, surveying, drafting, en-vironmental, materials testing, and construction management.

The E&CP Department staff, on behalf of the client departments, is responsible for the expansionand modernization of over 3,820 miles of streets and alleys, 769 miles of storm drains and channels,approximately 2,900 miles of sewer mains, and 3,139 miles of water mains as well as all the fire,library, and park facilities. Recent major projects include the Convention Center Expansion, expan-sion of Qualcomm Stadium, the construction of State Route 56, and the new downtown Ballpark.

PROJECT MANAGEMENT/DELIVERY

E&CP uses a "central point of contact" project delivery system in which the projects are assigned toa project manager within a design division who is then responsible for the management, budget, andschedule from the beginning of design phase (in some cases planning) through project closeout.

Page 29: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 25

Chapter 3Participating Agencies

Engineering is performed by either in-house staff from within the project manager's division orthrough the use of outside consultants, depending on the complexity and availability of resources.

Most projects make use of in-house resources for design services. The project manager also utilizesthe resources of the supporting divisions' staff for such services as surveys, contract procurement,construction management, and inspection. Funding is initially identified for a project by the clientdepartment during the planning process, and is generated from a variety of sources from tax revenueto special funds including bonds, user fees, and grants. The three project management/design divi-sions within the department most commonly use the design-bid-build project delivery system butare beginning to utilize alternative forms of project delivery including design-build methods and taskorder contracts.

CONSTRUCTION CONTRACTS/CAPITAL IMPROVEMENTS PROGRAM

Capital Improvements Program for Fiscal Years 2002-03 through 2004-05:

Page 30: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 26

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

VI. City and County ofSan Francisco

POPULATION 801,377

AREA 46.7 square miles

WEBSITE ADDRESS http://www.sfdpw.com

FORM OF GOVERNMENT

The City and County of San Francisco is a consolidated city and county with boundaries thatare prescribed by the laws of the State of California and the City Charter. The first CityCharter was established on April 15, 1850. The current City Charter was adopted Novem-

ber 6, 2001. The local government consists of a legislative branch consisting of an 11-member Boardof Supervisors, and an executive branch consisting of a Mayor. Each member of the Board is electedby district and serves a four-year term, but may not serve for more than two successive terms. TheMayor is the chief executive officer and official representative of the City and County who is electedat a general election and serves a four-year term, but may not serve for more than two successiveterms. Voters elect the City Attorney every four years. The Controller and City Administrator areappointed by the Mayor every ten and five years, respectively. Commissions and department headsare generally appointed by the Mayor and confirmed by the Board of Supervisors. With few excep-tions, all other officials and employees of San Francisco are subject to the civil service provisions ofthe Charter.

DEPARTMENT OF PUBLIC WORKS

The Deputy Director for Engineering, who also holds the title of City Engineer, is in charge of fourbureaus in the Department of Public Works: Bureau of Engineering, Bureau of Architecture, Bureauof Construction Management, and Bureau of Street Use and Mapping. The first three bureaus,referred to as the Tri-bureaus, work on capital projects while the Bureau of Street Use and Mappingregulates the use of city streets and private development of infrastructure.

The Tri-bureaus are responsible for the planning, design, and construction of public streets and infra-structure. These services are provided for client departments who do not have technical capabilitiesor contracting authority. These include the Police, Fire, Health and Recreation and Park departmentsas well as many other City agencies.

Tri-bureau personnel work on street renovation, sewer replacement and enlargement, traffic signals,parks and playgrounds, libraries, police and fire stations, health facility, treatment plant and pumpstations, and other public works projects.

The Tri-bureau has 431 authorized positions of which over 350 are filled. These positions covermany different disciplines including engineering, architecture, surveying, drafting, environmental,and construction management.

Page 31: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 27

Chapter 3Participating Agencies

PROJECT MANAGEMENT/DELIVERY

The Tri-bureaus use a strong project management delivery system in which the projects are assignedto a project manager who is responsible for the budget and schedule from planning through projectcloseout. Project funding is usually generated from special funds including general obligation andrevenue bonds, sales tax revenues, and grants. The Tri-bureaus have a philosophical approach todesign-bid-construct project delivery with the objective of using in-house resources whenever pos-sible to provide design and construction management. Consultants are used to supplement in-houseresources when necessary.

CONSTRUCTION CONTRACTS/CAPITAL IMPROVEMENT PROGRAM

Capital Improvements Program for Fiscal Year 2002-03 through Fiscal Year 2004-05:

Program Total Cost

Criminal Justice $184,000,000

Health Care Facilities $271,000,000

Libraries $57,000,000

Parks and Recreational $330,000,000

Sewers and Water $60,000,000

Streets and Bridges $142,000,000

Other Major Bond Programs $320,000,000

Total $1,364,000,000

Page 32: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 28

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

VII. City of San JosePOPULATION 918,800

AREA 177 square miles

WEBSITE ADDRESS http://www.ci.san-jose.ca.us/

FORM OF GOVERNMENT

San Jose has a Mayor-Council-City Manager form of government as provided by City Charter.The current City Charter became effective in May 1965. The Mayor is elected by the peopleevery four years. The people elect 10 City Council members representing 10 districts for four-

year overlapping terms. The City Charter limits the Mayor and Council members from servingmore than two consecutive terms. The City Attorney, Redevelopment Director, City Auditor, CityClerk and Independent Police Auditor are appointed by Mayor and Council. Department directors,assistant and deputy directors serve at-will. Other employees of the City are subject to the civilservice provisions of the Charter.

DEPARTMENT OF PUBLIC WORKS

MISSION: Plan, design, and construct public facilities and infrastructure systems to enhance thequality of life for the residents of San Jose.

The Public Works Department has the primary responsibility to deliver facilities and infrastructurethat meet the needs of the residents of San Jose and that comply with the standards and requirementsestablished in the engineering guidelines and the City's Master Plans. The Department achieves itsgoals through planning, design, and construction of the City's capital projects, and also through theplan review and permit process to regulate and facilitate private development projects. The Directorof Public Works/City Engineer manages the Department.

Department personnel work on the expansion and modernization of over 2,434 miles of streets, 926miles of storm drains, 2,169 miles of sewer lines, 3,500 acres of parks and the design and construc-tion of recreation facilities, police and fire stations, libraries, municipal buildings, bridges, and otherpublic works projects. Recently completed projects include the Runway 30L Reconstruction and theFederal Inspection Facility at the Norman Y. Mineta SJIA, the West Valley Branch Library, 4th andSan Fernando Parking Garage, and various parks improvement projects.

The Department employs over 380 employees in many different disciplines including engineering,architecture, landscape architecture, surveying, drafting, real estate, and construction management.Major projects currently underway include a new Civic Center, the Alameda and Berryessa BranchLibraries and the Bailey Avenue extension to SR 101. Major programs include the $228 millionParks Bond, $211 million Branch Library Facilities Bond, the $159 million Fire, Police Stations andFacilities Bond and the $691 million Airport Security and Traffic Relief Act Bond.

Page 33: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 29

Chapter 3Participating Agencies

PROJECT MANAGEMENT/DELIVERY

The Department has a focus of "on time, on budget" and reports performance measures in thecategories of timeliness, cost, quality, and customer satisfaction in the annual Operating Budget.Project management is a team effort in which the projects are assigned to a client partner and a DPWproject manager. The client provides scope and funding and the project manager is responsible forthe budget and schedule. The project manager is involved with design and problem resolution butpasses construction management responsibilities to a construction manager from the same division.

Project funding is the responsibility of the client department and may be generated from specialtaxes, bonds, in-lieu fees, and grants. The seven divisions (not including Administration) within theDepartment use a design-bid-build system for project delivery. Design has shifted from mostly in-house to over 70% consultant design, often using master agreements for multiple projects. This hastaken place in order to meet a large increase in workload from approximately $1.291 billion in bondfunds and the new Civic Center. However, since reduced revenues are forecast for state and localgovernments, more projects will be designed in-house over the next year. Construction managementremains largely in-house, augmented with consultants for special assignments.

CONSTRUCTION CONTRACTS/CAPITAL IMPROVEMENT PROGRAM

Major construction contracts scheduled for award in Fiscal Year 2002-03 throughFiscal Year 2004-05 (design and construction management included, rounded to $1,000,000):

Program Total Projects Total Cost

Public Safety Bond 20 $116,000,000

Library Bond 21 $104,000,000

Parks/Recreation Facilities 62 $140,000,000

Airport Master Plan 54 $521,000,000

Civic Center 1 $210,000,000

Wastewater Program 26 $91,000,000

Storm Drainage 5 $4,000,000

Traffic 92 $195,000,000

Total 281 $1,381,000,000

Page 34: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 30

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

C. SIMILARITIES AND DIFFERENCES

I. Agencies Summaries

Update 2003 agency information is consistent with the first year of the study. All seven agencies aresimilar in structure. The diversity of projects types and sizes planned for the next three years provideopportunities to continue this benchmarking study and gain valuable information in future years.

II. Available In-House Project Delivery Services

Table C summarizes the agencies' project delivery services that are available in-house. There has beenno change in the available in-house project delivery services, compared to last year. Note that theinformation on the new participating agency (City of Oakland) is provided. City of Los Angeles isstill the only agency that provides in-house geotechnical services.

III. Fact Sheets

Table D shows the updated Fact Sheet that covers the three-year period of Fiscal Year 2002-2003through 2004-2005.

Page 35: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

C h a p t e r 4

Process Benchmarking

py

gy

Page 36: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

CHAPTER ProcessBenchmarking4

Page 31

A. GUIDING PRINCIPLES

Executive level technical staff from sevenmajor cities (the Project Team) shared andopenly discussed the effectiveness of their

capital project delivery methods during the study.This Project Team identified, discussed, andevaluated 98 processes associated with the effec-tive delivery of capital projects.

The participating agencies then met to identify"Recommended Best Management Practices."They used their collective experience, as well asbenchmarking outcomes, to identify the processesthat they believed would improve the delivery ofcapital projects. The agencies used an intuitiveconsensus approach to determine which processesshould be recommended as Best ManagementPractices (Table E).

B. IMPLEMENTATION OF BESTMANAGEMENT PRACTICES

While the 2002 report identified 15 CommonBest Management Practices and 24 Recom-mended Best Management Practices, the secondyear of the study began tracking the actual imple-mentation of these Best Management Practices.

An important focus of Update 2003 was to im-prove the project delivery process by committingto the implementation of the Best ManagementPractices identified in the 2002 report. After sev-eral months of process improvement informationsharing, a survey of the participating agencies wasundertaken. The survey, shown in Table E, indi-cates the present status and intent of each agencyto implement these Best Management Practices.

Two additional Best Management Practices wereidentified through Update 2003 as follows:

Make Bid Documents Available Online

The cost of reproduction and distribution ofbid documents is increasing and may bereduced by making the documents availableonline.

Create In-House Project ManagementTeam for Small Projects

The 2002 study performance data suggestedthat project delivery costs were disproportion-ately high for projects with values of less than$250,000. It is envisioned that utilizing aspecialized in-house team to implement theseprojects would reduce costs.

The implementation of these additional BestManagement Practices is also indicated in Table E.

C. ADDITIONAL IMPROVEMENTS

In addition to the specific Best ManagementPractices directly associated with project delivery,the Project Team identified certain improvementsin contract documents that may improve theproject delivery process:

Standardized Indemnification Clauses

Standardized indemnification language, inwhich risk is reasonably apportioned to theparties most able to control it, would allowdesign professionals and their insurers knowwhat to expect and would expedite agree-ments.

Page 37: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 32

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Page 38: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 33

Chapter 4Process Benchmarking

Page 39: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 34

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Page 40: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 35

Chapter 4Process Benchmarking

Page 41: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 36

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Standardized Document Ownership Lan-guage

The development of standard document own-ership language would let design profession-als know what to expect and would expediteagreements.

The implementation of these improvements areoutside of the direct control of the Project Team.A plan for implementation of these and otherprocess improvements will be developed as a partof the next update of this report.

D. DISCUSSION ITEMS FOR THEIMPROVEMENT OF PROCESSES

The Project Team's approach to improving projectdelivery processes and emphasizing Best Manage-ment Practice implementation relied on the col-lective wisdom and experience of project deliveryexecutives from the seven participating agencies.Specifically, members of the Project Team sharedprocesses that were effective in their communi-ties as follows:

Strategic Project Prioritization. The City ofSacramento, Department of Public Workspresented its Transportation ProgrammingGuide which provided an objective approachto prioritizing projects based on need.

Asset Management. The City of Los Ange-les, Bureau of Engineering presented an ap-proach to developing condition assessmentsof major facilities.

Estimating Procedures. The City of San Jose,Department of Public Works presented a costestimating approach that spans from conceptphase to completion of the final constructiondocuments.

Front-End Documents. The City andCounty of San Francisco, Department ofPublic Works presented an approach to stan-dardizing electronically formatted "front end"bid documents.

Pre-Qualification. The City of Los Angeles,Bureau of Engineering presented an approachto pre-qualify contractors for design-bid-buildprojects.

In addition, the Project Team discussed streetscapedesign, sustainable design standards, electronicbid-document distribution, management of liq-uidated damages, standard indemnificationclauses, and design document ownership.

E. ONLINE DISCUSSIONS ANDCOMMUNICATION

Update 2003 also provided a basis for online dis-cussions and correspondence among Project Teammembers. Agency members posted their ques-tions on an email distribution list and other mem-bers responded, based on experiences with theirown agency practices. This was acknowledged asan efficient and effective communication methodand continues to play an important role in devel-opment of "a continuous forum for communi-cation to enable agency representatives to networkwith one another" (2002 report: Page 9).

The following questions have initiated interest-ing online discussions that have been elaboratedon in the Project Team meetings:

Construction Contract Bid & Award pro-cess and schedule and contractors insurancerequirements. San Diego, 10/09/02: Thisdiscussion was initiated by the City of SanDiego noting the particular problem contrac-tors appeared to have procuring the insurancerequired by the contract documents betweenthe Notice of Award and the intendedNotice to Proceed.

Whether or not Clients' Department costswere included in the delivery costs providedfor this benchmarking study. San Jose,10/15/02: This discussion was initiated bySan Jose to question the accuracy of the project

Page 42: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 37

Chapter 4Process Benchmarking

delivery cost data. Specifically of concern wasdetermining if the pre-construction planningand other costs expended by the user agencywas captured and what means was used tocapture the effort or costs expended. The con-clusion from four of the agencies was thatOwner Department costs were not being cap-tured in project delivery costs. Most ownerdepartments have separate operating budgets.

ADA Compliance for conversion of streetsto pedestrian malls. San Francisco, 01/30/03: This discussion was initiated by San Fran-cisco in order to prompt an exchange of ideasregarding design standards, costs and extentof the implementation of ADA in pedestrianmall conversions.

Organization of Public Works department.Sacramento, 02/03: This request was initi-ated by the City of Sacramento in order toassist in evaluating the structure and compo-sition of its Public Works Department.

Review of agencies' indemnification clausesfor consultants' services. Oakland, 02/05/03: This conversation was initiated by Oak-land and arose out of the delays incurred ingetting design contracts fully executed due toobjections from consultants to language allo-cating apparently excessive responsibility tothe consultant. The Project Team agreed thatstandard, less onerous indemnification lan-guage among the agencies could alleviate theproblem. Development of suggested languagemay be taken on as a task in a future updateof the study.

Mandating LEED Certification and GreenDesign adherence and implementation ofGreen Design guidelines. San Francisco,02/10/03: This discussion revolved aroundthe concept of avoiding excessive delays andcosts that agencies had experienced in the pur-

suit of independent outside certification ofprojects prior to going to bid. One conceptdiscussed was in-house certification. Discus-sion among the Project Team is continuing.

Issuance of Construction project plans andspecification online, as well as / instead ofpaper sets. Los Angeles, 02/13/03: Noneof the agencies currently distribute bid docu-ments online even though the practice wouldresult in the substantial reduction of docu-ment reproduction costs. During the discus-sions it was found that the City of Seattledoes distribute bid documents online. Thisdiscussion will continue.

Agencies' typical cost per mile for sewerconstruction and review of similarities anddifferences. City of Oakland, 02/21/03:The unit price of sewer construction was dis-cussed in depth as well as the factors contrib-uting to the wide variations between agen-cies. It was agreed that variations were to beexpected and that developing comparativedata might be done outside of the scope ofthis team because the variations were relatedmore to construction circumstances ratherthan project delivery processes.

Recruitment of geotechnical engineers,their minimum qualifications, salaries, andjob descriptions in various agencies. SanFrancisco, 03/11/03: It was determined thatLos Angeles was the only one of the sevenagencies that had in-house geotechnical re-sources. Discussion on this topic continues.

Monetary limits for award of competitivelybid projects by Bureau of Engineering, Pub-lic Works Board, or City Council. San Jose,03/26/03: This topic was discussed in detailand two tables were developed as follows(Tables F-1 and F-2):

Page 43: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 38

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Removal of utilities markings after streets construction completion. Long Beach, 03/31/03:The marks on City property remaining after construction was completed are an eyesore. Othercommunities had adopted ordinances or contract specifications requiring the cleanup and re-moval of the utility marking.

DBE participation for Design - Build projects. San Diego, 04/16/03: This discussion relatedto how DBE participation on design-build projects might be required as compared to thoseprojects bid the conventional design-bid-build method. Some agencies had no formal require-ments when design-build was utilized. Others had the same requirements as design-bid-build.

Page 44: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 39

Chapter 4Process Benchmarking

Review of the City's Infrastructure ReportCard. Los Angeles, 05/01/03.

Policies and procedures to employ MBE/DBE/WBE. San Diego, 05/01/03.

Environmental documents and permitscosts, evaluation time, responsible party,mechanism to assure effectiveness and ad-herence to schedule, sequencing with de-sign, etc. San Diego, 06/04/03: The City ofSan Diego initiated this discussion to get feed-back on what other agencies were experienc-ing in costs for CEQA compliance activitiesand what their processes were. Some agen-cies use consultants exclusively, others havethe environmental permit process handledoutside of the division responsible for projectdelivery, and yet others handle the environ-mental permits within the delivery team.

Standards and design guidelines for publicsafety capital programs. San Jose, 06/10/03: Agencies discussed experiences to acquiredesign and PM/CM services to construct fire,police and 911 facilities. The group receivedfire design standards from Los Angeles andarchitectural program requirements from SanFrancisco. San Jose is developing its own stan-dards for the public safety bond program.

Federal wage rates versus California wagerates for Federally-funded projects. SanFrancisco, 06/10/03: The discussion wasabout the lengthy administrative process tocomply with the Federal wage rates require-ments. It centered around the burdensomerequirement for attaching the latest Federaland California wage rates in the bid package.All responding agencies do this; however, SanJose does not include the California wage ratesbut rather indicates to the bidders where theycan obtain a copy of that information.

In addition to providing an online forum for shar-ing information, the study also created network-ing opportunities whereby off-line one-on-oneconversations took place between agencies on is-sues of mutual interest.

F. SPECIAL STUDIES

The Project Team decided that a special studywould be conducted every year, consistent withthe objective of improving processes. The specialstudy for this year was to review the use of con-sultants and to compare that to the overall projectdelivery costs for all agencies to identify areas ofimprovement. Table G summarizes the use ofconsultants compared with project delivery per-centages for all agencies. This report is a part ofthe project database and is updated instantly asadditional project data are compiled.

The Project Team proposed the following specialstudies for future years of study:

Investigate projects with delivery costs greaterthan 50% and relate to process improvement.

Categorize Change Orders into "ScopeChanges" and "All Others."

Review and benchmark project permits andfees.

Benchmark projects' unit costs (e.g. cost permile of street widening projects).

Identify and include Design-Build projects inthe study.

Identify and review projects with contingencyallowances.

Page 45: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 40

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Page 46: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

C h a p t e r 5

Performance Benchmarking

pg

Page 47: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

CHAPTER PerformanceBenchmarking5

Page 41

A. GUIDING PRINCIPLES

Performance benchmarking consisted of col-lecting documented project costs and com-paring the actual project delivery costs with

total construction costs.

Update 2003 began with an improvement to theproject performance questionnaire that was modi-fied to collect completion dates as well as projectdelivery costs. The questionnaires were uploadedinto the project database using a Visual Basic code,as in the previous study year. In addition to 2002report guidelines, the following applied to Up-date 2003 performance benchmarking:

Costs. All projects included in this study havea total construction cost exceeding $100,000.(Projects less than $100,000 in value are in-cluded in the database, but not included inthe study.)

Completion Date. Projects included in thestudy were completed after January 1997.Projects with earlier completion dates wereexcluded from the analysis, but still main-tained in the database. The database softwareallows that projects may be sorted and/or fil-tered by completion dates for specific analy-ses.

Representative Projects. All of the selectedprojects are "representative of the agencies' pro-cesses." The Project Team identified, reviewed,and corrected or eliminated all projects thathad the potential to be outliers in the regres-sion analysis.

Project Delivery Method. All selectedprojects were delivered through traditionalDesign-Bid-Build delivery method. Projectsdelivered using Design-Build and delivery

methods other than Design-Bid-Build are cat-egorically different and are not included inthis study (or the database) at this time.

B. ADDITIONAL DATA COLLECTIONAND DATABASE IMPROVEMENTS

Participating agencies provided project informa-tion by responding to the performance question-naire. The Study Team compiled the data intothe project database to develop new performancemodels. The curves were developed from datacollected over the two study years from projectscompleted between 1997 and 2002.

However, as noted in the 2002 report, the per-formance models become more credible as thedatabase grows and improves. Therefore, theProject Team set the Update 2003 objectives forimproving the database as follows:

Collect additional data to increase the num-ber of projects included in the database.Through an intense effort, the seven agencieswere able to increase the total number ofprojects included in the study from 239 to453, excluding non-representative projects.(The total value of the projects included inUpdate 2003 is over $830 million.)

Improve the data by correcting errors andproviding missing data. The Project Teamreviewed the data included in the 2002 re-port and found that some of the inspectioncosts had not been captured in the construc-tion management delivery analysis. Thesecosts were captured and included in Update2003.

Page 48: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 42

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Previously the database did not distinguishbetween projects that experienced no changeorders and projects for which change orderdata was not available. In both instances achange order value of zero was simply in-cluded in the database for these projects. Thecosts for these projects have since been re-viewed in detail and corrections were madeto distinguish between projects with "zero"change orders and projects for which changeorder data were not available.

The duration data for design and construc-tion management phases were reviewed andcorrected. Further, the Project Team foundthat a tabulated analysis was more useful thanregression curves.

Outliers Identification. An outliers identifi-cation model was developed and imple-mented. The Project Team reviewed theseprojects for abnormal attributes. The projectdata were corrected or eliminated from thedatabase, as appropriate. Appendix C providesdetails of the outliers identification technique.

Collect data on additional project types andclassifications. The Project Team expandedthe project "types" to include Parks with sub-classifications of Playgrounds, Sport Fields,and Restrooms.

Begin the process of linking processes to per-formance. In order to begin to facilitate fu-ture linking of implemented Best Manage-ment Practices to performance data, theProject Team perfected all project completion

dates and developed a sorting function withinthe software. The team also agreed that thefive-year performance data source windowwould roll forward each year with the intentof eventually being able to measure the effectof the implementation of the BMPs on de-livery performance.

Conduct a comparison between in-house de-livery and usage of consultants. The StudyTeam initiated an analysis that compared de-livery costs on projects where consultant us-age amounted to more than 50% of the costsof project delivery. A study of in-houseproject management costs, when consultantis the primary deliverer of project, will alsobe included.

Improvements to the database. New databasefeatures include categorization by date and up-date year, inclusion/exclusion of mutually ap-proved outliers, and additional instant reportsand tables developed in Update 2003 (Accessreports, Word consolidated graphs report,Excel tabulation of R2 and predictive table).

C. DISTRIBUTION OF PROJECTS

Total Number of Records in the Project Da-tabase. Table H summarizes the total num-ber of projects included in the database. Whilethe database contains 525 projects, 453 fitthe study criteria. As a result, column (d) ofTable H was the basis for the performancegraphs.

Page 49: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 43

Chapter 5Performance Benchmarking

Identification of Non-RepresentativeProjects. The Study Team utilized an outlieridentification process to enhance the credibil-ity of the final performance trends' capacityto serve as a predictive tool. A three-step pro-cess was followed:

1. A statistical model was developed whichreviewed all project data and identifiedanomalies. A list of outliers (data out-side of a defined interval) was identifiedthat contained 75 projects.

2. The projects (data) identified as anoma-lies in Step 1 (above) were reviewed indetail by the contributing agency for nec-essary corrections or for exclusion fromthe performance model database.

3. As a result of Step 2, it was found thatonly 34 of the 75 statistical outliers werenon-representative projects. These weregenerally the projects whose total deliv-ery costs were greater than 50% or lessthan 15% of total construction cost. The34 non-representative projects were ex-cluded from this study.

In addition, all 25 projects smaller than $100,000and 13 projects outside of the 5-year windowwere excluded from this analysis.

Updated Projects Distribution Matrix.Table I summarizes the final project distribu-tion (453 projects). The table shows largediversity in the number of projects, which arewidely distributed throughout the differentclassifications.

D. PERFORMANCE GRAPHDEVELOPMENT

Project performance data are available in theproject database as a "Project Listing" report.Performance data were compiled into a MicrosoftAccess database. A Visual Basic program exchangesperformance data with Microsoft Excel to developand review performance curves (Appendix B-II,Pages B-31 to B-104).

Study participants used the database to review andevaluate numerous benchmarking models and les-sons learned from the data trends.

The following are some examples of the modelsavailable in the database:

Change order costs versus design cost, con-struction management cost, or total deliverycost.

Project delivery costs versus consultant usage.

Effects of consultant usage on total changeorder costs.

Construction management cost versus designcost.

The database is designed to facilitate additionaldata collection and instant development of theperformance graphs.

E. DISCUSSION

Table J summarizes the correlation coefficientsfor all performance models. In consideration ofthe large amount of data, the team decided tostudy performance models at the "Project Classi-fication" level and therefore performance modelswere not developed at "Project Type" level. Thetable is also enhanced over last year (see Page 60of the 2002 report) and shows R2 values forproject delivery, change order, and duration mod-els at the classification level.

Page 50: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 44

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Page 51: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 45

Chapter 5Performance Benchmarking

While the number of projects has been almostdoubled compared to the 2002 report, in somecases, correlation coefficients have not improvedsignificantly. Limited improvement is noted in"Police/Fire Stations," "Renovation / Resurfacing,"and "Signals" for the design models and "Widen-ing / New / Grade Separation," "Bridges," and"Signals" for construction management models.

The decrease in R2 values for most of the modelsis an indication of poor correlation and the StudyTeam is not recommending these models for useas "Predictive Tools." This outcome sheds lighton areas of data improvement in future updates,as discussed below.

Appendix B provides all performance models. Thefollowing discussion summarizes the results ofanalyzing these performance models and compar-ing them with their equivalents in the 2002 re-port (Pages D-31 to D-74):

Curve Group I - Design Cost / ConstructionCost Versus Construction Cost:

Design models, in general, have intuitively accept-able trends. The exceptions are bike/pedestrian/curb ramps and all parks classifications (play-grounds, restrooms, and sports fields).

Significant improvement is observed for the reno-vation/resurfacing model, compared to the 2002report. This improvement also corresponds withthe improved correlation coefficient (R2=0.0955).While this is still a very small correlation coeffi-cient, it has increased 330% compared to last year.

In general, design models still require significantadditional data collection for consideration as pre-dictive tools. The outliers identification techniquethat was applied to project delivery models couldbe a useful tool to identify anomalies in designmodels, as well. This was, however, considered apremature analysis at this time considering thesmall number and non-uniform distribution of

project data in various categories. It is envisionedthat an appropriate number of projects will beincluded in future updates (at least seven projectsper agency, per classification) to facilitate a morereliable analysis of statistical outliers.

Curve Group II - Construction ManagementCost / Construction Cost Versus ConstructionCost:

Observations of construction management mod-els were generally similar to design models. Thechanges, compared to the 2002 report, were lesssignificant than the design models, with the ex-ception of community buildings. A significantimprovement was observed in the widening / new/ grade separation classification. While the num-ber of projects was increased by only one (aftereliminating non-representative projects), the cor-relation coefficient improved from near zero to0.2081. This significant improvement was duein part to additional data and in part to eliminat-ing six non-representative projects (three in the2002 report and three in this study).

The observed improvement of the constructionmanagement model for widening / new / gradeseparation classification emphasizes the impor-tance of additional data collection and, specifi-cally, the significance of eliminating non-repre-sentative projects. It is hoped that in future up-dates similar improvements will be observed forall performance models.

Curve Group III - Project Delivery Cost /Construction Cost Versus Construction Cost:

Project delivery models, which represent the con-solidation of design and construction managementmodels exhibited behaviors similar to the othertwo curve groups. Since these models were thebasis of outlier identification technique, they gen-erally possessed higher correlation coefficients.

Page 52: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 46

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Generally, in all three performance models, theBike/pedestrian/curb ramps classification and allParks classifications exhibited very low correla-tions and non-intuitive trends. The low correla-tion for the Parks project type is due to the verysmall number of projects. This is because Parks isa new project type in this year's study. For theBike/pedestrian/curb ramps classification, no con-clusions can be made at this time. If additionaldata and a more critical review of the current datareveal a similar trend in future studies, it wouldimply that the cost of delivering of such projectsis independent of their total construction cost.

Table K provides a tabulated summary of theproject delivery performance models. This tableis for reference only and is not yet useful as a pre-dictive tool. Areas with values shown in "un-shaded" type are the most reliable parts of thispredictive tool and may be cautiously used as aguideline.

Curve Group IV - Change Order Cost /Construction Cost Versus Construction Cost:

Similar to the 2002 report, no conclusions canbe made based on Change Order models due to alack of correlation. Successful application ofchange order models depends upon appropriatelycategorizing change orders. The Project Teamconcluded that change orders should be catego-rized as "Scope Changes" and "All Others." Thisis a task to be performed in future benchmarkingstudies.

Curve Group V - Total Project DurationVersus Construction Cost:

Performance models for project duration haveimproved, compared to the 2002 report. Thisimprovement is partially due to additional datacollection and compilation. It is also due, in part,to the review of the 2002 report data and theincorporation of necessary modifications.

The general trend is intuitive for all project classi-fications. As the size of the project increases, thetotal duration also increases. The change in dura-tion becomes less significant for larger projects.Streets graphs generally present better correlationcoefficients than other project classifications.Association of these high R2s with the large num-ber of data points is an indication of the models'reliability and usefulness as comparative tools.With little improvement (additional data and/oreliminating outliers), the duration models forStreets classifications can become useful predic-tive as well as comparative tools. The durationperformance models for the other project classi-fications also appear useful and could be even morepromising with additional data.

In general, duration performance models havegood trends and correlations. Exceptions areMunicipal Facilities - Libraries, Streets - Renova-tion / Resurfacing, Pipe Systems - Pressure Sys-tems, and Pipe Systems - Pump Stations, withlow correlations. The last two classifications alsoexhibited counter-intuitive trends, indicating theneed for additional data collection. In referenceto the Parks project type, the models' reliabilitycan be significantly improved by additional datacollection. A tabulation of duration models out-comes is provided in Table L. With additionaldata and other improvements, this table may beused as a predictive tool in the future.

Page 53: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 47

Chapter 5Performance Benchmarking

Page 54: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 48

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Page 55: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 49

Chapter 5Performance Benchmarking

Page 56: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

C h a p t e r 6

Conclusions and Recommendations

- Benchmarking

Page 57: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

CHAPTER Conclusions andRecommendations6

Page 51

A. PROCESS BENCHMARKING -RECOMMENDED BESTMANAGEMENT PRACTICES

Process benchmarking concentrated on fourareas:

The agencies' level of implementation of rec-ommended Best Management Practices wasa priority. With few exceptions, Best Man-agement Practices have been or are plannedto be implemented by next year. The agen-cies' belief in the importance of these pro-cesses and their intentions to implement themis a significant indication of the agencies' be-lief in collaboration and commitment to theprinciples put forth in the 2002 report.

Two new recommended Best ManagementPractices were identified as a result of teamdiscussions and added to the implementationplan.

Online discussions were found to be valuableduring this benchmarking study. Sixteen dif-ferent process-related topics were discussedamong the Project Team members demon-strating the agencies' belief that sharing theirexperiences can improve project delivery pro-cesses.

Special studies, conducted as a part of cur-rent and future updates, are an important as-pect of successful benchmarking. These stud-ies will assist the team to better understandbenchmarking outcomes by expanding thesearch for influences on the data and perfor-mance models. The special study containedin Update 2003 involved a review of the roleand use of consultants in the delivery of capi-

tal improvement projects. Six special studiesare identified for consideration in futurebenchmarking efforts.

B. PERFORMANCE BENCHMARKING

Most performance models exhibited very similartrends compared to models contained in the 2002report. Some of the 2002 report performancemodels that did not demonstrate intuitive trendshad significantly better trends after inclusion ofadditional data. However, correlation coefficientsdid not appear to improve in many models. Thiscould be attributed to the diversity of the agen-cies' processes. They also could be related to ap-plication of different criteria to the data collec-tion process. A critical review of agencies' datacollection processes is the first priority of the nextyear of study. This will hopefully identify areasof discrepancy and improve the performancemodels.

C. STUDY QUALIFICATIONS ANDCHARACTERISTICS

The 2002 California Multi-Agency CIPBenchmarking Study developed a solid and ben-eficial foundation for process and performancebenchmarking. The second year of study buildson this effort. Update 2003 shows that addi-tional project data made significant improvementsto some of the models as compared to the 2002study.

The statistical analysis included in the 2002 re-port had recommended that project data fromthe project classification level should not bebundled and analyzed at the project type level.

Page 58: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 52

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

This recommendation was implemented in Up-date 2003 after significant enhancement of theproject data pool. All performance models weredeveloped at the project classification level in thisstudy.

The Study Team observed that additional datacollection and review of existing data can signifi-cantly improve study outcomes. Areas ofanomaly and discrepancy were quickly identifiedusing the database tools. Necessary correctionswere incorporated and the data quality and accu-racy were significantly improved.

In conclusion, the above discussion suggests thefollowing:

Individual agencies can benefit from using thecurrent performance curves as comparativerather than predictive tools. The best use ofthese curves, with the current data, is to com-pare an agency's performance to industrytrends.

Additional project data will improve the re-sults of this study and its ability to predictresource requirements needed to deliver aCapital Improvement Project. The currentperformance curves are an improvement com-pared to the 2002 report graphs. However,additional data collection can further improvecredibility of these models and aid progresstowards the objective of having a predictivetool.

Similar to the 2002 study, the Project Teamdid not categorize change orders based on theirsource (Unforeseen and Changed Conditions,Design Changes, Owner-Initiated, Commis-sioning/Optimization, Miscellaneous). Inthis study, the Project Team discussed thepossibility of studying categorization of changeorders by "Scope Changes" and "All Others."This categorization may provide more realis-tic change orders models.

As was the case in the initial year of the study,Update 2003 found that agencies' multipli-ers remained similar and that it was still rea-sonable to use "Costs" as the comparativebasis, instead of "Hours."

D. NEXT STEPS

Identify agencies' goals for process improve-ment. All agencies showed an interest inimplementing some or all of the Best Man-agement Practices in the near future. An es-sential step in future studies is to review agen-cies' progress in this area and their approachto improving their processes based on the les-sons learned in this benchmarking study.

Review of agencies data collection processes.Figures 1a through 1c analyze changes in thedesign performance model for Libraries. Inthis example, individual agencies were per-forming much more similarly to one anotherin the 2002 study (Figure 1a - regressioncurves of Agencies A and D are very close tothe global regression curve). In this year'sstudy, however, the agencies' models showmore differences in the cost of designing aCIP project (Figure 1b - Agencies D and G,the main contributors to this study, havetrends significantly different from each other).Even though individual agencies' models haveimproved (compare R2 values between Fig-ures 1a and 1c), the combined model exhib-its lower correlation due to agencies' differ-ences (Figure 1c - the Global curve).

Additional data collection for performancebenchmarking. Performance benchmarkingresults will be improved with collection ofadditional projects data and exclusion of non-representative projects, as was the case in Up-date 2003. An improvement in the correla-tion coefficient of an individual agency'smodel indicates the agency's consistency inproject delivery processes. An improvement

Page 59: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 53

Chapter 6Conclusions and Recommendations

1 California Multi-Agency CIP Benchmarking Study, 2002Report, Page 11.

of the global model correlation coefficientwould better verify agencies' similarities and/or differences.

Update 2003 data indicate that projects werenot delivered at similar costs. Answers to thefollowing three questions would clarify thereason for these differences:

1. Are agencies' project delivery processes sig-nificantly different? The Multi-AgencyBenchmarking Study of 2002 found that"the agencies' operations and approachesto project delivery are strikingly similar"1

and the agencies processes are not knownto have changed significantly since thatreport. Therefore, the differences in costsmust be explained otherwise.

2. Are agencies' collecting projects data differ-ently? The Project Team plans to reviewagencies' data collection procedures to as-sure consistency. The Project Team con-siders this a high priority for the contin-ued study.

3. Are all agencies' providing data on similarprojects? The fresher data included in Up-date 2003 may reflect information onsubstantially different CIP projects be-tween the participating agencies (e.g. onecity may be doing mostly retrofits; oth-ers may be doing mostly new projects).Implementation of projects indices anddevelopment of the associated models asthe study continues will determine theinfluence of project differences.

Continue forum discussions and agenciespresentations. The Project Team will con-tinue to share experiences and questionsthrough online discussions and presentations.This was found to be an effective method toimprove processes and facilitate efficient de-livery of Capital Improvement Projects.

Page 60: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page 54

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Figure 1a

Figure 1b

Figure 1c

Page 61: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Acknowledgments

g

Page 62: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Acknowledgements

Page I

Gratefully acknowledged are the participation and contribution of the following individualsto the second phase of the California Multi-Agency CIP Benchmarking Study. This workwould not have been possible without the contributions made by these people:

Study Team:

Gary Lee Moore, P.E., Deputy City EngineerCity of Los Angeles, Department of Public Works, Bureau of Engineering650 South Spring Street, Suite 200Los Angeles, CA 90014(213) 847-8764 (213) 847-9602 (Fax)[email protected]

Joseph Wojslaw, P.E., EED Consultant ManagerMontgomery Watson Harza, Inc.12000 Vista Del Mar, Pregerson Building, Suite 200Playa Del Ray, CA 90293(310) 648-6102 (310) 648-6155 (Fax)[email protected]

Bill Lacher, CCMVanir Construction Management, Inc.3435 Wilshire BoulevardLos Angeles, CA 90010-2006(213) 487-1145 (213) 487-1051 (Fax)[email protected]

Ali Nowroozi, Ph.D., Consultant Project Controls EngineerTajann Engineering and Construction, Inc.4225 Via Arbolada, Suite 521Los Angeles, CA 90042(323) 481-4671 (323) 227-0693 (Fax)[email protected]

Page 63: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page II

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Project Team:

Mark Christoffels, City EngineerCity of Long Beach, Department of Public Works333 W. Ocean Blvd., 9th FloorLong Beach, CA 90802(562) 570-6771 (562) 570-6012 (Fax)[email protected]

Edward Villanueva, Administrative AnalystCity of Long Beach, Department of Public Works333 W. Ocean Blvd., 9th FloorLong Beach, CA 90802(562) 570-6771 (562) 570-6012 (Fax)[email protected]

Roger Beaman, Administrative AnalystCity of Long Beach, Department of Public Works333 W. Ocean Blvd., 10th FloorLong Beach, CA 90802(562) 570-6771 (562) 570-6012 (Fax)

Alex J. Vidaurrazaga, Principal Civil Engineer, S.E.City of Los Angeles, Department of Public Works, Bureau of EngineeringStructural Engineering Division650 South Spring Street, Suite 400Los Angeles, CA 90014(213) 847-8773 (213) 847-8999 (Fax)[email protected]

Shailesh "Sunny" Patel, Senior Structural Engineer, S.E.City of Los Angeles, Department of Public Works, Bureau of EngineeringStructural Engineering Division650 South Spring Street, Suite 400Los Angeles, CA 90014(213) 847-8774 (213) 847-8999 (Fax)[email protected]

Claudette R. Ford., P.E., DirectorCity of Oakland, Public Works Agency250 Frank H. Ogawa Plaza, Suite 4314Oakland, CA 94612(510) 238-3961 (510) 238-2233 (Fax)[email protected]

Page 64: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page III

Acknowledgements

Raul Godinez, P.E., Assistant DirectorCity of Oakland, Public Works Agency250 Frank H. Ogawa Plaza, Suite 4314Oakland, CA 94612(510) 238-4420 (510) 238-6412 (Fax)[email protected]

Michael Neary, P.E., Engineering Division ManagerCity of Oakland, Public Works Agency250 Frank H. Ogawa Plaza, Suite 4314Oakland, CA 94612(510) 238-6659 (510) 238-7227 (Fax)[email protected]

David Lau, P.E., Project Delivery ManagerCity of Oakland, Public Works Agency250 Frank H. Ogawa Plaza, Suite 4344Oakland, CA 94612(510) 238-7131 (510) 238-2085 (Fax)[email protected]

Jose Martinez, Project ManagerCity of Oakland, Public Works Agency250 Frank H. Ogawa Plaza, Suite 4314Oakland, CA 94612(510) 238-6864 (510) 238-6412 (Fax)[email protected]

Fran Halbakken, Project Delivery ManagerCity of Sacramento, Department of Public Works927 10th Street, 1st FloorSacramento, CA 95814(916) 264-7194 (916) 264-8281 (Fax)[email protected]

Shirley Wong, AnalystCity of Sacramento, Department of Public Works927 10th StreetSacramento, CA 95814(916) [email protected]

Page 65: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page IV

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Robin Borre, AnalystCity of Sacramento, Department of Public Works927 10th StreetSacramento, CA 95814(916)[email protected]

Frank Belock, P.E., DirectorCity of San DiegoEngineering & Capital Projects Department202 "C" Street, MS 9BSan Diego, CA 92101(619) 236-6274 (619) 533-4736 (Fax)[email protected]

Patti Boekamp, P.E., Chief Deputy DirectorCity of San DiegoEngineering & Capital Projects Department, Transportation Engineering Division1010 2nd Avenue, Suite 1200, MS 612San Diego, CA 92101(619) 533-3173 (619) 533-3071 (Fax)[email protected]

Darren Greenhalgh, P.E., Senior Civil EngineerCity of San DiegoEngineering & Capital Projects Department, Architectural Engineering and Con-tract Services Division1010 2nd Avenue, Suite 1400, MS 614San Diego, CA 92101(619) 533-3104 (619) 533-3112 (Fax)[email protected]

Richard Leja, P.E., Senior Civil EngineerCity of San DiegoEngineering & Capital Projects Department, Transportation Engineering Division1010 2nd Avenue, Suite 1200, MS 611San Diego, CA 92101(619) 533-3764 (619) 533-3071 (Fax)[email protected]

Page 66: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page V

Acknowledgements

George Qsar, P.E., Senior Civil EngineerCity of San DiegoEngineering & Capital Projects Department, Field Engineering Division9485 Aero DriveSan Diego, CA 92123(858) 627-3240 (858) [email protected]

Jeffrey A. Shoaf, P.E., Senior Civil EngineerCity of San DiegoEngineering & Capital Projects Department, Water & Sewer Design Division600 B Street, Suite 800, MS 908ASan Diego, CA 92101(619) 533-5109 (619) 533-5176 (Fax)[email protected].

Edwin M. Lee, Director of Public WorksCity and County of San Francisco, Dept. of Public WorksCity Hall, Room 3481 Dr. Carlton B. Goodlett PlaceSan Francisco, CA 94102(415) 554-6920 (415) 554-6944 (Fax)[email protected]

Robert Beck, Acting Deputy Director of Public WorksCity and County of San Francisco, Dept. of Public WorksCity Hall, Room 3481 Dr. Carlton B. Goodlett PlaceSan Francisco, CA 94102(415) 554-6940 (415) 554-6944 (Fax)[email protected]

Nelson Wong, Bureau ManagerCity and County of San Francisco, Dept. of Public Works, Bureau of Engineering30 Van Ness Avenue, 5th FloorSan Francisco, CA 94102(415) 558-4517 (415) 558-4519 (Fax)[email protected]

Steven T. Lee, Electrical EngineerCity and County of San Francisco, Dept. of Public Works, Bureau of Engineering30 Van Ness Avenue, 5th FloorSan Francisco, CA 94102(415) 558-5226 (415) 558-4590 (Fax)[email protected]

Page 67: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page VI

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Katy Allen, DirectorCity of San Jose, Department of Public Works801 N. First Street, 320San Jose, CA 95110(408) 277-4339 (408) 277-3156 (Fax)[email protected]

David Clarke, P.E., Deputy DirectorCity of San Jose, Department of Public Works801 N. First Street, 320San Jose, CA 95110(408) 277-3833 (408) 277-3156 (Fax)[email protected]

Kevin Briggs, P.E., Senior Civil EngineerCity of San Jose, Department of Public Works801 N. First Street, 340San Jose, CA 95110(408) 277-2972 (408) 277-3869 (Fax)[email protected]

Lisa Cheung, CIP SpecialistCity of San Jose, Office of the City Manager801 N. First Street, 340San Jose, CA 95110(408) 794-1471(408) [email protected]

Dale Schmidt, Associate Civil EngineerCity of San Jose801 N. First Street, 300San Jose, CA 95110(408) 277-3693 (408) 277-3668(FAX)[email protected]

Alfredo Iraheta, AnalystCity of San Jose801 N. First Street, 300San Jose, CA 95110(408) 277-2496 (408) 277-3668(FAX)[email protected]

Page 68: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

A p p e n d i x A

Administrative Information

pRecomm

endations

Page 69: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

APPENDIX AdministrativeInformationA

Page A-1

Page 70: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page A-2

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Page 71: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page A-3

Appendix AAdministrative Information

Page 72: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page A-4

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Page 73: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page A-5

Appendix AAdministrative Information

Page 74: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page A-6

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Page 75: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

pp

A p p e n d i x B

Performance Benchmarking

Page 76: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

APPENDIX PerformanceBenchmarkingB

CURVES GROUP 1

Design Cost / Construction Cost

Versus

Total Construction Cost

123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345

Page B-1

Page 77: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-3

Page 78: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-4

Page 79: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-5

Page 80: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-6

Page 81: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-7

Page 82: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-8

Page 83: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-9

* 2 b

ike

proj

ects

had

zer

o de

sign

cos

ts a

nd a

re e

xclu

ded

from

this

mod

el

Page 84: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-10

Page 85: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-11

Page 86: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-12

Page 87: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-13

Page 88: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-14

Page 89: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-15

Page 90: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-16

Page 91: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Appendix BPerformance Benchmarking

CURVES GROUP 2

Construction Management Cost /Construction Cost

Versus

Total Construction Cost

123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345

Page B-17

Page 92: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-19

Page 93: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-20

Page 94: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-21

Page 95: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-22

Page 96: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-23

Page 97: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-24

Page 98: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-25

Page 99: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-26

Page 100: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-27

Page 101: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-28

Page 102: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-29

Page 103: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-30

Page 104: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-31

Page 105: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-32

Page 106: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Appendix BPerformance Benchmarking

CURVES GROUP 3

Delivery Cost / Construction Cost

Versus

Total Construction Cost

123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345

Page B-33

Page 107: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-35

Page 108: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-36

Page 109: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-37

Page 110: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-38

Page 111: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-39

Page 112: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-40

Page 113: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-41

Page 114: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-42

Page 115: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-43

Page 116: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-44

Page 117: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-45

Page 118: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-46

Page 119: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-47

Page 120: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-48

Page 121: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Appendix BPerformance Benchmarking

CURVES GROUP 4

Change Order Cost / Construction Cost

Versus

Total Construction Cost

123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345

Page B-49

Page 122: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-51

Page 123: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-52

Page 124: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-53

Page 125: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-54

Page 126: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-55

Page 127: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-56

Page 128: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-57

Page 129: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-58

Page 130: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-59

Page 131: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-60

Page 132: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-61

Page 133: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-62

Page 134: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-63

Page 135: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-64

Page 136: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Appendix BPerformance Benchmarking

CURVES GROUP 5

Change Order Cost / Construction Cost

Versus

Total Construction Cost

123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345123456789012345

Page B-65

Page 137: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-67

Page 138: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-68

Page 139: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-69

Page 140: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-70

Page 141: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-71

Page 142: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-72

Page 143: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-73

Page 144: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-74

Page 145: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-75

Page 146: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-76

Page 147: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-77

Page 148: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-78

Page 149: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-79

Page 150: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page B-80

Page 151: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

A p p e n d i x C

Outliers Identification

Page 152: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

APPENDIX OutliersIdentificationC

Page C-1

Alternatively, Confidence Interval (CI) may beused to improve the correlation coefficient moreeffectively. In this technique, a confidence inter-val around the regression curve is defined and allthe points outside this range are considered asoutliers (see Figure C-1).

Selection of the confidence interval is based onthe trade-off between the number of data pointsthat can be set aside and the improvement thatcan be achieved. The more data points that areexcluded, the more the R2 will improve. How-ever, the model becomes unrealistic if too manydata points are excluded.

It is noteworthy that, in this study, outliers' iden-tification was merely a tool to identify projectswith too high or too low project delivery costs,compared to the general trend. This provides atool to distinguish the projects that have the po-tential to be abnormal (not-representative)projects. Under no circumstances should statisti-cal outliers be used as the basis of project elimina-tion without other justification.

Subsequently, selection of the CI is arbitrary andis defined based on acceptability of maximum andminimum project delivery costs. For example,practical experience has shown that a project withmore that 50% project delivery cost has the po-tential to be a non-representative project. There-fore the upper bound curve should not go be-yond 50% in Figure C-1. The team reviewed allperformance curves and it was found that, in gen-eral, a 51% confidence interval (m + 0.75s) iden-tifies the acceptable range in all graphs. In otherwords, the projects beyond [m - 0.75s , m + 0.75s]are worth reviewing for possible abnormal behav-ior (i.e. non-standard delivery process).

A computer program was developed to apply thisoutliers identification technique to all total projectdelivery performance models. A list of all outli-ers (75 projects) was shared with the Project Team,34 of which were found to be abnormal projects.The abnormal (non-representative) projects wereall projects with total project delivery higher than50% or smaller than 15%.

2 See Appendix B-II of Phase I report for details of this technique.

Outliers, in a regression model, are considered to be the data points that are "too far" from theregression curve. The classic criteria to find statistical outliers are based on the followingrule of thumb2:

Page 153: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page C-2

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Page 154: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

A p p e n d i x D

Mulit-Agency Benchmarking Database (Update 2003)

Page 155: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

APPENDIX Multi-Agency BenchmarkingDatabase (Update 2003)D

Page D-1

A benchmarking database was originally de-signed for the 2002 California Multi-Agency benchmarking study. This data-

base (CALBM) was modified and improved forthe specific purposes of the Update 2003 study.

Upon execution of the database program, theMulti-Agency logo appears on the monitor, asshown in Figure D-1.

The user has the option of opening a form fordata entry / review or benchmarking models de-velopment or opening a report:

Project data form, as shown in Figure D-2,can be used to review all project data that areprovided by various agencies and to add newproject data.

Curves form is a tool to develop instantaneousperformance models based on the criteria thatare selected from the form options. For ex-ample, Figure D-4 is the performance modelthat was developed based on the potions se-lected in Figure D-3. This form is the mostuseful feature of the database and was themain tool to perform performancebenchmarking in both 2002 and 2003benchmarking studies.

Figure D-1. CALBM Database View

Page 156: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page D-2

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Figure D-2. Project Data Form

Figure D-3. Curves Form

Page 157: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page D-3

Appendix DMulti-Agency Benchmarking Database (Update 2003)

Numerous reports are available as shown in Figure D-5. These reports get updated instantly asadditional data are added to the database. Many of the CALBM reports are used in both 2002and 2003 studies reports. Figure D-6 shows the Consultants usage summary report, as an ex-ample.

Detail explanation of the CALBM database is provided in the project manual which is provided as a“README” file in the enclosed copy of the database.

Figure D-4. Performance Model: Output of Figure D-3

Municipal Facilities - Libraries

Design Percentage Versus Total Construction Cost

R2 = 0.4915N = 25

%

10%

20%

30%

40%

50%

60%

0 1 2 3 4 5 6 7 8 9 10Total Construction Cost ($Million)

Des

ign

Perc

enta

ge

Agency A

Agency B

Agency C

Agency D

Agency F

Agency G

Log. (Global)

Log. (Global-UB)

Page 158: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page D-4

Annual Report Update 2003 California Multi-Agency CIP Benchmarking Study

Figure D-5. Various Reports Available in the CALBM Database

Page 159: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking:

Page D-5

Appendix DMulti-Agency Benchmarking Database (Update 2003)

Page 160: Table of - Bureau of Engineering · Table of TOC Contents Page TOC-1 CHAPTER 1 -EXECUTIVE SUMMARY 1 A. Introduction 1 B. Study Methodology 2 C. Conclusions of Process Benchmarking: