Automated Pavement Condition Surveys Monday, October 21,...

Post on 05-Oct-2020

1 views 0 download

Transcript of Automated Pavement Condition Surveys Monday, October 21,...

Automated Pavement Condition Surveys

Monday, October 21, 20192:00-3:30 PM ET

TRANSPORTATION RESEARCH BOARD

The Transportation Research Board has met the standards and

requirements of the Registered Continuing Education Providers Program.

Credit earned on completion of this program will be reported to RCEP. A

certificate of completion will be issued to participants that have registered

and attended the entire session. As such, it does not include content that

may be deemed or construed to be an approval or endorsement by RCEP.

Purpose

Provide a summary of the findings from the National Cooperative Highway Research Program (NCHRP)’s Synthesis 531 Automated Pavement Condition Surveys

Learning ObjectivesAt the end of this webinar, you will be able to:• Describe how agencies conduct and ensure data quality for

automated pavement condition surveys

Transportation Research Board WebinarOctober 21, 2019

Automated Pavement Condition Surveys

NCHRP Project 20-05, Topic 49-15NCHRP Synthesis 531

Linda PierceNCE

Spokane, Washington

Purpose

Summarize highway agency practice with semi- and fully automated pavement condition surveys and data quality

management plans

Outline

• Introduction & scope • Literature review summary• Survey results summary• Summary of agency data

quality management plans• Concluding remarks• Questions

Learning Objective

Better understanding of how highway agencies conduct & ensure data quality for automated pavement

conditions surveys

Introduction

Jo Allen GauseSenior Program Manager

NCHRP

Linda PiercePrincipal Investigator

Nick WeitzelStaff Engineer

Project Panel• Bouzid Choubane, FLDOT• Dulce Rufino Feldman,

Caltrans• Tom Kazmierowski, Golder

& Associates• Michael Mariotti, NYSDOT• Magdy Mikhail, AgileAssets• John Senger, ILDOT• James Tsai, Georgia Tech• Andy Mergenmeier, FHWA• Larry Wiser, FHWA

Scope

• Document agency practices, challenges, & successes related to:- Condition type- Technologies- Data processing- Data quality

management- Data utilization

Literature Review

Agency Survey

Case Examples

Synthesis

Literature Review Results

Previous Synthesis Studies

5% 0% 0%23%

82%

23%0%

33%

91%82%

41%

59%

0%

20%

40%

60%

80%

100%

IRI Rutting Faulting Distress

PERC

ENT

OF

AGEN

CIES

1986 1991 2004

Survey Types

• 2D- Area scan (lighting

requirements)- Line scan (w/o

lighting influence)- Human rater or

analysis software

• 3D- 2D intensity (reflected

light, e.g., stripping, cracking, aggregate)

- 3D range (elevation, e.g., cracking, potholes, spalling)

- Analysis software

Data Quality Management Plans (23 CFP Part 490)

• Equipment calibration & certification- IRI- Cracking percent- Rutting- Faulting

• Certification process for persons performing manual data collection

• Quality control before & periodically during data collection

• Data sampling, review & checking processes

• Error resolution procedures & data acceptance criteria

Agency SurveyResults

(not all results are presented)

Data Collection Methods

• Fully automated (16)• Fully & semi-

automated (21)• Manual & automated

(6)• Manual (6)• Nearly 90% use

automated pavement condition surveys

Total responses = 57

Full vs Semi

26

18

8

24 23

11

0

5

10

15

20

25

30

Asphalt(50 reponses)

JPCP(41 responses)

CRCP(19 responses)

NO

. OF

AGEN

CIES

PAVEMENT TYPE

Fully automated

Semi- and Fully Automated

Experience

• > 10 years (22)• 5 to 10 years (16)• 1 to 4 years (9)

Total responses = 47

Asphalt

Condition Fully Automated

Semi-Automated Manual Total No.

ResponsesIRI 55 0 0 55Rutting 53 0 3 56Longitudinal cracking 33 9 9 51Transverse cracking 32 13 10 55Cross slope 30 0 1 31Alligator cracking 29 15 10 54Edge cracking 19 10 4 33Texture 19 1 2 22Block cracking 16 11 7 34Reflection cracking 16 7 4 27Potholes 14 13 9 36Raveling 14 11 10 35Patching 10 15 11 36Bleeding 10 9 9 28

Total responses = 57

Jointed Plain Concrete(JPC)

Condition Fully Automated

Semi-Automated Manual Total No. of

ResponsesIRI 44 0 0 44Faulting 37 3 2 42Cross slope 20 1 1 22Longitudinal cracking 20 13 7 40Transverse cracking 16 17 6 39Texture 12 1 2 15Patching 8 14 7 29Corner cracking 7 16 7 30Spalling 7 15 8 30Joint seal damage 6 7 7 20Lane/shoulder drop off 6 4 5 15Durability 4 9 6 19Map cracking 4 7 2 13Blowups 2 6 3 11

Total responses = 44

Continuously ReinforcedConcrete

Condition Fully Automated

Semi-Automated Manual Total No. of

ResponsesIRI 19 0 0 19Cross slope 9 0 0 9Longitudinal cracking 8 7 2 17Transverse cracking 6 6 1 13Texture 6 0 1 7Punchout 5 8 1 14Lane/shoulder drop off 5 1 2 8Spalling 3 4 1 8Patching 3 7 2 12Durability 3 3 2 8Scaling 1 1 1 3Map cracking 1 3 0 4Polished aggregate 0 2 1 3Blowups 0 4 2 6

Total responses = 19

Frequency ofCollection

1

1

1

5

6

9

10

13

22

23

40

44

0 5 10 15 20 25 30 35 40 45 50

Interstate every 2 years

Non-NHS every 4 years

Off Highway System NHS every 3 or more years

Canadian provincial highways every 2 years

Canadian provincial highways annually

NHS every 2 years

Off Highway System NHS every 2 years

Off Highway System NHS annually

Non-NHS every 2 years

Non-NHS annually

NHS annually

Interstate annually

NO. OF AGENCIES

Total responses = 56

Quality Management Plans

WA

OR

CA

MT

ID

NV

AZ

UT

WY

CO

NM

TX

OK

KS

NE

SD

ND

MN

IA

MO

AR

LA

MSAL

GA

FL

SCTN

NC

IL

WIMI

OHIN

KY

WV VA

PA

NY

ME

VTNH

NJDE

MD

MA

CT

RI

AK

HI Data Quality Plan Received

• Alberta• British Columbia• Saskatchewan• Quebec

Total responses = 29 US4 CA

Data Quality Process

Flintsch and McGhee 2009, as adapted by Shekharan et al. 2007

Standards & Protocols

Category Standard / Protocol

No. of Agencies

Condition manual

HPMS Field 24Agency Manuals 14LTPP 6

Profile equipment

AASHTO R 56 22AASHTO M 328 18AASHTO R 57 17

Faulting AASHTO R 36 18Roughness AASHTO R 43 17

ASTM E1926 4AASHTO PP 37 2ASTM E1489 1

Category Standard / Protocol

No. of Agencies

Measuring profile

AASHTO PP 70 16ASTM E950 15ASTM E1656 4

Rutting AASHTO R 48 12ASTM E1703 3AASHTO PP 38 2AASHTO PP 69 13

Asphalt cracking

AASHTO R 55 8AASHTO PP 67 6

Images AASHTO PP 68 6

Total responses = 57

DQMP Distress Types

Distress Type No. of Agencies

Longitudinal cracking 19Transverse cracking 19Alligator cracking 18Percent cracking (HPMS) 15Patching 12Block cracking 9Pothole 8Raveling 8Bleeding 8Miscellaneous cracking 5Edge cracking 5Longitudinal joint cracking 5

Asphalt (25 responses)Distress Type No. of

AgenciesCracked slabs (HPMS) 11Transverse cracking 11Longitudinal cracking 11Patching 10Joint spalling 9Corner cracking 8Multiple cracking 8

JPCP (17 responses)

Distress Type No. of Agencies

Longitudinal cracking 5Punchouts 5Patching 5

CRCP (6 responses)

Rater Training

• California- 1 week training- Minimize

discrepancies• New Hampshire

- 15 certification sections

- Personnel required to rate to satisfactory level

• Pennsylvania- Vendor train all raters- Evaluate 6 calibration

sites- Meet agency accuracy

& repeatability requirements

• Texas- Surface distress rating

class- Written test

certification

Quality Control

• Activities conducted by data collection team, for example:-Data completeness- Location information- Linear reference system- Speed-Data-Geometry-Condition & distress

22 agency requirements

summarized in Synthesis

Control, Verification, Blind Site Requirements

• Monitor & ensure data (& image) quality prior to & during data collection

• Conducted to:- Certify, calibrate, &

verify equipment- Establish reference values

Control• Agency testing• Typically same sites

year-to-year• Establish reference valuesVerification• Agency testing• Spread across network• Typically not used to

establish reference values• Location typically known

by rating crew• May be traversed multiple

timesBlind• Same as verification,

except location unknown to rating crew

23 agency requirements summarized in Synthesis

Acceptance

• Activities performed to assess the quality of the submitted condition data & images-Data requirements

Completeness, acceptable range, etc.- Images

Clarity, stitching, missing images, etc.-Condition data

Meet requirements, repeatability, etc.• Based on random

sample (typical) 24 agency requirements summarized in Synthesis

Error Resolution

• Activities conducted in the event collected data & images does not meet agency requirements-Notification to proceed-Reprocess data-Recollect data (& images)-Recalibrate equipment and recollect

3 agency requirements summarized in Synthesis

Independent Review

• Third-party review of collected data & images

• Reviews based on agency acceptance requirements

2 agency practices discussed in Synthesis

Integration

• Linking data sets from different sources- Spatial file provided to data collection

crew• Challenges (13)

- Matching locations (4)- Software formats & systems (2)- Comparison of manual & automated cracking,

IT support, data consistency, new algorithms & verification protocols, changing technologies (1 each)

Storage

• Majority of agencies store (16):- Images (16)-Raw data (14)-Condition index (10)

• Typical format (16):-Database (7)-Database and spreadsheet (3)- JPEG (6)- Vendor-hosted image site (3)

Retention Schedule

• Type of information and length of time information is kept

• Agency practice (15)- Data & images indefinitely (5)- Data only indefinitely (3)- Data & images for 4 years (2)- Data & images 10+ years (2)- Data only 10 years (1)- Data & images 20 years + indices indefinitely

(1)- Data 30 or more years (1)

Survey Costs

WARNING: not everything is as it seems

Survey Costs

• Agency- Hours may not be tracked by specific task- May or may not include all costs

• Vendor- Lump sum versus line item- Economy of scale adjustment- Includes all costs

• Other- Time frame for deliverables- Analysis methods- Distress types collected & analyzed

Survey Costs

In general, larger networks result in lower costs

Survey Type Network Collects / Analyzes Cost / mi (km)

Semi-automated

Small Agency $159 ($99)Large Vendor / Agency $82 ($51)

X-largeVendor / Agency $34 and 50 ($21 and $31)

Vendor $76 and $101 ($47 and $63)

Fully automated MediumAgency $199 ($165)Vendor $43 ($27)

Semi- and fully automated

Small Vendor $75 and $115 ($47 and $71)Medium Vendor $65 ($40)Large Vendor $28 ($17)

X-large Vendor / Agency $58 ($36)

Accomplishments

• Safer, faster, efficient, & consistent• Satisfaction with automated crack

detection• Data and images used by other offices• Great tool for assisting with project

selection

Challenges

• Determining data quality tolerances• Ground truth testing• Standardized methods• Consistent rut depth measurements• Protocols & algorithms for new datasets and

metrics• Resources for collection to delivery• Generating reports, trends, & project

assessment• Distress ratings year-to-year and vendor-to-

vendor

Case Examples

British ColumbiaNorth DakotaPennsylvania

British Columbia

Survey Type Collects / Analysis Comments

Fully automated Vendor • 5,590 mi (9,000 km)• Asphalt pavement only

Category Condition or Distress Type Protocol

CrackingAlligator, longitudinal joint, longitudinal wheel path, meandering longitudinal, pavement edge, and transverse

Agency Rating Manual

Defects Bleeding and potholes Agency Rating ManualRutting Rut depth AASHTO PP 70

Roughness IRI ASTM E950, ASTM E1926, AASHTO M 328

Images Pavement surface and ROW Data collection contract

North Dakota

Survey Type Collects / Analysis Comments

Semi- & Fully automated Agency • 8,500 mi (13,679 km

Category Condition or Distress Type Protocol

DistressAlligator, block, corner, durability, longitudinal, reflection, spalling, transverse

AASHTO PP 67, PP 68 & R 55, ASTM E1656, LTPP & Agency manual

Defects Lane/Shldr drop-off, patching, blowups Agency manualFaulting Faulting AASHTO R 36

Rutting Rut depth AASHTO PP 69, PP 70 & R 48

Profile IRI, cross slope ASTM E1926, AASHTO M 328, R 43 & R57

ImagesSigns legible, exposure, color balance, 0.08 in (2 mm) crack visible and detectable

Pennsylvania

Survey Type Collects / Analysis Comments

Semi- & Fully automated Vendor • 28,000 mi (45,061 km) state owned• 2,000 mi (3,219 km) local Federal Aid

Category Condition or Distress Type Protocol

CrackingBroken slabs, transverse, fatigue, miscellaneous, edge, edge joint

AASHTO PP 67, PP 68, R 55, & agency manual

Defects Joint spalling, patching Agency manualFaulting Faulting AASHTO R 36 & R 57Rutting Rut depth AASHTO R48 & PP 70

Roughness IRI AASHTO M 328, R 43, R 56, R 57

Images2.5% random sample; agency rates images and compares to vendor; used for acceptance

Concluding Remarks

• Decades of condition assessment

• Importance of data quality

Manual 2D (semi)

3D(fully)

Walking Slower Speed

Posted Speed

Pen & Paper

View on-screen

Minimal User

Today ~90% of agencies

surveyed use fully-automated

surveys

Suggestions for Future Research

• Standardized method for evaluating distress algorithms

• Accuracy of crack detection on high macrotexture surfaces

• Effort needed to establish certification facilities

• IRI from 3D profile measurements• Impact of changing equipment or service

provider

Q&A

Linda Pierce(505) 603-7993

lpierce@ncenet.com

Today’s Participants

• Linda Pierce, NCE, LPierce@ncenet.com

• Magdy Mikhail, Agile Assets, MMikhail@agileassets.com

Panelists Presentations

http://onlinepubs.trb.org/onlinepubs/webinars/191021.pdf

After the webinar, you will receive a follow-up email containing a link to the recording

Get Involved with TRB• Getting involved is free!• Join a Standing Committee (http://bit.ly/2jYRrF6)• Become a Friend of a Committee

(http://bit.ly/TRBcommittees)– Networking opportunities– May provide a path to become a Standing Committee

member• For more information: www.mytrb.org

– Create your account– Update your profile