Pam Heinselman NOAA National Severe Storms Laboratory

21
Pam Heinselman NOAA National Severe Storms Laboratory Warn on Forecast Workshop 8 Feb 2012 Exploring the Impact of Rapid-scan Radar Data on NWS Warnings Daphne LaDue OU CAPS Heather Lazrus NCAR

description

Exploring the Impact of Rapid-scan Radar Data on NWS Warnings. Pam Heinselman NOAA National Severe Storms Laboratory . Daphne LaDue OU CAPS. Heather Lazrus NCAR. Warn on Forecast Workshop 8 Feb 2012. Motivation. Stakeholders’ needs: Faster Updates. 62%. - PowerPoint PPT Presentation

Transcript of Pam Heinselman NOAA National Severe Storms Laboratory

Page 1: Pam  Heinselman NOAA National Severe Storms Laboratory

Pam HeinselmanNOAA National Severe Storms Laboratory

Warn on Forecast Workshop 8 Feb 2012

Exploring the Impact of Rapid-scan Radar Data on NWS Warnings

Daphne LaDue OU CAPS

Heather LazrusNCAR

Page 2: Pam  Heinselman NOAA National Severe Storms Laboratory

Motivation

62%

Stakeholders’ needs:Faster Updates

Source: Radar Operations Center

Page 3: Pam  Heinselman NOAA National Severe Storms Laboratory

Motivation

Page 4: Pam  Heinselman NOAA National Severe Storms Laboratory

“To enhance the ability to monitor rapid tornadogenesis, the NWS should develop and implement additional Volume Coverage Pattern strategies that allow for more continuous sampling near the surface (e.g., 1-min lowest elevation sampling).”

Motivation

Page 5: Pam  Heinselman NOAA National Severe Storms Laboratory

JDOP (Burgess et al. 1979; Whiton et al. 1998)Performance improvement resulting from adding Doppler capability

MotivationBenefit Assessment

JPOLE (Scharfenberg et al. 2005)Case examples illustrating how polarization aided operations

Understanding of storm severity, warning decisions, wording in follow-up statements, confidence

Page 6: Pam  Heinselman NOAA National Severe Storms Laboratory

Explore how improvements in depiction of storm development from rapid sampling may benefit forecasters’ decision making process.

Objective

Page 7: Pam  Heinselman NOAA National Severe Storms Laboratory

Andra et al. (2002)Description of 5 warning decision factors exemplified during 3 May 1999 tornado outbreak

Background: NWS Decision Making

1. Scientifically based conceptual models2. Two primary data sets: Doppler radar & ground truth3. Workstations and software4. Strategy5. Expertise

Page 8: Pam  Heinselman NOAA National Severe Storms Laboratory

Background: NWS Decision makingMorss and Ralph (2007)Participant observation & structured interviews during PACJET and CALJET

Assessed operational benefit of forecaster use of off-shore, gap-filling observations provided

during CALJET

Use of additional data appeared to help forecasters-- In some cases, data improved specificity of

forecast-- When initial forecast was fairly accurate, use of dataincreased forecaster confidence

Page 9: Pam  Heinselman NOAA National Severe Storms Laboratory

Tuesday AfternoonIntroduction to PAR & WDSS-II training

Tuesday Evening and WednesdayGain experience interrogating PAR data and issue warnings usingWDSS-lI WARNGEN

Thursday

12 forecasters, 12-30 April 2010

Temporal Resolution Experiment

Page 10: Pam  Heinselman NOAA National Severe Storms Laboratory

Paired forecasters w/ similar radar analysis skills

Worked tropical supercell event that produced EF1 tornado (unwarned )

Pair 1: 43-s updates Pair 2: 4.5-min updates

Temporal Resolution Experiment

Page 11: Pam  Heinselman NOAA National Severe Storms Laboratory

43-sUpdates

4.5-min Updates

19 Aug 2007

Page 12: Pam  Heinselman NOAA National Severe Storms Laboratory

Data We Collected

Audio of the teams working through situation awareness and the caseW

hat t

hey

did

Video of computer screens Products issued

Two observers took notes in each room

Wha

t we

saw

Page 13: Pam  Heinselman NOAA National Severe Storms Laboratory

Wha

t the

y th

ough

t the

y di

d

Teams debriefed individually Joint debrief to compare across teams

Each individual ranked factors in their warning decision

Each individual completed a confidence continuum

Data We Collected

Page 14: Pam  Heinselman NOAA National Severe Storms Laboratory

Understanding decision process

Cognitive Actions Emotions

Experiment Design & Software

Data used

Coding and Thematic Analysis

Page 15: Pam  Heinselman NOAA National Severe Storms Laboratory

Example Analysis: 43-s Team Decision Process

01:37:09 01:40:0101:38:35

01:40:44

01:41:17

01:42:10

Interrogate SRM to “make sure there is nothing as pertinent as far as rotation there” before issuing SVS on south storm’s warning; identify circ on south storm w/ strong inbound but weak outbound velocity

Video=22:29 – 22:53

Examineshape of reflectivity

Video=22:54 - 23:04

Issue SVSVideo=23:05–24:08

No changes are made to the warning

Notices increase in reflectivity magnitude in north storm (0.5)

Interrogate base velocity & detect mesos in both storms; vel in south storm stronger aloft (50kts@15Kft)

Interrogate top height and vertical reflectivity structure of both storms

Video=25:52 – 27:58

01:42:53

Interrogate base velocity of north storm: “We’ve got that persistent meso. I’ve got 34 kts inbound now on this, 25 outbound”

Video=28:01 – 28:10

Interrogate reflectivity for notch signature

Video=28:11 – 29:06

01:43:36

01:44:19

Interrogate base vel “usually at 35 kts you need to start considering tor.”

Video=29:07 – 29:17

Compares location of vel couplet w/ location of refl. notch

01:45:02

Opens WarnGen at Video = 30:26

Video=30:12

Issue Warning at Video = 32:38

01:45:45

01:46:28

01:47:1101:45:02

Video=30:12

At end statates, “41 inbound, 64 out…I don’t think I can ignore that.”

Video=29:45 -30:19

PAR ~01:47:28

Video=24:09 -- 24:27

Video=24:28 -- 25:41

Video=29:23 – 29:44

Interrogates updated SRM and finds “41 inbound, 64 out…I don’t think I can ignore that.”

Video=29:45 -30:19

Issue TOR

Warning

Page 16: Pam  Heinselman NOAA National Severe Storms Laboratory

43-s Team WarningsEF1 TORN. Storm

01:13:29 01:50:0301:27:06 01:40:0101:17:47 01:35:43 01:44:1901:22:05 01:31:24TOR S. Storm TOR N. Storm

01:50:0301:22:05 01:27:06 01:31:24 01:40:0101:17:47 01:35:43 01:44:19SVR S. Storm SVR N. Storm TOR N. Storm

Negative Lead TimePositive Lead Time

01:13:29 01:50:0301:40:0101:17:47 01:35:43 01:44:1901:22:05 01:31:2401:27:06TOR N. Storm

+18.6 min

+11.5 min

-3.2 min

01:56:13

01:56:13

01:56:13

01:13:29

Comparison A

Comparison B

Comparison C

0.5° LLSD Azimuthal Shear (s-1)

Time (UTC)

+6 min

Page 17: Pam  Heinselman NOAA National Severe Storms Laboratory

4.5-min Team WarningsEF1 TORN. Storm

01:13:29 01:50:0301:27:06 01:40:0101:17:47 01:35:43 01:44:1901:22:05 01:31:24TOR N. Storm

01:50:0301:22:05 01:27:06 01:31:24 01:40:0101:17:47 01:35:43 01:44:19TOR N. Storm

Negative Lead TimePositive Lead Time

01:13:29 01:50:0301:40:0101:17:47 01:35:4301:22:05 01:31:2401:27:06TOR N. Storm

+4.6min

-0.7 min

01:56:13

01:56:13

01:56:13

01:13:29

Comparison A

Comparison B

Comparison C

Time (UTC)

0.5° LLSD Azimuthal Shear (s-1)

-1.6 min

01:44:19

Page 18: Pam  Heinselman NOAA National Severe Storms Laboratory

What we’ve learned6 teams interrogated similar radar signaturesCame to different conclusions about whether and when to warn

Decision(Confounding)

Factors

(Hahn et al. 2003; Hoffman et al. 2006; Pliske et al. 1997)

Page 19: Pam  Heinselman NOAA National Severe Storms Laboratory

Forecasters’ Conceptual Model During Case

43-s Team 4.5-minTeam

Weaker Couplet Strength 66% 83%

Trend in Circulation Strength 100% 100%

Update Time Detrimental 0% 100%

Environment 66% 66%

Reflectivity Notch 100% 100%

Page 20: Pam  Heinselman NOAA National Severe Storms Laboratory

Understanding of Supercell in Tropical Environment

BC

AAAABB C B

CC

More ConfidentLess Confident

Understanding of NWRT PAR Data

A AA A

BBC C

B BCC

Usual Confidence More ConfidentLess Confident

Usual Confidence

Page 21: Pam  Heinselman NOAA National Severe Storms Laboratory

What we’ve learned

*43-s Teams

4.5-min Teams

0 min 20 min

Warning Lead Times

11.5 min

4.6 min

18.6 min6 min

*Issued 50% more warnings: 3 hits, 1 miss, 2 false alarms

• Data analysis was time intensive; exploring other methods• Warning decision process is complex• Some decision factors were similar across groups, others were not• Update time likely had a positive impact on warning lead time