1 Development and Calibration of Ensemble Based Hazardous Weather Products at the Storm Prediction...
-
Upload
alice-roberts -
Category
Documents
-
view
214 -
download
1
Transcript of 1 Development and Calibration of Ensemble Based Hazardous Weather Products at the Storm Prediction...
1
Development and Calibration of Development and Calibration of Ensemble Based Hazardous Ensemble Based Hazardous
Weather Products at theWeather Products at the Storm Prediction Center Storm Prediction Center
David Bright
Gregg Grosshans, Jack Kain, Jason Levit, Russ Schneider, Dave Stensrud, Matt Wandishin, Steve Weiss
October 11, 2005NCEP Predictability Discussion Group
Where Americas Climate and Weather Services Begin
2
STORM PREDICTION CENTERSTORM PREDICTION CENTER
MISSION STATEMENT
The Storm Prediction Center (SPC) exists
solely to protect life and property of the American people
through the issuance of timely, accurate watch and forecast products
dealing with tornadoes, wildfires and other hazardous mesoscale weather
phenomena.
MISSION STATEMENT
The Storm Prediction Center (SPC) exists
solely to protect life and property of the American people
through the issuance of timely, accurate watch and forecast products dealing with hazardous mesoscale weather
phenomena.
3
• Hail, Wind, Tornadoes
• Excessive rainfall
• Fire weather
• Winter weather
STORM PREDICTION CENTERSTORM PREDICTION CENTER HAZARDOUS PHENOMENA
4
• TORNADO & SEVERE THUNDERSTORM WATCHES
• WATCH STATUS MESSAGE• CONVECTIVE OUTLOOK
– Day 1; Day 2; Day 3; Days 4-8• MESOSCALE DISCUSSION
– Severe Thunderstorm Potential/Outlook Upgrade
– Thunderstorms not expected to become severe– Hazardous Winter Weather– Heavy Rainfall
• FIRE WEATHER OUTLOOK– Day 1; Day 2; Days 3-8
• OPERATIONAL FORECASTS ARE BOTH DETERMINISTIC AND PROBABILISTIC
SPC Forecast ProductsSPC Forecast Products
75% of all SPC products are valid for < 24h period
5
Tornadoes
Probability of 2 or more tornadoes Low (10%)
Probability of 1 or more strong (F2-F4) tornadoes Low (<5%)
Wind Probability of 10 or more severe wind events Mod (60%)
Probability of 1 or more wind event > 65 knots Low (10%)
Hail
Probability of 10 or more severe hail events Low (10%)
Probability of 1 or more hailstones >2 inches Low (<5%)
Combined Severe Hail/Wind
Probability of 6 or more combined severe wind/hail events Mod (60%)
Severe Thunderstorm Watch 688 Probability Table
EXPERIMENTAL WATCH PROBABILITIES
6
CONVECTIVE OUTLOOKSOperational through Day 3
7
Thunderstorm Outlooks:Thunderstorm Outlooks:
24h General Thunderstorm
12h Enhanced Thunderstorm (Tonight)
12h Enhanced Thunderstorm (Today)
24h Period (> 10%)
12h Periods (> 10%; 40%; 70%)
8
• Operational emphasis on…– Observational data– Short-term, high-resolution NWP guidance– Specific information predicting hazardous mesoscale
phenomena• NWP needs range from the very-short range to medium range
– Very short-range: Hourly RUC; 4.5 km WRF-NMM– Short-range: NAM, GFS, SREF– Medium-range: GFS, ECMWF, MREF
• Today’s focus: SREF– Overview of the ensemble product suite– Specific ensemble calibrated guidance
Product Guidance at the SPCProduct Guidance at the SPC
9
Overview of Ensemble GuidanceOverview of Ensemble Guidance
Objective: Provide a wide range of ensemble guidance covering all of
the SPC program areas
10
Sample of Ensemble Products Available…
http://www.spc.noaa.gov/exper/sref/
MEAN & SD: 500 mb HGHT
MEAN: PMSL, DZ, 10M WIND MEAN: MUCAPE, 0-6 SHR, 0-3 HLCY
SPAGHETTI: SFC LOW
11http://www.spc.noaa.gov/exper/sref/
PROB: DENDRITIC GROWTH
Omega < -3-11 < T < -17
RH > 80%PROB: SIG TOR PARAM > 3
MEDIAN, UNION, INTERSECTION:SIG TOR PARAM
MAX OR MIN: MAX FOSBERG INDEX
STP = F (mlCAPE, mlLCL, SRH, Shear)Thompson et al. (2003)
STP = F (mlCAPE, mlLCL, SRH, Shear)Thompson et al. (2003)
Sample of Ensemble Products Available…
12
F63 SREF POSTAGE STAMP VIEW: PMSL, HURRICANE FRANCES F63 SREF POSTAGE STAMP VIEW: PMSL, HURRICANE FRANCES
Red = EtaBMJ
Yellow= Yellow= EtaKFEtaKF
Blue = RSM
White = White = OpEtaOpEta
SREF Member
13
Combined Probability Combined Probability
• Probability surface CAPE >= 1000 J/kg– Relatively
low – Ensemble
mean is < 1000 J/kg (no gold dashed line)
CAPE (J/kg)Green solid= Percent Members >= 1000 J/kg; Shading >= 50%
Gold dashed = Ensemble mean (1000 J/kg)F036: Valid 21 UTC 28 May 2003
14
• Probability deep layer shear >= 30 kts– Strong mid
level jet through Iowa
10 m – 6 km Shear (kts)Green solid= Percent Members >= 30 kts; Shading >= 50%
Gold dashed = Ensemble mean (30 kts)F036: Valid 21 UTC 28 May 2003
Combined Probability Combined Probability
15
• Convection is likely WI/IL/IN– Will the
convection become severe?
3 Hour Convective Precipitation >= 0.01 (in)Green solid= Percent Members >= 0.01 in; Shading >= 50%
Gold dashed = Ensemble mean (0.01 in)F036: Valid 21 UTC 28 May 2003
Combined Probability Combined Probability
16
Prob Cape >= 1000 X Prob Shear >= 30 kts X Prob Conv Pcpn >= .01” F036: Valid 21 UTC 28 May 2003
Combined Probability Combined Probability
• A quick way to determine juxtaposition of key parameters
• Fosters an ingredients-based approach
• Not a “true” probability– Dependence– Different
members contribute
17
Severe ReportsRed=Tor; Blue=Wind; Green=Hail
Prob Cape >= 1000 X Prob Shear >= 30 kts X Prob Conv Pcpn >= .01” F036: Valid 21 UTC 28 May 2003
Combined Probability Combined Probability
• A quick way to determine juxtaposition of key parameters
• Fosters an ingredients-based approach
• Not a “true” probability– Dependence– Different
members contribute
18
F15 SREF PROBABILITYTPCP x RH x WIND x TMPF(< .01” x < 10% x > 30 mph x > 60 F)
Ingredients for extreme fire weatherconditions over the Great Basin
Combined ProbabilityCombined Probability
19
Objective: Develop calibrated probabilistic guidance for CG
lightning
Calibrated Thunderstorm GuidanceCalibrated Thunderstorm Guidance
20
Combine Lightning Ingredients Combine Lightning Ingredients into a Single Parameterinto a Single Parameter
• Three first-order ingredients (readily available from NWP models):– Lifting condensation level > -10o C– Sufficient CAPE in the 0o to -20o C layer – Equilibrium level temperature < -20o C
• Cloud Physics Thunder Parameter (CPTP) CPTP = (-19oC – Tel)(CAPE-20 – K) K
where K = 100 Jkg-1 and CAPE-20 is MUCAPE in the 0o C to -20o C layer
21
Example CPTP: One MemberExample CPTP: One Member
18h Eta Forecast Valid 03 UTC 4 June 2003
Plan view chart showing where grid point soundings support lightning (given a convective updraft)
22
SREF Probability CPTP SREF Probability CPTP >> 1 1
15h Forecast Ending: 00 UTC 01 Sept 2004Uncalibrated probability: Solid/Filled; Mean CPTP = 1 (Thick dashed)
3 hr valid period: 21 UTC 31 Aug to 00 UTC 01 Sept 2004
23
SREF Probability Precip SREF Probability Precip >> .01” .01”
15h Forecast Ending: 00 UTC 01 Sept 2004Uncalibrated probability: Solid/Filled; Mean precip = 0.01” (Thick dashed)
3 hr valid period: 21 UTC 31 Aug to 00 UTC 01 Sept 2004
24
Joint Probability (Assume Independent)Joint Probability (Assume Independent)
15h Forecast Ending: 00 UTC 01 Sept 2004Uncalibrated probability: Solid/Filled
P(CPTP > 1) x P(Precip > .01”)3 hr valid period: 21 UTC 31 Aug to 00 UTC 01 Sept 2004
25
Perfect Forecast
No Skill
Climatology
P(CPTP > 1) x P(P03I > .01”)
Uncalibrated ReliabilityUncalibrated Reliability (5 Aug to 5 Nov 2004)(5 Aug to 5 Nov 2004)
Frequency[0%, 5%, …, 100%]
26
Adjusting ProbabilitiesAdjusting Probabilities
• Calibrate ensemble thunderstorm guidance based on the observed frequency of occurrence
27
Ensemble Thunder CalibrationEnsemble Thunder Calibration1) Bin separately P(CPTP > 1) and P(P03M > 0.01”) into
11 bins (0-5%; 5-15%; …; 85-95%; 95-100%)2) Combine the two binned probabilistic forecasts into one
of 121 possible combinations (0%,0%); (0%,10%); … (100%,100%)
3) Use NLDN CG data over the previous 366 days to calculate the frequency of occurrence of CG strikes for each of the 121 binned combinations
• Construct for each grid point using 1/r weighting
4) Bin ensemble forecasts as described in steps 1 and 2 and assign the observed CG frequency (step 3) as the calibrated probability of a CG strike
5) Calibration is performed for each forecast cycle (09 and 21 UTC) and each forecast hour; domain is entire U.S. on 40 km grid (CG strike within ~12 miles)
28
Before Calibration
29
Joint Probability (Assumed Independence)Joint Probability (Assumed Independence)
P(CPTP > 1) x P(Precip > .01”)3 hr valid period: 21 UTC 31 Aug to 00 UTC 01 Sept 2004
15h Forecast Ending: 00 UTC 01 Sept 2004Uncorrected probability: Solid/Filled
30
After Calibration
31
Calibrated Ensemble Thunder Probability Calibrated Ensemble Thunder Probability
15h Forecast Ending: 00 UTC 01 Sept 2004Calibrated probability: Solid/Filled
3 hr valid period: 21 UTC 31 Aug to 00 UTC 01 Sept 2004
32
Calibrated Ensemble Thunder ProbabilityCalibrated Ensemble Thunder Probability
15h Forecast Ending: 00 UTC 01 Sept 2004Calibrated probability: Solid/Filled; NLDN CG Strikes (Yellow +)
3 hr valid period: 21 UTC 31 Aug to 00 UTC 01 Sept 2004
33
Perfect Forecast
No Skill
Perfect Forecast
No Skill
Calibrated ReliabilityCalibrated Reliability (5 Aug to 5 Nov 2004)(5 Aug to 5 Nov 2004)
Calibrated Thunder Probability
Climatology
Frequency[0%, 5%, …, 100%]
34
3h probability of > 1 CG lightning strike within ~12 mi09Z and 21Z SREF valid at F003 through F063 May 15 – Sept 15 2005
Economic Potential ValueReliability
35
12h probability of > 1 CG lightning strike within ~12 mi09Z SREF valid at F012 through F063 May 15 – Sept 15 2005
Economic Potential ValueReliability
36
Objective: Develop calibrated probabilistic guidance of the
occurrence of severe convective weather
(Available for 3h, 12h, and 24h periods; calibration not described today)
Calibrated Severe Thunderstorm Calibrated Severe Thunderstorm GuidanceGuidance
37
24h probability of > 1 severe thunderstorm within ~25 miSREF: 2005051109Valid 12 UTC May 11, 2005 to 12 UTC May 12, 2005
SVR WX ACTIVITY12Z 11 May to 12Z 12 May, 2005
a= Hailw=Wind
t=Tornado
38
Hail > .75” Wind > 50 kts Tornado
24h probability of > 1 severe thunderstorm within ~25 mi21Z SREF valid at F039 through F039 (i.e., Day 1 Outlook)May 15 – Sept 15 2005
Economic Potential ValueReliability
39
Objective: Develop calibrated probabilistic guidance of snow accumulation on road surfaces
Experimental Calibrated Snow Experimental Calibrated Snow Accumulation GuidanceAccumulation Guidance
40
• Use frequency of occurrence technique -- similar to the calibrated probability of CG lightning
• Produce 8 calibrated joint probability tables • Take power mean (RMS average) of all 8
tables for the 3h probability of snow accumulating on roads in the grid cell
• Calibration period is Oct. 1, 2004 through Apr. 30, 2005
• MADIS “road-state” sensor information is truth (SREF is interpolated to MADIS road sensor)
Ensemble Snow CalibrationEnsemble Snow Calibration
41
• SREF probability predictors(1) Two precipitation-type algorithms
• Baldwin algorithm in NCEP post. (Pr[Sn, ZR, IP])• Czys algorithm applied in SPC SREF post-processing.
(Pr[Sn, ZR, IP])
(2) Two parameters sensitive to lower tropospheric and ground temperature
• Snowmelt parameterization (RSAE)– Evaluates fluxes to determine if 3” of snow melts over a 3h period. If yes, then parameter is assigned: 273.15 – TG. (Pr[>1; >2; >4])
• Simple algorithm (RSAP) F (Tpbl, TG, Qsfc net rad. flux,) where values > 1 indicate surface cold enough for snow to accumulate. (Pr[>1])
Goal: Examine the parameter space around the lower PBL T, ground T, and precip type and calibrate using road sensor data.
42
Frequency Calibration Tables
LAYER SREF INGREDIENT 1 SREF INGREDIENT 2
1 Prob(RSAE > 1) Prob(Baldwin Snow, ZR, or IP) 2 Prob(RSAE > 2) Prob(Baldwin Snow, ZR, or IP)
3 Prob(RSAE > 4) Prob(Baldwin Snow, ZR, or IP)
4 Prob(RSAE > 1) Prob(Czys Snow, ZR, or IP) 5 Prob(RSAE > 2) Prob(Czys Snow, ZR, or IP)
6 Prob(RSAE > 4) Prob(Czys Snow, ZR, or IP)
7 Prob(RSAP > 1) Prob(Baldwin Snow, ZR, or IP)
8 Prob(RSAP > 1) Prob(Czys Snow, ZR, or IP)
43
SREF 32F Isotherm(2 meter air temp)
Mean (dash)
Union (At leastone SREF member ator below 32 F - dots)
Intersection (All members at or below 32F- solid)
3h probability of freezing or frozen pcpn (Baldwin algorithm; uncalibrated)
Example: New England Blizzard (F42: 23 January 2005 03Z)
SREF 32F Isotherm(Ground Temp)
Mean (dash)
Union (At leastone SREF member ator below 32 F - dots)
Intersection (All members at or below 32F- solid)
3h calibrated probability of snow accumulating on roads
44
SREF 32F Isotherm(2 meter air temp)
Mean (dash)
Union (dots)
Intersection (solid)
3h probability of freezing or frozen pcpn (Baldwin algorithm; uncalibrated)
Example: Washington, DC Area (F21: 28 February 2005 18Z)
SREF 32F Isotherm(Ground Temp)
Mean (dash)
Union (dots)
Intersection (solid)
3h calibrated probability of snow accumulating on roads
45
6h Prob Snow Accum on RoadsOct 15, 2005 (F006 v15 UTC)
3h Prob Snow Accum on RoadsOct 15, 2005 (F006 v15 UTC)
46
Blind Test
• Calibration period: Oct 1, 2004 through April 30, 2005• 5 days randomly selected for each month in the sample =>
35 days in test• Test days withheld from the monthly calibration tables (i.e.,
cross validation used) • The SREF forecasts were reprocessed for the 35 days and
verified against the MADIS surface state observations (F03 – F63)
47
Verification
Reliability Diagram: All 3 h forecasts (F00 – F63); 35 days (Oct 1 – Apr 30)
Economic Potential ValueReliability
48
Test Results• 3 h forecast results (F00 – F63)
– Forecast are reliable– Brier score is a 21% improvement over sample
climatology– ROC area = .919– Ave probability where new snow detected: 23%– Ave probability where new snow not detected: 4%– Economic value for a wide range of users peaking
over 0.7
49
Road-Snow: Summary • Method appears reliable – although 3h
probabilities rarely exceed 50%• Highlights importance of ground temp
predictions from SREF and deterministic models• Possible improvements:
– Bias correction to 2m and ground temps from SREF*– Statistical post-processing of 2m and ground temps*
prior to road-state calibration– Addition of asphalt tile to LSM of SREF members
* See the next slide for temp correction information
50
Under dispersiveSREF 2m tempforecast (F15) and cold bias
F15 cold bias in2m temp removedbut remains under dispersive
Uniform VOR after statistical adjustment to SREF
Raw 2m Temp
Bias adjusted 2m Temp
Recalibrated 2m Temp
Bias adjustmentand recalibration with the additionof asphalt-typeground temptile in LSM might be very useful for snowaccumulationfrom SREF
F15 SREF 2m TempVerf Period: ~August, 2005