Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

20
Global Wind Forecasts from Improved Wind Analysis via the FSU-Superensemble suggesting possible impacts from Wind-LIDAR Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

description

Global Wind Forecasts from Improved Wind Analysis via the FSU-Superensemble suggesting possible impacts from Wind-LIDAR. Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001. Agenda. Impetus and motivation Superensemble (SE) Methodology - PowerPoint PPT Presentation

Transcript of Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

Page 1: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

Global Wind Forecasts from Improved Wind Analysis via the FSU-Superensemble

suggesting possible impacts from Wind-LIDAR

Adam J. O’Shay and T. N. Krishnamurti

FSU Meteorology

8 February 2001

Page 2: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

Agenda

Impetus and motivation Superensemble (SE) Methodology Data selection and data-assimilation issue

Results of forecasts Summary

Page 3: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

Motivation

Space-borne LIDAR could provide global modeler’s with wind data-sets useful for analysis in models

Provided with an additional wind-analysis the SE could greatly improve its global wind forecasts

from http://www-lite.larc.nasa.gov/

Page 4: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

Motivation - Con’t To show the benefit of improved wind analyses -

the FSU SE was run using various multi-models’ observed data as training data - with ECMWF analysis being ground truth Typically the FSU SE uses the FSU (ECMWF based)

analyses as the training data-set as it has shown to be a superior initial state among global NWP models - instances where ‘superior initial states’ don’t always give improved forecasts do occur, although they are rare

Page 5: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

Multi-Model ForecastsEnsemble MeanBreeding ModesSingular Vector based Perturbations

Super Ensemble Forecasts

Observed State ‘Analysis’

CONTROL FORECAST

MULTI-MODEL FORECASTS

SUPER-ENSEMBLE FORECASTSSTATISTICS

‘OBSERVED’ANALYSIS FIELDS

TIME

Page 6: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

FSU Superensemble Methodology

Division of time length to two time periods Training period - Multi-model variables are

regressed towards the observed data for each model. Multiple linear regression provides weights for the individual models and time period forecasts

Forecast period (test phase) - Weights are applied to the forecasts resulting in the SE forecast for each day/time period (Krishnamurti et al. 2000)

Page 7: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

Superensemble Methodology - Con’t

This is a Multiple Linear Regression based ‘Least-Squares Minimization’ procedure, where the algorithm is based upon collective bias removal

Individual model bias removal assigns a weight of 1.0 to all (bias removed) models which results in inclusion of poor models - collective bias exhibits statistically better forecasts

Page 8: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

Superensemble Methodology - Con’t

Each SE forecast was computed for each of the forecast days (48hr and 72hr) for a global domain upon each differing training run

Page 9: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

Data Selection

Time period - 1 January 2000 - 9 April 2000 80 day training period - with 20 days of forecast U,V-winds at 200 & 850mb were examined -

Results at 200mb presented here for time consideration

Several global models in addition to ECMWF analysis chosen based on continuity of data-record (interpolated to 1º x 1º if necessary)

Error calculations performed for global-domain

Page 10: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

Data Assimilation point

All global NWP models do not use the same data-assimilation scheme Multi-models examined here use the following data-

assimilation schemes:

Perhaps the variation in each models’ data-assimilation accounts largely for the variation in forecasts that the SE expels when using a model other than ECMWF as t1

* This is NOT to say it is the only reason!!!

Page 11: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

Data Assimilation - Con’t

Rabier et al. (2000) and Swanson et al. (2000) found that 4-D VAR has shown overall improvement in global modeling, particularly when coupled with a high resolution model (i.e. ECMWF -- T200+)

Given the above, models using data-assimilation schemes OTHER than 4-D VAR have been used as ‘training’ for the FSU SE

Page 12: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

RMS V-200mb 48HR FCST

0

2

4

6

8

10

12

14

16

18

1 4 710 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97

Day

RMS (m/s)

RMS V-200 48HR FCST

0

2

4

6

8

10

12

14

16

1 4 710131619 2225283134374043464952555861646770737679828588919497

Days

RMS (m/s)

RMS V-200mb 48HR FCST

0

2

4

6

8

10

12

14

16

1 4 710 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97

Day

RMS (m/s)

RMS ERRORS 48HR

Page 13: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

RMS ERRORS 72HR

RMS V-200mb 72HR FCST

0

2

4

6

8

10

12

14

16

18

1 4 710 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97

Day

RMS (m/s)

RMS V-200mb 72HR FCST

0

2

4

6

8

10

12

14

16

1 4 710131619 2225283134374043464952555861646770737679828588919497

Day

RMS (m/s)

RMS V-200mb 72HR FCST

0

2

4

6

8

10

12

14

16

18

1 4 710131619 2225283134374043464952555861646770737679828588919497

Day

RMS (m/s)

Page 14: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

R(OBS,SE) = 0.70

Page 15: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

R(OBS,SE) = 0.78

Page 16: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

R(Obs,ETSE) ~ 0.5

Page 17: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

R(Obs,ETSE) ~ 0.85

Page 18: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

What is going on here?

Each of the multi-models uses a different data-assimilation scheme -- training the SE with each and obtaining the final SE product will result in disparate forecasts than if one had used ECMWF-based analysis as the training data

Page 19: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

Courtesy of: http://sgi62.wwb.noaa.gov:8080/STATS/STATS.html

RMS Errors of Various Models

Page 20: Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

Summary

The SE shows a degradation in skill when the training data-set originates from a model with a poorer initial state (i.e. multi-models with various data-assimilation schemes) The heart of the SE is the training data-set - the

initiation of LIDAR into the data-assimilation scheme at FSU may lead to skill in global-wind forecasts far exceeding those of present day