WRF Outline Overview and Status WRF Q&A John Michalakes Mesoscale and Microscale Meteorology...

12
WRF Outline • Overview and Status • WRF Q&A www.wrf-model.org John Michalakes Mesoscale and Microscale Meteorology Division National Center for Atmospheric Research [email protected]

Transcript of WRF Outline Overview and Status WRF Q&A John Michalakes Mesoscale and Microscale Meteorology...

Page 1: WRF Outline Overview and Status WRF Q&A  John Michalakes Mesoscale and Microscale Meteorology Division National Center for Atmospheric.

WRF Outline

• Overview and Status

• WRF Q&A

www.wrf-model.org

John MichalakesMesoscale and Microscale Meteorology Division

National Center for Atmospheric [email protected]

Page 2: WRF Outline Overview and Status WRF Q&A  John Michalakes Mesoscale and Microscale Meteorology Division National Center for Atmospheric.

Weather Research and Forecast (WRF) Model

• Large, collaborative effort; pool resources/talents

• Develop advanced community mesoscale and data-assimilation system:– Focus on 1-10km; accurate,

efficient, scalable over broad range of scales

– Advanced physics, data assimilation, nesting

– Flexible, modular, performance-portable with single-source code

• Principal Partners:– NCAR Mesoscale and Microscale

Meteorology Division– NOAA National Centers for Environmental

Prediction– NOAA Forecast Systems Laboratory– OU Center for the Analysis and Prediction of

Storms– U.S. Air Force Weather Agency– U.S. DoD HPCMO– NRL Marine Meteorology Division– Federal Aviation Administration

• Additional Collaborators:– NOAA Geophysical Fluid Dynamics

Laboratory– NASA GSFC Atmospheric Sciences Division– NOAA National Severe Storms Laboratory– EPA Atmospheric Modeling Division– University Community

• WRF Model Applications– Limited-area NWP– MCS triggering, severe weather research– Cloud modeling, Large Eddy Simulation– Synoptic-scale research (baroclinic waves,

fronts)– Topographic flows– Chemistry and air-quality research and

prediction– Regional climate . . .

Page 3: WRF Outline Overview and Status WRF Q&A  John Michalakes Mesoscale and Microscale Meteorology Division National Center for Atmospheric.

OMP

WRF Software Architecture

• Driver: I/O, communication, multi-nests, state data

• Model routines computational, tile-callable, thread-safe

• Mediation layer: interface between model and driver; (also handles dereferencing of driver layer objects to simple data structures for model layer)

• Interfaces to external packages

ConfigInquiry

I/O API

ConfigModule

WRF Tile-callableSubroutines

SolveMediation Layer

Model Layer

Driver Layer

DM comm

Thr

eads

External Packages

PackageIndependent

PackageDependent

Data formats,Parallel I/O

Mes

sage

Pas

sing

Driver

Page 4: WRF Outline Overview and Status WRF Q&A  John Michalakes Mesoscale and Microscale Meteorology Division National Center for Atmospheric.

WRF Model Layer Interface

• Interface Driver and Model Layers– No COMMON. State arrays through

argument lists• Only simple data types allowed to pass

through the interface• No non-contiguity issues

– Domain, memory, and run dimensions passed unambiguously

– Spatial scope of a call: one “tile”– Temporal scope of a call limited by

coherency

• Benefits– Ensures that a sub-domain is tile-callable

and therefore portable to any decomposition supported by the driver

– Provides a common interface that:• Allows use of other conforming driver

layers (this positions the code for new developments such as ESMF)

• WRF driver layer can (and is) being used to parallelize other models

template for model layer subroutine

SUBROUTINE model ( & arg1, arg2, arg3, … , argn, &ids, ide, jds, jde, kds, kde, & ! Domain dimsims, ime, jms, jme, kms, kme, & ! Memory dimsits, ite, jts, jte, kts, kte ) ! Tile dims

IMPLICIT NONE

! Define Arguments (S and I1) dataREAL, DIMENSION (kms:kme,ims:ime,jms:jme) :: arg1, . . .REAL, DIMENSION (ims:ime,jms:jme) :: arg7, . . .. . .! Define Local Data (I2)REAL, DIMENSION (kts:kte,its:ite,jts:jte) :: loc1, . . .. . .! Executable code; loops run over tile ! dimensionsDO j = jts, jteDO i = its, iteDO k = kts, kteIF ( i > ids .AND. I < ide ) THENloc(k,i,j) = arg1(k,i,j) + …

ENDIFEND DO

END DOEND DO

template for model layer subroutine

SUBROUTINE model ( & arg1, arg2, arg3, … , argn, &ids, ide, jds, jde, kds, kde, & ! Domain dimsims, ime, jms, jme, kms, kme, & ! Memory dimsits, ite, jts, jte, kts, kte ) ! Tile dims

IMPLICIT NONE

! Define Arguments (S and I1) dataREAL, DIMENSION (kms:kme,ims:ime,jms:jme) :: arg1, . . .REAL, DIMENSION (ims:ime,jms:jme) :: arg7, . . .. . .! Define Local Data (I2)REAL, DIMENSION (kts:kte,its:ite,jts:jte) :: loc1, . . .. . .! Executable code; loops run over tile ! dimensionsDO j = jts, jteDO i = its, iteDO k = kts, kteIF ( i > ids .AND. I < ide ) THENloc(k,i,j) = arg1(k,i,j) + …

ENDIFEND DO

END DOEND DO

Page 5: WRF Outline Overview and Status WRF Q&A  John Michalakes Mesoscale and Microscale Meteorology Division National Center for Atmospheric.

WRF Irons in Fire• New Dynamical Cores

– NCEP: NMM core (June 2003)– NCEP: semi-implicit semi-Lagrangian core (?)– NRL: COAMPS integration (?)– China Met. Admin and GRAPS integration (June 2004)

• Data Assimilation– WRF 3DVAR (later 2003)– WRF 4DVAR (2005-6)

• Development Initiatives– WRF Developmental Testbed Center (Summer 2003 and ongoing)– Hurricane WRF (2006)– NOAA air quality initiative (WRF-Chem) (2003-04)– NCSA: WRF/ROMS coupling using MCT (MEAD) (2003 and ongoing)– DoD: WRF/HFSOLE coupling using MCEL (PET) (2003)

• WRF Software Development– WRF nesting and research release (later 2003)– Vector performance: Earth Simulator, Cray X-1 (?)– NASA ESMF integration 2004: start with time manager, proof-of-concept dry

dynamics– NCSA: MEAD and HDF5 I/O (?)

Page 6: WRF Outline Overview and Status WRF Q&A  John Michalakes Mesoscale and Microscale Meteorology Division National Center for Atmospheric.

WRF Q&A1. What system requirements do you have - e.g. Unix/Linux, CORBA, MPI, Windows,...

UNIX/Linux, MPI, OpenMP (optional)Happiest > .5GB memory per distributed memory process

2. Can components spawn other components? What sort of relationships are allowed? directory structure model, parent child process model, flat model, peer-to-peer model, client-server etc....

WRF can spawn nested domains within this component. No other spawningApplications are basically peer to peer, though the underlying coupling infrastructure

may be implemented as client server or other models3. What programming language restrictions do you have currently?

Using Fortran90 and C but have no restrictions per se4. Are you designing to be programming language neutral?

Yes. We enjoin against passing derived data types through the model-layer interface for this reason. (This has been violated by 3DVAR as implemented in the WRF ASF, however)

5. What sort of component abstraction do you present to an end user? Atmosphere component, Ocean component, an Analysis component, a generic component etc...

Generic. The person putting the components together is required to know what each needs from the other.

Models see coupling as I/O and read and write coupling data through the WRF I/O/coupling API

Page 7: WRF Outline Overview and Status WRF Q&A  John Michalakes Mesoscale and Microscale Meteorology Division National Center for Atmospheric.

WRF Q&A9. Does data always have to be copied between components or are there facilities for

sharing references to common structures across component boundaries? When, how and why is this needed?

All data is exchanged through the WRF I/O/Coupling API; how this is implemented is up to the API – however, the API doesn't presently have semantics for specifying structures that are common across component boundaries

13. What levels and "styles" of parallelism/concurrency or serialization are permitted/excluded. e.g. can components be internally parallel, can multiple components run concurrently, can components run in parallel

WRF runs distributed-memory, shared-memory, or hybrid. WRF I/O/Coupling API is assumed non-thread-safe. No restriction on concurrency/parallelism with other components.

14. Do components always have certain specific functions/methods?WRF always produces a time integration of the atmosphere; however, there are

several dynamics options and numerous physics options.Note: the time-loop is in the driver layer and it's straightforward to run over

framework specified intervals (the nested domains do this under the control of parent domains)

Page 8: WRF Outline Overview and Status WRF Q&A  John Michalakes Mesoscale and Microscale Meteorology Division National Center for Atmospheric.

WRF Q&A15. What, if any, virtualization of process, thread and physical CPU do you use?

Unit of work is a tile; model layer subroutines are designed to be "tile-callable".Distributed memory: mapping of distributed memory processes to OS-level

processes and physical processors is up to the underlying comm layer.Shared memory: up to the underlying thread package (we currently use OpenMP)

16. Can components come and go arbitrarily throughout execution?WRF nested domains (which are part of this component) can by dynamically

instantiated/uninstantiated.WRF as a component has a relatively high overhead for starting up; wouldn't be

ideal as a transient component17. Can compute resources be acquired/released throughout execution?

Definitely not at the application layer at this time; decomposition is basically static and the number of distributed memory processes can not change over the course of a run.

We intend to allow migration of work between processes for LB.Whole processes might someday migrate, but WRF would have to include support

for migration of state (which it does not currently have).Using smaller or larger number of shared-memory threads would be up to the

thread package.

Page 9: WRF Outline Overview and Status WRF Q&A  John Michalakes Mesoscale and Microscale Meteorology Division National Center for Atmospheric.

WRF Q&A18. Does you system have an initialization phase?

Yes, big one. Initial I/O is relatively costly19. Is the high-level control syntax the same in different circumstances? e.g. serial

component to serial component, versus parallel M component to parallel N component.

Not strictly applicable, since we're talking about a single component, WRF. However, WRF can be easily subroutine-ized.

20. What standards must components adhere to - languages, standard functions/API's, standard semantics etc...

Standards in WRF apply internally between layers in software hierarchy and in API's to external packages. The API's are WRF-specific, allowing flexibility over a range of implementations.

Plan to merge WRF I/O/Coupling API with ESMF-API specification provided it gave similar functionality and interfaced with WRF in the same way

23. What is your approach to saving and restoring system state?Write and read restart data sets at user specified intervals.

26. Who are your target component authors.Physics developers, dynamical core developers, and WRF development team

Page 10: WRF Outline Overview and Status WRF Q&A  John Michalakes Mesoscale and Microscale Meteorology Division National Center for Atmospheric.

WRF Benchmarks

NCAR (National Center for Atmospheric Research)

1272.001890.00

IBM SP Power3 375 MHz 16 way/ 1260 33

Naval Oceanographic Office (NAVOCEANO)

3160.006156.00

IBM pSeries 690 Turbo 1.3GHz/ 1184 11

NCAR (National Center for Atmospheric Research)

3164.006323.00

IBM pSeries 690 Turbo 1.3GHz/ 1216 10

HPCx3241.006656.00

IBM pSeries 690 Turbo 1.3GHz/ 1280 9

Forecast Systems Laboratory -NOAA

3337.006758.00

HPTi Aspen Systems, Dual Xeon 2.2 GHz -Myrinet2000/ 1536

8

Commissariat a l'EnergieAtomique (CEA)

3980.005120.00

Hewlett-Packard AlphaServer SC ES45/1 GHz/ 2560

7

Pittsburgh Supercomputing Center

4463.006032.00

Hewlett-Packard AlphaServer SC ES45/1 GHz/ 3016

6

Lawrence Livermore National Laboratory

5694.0011060.00

Linux NetworX MCR Linux Cluster Xeon 2.4 GHz - Quadrics/ 2304

5

Lawrence Livermore National Laboratory

7226.0012288.00

IBM ASCI White, SP Power3 375 MHz/ 8192 4

Los Alamos National Laboratory

7727.0010240.00

Hewlett-Packard ASCI Q - AlphaServer SC ES45/1.25 GHz/ 4096

3

Los Alamos National Laboratory

7727.0010240.00

Hewlett-Packard ASCI Q - AlphaServer SC ES45/1.25 GHz/ 4096

2

Earth Simulator Center35860.0040960.00

NEC Earth-Simulator/ 5120 1

Installation SiteCountry/ Year

RmaxRpeak

ManufacturerComputer/ Procs

Rank

November 2002Rankings"TOP 500 Super-computingSites"

(LINPACK Benchmark)

WRF r

unni

ng o

npo

rtin

g

Page 11: WRF Outline Overview and Status WRF Q&A  John Michalakes Mesoscale and Microscale Meteorology Division National Center for Atmospheric.

WR F P erformanceW R F E M C ore, 4 2 5 x3 0 0 x3 5 , D X= 1 2 km , D T = 7 2 s

0

1 0 0 0 0

2 0 0 0 03 0 0 0 0

4 0 0 0 0

5 0 0 0 0

6 0 0 0 0

7 0 0 0 0

8 0 0 0 0

9 0 0 0 0

1 0 0 0 0 0

1 1 0 0 0 0

1 2 0 0 0 0

1 3 0 0 0 0

1 4 0 0 0 01 5 0 0 0 0

1 6 0 0 0 0

1 7 0 0 0 0

0 1 0 0 2 0 0 3 0 0 4 0 0 5 0 0 6 0 0 7 0 0 8 0 0 9 0 0 1 0 0 0 1 1 0 0p rocessors

mflo

p/s

T C S 2 -ra ils

IB M R e ga tta

IB M W inte rha wk IIiJ e t

Page 12: WRF Outline Overview and Status WRF Q&A  John Michalakes Mesoscale and Microscale Meteorology Division National Center for Atmospheric.

W R F P erformance

W R F E M C o re , 4 2 5 x 3 0 0 x3 5 , D X = 1 2 k m , D T = 7 2 s

0

5 0

1 0 0

1 5 0

2 0 0

2 5 0

3 0 0

3 5 0

4 0 0

4 5 0

0 1 0 0 2 0 0 3 0 0 4 0 0 5 0 0 6 0 0 7 0 0 8 0 0 9 0 0 1 0 0 0 1 1 0 0

P ro c e s s o rs

32

-bit

Mflo

p/s

eco

nd

/pro

cess

or

T C S (E V 6 8 )

IB M R e g a tta

IB M W in te rh a w k II

iJ e t (X e o n )