Earth System Modeling Framework Overview
description
Transcript of Earth System Modeling Framework Overview
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Chris Hill, MIT [email protected]
TOMS, 2003, Boulder
Earth System Modeling Framework Overview
NSIPP Seasonal ForecastNCAR/LANL CCSM
NCEP Forecast
GFDL FMS SuiteMITgcm
NASA GSFC PSAS
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Talk Outline
• Project Overview• Architecture and Current Status
– Superstructure layer• Design• Adoption
– Infrastructure layer• What is it for• What it contains
• Next steps….• Open Discussion
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Technological Trends In climate research and NWP...
increased emphasis on detailed representation of individual physical processes; requires many teams of specialists to contribute components to an overall modeling system.
In computing technology... increase in hardware and software complexity in high-performance computing, as we shift toward the use of scalable computing architectures.
Time mean air-sea CO2 flux. MITgcm constrained by obs. + ocean carbon (MIT,Scripps,JPL).
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Community Response• Modernization of modeling software
Abstraction of underlying hardware to provide uniform programming model, runs efficiently across vector and single and multiple microprocessor architectures.
Distributed software development model characterized by many contributing authors; use high-level language features for abstraction, to facilitate development process and software sharing.
Modular design for interchangeable dynamical cores and physical parameterizations, development of community-wide standards for components
• Development of prototype infrastructures GFDL (FMS), NASA/GSFC (GEMS), NCAR/NCEP (WRF), NCAR/DOE (MCT),
MIT(Wrapper), ROMS/TOMS etc..
ESMF aims to unify and extend these efforts.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICHESMF Goals and ProductsSTATED GOAL: To increase software reuse,
interoperability, ease of use and performance portability in climate, weather, and data assimilation applications, implies unified “standards”.
PRODUCTS: • Coupling superstructure and utility
infrastructure software • Synthetic code suite for validation and
demonstration• Set of 15 ESMF-compliant applications
(including CCSM, WRF, GFDL models; MITgcm, NCEP and NASA data assimilation systems)
• Set of 8 interoperability experiments
Interoperability
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
ESMF Interoperability Demonstrations
COUPLED CONFIGURATION NEW SCIENCE ENABLED
GFDL B-grid atm / MITgcm ocn Global biogeochemistry (CO2, O2), SI timescales.
GFDL MOM4 / NCEP forecast NCEP seasonal forecasting system.
NSIPP ocean / LANL CICE Sea ice model for extension of SI system to centennial time scales.
NSIPP atm / DAO analysis Assimilated initial state for SI.
DAO analysis / NCEP model Intercomparison of systems for NASA/NOAA joint center for satellite data assimilation.
DAO CAM-fv / NCEP analysis Intercomparison of systems for NASA/NOAA joint center for satellite data assimilation.
NCAR CAM Eul / MITgcm ocn Improved climate predictive capability: climate sensitivity to large component interchange, optimized initial conditions.
NCEP WRF / GFDL MOM4 Development of hurricane prediction capability.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Talk Outline
• Project Overview• Architecture and Current Status
– Component based approach– Superstructure layer
• Design• Adoption
– Infrastructure layer• What is it for• What it contains
• Next steps….• Open Discussion
ESMF overall structure• ESMF uses a component based
approach• The framework provides an upper
“superstructure” layer and a lower “infrastructure” layer
• User written code (simulation algorithms, DA algorithms …) is sandwiched between the two layers.– User code provides standard
interfaces that are called from the superstructure layer
– User code uses facilities in the infrastructure for parallelism, I/O, interpolation
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
2. ESMF provides a toolkit that components use toi. ensure interoperabilityii. abstract common services
ESMF Programming ESMF Programming ModelModel
Component: run(), checkpoint()
Field: halo(), import(), export() + I/O
Grid: regrid(), transpose() + Metrics
Layout, PEList, Machine Model
Application ComponentGridded
ComponentsCoupler
Components
1. ESMF provides an environment for assembling components.
3. Gridded components, coupler components and application components are user writtenuser written.
SUPERSTRUCTURE
SUPERSTRUCTURE
LAYERLAYER
INFRASTRUCTURE
INFRASTRUCTURE
LAYERLAYER
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Superstructure Layer: Assembles and connects componentsSince each ESMF application is also a component, entire ESMF applications may be treated as Gridded Components and nested within larger applications.
climate_comp
ocn_comp atm_comp
ocn2atm_coupler
atm_phys
phys2dyn_coupler
atm_dynPE
Example: atmospheric application containing multiplecoupled components within a larger climate application
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICHSuperstructure Layer:Controlling subcomponentsComponents must provide a single externally visible entry point which will register the other entry points with the Framework. Components can:
- Register one or more Initialization, Run, Finalize, and Checkpoint entry points.
- Register a private data block which can contain all data associated with this instantiation of the Component; particularly useful when running ensembles.
Higher level Comp
cmp_register() cmp_run() cmp_final()cmp_init()
ESMF Framework Services
Public subroutinePrivate subroutine
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Superstructure Layer:Passing data between components IGridded Components do not have access to the internals of other Gridded Components. They have 2 options for exchanging data with other Components; the first is to receive Import and Export States as arguments.
coupler
ocn_componentsubroutine ocn_run(comp, &
ImportState,ExportState, Clock, rc)
atm_componentsubroutine atm_run(comp, &
ImportState,ExportState, Clock, rc)
States contain flags for “is required”, “isvalid”, “is ready”, etc.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Gridded Components using Transforms do not have to return control to a higher level component to send or receive State data from another component. They can receive function pointers which are methods that can be called on the states.
coupler
ocn_componentcall ESMF_StateXform(ex_state, &
xform)
atm_componentcall ESMF_StateXform(xform, &
im_state)
transformcall ESMF_CompRun(atm, xform)call ESMF_CompRun(ocn, xform)
Superstructure Layer:Passing data between components II
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Superstructure Layer: Parallel CommunicationAll inter-component communication within ESMF is local.
climate_comp
ocn_comp atm_comp
atm_phys
phys2dyn_coupler
atm_dynPE
atm2ocn _coupler
This means:
Coupler Components must be defined on the union of the PEs of all the components that they couple.
In this example, in order to send data from the ocean component to the atmosphere, the Coupler mediates the send.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Superstructure Layer: Summary Provides means to connect components together
• Components can be connected in a hierarchy
Provides a general purpose mechanisms for passingdata between components
• Data is self-describing
Provides a general purpose mechanism for “parent” components to control “child” components
• Stepping forward, backward, initializing state etc…
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Infrastructure Layer
1. A standard software platform for enabling interoperability (developing couplers, ensuring performance portability).
2. Set of reusable software for Earth science applications. Streamlined development for researchers.
NCARAtmosphere
GFDLOcean NSIPP
Land
NCARAtmosphere
GFDLOcean NSIPP
Land
MySea Ice
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Infrastructure Layer Scope
SupportSupport for• Physical Grids• Regridding• Decomposition/composition• Communication• Calendar and Time• I/O• Logging and Profiling
ESMF Infrastructure
User Code
ESMF Superstructure
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Field and Grid :grid = ESMF_GridCreate(…,layout,…) :field_u = ESMF_FieldCreate(grid, array) :
1. Create field distributed over a set of decomposition elements (DE’s).2. Domain decomposition determined by DELayout, layout.3. Each object (grid and field_u) has internal representation.4. Other parts of the infrastructure layer use the internal representation e.g.
• Regrid() – interpolation/extrapolation + redistribute over DE’s• Redistribution() - general data rearrangement over DE’s• Halo() – specialized redistribution
ESMF_Array
info.
ESMF_Fieldmetadata
ESMF_Grid
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Regrid• Function mapping field’s array to a different physical and distributed grid.
• RegridCreate() – creates a Regrid structure to be used/re-used
regrid = ESMF_RegridCreate(src_field, dst_field, method, [name], [rc])
Source, Dest field can be empty of field data (RegridCreate() uses grid metrics)– PhysGrid, DistGrid info used for setting up regrid– Resulting regrid can be used for other fields sharing same Grid
• Method specifies interpolation algorithm. For example, bilinear, b-spline, etc…
},,{},,{ DPADPA
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Regrid Interface (cont)• RegridRun() – performs actual regridding call ESMF_RegridRun(src_field, dst_field, regrid,[rc])
– Communication and interpolation handled transparently.
• RegridDestroy() – frees up memory call ESMF_RegridDestroy(regrid,[rc])
src_field
dst_field
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Redistribution• No interpolation or extrapolation• Maps between distributed grids, field’s array and physical grid are the
same i.e.
• Example: layout() created two distributed grids one decomposed in X and one decomposed in Y.
– Redistribution() function maps array data between the distributions (a transpose/corner turn)
– Communication handled transparently
},,{},,{ DPADPA
src_field dst_field
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Halo • Fields have distributed index spaces of
– Exclusive {E}, compute {C} and local {L} region where
• Halo() fills points not in {E} or {C} from remote {E} e.g
{E}
{C}{L}
{C}{L}
{E}
ECL
call ESMF_FieldHalo(field_foo, status)
DE3 DE4
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICHMore functions in reference manual e.g
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Bundle and Location Stream
• ESMF_Bundle– collection of fields on the same grid
• ESMF_LocationStream– Like a field but…..– unstructured index space with an associated
physical grid space– useful for observations e.g. radiosonde, floats
• Functions for create(), regrid(), redistribute(), halo() etc…
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
ESMF Infrastructure Utilities• Clock• Alarm• Calendar• I/O• Logging• Profiling• Attribute• Machine modeland comms
Ensures consistent time between components
Provides field level I/O in standard forms – netCDF, binary, HDF, GRIB, Bufr
Consistent monitoring and messaging
Consistent parameter handling
Hardware and system software hiding. Platform customizable
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Time• Standard type for any component
• Calendar (support for range of calendars)
! initialize stop time to 13May2003, 2:00 pmcall ESMF_TimeInit(inject_stop_time, & YR=int(2003,kind=ESMF_IKIND_I8), & MM=off_month, DD=off_day, H=off_hour, M=off_min, & S=int(0,kind=ESMF_IKIND_I8), & cal=gregorianCalendar, rc=rc)
do while (currTime .le. inject_stop_time ) : call ESMF_ClockAdvance(localclock, rc=rc) call ESMF_ClockGetCurrTime(localclock, currtime, rc)end
call ESMF_CalendarInit(gregorianCalendar, ESMF_CAL_GREGORIAN, rc)
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Time Representations
type(ESMF_Calendar) :: calendar1call ESMF_CalendarInit(calendar1, ESMF_CAL_GREGORIAN, rc)
- ESMF_CAL_GREGORIAN (3/1/-4800 to 10/29/292,277,019,914)- ESMF_CAL_JULIAN (+/- 106,751,991,167,300 days)- ESMF_CAL_NOLEAP- ESMF_CAL_360DAY- ESMF_CAL_USER
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
I/O• Field level binary, netCDF, HDF, GRIB,
bufr, extensible…–Currently I/O piped through 1 PE call ESMF_FieldAllGather(field_u, outarray, status) if (de_id .eq. 0) then write(filename, 20) "U_velocity", file_no call ESMF_ArrayWrite(outarray, filename=filename, rc=status)
endif call ESMF_ArrayDestroy(outarray, status)
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Internal Classes
• Machine model– Captures system attributes, CPU, mem,
connectivity graph– Useful for defining decomposition, load-balance,
performance predictions.
• Comms– Communication driver, allows bindings to MPI,
shared memory, vendor system libraries
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICHComms Performance Test
Right mix (green) on Compaq gives x2 realized bandwidth (in large message limit)
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Talk Outline
• Project Overview• Architecture and Current Status
– Superstructure layer• Design• Adoption
– Infrastructure layer• What is it for• What it contains
• Next steps….• Open Discussion
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Timeline
May 2002 Draft Developer’s Guide and Requirements Document completed
Community Requirements Meeting and review held in D.C.
July 2002 ESMF VAlidation (EVA) suite assembled
August 2002 Architecture Document: major classes and their relationships
Implementation Report: language strategy and programming model
Software Build and Test Plan: sequencing and validation
May 12-15 2003 First API and software release, Community Meeting at GFDL
July 2003 First 3 interoperability experiments completed
April 2004 Second API and software release, Community Meeting
July 2004 All interoperability experiments completed
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
May 2003 Release
Focus for May 2003 ESMF release was on developing sufficient infrastructure and superstructure to achieve the initial set of interoperability experiments.
These are:– FMS B-grid atmosphere coupled to MITgcm ocean– CAM atmosphere coupled to NCEP analysis– NSIPP atmosphere coupled to DAO analysis
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Regrid Next Steps• Support for all ESMF
Grids• Support for regridding
methods including:BilinearBicubic1st-order Conservative2nd-order ConservativeRasterized ConservativeNearest-neighbor distance-weighted averageSpectral transforms1-d interpolations (splines)Index-space (shifts, stencils)Adjoints of many above
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Distributed Grid • Regular 2d already supported• Next steps
– Generalized 1d decomposition– Extend support for 2d and quasi regular decomposition – Spectral grid decompositions– Composite grids
• Larger set of metrics, grids• High level routines for rapid definition of common grids.
Physical Grid Next Steps
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
I/O Next Steps• Broaden format set, binary, netCDF, HDF, GRIB,
bufr• Improve parallelization
• Full support for alarms• Broader functionality
Time/Logging/Profiling Next Steps
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Summary
• ESMF Current Status– Comprehensive class structure available in
version 1.0– Over the coming year significant extension of
functionality will take place.– Feedback and comments on version 1.0 welcome
http://www.esmf.ucar.edu
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Talk Outline
• Project Overview• Architecture and Current Status
– Superstructure layer• Design• Adoption
– Infrastructure layer• What is it for• What it contains
• Next steps….• Open Discussion
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Questions from Hernan
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Questions from Hernan, cont..
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Questions from Hernan, cont..
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Questions from Hernan, cont..
Last but not least - an interesting potential benefit of “component” based approaches
Component based approaches could provide a foundation for driving high-end applications from “desktop” productivity environments.For example driving a parallel ensemble GCM run from Matlab becomes conceivable!
To learn more visit us at MIT!!!
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
ESMF Infrastructure Utility detailed example
Earl Schwab, ESMF Core Team, NCAR
Time Manager
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
What is Time Manager?
• Clock for time simulation• Time representation• Time calculator• Time comparisons• Time queries• F90 API, C++ implementation
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Clock for time simulation
type(ESMF_Clock) :: clock
call ESMF_ClockInit(clock, timeStep, startTime, stopTime, rc=rc)
do while (.not.ESMF_ClockIsStopTime(clock, rc))
! Do application work
.
.
.
call ESMF_ClockAdvance(clock, rc=rc)
end do
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Clock (cont.)
• Clock queries/commands
call ESMF_ClockGetCurrTime(clock, currTime, rc)call ESMF_ClockSetCurrTime(clock, currTime, rc)call ESMF_ClockGetTimeStep(clock, timeStep, rc)call ESMF_ClockSetTimeStep(clock, timeStep, rc)call ESMF_ClockGetAdvanceCount(clock, advanceCount, rc)call ESMF_ClockGetStartTime(clock, startTime, rc)call ESMF_ClockGetStopTime(clock, stopTime, rc)call ESMF_ClockGetPrevTime(clock, prevTime, rc)call ESMF_ClockSyncToWallClock(clock, rc)
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Common Component Architecture
CCSM Coupled System,NASA GEMS, GFDL FMS,
MITgcm, etc…
Single Earth systemmodel or modeling system
Earth System Modeling Framework
Climate, Weather, Data Assimilation
All HPC Applications
ESMF within the broader computational hierarchy
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Time Representation
type(ESMF_Calendar) :: calendar1
call ESMF_CalendarInit(calendar1, ESMF_CAL_GREGORIAN, rc)
- ESMF_CAL_GREGORIAN (3/1/-4800 to 10/29/292,277,019,914)
- ESMF_CAL_JULIAN (+/- 106,751,991,167,300 days)
- ESMF_CAL_NOLEAP
- ESMF_CAL_360DAY
- ESMF_CAL_USER
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Attributes• Flexible parameter specs, configuration e.g. file
of parameters
Code
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Technological Trends
In climate research and NWP... increased emphasis on detailed representation of individual physical processes; requires many teams of specialists to contribute components to an overall modeling system.
In computing technology... increase in hardware and software complexity in high-performance computing, as we shift toward the use of scalable computing architectures.
Time mean air-sea CO2 flux. MITgcm constrained by obs. + ocean carbon (MIT,Scripps,JPL).
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Time Representation (cont.)
type(ESMF_Time) :: time1call ESMF_TimeInit(time1, YR=int( 2003 ,kind=ESMF_IKIND_I8), &
MM= 5 , DD= 15 , H= 15 , cal=calendar1, rc=rc)
- YR, MM, DD, H, M, S F90 optional- D, H, M, S arguments
type(ESMF_TimeInterval) :: timeInterval1call ESMF_TimeIntervalInit(timeInterval1,
D=int( 90 ,kind=ESMF_IKIND_I8), rc=rc)
- D, H, M, S F90 optional arguments
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Time Calculator
• Time differencing• Time increment/decrement by a time interval• Time interval arithmetic (+, -, *, /)
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Time Calculator (cont.)
call ESMF_TimeInit(time1, YR=int( 2003 ,kind=ESMF_IKIND_I8), & MM= 5 , DD= 15 , cal=gregorianCalendar, rc=rc)
call ESMF_TimeInit(time2, YR=int( 2003 ,kind=ESMF_IKIND_I8), & MM= 3 , DD= 26 , cal=gregorianCalendar, rc=rc)
call ESMF_TimeIntervalInit(timeInterval1, D=int( 90 ,kind=ESMF_IKIND_I8, rc=rc)
timeInterval2 = time2 - time1time1 = time1 + timeInterval1 ! Uses F90 overloaded timeInterval3 = timeInterval1 * 2 ! operators
double precision :: ratioratio = timeInterval1 / timeInterval2
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Time Comparisons
>, <, >=, <=, ==, <> F90 overloaded operators between any 2 times or time intervals
if (time1 < time2) then…
end if
if (timeInterval1 .ge. timeInterval2) then…
end if
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Time Queries
call ESMF_TimeGet(time1, YR=yr, MM=mm, DD=dd, H=h, M=m, S=s, rc=rc)call ESMF_TimeIntervalGet(timeInterval1, D=d, H=h, M=m, S=s)
call ESMF_TimeGetDayOfYear(time1, dayOfYear, rc) ! double or integercall ESMF_TimeGetDayOfMonth(time1, dayOfMonth, rc)call ESMF_TimeGetDayOfWeek(time1, dayOfWeek, rc)call ESMF_TimeGetMidMonth(time1, midMonth, rc)
call ESMF_TimeGetString(time1, string, rc) ! 2003-05-14T12:20:19 (ISO 8601)call ESMF_TimeIntervalGetString(timeInterval1, string, rc) ! P1DT12H0M0S (ISO)
call ESMF_TimeGetRealTime(time1, rc)
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
More information
• ESMF User’s Guide• ESMF Reference Manual• ESMF Requirements Document• Time Manager F90 API & examples source code
esmf_1_0_0_r/src/Infrastructure/TimeMgr/interface
esmf_1_0_0_r/src/Infrastructure/TimeMgr/examples
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
2. ESMF provides a toolkit that components use toi. ensure interoperabilityii. abstract common services
Location Within ESMFLocation Within ESMF
Component: run(), checkpoint()
Field: halo(), import(), export() + I/O
Grid: regrid(), transpose() + Metrics
Layout, PEList, Machine Model
Application ComponentGridded
ComponentsCoupler
Components
1. ESMF provides an environment for assembling components.
INFRASTRUCTURE
INFRASTRUCTURE
LAYERLAYER
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
ESMF Infrastructure Fields and Grids
Field Contains array, grid, array to grid Contains array, grid, array to grid mapping and metadatamapping and metadata
Array Holds actual data e.g. temperature, wind Holds actual data e.g. temperature, wind speed etc…speed etc…
Grid Contains physical and distributed gridContains physical and distributed grid
Physical Grid Grid metrics, e.g. lat-lon coords, Grid metrics, e.g. lat-lon coords, spacings, areas, volumes etc..spacings, areas, volumes etc..
Distributed Grid Parallel decomposition informationParallel decomposition information
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Creating an ESMF Field
1. Fortran Array
2. Array Attach
3. Field Attach
Array
real, dimension(100,100) :: arr
esArr = ESMF_ArrayCreate(arr,…)
ESMF_Array
info.
ESMF_Array
info.
ESMF_FieldAttachArray(esFld,esArr,…)
ESMF_Fieldmetadata
ESMF_Grid
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Creating an ESMF Grid
ESMF_Grid
PhysGrid
DistGrid
call ESMF_PhysGridCreate…(…) call ESMF_DistGridCreate…(…)
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICHInfrastructure Internal Organization
MachineModel
DELayout
DistGridPhysGrid
Bundle
Array
Time, Alarm, Calendar, LogErr, I/O, Attributes
F90
C++
Field
Grid
Regrid
Comm
Data Communications
Route
Two tiers
Class hierarchy for data and communications