Creating & Running a Case - CGD · Creating & Running a Case Cécile Hannay and Dani Coleman...
Transcript of Creating & Running a Case - CGD · Creating & Running a Case Cécile Hannay and Dani Coleman...
Creating & Running a Case
Cécile Hannay and Dani Coleman National Center for Atmospheric Research (NCAR)
Reference: This tutorial borrows material presented in: “CESM1.0.3 Tutorial” by CESM Software Engineering Group http://www.cesm.ucar.edu/models/cesm1.0/cesm/cesm1_tutorial.pdf
1
The Community Earth System Model (CESM)
• The Community Earth System Model (CESM) is a fully-coupled, global climate model that provides state-of-the-art computer simulations of the Earth's past, present, and future climate states.
• Take a tour of the CESM webpage: http://www.cesm.ucar.edu/models/cesm1.0/ and please register as a CESM user
3
Registration • Go to CESM1.0 home page:
• http://www.cesm.ucar.edu/models/cesm1.0/ • Right hand column has a link to the registration page, click on it • Register -- you will be emailed a username and password • It is helpful for the CESM development team to know who is using the
model so please do this now.
4
Overview of directories to run CESM
• Typically, the files necessary to run CESM live in 2 root directories
(1) CESM code directory Contains all the code required to run the model
(2) Inputdata directory Contains all input data required to run the model
• Ideally these directories are shared by a group of users to save disc space. This is especially important for the inputdata directory.
• For this tutorial:
CESM code directory: /home/s07hsu00/cesm_collecJon/cesm1_0_3/
Inputdata directory:
/work/s07hsu00/cesm_inputdata 5
Overview of directories to run CESM
models scripts
$CCSMROOT
atm
lnd
ocn
ice
glc
csm_share
utils
drv
CESM Code
(1) CESM code directory /home/s07hsu00/cesm_collec/on/cesm1_0_3
The CESM code directory contains two subdirectories: models: contains the code for every model component
(atmosphere, land, ocean, ice, …) scripts: contains the scripts to run CESM
(2) Inputdata directory /work/s07hsu00/cesm_inputdata
The inputdata directory contains subdirectories for every model component (atm, lnd, ocn, ice, …)
Please go (cd) to these directories and explore their structure.
$DIN_LOC_ROOT_CSMDATA
atm lnd ocn ice glc cpl
INPUTDATA Directory
$DIN_LOC_ROOT_CSMDATA
6
In the CESM scripts, DIN_LOC_ROOT_CSMDATA refers to the inputdata directory
CCSMROOT refers to the CESM code directory
Basic Work Flow
• SeWng up and running an “out of the box”* CESM experiment (case) requires 4 steps:
(1) Create a New Case
(2) Configure the Case
(3) Build the Executable
(4) Run the Model
• We are going to study each step in detail
*“out of the box” means without any customizaJon
7
Step (1) in the Basic Work Flow
(1) Create a New Case (2) Configure the Case (3) Build the Executable
(4) Run the Model
8
(1) Create a new case • The scripts directory $CCSMROOT/scripts contains key scripts (see slide 3: CESM directories).
The scripts are a combinaJon of csh, perl, sh, and xml
• Go to the scripts directory: $CCSMROOT/scripts
create_newcase is the script that generates a new case
This is the first step in seWng up a model
CESM1_0/scripts>ls -l!total 400!-rw-r--r-- 1 userx ncar 18596 May 12 11:33 ChangeLog!-rw-r--r-- 1 userx ncar 168 May 12 11:33 README!-rw-r--r-- 1 userx ncar 103 May 12 11:33 SVN_EXTERNAL_DIRECTORIES!drwxr-xr-x 10 userx ncar 8192 May 12 11:33 ccsm_utils!-rwxr-xr-x 1 userx ncar 19039 May 12 11:33 create_clone!-rwxr-xr-x 1 userx ncar 52338 May 12 11:33 create_newcase!-rwxr-xr-x 1 userx ncar 18253 May 12 11:33 create_test!-rwxr-xr-x 1 userx ncar 9643 May 12 11:33 create_test_suite!drwxr-xr-x 3 userx ncar 8192 May 12 11:33 doc!-rwxr-xr-x 1 userx ncar 1255 May 12 11:33 link_dirtree!-rw-r--r-- 1 userx ncar 295 May 12 11:33 sample_compset_file.xml!-rw-r--r-- 1 userx ncar 851 May 12 11:33 sample_pes_file.xml!
create_newcase
9 These were for another machine. It will not show that same way on alps.
(1) About create_newcase • create_newcase has many command line opJons -‐ most are rarely used
• create_newcase –help lists all the available opJons • Most ocen only four opJons are used: case, compset, res, and mach
CESM1_0/scripts>./create_newcase -help!SYNOPSIS! create_newcase [options]!OPTIONS! User supplied values are denoted in angle brackets (<>). Any value that contains! white-space must be quoted. Long option names may be supplied with either single! or double leading dashes. A consequence of this is that single letter options may! NOT be bundled.!
-case <name> Specifies the case name (required).! -compset <name> Specify a CESM compset (required).! -res <name> Specify a CCSM grid resolution (required).! -mach <name> Specify a CESM machine (required).! -pecount <name> Value of S,M,L,X1,X2 (optional). (default is M).! -pes_file <name> Full pathname of pes setup file to use (will overwrite default settin! -compset_file <name> Full pathname of compset setup file to use. (optional)!
-help [or -h] Print usage to STDOUT (optional).! -list Only list valid values for compset, grid settings and machines (optional).! -silent [or -s] Turns on silent mode - only fatal messages issued (optional).! -verbose [or -v] Turn on verbose echoing of settings made by create_newcase (optional).! -xmlmode <name> Sets format of xml files; normal or expert (optional). (default is normal) !
The following arguments are required for a generic machine. Otherwise, they will be ignored. !
-scratchroot <name> ccsm executable directory (EXEROOT will be scratchroot/CASE) (char)! -din_loc_root_csmdata <name> cesm input data root directory (char)! -max_tasks_per_node <value> maximum mpi tasks per machine node (integer)!
The following two arguments turn on single point mode. ! If one is given -- both MUST be given.!
-pts_lat <value> Latitude of single point to operate on (optional)! -pts_lon <value> Longitude of single point to operate on (optional) 10
(1) create_newcase -- Argument overview
Four arguments are required: ./create_newcase -‐case ~/cesm_case/case01 -‐res T31_g37 -‐compset B_1850 -‐mach alps
• “case” is the name and locaJon of the case being created • ~/cesm_case/case01
• “res” specifies the model resoluJons (or grid) • Format is [atm/lnd grid]_[ocn/ice grid]
eg. , T31_g37 is T31 atm/lnd grid + gx3v7 ocn/ice grid • Most ocen the atm & lnd share the same grid, and the ice & ocn share the same grid • Equivalent short and long names (T31_g37 == T31_gx3v7)
• “compset” specifies the “component set” • component set specifies component models, forcing scenarios and physics opJons for those models • equivalent short and long names (B1850CN == B_1850_CN)
• “mach” specifies the machine that will be used. • For the tutorial, the machine is ‘alps’.
To find out opJons • create_newcase –list
• lists all the valid choices for these command line opJons (see next slide) • Arguments given are “locked down” in case directory
• file env_case.xml contains all “locked down” variables when create_newcase was run 11
(1) create_newcase arguments: res and mach
CESM1_0/scripts>./create_newcase -list!
RESOLUTIONS: name (shortname) ! 0.9x1.25_0.9x1.25 (f09_f09) ! 0.9x1.25_gx1v6 (f09_g16) ! 1.9x2.5_1.9x2.5 (f19_f19) ! 1.9x2.5_gx1v6 (f19_g16) ! 4x5_gx3v7 (f45_g37) ! T31_gx3v7 (T31_g37) ! ne30np4_1.9x2.5_gx1v6 (ne30_f19_g16) !
COMPSETS: name (shortname): description (status) ! A_PRESENT_DAY (A) ! Description: All data model ! B_2000 (B) ! Description: All active components, present day ! B_1850 (B1850) ! Description: All active components, pre-industrial ! B_1850_CN (B1850CN) ! Description: all active components, pre-industrial, with CN (Carbon Nitrogen) in CLM ! F_AMIP (FAMIP) ! Description: Default resolution independent AMIP is INVALID ! F_2000_CN (FCN) ! Description: Stand-alone cam default, prescribed ocn/ice with CN ! G_NORMAL_YEAR (G) ! Description: Coupled ocean ice with COREv2 normal year forcing ! I_2000 (I) ! Description: Active land model with QIAN atm input data for 2003 and Satellite phenology (SP), CO2 level ! and Aerosol deposition for 2000 ! I_1850 (I1850) ! Description: Active land model with QIAN atm input data for 1948 to 1972 and Satellite phenology (SP), CO2 ! level and Aerosol deposition for 1850 !
MACHINES: name (description)! bluefire (NCAR IBM p6, os is AIX, 32 pes/node, batch system is LSF) ! franklin (NERSC XT4, os is CNL, 4 pes/node, batch system is PBS) ! intrepid (ANL IBM BG/P, os is BGP, 4 pes/node, batch system is cobalt) ! jaguar (ORNL XT4, os is CNL, 4 pes/node, batch system is PBS) ! jaguarpf (ORNL XT5, os is CNL, 12 pes/node, batch system is PBS) ! prototype_ranger (TACC Linux Cluster, Linux (pgi), 1 pes/node, batch system is SGE) ! generic_linux_pgi (generic linux (pgi), os is Linux, batch system is PBS, user-defined) ! generic_linux_intel (generic linux (intel), os is Linux, batch system is PBS, user-defined) ! 12
CESM component sets • The component and component models are basic element throughout CESM
• Plug and play of components (ie atm) with different component models (ie cam, datm, etc) • Done at case configuraJon Jme • Each component model has its own sub-‐directory tree under the model root
CAM
cpl
POP2
CLM
CICE
SGLC B_
DATM
cpl
SOCN
CLM
SICE
SGLC I_
DATM
cpl
POP2
DLND
CICE
SGLC G_
CAM
cpl
DOCN
CLM
CICE (P)
SGLC F_
13
(1) create_newcase arguments: compset
CESM component summary (full descriptions in documentation: http://www.cesm.ucar.edu/models/cesm1.0/cesm/cesm_doc_1_0_4/x42.html) • Model Type Model Name Component Name Type Description • atmosphere atm cam active Community Atm Model • atmosphere atm datm data atmosphere forcing data • land lnd clm active Community Land Model • land lnd dlnd data data run-off or data land • ocean ocn pop active Parallel Ocean Program • ocean ocn docn data data or slab ocean • sea-ice ice cice active CICE (prognostic or prescribed) • sea-ice ice dice data CICE prescribed • land-ice glc cism active CISM = extension of Glimmer
• Coupler cpl cpl active coupler using MCT library 14
COMPSETS = Component Sets! name (shortname): description (status) ! A_PRESENT_DAY (A) All data model ! B_2000 (B) ! All active components, present day ! B_1850 (B1850) All active components, pre-industrial ! B_1850_CN (B1850CN) All active components, pre-industrial, !
! ! !with CN (Carbon Nitrogen) in CLM ! F_AMIP (FAMIP) ! Default resolution independent AMIP is INVALID ! F_2000_CN (FCN) Stand-alone cam default, prescribed ocn/ice with CN ! G_NORMAL_YEAR (G) Coupled ocean ice with COREv2 normal year forcing ! I_2000 (I) !Active land model with QIAN atm input data for 2003 and Satellite !
! !phenology (SP), CO2 level and Aerosol deposition for 2000 ! I_1850 (I1850) !Active land model with QIAN atm input data for 1948 to 1972 and !
! !Satellite phenology (SP), CO2 level and Aerosol deposition for 1850 !
(1) Create your first case on alps
• You are now ready to create your first case on alps.
• As we are going to create many cases over the next few days, we will first create a directory where we will keep all our cases. You want this directory to be in space that it is backed-‐up (like your home directory). Please create the directory:
mkdir ~/cesm_case
• Using what you have learned about create_newcase, create a case called case01
in the directory ~/cesm_case
using the resoluJon T31_gx3v7
and the compset B_1850
Try to do this on your own!
If you get stuck, the soluJon is on the next slide (line in red) 15
# go to CESM root directory!cd /home/$LOGNAME/cesm_collection/cesm1_0_3!
# go into scripts subdir!cd scripts!
# (1) create a new case in your home dir!create_newcase –case ~/cesm_case/case01 -res T31_g37 -compset B_1850 -mach alps!
Basic work flow step 1) create_newcase
16
(1) Result of Running create_newcase
For both a quick start as well as a detailed summary of creating and running !a CESM model case, see the CESM1.0 User's Guide at!http://www.cesm.ucar.edu/models/cesm1.0!
IMPORTANT INFORMATION ABOUT SCIENTIFIC VALIDATION!
CESM1.0 has the flexibility to configure cases with many different ! combinations of component models, grids, and model settings, but this ! version of CESM has only been validated scientifically for the following ! fully active configurations:!
1.9x2.5_gx1v6 B_1850_CN! 1.9x2.5_gx1v6 B_1850_RAMPCO2_CN! 1.9x2.5_gx1v6 B_1850-2000_CN!
1.9x2.5_gx1v6 B_1850_CAM5!
………………..!
please refer to the individual component web pages at! http://www.cesm.ucar.edu/models/cesm1.0!
***********************************************************!Component set : B_1850 (B)!Desc : All active components, pre-industrial !***********************************************************!
Creating ~/cesm_case/case01 !
Locking file ~/cesm_case/case01/env_case.xml !Successfully created the case for alps !
warning message
success
case location
17
(1) Overview of Directories (after create_newcase)
case01 $CASEROOT
configure
SourceMods
src.cam
src.pop2
src.share
CASE Directory
Tools
models scripts create_newcase
$CCSMROOT
atm
lnd
ocn
ice
glc
csm_share
utils
drv
CESM Download
$DIN_LOC_ROOT_CSMDATA
INPUTDATA Directory
18
$CCSMROOT = /home/s07hsu00/cesm_collection/cesm1_0_3/ $DIN_LOC_ROOT_CSMDATA = /work/s07hsu00/cesm_inputdata $CASEROOT = ~/cesm_case/case01
When you issue the command create_newcase, you create a case directory $CASEROOT with the structure shown here.
Please go into the case directory you just created and look at the content.
(1) Case Directory After Running create_newcase
• SourceMods is a directory where case specific code modificaJon can be placed (we will study this later this week)
• configure is the script used in the next step, step (2) • env_*.xml contain environment variables associate with the case (more on this later) • xmlchange is a script that changes env variable values through a command line interface
CESM1_0/scripts> cd ~/cesm_case/case01 !cases/case01>ls -l!total 64!drwxr-xr-x 2 userx ncar 8192 May 13 14:32 LockedFiles!-rw-r--r-- 1 userx ncar 10687 May 13 14:32 Macros.alps!drwxr-xr-x 2 userx ncar 8192 May 13 14:32 README!-rw-r--r-- 1 userx ncar 66 May 13 14:32 README.case!drwxr-xr-x 9 userx ncar 8192 May 13 14:32 SourceMods!drwxr-xr-x 4 userx ncar 8192 May 13 14:32 Tools!-rwxr-xr-x 1 userx ncar 9330 May 12 11:33 check_input_data!-rwxr-xr-x 1 userx ncar 10092 May 12 11:33 configure!-rwxr-xr-x 1 userx ncar 3085 May 12 11:33 create_production_test!-rw-r--r-- 1 userx ncar 4433 May 13 14:32 env_build.xml!-rw-r--r-- 1 userx ncar 5635 May 13 14:32 env_case.xml!-rw-r--r-- 1 userx ncar 7029 May 13 14:32 env_conf.xml!-rw-r--r-- 1 userx ncar 5915 May 13 14:32 env_mach_pes.xml!-rwxr-xr-x 1 userx ncar 2199 May 13 14:32 env_mach_specific!-rw-r--r-- 1 userx ncar 10466 May 13 14:32 env_run.xml!-rwxr-xr-x 1 userx ncar 10388 May 12 11:33 xmlchange!
env files
configure
xmlchange
SourceMods
19
About env_*.xml Files: Format & Variables • Contain variables used by scripts -‐-‐ some may be changed by the user
• Here’s a snippet of the env_run.xml file <!--"sets the run length in conjunction with STOP_N and STOP_DATE, valid values: none,never,nst!eps,nstep,nseconds,nsecond,nminutes,nminute,nhours,nhour,ndays,nday,nmonths,nmonth,nyears,nyea!r,date,ifdays0,end (char) " -->!<entry id="STOP_OPTION" value="ndays" /> !
<!--"sets the run length in conjunction with STOP_OPTION and STOP_DATE (integer) " -->!<entry id="STOP_N" value="5" /> !
<!--"logical to turn on short term archiving, valid values: TRUE,FALSE (logical) " -->!<entry id="DOUT_S" value="TRUE" /> !
<!--"local short term archiving root directory (char) " -->!<entry id="DOUT_S_ROOT" value="/ptmp/$CCSMUSER/cesm_archive/$CASE" /> !
• “id” is the variable’s name
• “value” is the variable’s seWng • <!-‐-‐-‐ text -‐-‐> is a comment in xml (most variables have a descripJon above the entry)
• “type” is the type of the variable (float, integer, logical, char, …) • “valid values” indicates the full set of allowable seWngs • the script will let you know if you try to set a variable to an invalid value
• many values do not have valid values defined, that means there are no constraints • To modify a variable in an xml file
• use xmlchange to modify env variable seWngs > xmlchange –file filename.xml –id name –val value > xmlchange –file env_run.xml –id STOP_N –val 20 Sets STOP_N = 20 in env_run.xml For help > xmlchange –help
• Instead of using xmlchange, you may edit env_*.xml files manually -‐-‐ but be careful about introducing formaWng errors
20
About env_.xml Files: How They Change the Build and Run
• Defaults are generally reasonable
• env_case.xml is set by create_newcase and cannot be modified • env_conf.xml variables specify various component informaJon
• Most ocen this file should not be modified
• RUN_TYPE, RUN_STARTDATE, RUN_REFCASE, RUN_REFDATE – define iniJal condiJons • Can change the physics of a model – be very careful about this
• env_mach_pes.xml variables specify the layout of components on hardware processors
• is used to tune the performance of the model -‐ scienJfic results do not depend on component/processor layout
• NTASKS_* -‐ number of mpi tasks assigned to the component
• NTHRDS_* -‐ number of openmp threads per mpi task for the component
• ROOTPE_* -‐ global mpi task rank of the component root mpi task • env_build.xml variables specify some build informaJon
• Most ocen this file should not be modified
• Macros.* specifies the compilaJon variables used in the Makefile
• Most ocen this file should not be modified • env_mach_specific
• Sets modules and paths to libraries (e.g. MPI)
• Can change compiler opJons, libraries, etc. • Part of porJng is to set variables here
• env_run.xml variables specify run Jme informaJon
• Most ocen this file will be modified
• STOP_OPTION, STOP_N, REST_OPTION, REST_N
<entry id="NTASKS_ATM" value=”64" /> !<entry id="NTHRDS_ATM" value=“1" /> !<entry id="ROOTPE_ATM" value="0" /> !
<entry id="NTASKS_LND" value=”64" /> !<entry id="NTHRDS_LND" value=“1" /> !<entry id="ROOTPE_LND" value="0" /> !
<entry id="NTASKS_ICE" value=”50" /> !<entry id="NTHRDS_ICE" value=“1" /> !<entry id="ROOTPE_ICE" value=“0" /> !
<entry id="NTASKS_OCN" value=”60" /> !<entry id="NTHRDS_OCN" value="1" /> !<entry id="ROOTPE_OCN" value=“0" /> !
<entry id="NTASKS_CPL" value=”64" /> !<entry id="NTHRDS_CPL" value="1" /> !<entry id="ROOTPE_CPL" value="0" /> !
21
Step (2) In the Basic Work Flow
(1) Create a New Case (2) Configure the Case (3) Build the Executable
(4) Run the Model
22
(2) Configure the case
The command in red configures the case on alps. You can do this now.
# go to root directory of CESM code!cd /home/s07hsu00/cesm_collection/cesm1_0_3 !
# go into scripts subdir!cd scripts!
# (1) create a new case in your home dir!create_newcase -case ~/cesm_case/case01 -res T31_g37 -compset B_1850 -mach alps!
# go into the case you just created in the last step!cd ~/cesm_case/case01/!
# (2) configure the case!configure –case!
23
(2) Configure the Case
• To configure, use the command configure –case
• If necessary, you can modify env_conf.xml and env_mach_pes.xml before running configure, but not acer (unless invoke configure with a clean opJon)
• Most ocen there is no need to modify env_conf.xml or env_mach_pes.xml • The command configure –case generates
• Buildconf/ directory with buildnml, buildexe, and input_data_list files • case *.build and *.run scripts
• It locks env_conf.xml and env_mach_pes.xml
CESM1_0/scripts> cd ~/cesm_case/case01 !cases/case01>ls -l!total 64!drwxr-xr-x 2 userx ncar 8192 May 13 14:32 LockedFiles!-rw-r--r-- 1 userx ncar 10687 May 13 14:32 Macros.alps!drwxr-xr-x 2 userx ncar 8192 May 13 14:32 README!-rw-r--r-- 1 userx ncar 66 May 13 14:32 README.case!drwxr-xr-x 9 userx ncar 8192 May 13 14:32 SourceMods!drwxr-xr-x 4 userx ncar 8192 May 13 14:32 Tools!-rwxr-xr-x 1 userx ncar 9330 May 12 11:33 check_input_data!-rwxr-xr-x 1 userx ncar 10092 May 12 11:33 configure!-rwxr-xr-x 1 userx ncar 3085 May 12 11:33 create_production_test!-rw-r--r-- 1 userx ncar 4433 May 13 14:32 env_build.xml!-rw-r--r-- 1 userx ncar 5635 May 13 14:32 env_case.xml!-rw-r--r-- 1 userx ncar 7029 May 13 14:32 env_conf.xml!-rw-r--r-- 1 userx ncar 5915 May 13 14:32 env_mach_pes.xml!-rwxr-xr-x 1 userx ncar 2199 May 13 14:32 env_mach_specific!-rw-r--r-- 1 userx ncar 10466 May 13 14:32 env_run.xml!-rwxr-xr-x 1 userx ncar 10388 May 12 11:33 xmlchange!
env_conf.xml env_mach_pes.xml
configure
24
(2) About configure
• configure –case
• Generates Buildconf/ directory and buildnml, buildexe, and input_data_list files • Generates the case .build and .run scripts • Locks env_conf.xml and env_mach_pes.xml
• configure –cleanall
• Unlocks env_conf.xml and env_mach_pes.xml • “Backs up” Buildconf/ and run scripts • Modify env_conf.xml and env_mach_pes.xml and type configure –case again
• configure –cleanmach • Unlocks only env_mach_pes.xml
• “Backs up” run scripts • Modify env_mach_pes.xml and type configure –case again
> configure -help
NAME!
configure - configures the model for a given resolution, component set! and machine. !
SYNOPSIS !
configure [-case] [-cleannamelist] [-cleanmach] [-cleanall] !
25
(2) Output from configure
cases/case01>./configure -case!Generating resolved namelist, prestage, and build scripts!adding use_case 1850_control defaults for var sim_year with val 1850 !adding use_case 1850_control defaults for var sim_year_range with val constant !adding use_case 1850_control defaults for var use_case_desc with val Conditions to simulate 1850 land-use !configure done.!Successfully generated resolved namelist, prestage, and build scripts!Locking file env_conf.xml!Generating clean_build script !Generating build script !Generating run script !Locking file env_mach_pes.xml!Successfully configured the case for alps!If an old build exists for this case, you might want to ! run the *.clean_build script before building!
Generated Buildconf files
Generated build and run scripts
Success
26
models scripts create_newcase
~/CESM1_0 $CCSMROOT
atm
lnd
ocn
ice
glc
csm_share
utils
drv
CESM Download
/ $DIN_LOC_ROOT_CSMDATA
INPUTDATA Directory
(2) Overview of Directories (after configure)
Buildconf *.buildnml.csh *.buildexe.csh
*.input_data.list
case01 $CASEROOT
configure case01.alps.build case01.alps.run
SourceMods
CASE Directory
Tools
27
$CCSMROOT = /home/s07hsu00/cesm_collection/cesm1_0_3/ $DIN_LOC_ROOT_CSMDATA = /work/s07hsu00/cesm_inputdata $CASEROOT = ~/cesm_case/case01
When you issue the command configure -‐case, you generate a set of files (shown in green here).
Please go into the case directory and look if the files are there.
(2) Case directory ($CASEROOT) after running configure
• configure adds the Buildconf/ directory and populates it
• configure generates build, clean_build, run, and archive scripts
cases/case01>ls -l!total 432!drwxr-xr-x 6 userx ncar 8192 May 13 17:12 Buildconf!drwxr-xr-x 2 userx ncar 8192 May 13 17:12 LockedFiles!-rw-r--r-- 1 userx ncar 10687 May 13 14:32 Macros.alps!drwxr-xr-x 2 userx ncar 8192 May 13 14:32 README!-rw-r--r-- 1 userx ncar 66 May 13 14:32 README.case!drwxr-xr-x 9 userx ncar 8192 May 13 14:32 SourceMods!drwxr-xr-x 4 userx ncar 8192 May 13 14:32 Tools!-rwxr-xr-x 1 userx ncar 9330 May 12 11:33 check_input_data!-rwxr-xr-x 1 userx ncar 10092 May 12 11:33 configure!-rwxr-xr-x 1 userx ncar 3085 May 12 11:33 create_production_test!-rw-r--r-- 1 userx ncar 4454 May 13 17:12 env_build.xml!-rw-r--r-- 1 userx ncar 5635 May 13 14:32 env_case.xml!-rw-r--r-- 1 userx ncar 7029 May 13 14:32 env_conf.xml!-rw-r--r-- 1 userx ncar 614 May 13 17:12 env_derived!-rw-r--r-- 1 userx ncar 5916 May 13 17:12 env_mach_pes.xml!-rwxr-xr-x 1 userx ncar 2199 May 13 14:32 env_mach_specific!-rw-r--r-- 1 userx ncar 10466 May 13 14:32 env_run.xml!-rwxrwxr-x 1 userx ncar 574 May 13 17:12 case01.alps.build!-rwxrwxr-x 1 userx ncar 836 May 13 17:12 case01.alps.clean_build!-rwxrwxr-x 1 userx ncar 802 May 13 17:12 case01.alps.l_archive!-rwxrwxr-x 1 userx ncar 3938 May 13 17:12 case01.alps.run!-rwxr-xr-x 1 userx ncar 10388 May 12 11:33 xmlchange!
new scripts
Buildconf
28
(2) Files in the Buildconf/ Directory (Created by configure)
• The configure script fills the Buildconf/ directory which contains • Component buildnml.csh scripts
• Component buildexe.csh scripts • Component input_data.list
cases/case01>ls -l Buildconf/!total 448!-rwxr-xr-x 1 userx ncar 850 May 13 17:12 cam.buildexe.csh!-rwxr-xr-x 1 userx ncar 3625 May 13 17:12 cam.buildnml.csh!-rwxr-xr-x 1 userx ncar 1508 May 13 17:12 cam.input_data_list!drwxr-xr-x 2 userx ncar 8192 May 13 17:12 camconf!-rwxr-xr-x 1 userx ncar 480 May 13 17:12 ccsm.buildexe.csh!-rwxr-xr-x 1 userx ncar 1414 May 13 17:12 cice.buildexe.csh!-rwxr-xr-x 1 userx ncar 3292 May 13 17:12 cice.buildnml.csh!-rwxr-xr-x 1 userx ncar 379 May 13 17:12 cice.input_data_list!drwxr-xr-x 2 userx ncar 8192 May 13 17:12 ciceconf!-rwxr-xr-x 1 userx ncar 1174 May 13 17:12 clm.buildexe.csh!-rwxr-xr-x 1 userx ncar 2269 May 13 17:12 clm.buildnml.csh!-rwxr-xr-x 1 userx ncar 702 May 13 17:12 clm.input_data_list!drwxr-xr-x 2 userx ncar 8192 May 13 17:12 clmconf!-rwxr-xr-x 1 userx ncar 42 May 13 17:12 cpl.buildexe.csh!-rwxr-xr-x 1 userx ncar 10507 May 13 17:12 cpl.buildnml.csh!-rwxr-xr-x 1 userx ncar 1665 May 13 17:12 csm_share.buildlib!-rwxr-xr-x 1 userx ncar 1965 May 13 17:12 mct.buildlib!-rwxr-xr-x 1 userx ncar 2412 May 13 17:12 pio.buildlib!-rwxr-xr-x 1 userx ncar 5546 May 13 17:12 pop2.buildexe.csh!-rwxr-xr-x 1 userx ncar 29056 May 13 17:12 pop2.buildnml.csh!-rwxr-xr-x 1 userx ncar 1012 May 13 17:12 pop2.input_data_list!drwxr-xr-x 2 userx ncar 8192 May 13 17:12 pop2doc!-rwxr-xr-x 1 userx ncar 588 May 13 17:12 sglc.buildexe.csh!-rwxr-xr-x 1 userx ncar 78 May 13 17:12 sglc.buildnml.csh!
29
Third Step In the Basic Work Flow
(1) Create a New Case (2) Configure the Case (3) Build the Executable
(4) Run the Model
30
(3) Build the executable on alps
The command in red builds the executable on "alps”. Please do this now.
# go to root directory of CESM code!cd /home/s07hsu00/cesm_collection/cesm1_0_3 !
# go into scripts subdir!cd scripts!
# (1) create a new case in your home dir!create_newcase -case ~/cesm_case/case01 -res T31_g37 -compset B_1850 -mach alps!
# go into the case you just created in the last step!cd ~/cesm_case/case01/!
# (2) configure the case!configure -case!
# (3) build the executable!./case01.alps.build !
31
(3) Build the Model
• ModificaJons before build • You can change values in env_build.xml before running *.build, but not acer • Usually there is no need to edit env_build.xml
• You may want to introduce modified source code before building (See: Wednesday)
• The *.build script • checks for missing input data by running check_input_data -‐check • creates build/run directory that will contain executable code and model namelist files • locks env_build.xml • compiles the individual component libraries by running the *.buildexe.csh scripts for each component • creates the final, single, executable
• If any inputdata is missing… • The build will abort, but it will provide a list of missing files • You must then run the script check_input_data –export to acquire missing data
• svn will be used to put required data in the inputdata directory • You must re-‐run build script acer running check_input_data –export
NB: For the tutorial, you don’t have permissions to put data in $DIN_LOC_ROOT_CSMDATA
However, all data required for the tutorial should be in: $DIN_LOC_ROOT_CSMDATA
32
(3) The *.build scripts cases/case01>ls -l!total 432!drwxr-xr-x 6 userx ncar 8192 May 13 17:12 Buildconf!drwxr-xr-x 2 userx ncar 8192 May 13 17:12 LockedFiles!-rw-r--r-- 1 userx ncar 10687 May 13 14:32 Macros.alps!drwxr-xr-x 2 userx ncar 8192 May 13 14:32 README!-rw-r--r-- 1 userx ncar 66 May 13 14:32 README.case!drwxr-xr-x 9 userx ncar 8192 May 13 14:32 SourceMods!drwxr-xr-x 4 userx ncar 8192 May 13 14:32 Tools!-rwxr-xr-x 1 userx ncar 9330 May 12 11:33 check_input_data!-rwxr-xr-x 1 userx ncar 10092 May 12 11:33 configure!-rwxr-xr-x 1 userx ncar 3085 May 12 11:33 create_production_test!-rw-r--r-- 1 userx ncar 4454 May 13 17:12 env_build.xml!-rw-r--r-- 1 userx ncar 5635 May 13 14:32 env_case.xml!-rw-r--r-- 1 userx ncar 7029 May 13 14:32 env_conf.xml!-rw-r--r-- 1 userx ncar 614 May 13 17:12 env_derived!-rw-r--r-- 1 userx ncar 5916 May 13 17:12 env_mach_pes.xml!-rwxr-xr-x 1 userx ncar 2199 May 13 14:32 env_mach_specific!-rw-r--r-- 1 userx ncar 10466 May 13 14:32 env_run.xml!-rwxrwxr-x 1 userx ncar 574 May 13 17:12 case01.alps.build!-rwxrwxr-x 1 userx ncar 836 May 13 17:12 case01.alps.clean_build!-rwxrwxr-x 1 userx ncar 802 May 13 17:12 case01.alps.l_archive!-rwxrwxr-x 1 userx ncar 3938 May 13 17:12 case01.alps.run!-rwxr-xr-x 1 userx ncar 10388 May 12 11:33 xmlchange!
.build script
env_build.xml
check_input_data
33
(3) Output from case01.alps.build Script
cases/case01>././case01.alps.build !-------------------------------------------------------------------------! CESM BUILDNML SCRIPT STARTING! - To prestage restarts, untar a restart.tar file into /ptmp/userx/case01/run! - Create modelio namelist input files! CESM BUILDNML SCRIPT HAS FINISHED SUCCESSFULLY!-------------------------------------------------------------------------! CESM PRESTAGE SCRIPT STARTING! - CESM input data directory, DIN_LOC_ROOT_CSMDATA, is /fis/cgd/cseg/csm/inputdata! - Case input data directory, DIN_LOC_ROOT, is /fis/cgd/cseg/csm/inputdata! - Checking the existence of input datasets in DIN_LOC_ROOT! CESM PRESTAGE SCRIPT HAS FINISHED SUCCESSFULLY!-------------------------------------------------------------------------! CESM BUILDEXE SCRIPT STARTING! - Build Libraries: mct pio csm_share !Tue May 18 18:06:34 MDT 2010 /ptmp/userx/case01/mct/mct.bldlog.100518-180630!Tue May 18 18:07:21 MDT 2010 /ptmp/userx/case01/pio/pio.bldlog.100518-180630!Tue May 18 18:08:16 MDT 2010 /ptmp/userx/case01/csm_share/csm_share.bldlog.100518-180630!Tue May 18 18:08:59 MDT 2010 /ptmp/userx/case01/run/cpl.bldlog.100518-180630!Tue May 18 18:08:59 MDT 2010 /ptmp/userx/case01/run/atm.bldlog.100518-180630!Tue May 18 18:10:25 MDT 2010 /ptmp/userx/case01/run/lnd.bldlog.100518-180630!Tue May 18 18:11:43 MDT 2010 /ptmp/userx/case01/run/ice.bldlog.100518-180630!Tue May 18 18:12:43 MDT 2010 /ptmp/userx/case01/run/ocn.bldlog.100518-180630!Tue May 18 18:14:52 MDT 2010 /ptmp/userx/case01/run/glc.bldlog.100518-180630!Tue May 18 18:14:53 MDT 2010 /ptmp/userx/case01/run/ccsm.bldlog.100518-180630! - Locking file env_build.xml! - Locking file Macros.alps! CESM BUILDEXE SCRIPT HAS FINISHED SUCCESSFULLY! Success
Namelist Generation
Inputdata Verification
Model Build
34
(3) Your $RUNDIR after running .build
> ls –al $RUNDIR where RUNDIR is /work/$LOGNAME/cesm_run/case01/run
cases/case01>ls -l $RUNDIR!total 167552!-rw-r--r-- 1 userx ncar 9960 May 18 18:10 atm.bldlog.100518-180630.gz!-rw-r--r-- 1 userx ncar 2867 May 18 18:06 atm_in!-rw-r--r-- 1 userx ncar 133 May 18 18:06 atm_modelio.nml!-rw-r--r-- 1 userx ncar 1398 May 18 18:15 ccsm.bldlog.100518-180630.gz!-rwxr-xr-x 1 userx ncar 84463482 May 18 18:15 ccsm.exe!-rw-r--r-- 1 userx ncar 120 May 18 18:08 cpl.bldlog.100518-180630.gz!-rw-r--r-- 1 userx ncar 133 May 18 18:06 cpl_modelio.nml!-rw-r--r-- 1 userx ncar 50 May 18 18:06 drv_flds_in!-rw-r--r-- 1 userx ncar 2545 May 18 18:06 drv_in!-rw-r--r-- 1 userx ncar 589 May 18 18:14 glc.bldlog.100518-180630.gz!-rw-r--r-- 1 userx ncar 133 May 18 18:06 glc_modelio.nml!-rw-r--r-- 1 userx ncar 2569 May 18 18:12 ice.bldlog.100518-180630.gz!-rw-r--r-- 1 userx ncar 3279 May 18 18:06 ice_in!-rw-r--r-- 1 userx ncar 133 May 18 18:06 ice_modelio.nml!-rw-r--r-- 1 userx ncar 4591 May 18 18:11 lnd.bldlog.100518-180630.gz!-rw-r--r-- 1 userx ncar 1918 May 18 18:06 lnd_in!-rw-r--r-- 1 userx ncar 133 May 18 18:06 lnd_modelio.nml!-rw-r--r-- 1 userx ncar 3668 May 18 18:14 ocn.bldlog.100518-180630.gz!-rw-r--r-- 1 userx ncar 133 May 18 18:06 ocn_modelio.nml!-rw-r--r-- 1 userx ncar 14976 May 18 18:06 pop2_in!-rw-r--r-- 1 userx ncar 1882 May 18 18:06 seq_maps.rc!
executable
.bld.log files
namelist files
35
(3) Overview of Directories during build
36
run $RUNDIR
$EXEROOT
atm
lnd
ocn
ice
glc
ccsm
Build/Run Directory
models scripts create_newcase
~/ccsm4_0 $CCSMROOT
atm
lnd
ocn
ice
glc
csm_share
utils
drv
CCSM Download
/fs/cgd/csm/inputdata $DIN_LOC_ROOT
atm lnd ocn ice glc cpl
cice dice7
/ $DIN_LOC_ROOT_CSMDATA
INPUTDATA Directory
case01 $CASEROOT
configure case01.alps.build case01.alps.run
Buildconf *.buildnml.csh *.buldexe.csh
*.input_data.list
SourceMods
src.cam
src.pop2
src.share
CASE Directory
Tools
$CCSMROOT = /home/s07hsu00/cesm_collection/cesm1_0_3/ $DIN_LOC_ROOT_CSMDATA = /work/s07hsu00/cesm_inputdata $CASEROOT = ~/cesm_case/case01 $EXEROOT = /work/$LOGNAME/cesm_run/case01
When you build, you generate the build/run directory ($EXEROOT, set in env_build.xml)
The script builds individual component and eventually the executable.
The run directory ($RUNDIR) contains the component namelists and the executable.
Step (4a) In the Basic Work Flow
(1) Create a New Case (2) Configure the Case (3) Build the Executable
(4) Run the Model 4a: IniJal run 4b: Restart run
37
Work Flow: Super Quick Start
You are ready to submit the run script to the batch queue Execute the commands in red
(the first command in case you moved out of your case directory):
# go to root directory of CESM code!cd /home/s07hsu00/cesm_collection/cesm1_0_3 !
# go into scripts subdir!cd scripts!
# (1) create a new case in your home dir!create_newcase -case ~/cesm_case/case01 -res T31_g37 -compset B_1850 -mach alps!
# go into the case you just created in the last step!cd ~/cesm_case/case01/!
# (2) configure the case!configure -case!
# (3) build the executable!./case01.alps.build !
# (4) submit your run to the batch queue!bsub < ./case01.alps.run!
# check status of job and output files!bjobs # Checking queue!ls -lFt /work/$LOGNAME/cesm_run/case01/run!
. . . . . !
38
Change back to CASEDIR if necessary
(4a) Running the Model: An Initial Run
• The run script
• Generates the namelist files in $RUNDIR (again) • Verifies existence of input datasets (again)
• DOES NOT build (or re-‐build) the executable
• IniJal runs are usually short (default is 5 days), used to verify that model is running correctly
• You usually will want to edit env_run.xml file before running, such as to specify the length of the run (we will do this later in the tutorial)
• You may want to modify namelist seWngs before running (we will do this later in the tutorial as well)
• Via env_run.xml variables
• Directly in the Buildconf/*.buildnml.csh files
39
(4) While the job is running: bjobs and bkill While a job is running, the queuing system gives some information • Check status in run queue > bjobs!JOBID USER STAT QUEUE FROM_HOST EXEC_HOST JOB_NAME SUBMIT_TIME!989663 s07hsu0 PEND 48cpu alps6 ! !case01 Mar 20 06:17!
It is waiJng in the queue if EXEC_HOST is blank
It is running if EXEC_HOST has a value: > bjobs!JOBID USER STAT QUEUE FROM_HOST EXEC_HOST JOB_NAME SUBMIT_TIME!989664 s07hsu0 RUN 48cpu alps6 48*group2 !case01 Mar 20 06:20!
It is finished if it doesn’t appear in the list (this doesn’t imply success, though!) > bjobs!
No unfinished job found!
One more useful queuing system command is bkill, to stop a job. The argument is the JOBID number shown in the bjobs output.
> bkill 989664!40
(4) While the job is running: Output in Your RUNDIR
/work/s07hsu00/cesm_run/case01/run> ls -ltr!-rw------- 1 s07hsu00 s07hsu00 130 2013-03-20 04:28 cpl.bldlog.130320-042504.gz!-rw------- 1 s07hsu00 s07hsu00 10726 2013-03-20 04:31 atm.bldlog.130320-042504.gz!-rw------- 1 s07hsu00 s07hsu00 2872 2013-03-20 04:32 lnd.bldlog.130320-042504.gz!-rw------- 1 s07hsu00 s07hsu00 1807 2013-03-20 04:33 ice.bldlog.130320-042504.gz!-rw------- 1 s07hsu00 s07hsu00 2543 2013-03-20 04:35 ocn.bldlog.130320-042504.gz!-rw------- 1 s07hsu00 s07hsu00 530 2013-03-20 04:35 glc.bldlog.130320-042504.gz!-rw------- 1 s07hsu00 s07hsu00 1121 2013-03-20 04:36 ccsm.bldlog.130320-042504.gz!-rwx------ 1 s07hsu00 s07hsu00 200239191 2013-03-20 04:36 ccsm.exe!-rw------- 1 s07hsu00 s07hsu00 1852 2013-03-20 06:17 seq_maps.rc!-rw------- 1 s07hsu00 s07hsu00 157 2013-03-20 06:17 ocn_modelio.nml!-rw------- 1 s07hsu00 s07hsu00 157 2013-03-20 06:17 lnd_modelio.nml!-rw------- 1 s07hsu00 s07hsu00 1929 2013-03-20 06:17 lnd_in!-rw------- 1 s07hsu00 s07hsu00 157 2013-03-20 06:17 ice_modelio.nml!-rw------- 1 s07hsu00 s07hsu00 2248 2013-03-20 06:17 ice_in!-rw------- 1 s07hsu00 s07hsu00 157 2013-03-20 06:17 glc_modelio.nml!-rw------- 1 s07hsu00 s07hsu00 3608 2013-03-20 06:17 drv_in!-rw------- 1 s07hsu00 s07hsu00 50 2013-03-20 06:17 drv_flds_in!-rw------- 1 s07hsu00 s07hsu00 157 2013-03-20 06:17 cpl_modelio.nml!-rw------- 1 s07hsu00 s07hsu00 157 2013-03-20 06:17 atm_modelio.nml!-rw------- 1 s07hsu00 s07hsu00 3850 2013-03-20 06:17 atm_in!-rw------- 1 s07hsu00 s07hsu00 15066 2013-03-20 06:17 pop2_in!drwx------ 3 s07hsu00 s07hsu00 4096 2013-03-20 06:20 timing!-rw------- 1 s07hsu00 s07hsu00 672752 2013-03-20 06:20 case01.cam2.h1.0001-01-01-00000.nc!-rw------- 1 s07hsu00 s07hsu00 15447260 2013-03-20 06:20 case01.cpl.r.0001-01-06-00000.nc!-rw------- 1 s07hsu00 s07hsu00 28095420 2013-03-20 06:20 case01.clm2.rh0.0001-01-06-00000.nc!-rw------- 1 s07hsu00 s07hsu00 41743192 2013-03-20 06:20 case01.clm2.r.0001-01-06-00000.nc!-rw------- 1 s07hsu00 s07hsu00 14200964 2013-03-20 06:20 case01.cice.r.0001-01-06-00000.nc!-rw------- 1 s07hsu00 s07hsu00 2364620 2013-03-20 06:20 case01.cam2.rs.0001-01-06-00000.nc!-rw------- 1 s07hsu00 s07hsu00 19756492 2013-03-20 06:20 case01.cam2.rh0.0001-01-06-00000.nc!-rw------- 1 s07hsu00 s07hsu00 104246272 2013-03-20 06:20 case01.cam2.r.0001-01-06-00000.nc!-rw------- 1 s07hsu00 s07hsu00 36 2013-03-20 06:20 rpointer.ocn.tavg!-rw------- 1 s07hsu00 s07hsu00 48 2013-03-20 06:20 rpointer.ocn.restart!-rw------- 1 s07hsu00 s07hsu00 33 2013-03-20 06:20 rpointer.ocn.ovf!-rw------- 1 s07hsu00 s07hsu00 257 2013-03-20 06:20 rpointer.lnd!-rw------- 1 s07hsu00 s07hsu00 257 2013-03-20 06:20 rpointer.ice!-rw------- 1 s07hsu00 s07hsu00 257 2013-03-20 06:20 rpointer.drv!-rw------- 1 s07hsu00 s07hsu00 339 2013-03-20 06:20 rpointer.atm!-rw------- 1 s07hsu00 s07hsu00 54227 2013-03-20 06:20 case01.pop.ro.0001-01-06-00000!-rw------- 1 s07hsu00 s07hsu00 119454828 2013-03-20 06:20 case01.pop.rh.0001-01-06-00000.nc!-rw------- 1 s07hsu00 s07hsu00 12886 2013-03-20 06:20 case01.pop.r.0001-01-06-00000.hdr!-rw------- 1 s07hsu00 s07hsu00 56979200 2013-03-20 06:20 case01.pop.r.0001-01-06-00000!
Timing Files (dir)
Single executable
Namelist Input Files
41
The files in your RUNDIR can also tell you what is happening. Recall: RUNDIR = /work/$LOGNAME/cesm_run/case01/run!
Log files from builds
Restart files eg. case01.cam2.r*… and text ‘pointer’ files that indicate the latest restart .nc files (not written until the end of the month, year or run)
History file eg. case01.cam2.h*... Model output = the best sign that the model is running!
(4) While the job is running: Output in Your RUNDIR: logfiles
/work/s07hsu00/cesm_run/case01/run>!
-rw------- 1 s07hsu00 s07hsu00 64111 2013-03-21 14:28 ccsm.log.130321-141417 -rw------- 1 s07hsu00 s07hsu00 218159 2013-03-21 14:29 ice.log.130321-141417 -rw------- 1 s07hsu00 s07hsu00 44541 2013-03-21 14:29 cpl.log.130321-141417 -rw------- 1 s07hsu00 s07hsu00 91742 2013-03-21 14:29 ocn.log.130321-141417 -rw------- 1 s07hsu00 s07hsu00 119637 2013-03-21 14:29 lnd.log.130321-141417 -rw------- 1 s07hsu00 s07hsu00 879808 2013-03-21 14:29 atm.log.130321-141417
> tail atm.log.130321-141417 nstep, te 1885 0.33068338593459501E+10 0.33068331657009416E+10 -0.19234104944156602E-04 0.98453390855408943E+05 NSTEP = 1885 8.839688154582877E-05 6.465300370062509E-06 250.495 9.84534E+04 2.359821342113403E+01 0.82 0.28 nstep, te 1886 0.33068395201603765E+10 0.33068367183476424E+10 -0.77691542135344713E-04 0.98453408632295410E+05 NSTEP = 1886 8.839730808138550E-05 6.465033776064601E-06 250.497 9.84534E+04 2.360126565292244E+01 0.81 0.28
> tail cpl.log.130321-141417 memory_write: model date = 10205 0 memory = 160.32 MB (highwater) 394.55 MB (usage) (pe= 0 comps= cpl ocn atm lnd ice glc) tStamp_write: model date = 10206 0 wall clock = 2013-03-21 14:28:41 avg dt = 22.87 dt = 22.24 memory_write: model date = 10206 0 memory = 160.34 MB (highwater) 394.55 MB (usage) (pe= 0 comps= cpl ocn atm lnd ice glc) tStamp_write: model date = 10207 0 wall clock = 2013-03-21 14:29:03 avg dt = 22.86 dt = 22.23 memory_write: model date = 10207 0 memory = 160.34 MB (highwater) 394.55 MB (usage) (pe= 0 comps= cpl ocn atm lnd ice glc)!
Log files for each component
42
The log files in the RUNDIR are valuable for seeing what timestep the model is on or for diagnosing what went wrong if the model fails. Note that the log files are moved from RUNDIR to CASEDIR/logs when the run completes, and they are gzipped. To unzip: gzip –d file.gz Recall: RUNDIR = /work/$LOGNAME/cesm_run/case01/run! CASEDIR= /home/$LOGNAME/cesm_case/case01!
The atm model is on timestep 1885
The coupler shows the model date (YYYYMMDD)
(4) Output in Your CASE Directory: Success?
The files in your CASEDIR can tell you what is happening. • When a job completes, many files are moved from RUNDIR to CASEDIR. • When a job completes successfully, $CASEDIR/logs/cpl.log will end with
“SUCCESSFUL TERMINATION OF CPL7-CCSM” !
43
Copies of the Current Namelist Input Files
~/cesm_case/case01>ls -l!total 512!drwxr-xr-x 6 userx ncar 8192 May 18 18:32 Buildconf!drwxr-xr-x 2 userx ncar 8192 May 18 18:06 CaseDocs!drwxr-xr-x 2 userx ncar 8192 May 18 18:15 LockedFiles!-rw-r--r-- 1 userx ncar 10687 May 13 14:32 Macros.alps!drwxr-xr-x 2 userx ncar 8192 May 13 14:32 README!-rw-r--r-- 1 userx ncar 66 May 13 14:32 README.case!drwxr-xr-x 9 userx ncar 8192 May 13 14:32 SourceMods!drwxr-xr-x 4 userx ncar 8192 May 13 14:32 Tools!-rwxr-xr-x 1 userx ncar 9330 May 12 11:33 check_input_data!-rwxr-xr-x 1 userx ncar 10092 May 12 11:33 configure!-rwxr-xr-x 1 userx ncar 3085 May 12 11:33 create_production_test!-rw-r--r-- 1 userx ncar 4475 May 18 18:32 env_build.xml!-rw-r--r-- 1 userx ncar 5635 May 13 14:32 env_case.xml!-rw-r--r-- 1 userx ncar 7029 May 13 14:32 env_conf.xml!-rw-r--r-- 1 userx ncar 614 May 18 18:37 env_derived!-rw-r--r-- 1 userx ncar 5916 May 13 17:12 env_mach_pes.xml!-rwxr-xr-x 1 userx ncar 2199 May 13 14:32 env_mach_specific!-rw-r--r-- 1 userx ncar 10466 May 13 14:32 env_run.xml!drwxr-xr-x 3 userx ncar 8192 May 18 18:37 logs!-rw-r--r-- 1 userx ncar 270 May 18 18:37 poe.stderr.40597!-rw-r--r-- 1 userx ncar 2013 May 18 18:37 poe.stdout.40597!-rwxrwxr-x 1 userx ncar 574 May 13 17:12 case01.alps.build!-rwxrwxr-x 1 userx ncar 836 May 13 17:12 case01.alps.clean_build!-rwxrwxr-x 1 userx ncar 802 May 13 17:12 case01.alps.l_archive!-rwxrwxr-x 1 userx ncar 3938 May 13 17:12 case01.alps.run!drwxr-xr-x 2 userx ncar 8192 May 18 18:37 timing!-rwxr-xr-x 1 userx ncar 10388 May 12 11:33 xmlchange!
stdout/err
Timing Files cases/case01>ls -l timing!total 32!-rw-r--r-- 1 userx ncar 6204 May 18 18:37 ccsm_timing.case01.100518-183212!-rw-r--r-- 1 userx ncar 3711 May 18 18:37 ccsm_timing_summary.100518-183212.gz!
cases/case01>ls -l logs!total 272!-rw-r--r-- 1 userx ncar 29882 May 18 18:37 atm.log.100518-183212.gz!drwxr-xr-x 2 userx ncar 8192 May 18 18:15 bld!-rw-r--r-- 1 userx ncar 19115 May 18 18:37 ccsm.log.100518-183212.gz!-rw-r--r-- 1 userx ncar 4998 May 18 18:37 cpl.log.100518-183212.gz!-rw-r--r-- 1 userx ncar 18732 May 18 18:37 ice.log.100518-183212.gz!-rw-r--r-- 1 userx ncar 9384 May 18 18:37 lnd.log.100518-183212.gz!-rw-r--r-- 1 userx ncar 18534 May 18 18:37 ocn.log.100518-183212.gz!
Log Files
(4) Short Term Archiving Directory
• Output data is originally created in $RUNDIR • When the run ends, output data is moved into a short term archiving directory,
$DOUT_S_ROOT = /work/$LOGNAME/cesm_archive/case01 • Why?
• Cleans up the $RUNDIR directory
• Migrates output data away from a possibly volaJle $RUNDIR
• Gathers data for the long term archive script which can then save the data to a permanent long-‐term storage area (e.g. HPSS)
cases/case01>ls -l $DOUT_S_ROOT!total 1024!drwxr-xr-x 4 userx ncar 65536 May 18 18:37 atm!drwxr-xr-x 4 userx ncar 65536 May 18 18:37 cpl!drwxr-xr-x 4 userx ncar 65536 May 18 18:37 dart!drwxr-xr-x 3 userx ncar 65536 May 18 18:37 glc!drwxr-xr-x 4 userx ncar 65536 May 18 18:37 ice!drwxr-xr-x 4 userx ncar 65536 May 18 18:37 lnd!drwxr-xr-x 4 userx ncar 65536 May 18 18:37 ocn!drwxr-xr-x 3 userx ncar 65536 May 18 18:37 rest!cases/case01>ls -l $DOUT_S_ROOT/cpl!total 256!drwxr-xr-x 2 userx ncar 65536 May 18 18:37 hist!drwxr-xr-x 2 userx ncar 65536 May 18 18:37 logs!cases/case01>ls -l $DOUT_S_ROOT/cpl/logs/!total 256!-rw-r--r-- 1 userx ncar 19115 May 18 18:37 ccsm.log.100518-183212.gz!-rw-r--r-- 1 userx ncar 4998 May 18 18:37 cpl.log.100518-183212.gz!
44
models scripts create_newcase
~/CESM1_0 $CCSMROOT
atm
lnd
ocn
ice
glc
csm_share
utils
drv
CCSM Download
(4) Overview of Directories after run
and short term archive
/ $DOUT_S_ROOT
atm
lnd
ocn
ice
glc
rest
cpl
logs
hist
Short Term Archive
$DIN_LOC_ROOT_CSMDATA
atm lnd ocn ice glc cpl
cice dice7
INPUTDATA Directory
case01 $CASEROOT
configure case01.alps.build case01.alps.run
Buildconf *.buildnml.csh *.buildexe.csh
*.input_data.list
SourceMods
CASE Directory
Tools
logs
timing
CaseDocs
run $RUNDIR
$EXEROOT
atm
lnd
ocn
ice
glc
ccsm
Build/Run Directory
45
$CCSMROOT = /home/s07hsu00/cesm_collection/cesm1_0_3/ $DIN_LOC_ROOT_CSMDATA = /work/s07hsu00/cesm_inputdata $CASEROOT = ~/cesm_case/case01 $EXEROOT = /work/$LOGNAME/cesm_run/case01 $DOUT_S_ROOT = /work/$LOGNAME/cesm_archive/case01
Step (4b) In the Basic Work Flow
• CreaJng & Running a Case (1) Create a New Case (2) Configure the Case (3) Build the Executable (4) Running the Model
4a: IniJal Run
4b: ConJnuaJon Run
46
(4b) Running the Model: Continuation Runs
• You should start with a short iniJal run as described in step (4a)
• Carefully examine the output to verify that the run is doing what you expected it to; you might rerun the iniJal run several Jmes to fix problems
• If the iniJal run looks good…the next step is to do a conJnuaJon run
• Set CONTINUE_RUN to TRUE in env_run.xml • Usually change STOP_OPTION to run the model longer
• May want to turn on auto-‐resubmit opJon in env_run.xml (RESUBMIT)
• May want to turn on “long term archiving” in env_run.xml (DOUT_L_MS). NB: This last point is true on supported machine. On alps, long-‐term archiving is not available for this tutorial.
47
(4b) Continuation run on alps
Wait for your iniJal run to complete.
Now conJnue the run case01 for 2 years
You will need to edit variables in env_run.xml • CONTINUE_RUN • STOP_N • STOP_OPTION
The soluJon is on the following page but please try first to do this on your own.
48
Work Flow: Super Quick Start
Use xmlchange to edit env_run.xml for a CONTINUE_RUN (restart).!When your initial run is completed, execute the commands in
red !# go to root directory of CESM code!cd /home/s07hsu00/cesm_collection/cesm1_0_3 !
# go into scripts subdir!cd scripts!
# (1) create a new case in your home dir!create_newcase -case ~/cesm_case/case01 -res T31_g37 -compset B_1850 -mach alps!
# go into the case you just created in the last step!cd ~/cesm_case/case01/!
# (2) configure the case!configure -case!
# (3) build the executable!./case01.alps.build !
# (4) submit an initial run to the batch queue!bsub < ./case01.alps.run!
# when the initial run finishes, change to a continuation run!xmlchange -file env_run.xml -id CONTINUE_RUN -val TRUE!xmlchange -file env_run.xml -id STOP_N –val 24!xmlchange -file env_run.xml -id STOP_OPTION –val nmonths!
# (4b) submit a continuation run to the batch queue!bsub < ./case01.alps.run!
49 You can only execute these commands acer the first run is completed !!!
Change back to CASEDIR if necessary
(5) Long Term Archiving
• Why? • Migrates output data away from a possibly volatile $DOUT_S_ROOT into a
permanent long-term storage area • Long term archiving script moves data conveniently and in parallel
• CESM has this capability however, scripts are not functional for the tutorial so we will not be doing it today
50
models scripts create_newcase
~/CESM1_0 $CCSMROOT
atm
lnd
ocn
ice
glc
csm_share
utils
drv
CESM Download
/ $DIN_LOC_ROOT
atm lnd ocn ice glc cpl
cice dice7
INPUTDATA Directory
Buildconf *.buildnml.csh *.buildexe.csh
*.input_data.list
SourceMods
CASE Directory
Tools
logs
timing
CaseDocs
case01 $CASEROOT
configure case01.alps.build case01.alps.run
case01.alps.l_archive
/ $DOUT_S_ROOT
atm
lnd
ocn
ice
glc
rest
cpl
logs
hist
Short Term Archive
HPSS
(5) Overview of Directories (+ long term archive)
run $RUNDIR
/ $EXEROOT
atm
lnd
ocn
ice
glc
ccsm
Build/Run Directory
51
Long term Archive
More Information/Getting Help • Model User Guides (please provide feedback)
• h~p://www.cesm.ucar.edu/models/cesm1.0/ • CESM Users Guide and Web-‐Browseable code reference • CAM, CLM, POP2, CICE, Data Model and CPL7 Users Guides
• CESM BulleJn Board
• h~p://bb.cgd.ucar.edu/ • Facilitate communicaJon among the community • Ask quesJons, look for answers • Many different topics
• CESM Release Page Notes
• h~p://www.ccsm.ucar.edu/models/cesm1.0/tags/ • Notes significant bugs or issues as they are idenJfied
• Model output is available on the Earth System Grid
• h~p://www.earthsystemgrid.org
• GeWng Help -‐ email
• cesm-‐[email protected] • QuesJons will be answered as resources are available
52