Practical introduction to Hirlam
Kalle Eerola, 2002, updated 2005.
Where to find information
hirlam.fmi.fi/hirlam_intro
contains collected information:
contains the official Hirlam pages. A password is needed to enter the
"Members only" area.
PARTS OF HIRLAM
This does not contain every piece, what is in the miniSMS version
The miniSMS definition file is created into $HL_DATA when
starting an experiment
1. Compile
-
Compile all libraries
-
Run checkoptions
-
Link analysis and forecast model
-
Note that all other programs are linked in the running phase (Boot)
2. Constant Files (ConstantFiles)
-
Analysis constant files for 3DVAR (VARinput)
-
Create the BUFR tables from ASCII files (MakeBUFRtabs)
3. Prepare constant files depending depending on date (MakeCycleInput)
Repeat for every date
-
Check if there is need to create the climate of this month (Climate)
-
Repeat every cycle in a day
-
Create the strategy to define which boundaries to use (MakeStrategy)
-
Create the boundaries: horizontal and vertical interpolation (Boundaries)
-
Create the background field for the first cycle
-
Prepare the observations (Prepob and MakeCMA)
The next is repeated for every cycle
Analysis
4. Define the background field (Fg)
5. Surface analysis (Span)
6. Input for upper level analysis
analysis error (VARinput)
7. 3dvar analysis
perform the 3dvar analysis, (VARan)
this is a parallel program
8. Postprocessing of analysis (Postpp)
9. Diagnostic file from the analysis
Contains information from the observation usage, analysis increments
Creates a separate file (CMAstat)
Forecast
10. Define the input for the forecast model (FCinput)
-
Define the start and boundary fields
-
Create the namelist for the forecast model
11. Run the forecast
-
parallel application (Prog)
-
contains the forecast including postprocessing of the forecast fields
The next is repeated for every cycle
Postprocessing
12. Verification against observations
-
verification is done against EWGLAM stations (Verify.pl)
13. Archiving of the results
-
the results are archived in zipped tar files (Archive.pl)
14. Cleaning of data directories
-
in the reference system a very aggressive cleaning is done (SaniDisk.pl)
Some important features
1. Identify your experiment (EXP)
-
Normally a three character identification, which identifies your experiment
-
you can run many different experiments at the same time
-
in an experiment you can have only one run going on at a time
2. Create a Hirlam environment for yourself
-
run: hir512 EXP
-
EXP is a Hirlam identification
-
this script:
-
run the script defining the Hirlam system (Env_system)
-
creates a new (t-)shell for you
-
goes to the Hirlam working directory $HL_WD
-
hir152 and Env_system are defined while installing the
Hirlam on that platform
-
they are normally done by the local Hirlam system manager, at the moment
mainly KE
-
there can be several Hirlam versions installed at the same time on any
platform
-
the environment are created by hirXYZ,
where XYZ defines the version: XYZ=Hirlam-X.Y.Z
3. Some important (pseudo)variables
3.1 Specific for one platform, normally the user need not modify these
and he has only read
and execution rights to these
-
HL_REF_FS - full path of the Hirlam reference system
-
HL_REF_CP - contains a copy of Hirlam reference system
-
HL_SCR - contains start-up scripts of the Hirlam reference system, only
a few scripts
-
HL_UTIL - utilities like "mandtg" etc
-
HL_RESOURCES - resources for the Hirlam system
3.2 Specific for one user
-
HL_ARC
-
archive directory of the user
-
normally products from experiment EXP are archived in $HL_ARC/$EXP
-
hl_hs
-
defines your own modified scripts, which are common for your all experiments
-
overrides the reference scripts, but not those specific for one experiment
3.3 Specific for one experiment
-
HL_WD
-
"home" directory of one experiment, normally $HOME/$OS/hlxyz
-
$OS is system (IBMSP, SGI, T3E), hlxyz is for instance
hl511 for Hirlam version 5.1.1
-
in the subdirectories of HL_WD are the modifications
for scripts and programs specific for this experiment
-
for every library a corresponding subdirectory here contains
modifications toa program in that library
-
subdirectory scripts contains modified scripts specific to this
experiment
-
HL_DATA (HL_DATA_HOSTx)
-
contains data (forecasts, boundaries, etc.) specific to this experiment
-
The data is normally removed from here, when it is not any more needed
-
for a specific cycle the data is in subdirectoy "yyyymmdd_hh", where
yyyy is year, mm is month, dd is day and hh is hour
-
HL_LIB
-
directory containing libraries and scripts
-
more permanent than HL_DATA
-
by default equals to HL_DATA
-
HL_EXP
-
is normally set to $HL_ARC/$EXP
-
archiving directory of one experiment
-
HL_DATA and HL_EXP contains a lot of data, so they may need a lot of disk
space
4. How this hierarchy is achieved
-
File Env_system defines the Hirlam environment
-
Default Env_system is normally in user id hirlam and
many users can use that unless
they want to change the default values
-
you must belong the group hirlam to be able to run hirlam, because many
files
are common to all who run Hirlam and they have read- and execution
rights for the group
-
Env_system is run "inline" when Hirlam environment is created
or a Hirlam run started
-
sets the variables described above (and many others)
-
modifies the PATH:
PATH=$PATH_ORIG:$HL_WD/scripts:$hl_hs:$HL_LIB/scripts:$HL_UTIL:$HL_SCR
-
PATH_ORIG is the default path, normally set to PATH
while logging in
5 How you define, modify and run an experiment
-
Suppose you want to run an experiment called "AAA"
-
Everything has a reasonable default, but you want to run your experiment
5.1 Run: hir512 AAA and you are in $HL_WD
5.2 You must define your experiment and perhaps you want to modify some
subroutine
-
mkdir scripts; cd scripts
-
you define a directory for scripts in this experiment
-
Hirlam co scripts Env_input Env_domain Env_expdesc Env_qsub
-
you take Hirlam default scripts, which defines the experiment
-
if you have good defaults in $hl_hs, you can copy them
-
don't forget to give execution rights to your modified scripts: chmod
u+rwx *
-
edit the scripts you want
-
in the same you can modify any Hirlam script
-
Suppose you want to modify Fortran subroutine GEMINI.f in the
library grdy
-
cd $HL_WD; mkdir grdy; cd grdy
-
you create a subdirectory grdy in $HL_WD
-
Hirlam co grdy GEMINI.f; chmod u+rwx GEMINI.f; emacs GEMINI.f
-
Now you can start the experiment:
Hirlam start DTG=2001121000 DTGEND=2001121018
-
Hirlam is run on the background processes or as separate batch jobs
depending on the platform
-
Version 5.0.0 and earlier are run as a single job
-
Version 5.1.0 and upwards uses miniSMS to control the job
-
several parts of Hirlam are run at the same time
-
controlled by the miniSMS
-
the syntax of the miniSMS definition file is similar to SMS
-
The logfile(s) can be found in $HL_DATA
-
a single file for not SMS-run
-
several files for SMS runs in HTML format
Running Hirlam
Needed preparations, when running Hirlam 5.1.2 on Metis for the first time
-
Run script /home/met/hirlam/bin/hir_create
-
creates the script: ~/bin/hir512
-
creates the directory $hl_hs, which contains the locally
modified scripts common
to all FMI users
-
Note that hir512 write into $HL_WD a file
hl_vn containg
5.2.1. This is needed to
to run beta version 5.1.2. Otherwise the beta release would run latest
reference 5.1.0
Now you can run experiments:
hir512 ZZZ
Hirlam start DTG=2001121000 DTGEND=2001121018 LL=48
Examples of a few typical Env_ files are in the directory Examples