You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 200 Next »


Contents:



This page contains a guide for building and using OpenIFS 48r1 at model version 48R1.1.

We describe how to build the model and carry out a few basic tests, how to run an example forecast experiment, for which initial experiment data is made available, and how to visualise the model output by creating plots.

Worked Example:

We explain the model installation and the process of running of a forecast experiment on the ECMWF Atos Sequana XH2000 HPC facility (hpc2020)

  • All data files and configuration scripts that are required for the worked example experiment described in this guide can be found on the ECMWF file system here:  /perm/openifs/oifs_data/48r1/example
  • For users without access to the ECMWF HPC facility we provide URLs for downloading all required data files and configuration scripts.
  • The model will be installed into $HOME/openifs-48r1.1/
  • The example forecast experiment will be set up in directory $OIFS_EXPT/ab2a/

It is important to note that the installation process on hpc2020 will not directly translate to alternative systems. To address this we present details about a docker install in the last section of this guide.

Access requirements

OpenIFS is licensed software and using the model requires your affiliation with an institute which has signed the ECMWF OpenIFS software licence agreement.

In order to access the model sources and other required data packages, you need to have a personal ECMWF account which you can create on the ECMWF web site or by clicking here.

If you are a new model user you should in the first instance contact OpenIFS support (by emailing openifs-support@ecmwf.int) requesting access to OpenIFS by providing the following information: your full name, your affiliation and related institutional email address, and your ECMWF account user name. OpenIFS staff will then activate OpenIFS user policies for your personal ECMWF account. In some cases we may need to request additional verification of your affiliation with your licensed institute prior to giving you access. 

Extract the OpenIFS package

Create your local installation of OpenIFS 48R1.1 by downloading and extracting the tarball containing the model source package.

The tarball can be downloaded from this web site: https://sites.ecmwf.int/openifs/openifs-data/src/48r1/openifs-48r1.tar.gz. Please note that access to this web site is restricted to registered OpenIFS users, i.e. your personal ECMWF account must have been added to the OpenIFS user policy.

In the example below we assume that the OpenIFS model shall be installed in the user's HOME directory, and that the wget utility is available for convenient download: 

cd $HOME
wget https://sites.ecmwf.int/openifs/openifs-data/src/48r1/openifs-48r1.tar.gz
tar -xvzf openifs-48r1.tar.gz
  • You will need approximately 4 GB of disk space for the model sources, the bundle packages, and the built model binaries. 
  • Please note that in future access to the OpenIFS source package will be provided through a git repository. Therefore the provision of the package as a tarball is only a temporary arrangement. 

Set up the platform configuration file

The OpenIFS model requires a small number of Linux environment variables to be set for both installation and runs. These are referred to as global environment variables.

The most important environment variable is OIFS_HOME, which is required by all scripts used by the model. 

 OIFS_HOME describes the location of the OpenIFS model installation and is general the path where the git repository was extracted.  For example, if you clone the repository into your $HOME directory then you should set:

export OIFS_HOME=$HOME/openifs-48r1.1/

The other model environment variables are

  • OIFS_CYCLE - describes the model cycle (e.g. 48r1) for which this configuration file can be used.
  • OIFS_DATA_DIR - describes the location of climatological input files that are required to run OpenIFS. These have been installed on the ECMWF hpc2020 in a central and accessible location under /perm/openifs/oifs_data and the information is organised by model cycle. 

If you do not have access to the ECMWF hpc2020 file system, or if you wish to install the climatological input files in a local directory of your choice, then you can download the required data from this site: https://sites.ecmwf.int/openifs/openifs-data/ifsdata/48r1/

As a minimum you will require the packages ifsdata_rtables_48r1.tgz, ifsdata_climatology_48r1.tgz, and the files required for your selected horizontal grid resolution, which in the case of our example is ifsdata_48r1_climate.v020_255.tgz.

Download and extract these files to your chosen location, the filepath of which should then be set to variable OIFS_DATA_DIR

The required global environment variables, described above, are defined in the Platform configuration file, which needs to be modified for your local OpenIFS installation. 

  • This file can be located anywhere on your file system, however the recommended default location is $OIFS_HOME.
  • We provide a template for this configuration file in $OIFS_HOME/oifs-config.edit_me.sh.
  • You should edit this file and update the path set in variable OIFS_HOME with your installation's path.

Once edited the platform configuration file is loaded using the following command:

source /path/to/file/location/oifs-config.edit_me.sh
# using our installation example:
# source $HOME/openifs-48r1.1/oifs-config.edit_me.sh

We recommend to include this command in your Linux shell startup configuration (e.g. in .bashrc). 

The above command should also be include in any batch job scripts that are intended to run OpenIFS (described in Section 3)

Previous versions of the OpenIFS model also used a platform configuration file. Since OpenIFS 48r1 the number of variables set inside this file has been reduced to a bare minimum. At present, only the variables OIFS_HOME , OIFS_CYCLE , and OIFS_DATA_DIR  need to be set in this file. 

 Build OpenIFS

In the next step the model binary executable (and other helper programs) will be built.

OpenIFS build system

In contrast to earlier model versions, the building of OpenIFS 48r1 is no longer based on the FCM configuration manager, but uses from now on the ecbuild ECMWF build system that is also used for the ECMWF IFS model, and which uses CMake at its core.

OpenIFS 48r1 is further distributed with a software bundle which automatically installs many required software packages during the build process, such as for instance ecbuild, ecCodes, metkit, etc. Hence, a separate installation of these libraries is no longer required as they have now become part of the OpenIFS distribution. 

Starting the build process

The  $OIFS_HOME/scripts/openifs-test.sh  script can be used to build the model and run initial tests. 

  • The script requires key environment variables, such as $OIFS_HOME, to be assigned. Make sure you have sourced the platform configuration script first.
  • The usage of the script is shown with the command:  openifs-test.sh -h
  • The option -e defines the compiler environment (intel or gnu). The default is intel. 

Run the build process and the tests using the following command:

cd $OIFS_HOME
./scripts/openifs-test.sh -cbt

where

  • -c cleans up the directory, i.e, remove any existing build, source, ecbundle directories
  • -b builds 
    • the OpenIFS double and single precision master executables (ifsMASTER.DP and ifsMASTER.SP, respectively) , which are used to run 3-D OpenIFS. The executables are located in  $OIFS_HOME/build/bin.
    • the double and single precision Single Column Model (SCM) executables (MASTER_scm.DP and MASTER_scm.SP, respectively), which are used to run the SCM derived from OpenIFS. The executables are located in  $OIFS_HOME/build/bin.
  • -t  will run the ifs-test t21 tests, which comprise of
    • 21 3-D OpenIFS forecast-only tests with and without chemistry
    • 1 SCM test (based on TWP-ICE)

By default, on hpc2020, OpenIFS will be built using the Intel compiler (Intel 2021.2) . OpenIFS will also build with GNU, if this is required then the user should execute the following command

./scripts/openifs-test.sh -cbt -e gnu 

This will load the compiler environment for GNU GCC 11.2.

If everything has worked correctly then all tests should have passed and the script returns the following

[INFO]: Good news - ctest has passed
        openifs is ready for experiment and SCM testing
----------------------------------------------------------------
END ifstest on OpenIFS build

 Set up a forecast experiment

An example forecast experiment has been prepared for OpenIFS 48r1. The experiment ID is  ab2a

Extract the example forecast experiment ab2a.tar.gz into a folder in a location suitable for model experiments. This folder will be your experiment directory and its root path should correspond with the variable OIFS_EXPT in the platform configuration file. 

Example: 
On the ECMWF hpc2020 our model installation $OIFS_HOME will be in $HOME/openifs-48r1.1 and for the experiment we extract the ab2a package to $OIFS_EXPT. The experiment directory shall therefore be $OIFS_EXPT/ab2a/2016092500

wget https://sites.ecmwf.int/openifs/openifs-data/case_studies/48r1/karl/ab2a.tar.gz $OIFS_EXPT
cd $OIFS_EXPT
tar -xvzf ab2a.tar.gz

The experiment directory would ideally be in a different location from the earlier model installation path $OIFS_HOME. In general, you will need more disk space for experiments, depending on the model grid resolution, the duration of the forecast experiment and the output fequency of model results. 

Ensure the namelist files for the atmospheric model (fort.4) and for the wave model (wam_namelist) are found in the experiment directory.  If they are not already there then you can find them in a subfolder (called ecmwf) inside the experiment directory.

cd $OIFS_EXPT/ab2a/2016092500
cp ./ecmwf/fort.4 .
cp ./ecmwf/wam_namelist .

You will need to copy three further scripts from the OpenIFS package into your experiment directory:

  • oifs-run: this is a generic run script which executes the binary model program file.
  • exp-config.h: this is the experiment configuration file that determines settings for your experiment. It will be read by oifs-run. 
  • run.ecmwf-hpc2020.job: this is the wrapper script to submit non-interactive jobs on hpc2020

Copy these files from $OIFS_HOME/scripts into your experiment directory.

cd $OIFS_EXPT/ab2a/2016092500
cp $OIFS_HOME/scripts/exp_3d/oifs-run .
cp $OIFS_HOME/scripts/exp_3d/exp-config.h .
cp $OIFS_HOME/scripts/exp_3d/run.ecmwf-hpc2020.job .

Determine experiment parameters

Namelist:

  • You can edit the atmospheric model namelist file fort.4. It contains Fortran namelists which control model settings and switches.
  • An important switch to edit is in namelist NAMRIP the variable CSTOP. Set this to the desired length of the forecast experiment.
  • Experiment ab2a can be run for up to 144 hours (6 days) by setting CSTOP='h144'.

Experiment configuration file: 

  • You can edit the exp-config.h file which determines settings for this experiment.
  • The oifs-run script will read the settings from this file.
  • Alternatively, the settings can be passed to the oifs-run script via command line parameters, which takes precedence over the exp-config.h settings. 

You should always set up an exp-config.h for each experiment. If no exp-config.h file is found in the experiment directory, and if also no command line parameters are provided when calling oifs-run, then oifs-run will revert to its own default values which are not appropriate. In any case you should either edit the exp-config.h file appropriately or provide the correct command line parameters.

The exp-config.h file contains the following settings:

exp-config.h
#--- required variables for this experiment:

OIFS_EXPID="ab2a"       # your experiment ID
OIFS_RES="255"          # the spectral grid resolution (here: T255)
OIFS_GRIDTYPE="l"       # the grid type, either 'l' for linear reduced grid, or 'o' for the cubic octahedral grid
OIFS_NPROC=8            # the number of MPI tasks
OIFS_NTHREAD=4          # the number of OpenMP threads
OIFS_PPROC=true         # enable postprocessing of model output after the model run
OUTPUT_ROOT=$(pwd)      # folder where pproc output is created (only used if OIFS_PPROC=true). In this case an output folder is created in the experiment directory.
LFORCE=true             # overwrite existing symbolic links in the experiment directory
LAUNCH=""               # the platform specific run command for the MPI environment (e.g. "mpirun", "srun", etc).

#--- optional variables that can be set for this experiment:

#OIFS_NAMELIST='my-fort.4'               # custom atmospheric model namelist file
#OIFS_EXEC="<cutome-path>/ifsMASTER.DP"  # model exec to be used for this experiment

Order of precedence for how OpenIFS evaluates variables:


  1. exp-config.h:  These variables have the highest precedence and are used for the experiment (Best practice to use this).
    Example: Here you are setting the experiment ID, parameters for the model grid and for parallel execution of this specific experiment. Each experiment directory should contain its own exp-config.h file. 

  2. oifs-run:  If no exp-config.h is found, and if no command-line parameters are provided when calling oifs-run, then the default settings found inside oifs-run are used instead (This is not recommended! Use an exp-config.h file instead). 
    Example: For some variables the defaults are usually fine. For instance, you do not need to specify the namelist file 'fort.4' in exp-config.h, because oifs-run will use this file name as a default value.

  3. oifs-config.edit_me.sh:  This file contains global configuration settings required for the correct functioning of OpenIFS, and therefore this file needs to be always sourced first. However, it does not contain variables that are specific to a forecast experiment, and any variables that are declared in either exp-config.h or in oifs-run will overwrite their previous settings in this global configuration file.
    Example: In your global configuration file you may have set the double precision variable as your standard model executable. If you wish to use single precision for a specific experiment, then you can set OIFS_EXEC in exp-config.h to the SP binary executable which will overwrite the global setting for this experiment.


Running the experiment

After all optional edits to the namelists (fort.4) and to the experiment configuration file (exp-config.h) have been completed the model run can be started.

Depending on the available hardware experiments can either be run interactively or as a batch job.

Running a batch job

This method is the preferred way to run OpenIFS, as it is more efficient and it allows more flexibility in using the available hardware resources. 

  • A job wrapper script that is suitable for the locally available batch scheduler needs to be used to call oifs-run.
  • We include an example job wrapper script run.ecmwf-hpc2020.job in $OIFS_HOME/scripts, which is suitable for the ECMWF hpc2020 Atos HPC. This uses the SLURM batch job scheduler.
    • In section 3, this script is copied to the experiment directory because it needs to be located here, to run an experiment.
  • run.ecmwf-hpc2020.job needs to be edited with the following essential and optional changess
    • Intially run.ecmwf-hpc2020.job sets the PLATFORM_CFG variable as follows
# set OpenIFS platform environment:
PLATFORM_CFG="/path/to/your/config/oifs-config.edit_me.sh"
    • It is important to change "/path/to/your/config/oifs-config.edit_me.sh" to the actual path for the oifs-config.edit_me.sh, e.g., "$HOME/openifs-48r1.1/oifs-config.edit_me.sh"
    • The default resources requested in run.ecmwf-hpc2020.job are  8 nodes on the ECMWF hpc2020 machine, with a total of 256 MPI tasks and 4 OpenMP threads. This can be changed as required.
    • For information, the LAUNCH command for batch job submission is set to "srun" without any further options, because all required parallel environment settings are provided through the SLURM script headers.

Once you have made the appropriate changes to run.ecmwf-hpc2020.job, you can submit it and, hence, run the experiment with the following commands

# run as batch job:
cd $OIFS_EXPT/ab2a/2016092500
sbatch ./run.ecmwf-hpc2020.job

The job wrapper script will read the exp-config.h file and adopt the selected values. The exceptions are LAUNCH, which is set to "srun" for batch jobs, and OIFS_NPROC & OIFS_NTHREAD for which values from the batch job headers are used. The job wrapper script modifies the exp-config.h file accordingly prior to calling the oifs-run script.

Running interactively

On the ECMWF hpc2020, running the model script interactively should be fine for lower grid resolutions up to T255L91. 

  • In order to run the experiment interactively, execute the oifs-run script from the command line in your terminal.
  • If no command line parameters are provided with the oifs-run command, then the values from the exp-config.h will be used.
  • In exp-config.h set OIFS_NPROC=8 and OIFS_NTHREAD to 4.
  • In exp-config.h the LAUNCH variable should remain empty, i.e. LAUNCH="" and no --runcmd parameter should be provided in the command line.

The oifs-run script will in this case use its default launch parameters:  srun -c${OIFS_NPROC} --mem=64GB --time=60  which will work fine with OIFS_NPROC=8 for experiment ab2a. 

# run interactively:
cd $OIFS_EXPT/ab2a/2016092500
./oifs-run

Postprocessing

If in the exp-config.h file the OIFS_PPROC variable has been set to true (or if the --pproc command line parameter was used) then the model output in the experiment directory is further processed after completing the model run.

  • In this case the script will generate a folder called  output_YYYMMDD_HHMMSS , with YYYYMMDD being the current date and HHMMSS the current time. 
  • This avoids accidental modification or overwriting of any previous results when the model experiment is repeated.
  • The variable OUTPUT_ROOT in exp-config.h determines where this ouput folder will be created. The default location is inside the experiment directory, but when assigning another path to OUTPUT_ROOT this could be created elsewhere.

The postprocessing groups all model output fields and diagnostics into individual GRIB files with ascending forecast time step. Also, a copy of the atmospheric model namelist file fort.4, as well as the ifs.stat and NODE.01_001 log files are moved into the output folder.

This postprocessing is required if the Metview Python script is to be used later to visualise the model output.

Plotting of model output

Here we describe in a brief summary how a small number of plots from the model results can be generated. This permits a first-order sanity check whether the model results look sensible.

For this we use the Metview graphics package developed at ECMWF. 

This requires the use of Jupyter Notebooks using a conda environment with Metview and Metview-Python libraries. 

On the ECMWF hpc2020 a Jupyterlab session can be started using the command   ecinteractive -j 

Step 1:  Copy the Metview processing code to your $PERM location:

cp /perm/openifs/oifs_data/48r1/example/mv.tgz $PERM
cd $PERM
tar -xvzf mv.tgz
cd mv

In the following steps we will process the OpenIFS model output into a dataset format that can be easily interpreted by Metview using a simplified plotting procedure.

Step 2:  Edit the file oifs_to_mv.sh and change the path variable:

  • in_dir:  This needs to point to the output_YYYMMDD... folder where the postprocessed OpenIFS model experiment (from the previous section) is found. Note that absolute file paths are required for this variable!

Step 3:  Execute the script by running the command: 

cd $PERM/mv
./oifs_to_mv.sh
  • This data processing may take a couple of minutes to complete. 
  • Occasionally the message "ERROR:  input file does not exist!" may occur which can be safely ignored. This happens when the script attempts to convert model output which was not generated by OpenIFS. The script will not fail but simply carry on looking for the next file.
  • After successful completing the conversion process "Done." should appear on the terminal.
  • As a result of this processing, regular gridded and compressed GRIB files are generated in  $PERM/mv/data/ab2a  which can be visualised with by running the enclosed Jupyter Notebook  single.ipynb

Step 4:  Now proceed with the following steps to visualise the processed data:

  • On the ECMWF Virtual Desktop Interface (VDI) open a terminal, log into the hpc2020 with command:  ssh hpc-login
  • In the terminal start the Jupyter session on an interactive node, using the command:  ecinteractive -j
  • After the interactive node has started you will be given a weblink to connect to the Jupyterlab session ("To manually re-attach go to <weblink>").
  • Open a web browser (e.g. Chrome) inside the VDI and paste the weblink into the browser's URL address field; this will connect to the Jupyter session.
  • In the file explorer, on the left side of the Jupyter window, navigate to the folder $PERM/mv/ipynb/ and select Notebook  single.ipynb
  • Open this Notebook by double-clicking in the explorer window.
  • Once it has opened, run all its cells in sequence (e.g. use the command "Run All Cells" in menu "Run").
  • This will generate a series of plots from the model output which are displayed inside the Notebook. 
  • Optional:  After completing the Jupyter session it is good practice to release the reserved interactive node using this command in the terminal window:  ecinteractive -p hpc -k   and confirm cancellation of the job; if this is not done the interactive job will timeout after 12 hours.

Requirements

This section provides further details about software requirements for OpenIFS.

This is not needed for the ECMWF hpc2020.


  • No labels