Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

WRF module available on Atos


There is a WRF model already installed on Atos. The module is available only for Intel programming environment and Mellanox Open MPI because this combination has given the best performance. Consequently, the module is only available once the following pre-requisite modules are loaded:

Code Block
languagebash
titlepre-requisit modules
$> module load prgenv/intel hpcx-openmpi
$> module load wrf

To see all pre-requisite conditions for the WRF one can use module spider command:

Code Block
languagebash
titlemodule spider
$>module spider wrf


If you need any additional versions installed, or under another programming environment please contact us via Support Portal

WRF Example


To set up a working directory for running WPS and WRF from the public install, use the WRF utility script called build_wrf_workdir. This script creates the appropriate folder structure for running WPS and WRF under your $PERM folder:

Code Block
languagebash
$> module load wrf
$> build_wrf_workdir
$> cd $PERM/wrf
$> ls
run_IFS_4km

This will load WPS and WRF.

A sample submission script together with model configuration for a simple case is available in $PERM/wrf/run_IFS_4km/. run_wrf.sh is a self contained script that runs the test case for April 2019 using IFS boundary conditions. Aside from it, the directory also contains:

namelist.wps.inWPS namelist template
namelist.input.inWRF namelist template
run_wps.sbatch.inscript for submitting ungrib, geogrid, and metgrid using 32 tasks 
run_real.sbatch.inscript for submitting real.exe using 128 tasks
run_wrf.sbatch.in script for submitting wrf.exe using 256 tasks


To run the sample:

Code Block
languagebash
$> cd $PERM/wrf/run_IFS_4km/
$> ./run_wrf.sh

This script creates model run directory in $SCRATCH/wrf/run/, goes into it, copies all required input data, creates links, and executes all model components one by one.

Please note: build_wrf_workdir script should be executed only the first time you load specific version of the module to create a WRF structure in your $PERM directory. Every time you execute it, it will overwrite the existing $PERM/wrf/run_IFS_4km/ structure and move the existing directory:

From:To:
run_IFS_4km/run_IFS_4km_old/

After it has been executed once, every other time you just have to load the module to run the model.

Geogrid Data


Geogrid data for various different resolutions (30", 2', 5', and 10') is currently available in /ec/res4/hpcperm/usbk/geog.
The 'geog_data_path' variable in WPS's namelist.wps has already been configured to use this greogrid data. Please note that this location could be unavailable for 3-4 hours per year during system sessions. Consequently, any operational or "Time Critical" work should not be based on it.

Boundary Conditions


Boundary conditions from IFS HiRes are provided for date 30/04/2019 at 00 UTC +6 hours ahead and domain covering most of Europe. These BC files are already linked inside run_wrf.sh script. For other geographical regions, BC can be download from publicly available ftp server:

ftp://ftp.ecmwf.int/pub/wrf

For more information look: ECMWF WRF Test Data

How to install your own version of WRF on Atos

Intel compilers & Open MPI

To install WRF with Intel compilers, the following modules need to be pre-loaded:

Code Block
languagebash
titlemodules
$>module load prgenv/intel netcdf4 hpcx-openmpi jasper/2.0.14
$>module list

Currently Loaded Modules:
  1) intel/19.1.2   2) prgenv/intel   3) netcdf4/4.7.4   4) openmpi/4.0.5.2  hpcx-openmpi   5) jasper/2.0.14
 

WRF needs to be pointed to NETCDF location manually:

Code Block
languagebash
export NETCDF=$NETCDF$NETCDF4_DIR

In general, on ATOS we use -rpath option to link shared libraries. However, this is difficult to use with WRF because of the installation scripts structure which use NETCDF variable to link to NetCDF libraries. Consequently, in running script we need to export NetCDF library path:

Code Block
languagebash
titleWRF running script
module load netcdf4
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$NETCDF4_DIR/lib


CSH is not installed on TEMS. To execute CSH scripts one can use locally installed tcshCompilation should be submitted as a batch job and an example is given in the WRF module:

Code Block
languagebash
titleexample
/perm/usxa/tems/apps/tcsh/6.22.03/bin/tcsh ./compile$WRFPATH/WRF/compile.sh


WPS: 

In modules such as netcdf4 on TEMSAtos, libraries are linked using environmental variables such as:

Code Block
languagebash
setenv("NETCDF4_LIB","-L/usr/local/apps/netcdf4/4.7.4/INTEL/19.1/lib -Wl,-rpath,/usr/local/apps/netcdf4/4.7.4/INTEL/19.1/lib -lnetcdff -lnetcdf_c++ -lnetcdf")

...

Code Block
languagebash
#                       -I$(NETCDF)/include
                        $(NETCDF4_INCLUDE)

#                       -L$(NETCDF)/lib -lnetcdff -lnetcdf
                        $(NETCDF4_LIB)

#COMPRESSION_LIBS    = -L/glade/u/home/wrfhelp/UNGRIB_LIBRARIES/lib -ljasper -lpng -lz
#COMPRESSION_INC     = -I/glade/u/home/wrfhelp/UNGRIB_LIBRARIES/include

COMPRESSION_LIBS    = $(JASPER_LIB) \
                      -L/usr/lib64 -lpng -lz
COMPRESSION_INC     = $(JASPER_INCLUDE) \
                      -I/usr/include

...

An example of compilation job is also available in the WRF module:

Code Block
languagebash
$WRFPATH/WPS-master/compile.sh

Everything else should be done following the official WRF user guide.

Intel compilers & Intel MPI

There are only a few differences in the compilation process with Intep MPI:

...

Code Block
languagebash
titlemodules
$>module list

Currently Loaded Modules:
  1) intel/19.1.2   2) prgenv/intel   3) netcdf4/4.7.4   4) jasper/2.0.14   5) intel-mpi/19.1.2


  • In configure.wrf and configure.wps make the following settings:

...