You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »

WRF modules available on CCA/CCB

There are few WRF versions available on CC{A/B}. Use the module avail wrf to see the available versions. If you need any additional versions installed, please email servicedesk@ecmwf.int.

> module avail wrf 

-------------------------------------- /usr/local/apps/modulefiles/tools_and_libraries/utilities -------------------------------------- 
wrf/3.9.1.1          wrf/4.0.3(default)  wrf/4.1.5

Listed versions are available for Intel Programming Environment only. Since PrgEnv-cray is loaded as default at login user has to switch it to Intel in order to load wrf module:

prgenvswitchto intel

WRF Example

To set up a working directory for running WPS, WRF and UPP from the public install, use the WRF utility script called build_wrf_workdir. This script creates the appropriate folder structure for running WPS and WRF under your $PERM folder:

$> module load wrf
$> build_wrf_workdir
$> cd $PERM/WRF
$> ls
run_IFS_4km

This will load WPS, WRF including chemistry, and UPP.

A sample submission script together with model configuration for a simple case is available in $PERM/WRF/run_IFS_4km/run_wrf.sh is a self contained script that runs the test case for April 2019 using IFS boundary conditions. Aside from it, the directory also contains:

namelist.wps.in
WPS namelist template 
namelist.input.in
WRF namelist template
run_wps.qsub.in
script for submitting ungrib, geogrid, and metgrid using 1 node 
run_real.qsub.in
script for submitting real.exe using 1 node
run_wrf.qsub.in
script for submitting wrf.exe using 7 nodes


To run the sample:

$> cd $PERM/WRF/run_IFS_4km/
$> ./run_wrf.sh

This script creates model run directory in $SCRATCH/WRF/run/, cd into it, copies all required input data, creates links, and executes all model components one by one.

Please note: build_wrf_workdir script should be executed only the first time you load specific version of the module to create a WRF structure in your $PERM directory. Every time you execute it, it will overwrite the existing $PERM/WRF/run_IFS_4km/ structure and move the existing directory:

From:To:
run_IFS_4km/run_IFS_4km_old/

After it has been executed once, every other time you just have to load the module to run the model.

Geogrid Data

Geogrid data for various different resolutions (30", 2', 5', and 10') is available in /fws2/wrf_geog/geog/. The 'geog_data_path' variable in WPS's namelist.wps has already been configured to use this greogrid data. Please note that this location could be unavailable for 3-4 hours per year during system sessions. Consequently, any operational or "Time Critical" work should not be based on it.

Boundary Conditions

Boundary conditions from IFS HiRes are provided for date 30/04/2019 at 00 UTC +6 hours ahead and domain covering most of Europe. These BC files are already linked inside run_wrf.sh script. For other geographical regions, BC can be download from publicly available ftp server:

ftp://ftp.ecmwf.int/pub/wrf

For more information look: ECMWF WRF Test Data

How to install your own version of WRF on CCA[B]

Several versions of WRF model have been installed and tested on CCA[B] using Cray and Intel compilers. Model efficiency is very similar with these two options and compilation is a bit faster with Intel.

  • To install it using Cray compiler (default) follow the directions in WRF official user guide. Prior to installation NETCDF variable has to be set:

    $> module load cray-netcdf
    $> export NETCDF=$NETCDF_DIR

    Configuration option "Cray XE and XC CLE/Linux x86_64, Cray CCE compiler  (dmpar)should be selected.

  • To install it using Intel compiler, prior to installation programming environment has to be change from Cray (default) to Intel. Prior to installation NETCDF variable has to be set::

    $> prgenvswitchto intel
    $> module load cray-netcdf
    $> export NETCDF=$NETCDF_DIR

    Configuration option "Linux x86_64, ifort compiler with icc (dmpar)" should be selected.

After that, the following changes should be made to configure.wps:

BeforeAfter
DM_FC = mpif90DM_FC = ftn
DM_CC = mpiccDM_CC = cc

Everything else should be done following the official WRF user guide.


Computational cost


Domain size nx * nyModel resolution [Km]Time step [s]Number of vertical layersNumber of nodesTime required for +24h forecast [min]ECMWF System Billing Units [SBU]
300 * 30010605029.8189
300 * 3004245059.7469
300 * 3003185089.3720
300 * 30016502513.23197

Please note that this is an approximate computational cost and the actual cost of running WRF depends on physics selected. Consequently, your numbers may vary from those given in the table.

As a guidance, increasing "nx" or "ny" two times will make computational cost 2* higher. Doubling model resolution while keeping the same domain size will make it 2*2*2 times more expensive.

  • No labels