Skip to end of metadata
Go to start of metadata

How is the CAMS reanalysis daily data organised in MARS?

In general it is organised, as a huge tree, with the indentation below, showing different levels down that tree:

What would be the natural way to group requests?

The idea is to request as much data as possible from the same tape file (star) . The natural way to group requests would be:
all parameters, all levels, all time-steps for all dates of a month

(warning) Note: 'all' means 'all' that the user wants. It doesn't have to be all available parameters.

Web-API examples:

Example 1: A request for CAMS Reanalysis daily data, pressure levels

  • The objective of this example is to demonstrate how to iterate efficiently over some years, for a particular CAMS Reanalysis daily, pressure level request.
  • At this point you may wish to have a look on the CAMS Reanalysis daily availability
  • The request below can be used as a starting point, however you need to keep in mind that you have to adapt it to your needs eg to set the keyword, values according to your requirements (eg add or remove 'param', 'levtype', 'step' etc).
  • In this way you can extend this example to download a longer period or even the whole CAMS Reanalysis dataset.
  • The sample script creates one output file per requested month.
#!/usr/bin/env python
import calendar
from ecmwfapi import ECMWFDataServer
server = ECMWFDataServer()

def retrieve_cams-reanalysis():
    """       
       A function to demonstrate how to iterate efficiently over several years and months etc     
       for a particular CAMS Reanalysis request.      
       Change the variables below to adapt the iteration to your needs. 
       You can use the variable 'target' to organise the requested data in files as you wish.
       In the example below the data are organised in files per month. (eg "cams-reanalysis_daily_200310.grb")
    """
    yearStart = 2003                                    # As of 2017/11, only 2003 CAMS reanalysis data is available.
    yearEnd = 2004
    monthStart = 1
    monthEnd = 12
    for year in list(range(yearStart, yearEnd + 1)):
        for month in list(range(monthStart, monthEnd + 1)):
            startDate = '%04d%02d%02d' % (year, month, 1)
            numberOfDays = calendar.monthrange(year, month)[1]
            lastDate = '%04d%02d%02d' % (year, month, numberOfDays)
            target = "cams-reanalysis_daily_%04d%02d.grb" % (year, month)
            requestDates = (startDate + "/TO/" + lastDate) 
            cams-reanalysis_request(requestDates, target)

def cams-reanalysis_request(requestDates, target):
    """       
        An CAMS Reanalysis request for analysis pressure level data.
        Change the keywords below to adapt it to your needs.
        (eg to add or to remove  levels, parameters, times etc)
    """
    server.retrieve({
        "class": "mc",                                  # do not change
        "dataset": "cams_reanalysis",                   # do not change
        "expver": "eac4",                               # do not change
        "stream": "oper",                               # do not change
        "type": "an",                                   # analysis (versus forecast, fc)
        "date": requestDates,                           # dates, set automatically from above
        "levtype": "pl",                                # pressure level data (versus surface, sfc, and model level, ml)
        "levelist": "100/500/700/850/925/1000",         # levels, required only with levtype:pl and levtype:ml 
        "param": "4.210/157.128",                       # here: Dust Aerosol mixing ration and Relative humidity (r); see http://apps.ecmwf.int/codes/grib/param-db
        "target": target,                               # output file name, set automatically from above
        "time": "00/06/12/18",                          # times of analysis (with type:an), or initialization time of forecast (with type:fc)
        "grid": "0.7/0.7",		             			# Optional. The horizontal resolution in decimal degrees. If not set, the archived grid as specified in the data documentation is used.
        "area": "75/-20/10/60",		            		# Optional. Subset (clip) to an area. Specify as N/W/S/E in Geographic lat/long degrees. Southern latitudes and western longitudes must be
                                                        # given as negative numbers. Requires "grid" to be set to a regular grid, e.g. "0.7/0.7".
        "format": "netcdf",		            			# Optional. Output in NetCDF format. Requires that you also specify 'grid'. If not set, data is delivered in GRIB format, as archived.
    })
if __name__ == '__main__':
    retrieve_cams-reanalysis()

Merging files

The example script above creates one output file per month of data. If you want a single output file:

For GRIB files, you can simply concatenate the files in Linux:

cat file1 file2 file3 > file4 

For NetCDF files (unsupported) see https://code.mpimet.mpg.de/boards/1/topics/4446

Useful links

 

 



  • No labels