Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.


Table of Contents

Model Data Assimilation - overview

Data assimilation is an analysis technique in which the observed information is accumulated into the model state.

Approaches to Data Assimilation

 There are three basic approaches to data assimilation:

  • sequential assimilation – only considers observation made in the past until the time of analysis (real-time assimilation systems).
  • non-sequential – considers observations made before and after the nominal time of the analysis (e.g. 4D–var). 
  • retrospective assimilation, where observation from the future can be used (e.g reanalysis).

Another distinction can made between methods that are intermittent or continuous in time:

  • Intermittent method – observations can be processed in small batches.  The correction to the analysed state tends to be abrupt and physically less realistic.
  • Continuous method – observation batches over longer periods are considered.  The correction to the analysed state is smoother in time and physically more realistic (e.g. Long Window Data Assimilation).

The four basic types of assimilation are depicted schematically in Fig2.5.1.  Compromises between these approaches are possible.  The aim is to assimilate data in a manner which does not produce sudden jumps in analysed values and some sort of continuous assimilation seems preferable.  It is expensive in computing time and a compromise has been adopted with 4D-Var assimilating data observed at various times over several hours. This avoids sudden jumps at in analyses for the forecasts (e.g. 00UTC, 12UTC).  Continuous Assimilation is used during 4D-Var analysis process for the early cut-off analysis. 


Image Modified

Fig2.5.1: Representation of four basic strategies for data assimilation as a function of time.   Observations are made at different times and arrive at irregular times.  Intermittent assimilation of observations induces step changes in the model analysis (red line) while continuous assimilation of observations gives smoother changes in the model analyses. 


Quality Control

Data is quality controlled using constraints of:

  • consistency of information from the observing platform,
  • assessment whether changes with time are realistic when compared with model expectations,
  • consideration of the physical properties, both actual and implied.

Before assimilating data it is necessary to quality control the observations during data extraction process:

  1. Data Extraction
    • Thinning (to reduce over-emphasis towards values in a small area) 
    • Check out duplicate reports (to remove over-emphasis of the observation)
    • Ship tracks check (to ensure observations made at a reasonable location)
  2. Hydrostatic check
    • Some data is not used to avoid over-sampling and correlated errors.
    • Departures and flags are still calculated for further assessment.
  3. Blocklisting
    • Data skipped due to systematic bad performance or due to different considerations (e.g. data being assessed as unreliable, inconsistent or misleading).
  4. Model/4D-Var dependent quality control
    • First guess based rejections
    • 4D-Var quality control rejections
  5. The Analysis

...

Outline of the analysis process

The analysis process seeks to realistically represent in the model the actual state of the atmosphere.  However, inconsistencies in time and space of the observations mean that this aim will never be actually attained and the best that can be done is to approximate (hopefully closely) the actual state of the atmosphere while maintaining the stability (in the numerical sense) of the model atmosphere both horizontally and vertically.  The assimilation process is carried out by 4D-Var (see below).  In simple terms the previous analysis step has used model processes (e.g. dynamics, radiation, etc) to reach a forecast first guess value at a given location.  This will usually differ from an observation at that location and time. The difference between them is the "departure".  The analysis scheme now adjusts the value at the location towards the observed value while retaining stability in the model atmosphere. This adjustment is the "Analysis Increment".  The magnitude, if large, suggests the model is not capturing the state of the atmosphere well (e.g. a jet is displaced, a trough is sharper, large scale active convection has not been completely captured).  However, a large Analysis Increment may also suggest poor data.  Analysis Increment charts are a powerful tool for identifying areas of uncertainty which might propagate downstream. 


Image Modified

Fig2.5.2: Schematic of the data assimilation process (from a diagnostic perspective).  All the model forecast parameters (dynamics, radiation, etc.) are used in the model forecast to deliver the first guess forecast (red).  The observations (grey) however will normally differ to a greater or lesser extent from the first guess forecast (red) and the analysis increment (yellow) is evaluated to bring the the evolution more into line with the observations while maintaining model stability.  The resultant value (blue) becomes the first guess for the next analysis.

Model Data Assimilation, 4D-Var

...