Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

At each grid point the set of observed data is processed using the set of random weighting functions for each parameter.  Initially the forecast value will not agree with those observed at the verifying time of the forecast.  The error (loss function) as measured by some error metric is fed backwards (back propagation) within the process.  In response, the influence of types of observations (say wind, 50hPa temperature, etc.) may be reduced while that of others (say surface temperature) may be increased.  This process is repeated many times with the aim to progressively minimise the error metric (See Fig3Fig2.1.6.1-1). 

The process incrementally improves the relationship between the set of initial observations and forecast values of a single variable for the later time.  In this way a relatively simple relationship between initial data and forecast data for six hours in the future is gradually built up.  This consists of probabilities of influence of each meteorological parameter in the form of a weighting for each input data type.  Taking all the weighting functions together forms an algorithm for use during the AI forecasts.   

...

The error metric (loss function) is RMS.

Fig3Fig2.1.6.1-1: Machine learning training process in deriving an algorithm for use in AIFS single.  A range of observed data is processed using random weighting functions for each parameter.  The forecast results are then compared with verifying data and the difference between them (the loss or error) is fed back to the processor (back propagation).  This induces modification of the weighting functions for each parameter and the resulting forecast is compared with the verifying data giving new loss/error value.  Iteration continues until the loss/error is minimised and the set of weights for each parameter becomes the algorithm for the forecasting process.

...

The algorithms for the ensemble are developed rather differently from those for use by AIFS single.   A small ensemble group (ECMWF uses a group of 4) is used that give information on the variability of these independent results and to introduce a measure of model uncertainty.   At each grid point the set of observed data is processed using four different sets of random weighting functions for each parameter.  The four forecast results are then compared with verifying data.  CRPS, which measures how good forecasts are, is then evaluated for the results of the four forecasts.  The CRPS influences what is fed back to the ML processors (back propagation).  This induces modification of the weighting functions for each parameter and the resulting forecasts are compared with the verifying data giving new CRPS values.  Iteration continues until the CRPS is minimised and the set of weights for each parameter forms the algorithm for the ensemble forecasting process (See Fig4Fig2.1.6.1-2).  Using CRPS as a loss function accounts for the limitations of using a finite number of ensemble members, and ensures an accurate and well-callibrated distribution.   Model uncertainty is incorporated as a learnt aspect due to the insertion of white noise.


Fig4Fig2.1.6.1-2: Machine learning training process in deriving an algorithm for use in AIFS-ENS.  A range of observed data is processed using four different sets of random weighting functions for each parameter.   White noise is introduced to each iteration to emulate model uncertainty.  The four forecast results are then compared with verifying data.  CRPS, which measures how good forecasts are, is evaluated for the results of the four forecasts.  The CRPS influences what is fed back to the processor (back propagation). This induces modification of the weighting functions for each parameter and the resulting forecasts are compared with the verifying data giving new CRPS value.  Iteration continues until the CRPS is minimised and the set of weights for each parameter becomes the algorithm for the forecasting process.

...