Overview
The AI Weather Quest leaderboards page provides a dynamic view of how teams and models are performing throughout the competition.
The AI Weather Quest is a collaborative forecasting challenge hosted by ECMWF, designed to explore the potential of AI and machine learning in medium-range weather prediction. Participating teams are challenged with providing quintile-based probabilistic forecasts for three key variables: near-surface air temperature (tas), mean sea level pressure (mslp), and accumulated precipitation (pr). Forecasts target lead times of days 19 to 25 (week 3) and days 26 to 32 (week 4).
Starting 19 September 2025, the Leaderboards page will display:
- Ranked Probability Skill Scores (RPSS) for submitted forecasts
- Tables showing team and model rankings
- Evolution graphs tracking performance over time
Evaluation results are indeed only made available after the evaluation date (day 37 of each competition week) has passed.
This guide explains how to use the leaderboards page effectively and track the evolution of team and model performance over time.
Using the filters
Filters apply to all elements on the page, including RPSS tables and evolution graphs. By default, the page displays the latest evaluated period and week, the first forecast window (Days 19–25), and variable-averaged scores.
To customize your view, use the following options:
1. Competitive period and week
- Dropdown 1: select a competitive period (e.g. SON 2025)
- Dropdown 2: select a competition week (e.g. Competition Week 1 (Thu 14-Aug-2025))
2. Forecast window
- Days 19–25
- Days 26–32
3. Variable
- Variable-averaged (tas, mslp, pr)
- tas (2m temperature)
- mslp (mean sea level pressure)
- pr (precipitation)
Understanding the RPSS tables
All rankings are based on the RPSS of each team's best-performing model. However, individual model ranks and scores are also visualised, allowing users to compare performance across multiple submissions from the same team.
Click any RPSS score in the table to view the corresponding forecast in the ECMWF-hosted sub-seasonal AI forecasting portal.
There are two main types of RPSS tables:
1. Period-aggregated RPSS table
- Shows RPSS scores aggregated over a selected competitive period (e.g. SON 2025)
- Useful for assessing overall performance across multiple weeks
2. Weekly RPSS table
- Shows RPSS scores for a specific competition week
- Useful for tracking short-term performance
Evolution graphs
Each RPSS table is accompanied by two evolution graphs:
1. Team rankings over time
Shows how team rankings evolve week by week:
- X-axis: competition week numbers
- Y-axis: team rankings based on RPSS of their best-performing model
- Default view: top 5 teams for selected filters
- Up to 10 teams can be selected at once
2. Model RPSS scores over time
Tracks performance trends of models across weeks:
- X-axis: competition week numbers
- Y-axis: RPSS scores of models
- Default view: top 5 models for selected filters
- Up to 10 models can be selected at once
Exploring team profiles
Click on any team name in the leaderboard to view:
- Team members (if public)
- Model descriptions
- Participation history (forecast submissions by week, window, and variable)
- Submitted forecast data in previous period(s) (will first be added after the end of the first period)