You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 5 Next »

Experimental service

This is an experimental service currently available to ECMWF staff only. Feedback regarding the service would be greatly appreciated such that we can improve the service and make it available to a wider audience. Please provide any feedback to the table found here.

Please be aware that the service described here is subject to change based on real usage of the system.

Based on user and developer experience of the Climate Data Store (CDS) Toolbox, the CDS-Toolbox will be replaced by a JupyterHub service for online computing environment and earthkit for the quality assured software. Jupyterhub sessions will provide very fast access to data available on the various Data Stores and will allow users to perform post-processing and visualisation of this data. The sessions are considered small (see provision details below) and not designed for very large computation. For larger computation task, users should consider other JupyterHub resources, for example WEkEO.

Time limited singleton sessions

All JupyterLab sessions running on this service are time limited. When the time is up, the instance will be killed automatically along with any active processing that may be taking place.

You can only have one JupyterLab instance running. If you left one running, JupyterHub will connect you straight back into it.

How to access

The DSS JupyterHub will be available from the ECMWF JupyterHub launcher page, linked from the CDS, ADS and EWDS web-sites (maybe not ADS/EWDS, maybe other data-stores too, update when decided). Access requires ECMWF log in credentials, including a two-factor authentication.

Once logged in, users are given a choice of environment to use for their Jupyter session session from a dropdown menu, with several additional option depending on which environment you have selected. Please note that by launching a JupyterHub session you are agreeing the terms and conditions of use (provide link to TnCs).

ECMWF JupyterHub launcher page

ECMWF sessions

This is the general ECMWF JupyterHub launcher, therefore it is possible that you have access to more than the two Data Store options described here

Environments available to DSS users

DSS users will be able to spawn sessions with one of the environments summarised in the table below. This can be selected from the "Select an Environment" dropdown selector on the JupyterHub Launcher, and depending on the environment, additional options may be configurable.

Name

Use case

RAM

CPU

Duration

ECMWF Cloud -Copernicud CDS - SMALL

Downloading, inspecting and plotting data products found on the Data-Stores

2

1

12 hours

ECMWF Cloud -Copernicud CDS - MEDIUM

Some small data processing, e.g. data averaging of small files

4

2

3 hours


Session priorities

In the first deployment smaller (and shorter) sessions will be prioritised to ensure fair usage of the platform. These priorities are to be monitored closely and will evolve as the project develops.

Pre-installed software

The default python environment is created using conda-forge with the following environment.yml file:

Action required

Unauthenticated access to this resource is not allowed. Please login to Confluence first.

You can install additional packages from the (open-source) conda-forge channel (`conda install PACKAGE-NAME`), or from PyPi (`pip install PACKAGE-NAME`). This packages will be installed in your local storage and will be available next time you create a session.

User storage

The DSS will offer two forms of storage for use in the JupyterHub. Please be aware that both of these options, and the way that they have been configured, are subject to change as the project develops

Private storage

Each user will have 1Gb "home" storage allocation. If this storage is not touched for a period of 31 days it will be removed. This storage is only accessible to you.

Scratch storage

Each user will have a 100Gb maximum quota on the temporary scratch disk. If you exceed the 100Gb quota, a clean up script will remove your largest files. Any attempt to cirumvent this behaviour is considered malicious and will lead to your access to JupyterHub being revoked.

The scratch disk is a shared resource and is cleaned regularly. When the shared usage of all users exceeds the maximum quote, the files accessed least recently will be removed. This means that files stored here should not be considered permanently stored, they should exist for your current session and may be there when you return. The lifetime of these files will depend on the general usage of the service, and at this stage it is not possible to provide an expected lifetime of such files.

Shared resources

There will be a shared resources directory available from the home. This will be read only and contain resources provided by Copernicus and ECMWF, for example the notebooks found in the C3S training material.

External network access

SSH connections are disabled

The Jupyter sessions do not allow SSH connectivity for security reasons. Therefore, you must use the HTTPS address for any git repositories that you want to clone.

Right to suspend service

This service is provided according the terms and conditions (LINK???). We reserve the right to suspend the service to users if we detect that terms and conditions are infringed. Suspension may be triggered automatically, and may only be reinstated when we have investigated the specific use case.

  • No labels