When using standard, non-gpu ecinteractive or JupyterHub sessions, you can enjoy some fast, local SSD storage that can be persisted across sessions in addition to the usual filesystems available. Made available as $LOCALSSD, its contents are automatically archived into your $HPCPERM when your interactive session finishes so you can recover them on your next session and continue where you left off. This feature is primarily aimed to speed up development workflows where builds are required.

How does it work?

Before starting your standard, non-gpu interactive session, make sure you request the amount of local disk space you are going to require. See How much LOCAL SSD space you can use for the actual limits. 

  • With ecinteractive using the -s option. For example, to request 30 GB:
    ecinteractive -s 30
  • With JupyterHub, select the desired value from the "Temporary Storage" dropdown, which is available on the "ECMWF Atos HPC" or "ECMWF Atos ECS" profiles.

When your session starts, you will have a space in the local disk pointed to by $LOCALSSD environment variable that you can use in your session

Shared space with TMPDIR

For interactive sessions, both $LOCALSSD and $TMPDIR live in the same space of the local disk and share the same quota

At the end of the session, either by your choice killing the session or wall time, the contents of $LOCALSSD will be archived automatically into a special directory in your $HPCPERM under:

$HPCPERM/.DO_NOT_DELETE_LOCALSSD_ARCHIVE/

Note that:

  • If the LOCALSSD is empty, it will not be archived.
  • When a new archive of LOCALSSD is created, a copy of the previous one is kept as a backup in case there is any problems. Only the most recent two copies are kept.
  • Logging out of the session does not terminate it, so no archival happens at that point as you can reconnect to it in from a separate terminal or browser tab, or even later while it remains active.

How do I restore my LOCALSSD contents?

There is no automatic restore of your LOCALSSD contents, so next time you open a new interactive session, you will be able to restore the contents of the local storage from your previous one running the following command:

ec_restore_local_ssd -r

Always remember to restore

If you do not restore after starting a new session and you use the $LOCALSSD in the new session, a new archive will be created when the job ends and so you will not have the previous data in this archive. You may still recover the last-but-one archive with:

ec_restore_local_ssd -p

Note that if you don't use your $LOCALSSD, your last saved archive will not be overwritten at the end of the job.

Beyond the basic usage, the ec_restore_local_ssh command has some additional options that allow you to customise your restore.

You may run the command with no arguments to see your saved SSD backups and all the available options:

$ ec_restore_local_ssd 

        These are your available archives:

-rw-r--r-- 1 root root 10119290880 Dec 13 23:36 /hpcperm/user/.DO_NOT_DELETE_LOCALSSD_ARCHIVE/user-ecinteractive.LOCALSSD.tar.latest
-rw-r--r-- 1 root root    45649920 Oct 31 01:07 /hpcperm/user/.DO_NOT_DELETE_LOCALSSD_ARCHIVE/user-ecinteractive.LOCALSSD.tar.prev

/usr/local/bin/ec_restore_local_ssd [OPTIONS]

OPTIONS:
-h:  usage 
-r:  Restore the .latest (most recent) archive
-p:  Restore the .prev version of the archive
-d:  Delete any data currently in your /etc/ecmwf/ssd/ssd1/ecinteractive/user-ecinteractive for fresh start

__        ___    ____  _   _ ___ _   _  ____ 
\ \      / / \  |  _ \| \ | |_ _| \ | |/ ___|
 \ \ /\ / / _ \ | |_) |  \| || ||  \| | |  _ 
  \ V  V / ___ \|  _ <| |\  || || |\  | |_| |
   \_/\_/_/   \_\_| \_\_| \_|___|_| \_|\____|
                                             
The following flags for FORCE restoring is AT YOUR OWN RISK. 
Consider using the delete flag first to cleanup, otherwise you are just adding to/overwriting the current data

-R:  Force the restore of /hpcperm/user/.DO_NOT_DELETE_LOCALSSD_ARCHIVE/user-ecinteractive.LOCALSSD.tar.latest file into a populated /etc/ecmwf/ssd/ssd1/ecinteractive/user-ecinteractive
-P:  Force the restore of /hpcperm/user/.DO_NOT_DELETE_LOCALSSD_ARCHIVE/user-ecinteractive.LOCALSSD.tar.prev file into a populated /etc/ecmwf/ssd/ssd1/ecinteractive/user-ecinteractive

How much LOCALSSD space can I use?

You must request how much space you need when you start your interactive session as described above. You will not be able to exceed that limit while in that interactive session. 


Default sizeMaximum size
HPC interactive session3 GB100 GB
ECS Interactive session3 GB20 GB

  • No labels