We use cookies to ensure that we give you the best experience on our website. You can change your cookie settings at any time. Otherwise, we'll assume you're OK to continue.

Durham University





Welcome to the COSmology MAchine (COSMA)! Here, you will find some information about the COSMA5, COSMA6 and COSMA7 HPC facilities.

Watch the DellEMC COSMA7 video here.

COSMA has been in existence since July 2001 and is now in its 7th generation.

COSMA5 is a Durham system for ICC users and collaborators.

COSMA6 and COSMA7 are the current DiRAC facilities hosted by Durham. COSMA6 was introduced on 1st April 2017 as the DiRAC-2.5 Data Centric service. COSMA7 was introduced as the DiRAC-2.5x system in May 2018, and expanded by DiRAC-2.5y in October 2018.

DiRAC is the UK's integrated supercomputing facility for theoretical modelling and HPC-based research in particle physics, astronomy and cosmology. For more information about DiRAC please visit the DiRAC web pages

The actual load (taken every 15 minutes) on the DiRAC systems can be seen on the SAFE pages:

The COSMA systems run CentOS 7.4, with a 3.10 Linux kernel.

The login details are:
cosma5: or (round-robin to cosma-a and cosma-b)
cosma6: (single node, cosma-i)
cosma7: (round-robin to cosma-m and cosma-n)

All systems use a global slurm batch system. Submission to any queue (cosma, cosma6 and cosma7) can be done from any login node.

COSMA5 is no longer a DiRAC facility. Therefore, DiRAC users should use COSMA6 or 7.

Contact Details

Institute of Computational Cosmology,
Ogden Centre for Fundamental Physics - West,
Department of Physics,
Durham University,
South Road,
Durham DH1 3LE
COSMA logo


22/3/19: 152 new COSMA nodes added to COSMA7 queues.

20/3/19: COSMA7 storage outage due to failed PDU. Now operational without data loss.

4/3/19: 152 new COSMA nodes functional and undergoing tests

12/2/19: COSMA downtime from 18-22nd Feb for installation of new hardware

1/2/19: COSMA awarded funding to replace ageing COSMA6 storage

4/1/19: COSMA7 now expanded to 300 nodes, 230TB RAM.

Older news here!