The HPC cluster at the CERN Data Centre; credit: CERN.
During its operation, the LHC provides its experiments with 40 million proton-proton bunch collisions every second. The experiments use hugely complex hardware and software triggering mechanisms and algorithms to select only the most interesting collision events, however, even this results in a data-stream of tens or even hundreds of Gigabits per second (GB/s) that must be stored for later analysis. In addition, HEP analyses require a huge amount of Monte-Carlo (MC) simulations of various physical processes to be created. These are later compared to the experimental data in search of anomalies which could lead to new discoveries.
The immense computing tasks outlined above are performed by the Worldwide LHC Computing Grid (WLCG). This grid is made of nearly one million computing cores situated in more than 170 sites around the world. It tackles approximately 2 million computing tasks every day, with a non-stop global data transfer rate of over 60 GB/s. The hierarchy of this grid is split into tiers - Tier-0 to Tier-2[1]. Tier-2 is formed by high-performance computing (HPC) clusters of universities and research institutes around the world. In 2019, RTU and the University of Latvia (UL) undertook a joint pilot-project aimed at establishing a Tier-2 site in Latvia.
Development of a Tier2 Centre in Latvia
The development of a Tier-2 data data centre is one of Latvia’s strategic CERN-related projects. In 2019, RTU and UL undertook a collaborative effort of uniting their computing resources at the RTU HPC Centre and the UL Institute of Numerical Modelling, into a single network, with the aim of using the unified HPC cluster as a single Tier-2 site. A state-financed pilot-project validated the feasibility of the overall scheme and showed positive results.
This project is seen as a natural continuation for the BalticGrid: a grid of computing resources of the three Baltic states which was operational in the 2000s. As such, the success of this pilot-project should be viewed as an opportunity for other institutes with available HPC resources, both in Latvia and the other Baltic states, to join the next stages of the project.
The development of the Tier-2 site in Latvia is supported by experts from the Estonian Tier-2 site hosted by the National Institute of Chemical Physics and Biophysics (NICPB), as well as expert researchers from CERN and CMS.
The management system of this Tier-2 site is based on the OpenStack cloud-server platform and CEPH software-defined storage solutions. A single compute-element management system has been implemented. This is expected to receive computing tasks from CERN. These tasks are then distributed throughout the available HPC resources of the federative system, where a Slurm resource management system is used to autonomously distribute and execute the tasks on the available CPUs. These tools, recommended by CERN, have allowed for a successful proof-of-concept pilot-project and will allow for success in the development of a fully functional Tier-2 site.
A series of technical upgrades of the management system have been carried out in order to meet the performance requirements of this planned Tier-2 system, and a unique domain, t2cms.hpc-net.lv, has been registered. This is to be followed by linking up the computing resources of the federation partners and registering Latvia’s Tier-2 data centre with CERN. The subsequent tests and full-scale implementation will be performed in due course.
[1]Technically, another tier, Tier-3, exists. This tier is the collective of the personal computers of the end-users - the individual physicists, working tirelessly to analyse the data provided by the LHC and its experiments.
credit: RTU HPC Centre |
RTU role in the project: HPC Centre, HEP Centre
Implementation stage: pilot project implemented
Project partners: Ministry of Education and Science of Latvia, RTU, UL, Dati Group
Total project expenses: 24 000 EUR
Project implementation period: December 2019-March 2020 (4 months)
The project team of RTU:
Prof. Toms Torims
Dr. Lauris Cikovskis