Applications

These real-world HPC applications play a crucial role in our research as their requirements for both the hardware and system software/API aspects drive the co-design process: The selected applications are continuously optimized for the Cluster/Booster and the Modular Supercomputing Architecture respectively. This results in systems that fit the needs of the applications, and showcases that demonstrate the efficiency and performance of the DEEP projects prototypes.

The understanding of superconducting mechanisms in high critical temperature (HTc) materials (such as copper oxides) remains one of the most challenging topics in condensed matter physics. 

Towards exascale simulations of space plasmas using the DEEP-ER architecture.

Lattice QCD (Quantum Chromodynamics) studies the fundamental theory of the strong interactions numerically and has been at the forefront of HPC since the 1980s.

Full Waveform Inversion is a technique that allows to probe the physical properties of a subsurface area with high spatial resolution using seismic data.

The research field aims at understanding the physical processes of large scale geological incidents (e.g. earthquakes) and their secondary effects (e.g. tsunamis).

The Square Kilometre Array (SKA), the next-generation radio telescope has Exascale compute requirements.

The assessment of the possible health effects linked to the exposure of humans to electromagnetic fields emitted from wireless systems is of particular interest to the well-being in our modern society.

GROMACS is one of the most widely used open-source and free software codes in chemistry, used primarily for dynamical simulations of biomolecules

The continuous progress in remote sensor resolutions of Earth observation platforms generates large quantities of hyperspectral data for the mapping and monitoring of natural and man-made land covers.

NEST is a widely-used, publically available simulation software for spiking neural network models, scaling up to the full size of petascale computers.

The LHC (Large Hadron Collider) experiments at CERN collect enormous amounts of data, which need to be pre-processed, treated and then analysed to extract the scientific information which physicists look for. This makes the codes developed for LHC paramount examples of HPDA applications.

Seismic imaging aims at providing accurate and detailed 3D maps of the earth subsurface from acoustic wave propagation recording at the surface.

Brain simulation is making giant leaps towards a better understanding of the inner workings of the human brain.

AVBP is a parallel Computational Fluid Dynamic (CFD) code that studies the combustion process in gas turbines, targeting its optimisation, impacting stability and pollution reduction.

Understanding the evolution and changes of global climate is of utmost importance in the 21st century. The complexity of climate simulation is reflected in the structure of codes in the field.