HPC at the ICP
==============
Why do we need HPC at the ICP?
------------------------------
The ICP is largely built around performing large scale simulations with a focus
on soft matter physics, energy materials, active matter and fluid dynamics.
Multiscale modeling is often necessary to adequately capture physical properties
that resolve different time- or length-scales. This type of modelling couples
particle-based and lattice-based algorithms to resolve different scales.
Some algorithms can leverage GPU accelerators for lattice-based problems
and machine learning, or large memory compute nodes for large-scale data
analysis or simulations involving billions of particles.
Which HPC expertise do we have?
-------------------------------
The ICP manages its own cluster, :doc:`Ant `,
for high-performance parallel computing [#grant_Ant_cluster]_.
It is also used for benchmarking and improving the scalability of parallel algorithms,
together with a fleet of GPU-equipped servers exclusively dedicated to software testing.
The ICP has access to the :doc:`SimTech cluster `
and bwHPC resources (:doc:`bwForCluster `,
:doc:`bwUniCluster 2.0 `).
Through the :doc:`PRACE program `,
the ICP can apply for computing time at any European HPC facility,
and currently has development access to the petascale
:doc:`Vega ` supercomputer.
The University of Stuttgart is an active player in the HPC field:
it is a shareholder of the `bwHPC `__ initiative
[#bwHPC_LNA_BW]_ and owns the `HLRS `__ supercomputing
center and the `IPVS `__.
The ICP is a member of the `Center of Excellence MultiXscale
`__ [#grant_CoE_MultiXscale]_ [#grant_BMBF_MultiXscale]_.
All ICP PIs are project members of the `Cluster of Excellence SimTech
`__ [#grant_EXC_SimTech]_
and Christian Holm was a member of the `SFB 716
`__ [#grant_SFB_716]_.
The ICP, University of Stuttgart and SimTech have agreements to cover
the participation fees and travel costs to :doc:`EuroHPC training events
` of their staff members and students,
in an effort to foster continuous learning in HPC.
The ICP employs a HPC-:abbr:`RSE (Research Software Engineer)`
(`Jean-Noël Grad `__),
the `SFB 1313 `__ employs a
:abbr:`FAIR (Findability, Accessibility, Interoperability,
and Reusability of research output and research software)`-RSE
(`Hamza Oukili `__),
SimTech employs a :abbr:`RDM (Research Data Management)`-RSE
(`Sarbani Roy `__),
and `IntCDC `__
employs a :abbr:`RDM (Research Data Management)`-RSE
(`Matthias Braun `__).
Their role is to assist domain scientists in leveraging highly parallel computing
environments, writing quality-assured and future-proof software/libraries/scripts,
and making simulation data archivable/findable/re-usable in compliance
with the requirements of funding agencies and academic institutions.
HPC-driven research poses unique challenges in terms of software engineering,
energy efficiency, software quality assurance, scientific reproducibility and
data management. We actively participate in these discussions and disseminate
their outcome to domain scientists of the University of Stuttgart through
regular meetings and seminars. In addition, SimTech organizes the `SIGDIUS Seminars
`__,
a monthly event to discuss policies, infrastructure and tools for software
engineers, data stewards and domain scientists.
`IntCDC `__ organizes a Software Carpentry
every semester [#stuttgart_carpentries]_.
The `IPVS `__ offers a RSE course
`Simulation Software Engineering `__
every winter semester.
The HLRS, SimTech and University of Stuttgart are founding members of the
`str-RSE `__ chapter of the
`German Research Software Engineers `__ association,
and manage a rich portfolio of highly extensible `research software
`__
that are funded by software engineering grants [#grant_ESPResSo]_ [#grant_ESPResSo2]_
[#grant_PreDem]_ [#grant_DuMux]_ [#grant_MegaMol]_ [#grant_Librepa]_
[#grant_CoE_MultiXscale_ESPResSo]_.
What HPC facilities do we have access to?
-----------------------------------------
* University clusters
* :doc:`Ant cluster ` at the ICP
* :doc:`Bee cluster ` at the ICP
* :doc:`Ehlers cluster ` at SimTech
* :doc:`Vulcan cluster ` at the HLRS
* :doc:`bwForCluster ` at the bwHPC
* :doc:`bwUniCluster ` at the bwHPC
* HPC centers
* :doc:`Hawk supercomputer `
* :doc:`Vega supercomputer `
* :doc:`EuroHPC `
* :doc:`HLRS `
____
.. [#grant_Ant_cluster] DFG grant :dfg-gepris:`492175459` for Ant.
.. [#bwHPC_LNA_BW] bwHPC `User Steering Committee (LNA-BW) `__.
.. [#grant_CoE_MultiXscale] EuroHPC-JU grant number `101093169 `__:
Centre of Excellence in exascale-oriented application co-design and delivery for multiscale simulations.
.. [#grant_BMBF_MultiXscale] BMBF grant number `16HPC095 `__:
Verbundprojekt MultiXscale: HPC-Exzellenzzentrum für Multi-Skalen-Simulationen auf Höchstleitungsrechnern.
.. [#grant_EXC_SimTech] DFG grant :dfg-gepris:`390740016`:
Data-Integrated Simulation Science (SimTech, EXC 2075).
.. [#grant_SFB_716] DFG grant :dfg-gepris:`17546514`:
Dynamic simulation of systems with large particle numbers (SFB 716).
.. [#grant_ESPResSo] DFG grant :dfg-gepris:`391126171`: Fostering an international
community to sustain the development of the ESPResSo software package.
.. [#grant_ESPResSo2] DFG grant :dfg-gepris:`528726435`: Strengthening the quality
and user base of the research software ESPResSo for particle-based simulations.
.. [#grant_PreDem] DFG grant :dfg-gepris:`391150578`: PreDem -- Democratization
of the coupling library preCICE.
.. [#grant_DuMux] DFG grant :dfg-gepris:`391049448`: Sustainable infrastructure
for the improved usability and archivability of research software on the
example of the porous-media-simulator DuMux.
.. [#grant_MegaMol] DFG grant :dfg-gepris:`391302154`: Research software
sustainability for the open-source particle visualization framework MegaMol.
.. [#grant_Librepa] DFG grant :dfg-gepris:`265686075`:
Load-balancing for scalable simulations with large particle numbers.
.. [#grant_CoE_MultiXscale_ESPResSo] CoE MultiXscale WP1+WP2: ESPResSo performance, productivity and portability
(subprojects of EuroHPC-JU grant number `101093169 `__).
.. [#stuttgart_carpentries] Software Carpentries are announced on the University
`Events feed `__.
A list of past and future events can be found on the `IntCDC GitHub page
`__.