Uncategorized

MultiXscale experts Khush Bakhat Rana, Petra Papež and Helena Vela highlighted in the special issue “HPC Unites: Celebrating Women’s History Month at SC26”

The SC Women’s History Month Profiles project started three years ago, and since then, it has highlighted more than 200 women whose work is shaping the High-performance computing (HPC) community across research, engineering, education, and leadership. The theme selected for this year “HPC Unites” reflects the collaborative nature of the field, bringing together people, disciplines, and institutions to solve complex challenges and drive innovation. Discover more about their profiles here

New Podcast Episode: EESSI – Simplifying software installation in HPC systems

The new episode for “Supercomputing in Europe. HPC in Europe network” is out! This time is joined by MultiXscale experts Caspar van Leeuwen,⁠ Machine Learning Consultant at ⁠SURF⁠ and ⁠Helena Vela Beltran⁠, Computational Scientist at ⁠HPCNow! and Do IT Now Spain⁠. Both are actively contributing to the ⁠MultiXscale CoE⁠ and the ⁠EESSI project⁠, the European Environment for Scientific Software Installations, a common stack of scientific software installations for HPC systems and beyond, including PCs and cloud infrastructure. The interview and mixing is carried out by Apostolos Vasileiadis (ENCCS, Mimer AI Factory) Listen to the full interview here: Spotify: https://open.spotify.com/episode/0eYCo5VLyuujIIlrfAFP9o?si=HVN3qxdpTEagbaCSyB051w Apple podcasts: https://podcasts.apple.com/se/podcast/eessi-simplifying-software-installation-in-hpc-systems/id1768782069?i=1000758537611 Add the RSS feed to your favourite podcast app: https://anchor.fm/s/f01f82a4/podcast/rss

Status Update on EESSI for the EuroHPC Federation Platform

Since the blog post in Feb’25 announcing that EESSI will be integrated in the EuroHPC Federation Platform (EFP), we have been working hard on making this a reality. The Federated Software Catalog (FSC) component of EFP will use EESSI as a base. As a result, EESSI will soon be available on all EuroHPC supercomputers. In fact, EESSI is already available on the majority of them today! Read all the details on the EESSI blog post here

ESPResSo 5.0 released!

By Jean-Noël Grad We are pleased to announce the release of ESPResSo 5.0. ESPResSo is an open-source simulation software for particle- and lattice-based modelling, used across soft matter, statistical physics, biophysics, and process engineering – from polyelectrolytes, gels and colloids to bacterial motion and super-capacitors. A highlight of this release is the redesign of lattice-Boltzmann and electrokinetics on CPU and GPU using the HPC-ready waLBerla framework. This work was carried out in the context of the MultiXScale EuroHPC CoE. In addition to more fine-grained control of boundary conditions, it brings multi-GPU support for lattice-Boltzmann hydrodynamics coupled to molecular dynamics. ESPResSo 5.0 also introduces support for shared-memory parallelism, delivering improved performance on hybrid CPU/GPU systems. New features and algorithms were introduced to tackle challenging multiphysics problems: magnetodynamics solver, hydrodynamics under shear, Andersen and MTK barostats, per-particle selection of equations of motion, and integration with the Atomic Simulation Environment (ASE). ESPResSo blends scalable algorithms with a Python interface to enable flexible simulation workflows and provide seamless integration with other scientific software. The source code can be found at https://github.com/espressomd/espresso. Moreover, as part of the EuroHPC Center of Excellence multiXscale, ESPResSo is available on the EESSI software stack to simplify installations on workstations, computer clusters, and cloud environments.

Highlights of the first Stuttgart Research Software Day

By Jean-Noël Grad The first Stuttgart Research Software Day (SRSD1) was organised as a satellite event of the 6th Conference for Research Software Engineering in Germany (deRSE26) and explored Stuttgart’s contribution to open source research software through 56 posters. Several contributions featured multiscale and multiphysics software, such as ESPResSo, DuMux, preCICE, OpenDiHu and FANS. Contributions to IT infrastructure included the EESSI software stack and GitHub Action from MultiXscale, the large-scale research data storage and sharing platform bwSFS-2 for Tier-3 HPC clusters, and a CPU operating map test bench to visualize how varying CPU frequencies and workloads impact datacenter energy efficiency. Posters are available online in the SRSD1 Zenodo community.

Explore ESPResSo pre-releases via dev.eessi.io

By Jean-Noël Grad We are happy to announce ESPResSo pre-releases are available as binaries on the development stack of EESSI. Several commits of the python branch of ESPResSo can now be used on HPC clusters that adopted EESSI, as well as in GitHub CI/CD. Continuous testing (CI) and continuous deployment (CD) are important to ensure the reproducibility of software and executable papers. Python projects like pyMBE and SwarmRL are using pre-release versions of ESPResSo to test their software against the most recent ESPResSo features, without having to wait for the next official release. Simulation scripts and Jupyter notebooks can now be shared and executed on GitHub runners without the need to build ESPResSo from sources. To help you get started with this new service, we wrote a tutorial with re-usable GitHub workflows: https://www.eessi.io/docs/blog/2025/12/04/gh-ci-workflow-for-EESSI Further information available here

Upcoming webinar on 25 February: EESSI integration in the EuroHPC Federation Platform (EFP)

Curious about how EESSI is being integrated in the EuroHPC Federation Platform (EFP) as the base for the Federated Software Catalog? Don’t miss the next EFP webinar on Wednesday 25 February at 14h, by our MultiXscale expert Kenneth Hoste (Ghent University). Abstract: In this webinar, we will introduce the Federated Software Catalogue (FSC) component of the EuroHPC Federation Platform (EFP ), which will be based on the European Environment for Scientific Software Installations (EESSI, pronounced as “easy”). EESSI provides a consistent set of software installations that were optimized for a broad range of specific CPU microarchitectures (Intel, AMD, Arm), including those featured in EuroHPC JU supercomputers. These software installations were built such that they are independent of the host operating system (OS), and only rely on software provided by the host OS where necessary (like GPU drivers). Where desirable, specific custom software installations can be “plugged in”, for example an system-tuned MPI library to enhance the interconnect performance. A select set of software that supports different generations of NVIDIA GPUs is also included in EESSI, while support for AMD GPUs is a work-in-progress. EESSI is already available on various EuroHPC JU supercomputers today, and is currently being made available on the others. In addition, EESSI will be integrated into EFP in various ways: We will present a short introduction on EESSI and its current status, outline how it is being integrated in EFP as Federated Software Catalogue component, demonstrate how it can be used on EuroHPC JU supercomputers already, and provide an outlook on upcoming improvemnts and enhancements. *More information an registration available here

MultiXscale in the NCC/CoE Success Stories Booklet 2025

MultiXscale Success Story “Automating the deployment of large software stacks for analysing huge radio astronomy data streams” is already available in the NCC/CoE Success Stories Booklet 2025 (pages 164-165). In this issue, you can discover inspiring examples of how National Competence Centres (NCCs) and Centres of Excellence (COEs) across Europe are leveraging supercomputing to tackle complex challenges and drive innovation. From SMEs to research institutions, these stories show how collaboration and advanced computing drive success across borders. Explore the document here:

The “Organizing software community workshops: Experiences from three independent simulation software projects” article has been published in the Electronic Communications of the EASST

The “Organizing software community workshops: Experiences from three independent simulation software projects” article has been published in the Electronic Communications of the EASST on December 15 2025. It is available through this link: https://doi.org/10.14279/eceasst.v85.2700 The ESPResSo summer school is an annual CECAM training event for the soft matter community. It plays a critical role in the dissemination of MultiXscale’s portfolio of tools and workflows for multi-scale simulations, and relies on a unique blend of scientific lectures and live coding sessions for rapid onboarding of new software users. Each iteration of the school emphasises a different aspect of soft matter physics and allocates time to presenting other software relevant to multi-scale simulations, such as waLBerla, GROMACS, and SwarmRL. On the last day, research talks illustrate how these simulation software are applied to solve real-word problems, and discuss new trends and recent developments in soft matter research. In the paper, together with collaborators from other simulation packages, we share our lessons learned regarding event formats that maximise participants engagement and community growth, and distill “good practices” for event planning that complement the literature on this topic by documenting the specific challenges of formats with live coding sessions.

Scroll to Top