Uncategorized

The poster abstract submission deadline for CECAM Flagship School “Simulating soft matter across scales” has been extended to Sept. 27th

The poster session of the ESPResSo summer school is a great opportunity to present your research and engage in meaningful discussions with soft matter experts and ESPResSo/waLBerla/pyMBE developers. It serves not only as a platform to present your work to your peers but also as a chance to network, gather feedback, and foster collaborations that can extend well beyond the duration of the event. Everyone bringing a poster is invited to present it in a 1 minute lightning talk during the poster session. The poster boards will remain up for the entire duration of the school. The school will focus on coarse-grained and lattice-based simulations methods to model soft matter systems at mesoscopic length and time scales. We will simulate coarse-grained ionic liquids in electrolytic capacitors, coarse-grained liquids with machine-learned effective potentials, polymer diffusion, hydrodynamic interactions via the lattice-Boltzmann method, and electrokinetics and catalysis with diffusion-advection-reaction solvers. Lectures will provide an introduction to the physics and simulation model building as well as an overview of the necessary simulation algorithms to resolve physical processes at different time scales. During the afternoon, students will practice running their own simulations in hands-on sessions using ESPResSo and waLBerla. Time will be dedicated to research talks and poster sessions. Invited speakers: We invite all interested to attend the ESPResSo summer school “Simulating soft matter across scales” on October 7-11, 2024, University of Stuttgart, Germany. Attendance to the summer school is free of charge. To register, go to https://www.cecam.org/workshop-details/1324 and write a short motivation and CV. You can submit a poster abstract until September 27th, 2024.

Sprint Training Event: “Introduction to EESSI” on 4 October 2024

EuroCC Austria and Slovenia are organizing an exciting Sprint Training Event: Introduction to EESSI – European Environment for Scientific Software Installations, together with CoE MultiXscale on 4 October 2024. The European Environment for Scientific Software Installations (EESSI – pronounced “easy”) is a common stack of scientific software for HPC systems and beyond, including laptops, personal workstations, and cloud infrastructure. In many ways it works like a streaming service for scientific software, instantly giving you the software you need, when you need it, and compiled to work efficiently for the architecture you have access to. In this 2-hour online workshop, we’ll explain what EESSI is, how it is being designed, how to get access to it, and how to use it. We’ll give a number of demonstrations and you can try EESSI out yourself. Lecturers Course event page: https://events.vsc.ac.at/e/EESSI-2024-10 Registration: https://events.vsc.ac.at/event/141/registrations/137/

Leveraging EESSI for SKA Radio Astronomy Data on Global SRCnet Infrastructure

In collaboration with the SKA project, we demonstrated the successful use of European Environment for Scientific Software Installations (EESSI) to run radio astronomy analyses on the globally distributed SRCnet infrastructure. The SKA project faces an immense challenge as it must process and analyze an estimated 700 PB of data each year while operating across a globally distributed infrastructure. It is crucial to ensure that the right software is delivered to the correct locations with optimal performance in order to effectively handle this massive amount of data. By deploying software across multiple SKA regional centres, including those in the Netherlands, Japan, Korea, and Canada, we showcased how EESSI enables seamless and efficient data processing. This proof of concept highlighted the flexibility of EESSI across a variety of systems, such as HPC, Cloud, and Kubernetes, meeting the complex requirements of the SKA’s high-performance data analysis needs. As a proof of concept, we deployed various pieces of software that are normally used as part of a radio astronomy analysis pipeline (AOFlagger, Casacore, IDG, EveryBeam, DP3 and WSClean) through EESSI. This allowed the SKA regional centers to run this pipeline on any node of their distributed infrastructure without the need for downloading complete containers first. EESSI’s capability to optimize software for various CPU models and reduce network traffic and startup latency proved invaluable, which has been shown to deliver up to 30% performance improvements for certain use cases. While EESSI may not be a one-size-fits-all solution for SKA, its key technologies can play an important role in helping to meet these demands. By adopting and integrating select components of EESSI, SKA can improve the efficiency and performance of its software stack.

Extrae available in EESSI

Thanks to the work developed under MultiXscale CoE we are proud to announce that as of 22 July 2024, Extrae v4.2.0 is available in the EESSI production repository software.eessi.io, optimized for the 8 CPU targets that are fully supported by version 2023.06 of EESSI. This allows using Extrae effortlessly on the EuroHPC systems where EESSI is already available, like Vega and Karolina. It is worth noting that from that date Extrae is also available in the EESSI RISC-V repository risv.eessi.io. Extrae is a package developed at BSC devoted to generate Paraver trace-files for a post-mortem analysis of applications performance. Extrae is a tool that uses different interposition mechanisms to inject probes into the target application so as to gather information regarding the application performance. It is one of the tools used in the POP3 CoE. The work to incorporate Extrae into EESSI started early in May. It took quite some time and effort but has resulted in a number of updates, improvements and bug fixes for Extrae. For full details of the work, see the extended EESSI blog post.

Using EESSI in GitHub Action workflows

By Jean-Noël Grad GitHub continuous integration and continuous delivery (CI/CD) pipelines can leverage EESSI1 to download pre-built scientific software using the EESSI GitHub Action2. GitHub workflows are routinely used to execute test suites, generate and deploy software documentation, and run executable papers. As a real-world example, we will explore pyMBE3, a molecular builder that simplifies and automates the creation of complex molecular models in the molecular dynamics engine ESPResSo. As part of pyMBE’s software quality assurance, every code contribution is automatically tested against stable ESPResSo releases. This is achieved by a workflow called testsuite.yml4, which loads ESPResSo 4.2.1 and installs the subset of Python dependencies not already provided by EESSI. In a subsequent stage, the test suite is executed to check the software behavior meets our specifications and reproduces published results. The software user guide is generated to verify compliance with the Sphinx specifications, and uploaded as an artifact that can be downloaded by human reviewers to confirm that any new feature is properly documented. After a contribution is merged to the main branch, and upon successful completion of a test suite on the main branch, another workflow called deploy.yml5 automatically reads and uploads the documentation artifact to the pyMBE online user guide6, which is hosted on GitHub Pages. References:

pyMBE: The Python-based molecule builder for ESPResSo

By Jean-Noël Grad We are happy to announce the first release of pyMBE, an open-source Python package designed to facilitate the design of custom coarse-grained models of polyelectrolytes, peptides and proteins in ESPResSo (https://doi.org/10.5281/zenodo.12102635). pyMBE extends the ESPResSo API with methods to automate repetitive and error-prone tasks, such as setting up chemical bonds, non-bonded interactions and reaction methods. pyMBE is maintained by an active community of soft matter researchers with a shared interest in the modeling of weak polyelectrolytes and biomacromolecules. We welcome new users and developers to join the project and contribute new features! Learn more about pyMBE in our recent publication at The Journal of Chemical Physics (https://doi.org/10.1063/5.0216389), where we outline the main features of pyMBE and show how it can be leveraged in computational soft matter research.

HPC Knowledge Meeting – HPCKP Barcelona, May 2024

The recording and presentation of the talk “Streaming scientific software has never been so EESSI”, by Alan O’Cais, at HPCKP’24 Barcelona are already available online here: Abstract: Have you ever wished that all the scientific software you use was available on all the resources you had access to without having to go through the pain of getting them installed the way you want/need? The European Environment for Scientific Software Installations (EESSI – pronounced “easy”) is a common stack of scientific software installations for HPC systems and beyond, including laptops, personal workstations and cloud infrastructure. In many ways it works like a streaming service for scientific software, instantly giving you the software you need, when you need it, and compiled to work efficiently for the architecture you have access to. In this talk, we’ll explain what EESSI is, how it is being designed, how to get access to it, and how to use it. We’ll include a number of demonstrations and review significant developments of the last 12 months (including support for NVIDIA GPUs, and active development for RISC-V systems). Download PDF

MultiXscale at InPEx Workshop 2024

The International Post-Exascale Project (InPEx) is a pioneering initiative bringing together the brightest minds in the field of high-performance computing, from researchers and engineers to policy organizations and funding bodies. It’s workshops, accompanied by workgroups of experts dedicated on critical topics for Exascale, are designed to foster international collaboration and co-design, essential in the journey towards and beyond Exascale computing. The technical manager of MultiXscale (AlanO’Cais) was invited to participate in the 2024 InPEx Workshop, and in particular contributed to the “Software production and management” topic of the workshop. Lively discussions were held as part of that topic with a general agreement that creating a software installation description format could be very useful to encourage collaboration between the various software installation tools being used “in the wild” (at this particular workshop EasyBuild/EESSI, Spack and Guix were represented).

First portable test run on two systems with different architectures

One of the milestones that we have in MultiXscale is to be able to run the EESSI test suite on at least two different architectures. In the context of EuroHPC that means running on different partitions of the available EuroHPC Supercomputers.  Our initial effort focused on getting the test suite portable between two different supercomputers: Karolina and Vega (the CPU partitions of both are a Zen2 architecture). More recently we have spent time getting the same test suite working on a more “exotic” architecture, the ARM A64FX architecture of Deucalion (currently in pre-production). This has some additional complications for us as CernVM-FS is not yet natively available there. The performance/scalability plots we have measured for the ESPResSo application of MultiXscale are below. For full technical details of how we carried this out (and how you can repeat it for yourself!), please take a look at the EESSI blog post on this milestone. We look forward to reporting increased performance for ESPResSo in the future as we implement some of the ideas suggested in our recent deliverable on this topic.

New Paper available at Faraday Discussions Journal

New Paper available at Faraday Discussions Journal: “Investigating the effect of particle size distribution and complex exchange dynamics on NMR spectra of ions diffusing in disordered porous carbons through a mesoscopic model”. The document is accesible for download here. The poster was recently presented at CECAM workshop: “Electrochemical Interfaces in Energy Storage: Advances in Simulations, Methods and Models”, carried out in CECAM-HQ-EPFL, Lausanne (Switzerland), from 18 to 21 June 20024.

Scroll to Top