- An EU collaboration aims to more accurately simulate multiple physical systems at the same time
- Custom code “glues” existing physics solvers together based on their input and output
- Glue is needed because solvers must communicate to solve a multi-physics problem
All around the world, scientists are racing to create the first supercomputer capable of exascale computing. The European Union has even invested €1 billion of public funding to boost HPC efforts in this area.
Though we’re still several years away from a computer that can handle one quintillion (10^18) calculations per second, scientists in the German DFG priority program Software for Exascale Computing (SPPEXA) are creating architectures that could one day run on these kinds of machines.
One of the multiple projects in SPPEXA is ExaFSA, involving researchers at the University of Stuttgart, the Technical University of Munich, Delft University of Technology, TU Darmstadt, and the University of Siegen.
Dr. Benjamin Uekermann, a researcher at SPPEXA, tells Science Node that the idea behind this project is to more accurately simulate complex systems involving several different physical models.
This kind of research could explore the physics of the human body or how climate change works on a grand scale. Specifically, they want to simulate fluid-structure-acoustics interactions, where a very apt example is something we’ve discussed before: wind turbines.
When a wind turbine rotates, it both affects and is affected by the environment around it. The wind vibrates the blades, which creates noise. That makes this a multi-physics problem involving the physics of wind, vibration, and sound. On top of that, there’s a multi-scale problem. Turbulent flow structures are measured with centimeters, but the distance of the noise is measured in hundreds of meters.
“ExaFSA brings engineering applications to high-performance computing architectures,” says Dr. Miriam Mehl, a scientist at University of Stuttgart who workes on the project.
Stuck like glue
While existing tools can already model each property, these solvers don’t work together simultaneously. This often results in scientists omitting some physical property from their simulation, thereby giving a less accurate reading. What’s more, the solvers are highly complex and often act like black box machines for which no details are known beyond their input and output data.
Therefore, researchers had to come up with a way to combine these separate solvers into a single platform. In this effort, scientists from the University of Stuttgart and the Technical University of Munich created preCICE. This open-source coupling library is a huge jump forward. Uekermann explains that preCICE allowed the team to work with highly-complex legacy codes without having to make major alterations.
“We developed the open-source coupling library preCICE as a glue code to couple different existing simulation software components,” says Uekermann. “For example, we couple the flow solver FASTEST (TU Darmstadt) with the structure solver FEAP (developed at Berkeley) and the acoustics far-field solver Ateles (University of Siegen).”
Modeling a multi-physics problem demands an enormous computing power, which is why the team relied on Leibniz Supercomputing Centre’s SuperMUC and Stuttgart’s Hazel Hen. SuperMUC utilizes 245,512 cores and has a performance speed of 6.92 teraFLOPS while Hazel Hen relies on 185,088 cores and performs at 7.42 teraFLOPS.
Both of these machines are powerful, but they are nowhere near exascale. However, the researchers working on ExaFSA aren’t letting a lack of technology stop them.
That said, expectations for exascale might be putting the cart before the horse. Mehl says that current projections show that an exascale machine would require an entire power plant to function. However, this technology will be here before we know it, and researchers like Uekermann and Mehl are excited about the possibilities.
“Gluing together two or more codes in a way that works is really gainful,” said Dr. Mehl. “It brings together pieces that are already working great on their own, and you can bring them together into something which is much more than the sum of its parts.”
Exactly when or how exascale will come to fruition remains a mystery, but one thing we do know is that scientists like the ones working on ExaFSA are ready for it. Mehl tells Science Node that the team is confident their work on ExaFSA and preCICE will contribute to successfully exploiting the available compute power whenever exascale arrives.