- Dark matter theory seeks to account for an unseen force behind observed galaxy movement.
- The Lattice Strong Dynamics group used the Vulcan supercomputer to test a variant theory.
- Collider experiments will verify the new theory within a year — if supercomputers can keep up.
Have you ever gazed up into space and been completely overwhelmed by the sheer vastness of the universe and all its mysteries? Does the magnitude of galaxies upon galaxies, expanding as far as 13 billion light-years away, completely boggle your mind? It should. Figures and models explaining intergalactic phenomena can be incomprehensible even to the brainiest of scientists.
A team of physicists, part of the Lattice Strong Dynamics (LSD) collaboration, realized that clarity calls for some supercomputing action. They recently performed a computation that possibly explains the existence of a yet-to-be-deciphered phenomenon known as dark matter. Since dark matter makes up around 83% of all matter in the universe, this is a very big deal.
Swiss astronomer Fritz Zwicky first observed evidence of dark matter’s existence in the 1930s. Zwicky found galaxies were not moving at the speed they should given the mass and gravitational force of other galaxies in their vicinity. These discrepancies suggested there was some unseen matter at work influencing the galaxies’ movements. This gave way to a theory of invisible, dark matter that still has physicists scratching their heads: Dark matter itself cannot be seen, but its gravitational effects can.
Protons and neutrons are composites made up by even smaller particles, called quarks, and are held together by fundamental force particles, called gluons.
With the help of Vulcan, the IBM BlueGene/Q supercomputer at the Lawrence Livermore National Laboratory, physicists have been studying the properties of protons and neutrons that make up ordinary matter. Quantum chromodynamics (QCD) — a theory that also explains the strong nuclear force that holds the nucleus in all atoms, in every part of our visible world — describes these physical phenomena.
The LSD’s ‘stealth dark matter’ theory, however, claims that dark matter is created through a darker, ‘stealth’ form of QCD. Their theory holds that a different species of quarks (dark quarks, which nevertheless have ordinary electric charge) and gluons (dark gluons) are bound so tightly into an electrically neutral composite particle (a ‘dark neutron’) that the composite particle appears to have no electric charge, and thus seems invisible.
In order to see the invisible, the LSD collaboration uses supercomputers to put theory into practice. “Theories that involve such strong interactions of elementary particles are not solvable with ‘pencil and paper’ calculations,” explains Pavlos Vranas, staff physicist and team lead of the LSD collaboration.
Instead, scientists invented a theoretical method called the Lattice. To solve theories, the team transfers space-time into a four-dimensional grid of points — the lattice — and assigns various particles to sites and links of the grid, mapping them onto Vulcan’s compute nodes. Researchers then program the supercomputer to ‘evolve’ these particles according to the rules of the theory of stealth dark matter. They can then calculate physical properties like the mass of the composite stealth dark matter particle.
Calculating dark matter is like trying to calculate the movement of a cloud in the sky when there is wind. The grid of points can be seen as the sky, and the cloud encompasses points that are ‘filled with water.’ The supercomputer applies the forces of wind, moving the cloud this way and that. At any given time, it’s possible to extract the value of a physical property, such as the new shape or location of the ‘cloud.’
Based on experimental findings at the Large Hadron Collider in Switzerland, the team hopes to have a sense of whether their stealth dark matter theory holds true within a year. However, the computational demands of their theories are unprecedented and can only be met by a massively parallel supercomputer such as Vulcan. While today’s fastest supercomputers run at speeds of petaFLOPS (1,000 trillion computations per second), the computational demands of these theories increase rapidly and will soon require exaFLOPS speeds (1 million trillion computations per second).
“The interplay of the theories of the smallest entities of our world with the development of the fastest computation machines is intriguing,” notes Vranas. Finding a way to make the invisible visible through the power of supercomputers is intriguing, indeed.
The LLNL lattice team authors are Evan Berkowitz, Michael Buchoff, Enrico Rinaldi, Christopher Schroeder, and Pavlos Vranas. The Laboratory Directed Research and Development, the LLNL Grand Challenge computation programs, the DOE Office of Science High Energy Theory, and the High Energy Physics Lattice SciDAC program supported this research. Other collaborators are physicists Thomas Appelquist and George Fleming of Yale University; Richard Brower, Claudio Rebbi, and Evan Weinberg of Boston University; Xiao-Yong Jin and James Osborn of Argonne Leadership Computing Facility; Joe Kiskis of the University of California, Davis; Graham Kribs of the University of Oregon; Ethan Neil of the University of Colorado and RIKEN-BNL Research Center at Brookhaven National Laboratory; David Schaich of Syracuse University; Sergey Syritsyn of RIKEN-BNL Research Center at Brookhaven National Laboratory; and Oliver Witzel of Boston University and the University of Edinburgh.