iSGTW Feature - Testing the limits of the Standard Model

Feature - Testing the limits of the Standard Model

This frame is from an animation illustrating the typical four-dimensional structure of gluon-field configurations used by Sugar and his research group.
Image courtesy of Derek Leinweber, CSSM, University of Adelaide.

Questions about the Standard Model of physics have stymied scientists for decades, but now, thanks to supercomputing grid technology, Robert Sugar believes we'll have answers in just five to ten years.

Sugar, a professor of physics at the University of California, U.S., is a TeraGrid user and member of the MILC Collaboration, one of the largest users of open-science computing in the world. MILC first turned to computers for their calculations in the 1980s, believing only supercomputers could solve their intractable problems.

"There was a long period where people were developing the tools and algorithms, but not producing results that were accurate enough to have an impact on experiments," Sugar recalls.

"But that's no longer true," he continues. "Over the last five years, our calculations have become more and more accurate. We're beginning to do calculations that really have an impact on experiments."

Plotting four-dimensional space-time grids

To deduce the parameters of sub-atomic particles, QCD researchers use lattice gauge methods, where quantum fields of quarks are plotted on a four-dimensional space-time grid divided into as many chunks as computational power will allow.

One time history contributing to the propagation of a meson in lattice QCD between the two points denoted by the gray dots. The blue path shows the trajectory of a quark-antiquark pair, the green path shows the excitation of a quark-antiquark pair from the vacuum, and the red lines indicate gluon fields.
Image by J. W. Negele and ScDAC Review

"The calculations get better and better as you make the grid spacing smaller and smaller," Sugar says. "But if you make the lattice spacing smaller, you have to include more grid points and that takes more computing power, which grows by the 4th power of the grid size! So you can see why we need larger computers to get greater accuracy."

MILC-ing massively parallel resources

The MILC Collaboration are making strong use of the TeraGrid supercomputing centers, using more than 19 million computing hours at TACC in 2008, and millions more on computers at the Pittsburgh Supercomputer Center, the National Center for Supercomputing Applications and the San Diego Supercomputer Center.

Processing in parallel across more than 4000 computer cores on TACC's Ranger, the group is able to simulate 643x144 lattice grids with the smallest spacings and quark masses to date.

Sugar believes users of the MILC lattices will soon be able to predict the masses and decay properties of strongly interacting particles to an accuracy of a few percent.
Ultimately, the goal is to define the "realm of applicability" for the Standard Model and show where the new physics begins.

- Aaron Dubrow, Texas Advanced Computing Center

This article appears in a longer form on the TACC Web site.