• Subscribe

Hunting for giants

Diagram of high energy collisions that produce a top quark and its antimatter equivalent, the anti-top quark.
A diagram showing how high-energy particle collisions produce a top quark (t) and its antimatter equivalent, the anti-top quark (t). Image courtesy Wikimedia Commons.

It's thirty-five times heavier than other quarks, and as heavy as the nucleus of an atom of gold. The top quark is the elephant-in-the-room of the standard model. Its lifetime in particle detectors is so short that we can only infer its existence by sifting through vast quantities of data produced by smashing together subatomic particles. "Without distributed computing, hunting for the top quark would be impossible," said Marcel Vreeswijk, associate professor of subatomic physics at the University of Amsterdam, who works on the ATLAS experiment at the Large Hadron Collider (LHC) at CERN, in Switzerland.

ATLAS produces new data every 25 nanoseconds (billionths of a second), helping scientists home in on an accurate mass for the top quark. To deal with this data, some of which is random 'noise', they have to filter out signals at the detector stage. Even so, processor farms must reconstruct collision events hundreds of times per second, after which data is sent out to the grid for further processing. "There are many people in different countries looking at the results," Vreeswijk said.

The mystery is in the mass

By simulating the high-energy conditions of the early universe, particle accelerators enable physicists to investigate the fundamental constituents of matter. Protons and neutrons in atoms are not the most basic building blocks, but are themselves made of a combination of 'up' and 'down' quarks - three in total. There are also four other, more exotic, types of quark besides 'up' and 'down' - including the top quark - that contribute to a picture of particle physics called the standard model. Top quarks were abundant in the early universe, but now, even in the world's most powerful accelerator at the LHC, they decay immediately.

"In nature we don't see quarks on their own because they're unstable. In our detectors, we see a signature shower of particles, including electrons, protons, pions, and muons," Vreeswijk said. "The top quark decays immediately into a bottom quark and a W boson - and the W boson decays into other particles. From these decays we know the top quark mass more accurately than any other particle."

The top and the higgs

A video about the top quark, the largest of the fundamental building blocks of the universe. Together with the Higgs boson, it is one of the elementary particles studied at the Large Hadron Collider (LHC), the world's largest and most complex scientific instrument. Image courtesy EGI.

Physicists think that particles, including the top quark, acquire their mass by interacting with the Higgs field. Knowing the mass of the top quark accurately is important because it can be used as a 'standard' in particle detector collisions, and will help point to the existence of the Higgs boson. Finding it, however, won't be the end for physicists like Vreeswijk. "If we find the Higgs particle, that won't be enough. We also have to test it against the standard model predictions," Vreeswijk said.

At high energies, inconsistencies begin to appear, such as deviations of recorded masses from those predicted by theory, or experimental results not matching up with Monte Carlo simulations. "When you have inconsistencies like this, we know the standard model cannot be the model. There must be something else," Vreeswijk said.

This opens up the possibility of new physics. One theory is supersymmetry, where each of the fundamental particles is mirrored by a partner with heavier mass. "It would complete the standard model in an elegant way," said Vreeswijk.

There are even more exotic possibilities, such as extra dimensions, occupied by tiny black holes. "It sounds bizarre, but actually it's based on solid theory," Vreeswijk said.

Beyond the top quark

People are already working on the next generation of particle accelerators - such as CLIC at CERN - which will reach even higher energies. "They will produce even more data, so this brings us back to distributed computing," Vreeswijk said. Even now, we don't have enough power to process everything every 25 nanoseconds. So the next generation of faster computers will need to be able to process more data, without the need for filtering at the detector stage. This will allow us to investigate rare processes, which would otherwise be lost.

Vreeswijk also pointed out the importance of investing in computing infrastructure. "There is a secondary benefit to distributed computing for data analysis," Vreeswijk said. "Local scientific institutes in individual countries benefit from extra computer power, while avoiding overloading a central facility such as the one at CERN."

Join the conversation

Do you have story ideas or something to contribute? Let us know!

Copyright © 2023 Science Node ™  |  Privacy Notice  |  Sitemap

Disclaimer: While Science Node ™ does its best to provide complete and up-to-date information, it does not warrant that the information is error-free and disclaims all liability with respect to results from the use of the information.

Republish

We encourage you to republish this article online and in print, it’s free under our creative commons attribution license, but please follow some simple guidelines:
  1. You have to credit our authors.
  2. You have to credit ScienceNode.org — where possible include our logo with a link back to the original article.
  3. You can simply run the first few lines of the article and then add: “Read the full article on ScienceNode.org” containing a link back to the original article.
  4. The easiest way to get the article on your site is to embed the code below.