• Subscribe

The continuing question of exascale

Photo courtesy Cristina Costa, Flickr.

Extreme sciences - such as climatology, genomics, neuroscience, bioengineering, geosciences, and nuclear engineering - create immense amounts of data. Add in multiple models, a few virtualized simulations here and there, and a pinch of (or 80 million) extensive calculations, and you have a sure recipe for creating data (big data) at unprecedented rates. As dataset sizes increase exponentially, so too do researchers' needs to store, recall, and work with this data. These advances are leading to changes in supercomputing architecture and how supercomputers achieve results.

As William Harrod, division director for the US Department of Energy (DOE) Advanced Scientific Computing Research program, noted at SC12, "Extreme-scale science is creating a data explosion, so the next generation of scientific breakthroughs will require major advances in computing technology." This hardly comes as a revelation, as the goals and major challenges of exascale computing have been well documented in past reports compiled by both the DOE and the US Defense Advanced Research Projects Agency (DARPA).

The supercomputing community accepts exascale computing (computer systems that can execute at least one exaFLOPS - or 1,000 times more than one petaFLOPS, their current petascale counterpart) as the next step in advancing scientific discovery. However, advancement brings growing pains. Kirk Cameron, director of the SCAlable PErformance (SCAPE) Laboratory at Virginia Polytechnic Institute and State University (Virginia Tech) and an expert in high-performance, power-aware computing, reminded SC12 attendees of a notable DOE goal in this regard: to reach sustained exascale computing by 2019, while staying at or below the 20 megawatts per exaFLOPS limit. (Harrod himself calls this limit audacious.)

Current petascale systems use between two to three megawatts per petaFLOPS. To put that in perspective, one megawatt can power close to 1,000 US homes. As the scale of supercomputing expands to machines that can sustain 10 to 20 petaFLOPS, the added costs of increased power consumption diminish the return of increased processing power. Now consider exascale, where one exaFLOPS executes at 1,000 times the speed of one petaFLOPS. Power consumption costs would be staggering.

This has driven Cameron and SCAPE to pursue ways of monitoring and reducing power consumption on petascale-system architectures. For example, PowerPack 3.0 software for power measurement, analysis, and optimization on clusters is now available. While optimistic about this proactive approach to power consumption, Cameron recognizes the HPC community's aversion to letting power management software determine how (and which) power will be reduced. Despite those reservations, Cameron has a small, growing list of end-users achieving direct benefits from PowerPack.

Cameron acknowledges just how high the power challenge and the exascale hurdle is, calling power consumption an "embedded problem that is prolific and endemic." With the aging US power grid already at capacity and scrambling to keep up with demand, large-scale power withdrawals from the grid remain an unsustainable solution. Cameron predicts the DOE's goal of sustained exascale by 2019, which Herrod confirms they have already pushed back to 2022, is well over a decade away.

Calling exascale the grand challenge of 21st century computing, Herrod points out that making the exascale leap could very well place current petascale technologies in the hands of increasingly more scientists and researchers - and potentially even on the desktop. While he appreciates these possibilities, he also recognizes the importance of innovations in power and technology. Such innovations will need to scale not only across supercomputing, but also across other areas of technology and the computer industry.

The DOE has yet to design or develop actual exascale systems, but Herrod remains confident that future investments could make them a reality. He is less sure of the systems' importance to Congress, as lawmakers have yet to decide on funding. That said, extreme science is not waiting. Scientists and researchers have, in some cases, reached the limits of what petascale can provide and stand at a crossroads waiting for exascale and funding to come together. As Thomas Sterling, associate director of the Center for Research in Extreme Scale Technologies (CREST) at Indiana University asked at the end of 2011, "Can the US influence exascale direction and maintain a strategic lead in its deployment?" We have arrived at the end of 2012, and the answer remains to be seen.

Join the conversation

Do you have story ideas or something to contribute? Let us know!

Copyright © 2023 Science Node ™  |  Privacy Notice  |  Sitemap

Disclaimer: While Science Node ™ does its best to provide complete and up-to-date information, it does not warrant that the information is error-free and disclaims all liability with respect to results from the use of the information.

Republish

We encourage you to republish this article online and in print, it’s free under our creative commons attribution license, but please follow some simple guidelines:
  1. You have to credit our authors.
  2. You have to credit ScienceNode.org — where possible include our logo with a link back to the original article.
  3. You can simply run the first few lines of the article and then add: “Read the full article on ScienceNode.org” containing a link back to the original article.
  4. The easiest way to get the article on your site is to embed the code below.