- Aerosol particles are among the leading causes of death for millions.
- Calculating the effect of aerosols on global climate change has been an uncertain science.
- Big Red II brings the computing power to enable higher resolution models and process massive atmospheric data sets.
Newcastle upon Tyne, England. Dating back to its founding in the second century CE, this bustling university city is home to impressive bridges, Victorian underground tunnels, picturesque castles and cathedrals, and of course Newcastle Brown Ale. Newcastle may not be your typical supercomputing setting, so Big Red II at Indiana University is answering Newcastle University’s call for some computationally intensive research demands.
In a study recently published in Atmospheric Chemistry and Physics, scientists from Newcastle University, Cornell University, and Indiana University have joined forces in an attempt to reduce uncertainty in concentrations of aerosol particles. With the help of Big Red II, the research team is running high performance calculations to discern how atmospheric aerosol particles vary in space and time and understand the fundamental processes that dictate their concentrations and impacts.
Atmospheric aerosol particles are liquid or solid droplets suspended in air — catapulted there after, for instance, a volcanic eruption. They are the particles that make up air pollution. Alongside desert dust and human-made aerosols (think oil and coal emissions), these atmospheric aerosols have significant cooling effects on the world’s climate (by interacting with sun rays and clouds). They are the largest source of uncertainty in climate change projections and scientists blame them for the premature deaths of millions of people around the world each year (one in eight deaths is a direct result from air pollution exposure according to a 2014 report released by the World Health Organization).
“In our research we investigate the processes responsible for particle formation and growth in the atmosphere by using state-of-the-art model simulations of meteorological and chemical conditions over eastern North America,” explains Paola Crippa, research associate at Newcastle University, and lead author of the study.
By comparing results with NASA satellite observations, the team is able to assess model performance in describing spatio-temporal variability of particle properties. This is essential in more accurately quantifying aerosol particles’ current and future impacts on the earth’s climate and human health.
“Our work — seeking to improve understanding of aerosol populations and better model them — is a critical first step in developing policies to reduce the climate impact and human health toll from high aerosol concentrations,” says Sara Pryor, professor at Cornell University and co-author of the study. The team is conducting massive numerical experiments (millions of individual computations) that require massively powerful computers. That’s where Big Red II comes in.
Big Red II’s computational capacities have allowed the team to run simulations at unprecedented high spatial resolution (12 km/7.46 mi) in a relatively short period of time. “The technical support from principle engineer Abhinav Thota and the Scientific Applications and Performance Tuning group at Indiana University enabled us to use a state-of-the-art regional weather and forecasting model at very high resolution for the first time,” says Crippa. Her group is among the few research groups worldwide trained to run such simulations — which had never been run systematically on a Cray machine before this study.
Big Red II operates at a peak rate of one thousand trillion floating-point operations per second (one petaFLOPS). Performing one calculation per second 24 hours a day, 7 days a week, 365 days a year would take one person more than 31 million years — Big Red II can do the same number of calculations in just one second.
The sheer power of Big Red II allowed Crippa's team to complete year-long simulations in just ten days, using a fraction of the supercomputer’s power. “NASA generates over six TB of earth observational data per day, and when you think our simulations are for a whole calendar year, that’s a lot of data to store and analyze,” says Pryor.
Given the unique computational resources available, and a recently awarded NASA grant, the team will further investigate the sensitivity of model performance on different meteorological, chemical, and aerosol parameters and generate ensemble runs for climate purposes. They hope to perform even more accurate predictions of atmospheric aerosol formation and its effect on the atmosphere — and ultimately on human health.