• Subscribe

Record-breaking US drought set to become the norm

Drought-stressed corn

Supercomputers are key to predicting the likelihood of this.
Image courtesy CraneStation, Flickr, CC-BY 2.0.

With over half of all counties designated primary disaster areas by the United States Department of Agriculture, the current US drought is now officially the largest ever natural disaster area in the history of the nation. And, according to climate models run using the grid-computing infrastructure at the Extreme Science and Engineering Discovery Environment (XSEDE), such levels of drought are likely to become the norm in the region by the end of this century.

James Kinter, director of the Center for Ocean-Land-Atmosphere Studies at the Institute of Global Environment and Society, was granted permission to use systems provided by XSEDE - and its predecessor TeraGrid ­­- to run climate models with high spatial resolutions, requiring large amounts of processing power. Based on these simulations, Kinter and his colleagues calculated that the likelihood of extreme drought in the US Great Plains and Florida is set to increase from the average frequency of four out of every 50 years to 20 years of extreme drought out of every 50 years by the end of the 21st century. The research is published in the current issue of The Journal of Hydrometeorology.

A global problem

The on-going US drought, coupled with the fact that 2012 has so far been the hottest year since records began for the lower 48 states, has left many farmers in the country facing severe economic woes. To help ease the burden on farmers, the US government is opening up 3.8 million acres of conservation land for ranchers to use for haying and grazing and US crop insurers have agreed to provide farmers facing cash-flow issues a penalty-free, 30-day grace period on premiums in 2012.

However, the drought in the US is likely to have stark repercussions across the globe. With the US being the world's largest exporter of corn, as well as one of the top exporters of both soybean and wheat, analysts are predicting that the extreme dry period currently affecting the US could drive up food prices in some of the world's poorest nations. And, according to researchers at the New England Complex Systems Institute, the problem of rising global food prices is likely to be compounded by commodity speculation, whereby investors bet on food prices, thus amplifying the drought's market signals. World cereal prices have now risen by 17% last month alone.

Consequently, given the scale and severity of the problems the current US drought is likely to cause, coupled with the possibility that extreme weather events such as this are related to climate change, it is of paramount importance that models used for predicting how climate change is likely to unfold over the coming years are made to be as accurate as possible. This is where the supercomputers come in.

Resolution revolution

Using the XSEDE infrastructure, Kinter and his colleagues were able to simulate US climate over long periods at the spatial resolution normally used for producing 10-day weather forecasts today. "The pattern of increasing probability of extreme drought in our simulations is quite similar to the summer 2012 drought severity map showing 63% of the lower 48 US states in moderate to severe drought," says Kinter. "Our results suggest that, while the 2012 event itself cannot be ascribed to human-induced climate change per se, the severe situation we are experiencing today is likely to become entirely too commonplace in the future."

However, not all climate researchers are as fortunate as Kinter and his colleagues to be granted dedicated access to some of the US National Science Foundation's finest high-performance computing equipment. And, with competition for access to supercomputers fierce, climate scientists are often required to make pragmatic decisions, such as limiting the spatial resolution of their models, so as to save on computing time. After all, even at moderate resolutions, one could reasonably expect global-scale models to require several months of supercomputing time to produce just a century's worth of climate data.

Speaking at the recent XSEDE12 conference in Chicago, Kinter discussed the significant effects that changes in spatial resolution of climate models can have on their outcomes. Working with the Athena supercomputer in 2009/10, he and his team were able to run simulations of boreal summer climate at a 7-kilometer resolution over the course of eight summers, whereas researchers had previously only been able to simulate a single week or month at this level of detail. While running these simulations, the team found that even small changes in spatial resolution could have large impacts on the outcomes of the climate models. He cites research by collaborators that showed how low-resolution models of the Gulf Stream off the east coast of the US put rain associated with the weather pattern in the wrong place, whereas high-resolution models were able to delineate the bands of rain off the east coast with accuracy.

Of course, while ever-faster supercomputers mean that climate models can feasibly be run at increasingly high spatial resolutions, this doesn't mean that a solution to the difficulties of climate modeling is yet in sight. The research conducted by Kinter and his team was pioneering work and required many ad hoc fixes and workarounds to complete the simulations. For example, the team suffered with issues relating to the memory available per node on Athena and the output from their simulations was capped by the bandwidth limits between both processors and disk and disk and HPSS tape.

According to Kinter, turning the ad hoc solutions he and his team came up with to deal with these issues into systematic, repeatable solutions is the next step which needs to be taken. However, some of the issues his team faced were even more difficult to overcome: "We were in a unique situation, because we had dedicated access to Athena 24/7 for six months. That introduced unique challenges like how to keep the queue loaded with jobs so that the machine never went idle and how to manage the output data. We generated output at a rate that would fill Athena's disks every 6 days, so we had a fairly serious data management challenge." The project generated nearly 900TB of data in total and this is now available to researchers around the world hoping to improve the next generation of climate models.

The following list of people collaborated with Kinter on this research:

Benjamin Cash; Deepthi Achuthavarier; Jennifer Adams; Eric Altshuler; Paul Dirmeyer; Bohua Huang; Emilia Kyung Jin; Lawrence Marx; Julia Manganello; Cristiana Stan; Thomas Wakefield; Tim Palmer; Mats Hamrud; Thomas Jung; Martin Miller; Peter Towers; Nils Wedi; Masaki Satoh; Hirofumi Tomita; Chihiro Kodama; Tomoe Nasuno; Kazuyoshi Oouchi; Yohei Yamada; Hiroshi Taniguchi; Troy Baer; Christian Halloy; Dwayne John; Bruce Loftis; Rick Mohr; Kwai Wong.

Join the conversation

Do you have story ideas or something to contribute? Let us know!

Copyright © 2018 Science Node ™  |  Privacy Notice  |  Sitemap

Disclaimer: While Science Node ™ does its best to provide complete and up-to-date information, it does not warrant that the information is error-free and disclaims all liability with respect to results from the use of the information.

Republish

We encourage you to republish this article online and in print, it’s free under our creative commons attribution license, but please follow some simple guidelines:
  1. You have to credit our authors.
  2. You have to credit ScienceNode.org — where possible include our logo with a link back to the original article.
  3. You can simply run the first few lines of the article and then add: “Read the full article on ScienceNode.org” containing a link back to the original article.
  4. The easiest way to get the article on your site is to embed the code below.