"It might sound like a cliché, but in many ways, we know less about the shape of the ocean floor than we know about the craters and rock formations on Mars," says Dimitris Menemenlis, lead researcher on the ECCO-IcES (Ocean-Ice Interactions in Earth System Models) project. For vast expanses of the ocean floor, we have little knowledge of the underwater topography (bathymetry). Menemenlis has in fact shown direct evidence for the importance of the underlying shape of this cavity. The dynamic heat exchanges and processes between ice sheets, sea ice and the ocean have also not yet been fully represented in climate models. A new project, ECCO-IcES, aims to address some of these limitations by integrating an innovative Ice Sheet System Model (ISSM) into an existing computer model, taking advantage of the microprocessor muscle of NASA's Pleiades supercomputer (number 14 in the Top 500 list).
In the late 1990s, the ECCO (Estimating the Circulation and Climate of the Ocean) project was started by researchers at Massachusetts Institute of Technology (MIT) and NASA's Jet Propulsion Laboratory. The project's objective was to develop advanced computational tools to fully understand the dynamic forces that play out in the world's oceans combining observational data from satellites and ocean monitoring probes, which resulted in the MIT general circulation model (MITgcm).
MITgcm is still used today for ocean circulation, but ice sheet modelling is more complicated, and is mainly measured via mass-balance (the net effect of growth and melting). Sea-ice in the Arctic has become younger, and thinner. It is more likely to drift, converge and form ridges. "There are many large-scale processes that govern ice mass loss such as atmospheric precipitation and seasonal melt, but there are also more subtle processes that are difficult to observe or model like grounding line dynamics [the junction between the grounded ice sheet, the floating ice shelf, and the ocean]," explains Menemenlis. The ECCO-IcES project aims to couple the MITgcm with the Ice Sheet System Model (ISSM), developed by JPL and University of California, Irvine, in order to model some of these more subtle effects.
Menemenlis's goal is to make these two models talk to each other. MITgcm needs some boundary conditions that can be provided by the ISSM; the changing shape of the ocean cavity under ice sheets is something that's important, and cannot be modeled by MITgcm. Unfortunately, large-scale global climate models also do not resolve the small regional processes. "Models often don't include features like katabatic winds which are driven by orography, that is, the shape of the ice-mountains in Antarctica, and are important for moving sea ice and water. Most atmospheric analyses are too coarse to represent these winds," according to Menemenlis.
"Another major challenge is the disparity of temporal and spatial scales of climate processes. The time scale for deep ocean ventilation is 1,000s of years, ice sheet equilibration time scales are between 10,000 to 100,000 years. But ice sheet lubrication events can happen very quickly, days or weeks," says Menemenlis. There are also the spatial parameters, the sub-kilometre scales of oceanic vertical mixing or ice sheet grounding line dynamics. To understand the intricate ocean ice sheet interactions, the researchers fit their observational data to their numerical models and carry out an "estimation problem". "It's like solving a least squares line fitting problem, except that the model is non-linear and instead of solving for two parameters, we solve for billions of parameters using billions of observational constraints", says Menemenlis.
ECCO-IcES will need more than 50 million processor hours annually on Pleiades, doubling to 100 million hours per year in 2015. Menemenlis also suggests that many of the machines that are faster than Pleiades would not be as well suited to the kinds of applications that the ECCO-IcES team plan to run because "it is not only Pleiades's speed [peak performance of 1.75 petaflops] that makes it special, but the overall balanced computing environment for science applications, the connectivity of the nodes, the balance between disk and memory and CPU cycle, the I/O throughput, and the real time scientific visualization capability [Hyperwall-2]".
The long-term data storage requirements are estimated to reach about 800 terabytes. In order to understand the dynamic processes of the ocean, myriad measurements have to be recorded using distributed instruments - sea surface height (via satellite observations), sea-ice concentration (via passive microwave radiometers) sea surface temperature (via infrared) and surface wind stress (via radar) as well as salinity, temperature and motion (via moorings). "With all these data sets, the data management and quality management control are huge", says Menemenlis.