• Subscribe

Earthquake risks

This visualization depicts a building during an earthquake; the data on which it is based was generated during one of Barbosa's simulations. Image courtesy of Andre Barbosa.

When engineers design buildings, they don't know what sorts of natural disasters the building will have to withstand. But using cutting-edge computer models, they can minimize their uncertainty.

"In general, the world is uncertain. There's really uncertainty in everything," said André Barbosa, a structural engineering researcher based at Oregon State University.

Barbosa is using the Open Science Grid to run simulations that test how a building design will stand up to an earthquake. In the process, he works with two different kinds of uncertainty: the uncertainty in nature, and the uncertainty of computational models.

Inconstant nature

Nature introduces a great deal of uncertainty into a structural engineer's calculations, even before a building design is selected. Not all sites are at the same risk of having big earthquakes; this risk is called 'seismic hazard.'

Seismic hazard must take into consideration many different factors, such as the distance from the building site to all nearby faults, the frequency and intensity of earthquakes at those faults, and the type of soil at a building site.

Even when you only look at earthquakes of equal intensity, each quake will shake to the irregular beat of a different drummer. To cover nature's many variations, Barbosa turned to the Pacific Earthquake Engineering Research Center's Ground Motion database. The database contains over 3000 three-dimensional sensor recordings of 104 earthquakes compiled from over 1000 sensor stations.

"I want to get all the possible earthquake records that fall into the same category," Barbosa said. In the case of Barbosa's pilot experiment, that meant 180 records.

Building uncertainty

Buildings may seem like solid, concrete things. But to a structural engineer, nothing is set in stone. Completed buildings are highly unlikely to match their blueprints perfectly, thanks to factors ranging from human error to the limitations of measuring implements. And although characteristics of common building materials such as strength and stiffness are well-documented, they do vary from their expected values.

Yet blueprints and the documented characteristics of building materials are the foundation on which computational building models are built. In order to allow for that uncertainty, Barbosa calculates the expected value - a sort of best guess - and the standard deviation - a measure of how much a quantity might vary from that expected value - for parameters such as the strength and stiffness of concrete and steel, as well as ground movement. Then he uses that information to 'perturb' each parameter.

"Too many people use a deterministic kind of [approach]," Barbosa said. "Here, I'm looking at the uncertainty in the input model parameters… we are really trying to capture the model uncertainty."

Perturbing uncertainties

Although Barbosa's perturbations can give a much more realistic perspective of the best and worst case scenarios, they make the model exponentially more computationally intensive.

"On average, it takes about twelve hours for one analysis for one earthquake with a given set of parameters," Barbosa said. "If you perturb one parameter, it's another twelve hours… and so on…"

Without the perturbations, running the simulation for all 180 earthquakes would take 90 days. For his pilot, Barbosa planned to take into account six parameters with four perturbations each. That adds up to 49 simulations per earthquake, or a total of 8,820 simulations. That would take approximately 12 years, which meant it was time to turn to the OSG, where the jobs would take about a month to complete.

To get his simulation running on the OSG, Barbosa partnered with OSG user support team members Gabriele Garzoglio and Marko Slyz of Fermi National Accelerator Laboratory, as well as RENCI system administrators. Slyz and Garzoglio assisted Barbosa in the management, transfer, and analysis of the large research data sets involved in Barbosa's simulation. The OSG and RENCI teams also helped Barbosa select data compression algorithms, tune the Globus Online service for data transfers between the submit node and the users' local machine, and improve other software and hardware issues related to the processing of large datasets.

To run his simulations, Barbosa used the Open System for Earthquake Engineering Simulation (OpenSees), a software framework for simulating the seismic response of structural and geotechnical systems which was developed at the Pacific Earthquake Engineering Research Center. A little over a year ago, the OSG user support team collaborated with NEES (the George E. Brown, Jr. Network for Earthquake Engineering Simulation) to integrate OpenSees into the OSG infrastructure. Thanks to that earlier work, Barbosa was able to run his simulation successfully on the OSG.

Future steps

Although Barbosa's simulation executed successfully, it was only a pilot project designed to explore the concept of using probabilistic models for earthquake engineering. In order to be useful for practicing engineers, the model still needs some work.

"It's not fine-grained enough," Barbosa explained.

Barbosa is referring to the fact that his model did only four perturbations of six parameters. There are more parameters he could include, and four perturbations are certainly not adequate to achieve the accuracy to which he aspires.

"I would say the minimum is around eight perturbations," Barbosa said.

While Barbosa makes plans to explore simulations with more parameters and perturbations, the OSG user support team members and the NEES collaboration are working together and individually towards a number of goals.

NEES leverages a well-known open source platform called HUBzero to create NEEShub, the collaboration's website and science gateway. Via NEEShub, users can collaborate, sharing educational materials, data sets, applications, and more. NEEShub users can also use the portal to analyze data or run simulations using OSG resources, via a glide-in workload management system interface.

"We worked last year on the integration between NEEShub and OSG," Garzoglio said. "The next step is to make sure that it can sustain the production levels of NEES."

Join the conversation

Do you have story ideas or something to contribute? Let us know!

Copyright © 2020 Science Node ™  |  Privacy Notice  |  Sitemap

Disclaimer: While Science Node ™ does its best to provide complete and up-to-date information, it does not warrant that the information is error-free and disclaims all liability with respect to results from the use of the information.

Republish

We encourage you to republish this article online and in print, it’s free under our creative commons attribution license, but please follow some simple guidelines:
  1. You have to credit our authors.
  2. You have to credit ScienceNode.org — where possible include our logo with a link back to the original article.
  3. You can simply run the first few lines of the article and then add: “Read the full article on ScienceNode.org” containing a link back to the original article.
  4. The easiest way to get the article on your site is to embed the code below.