Los Angeles basin jiggles like big bowl of jelly in cutting-edge simulations

These two Los Angeles area maps compare ground motions calculated by SCEC earthquake simulations to ground motions observed during the 2008 Chino Hills earthquake. On each map, the star marks the quake’s epicenter, and the dots show locations where ground motions were recorded for that earthquake. The color variations show how well the simulated ground motions match the observed ground motions, with darker colors (from red to black) indicating areas with poorer matches, and lighter colors (from yellow to white) indicating areas with good matches. The two maps show results from simulations using different 3-D Earth structure models. The Earth model that produced the best overall match to observations (here shown to the left) was the model used as input for the team’s recent 1 hertz CyberShake seismic hazard simulation on the Blue Waters supercomputer at the National Center for Supercomputing Applications (NCSA). Credit: Ricardo Taborda, University of Memphis, and Jacobo Bielak, Carnegie Mellon University 

Earthquakes occur on a massive scale and often originate deep below the surface of the Earth, making them notoriously difficult to predict.

The Southern California Earthquake Center (SCEC) and its lead scientist, Thomas Jordan, use massive computing power made possible by the National Science Foundation (NSF) to improve our understanding of earthquakes. In doing so, SCEC is helping to provide long-term earthquake forecasts and more accurate hazard assessments.

One SCEC effort in particular, the PressOn project, aims to create more physically-realistic, wave-based earthquake simulations using an earthquake model they developed called CyberShake, which calculates how earthquake waves ripple through a 3-D model of the ground.

The latest NSF-funded supercomputers, capable of performing quadrillions of calculations every second, make this more accurate approach to studying earthquakes possible.

The Earth’s crust is made of plates that float on the molten outer core. Most earthquakes result from these plates moving relative to one another, a process called plate tectonics.

The edges of plates are rough. Those edges get stuck on one another while the rest of the plates keep moving, storing up energy—a process like stretching a rubber band. When the plate edges finally come unstuck (like letting go of one end of the rubber band), all the pent-up energy is released and the plate jerks into place. Aftershocks happen when the plate overshoots its equilibrium point and continues to readjust over the coming days to years.

Three distinct types of wave are generated by an earthquake: primary waves, secondary waves and surface waves. Each has a unique behavior and a distinct signature. The characteristics, timing and damage pattern of these waves differ by distance from the origin of the earthquake and the type of rock or soil they move through.

Given detailed information about the geological material in specific areas, physics-based 3-D wave propagation simulations are able to calculate how earthquake waves will move through the Earth and how strong the ground motions will be when the waves reach the surface.

In 2014, the SCEC team investigated the earthquake potential of the Los Angeles Basin, where the Pacific and North American plates run into each other at the San Andreas Fault. In this study, the simulation showed earthquake waves trapped, and reverberating, within in the Los Angeles basin. The high-shaking ground motions were much more than Jordan and his team expected.

“These basins act as essentially big bowls of jelly that shake during earthquakes and therefore very much affect the motion,” Jordan said.

SCEC’s simulations vary in terms of seismic wave cycles per second, or hertz. As that measurement increases, so does the potential for damage—and the complexity of the simulation. Structures such as buildings and bridges are most vulnerable to damage by seismic waves between 1 and 10 hertz.

The team first simulated individual earthquakes at 4 hertz. They then performed a simulation involving a large ensemble of earthquakes at 1 hertz—simulating more quakes required lowering the wave intensity—to calculate a probabilistic seismic hazard model for the Los Angeles area. A seismic hazard model describes the probability that an earthquake will occur in a given geographic area, within a given window of time, and with ground motion intensity exceeding a given threshold.

Starting in April 2015 and continuing over seven weeks, SCEC used the NSF-funded Blue Waters and Titan supercomputers at the Oak Ridge Leadership Computing Facility to calculate the first 1 hertz CyberShake hazard model specific to the Los Angeles Basin. This simulation doubled the maximum simulated frequency of the previous year’s CyberShake seismic hazard model, therefore also doubling the accuracy.

Even though the number of calculations required increased as the maximum simulated frequency of the earthquake went up, the tremendous computing power of Blue Waters and Titan reduced the time needed for these calculations from months to weeks.

Scientists believe seismic hazard analyses need to simulate earthquake frequencies above 10 hertz to realistically capture the full dynamics of a potential event. SCEC’s work is paving the way for those simulations. Physics-based 3-D earthquake simulations at 10 hertz, once a distant dream, are now on the horizon.

Note: The above post is reprinted from materials provided by National Science Foundation.