The December 2004 Sumatra-Andaman earthquake is the third strongest recorded earthquake in history. Triggering a series of enormous tsunamis, the deadly quake ultimately caused the deaths of more than to 200,000 people in 15 countries — making it one of the most devastating natural disasters of all time. To try and learn lessons from it, German researchers from the Technical University of Munich and Ludwig Maximilian University (LMU) of Munich recently used a SuperMUC supercomputer to reproduce the event, in what they claim is the biggest ever multiphysics simulation of an earthquake and tsunami.
This immensely challenging task was the result of five years of preparations in order to optimize the earthquake simulation software.
“Reality is complex — and earthquakes are a multi-scale and multi-physics problem,” Dr. Alice-Agnes Gabriel, the lead researcher from the LMU side of the team, told Digital Trends. “To gain insight into the geophysical processes of the earthquake, we [needed] to simultaneously calculate the complicated fracture of several fault segments and the subsurface propagation of seismic waves, and we [needed] to consider modeling domains, spanning hundreds of kilometers, as well as the tip of earthquake fronts which is releasing tectonic stresses on, at most, meter scale.”
The simulation took around 14 hours to complete, using all 86,016 cores of the SuperMUC, which performed nearly 50 trillion operations in the process. To perform the seismic wave propagation calculations alone, more than 3 million time steps had to be computed. According to the researchers, the work is so cutting edge that — just two years earlier — the computing time for the simulation would have taken 15 times longer.
As desirable as it would be to do so, Gabriel said that there is “no realistic hope” of researchers being able to predict earthquakes like this anytime soon. Instead, simulations such as this one are crucial because they can hopefully be used to help mitigate earthquake-related damage to infrastructure, societies, and economies.
“Our analysis will help with the development of more reliable early warning systems,” she said. The work may also help researchers to understand why some earthquakes and the resulting tsunamis are so much bigger than others.
In recognition of the researchers’ achievement, a paper describing this work was nominated for the “best paper” award at SC17, one of the world’s premier supercomputing conferences, currently taking place in Denver.
- Here’s how Stanford scientists measured the speed of death
- Geologists discover a quadrillion tons of diamonds that are impossible to reach
- For these researchers, building a ‘smart cockpit’ means reading a pilot’s mind
- HPE’s 145,000-core supercomputer could help manage U.S. nuclear stock
- Harvard’s latest robot can walk on water. Your move, Jesus