Supercomputing’s Evolving Role in Natural Disaster Modeling

 

natural disasters image

Two groups of scientists on opposite sides of the globe are conducting breakthrough research to help predict the occurrence of and impact from natural disasters – and they are using supercomputing to mitigate many of the risks associated with them.

In California, the focus is on one of that state’s most serious and common natural disasters – wildfires. Leading those efforts is Ilkay Altintas, Chief Data Science Officer at the San Diego Supercomputer Center (SDSC), University of California San Diego, where she is also the Founder and Director for the Workflows for Data Science Center of Excellence.

In Japan, Tsuyoshi Ichimura, Kohei Fujita, Takuma Yamaguchi and other researchers with the University of Tokyo’s Computational Science and High Performance Computing Laboratory at the Earthquake Research Institute are conducting research that simulates entire phases of earthquake disasters. In 2014, 2015, and 2018, they were finalists for the prestigious Gordon Bell Prize which recognizes outstanding achievement in high-performance computing.

Ilkay and the University of Tokyo researchers provided me with an update on their groundbreaking work and offered insights on the impact of supercomputing on natural disaster modeling.

 

Michela: Please describe the current state of research in the prediction and mitigation of natural disasters.

 

Tsuyoshi/Kohei/Takuma: Natural disaster phenomena such as earthquakes are very complex processes, and thus it was impossible, in the past, to use computing in detail for practical prediction or mitigation purposes. So, it has been common to use empirical or statistical methods based on data of past earthquakes for prediction or mitigation.

However, the reliability of these methods is not high when applied to places with lack of past earthquake data or where the environment has changed after an earthquake. The research community is expanding use of physics-based computational approaches together with the large amount of data acquired by sensors such as GPS or remote sensing for prediction and mitigation of earthquakes.

Ilkay: In California, we’re having bigger fires than ever before. And a good number of them are uncontrollable fires or exhibiting fire behavior unlike anything seen in the past. That means we can’t just look at what has occurred in the past in order to predict the future. We need to understand the event that’s happening right now — as fast as possible. Integration of big data about the fire that is happening, combined with existing various fire physics models, gives us the opportunity to make adjustments in real time so we can predict what the fire will do next.

By making constant adjustments of the model, we can improve the understanding of the fire as it is happening. In this scenario, the model moves as the fire does.

 

Michela: Can you provide a brief overview of the importance of HPC in the use of AI, modeling, simulation and/or data analytics that support research in this field?

 

Ilkay: Data-driven fire modeling is a relatively new field. Fifteen years ago, we didn’t have access to high-speed networks and advanced computing. But then the data revolution happened, and we have a lot more information and a lot more means to actually collect it, even during fire events.

Also, we now have more access to on demand HPC and supercomputing capabilities. The ability to get real intelligence on the edge of these networks gives us more intelligence to inform our models in real time. Edge and real-time curation of data combined with machine learning and computational physics are enabling us to build sustainable models. Obviously, we need a lot of computing power to solve these problems.

Tsuyoshi/Kohei/Takuma: Physics-based simulation of earthquakes is very computationally expensive due to the large domain size and required resolution. HPC is essential for conducting these simulations. Furthermore, together with the physics-based simulation approach, the use of AI and data-analytics is expected to boost the capability of these simulations.

HPC helps us compute damage by earthquake in urban settings with very high resolution. For example, a seismic simulation of a large city demonstrated has 1km x 1km domain and 25 cm resolution. By considering complex geometry and buildings in the target domain, the displacement response distribution gets significantly different from results in previous conventional simulations.

 

Michela: Can you provide an overview of a current project you’re working on?

 

Tsuyoshi/Kohei/Takuma: One of our projects is city response simulation, which we demonstrated at SC18. Recently, urban density has been increasing, which leads to greater vulnerability for natural disasters such as earthquake. So, it’s important to conduct a seismic simulation of a large, densely constructed city, including complicated geometry comprising ground, underground structure and high-rise buildings.

Another project is simulation of the motion and deformation of the Earth’s plate. With high-resolution crustal simulation, we might be able to get better knowledge about the inner state of the earth, which is expected to be useful for improving long-term forecasting of earthquake activity. This simulation is an important task for earthquake disaster mitigation. And HPC is essential to conduct such simulations within a realistic timeframe. For example, in our SC19 research poster, we will show methods to conduct earthquake simulation using specialized hardware on HPC resources which were designed to accelerate deep learning (i.e,, Tensor Cores of V100 GPUs). The method attains over 1 ExaFLOPS for the core kernel and 400 PFLOPS for the entire solver on Summit at Oak Ridge National Laboratory, which led to significant speedup from past methods. It is expected to become a breakthrough in earthquake simulation.

Ilkay: On September 1, we just started a pilot program with the State of California that takes place while fires are happening. We are able to get pertinent perimeter information about those fires and combine it with other data such as humidity, moisture of the land, the topology, weather forecasts. And we create simulations so we can send back predictions of what the fire will do next, such as the rate it will spread in the next couple of hours. The (firefighter) decision-makers have an interface to visualize the results and access the most relevant information.

We just used it on our first real fire last night, and everything worked smoothly. It’s an exciting project; each fire has its internal dynamics, and it keeps changing and evolving while firefighters are suppressing it.

 

Michela: What’s next? How do you imagine HPC will significantly impact your field within 5 years?

 

Ilkay: I look at supercomputing now as a continuum of computing. There is definitely a big need for high capacity systems to enable applications that need precision and capacity.

For these high-capacity models to work you need an ecosystem around them, which I call composable systems. You’re composing a solution by automating workflows and using all models of supercomputing. To me, that means supercomputing is becoming even more useful to our research. We are working on expanding a composable architecture with external systems along with high capacity. Adding dynamic composition capability to high capacity supercomputing will be the next phase of the evolution of our research.

Tsuyoshi/Kohei/Takuma: We expect that the convergence of AI and equation-based simulations will significantly accelerate physics-based modeling of earthquakes and start to be used for practical decision-making by the government or industry in the near future. For example, we are working closely with the Japanese government and they are testing our methods extensively for practical use of physics-based large-scale analysis methods for earthquake and tsunami disaster estimation and mitigation.

 

–––

Michela Taufer, PhD, General Chair, SC19

SC19 logo

Michela Taufer is the Dongarra Professor in the Min H. Kao Department of Electrical Engineering & Computer Science, Tickle College of Engineering, University of Tennessee, Knoxville.

Back To Top Button