Back to Search Start Over

Nowcasting Earthquakes With Stochastic Simulations: Information Entropy of Earthquake Catalogs.

Authors :
Rundle, John B.
Baughman, Ian
Zhang, Tianjian
Source :
Earth & Space Science. Jun2024, Vol. 11 Issue 6, p1-12. 12p.
Publication Year :
2024

Abstract

Earthquake nowcasting has been proposed as a means of tracking the change in large earthquake potential in a seismically active area. The method was developed using observable seismic data, in which probabilities of future large earthquakes can be computed using Receiver Operating Characteristic methods. Furthermore, analysis of the Shannon information content of the earthquake catalogs has been used to show that there is information contained in the catalogs, and that it can vary in time. So an important question remains, where does the information originate? In this paper, we examine this question using stochastic simulations of earthquake catalogs. Our catalog simulations are computed using an Earthquake Rescaled Aftershock Seismicity ("ERAS") stochastic model. This model is similar in many ways to other stochastic seismicity simulations, but has the advantage that the model has only 2 free parameters to be set, one for the aftershock (Omori‐Utsu) time decay, and one for the aftershock spatial migration away from the epicenter. Generating a simulation catalog and fitting the two parameters to the observed catalog such as California takes only a few minutes of wall clock time. While clustering can arise from random, Poisson statistics, we show that significant information in the simulation catalogs arises from the "non‐Poisson" power‐law aftershock clustering, implying that the practice of de‐clustering observed catalogs may remove information that would otherwise be useful in forecasting and nowcasting. We also show that the nowcasting method provides similar results with the ERAS model as it does with observed seismicity. Plain Language Summary: Earthquake nowcasting was proposed as a means of tracking the change in the potential for large earthquakes in a seismically active area, using the record of small earthquakes. The method was developed using observed seismic data, in which probabilities of future large earthquakes can be computed using machine learning methods that were originally developed with the advent of radar in the 1940s. These methods are now being used in the development of machine learning and artificial intelligence models in a variety of applications. In recent times, methods to simulate earthquakes using the observed statistical laws of earthquake seismicity have been developed. One of the advantages of these stochastic models is that it can be used to analyze the various assumptions that are inherent in the analysis of seismic catalogs of earthquakes. In this paper, we analyze the importance of the space‐time clustering that is often observed in earthquake seismicity. We find that the clustering is the origin of information that makes the earthquake nowcasting methods possible. We also find that a common practice of "aftershock de‐clustering", often used in the analysis of these catalogs, removes information about future large earthquakes. Key Points: Earthquake nowcasting tracks the change in the potential for large earthquakes, using information contained in seismic catalogsWe analyze the information contained in the space‐time clustering that is observed in earthquake seismicityWe find that "aftershock de‐clustering" of catalogs removes information about future large earthquakes that the nowcasting method uses [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
23335084
Volume :
11
Issue :
6
Database :
Academic Search Index
Journal :
Earth & Space Science
Publication Type :
Academic Journal
Accession number :
178093137
Full Text :
https://doi.org/10.1029/2023EA003367