Editor: Jiancang Zhung
Authors: Jiancang Zhuang, David Harte, Maximilian J. Werner, Sebastian Hainzl, and Shiyong Zhou
Abstract: In this and subsequent articles, we present an overview of some models of seismicity that have been developed to describe, analyze and forecast the probabilities of earthquake occurrences. The models that we focus on are not only instrumental in the understanding of seismicity patterns, but also important tools for time-independent and time-dependent seismic hazard analysis. We intend to provide a general and probabilistic framework for the occurrence of earthquakes. In this article, we begin with a survey of simple, one-dimensional temporal models such as the Poisson and renewal models. Despite their simplicity, they remain highly relevant to studies of the recurrence of large earthquakes on individual faults, to the debate about the existence of seismic gaps, and also to probabilistic seismic hazard analysis. We then continue with more general temporal occurrence models such as the stress-release model, the Omori-Utsu formula, and the ETAS (Epidemic Type Aftershock Sequence) model.
Authors: Jiancang Zhuang, Maximilian J. Werner, Sebastian Hainzl, David Harte, and Shiyong Zhou
Abstract: In this article, we present a review of spatiotemporal point-process models, including the epidemic type aftershock sequence (ETAS) model, the EEPAS (Every Earthquake is Precursor According to Scale) model, the double branching model, and related techniques. Here we emphasize the ETAS model, because it has been well studied and is currently a standard model for testing hypotheses related to seismic activity.
Authors: Sebastian Hainzl, Sandy Steacy, and David Marsan
Abstract: Our fundamental, physical, understanding of earthquake generation is that stress-build-up leads to earthquakes within the brittle crust rupturing mainly pre-existing crustal faults.While absolute stresses are difficult to estimate, the stress changes induced by earthquakes can be calculated, and these have been shown to effect the location and timing of subsequent events. Furthermore, constitutive laws derived from laboratory experiments can be used to model the earthquake nucleation on faults and their rupture propagation. Exploiting this physical knowledge quantitative seismicity models have been built. In this article, we discuss the spatiotemporal seismicity model based on the rate-and-state dependent frictional response of fault populations introduced by Dieterich (1994). This model has been shown to explain a variety of observations, e.g. the Omori-Utsu law for aftershocks. We focus on the following issues: (i) necessary input information; (ii) model implementation; (iii) data-driven parameter estimation and (iv) consideration of the involved epistemic and aleatoric uncertainties.
Author: Takaki Iwata
Abstract: We often observe that earthquakes are triggered by the external oscillation of stress/strain, and typical causes of the oscillation are the earth tides and seismic waves of a large earthquake. As no clear physical models of these types of earthquake-triggering events have been developed, statistical approaches are used for detection and discussion of the triggering effects. This article presents a review of suggestive physical processes, common statistical techniques, and recent developments related to this issue.
Authors: David Marsan and Max Wyss
Abstract: Earthquake time series can be characterized by the rate of occurrence, which gives the number of earthquakes per unit time. Occurrence rates generally evolve through time; they strongly increase immediately after a large shock, for example. Understanding and modeling this time evolution is a fundamental issue in seismology, and more particularly for prediction purposes.
Seismicity rate changes can be subtle, with a slow time evolution, or with a gradual onset long after the cause. Therefore, it has proved problematic in many instances to assess whether a change in rate is real, i.e., whether it is statistically significant, or not. We here review and describe existing methods developed for measuring seismicity rate changes, and for testing the significance of these changes. Null hypotheses of 'no change' are formulated, that depend on the context. Statistics are then defined to quantify the departure from this null hypothesis. We illustrate these methods with several examples.
Authors: Thomas van Stiphout, Jiancang Zhuang, and David Marsan
Abstract: Seismicity declustering, the process of separating the seismicity catalog into foreshocks, mainshocks, and aftershocks, is widely used in seismology, in particular for seismic hazard assessment and in earthquake prediction models. There are several declustering algorithms that have been proposed over the years. Up to now, most users have applied either the algorithm of Gardner and Knopoff (1974) or Reasenberg (1985), mainly because of the availability of the source codes and the simplicity of the algorithms. However, declustering algorithms are often applied blindly without scrutinizing parameter values or the result. In this article we present a broad range of algorithms, and we highlight fundamentals of seismicity declustering and possible pitfalls. For most algorithms the source code or information regarding how to access the source code is available on the CORSSA website.
Authors: Jiancang Zhuang and Sarah Touati
Abstract: Starting from basic simulation procedures for random variables, this article presents the theories and techniques related to the simulation of general point process models that are specified by the conditional intensity. In particular, we focus on the simulation of point process models for quantifying the characteristics of the occurrence processes of earthquakes, including the Poisson model (homogeneous or nonhomogeneous), the recurrence (renewal) models, the stress release model and the ETAS models.