General Earthquake Models
:Numerical Laboratories for Understanding
The Physics of Earthquakes
A Proposed Role Within the
Southern California Earthquake Center
and the Proposed
California Earthquake Research Center
Prepared by:
John Rundle
Colorado Center for Chaos & Complexity
University of Colorado, Boulder, CO 80309
J.-Bernard Minster
IGPP, SIO
Unversity of California, San Diego 92093
A Consensus Proposal from the
General Earthquake Model Working Group
Members:
Claude Allegre, IPG & French Science Ministry, Paris, France
Yehuda Ben-Zion, University of Southern California
Jacobo Bialek, Carnegie Mellon University
William Bosl, Stanford University
David Bowman, University of Southern California
Charles Carrigan, Lawrence Livermore National Laboratory, Livermore, CA
James Crutchfield, Santa Fe Institute, Santa Fe, NM
Julian Cummings, ACL, Los Alamos National Laboratory, Los Alamos, NM
Steven Day, San Diego State University
Geoffrey Fox, NPAC, Syracuse University, Syracuse, NY
William Foxall, Lawrence Livermore National Laboratory, Livermore, CA
Roscoe Giles, Boston University, Boston, MA
Rajan Gupta, ACL, Los Alamos National Laboratory
Tom Henyey, SCEC and University of Southern California
Thomas H. Jordan, Massachusetts Institute of Technology, Cambridge, MA
Hiroo Kanamori, California Institute of Technology, Pasadena, CA
Steven Karmesin, ACL, Los Alamos National Laboratory, Los Alamos, NM Charles Keller, IGPP, Los Alamos National Laboratory
William Klein, Boston University, Boston, MA
Karen Carter Krogh, EES, Los Alamos National Laboratory, NM
Shawn Larsen, Lawrence Livermore National Laboratory, Livermore, CA
Christopher J. Marone, Massachusetts Institute of Technology, Cambridge, MA John McRaney, SCEC and University of Southern California
Paul Messina, CACR, California Institute of Technology, Pasadena, CA
J.-Bernard Minster, University of California, San Diego, CA
David O'Halloran, Carnegie Mellon University
Lawrence Hutchings, Lawrence Livermore National Laboratory, Livermore, CA
Jon Pelletier, California Institute of Technology
John Reynders, ACL, Los Alamos National Laboratory, Los Alamos, NM
John B. Rundle, University of Colorado, Boulder, CO
John Salmon, CACR, California Institute of Technology, Pasadena, CA
Charles Sammis, University of Southern California
Steven Shkoller, CNLS Los Alamos National Laboratory
Stewart Smith, University of Washington
Ross Stein, United States Geological Survey, Menlo Park, CA
Leon Teng, University of Southern California
Donald Turcotte, Cornell University, Ithaca, NY
Michael Warren, ACL, Los Alamos National Laboratory, Los Alamos, NM
Andrew White, ACL. Los Alamos National Laboratory
Bryant York, Northeastern University, Boston, MA
Introduction:
A workshop on "General Earthquake Models" (GEM's) was held at Santa Fe, NM, during October 23-25, 1997. The primary objective was to explore the possibility of developing the computational capability to carry out large scale numerical simulations of the physics of earthquakes in southern California and elsewhere. These simulations would be capable of producing detailed temporal and spatial patterns of earthquakes, surface deformation and gravity change, seismicity, stress, as well as, in principle, other variables, including pore fluid and thermal changes, for comparison to field and laboratory data. To construct the simulations, a state-of-the-art problem solving environment must at some point be developed that will facilitate: 1) Construction of numerical and computational algorithms and specific environment(s) needed to carry out large scale simulations of these nonlinear physical processes over a geographically distributed, heterogeneous computing network; and 2) Development of a testbed for earthquake "forecasting" & "prediction" methodologies which uses modern Object Oriented techniques and scalable systems, software and algorithms which are efficient for both people and computational execution time.
The consensus feeling of the group at the workshop was that the GEM project would fit well as an activity group both within the existing Southern California Earthquake Center, as well being a candidate for a major research focus group within the proposed new State of California Earthquake Center. In order for GEM to be considered within either the existing or proposed Center, it was determined that (this) short description and proposal should be submitted to the SCEC board of directors.
Lead PI:
John Rundle, University of Colorado, Boulder, CO
Initial Members of the GEM Steering Committee:
John Rundle, C4, University of Colorado at Boulder, Chairman
Yehuda Ben-Zion, University of Southern California
Jacobo Bialek, Carnegie Mellon University
William Bosl, Lawrence Livermore National Laboratory
Steven Day, San Diego State University
Geoffrey Fox, NPAC, Syracuse University
Roscoe Giles, Boston University
Thomas Jordan, Massachusetts Institute of Technology
Hiroo Kanamori, California Institute of Technology
William Klein, Boston University
J.-Bernard Minster, IGPP, University of California at San Diego
John Salmon, CACR, California Institute of Technology
Charles Sammis, University of Southern California
Steven Shkoller, CNLS Los Alamos National Laboratory
Ross Stein, United States Geological Survey
Donald Turcotte, Cornell University
Michael Warren, ACL, Los Alamos National Laboratory
Andrew White, ACL, Los Alamos National Laboratory
Bryant York, Northeastern University
Other representatives should be nominated by the SCEC Board
List of Co-PIs:
Name Institution Role
===== ========== =====
JB Rundle Colorado Lead Earth Science
GC Fox Syracuse Lead Computer Science
JB Minster UCSD Interface with SCEC and User Community
Description of the project:
Objectives: The primary objective is to develop the computational capability to carry out large scale numerical simulations of the physics of earthquakes in California and elsewhere. To meet this objective, a state of the art problem solving environment is needed that will facilitate: 1) The construction of numerical and computational algorithms and specific environment(s) needed to carry out large scale simulations of these nonlinear physical processes over a geographically widely distributed, heterogeneous computing network; and 2) The development of a testbed for earthquake "forecasting" & "prediction" methodologies which uses modern Object Oriented techniques and scalable systems, software and algorithms which are efficient for both people and computational execution time.
Method: Work will be based on numerical simulation techniques initially developed by Stuart (1986), Rundle (1988) and Ward and Goes (1993), using these efforts as starting points to model the physics of earthquake fault systems in southern California. The problem solving environment will be built using the best parallel algorithms and software systems. It will leverage ASCI/DOE and other state of the art national activities in simulation of both cellular automata and large scale particle systems.
Scientific and Computational Foci: We will focus on developing the capability to carry out large scale simulations of complex, multiple, interacting fault systems using a software environment optimized for rapid prototyping of new phenomenological models. The software environment will require: 1) Developing algorithms for solving computationally difficult nonlinear problems involving ("discontinuous") thresholds and nucleation events in a networked parallel (super) computing environment, adapting new "fast multipole" methods previously developed for general N-body problems; 2) Evolving an object oreiented framework for problem solving, perhaps similar to the POOMA framework developed by the Los Alamos ACL; and 3) Developing an environment to allow researchers to rapidly integrate simulation data with field and laboratory data (visually and quantitatively). This task is becoming increasingly important with the exponential increase in seismic, as well as space-based deformation data. The idea is to design a sufficiently flexible computational interface so that new physics can be added to the models easily, for example, new friction laws, enhanced wave propagation algorithms, new inelastic bulk constitutive properties, and so forth.
Problem(s) in earthquake source physics:
Short Term Focus: The group will focus on a problem similar to that described by Rundle (1988), which was associated with earlier work by Stuart (1986), and later work by Ward and Goes (1993). This effort will initially use quasistatic evolutions of stress fields in California. The model will have a layered, linear viscoelastic rheology with fixed faults, and stress interactions via Green's functions. The inputs to the model will be fault geometry, an initial stress field that results from past events, and tectonic loading. Outputs will be synthetic earthquake time-space histories, and stress fields. The relevance of the model will include 1) developing understanding of the fundamental physical processes by providing a means for testing models against data, as well as illuminating relationships between different data types; 2) the ability to compute ensembles of hazard models for ensemble-type predictions; 3) providing families of detailed earthquake scenarios for planning purposes. Data to be used includes GPS, trilateration, and other surface deformation data. Simulations of this type clearly show that field data can be simulated to as realistic an extent as is desired for comparison with natural, observed data. Major emphsis in the short term will be on: 1) Constructing better and more versatile stress Green's functions, such as including poroelasticity, general dipping faults, and heterogeneous media; 2) A general friction law interface so that arbitrary friction laws can be used; 3) Computational efficiencies and enhancements so that a large number of fault segments, particularly those in northern California can be modeled; 4) Construction of more general graphical and user friendly interfaces so that general users will find the codes useful, and so forth.
As discussed in detail in the full GEM proposal, objective of the calculations is to, for example, identify: 1) Catalogs of space-time patterns of activity at various spatial-temporal scales for comparison to, and validation by, data; 2) Identification of correlated activity involving several faults, so that enhanced probabilities of activity following a large event could be estimated; 3) Methods for initializing codes with historic activity so that projections of future activity could be made with a view to forecasting future events; 4) Roles of sub-grid scale faults and physical processes in space-time pattern selection; and so forth.
Intermediate-to-Long Term Focus: The longer term focus will be upon several geophysical and computational issues that will allow us to develop much larger and more general simulations than previously possible. For example, in an N-body simulation, the phase-space density distribution is represented by a large collection of "particles" which evolve in time according to some physically motivated force law. Direct implementation of these systems of equations is a trivial programming exercise. It is simply a double loop. It vectorizes well and it parallelizes easily and efficiently. Unfortunately, it has an asymptotic time complexity of O(N2). As described above, each of N left-hand-sides is a sum of N-1 right-hand-sides. The fact that the execution time scales as N2 precludes a direct implementation for values of N larger than a few tens of thousands, even on the fastest parallel supercomputers. Special purpose machines (such as GRAPE) have been succesfully used but these do not seem appropriate in an evolving field like GEM while we are still in the prototyping phase.
Several methods have been introduced which allow N-body simulations to be performed on arbitrary collections of bodies in time much less than O(N2), without imposition of a lattice (Appel, 1985; Barnes and Hut, 1986; Greengard and Rokhlin, 1987; Anderson, 1992). They have in common the use of a truncated expansion to approximate the contribution of many bodies with a single interaction. The resulting complexity is usually determined to be O(N) or O(N log N) which allows computations using orders of magnitude more particles. These methods represent a system of N bodies in a hierarchical manner by the use of a spatial tree data structure. Aggregations of bodies at various levels of detail form the internal nodes of the tree (cells). Making a good choice of which cells interact and which do not is critical to the success of these algorithms (Salmon and Warren, 1994). N-body simulation algorithms which use adaptive tree data structures are referred to as "treecodes" or "Fast Multipole" methods. By their nature, treecodes are inherently adaptive and are most applicable to dynamic problems with large density contrasts, while fast multipole methods have been mostly non-adaptive and applied to fairly uniform problems. It is likely that this distinction will blur in the future, however.
Other foci will be on developing an understanding of 1) the role of subgrid scale processes, such as small unmodeled faults, heterogenous media properties, and boundaries; 2) how to incorporate wave and fully inertial dynamics into simulations having boundaries and complex three dimensional structure, and evaluating the extent to which such are necessary to achieve a full understanding of the physics of earthquake rupture; and 3) the extent to which the physics of seismicity on small space-time scales is representative of the physics on larger and longer spatial and temporal scales. We will also begin to investigate other approaches based on either scalar (Lyakovsky, et al., 1997) or tensor damage mechanics approaches.
Proposed interfaces with SCEC's existing science and outreach programs:
We anticipate that interfacing with SCEC's existing science and outreach programs will be straightforward. In fact, it is fortunate that such a vigorous effort already exists, because it would have to be developed for GEM otherwise. The GEM group will in fact be able to provide at some scale, a user friendly modeling/simulation/data interpretation capability that may turn out to be a useful product for the informed public. In particular, these kinds of computational efforts have proven to be invaluable in the atmospheric sciences community for communicating the excitement and promise of the science, not to mention the risks, to the public at large, and to political groups such as city, state, and national government officials. One can only imagine how such communication efforts could be enhanced by the availability, following a moderate event, of calculations and maps detailing some of the future consequences of the event, including enhanced/depressed rates of fault creep on nearby faults, faster/slower deformation rates, and advanced/retarded seismic "clock" times.
Anticipated scientific/computational products:
The primary product will be a computional tool(s) that will be useful for simulating seismicity and earthquakes, crustal deformation and stress changes, pore pressure changes, and eventually, changes in fault geometry and configurations, and other physical and chemical processes, in whatever spatial and temporal detail is desired, or at least to whatever detail is computationally feasible. These simulations could, with various obvious caveats, be used to "forecast" or "predict" future activity, as well as to analyze space time patterns of events, understand the physics of earthquakes, and so forth.
As such, these products will be of considerable value, IF NOT MISUSED, to governmental and corporate entities such as insurance companies, earthquake engineering consulting firms, the PEER center, and others, that need this kind of input to evaluate seismic risk.
Anticipated outreach products as they pertain to earthquake hazard reduction:
The major outreach product(s) that will flow from the above include the computational tools in their fully operational form, or perhaps simplified versions of same that may be useful for educational or other purposes.
Proposed timelines for above:
For short term focus models: 1-3 years depending on level of effort
For long term focus models: 5-10 years depending on level of effort
Anticipated funding and computational resources external to SCEC:
Likely to be substantial, since it will now be possible to tap into sources such as:
DOE funding: Order of $1 million year from geosciences/ASCI/HPCC sources
NSF/KDI CISE funding: Order of $3 million/year. We intend to submit to
new KDI initiative, proposal due in March
NSF EAR: Order of several hundred thousand/year
Foundation Funding: Possibility of $1+ million/year of funding from Keck, Packard, etc.
Insurance Industry: Unknown. Perhaps might be willing to pay several millions/year or more for tools to forecast risk
Anticipated international collaboration:
At this time, it appears that the major international collaborators will be the ACES group, and possibly the Japanese CAMP project. However, we have heard of interest from the French, Germans, and Russians as well.
Location and organization of the GEM "activity center":
At the moment, the primary organizational/science activity center will be the Colorado Center for Chaos & Complexity. However, Caltech, Scripps, Syracuse, UCLA, USC, Boston University, Los Alamos, LLNL, and other institutions will play major roles. Major computational centers will be: NPAC, Syracuse; MARINER node at Boston University; NPACI, San Diego; ASCI facilities at Los Alamos and LLNL.
References:
Appel, A W. , An efficient Program for Many-Body Simulation, SIAM J. Sci. Stat. Comp., 6, 85, 1985.
Barnes, J.E. and P Hut, A Hierarchical (O (N logN) ) Force-Calculation Algorithm, Nature, 324, 446-449, 1986.
Greengard, L. and Rokhlin, V., J. Comp. Phys, 73, 325-348, 1987.
Lyakovsky, Ben-Zion and Agnon, J. Geophys. Res., in press, 1997.
Rundle, J.B., A physical model for earthquakes, 2, Application to southern California, J. Geophys. Res., 93, 6255-6274, 1988.
Stuart, W.D., Forecast model for recurring large and great earthquakes in southern California, J. Geophys. Res., 91, 13771-13786, 1986.
Ward, S.N. and S. Goes, How regularly do earthquakes recur? A synthetic seismicity model for the San Andreas fault, Geophys. Res. Lett., 20, 2131-2134, 1993.