John B. Rundle
Department of Physics
Colorado Center for Chaos and Complexity, and CIRES
University of Colorado, Boulder, CO 80309
rundle@hopfield.colorado.edu
Tom Henyey
Department of Earth Sciences and SCEC
University of Southern California, Los Angeles, CA 90089-0742
J.-Bernard Minster
IGPP, Scripps Institute of Oceanography
University of California at San Diego, La Jolla, CA 92093
Geoffrey Fox
Department of Physics
Syracuse University, Syracuse, NY 13244-4100
General Earthquake Models (GEM) are realistic, high performance computational simulations of earthquake processes on individual and systems of faults within the earthÕs crust. Recently, advances have been made on understanding specific aspects of earthquake processes on various spatial and time scales, however the efforts have generally been carried out by individual scientists, leaving the field of earthquake science somewhat disjointed and not well integrated. The next logical step will be to bridge the gaps between models of various scales, which will greatly advance our understanding of earthquake processes.
General Earthquake Models represent a coordinated effort within the United States' scientific community to develop highly realistic earthquake models for use in understanding the physics of earthquakes. GEM will develop a set of tools for use on high performance computers to simulate earthquake processes. By developing these tools such that the gaps between small and large scale models are bridged we will effectively develop a ``numerical laboratory''. Investigating the physics of earthquakes can then be approached as an ``experimental science'' that will complement existing observational and theoretical approaches. GEM will allow us to test for consistency between models, to explore the effects of friction, segmentation, loading due to plate motion, boundary conditions, random events, and the neglect of sub-grid-scale processes, all of which will result in more physically realistic models, further advancing our understanding of earthquakes.
Data will be assimilated to validate the models, and through GEM, diverse data sets can be related to each other by a process known as ``Model-Based Inference''. Data collection and archiving will not be a part of GEM, however we anticipate that through GEM data needs will be identified. This will help to drive data collection efforts in a variety of disciplines. We plan to use the GEM simulations to develop new space-time pattern recognition and analysis techniques. These may very likely lead to the development of forecasting and prediction methodologies for the reduction of earthquake hazard to lives and property.
Processes associated with earthquakes are known to occur over a wide variety of spatial and temporal scales. Figure 1 illustrates this point. For example, frictional processes, the primary nonlinearity associated with earthquakes, are known to be physically significant from the molecular scale, on time intervals of less than 10-8 seconds and lengths of Angstroms, to plate motion scales, on time intervals of 106 years and lengths in excess of 1000 km. The physical processes directly associated with faulting and seismology, including nucleation and quasistatic crustal deformation, occur over time intervals of fractions of seconds to thousands of years, and lengths of meters to a hundreds of km. By contrast, experiments on frictional sliding of rock samples are carried out over laboratory scales extending over typical time intervals of 10-3 seconds to days, and over length scales of cm to m. Figure 1 indicates that in terms of length scales, processes of faulting, seismology and crustal deformations are as far removed from laboratory experiments on rock samples, as these laboratory rock processes are from those on the molecular scale. There are no reliable means presently available to relate results observed on one set of scales to those on another. It is therefore of critical importance to develop an experimental numerical capability that can help to span these large ranges in scales, so that the applicability of physical (and chemical) processes on one scale can be evaluated for their importance on other, distinctly different scales. An example is the need to determine whether empirical or theoretical friction laws developed to describe the nonlinearity of sliding on one temporal and spatial scale applies to sliding on other scales.
Within the region of spatial and temporal scales in figure 1 describing earthquakes (denoted ``faulting and seismology''), there exist additional ``sub-scales'' that describe additional heirarchies of processes. These include, but are not limited to, the following:
Time Scales: 10-1 sec to 102 sec
Space Scales: 10-1 km to 102 km
Time Scales: 10 sec to 108 sec
Space Scales: 10-1 km to 103 km
Time Scales: 102 sec to 1012 sec
Space Scales: 10-1 km to 103 km
Time Scales: 10-1 sec to 106 years
Space Scales: 10-6 km to 1 km
The very large range of spatial and temporal scales that these problems involve clearly demonstrate that GEM is an HPC-class problem. Current evidence indicates that forecasting the damaging earthquakes of magnitude ~ 6 and greater almost certainly depends upon understanding the space-time patterns displayed by smaller events, e.g., the magnitude 3's, 4's and 5's. At least 40,000 km2 of fault area in southern California are capable of participating in magnitude 6 and greater events. Hence, needing a spatial resolution of about 100 m to eliminate grid-scale effects and to capture the physical processes of the magnitude 3 events, and a temporal resolution of ~ 100 sec, one arrives at the conclusion that as much as 106 grid sites will be necessary for a maximally realistic simulation. If grid sizes at the 10 m scale are used to capture the failure physics of the magnitude 3 events, then ~ 108 grid sites will be needed. Using current computational technology, run-time estimates of several months for this problem can be documented.
A well-constructed simulation technology should hold the promise of an iterative solution to earthquake problems through a direct comparison of simulations with a wide variety of data. In this context, specific scientific objectives of our research include:
Objective 1
Cataloguing and understanding the nature and configurations of space-time patterns of earthquakes and examining whether these are scale-dependent or scale-invariant in space and time. Correlated patterns may indicate whether a given event is a candidate foreshock.
Objective 2
Understanding the importance of inertia and seismic waves in determining details of space time patterns and slip evolution.
Objective 3
Understanding the physical justifications for space-time coarse-graining over smaller scales, the role of sub-grid scale processes, and whether these might be parameterized in terms of uncorrelated or correlated noise.
Objective 4
Developing and testing potential earthquake forecast algorithms, based primarily upon the use of space-time patterns and correlations in the fault system of interest.
Models
The first step in developing General Earthquake Models will be to identify existing codes and models ranging in scale from tenths to tens of kilometers. We expect that GEM will encompass the dynamics and physics of individual faults, all the way through systems of faults. This includes fault rupture, the causes of rupture growth into a large earthquake, fault interactions on short and long time scales, and the role of rheology, heterogenity, random property, and noise. A centerpiece of early development will be a newly constructed fault-interaction model, to be built by a world-class team of parallel computing physical modelers using modern object/component paradigms. The planned adaptation of this hugely successful astrophysical N-body algorithm (``Fast Multipole Method'') to the implementation of stress Green's functions on interacting fault systems will represent a major advance, which when completed will provide a unique capability for modeling complex crustal dynamics. The expected computer requirements as the model matures will exceed 1 teraflop performance, which will become available through a number of computing centers during this time. Our approach ensures that these resources are well- spent, as the N-body algorithm is both accurate and efficient. In general, the friction models used for GEM will be based upon 1) laboratory experiments, such as the slip- weakening or rate-and-state models; 2) computationally simple models that capture in a simple way important aspects of frictional sliding, such as the classic Coulomb-Amontons law; or 3) statistical mechanics, in which the basic phenomenology of friction has been incorporated on coarse grained space-time scales.
Pattern Analysis
Anecdotal evidence accumulated over many years clearly indicates the existence of such space-time patterns in seismicity data. The exact nature of these patterns, however, has so far eluded identification. As part of the project, computational tools will be developed to analyze the information content of both real and simulated earthquakes using new ideas about of pattern analysis and pattern reconstruction in complex systems. These include methods based upon log-periodic, correlation dimension, and more recent techniques based upon the discrete Karhunen-Loeve expansion (``Pattern Dynamics'').
Computations
The current environment is exemplified by isolated research groups that specialize in home-brewed software. The move to new, more fruitful collaborations marked by shared resources, we will take advantage of the emerging web-based object technology to create an environment where computing platforms, software, data and researchers will easily communicate. A standard computational framework, consisting of a web-server brokerage system and web-browser based development, data-flow and visualization tools will be agreed to and its components acquired by the collaborating institutions. Because this is a rapidly evolving field, we defer the choice of tools to the initial phase of the work, and expect it to evolve as the collaboration procedes. Promising technologies are described in ``Building Distributed Systems for the Pragmatic Object Web'' (Fox et al., http://www.npac.syr.edu/users/shrideep/book/), and prototypical systems have been investigated at JPL and NPAC.
Data Assimilation and Model Calibration
Following, and in parallel with, the code development GEM will build and test modules to compare observational data with the models. First will be tests for consistency between models, then data assimilation capabilities will be added so that the models can be calibrated and initialized using real data. Data assimilation techniques have been developed by the climate and ocean modeling community that start with a given model code, from which the ``adjoint model'' is calculated. The adjoint can be run backwards in time, beginning with current data, to predict the original initial and boundary conditions that led to the current conditions. This computational procedure allows a subset of model variables to be adjusted so that both initial conditions, historical observations, and current data can all be optimally matched. Field data of interest include GPS, InSAR and broadband seismic (TERRASCOPE) data, together with archived and newly developed paleoseismic information in the SCEC database.
Validation via Laboratory Experiments
While the collection of significant new laboratory data is outside the scope of the GEM project, data from laboratory experiments are needed to validate the GEM simulations. Validation can take many forms, but there would seem to be two primary functions: 1) The use of experimental data in verifying numerical techniques and the physics that are included (excluded) in models. 2) The use of numerical models in connection with a concurrent active program of lab experiments to further identify the physical mechanisms that operate in laboratory experiments and earthquake fault zones. However, it must be recalled that laboratory experiments provide information only about a strongly limited set of space-time scales on given materials and conditions. How these results scale up or apply to field situations is unknown.
Project organization will include the following functions:
Planning and Coordination
Modeling and Analysis
Computations
Validation/Data Assimilation
Outreach and Information Dissemination
It would be best if close coordination can be maintained between ongoing projects and their various participants. The optimal means of establishing such coordination is probably by means of management structures associated with the Southern California Earthquake Center, and its successor, the California Earthquake Research Center.
Priorities will be determined by:
1) Intellectual maturity of the problem
2) Interests of potential funding agencies and programs.
While no detailed cost breakdowns have been formulated, it is possible to envision an extremely basic, ``bare bones'' GEM project funded at the level of 100,000/year. Funding for a self-contained GEM project would begin at the level of 1,000,000/year, a level that would allow an active GEM Team to be formed, and to substantively address a number of the tasks described within a period of about 5 years. To make truly significant progress on all of the major tasks with 3 to 5 years would probably require a yearly level of funding in excess of 2,000,000/year.
The figures can be included as Encapsuled Postscript Files.
Figure 1: Space-Time Scales of Physical Processes (from Minster, 1998)}
Research upon which this paper is based has been funded by the US Department of Energy, the National Aeronautics and Space Administration, and the NationalScience Foundation.