--------------------------------

Notes, GEM Meeting, 12/5/98, Argent Hotel, San Francisco, CA

Notes taken by S. McGinnis, University of Colorado

---------------------------------

Roscoe Giles:

Computing and networking power is increasing at a very fast pace.

PACI focuses on communities of scientists interacting, rather than

single centers doing everything.

The Big Idea: to develop "The Grid": a system of dependable,

consistent, pervasive access to (high-end) computing resources. This

would transform the Internet into a utility, giving access to

computation in the same way that we have access to electricity,

without the end-user needing to care about the details. The Grid

would possess fractal structure, (highly) parallel components, and

variable-strength nodes and links.

The Goal: is to make a powerful computation grid available and

*accessible* to scientific users.

 

---------------------------------

Geoffrey Fox:

The GEM problem can be broken up into six components:

1) user interface

2) fault scale (non-local) equation solver (for Green's functions)

3) modules specifying local physics and friction

4) model evaluation, data analysis, and data visualization

5) data storage, indexing, and access

6) investigation of complexity & pattern dynamics

 

Because it is a young program, with little legacy code, GEM should be

able to take advantage of Distributed Object Web Technologies. [Using

the Web to link together ephemeral code objects, rather than persistent

documents.]

GEM requires HPCC, on the order of 1-100 Teraflops. GEM should also

take advantage of tree codes, which are *much* more efficient than N^2

codes for large N; N is huge in geophysics problems.

GEM's User Intrerface goal is to create seamless access to distributed

resources.

GEM can also get lots of leverage off inter-organization collaboration.

For the integration of GEM into a Problem-Solving Environment, we should

begin creating an object web.

The Pacific Disaster Center (PDC) is a good opportunity for these

approaches.

The goal is not necessarily to *do* these things now, but to be *aware*

of them and to make decisions so as to take advantage of them in the

future.

 

-----------------------------

Jay Parker

Improving the Earth Science / Computer Science (ES/CS) link

Note that Earth Scientists don't actually need to know much CS jargon.

New computing paradigms can lead to better collaboration.

Web-Distributed Object (currently a "rapidly-evolving" technology) could

affect: model elements, coding and integration, geometry and rheology

specification, data analysis and visualization, how the work is done,

and image and idea communication.

One dream:

Work can become a remote access jigsaw puzzle.

We could pool existing model codes into a toybox for all; to do this,

you would need to:

--Extend the web browser

--Wrap (small) existing legacy codes in java

--Add geometry specification, results viewing modules

--Define I/O parameters in wrappers

--Implement security mechanisms

One experience:

MIDAS for radar simulation

Allowed limited mix-and-match processing. A specialist was able to get

the entire system running in 2 weeks (with all the applications in place

through ftp, telnet, etc). Construction was easier than expected, but

it added layers to maintain that were not kept up.

GEM work would involve early core code and later flexible modular

plug-in physics.

Eventually, GEM should encompass: distrubution of web tools, access to a

"big model", and a remote collaborative environment

Meanwhile, we should: cooperate in the usual ways and share code, data,

and papers via ftp and the web.

 

----------------------------------

Bill Klein

Crossover scientists (formerly in Earth Science, now in CS) are a

valuable and needed resource for GEM.

[Much general discussion]

 

----------------------------------

Bernard Minster

We want to couple together many different models at many scales, but we

must make sure that we maintain modularity of the models.

We should change our idea of scale, using *LOTS* of computer resources.

GEM must THINK BIG.

Earth Science is a very data-intensive field. We need to learn how to

deal with that.

Useful Tools:

Mathematicians are working on new methods for stable simulation of

complex systems: multi-symplectic integrators.

Code exists at Argonne National Labs that will take any modeling code

and produce from it the *adjoint* code. The adjoint code runs the

physical evolution of the system *backwards* in time....allowing

data to be assimilated into the original code.

Minster also discussed the new NSF Earth System Collaboratory idea

now in the planning stages:

420M$/yr x 5 yrs = 1.4B$

THINK BIG!

 

---------------------------------

Charlie Sammis

[Discussion about the reasons why a large simulation is needed. Is GEM

proposing a large tool, and not a large science problem?

Some arguments for science (not tool) include use of GEM as a tool to

analyze scientific assumptoins about the problem, and the question of

why the simple models are viewed as valid for such a complex process.]

 

---------------------------------------

Terry Tullis

What an earthquake fault looks like depends on the scale at which you

view it. The question: as we scale up, what do we throw out? (Since

we must throw out something.) How do we properly parameterize

sub-gridscale processes at the next hierarchical level?

The answer may depend on the question being asked, and we should

catalog the questions.

 

---------------------------

[Attempt by scribe to summarize the general mindset of the GEM group,

with regard to the question of what GEM attempts to study:

Earthquakes are a complex system which consists of many interacting

components, both literally (faults) and symbolically / mathematically

(friction vs geometry). What we do not understand is the relative

importance of these interactions on difference scales. GEM proposes

generating a computational infrastructure that will allow comparison

of the elements in order to understand what is important and

whether a given interaction can be parameterized.]

 

---------------------------------

Lisa Grant

The US nuclear testing program (in the 80's) was a very successful

program with many similarities to the GEM problem. The process this

program used was to loop on:

prediction

test/observation

code calibration

Proposed Observation / Calibration component for the earthquake problem

is to study the rupture potential of the San Andreas - Parkfield -

Cholame - Carrizo faults.

This is a well-constrained system

--single fault, simple geometry

--large data set

--data at various scales

with a potentially large payoff in science and hazard mitigation.

Testing and prediction will be most successful if we focus not on when

an event happens, but on where and in what manner.

Geology can provide some boundary conditions, and slip distribution can

also help to calibrate models.

 

-------------------------

Kerry Sieh

The critical question is, how regular are earthquakes?

One fault is unlikely to have a G-R distribution.

Characteristic events do exist for some faults.

A possibility: displacement tends to be similar, but recurrence time is

irregular.

Do we need more complicated and expensive models?

Modellers and geologists need to communicate more. Modellers should ask

the geologists to:

1) establish general behavior of faults and fault systems

2) determine quantitative histories for particular systems

 

----------------------------

Mike Watkins

Most of GEM is about improving forward models; we want to work towards

doing inverse modelling.

A useful tool is the "data assimilation" method. Using variational

objective analysis, you can tune model parameters. This model also

allows you to determine whether or not parameters are significant.

 

-------------------------------

Tom Jordan

If we integrate all these themes, we see that we are talking about the

interaction between physics, hypotheses, models, and data. For this

reason, we shouldn't even use terms like "model validation" and "model

verification".

Need to decide the focus of GEM: theoretical function (numerical

laboratory) or observational function (data reconcilliation,

assimilation, and extrapolation)?

Two proposed tasks/foci: Establishing the scaling boundaries of

earthquake processes as a geosystem, and data assimilation.

[Dissenting opinion: we should break the grand endeavor into multiple

smaller (but still big) projects.]

We have, basically, a new way of doing science: model inference. (Note

that the model need not have equations in it. Example: avalanche

problems.)

The study of a driven, non-equlibrium, threshold system is applicable to

many fields.

GEM should focus on a *mission goal*, which is one element of a

strategic research plan to "understand earthquakes"; an SRP will

generate lots of successes, as it did in astrophysics.

The systems approach is the most attractive approach to the NSF.

 

-----------------------------------

End of Meeting