Subject: GEM Visualization Resent-Date: Thu, 16 Dec 1999 21:57:05 -0500 Resent-From: Geoffrey Fox Resent-To: p_gcf@npac.syr.edu Date: Thu, 16 Dec 1999 17:00:03 -0700 From: Seth McGinnis / Beemer To: gcf@npac.syr.edu Geoffrey-- Here's my half-page. It's a much harder thing to write about than I had thought. Hope it is helpful. --Seth Visualization for the GEM Project One of the aspects of the GEM community and research that is intersting from a science systems perspective (the meta-topic of "how to do science effectively") is its enormous heterogeneity. The kinds of data that earthquake modellers use varies enormously in its density and size across many diemnsions. For example: satellite SAR data samples the surface deformations of the landscape over a period of several days or weeks at an incredible spatial resolution, while GPS receiver networks like the SCIGN array record the positional variations of a small number of discrete locations with very high temporal resolution. The challenge of visualization in the GEM context is to create methods of integrating these very different kinds of data. To bring unity and cooperation to this field, we need to empower collaboration between researchers. The enormous diversity of not only the data types but also the researchers themselves (and their computer capabilities) means that we must develop extensible, universal tools with a very low access energy to connect researchers with one another and with visualization systems, both established (GMT, for example) and new (a 3-D tensor stress-field display engine, for another). These goals can be achieved through the use of technologies like XML; establishing XML formats for geophysical datasets will allow universal interchange between collaborators from very different backgrounds. New style sheets can then be created and deployed to display the data, granting seamless access to a variety of visualization technologies. This design is both evolutionary and robust -- as new datatypes, collaborators, and display tools join the field, the system used to connect them together will grow to accommodate them. And by creating interface technology with a low access cost to use it, the technologies remain accessible across a range of support levels (usable both by the modeler at his supercomputer and the geologist in the field). And once established, the system could be easily extended to use by researchers in other fields entirely. ----- BTW-- by "low access cost", I mean that there is little requirement in terms of software or hardware infrastructure. You'd just have portable translation filters to convert data into XML, then some style sheet generators that invoke your favorite rendering engine. Got a new datatype? Just use the appropriate tool to write a new filter. Nobody uses your graph-making package? Just tell a different tool how to convert data for that engine. And so on. Does that make sense?