SUMMARY of OUTREACH to CEWES MSRC USERs

Since the great majority of users of the CEWES MSRC are off-site, the CEWES MSRC PET effort places emphasis on outreach to remote users, as well as to users located on-site at CEWES. Table 3 lists the contacts made with CEWES MSRC users by the CEWES MSRC PET team during Year 2, and Table 2 lists all travel by the CEWES MSRC PET team in connection with the Year 2 effort. A major component of outreach to CEWES MSRC users is the training courses (described in Section VIII) conducted by the CEWES MSRC PET team, some of which are conducted at remote user sites and some of which are web-based. The CEWES MSRC PET website, accessible from the CEWES MSRC website, is also a major medium for outreach to CEWES MSRC users, and all material from the training courses is posted on the PET website. A CD-ROM of training material is also being prepared, as noted in Section VIII. Specific outreach activities conducted in Year 2 are described in this section, which is organized by individual components of the CEWES MSRC PET effort. And two major workshops related to outreach to users are described: a Cross-MSRC Workshop with DoD CTA Leaders and a Cross-CTA Gridding Workshop.

But first there follows a summary of the CEWES MSRC user taxonomy, which is continually updated in the course of the CEWES MSRC PET effort, in order to understand the makeup and potential needs of the CEWES MSRC user community.

------------------------------------------------------------------------------

CEWES MSRC User Taxonomy

The CEWES MSRC PET team first published a taxonomy of CEWES MSRC users in March 1997. At that time user statistics were only available for the Cray YMP and C90. Since then the YMP has been taken out of service, and utilization statistics have become available for the Cray T3E and IBM SP. A second taxonomy report is being prepared to analyze usage of the Cray C90 and T3E and the IBM SP. Some of the preliminary findings of that report are summarized here.

A comparison of Cray C90 utilization for July through December of 1996 with the same period in 1997 appears in the following table. The metric used in this comparison is Megaword-Hours, the product of memory words and CPU hours, and the Cray C90 utilization is presented in thousands of Megaword-Hours by CTA.

--------------------------------------------------------------- CTA CSM CFD SIP CCM CWO CEA EQM Other --------------------------------------------------------------- Jul-Dec 96 1,700 1,560 545 372 110 68 62 110 --------------------------------------------------------------- Jul-Dec 97 234 1,449 0.7 120 108 17 54 101 ---------------------------------------------------------------

This illustrates a dramatic shift of HPC work to the new parallel platforms during the first half of 1997, due to a large extent by the migration of CSM work.

The previous taxonomy used Megaword-Hours as measure of resource utilization, which is appropriate for a vector computer with a global memory. However, it is less appropriate as a measure of utilization for parallel platforms where the amount of memory used is a function of the number of processors requested and may not be the actual amount of memory needed by the application. Thus, CPU-Hours is the utilization metric for the IBM SP and Cray T3E in the following table showing the utilization of these machines for September through November of 1997 by CTAs.

----------------------------------------------------------------------- CTA CSM CFD SIP CCM CWO CEA EQM CEN FMS Other ----------------------------------------------------------------------- T3E Usage 202 54 6.1 47 102 9.3 - 7.1 - 8.8 ----------------------------------------------------------------------- SP Usage 59 39 3.3 29 94 - 1.2 - 4 5.2 -----------------------------------------------------------------------

>From these results it is seen that the CSM and CWO users have been most active in utilizing these scalable parallel platforms.

The first user taxonomy showed a large percent of the CEWES MSRC users were not located on-site at CEWES. That continues to be true. The following table contains a list of the Cray C90 user locations using at least 5% of total C90 resources for July - December 1997 as indicated by account IDs. This may not indicate a user's physical location in all cases, however, since most of the users with accounts through ARO and ONR are located at universities.

C90 ----------------------------- institution % total MWh ----------------------------- AFRL-WPAFB 13.1 ----------------------------- NRL-DC 12.5 ----------------------------- NSWC-Carderock 10.9 ----------------------------- CEWES 9.5 ----------------------------- ONR-DC 7.9 ----------------------------- ARL-Aberdeen 7.4 ----------------------------- DSWA 7.4 ----------------------------- AFOSR 6.7 ----------------------------- AF Space & Missiles 5.6 ----------------------------- AFRL-Kirtland 5.3 -----------------------------

Similar statistics on locations of T3E and SP users are contained in the following two tables. These tables also contain the average number of processor elements used at each location. The results show that the users are taking advantage of the parallel processing capabilities of the T3E and SP.

T3E ------------------------------------------------- institution Avg # PEs % total CPU ------------------------------------------------- CEWES 75 48.3 ------------------------------------------------- NRL-Stennis 50 23.5 ------------------------------------------------- ONR-DC 73 10.9 ------------------------------------------------- AFRL-Edwards 62 10.9 -------------------------------------------------

SP ------------------------------------------------- institution Avg # PEs % total CPU ------------------------------------------------- NRL-Stennis 65 41.1 ------------------------------------------------- CEWES 34 28.3 ------------------------------------------------- ONR-DC 21 10.3 ------------------------------------------------- ARO 51 7.9 -------------------------------------------------

------------------------------------------------------------------------------

CFD: Computational Fluid Dynamics CTA

Interactions with CEWES MSRC users have been initiated by a variety of means. Telephone, email, and personal visits have all resulted in opportunities for CEWES MSRC user support and more specific collaborative efforts. Face-to-face visits have resulted from meeting DoD users at technical conferences such as the AIAA CFD meeting and annual AIAA Aerospace Sciences meeting.

Specific CEWES MSRC user outreach efforts have also been made. For example, In August 1997 a trip was made to AF Phillips Laboratory at Kirtland AFB. During this trip, we met with members of the Satellite Assessment Center. This trip has resulted in an ongoing collaboration with CEWES MSRC user David Medina of Phillips Lab (see the CFD part of Section V). User outreach has also been accomplished through the Scalable Parallel Programming Workshops wherein users are introduced to parallel programming within the context of their own code. This is a particularly effective opportunity for user outreach and training since it gives the on-site CTA lead an opportunity to meet and interact with users on an individual basis and learn about their work within a semi-formal classroom environment. Training conducted at remote user sites at DoD labs presents excellent opportunities for interacting with other CEWES MSRC users within our CTA.

Finally, both Bova and Thompson visited with Jay Boris, the DoD CTA Leader, on several occasions, and both Thompson and Bova had phone and email communication with Boris.

------------------------------------------------------------------------------

CSM: Computational Structural Mechanics CTA

All projects in the CSM CTA are a direct result of collaboration with CEWES MSRC users at CEWES. The focus of the on-site user outreach in Year 2 was identifying the key on-site CSM CEWES MSRC users and determining their short-term and long-term requirements. Several meetings were held with Raju Namburu of CEWES regarding his Challenge Project work. Discussions were held with Robert Hall regarding the requirements of his Structural Analysis group and possible areas of CEWES MSRC PET support. The on-site lead (Weed) participated meetings with Leon Chandler and Dave Medina at AF Phillips Lab regarding the CEWES MSRC PET program. Meetings were held with Steve Akers of the CEWES Structures Lab to outline his work with the EPIC code and potential areas of CEWES MSRC PET support. Support was given to Larry Lynch in the Pavement modeling group at CEWES to assist with his decision on funding university developement of a parallel finite element solver.

LeRay Dandy and Bruce Loftis of NCSA visited Raju Namburu at CEWES to discuss monitoring large CTH simulations during execution on a high performance computer. A Focused effort is currently under way to develop this system using CUMULVS and EnSight.

LeRay Dandy and Bruce Loftis met with Raju Namburu at CEWES to discuss linking CTH and Dyna3D simulations. A Focused Effort has been proposed for Year 3 which will identify the most appropriate (robust) eulerian-lagrangian linking algorithm and deliver a software design specification for the new system currently under way to develop this system using CUMULVS and EnSight.

LeRay Dandy and Bruce Loftis visited Steve Akers at CEWES to discuss simplifying EPIC analysis input. A Dyna3D-to-EPIC Translator has been developed to accomplish this task. LeRay Dandy and Bruce Loftis met with Steve Akers at CEWES to improving EPIC performance. EPIC runs on the Cray C90 which will be decommissioned during Year 3. A Focused Effort has been developed to accomplish this task. LeRay Dandy met with Jon Windham and Doug Strasburg of CEWES to discuss a new material model for CTH. Experimental work has been performed, and we have verified that current analytical models are inadequate.

----------------------------------------------------------------------------

CWO: Climate/Weather/Ocean Modeling CTA

OSU personnel Bedford, Sadayappan, Welsh, and Zhang visited CEWES MSRC to meet with CEWES user Bob Jensen and CWO on-site lead Cox. Discussions took place concerning the requirements for a seamless restart capability for the parallel WAM wave model. Further discussion concerned the requirements and strategies for the coupling of the WAM model, the CH3D marine circulation model, the COSED marine bottom boundary layer model, and the MM5 atmospheric circulation model.

Welsh and Jensen exchanged emails concerning how to add unsteady current and depth effects to WAM. It was concluded that the WAM pre-processor routines which pre-calculate the relevant arrays should be re-used in the main calculation module, being called every time updated currents and depths are received from CH3D. Zhang and Billy Johnson at CEWES exchanged emails concerning how to generate the grid mesh for Lake Michigan and verify the computation of temperature field with the current version of sequential CH3D code.

OSU personnel Bedford, Zhang, and O'Neil attended the 5th International Conference on Estuarine and Coastal Modeling at Alexandria, Virginia. They talked with experts in the field of Lake Michigan hydrodynamic modeling (e.g., David Schwab and Dmitry Belestkey of NOAA GLERL). They also discussed the problems of applying CH3D to Lake Michigan with Harry Wang of CEWES.

Welsh and Jensen used email to discuss the physical meaning of "wave stress" and "total stress" in the Janssen air/water boundary layer model in WAM. It was concluded that the wave stress should be used to modify the surface drag coefficient used in CH3D.

OSU personnel Bedford, Sadayappan, and Welsh met with Jensen, and Christine Cuicchi of CEWES and Beaty of Cray/SGI regarding the evaluation of the ongoing sequential and parallel WAM deployments on the various CEWES MSRC platforms. A Hurricane Luis simulation was selected for use and detailed plans were agreed for deployment verification. The tests chosen focused on seamless restart and measurement of the differences between calculations made on the different platforms and with varying numbers of processors. Welsh met with Jensen to discuss the further verification of the pre-existing WAM current-induced propagation and refraction algorithms.

Welsh exchanged emails with Richard Strelitz, the CEWES MSRC PET SV on-site lead, concerning the use of OSU WAM and CH3D results for an ITL visualization presentation. Welsh and Zhang subsequently provided Strelitz with the requested data plus sample extraction and plotting codes, sample plots and animations, and documentation of all materials.

-------------------------------------------------------------------------------

EQM: Environmental Quality Modeling CTA

-------------------------------------------------------------------------------

FMS: Forces Modeling and Simulation/C4 CTA

As noted earlier, the natural HPCC user base in the DoD FMS community is not large. Consequently, the the primary interface with the DoD FMS community has been through the DoD FMS CTA Lead, Robert Wasilausky of NRaD, and the various researchers involved in FMS CHSSI projects. In addition to routine contacts be email and at various meetings, this fall Syracuse University's FMS Lead, Dr Wojtek Furmanski, was invited to attend an internal review of the FMS CHSSI program. This meeting, conducted at the Space and Naval Warfare Systems Command (SPAWAR) in San Diego, provided an opportunity for the CEWES MSRC PET team to keep abreast of CHSSI activities and for both groups to exchange views as to the evolution of the field.

During a visit to the Army Research Lab in Aberdeen Maryland, Dr Furmanski also had the opportunity to talk with the heads of the Aberdeen Test Center and the Virtual Proving Ground project. Although these organizations are technically more closely allied with the IMT CTA area than with FMS, there is a fair amount of overlap between the two fields, especially in the opportunity for commodity distributed computing.

In conjunction with the CMS (Comprehensive Mine Simulator) parallelization planning effort, The Syracuse team has also had extensive contact with the code's "owner", Steve Bishop, and his group at the Army Night Vision Lab at Ft. Belvoir. This included a visit by Dr Furmanski to Ft. Belvoir for a breifing and demonstration of the system.

----------------------------------------------------------------------------

SPPT: Scalable Parallel Programming Tools

The primary source of CEWES MSRC user contact by the SPP Tools PET team members for the CEWES MSRC has been through workshops and courses:

* "Parallel Tools and Libraries" workshop, held in April 1997 at Arnold Engineering Development Center (AEDC) by Christian Halloy (University of Tennessee), had 18 participants.

* "Parallel Tools and Libraries" workshop, held in July 1997 at CEWES MSRC by Susan Blackford (University of Tennessee) and Victor Eijkhout (UCLA), had 10 participants.

* "Performance Evaluation of Parallel Systems" workshop, held in July 1998 at CEWES MSRC by Erich Strohmaier (University of Tennessee), had 7 participants.

* "Workshop on Portable Parallel Performance Tools," held in January 1998 at CEWES MSRC by Shirley Browne (University of Tennessee) and Clay Breshears (CEWES MSRC, Rice University), had 8 participants.

* "Code Optimization for MPPs" workshop, held in February 1998 at CEWES MSRC by Phil Mucci and Kevin London (University of Tennessee), had 12 participants.

* "Parallel Programming Workshop" held at the Naval Research Lab, Washington DC in March 1998 by the NRC Code Migration Group (CMG) and Breshears had 3 participants. There were three participants that received instruction on methods of parallelizing codes and tools available to help with such efforts.

Clay Breshears, the on-site SPP Tools Lead, has consulted and collaborated with other on-site CTA Leads (Steve Bova, Rick Weed, Carey Cox) on tools and Computer Science issues. Breshears has also worked with Phil Bording, Jay Cliburn, Henry Gabb, Dan Nagle and Doug Strasburg of the NRC Computational Migration Group (CMG) at the CEWES MSRC on code migration projects and the creation of CMG in-house conversion tools. The most notable collaboration has been the development and implementation of Fortran 90 bindings for POSIX threads on the SGI Origin 2000 at CEWES MSRC in support of the parallelization of the MAGI code (David Medina of the AF Phillips Lab) by the CMG. The at-Rice group (Ehtesham Hayder, Chuck Koelbel, Gina Goff) met with CMG members Strasburg, Bording, Gabb at CEWES MSRC to discuss use of parallel tools in code migration related to HELIX and MAGI codes. Hayder also contacted David Medina and Ted Carney (New Mexico Tech) about the MAGI code. A focused effort on loop optimizations in HELIX and MAGI codes is proposed for the next year.

Shirley Browne, of Tennessee, and Breshears have worked with CMG members in the use of parallel performance analysis tools. Use of these tools has provided CMG members another method of approaching their code migration efforts and measuring the efficiency of codes that have been parallelized. Browne and Breshears also worked with members of the VPRF Challenge Project group from Kirtland AFB when they visited CEWES MSRC March 1998, and using VAMPIR they were able to improve the communication performance of the ICEPIC code.

Jack Dongarra, Clint Whaley, and Antoine Petitet corresponded with CEWES MSRC user Alan Wallcraft (NRL-Stennis) about the possibility of using ScaLAPACK for the ocean modeling Challenge Project in June and July 1997. David O'Neal (PSC) visited CEWES MSRC and worked with Breshears and Cox on writing a matrix inversion code using ScaLAPACK routines on the SGI/Cray T3E at CEWES MSRC. This code is used as a preprocessing step to the Wallcraft ocean model code. Blackford and Whaley were consulted for advice on correctness of the implementation.

Breshears has worked with Brian Jean and Alan Stagg (CEWES application engineers) and Joe Schmidt (CEWES Coastal Hydraulics Laboratory) on the design and creation of SPLICE (Scalable Programming Library for Coupling Executables). This will be a high-level library to allow separate MPI codes (possibly running on separate HPC platforms) to trade information with each other. SPLICE is intended to give researchers the ability to couple diverse codes without having to be concerned with minute details of data layout and distribution within each code.

------------------------------------------------------------------------------

SV: Scientific Visualization

As part of a long-term collaboration, the CEWES MSRC PET Scientific Visualization team has had ongoing communication with Carl Cerco and Mark Noel of CEWES, in relation to their Chesapeake Bay project and visual analysis of the output of the CEWES CEQUAL-IQM code. This is a continuation of the relationship that was begun in Year 1. During Year 2, we have worked with them on defining their requirements for desktop visualization support, prototyping solutions for those needs, and iterating on the design process to refine their specifications. We have provided them an early version of a tool that they are currently using to view data from their 10- and 20-year production runs of the Chesapeake Bay model. This tool also supports a limited form of collaboration that they are using to share their results with their project monitor at the Environmental Protection Agency.

We have also begun a significant collaboration with CEWES Structures Lab personnel Raju Namburu, Tommy Bevins, Byron Armstrong, and Photios Padados, all of CEWES, in relation to their DoD Challenge Project in simulation of damaged structures. In support of their effort, we have provided a tool to view the results of both their CTH and Dyna3D simulations. This tool allowed them to verify the results of these runs, particularly the Dyna3D output. It has also allowed them to generate visualizations in the form of static images and mpeg movies to share with their colleagues over the Web. We also used this tool to highlight and explain their science at the national meeting, SC97.

------------------------------------------------------------------------------

C/C: Collaboration/Communication

no NPAC contribution so far

NCSA hosted the first MSRC PET Webmasters Meeting Feb 3 - 5, 1998. Attendees included both PET and MSRC webmasters from the CEWES, ASC, ARL, and NAVO MSRCs. The objectives of the meeting were 1) to build a sense of community among the MSRC/PET webmasters to facilitate communication and sharing and 2) to identify mechanisms to improve usability and uniforminty across the MSRC PET websites. The meeting resulted in a list of suggestions for presentation to the PET directors at the MSRCs for their approval. These suggestions have been incorporated into ongoing C/C core support plans.

------------------------------------------------------------------------------

Cross-MSRC Workshop with DoD CTA Leaders

In September 1997, the CEWES MSRC PET team participated in a cross-MSRC workshop between the DoD CTA Leaders and the leadership of the CTA support teams in the PET effort at all four MSRCs. Prior to this workshop, the DoD CTA Leaders had prepared White Papers for each CTA citing MSRC user needs that might be addressed in the PET effort at the MSRCs. After this workshop, the PET academic leadership prepared responses to these White Papers. And out of this workshop came the impetus for the PET roadmaps and vision statements organizing future directions of the PET effort in five essential areas:

Metasystems Programming Tools Application Tools Scientific Visualization Training & Collaboration

These documents are available from the CEWES MSRC PET website.

------------------------------------------------------------------------------

Cross-CTA Gridding Workshop

As a part of the CEWES MSRC PET Year 2 effort, a workshop on the utility of grid generation systems for MSRC users was held at the University of Texas in Austin in February 1998 (see the last part of Section V). This grid workshop was targeted specifically at the five "grid-related" CTAs: CFD, CSM, CWO, EQM, and CEA. This grid workshop served both to identify the needs of CTA users that are not being met with currently available grid (mesh) generation systems, and to broaden the awareness of the availability of grid generation resources in the CEWES MSRC user community (see Section VII).