As we move into Year 3 of the CEWES MSRC PET effort, the continuing vision in the eight technical support areas is as follows:
-----------------------------------------------------------------------------
CFD: Computational Fluid Dynamics CTA
The goal of the PET CFD team for the CEWES MSRC is to provide continual and as-required support for CFD users of CEWES MSRC, and to help develop and maintain state-of-the-art simulation capability in their application areas. Initial support projects are emphasizing efficient utilization of CEWES MSRC parallel computing capability but are not restricted to this area. We will continue to conduct collaborative research with CEWES MSRC users by
* Assisting with the efficient utilization of CEWES MSRC computing resources, * Identifying and implementing advances in simulation capability (physics, numerics, computational platforms).
* Dissemination of results throughout the technical community, and training and education of CEWES MSRC users.
We have the following broad goals:
* Leverage military and civil applications of CFD.
* Make creative use of new solvers compatible with CEWES MSRC hardware.
* Focus on CEWES MSRC user outreach, training, and assistance to remote users.
* Assist CEWES MSRC users to solve complex physics in complex geometries with scalable algorithms.
* Guide the development of and disseminate tools for large-scale, distributed CFD visualization at CEWES MSRC.
The CEWES MSRC PET team will provide a core level of effort to support technology transfer, user outreach, training, and assessment of targeted codes and algorithms in CFD. Targeted codes include but are not limited to the CHSSI CFD codes. Technology of interest includes CFD solution algorithms, use of parallel tools in regard to CFD codes, computational design, and grid generation in CFD. CFD support will also be provided as appropriate for migration of vector codes to CEWES MSRC scalable platforms.
-----------------------------------------------------------------------------
CSM: Computational Structural Mechanics CTA
Continuing vision for Computational Structural Mechanics (CSM) tools for the CEWES MSRC is as follows:
1. CTH and Dyna3D On-Site Support
Rick Weed of ERC-MSU (on-site lead) will continue to support users of CTH and Dyna3D codes applied to the simulation of damaged structures, and NCSA will continue to provide visualization support of these same CEWES MSRC users.
2. EPIC Translator
Additional functionality will be added to the EPIC translator. This tool will evolve into a GUI-based job preparation and submission program, which will reduce the EPIC job preparation cycle to 25% of the current effort required to submit a job.
3. Parallelization of Epic on Origin 2000
Future work will involve completion of the single-processor verifications and parallelization using OpenMP. In addition, NCSA will investigate introducing a new solver into the code.
4. Monitoring CTH
During Year 3, CUMULVS will be linked into CTH at the source code level and will be used to track variables during CTH simulations. The variable will be sent to a client workstation for visualization. EnSight will be used to visualize the variables from the CTH simulation. This system will be implemented into Dyna3D and EPIC.
------------------------------------------------------------------------------
CWO: Climate/Weather/Ocean Modeling CTA
The overall goal of the CEWES MSRC PET support of CWO, which in many respects it shares jointly with aspects of EQM and CFD, is to create the algorithm, physics and computing circumstances necessary to build fully coupled atmosphere, wave, circulation and sediment transport modeling systems. Such systems will be valuable prediction tools, especially when used in forecast mode, for use in defense-related activities including harbor access for naval operation, wind-wave hazard forecasts for fleet operations, coastal condition forecasts for amphibious landing craft activities, sediment storm prediction for submarine tracking and retrieval of buried mines. The emphasis in these coupled models is on shallow and/or nearshore operations, which is complementary to the deep ocean integrated systems being implemented at the NAVO MSRC. This shallow water/coastal focus involves nonlinear and full 3D description of the physics in each code, and extremely high resolution grids with sufficient verbal resolution to predict density interfaces.
Our activities proceed in parallel with the EQM group in that their interest focuses on water quality - hydrodynamic coupling while ours concentrates on a series of couplings between physical processes. Both avenues must be pursued simultaneously as we begin to blend the coupling activities in CWO and EQM in Year 3.
To reach this goal, the following areas of development are envisioned for the next year and beyond:
1. Implementation of Robust Wave-Current Interaction Physics in the Models
To couple the wave and hydrodynamic codes with the sediment resuspension codes for operation at the intended scales it is extremely important that wave-current interaction physics be properly included in all three models. These interactions are equally important at both the surface and bottom boundaries of the water column in the circulation and sediment codes. We will build on the terms already examined and incorporated in this year's work and include the terms for interactions at the bottom of the water column in the circulation and sediment codes.
2. Coupling of Parallel Circulation and Wave Codes and Completion of Lake Michigan Tests
Since the individual CH3D-p and WAM-p codes have been parallelized during the past year, we can now complete their coupling and evaluation by application to data sets collected in Lake Michigan by the NSF-EEGLE project.
3. Parallelized CH3D-SED and COSED Models
Both the noncohesive (CH3D-SED) and cohesive (CH3D-COSED) sediment transport models must be parallelized before use in the coupled system. The OSU team is parallelizing SED, while COSED will be parallelized cooperatively with Puri Banglore of Mississippi State / CFM. Our activities here are different from the EQM transport coupling due to the fact that CH3D-SED was developed in one unified grid with one unified (and very small time step), so our coupling problem here is not cross-grid reconciliation, but rather incorporation of the physics necessary for fully coupled prediction.
4. WAM/CH3D/SED - Parallel Model Coupling and Application to Lake Michigan
With the CH3D-SED-p model scheduled for completion in mid-year three, and with our wave-current interaction coupling implemented experimentally, we expect to have a fully coupled parallel WAM, CH3D and SED system adapted and operational for Lake Michigan. Evaluation will be done using the data set mentioned above in item 2.
5. Mesoscale Atmospheric Modeling System
Here we plan on taking the first steps toward the final requirement for the fully coupled system - the implementation of a storm resolving, fully three-dimensional, mesoscale atmospheric modeling system. Until COAMPS is available to us, we will use the MM5 model developed by Penn State University and UCAR.
------------------------------------------------------------------------------
EQM: Environmental Quality Modeling CTA
Continuing effort in support of EQM will focus on the following:
1. Coupling of Hydrodynamics and Water Quality Models
One of the major concerns of EQM CEWES MSRC users is the coupling of simulators. For example, at CEWES, there are at least three different hydrodynamics codes which could in theory be coupled to the water quality model CE-QUAL-ICM. A long-term goal is to take the three-dimensional flow field from any hydrodynamic model and project it onto an arbitrary water quality model grid. Realizing such a goal is a major effort. Some of the difficulties encountered in such an arbitrary coupling are that the grids used for hydrodynamics and water quality could be very different: in particular one could be triangle based and one quadrilateral based. Furthermore, the velocities interpolated onto the water quality grid may not be "conservative", which would lead to severe mass balance errors in the water quality model.
As a start toward realizing the goal above, we plan to investigate the coupling of ADCIRC and CE-QUAL-ICM. The difficulties mentioned above are present here: ADCIRC is based on triangles, and ICM is based on quadrilaterals. Moreover, the velocities produced by ADCIRC are not conservative, i.e., they do not satisfy the primitive continuity equations element-by-element. Thus, the coupling between the two codes will involve converting ADCIRC geometry files into ICM input files, converting ADCIRC velocity output into ICM input format, and the use of a projection method for correcting the ADCIRC velocities so that they are mass-conservative. Such projection methods have already been developed at UT but not applied specifically to this problem.
During the coming year we plan to develop and test the coupling of these two codes in a 2D setting, in both serial and multi-processor modes. CEWES personnel will be involved in the testing and validation processes. If the 2D coupling is successful, we will then proceed to full 3D models (pending the testing of 3D capability in ADCIRC).
In order to make ICM functional on more arbitrary grids, the algorithms within ICM need to be extended to triangular (prisms in 3D) type elements. Currently, the higher-order transport scheme employed in ICM works on rectangular grids, but cannot be directly extended to triangles. Therefore another long-term goal is examining the possibility of a higher-order scheme for ICM that is appropriate for triangular or other arbitrary shaped elements.
2. Web-based Launching of Parallel Groundwater Simulations
The focus of this effort is to continue the development of web-based tools which will serve as prototypes for launching parallel simulations from remote environments. The parallel simulations of interest arise in subsurface flow and transport, but the tools to be developed will be of general use. ParSSim (Parallel Aquifer and Reservoir Simulator), a parallel, three-dimensional, flow and transport simulator for modeling contamination and remediation of soils and aquifers will be used in this demonstration. The code was developed at the University of Texas and contains many of the features of current state-of-the-art groundwater codes. It is fully parallelized using domain decomposition and MPI and is operational on the IBM-SP2 and Cray T3E platforms.
A client Java applet with a GUI (graphical user interface) will be developed that will allow remote users to access the ParSSim code and data domains on CEWES MSRC servers. The results of the computation are then saved on the CEWES MSRC local disks and also selectively sent back to the requesting Java applet. The Java applet can be instantiated from any Internet web browser. We will first develop appropriate tools for executing ParSSim on a single computing environment. If this proves successful, we will further investigate more complex programming tools such as Globus, which provides the infrastructure needed to execute in a metacomputing environment.
3. Parallel Visualization Capability for Hydrodynamic Flow and Water Quality Models
Visualization capability which allows the user to view solutions as they are being generated on a parallel computing platform can greatly increase CEWES MSRC user productivity through ease of debugging, ability to quickly modify input parameters, etc. VisTool, a client/server based parallel visualization tool developed by Chandra Bajaj of UT Austin, is publicly available and provides the necessary software infrastructure. VisTool visualization libraries (isocontouring, volume rendering, streamlines, error-bounded mesh decimation, etc.) are callable from Fortran and C codes, and the server side has been demonstrated on Intel Paragon and Cray parallel machines. The client side relies on OpenGL and VTK Library calls to render the application data generated by the simulation programs. Any OPENGL-capable machine (e.g., SGI, PC with OPpenGL) can be used as the client machine.
Communications through client and server are performed through PVM-like communication calls. VisTool supports structured, unstructured and mixed grids, and allows both scalar and vector data to be visualized. 2D cutting planes, 1D line probes, streamlines, tracers, ribbons and surface tufts are available, and mpeg-format animations or postscript files can be generated for presentations. We will develop the appropriate interface routines which will be necessary to visualize flow and transport solution output generated, for example, by the ParSSim code.
------------------------------------------------------------------------------
FMS: Forces Modeling and Simulation/C4I CTA
Trends in both the military modeling and simulation (M&S) community and in the commodity distributed computing community point to the increasing convergence in the next few years of the DMSO-mandated base M&S technologies and commodity approaches involving Java, CORBA, and related tools. To highlight this convergence, NPAC (Syracuse) researchers are currently implementing HLA's Real-Time Infrastructure component in Java and CORBA. This commodity-based "Object Web RTI" system will then be capable of supporting distributed execution of simulation applications which are compliant with HLA, while also offering the possibility to take advantage of all the capabilities of the rapidly advancing field of commodity web technologies. At the same time, the FMS support team is also investigating the Comprehensive Mine Simulator (CMS) from Steve Bishop's group at the US Army's Night Vision Directorate, Ft. Belvoir, Virginia. CMS is currently capable of handling 30,000-50,000 mines on a single processor workstation, but clearly requires an HPC system to reach the target of 1,000,000 mines.
In the near future, there will be further convergence of M&S technologies with commodity distributed computing: there is already serious discussion in the field of turning HLA into a CORBA service, for example. With this convergence, and as more FMS applications move to the HLA standard, the two aspects of the CEWES MSRC PET program's approach to the FMS field will also converge: as applications such as CMS become HLA-compliant, they can be linked into larger distributed simulations, using Object Web RTI technology to connect multiple HPCC systems together.
-----------------------------------------------------------------------------
SPPT: Scalable Parallel Programming Tools
The SPP Tools team's plans for the future extend our current projects, always designed with an eye toward improving the overall computing environment at the CEWES MSRC. As before, these can be divided into four categories:
(a). Working directly with users. (b). Supplying essential software. (c). Training in the use of that software. (d). Tracking and transferring technology.
A more detailed strategy for the Scalable Parallel Programming Tools technology area is available at http://www.crpc.rice.edu/DoDmod/Working-Papers/Tools.html. This document discusses our tactical approach.
>> [note to Joe/editors: The above URL may change in the next month. Check >> with chk@cs.rice.edu before sending this to press!]
Continuing effort in support of SPP Tools will focus on the following:
1. Working Directly with Users
The focus of our user interactions remains Clay Breshears, the Rice on-site SPP Tools lead. He is the first point of contact for any user-from DoD, contractors, or CEWES MSRC PET partners-for tools-related questions. Since most parallel programming relies heavily on tools (including libraries, runtime systems, and compilers), he is a natural point of contact for many general questions about migrating codes to or developing codes on parallel machines. Important sources of user contacts for Breshears include the CEWES MSRC "help" system, his collaborations with the NRC Code Migration Group (CMG), and his involvement in teaching courses. Breshears' work with users is augmented by visits from the other SPP Tools team members. Shirley Browne (University of Tennessee), Ehtesham Hayder (Rice University), and Charles Koelbel (Rice) each have visited CEWES MSRC several times in the past and plan to continue (and increase) visits in the future.
In addition to continuing the user collaborations mentioned above, we are planning a number of new activities with users in Year 3. Perhaps most notable among these is Hayder's work on the HELIX code. This is a turbulent flow code referred to the Rice team members by the CMG. Although parallelization was working fairly well, the code suffered from below-par single-processor performance. Hayder, in consultation with other Rice University researchers, is analyzing the code for memory hierarchy pathologies and other potential problems. The CMG reports that many other codes that they examine have similar inefficiencies; while it is much too early to speculate whether the causes are similar, it is clear that we will have many targets of opportunity for applying the compiler optimizations pioneered at Rice.
2. Supplying Essential Software
Although code migration projects are helpful to the individual users of those codes, real leverage to build up an HPCMP-wide programming environment comes from supplying more generally applicable software. We will continue working closely with CEWES MSRC staff to install and evaluate potentially useful new tools. We have targeted several tools for introduction at CEWES MSRC in the next year: MPI-IO, the first portable parallel I/O interface; CAPTools, a semi-automatic parallelization tool that could significantly aid code migration efforts; and OpenMP, an emerging standard in shared-memory programming. In addition, we are tracking upgrades to several existing packages.
One of the most exciting Tools projects proposed for Year 3 was the implementation of MPI-2 by a group led by Tony Skjellum at Mississippi State. MPI (Message Passing Interface) is an extremely successful standardization for two-sided communication between processes, for example the executable programs on two nodes of a distributed-memory machine. In effect, it is the foundation for portable parallel programming on most machines. MPI-2 extends this with I/O operations, one-sided communication, and dynamic process management. The MSU group proposed an aggressive 2-year project to supply highly-efficient implementations on two machines of interest to CEWES MSRC, with the resulting software to be phased in according to a strict schedule. This focused effort will be a great step forward for MPI users, both within the DoD and in the larger parallel computation community.
Although larger problems than ever before can be solved on today's scalable parallel computers, DoD users need to solve even larger problems and to couple independently developed portions of a problem running on different computer systems. This motivates an investigation of intercommunication and metacomputing technologies such as MPI-Connect, NetSolve, and SNIPE being developed at UTK, as well as the Globus and Legion systems. Specific requests by application areas for MPI intercommunication between all MSRC platforms warrants the effort to port MPI-Connect to these platforms, as well as to develop additional capabilities, such as parallel I/O for virtual file sharing.
It should also be noted that we plan to enlarge the populations of existing software repositories and add new, domain-specific repositories. Browne is spearheading this effort. The importance of these repositories is that they give a single source for users to go to obtain high-quality code, rather than continually reinventing the wheel (or, worse yet, failing to reinvent it and struggling with square wheels). This advantage has led the National Computational Science Alliance to adopt Repository in a Box (RIB) for their deployment mission.
3. Training
We will continue to expand the training courses offered, both by repeating popular courses (e.g. Parallel Performance Tools) and developing new ones. Some specific training plans for the next year include:
(a) Involvement in the JSU Summer Institute (Koelbel, Breshears). (b) Experiences Porting Scientific Codes (Hayder). (c) OpenMP (Gina Goff).
Some of these are dependent on focused efforts that, as of this writing, have not been formally approved.
4. Tracking and Transferring Technology
We will continue our efforts with standards efforts in the tools area, including emerging efforts like the ParaDyn/DAIS group which has recently formed. Limitations of trace-based, post-mortem performance analysis tools have been demonstrated by attempts to use them "out-of-the-box" with large-scale CEWES MSRC applications. A common platform-independent infrastructure for runtime attachment and monitoring would benefit not only performance analysis, but also interactive debugging and data visualization and would ease the task of tool writers. Such an infrastructure is being developed by the ParaDyn research group at the University of Wisconsin and University of Maryland and by an IBM parallel tools team led by Douglas Pase. The infrastructure consists of building a client-server system called Dynamic Application Instrumentation System (DAIS) on top of the low-level dyninst instrumentation library used by ParaDyn. During the next year, we plan to continue to participate in the dyninst/DAIS standardization effort and to help focus that effort by driving it with end-user tools of importance to DoD users.
We will continue to be active in the national HPC community, both to bring new, promising technologies into CEWES MSRC and to present our progress to our peers. A key part of this, as mentioned above, is attendance at conferences and professional meetings, with extensive trip reports to update CEWES MSRC on the new technologies found there. Foremost among these is the PTOOLS Annual Meeting. PTOOLS is a consortium of users and developers who work together to build new, useful parallel tools; one of their projects was to start the High Performance Debugger Forum. We also plan trips to SC'98, SIAM Parallel Processing, and other parallel processing meetings during the year with a heavy software emphasis. Finally, Rice University is hosting the 1998 DoD High Performance Computing Users Group conference, where we will surely see many advances in the field.
---------------------------------------------------------------------------
SV: Scientific Visualization
In an effort to serve the overall user community in the long-term, a strategic 5-year SV plan has been developed. This SV stategic plan provides a vision for a Visual High Performance Computing environment designed to support and enhance productivity for CEWES MSRC researchers. It also provides a framework in which to organize and prioritize activities within the CEWES MSRC PET visualization program.
The SV strategic plan examines three components:
(a) Anticipated changes in CEWES MSRC user needs over the 5-year lifetime of the CEWES MSRC PET effort. (b) Expected evolution of certain technologies over this time period.
(c) A plan for CEWES MSRC PET efforts that will take advantage of emerging technologies to address changing CEWES MSRC user needs.
This 5-year SV strategic plan is available at
http://www.ncsa.uiuc.edu/Vis/PET/Strategy
Continuing effort in support of SV will focus on the following:
1. VisGen
The CEWES MSRC PET team's work on the VisGen tools will continue in Year 3. Extending this tool with functionality needed by Cerco's EQM group at CEWES and bringing it to the maturity needed in a production-level tool is a high priority. Adapting the tool to also run on the CEWES MSRC ImmersaDesk will provide a cross-platform visual analysis tool, allowing the researcher to work on the platform best suited for the type of analysis needed. The ImmerasDesk version of the VisGen tool will also incorporate a speech interface and, eventually, audio output to augment or reinforce the visual representation.
2. Interactive Computation
The CEWES MSRC PET team for SV will also embark on an exploration of software designed to support interactive computation. Many researchers have expressed needs for the ability to monitor their simulation as it executes. This would allow a researcher to abort a run that appears to be on an unfruitful path, perhaps because of badly stated input conditions. In some cases, it might be appropriate for researchers to modify parameters of a running simulation. A variety of software, available from academia or government labs, exists to support these capabilities. The CEWES MSRC PET SV team will apply these strategies to various user codes, to be able to characterize the current support systems for computational monitoring and steering. In particular, we will apply these strategies to CTH and CEQUAL-IQM, furthering our support for these research communities.
3. Visualization Workshop
Now that we are fully staffed, the CEWES MSRC PET SV team in Year 3 will extend its user contact activities. We are contacting a new group of users, in Vicksburg and elsewhere, to initiate a relationship and assess their needs. We plan to organize a Visualization workshop to share information about possible solutions with a significant number of CEWES MSRC users. Additionally, if areas are identified where existing solutions are inadequate, we can use that information to plan future CEWES MSRC PET activities.
4. Jackson State
The CEWES MSRC SV PET team will continue to work with our colleagues at Jackson State University, particularly Milti Leonard and Edgar Powell. In this relationship, we will continue to find ways to utilize their current skills and provide opportunities for skill building. Finally, we will continue our end-user training efforts, and will extend our offerings in web-based delivery of information on visualization tools.
------------------------------------------------------------------------------
C/C: Collaboration/Communication
no NPAC contribution so far
Continuing C/C core support efforts at the CEWES MSRC PET will focus on increased user outreach and further development of the website infrastructure to improve information dissemination. Focused efforts are planned to provide web-based training support and an intranet environment to support team collaboration. Planned outreach activities include interaction with the CEWES MSRC user community through user surveys and face-to-face meetings. Information on C/C activities and technologies will be provided through updates to the C/C webpages and seminars on technologies as appropriate. The webmaster sense of community shall also be fostered through additional face-to-face meetings as well as online meetings.
Website development efforts will involve the redesign of the CEWES MSRC PET web site framework to improve the overall uniformity and usability of the sites across MSRCs. New web technologies will be implemented where appropriate to provide state-of-the-art capabilities to users and content providers. Guidance and training to PET web content providers will be developed to ensure consistency across sites. Also, as required, tools such as an MSRC PET graphics repository will be developed to assist in the development of PET webpages.
Training support will be provided by assisting CTA leads in selecting and utilizing new asynchronous training technologies to enable them to deliver web-based training courses to the HPC community. These self-paced courses can be taken on an any time, any place, any pace basis as determined by the user. This effort involves selection of state-of-the-art asynchronous training tools for use in developing multimedia short courses and assistance in developing courses utilizing the recommended tools. This assistance will include development of course materials to demonstrate capability of the tools, training on how to use the tools, and guidelines on course development using web-based training tools.
Collaboration support will be provided through the development of a fully integrated environment that will be incrementally implemented by providing increasing capability as collaborative tools meeting CEWES MSRC user needs are identified. Initial capabilities that will be provided are a fully functional calendar and/or scheduling capability, an improved threaded discussion capability that meets specific CEWES MSRC requirements, and meeting support to include whiteboard, chat, and audio conferencing. Long-term capabilities will include evaluation and improvement of the current videoconferencing capability at CEWES MSRC, implementation of a database backend to support user customization of the collaborative environment, and development of a web-based front end to support easy and intuitive access to all collaborative capabilities provided. Collaborative activities within the CEWES MSRC PET user community will be identified as test areas for deployment of these tools.