This plan addresses the current and projected visualization capabilities required by DoD researchers. It concentrates on issues at the heart of the academic component of the PET program, namely identifying emerging technologies that can enhance the computing abilities of the MSRC researcher and providing for technology transfer. Three timelines are provided, reflecting the interplay between new technologies, evolving user needs, and PET efforts to address DoD needs through new technology. MSRC infrastructure issues, staffing issues, and day-to-day operations related to well-understood visualization activities are not discussed here.
PET will reach this goal by drawing from the commercial sector, government labs, and academic efforts. Visualization personnel
Year 1 | Year 2 | Year 3 | Year 4 | Year 5 |
---|---|---|---|---|
Researchers request: visualization of very large data sets, real-time monitoring and/or interactive steering, and remote access to visualization capabilities |
Researcher interest in collaborative visualization increases ImmersaDesks at MSRC's promote researcher curiosity about VR for data analysis and multimodal interfaces |
As MSRC vector machines fade and codes are reimplemented for new architectures, need for interactive computing and debugging increases?? Adaptive mesh techniques demand new visual representations |
DREN connectivity increases user interest in collaborative visual data analysis Low-cost desktop graphics promotes researcher demand for Windows-based visualization tools and desktop VR |
Interoperable, multidisciplinary, and coupled codes heighten researcher needs for interactive steering and for effective visual representations |
Year 1 | Year 2 | Year 3 | Year 4 | Year 5 |
---|---|---|---|---|
Co-processing environments emerge ImmersaDesk's widely available Collaboration middleware available Cross-platform visualization libraries become available |
MSRCs connected by DREN Lower-cost desktop graphics, including PC accelerator cards become available Force-feedback devices emerge from research labs to market |
Distributed Centers connected by DREN Integrated "visual supercomputing" architectures available Traditional graphics vendors all market low-cost, Windows-based machines Speech-driven user interfaces are possible |
Automated grid generation improves Adaptive mesh techniques gain popularity Low-cost, Windows-based, desktop VR is possible Multidisciplinary codes are coupled |
Multidisciplinary, coupled codes see increasing use |
Year 1 | Year 2 | Year 3 | Year 4 | Year 5 |
---|---|---|---|---|
Educate users about current vis packages (AVS, EnSight) "Pioneer" use of ImmersaDesk for data analysis Demonstrate collaborative vis over DREN/vBNS Build Web-based training module in ImmersaDesk programming Experiment with coprocessing environments |
Develop cross-platform vis tools Identify strategies for supporting remote users Prototype collaborative vis software Demonstrate multimodal (force-feedback) interface Train in emerging vis software Track possibilities for PC-based visualization Demonstrate IMT visualization application |
Evaluate coprocessing environments Extend visualization capabilities of coprocessing environments Identify strategies for data management of very large data sets Define user requirements for collaborative visualization Demonstrate multimodal (speech-driven) interfaces |
Utilize "visual supercomputing" architecture for mega-run simulation Deploy collaborative visualization environment Deploy strategies for data management and interpretation of very large data sets |
Enhance collaborative visualization environment |