ITR Proposals Earth Landscape -- needs to have 3 CS thrusts; algorithms(Yousuff) Visualization(Banks) "Data Systems" (Gcf) PI works with Haff at Duke Math/Security in Education -- Math abstracts to support of complex scientific meta data in distributed information systems. These are small proposals in CSIT thrust of "Information Infrastrcture for Scientific Research and Education" HEP Data Analysis -- not ITR -- little focus. Needs to focus on data mining within CSIT Information Infrastructure Erlebacher -- Integrated Debugging Visualization Computational Steering for CFD Engineering Control Processes -- no CS yet. They need more homework MDO -- John Dennis Innovative Educational Objects -- ESCOT, CDROM making, NASA Andrea,Shared Portals Need a vision for CSIT Information Infrastructure and how researchers and education types work within it as opposed to taking resources and running Need a CSIL vision supporting capability computing and clustered systems Blackboard closed system. Still not delivered system linking to back-office Gallivan -- Can't get mathematically inclined students Interested in "agents" and "components" Andy White: Gil Weigand now running both parts of ASCI -- "fragile" as not clear who is in charge now Vic Reis left O'Brien has many graduates at Stennis -- not familiar with Bedford Does not use MSRC resources as "badly managed" Urs Heller using small Christ machine (50 gigaflops) CPS Courses to Ted Baker Issue of what I teach Baker v. CSIT HEP Proposal -- needs more datamining (Pulleyblank, Sanjay Ranka, NCSA, Hollebeek) and Information management (Reagan Moore, DoE(Gannon,Andy White)) --------------------------------------------------------------------------- GEM Computer Science NSF Letter of Intent Due November 15 Note "we believe" that proposal must be clearly have CS research as major focus Goal is 50% CS 50% Application/Outreach Funding Environments for Distributed Collaborative Science applied to Earthquake Analysis Institutions Syracuse: Fox Colorado: Rundle Boston: Giles,Klein USC: Henyey (SCEC for Outreach) ? Who else from Academia: Minster, Tullis NPACI would strengthen Computer Science Unfunded Collaboraters JPL Los Alamos USGS ? Industry Sun ? International ACES Motivation and Project Team The importance of distributed scientific collaboration has been understood for some time and tremendous progress has been made over the last few years. In particular distributed object and web technology has enabled sharing of both data and simulations over time and distance. However there are many fundamental issues to be studied both from computer science (how should we build collaborative scientific environments) and application science (what changes in the scientific method and what are appplication requirements and impact) points of view. The unsolved research issues are particularly acute for real time interactions between people, computer simulations, instruments and other information resources. This proposal builds an interdisciplinary team where we focus on both the general computer science issues and one particular application area -- that of earthquake analysis and simulation. This area is both important and needs both traditional scientific collaboration and the time critical distributed collaborations needed after a major event. Scientists around the world join to both help the crisis management teams and to gather together and understand the multi-faceted real time information overload characteristic of a large earthquake. The computer science research will address the needs of and test its ideas in other application areas using the existing collaborations and broad expertise of the proposal team. The earthquake area will focus on the needs of scientific collaboration but the environments will be extensible to support the general needs of the crisis teams with distributed interactions between control rooms, field personnel and experts together with real time data streams. Components of the Research Program 1) Computer Science Research We have abstracted lessons from early prototypes of collaborative systems and science problem solving environments to define a new approach to web-based collaborative portals SPW (Shared portal on the Web) which we will combine with requirements from both the earthquake application team and major teams at NSF Partnerships in Advanced Computational Science. We will iterate short (around 6 month) prototyping efforts with test and evaluation. This modular construction approach fits today's rapid evolution in technology on "Internet Time". 2) Application Effort We have developed three typical scenarios linking distributed scientists, data and simulations and these will be implemented as prototype collaborative environments using both existing and new application codes. We will with other partners (JPL, USC/SCEC, USGS) link to the major earthquake sensor systems as part of the environments. We will include theoretical and observational science scientific data analysis in the scenarios in both real-time decision support and more asynchronous collaboration modes. 3) Outreach We will leverage the existing broad and succesful outreach program of USC/SCEC which will link us both to the public (for education) and to the state and federal emergency services. CSIT Meeting November 2 ------ Larry Abele --------------------------------- 1853 Francis Epps (Jefferson Grandson) donated land for coed school (4 years each Latin Greek Math) 1905 Florida Female College -- Florida State College for Women 1947 Coed for returning G.I's with Focus on arts and Basic Science 1999 Florida decided to differentiate State Universities in Mission FSU Research Focus 1997 External and Internal Committees (Anita Jones headed external committee) 1998 Implementation Committee Bloch and Anita Jones 1999 Approved and Funded April May 99 CSIT reports to Provost 30 New Faculty 200,000 square foot $50M building 25 graduate student fellowships Hussaini is founding director Drug license funding gives $10M for named professorships in Science ------------- Peter Lax ----------------------- Invented term computational science His rigorous work underlies CFD today Peter Lax committee recommended Supercomputer Program List of Nobel Prizes using computing (Ken Wilson not mentioned) 1) Von Neumann was founder of computational science starting with need to design atomic bombs 2) 1954 Fermi-Pasta-Ulam uu/tt - c**2u/xx with c = 1+p(u) 3) Krustal and Zabusky Solitons discovered understand importance of complete integrability Newton (2 body gravitating) Ising model (Onsanger) also completely integrable 3) Chaos Feigenbaum, Lorenz (Strange Attractor), Fluid turbulence, fractals (Mandelbrot) ---. Period Doubling x/n+1 = Tf(x/n) Feigenbaum found sequence of Temperatures delineating different periods and chaotic regions 4) Zeros of Riemann Zeta function 5) Computer Assisted Proofs 4 color problem (Haken and Appel used computer to run through choices) Lanford's proof of Feigenbaum's universality principle on critical exponent But computers will not use AI to prove theorems!!!!!!!!!!!!!!!!!! 6) CFD Computational Airfoil Design -- Garabedian Jameson and Marsha Berger "Design whole aircraft" Jameson's work very important in industry Jim Glimm 1D breakthroughs 7) Algorithms Shock Capturing Von Neumann Multigrid (Federenko Brandt) Fast Fourier Transform (Cooley Tukey but known earlier to ?Runge) Fast Matrix Multiplication (Strassen) C n**2.807 C about 5 Crytosystems (RSA) Multiresolution -- wavelets ------------ Warren Washington ---------------------- Wrote popular book Climate Modelling Equations are CFD 5 Navier stokes Conservation of Water (Vapour Liquid) Detailed Chemistry at times Best resolution (.5 degree) cannot be used in long time simulations yet but will be done Need Global Ocean models as part of Climate (one tenth of a degree latitude and longitude) Sea Ice:10 kilometer resolution VERY IMPORTANT and its retreat gives global warming (experimental measurements show unprecendented Greenhouse gases Sulfate aerosols Stratospheric ozone Biomass burning Historical simulations to see if can predict past climate 1870-now Deduce emission strategies Some important observations -- use Ensembles to predict family of possible futures. Not clear one can predict a single future changes in precipitation Regional warming and cooling etc. 1930's were warm and 1940's cool -- can reproduce TRENDS in field More detailed interactions of land/vegetation/river runoff, ocean and sea ice More distant collaboration More parallelism PCM Parallel Climate Model -- His work "Flux Coupler" links components together Uses Los Alamos ocean model with no singularities at Pole Uses non uniform grids Ensembles Start with 1870-now and then go to future with different assumptions of increasing greenhouse gases and aerosols Solar variability, volcanic activity Global warming is decade time period PCM uses MPI -- doesnt scale very well even from 32 to 64 nodes Note atmosphere largest - ocean, sea ice, interpolation all sizeable T3E scales better than Origin 2000 64 Processors Compaq fastest and runs 300 years in one month Better than current IBM and SGI New NSF Initiative in Climate area DoE SSI included ACPI (Accelerated Climate Prediction Initiative) Lists of CSIT opportunities Interested in object oriented techniques and more computer science collaboration Clumsy data archiving --------------- Andy White ---------------------- 1979 Joined Los Alamos 1989 ACL Set Up Aid to decision making Consequences of error serious Discontinuity in use of past experience Manaic II was first floating point arithmetic machine Current Sweet spot is $500K Interesting display of computer power versus place DoE and NCSA largest Climate large Pixar Celeron (genomics) IBM SGI are major industry Climate Wildfire - needs about 1 teraflop to simulate worst case of multiple fires (15 in California) Infuenza outbreaks Electrical Power: $230B Industry deregulation will save some $20B but will decrease reliability Transportation How do you know that simulations are to be trusted CSIT should with its infrstructure help to develop software with clusters Software component architectures (as in Warren Washington flux coupler) Note physics components are geometrically distinct FMS components are geometrically intertwined Reliability is a problem as large machines give errors National Defense University trains decision makers using "war games" New rules such as: Graphs not discretization Dependency not locality .... ------- Doug Dwoyer (Tom Zang substites) ------------------- NASA 1 year predictions of weather Distributed Information System in the Sky Human exploration of Space Hybrid Robot and Human missions Interplanetary Internet Collaborative Engineering Center at Langley Needed as distributed expertise among laboratories PictureTel Videoconferencing International Space Station immersive environment demonstrated on Power Wall ------------- Steve Orzag -------------------------------- (2000)^3 grid plus 10^10 degrees of freedom on multi teraflop machines Moore's law should work until about year 2015 Current 0.18 micron Claims memory chip goes like square and processor like fourth power of inverse feature size New algorithms often come from new problem areas as suggest new ways to think Physical models very important as reduce number of degrees of freedom. R Reynolds Number >~ 10^9 1/R measures ratio of size of eddies to system size How to we do it? 1)Direct Numerical Simulation DNS Work goes like R^3 which is O(10^24) HOPELESS in general 2) Pure Theory 3) Small Eddy 4) Large Eddy LES Need to use "Tensor" correlations 5) Very large eddies VLES Simulate large anistropic eddies estimate univeral eddies VLES better than RANS Reynolds Averasged Navier Stokes Need a very accurate code -- simpler codes give either chaos or stable wrong answer Compares with experiments Turbulence always generated at boundaries Pitfalls -- four examples, In published CFD in last 10 years 5% papers did good tests 20% papers did sloppy tests 20 years ago every body tested ------------------ Bill Pulleyblank ----------------------- Deep Computing May 97 Deep Blue 3.5 Kasparov 2.5 Kasparov got overconfident after winning first game -- this was due to software bug in evaluating passed pawn! Deep Blue: 200 million chess positions per second on 30 node IBM SP System 3 Components of Deep Computing 1) Computational Methods 2) Large Datasets 3) Solve Decision Weather Forecast Applications Olympic Games, Car races, hurricane evacuation, frozen orange trees, energy demand (this depends on time of day) weather energy demand: Calculate expectations of given state and all those it swops energy with. Predict who wants energy and can sell it Intelligent e-mail response for customer messages Biometrics as examples of fuzzy matching -- other application fingerprint text nolecular/protein (Bio informatics) ---------- Inverse Problem Tomography -------------- Tomography does not need supercomputer 3D Convection models need supercomputers Reynolds number 10^-20 high resolution tomography 250,000 parameters gave misleading results as too many parameters for data Lots of pictures but rather little discussion of computing Will put seisometers in ocean EarthScope USArray deploy fixed crude resolution and movable fine resolution that will move to 5 sites where they will stay for two years each ---------- Haxford ----------------------------------- Neutrino Mass is real ------------ McMasters Boeing ------------------------- IUGREEE Cold war mindset but world is changing Global transportation system Son is webmaster with no qualifications and 75% salary Learns from colleagues/Internet Boeing likes Process driven approach 737-300 costs $35B 747-400 costs $180 with no competition Aviation has existed for 100 years -- 4 generations of designers Need to grow 5th generation Analogy with birds, Icarus, evolution ... man --> Wright Brothers --> Boeing Propellers (as in Boeing 377) suck Boeing spoilt by no competition Mature Industry Theoretical Limit Practical Limit Actual Acievement Jet Fuel Prediction impacted design and in fact inflated cost Current Jet Fuel is 14% Gene Wong of NSF likes scalability How to design next aircraft -- several possible scenarios 1) Flying watermelon 2) Pregnant Seagull 3) Klingon Battlecruiser Need to stop laying people off in recession University Relations due to Technology development Hire Students Continuing education After 5 years, no corelation in success aznd where you went to school NO CORRELATION between undergraduate GPA and performance in 450 Boeing engineers Advocates more collaboration between industry and academia starting begore graduation and continuing after graduation Liberal Arts Philosophy good as prepars student well for later learning ------------------------------- Panel --------------------------------------------- Chuck Koelbel: NSF really believes in Information Technology NOAA 10-40 teraflop sustained needed for Climate NOAA Seasonal forecasts such as El Nino gives 6 Teraflops by 2003 Need to collaborate climate scientists and computational science Tom Zang: Establish Software Quality Policy Students do not write good software today "Course in Modular Scientific Programming" Interdisciplinary: Decide what you mean! Multiple Science/Engineering fields together versus Science--Math--CS NSF Atmospheric Sciences NCAR is computational science flagship 12 commandments Think outside box Find community Focus your funds Honor your grandparents Beware steering committee Honor charismatic leadership No shotgun weddings Bottom up Innovative partnerships Tolerate lollipop ideas outside mainstream Free flow of ideas Fly by seat of pants ------------ John Rice: Organization -------------------- Must have a leader with a broad view cross departments What is the problem anyway? Originally cost was hardware; now problem is software Software re-use is more or less a failure ------------------ Abele ------------------------------- Use deep computing to support e-mail response to online questions Online credits costs 25% more than regular What is CSIT value added in education Internetics Internet resource for students Whats in it for student? Training in team activities critical Teach Mathematica instead of algebra! Interdisciplinarity replaces reductionism (focus on single deep field) Need liberal science CSIT shouldn't just be another discipline must focus on teams ---------------------> Need Core course on Teaming, Software building, Information use ^^^^^^^^^^ Data-mining is natural for ------------>>>>> Abele -- must scale to K-12 and growing minorities 17% GNP Health Care but no computer skills in health care professionals ------- Juris Hartmanis -------------------- Another factor of 1000 to 10000 guaranteed by Moore's law Larry Smarr XMP in 85 is in todays Game MIPS chips KDI started initiative -- his explanation described KDI rather differently from perception Bipartisan Support 7.4 million jobs -- wages 60% higher than national average PITAC report http://www.ccic.gov/ac/report/index.html Also NRC 1999 report on computer science funding Computer Science can be viewed as engineering of intellectual processes Computing and Simulation is ultimate success of Reductionism Like Comples Systems and Emergent Behavior Digitization of Information Connectivity and Complexity of Computational Grids Sensing Computing and Activation on a chip In Engineering Mass and Energy are being replaced by Information and Computing How should universities respond to the Information Technology Revolution Tools are being built to aid intellectual activities at greatest rate ever. "Second Wave of Computer Science" First was early 1960's CSIT Interesting Experiment (in sense not known how to do it) Cornell:www.cs.cornell.edu Initial task force report Recommends a "faculty" which is not a center or a department Dean for (not of) Computer and Information Science One department: CS taken from engineering FCI Faculty in Computing and Information FCI will not participate in tenure Significant increase in funding to fund part time appointments Term appointments i.e. catalyst funding initially ---------------- Biology ----------------------------- Study biological functions that are operationally important These will have simple laws Channel Proteins 90% of theories leave out ions and currents -- current critical Millisecond time scales non trivial as basic time-scale of proteins 10**-16 seconds Simple electrostatic model describes current flow on a pentium Then develop model for selectivity based on volume effects and salt concentration Solve irregular Poisson Large scale molecular dynamics are typically wrong as ignore ions They place proteins in distilled water Need 10^7 to 10^12 molecules to get it right.Impossible ------ Visualization JHU and SGI ---------------------- New approaches to Image Understanding -- Shapes as in faces Everybody will have visible human like images so we can see disease etc. and track Computational Scientists are the futureMichael Jordan's! Appearance Visualization as in paint on cars 2D Plot has low visual density -- eye can do more than this!