1997 Electronic Training Workshop

ASC MSRC WPAFB

Dayton, OH

March 6-7, 1997

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Central State University

and

ASC MSRC PET Program

 

TABLE OF CONTENTS

Page.

1.0 INTRODUCTION 3

2.0 OVERVIEW 4

2.1 WORKSHOP AGENDA 4

2.1.1 DAY ONE

2.1.2 DAY TWO 5

2.2 WORKSHOP PARTICIPANTS 5

3.0 WORKING GROUP RESULTS 6

3.1 AUTHORING AND PRODUCTION TOOLS 6

3.1.1 Discussion of Needs and Requirements

3.1.2 Recommendations

3.2 DELIVERY HARDWARE, SOFTWARE, AND TOOLS 8

3.2.1 Discussion of Needs and Requirements

3.2.2 Recommendations

3.3 SERVICE QUALITY AND OTHER NETWORK ISSUES 11

3.3.1 Discussion of Needs and Requirements

3.3.2 Recommendations

3.4 TRAINING EVALUATION MECHANISMS 12

    1. .1 Discussion of Needs and Requirements

3.4.2 Recommendations

3.5 COURSE CONTENT AND QUALITY 14

3.5.1 QUALITY OF COURSE CONTENT 15

3.5.1.1 Discussion of Needs and Requirements

3.5.1.2 Recommendations

3.5.2 DELIVERY MODE REQUIREMENTS 16

3.5.2.1 Discussion of Needs and Requirements

3.5.2.2 Recommendations

3.5.3 USER PROFILES 17

3.5.3.1 Discussion of Needs and Requirements

3.5.3.2 Recommendations

3.5.4 TYPES OF TRAINING MATERIALS 18

3.5.4.1 Discussion of Needs and Requirements

3.5.4.2 Recommendation

 

1.0 INTRODUCTION

This report summarizes the results of the Electronic Training Workshop

held on March 6-7, 1997, at Wright Patterson Air Force Base (WPAFB) in

Dayton, Ohio. The Workshop was a gathering of knowledgeable

representatives from academia, government, and industry, to discuss and

develop ideas and strategies for Network Based Education and Training

(NBET). The Workshop was sponsored by and supported by the Programming

Environment and Training (PET) component of the High Performance

Computing Modernization Program (HPCMP) at the Aeronautical Systems

Center (ASC) Major Shared Resource Center (MSRC).

NBET is a strategic goal of the HPCMP at ASC, and the Workshop was

conducted as part of the overall plan to achieve that goal. Therefore

the recommendations from this Workshop are incorporated into the ASC

MSRC PET Training Plan (draft), which is currently under review by the

government, and the results are also an input to the Tactical Plan for

the Training and Collaboration Interface Technology Center (ITC) at ASC.

Workshop attendance was open to representatives from all four of the

DoD MSRCs and also from academic institutions not directly connected to

the HPCMP including the Dayton Area Graduate Studies Institute (DAGSI).

 

2.0 OVERVIEW

The Electronic Training Workshop included two days of technical

presentations on NBET-related topics, open discussions, working

sessions, and report writing. The first day focused on the presentation

of available technologies and current activity in NBET; the second day

was reserved for evaluation of the available technology alternatives and

recommendations for a practical implementation.

2.1 WORKSHOP AGENDA

2.1.1 DAY ONE:

The first day began with welcomes and introductions by Mr. John Blair,

MSRC Deputy Director, Mr. Jeff Graham, MSRC PET Director and Senior

Engineer, and Mr. Fletcher Kurtz, HPCMP Program Manager from the Nichols

Research Corporation, prime contractor at the ASC MSRC. Mr. Dana

Hoffman, PET Academic Coordinator from the Ohio Supercomputer Center

(OSC) (The Ohio State University), then presented an overview of the PET

plans and approach for Web-based training development and deployment,

the overall context in which the Workshop should consider the available

alternatives and make recommendations.

Mr. Gordon Nelson, NBET Integrated Project Team (IPT) Leader, presented

a kickoff briefing including an overview of the HPCMP and PET program

background, current ASC PET training activities, near term requirements,

long range vision, and the Workshop goals. The NBET long range vision

includes the following components:

a. Strong video and audio features with interactive capability;

b. Smooth transition from live classroom to availability on the Web;

c. Continuous evaluation and evolution with competition encouraged; and

d. Course development on the Web in accordance with established

guidelines.

Mr. Tracey Smith, Nichols, presented a briefing on the capabilities of

the new MSRC Training Classroom to support local PET training and NBET.

The classroom was under construction at the time of the Workshop and is

expected to be ready for PET use during May.

Ms. Leslie Southern, PET Training and Collaboration ITC Senior Lead

from OSC, presented a summary of the "Colloquium on High Performance

Computing (HPC) Collaborative Methods and Tools" that was held at the

Army Corps of Engineers Waterways Experiment Station (CEWES) MSRC in

Vicksburg, Mississippi, on February 27-28, 1997.

The remainder of Day One was reserved for the NBET-related technical

presentations by the invited speakers. There were four presentations as

follows:

a. "Network Based Learning and netWorkPlace," Dr. John Ziebarth and Mr.

Frank Baker, National Center for Supercomputer Applications (NCSA),

University of Illinois at Urbana-Champaign.

b. "The Web and Education: The Good, The Bad, and The Ugly," Dr. Rubin

Landau, Department of Physics, Oregon State University.

c. "The Tango Collaboratory," Dr. Geoffrey Fox and Mr. Gang Cheng,

Northeast Parallel Architectures Center (NPAC), Syracuse University.

d. "The Regional Training Center for Parallel Processing," Dr. Mladen

Vouk and Mr. Rick Klevans, Department of Computer Science, North

Carolina State University.

2.1.2 DAY TWO:

The second day began with a determination of the topics and

participants for the Working Groups, and these are listed in Section

3.0. The Groups investigated their respective topics and then returned

to general session for an oral presentation of the interim results by

the Group Leaders. The Groups then reassembled to revise and write

their findings and recommendations in reports that were compiled in

electronic form to be consolidated into this overall Workshop Report.

2.2 WORKSHOP PARTICIPANTS

William L. Asbury Nichols asburywl@email.wpafb.af.mil

Frank Baker NCSA fbaker@ncsa.uiuc.edu

Stephen Brewster CSU sbrewster@cesvxa.ces.edu

Douglas L. Campbell AFIT campbell@afit.af.mil

Gang Cheng NPAC gcheng@npac.syr.edu

Charlotte D. Coleman ASC MSRC colemacd@88cg.wpafb.af.mil

Kamyar Dezhgosha CSU kamyar@cesvxa.ces.edu

Ken Flurchick OSC kenf@osc.edu

Geoffrey C. Fox NPAC gcf@npac.syr.edu

Frank Gilfeather Maui HPC Center gilfeath@arl.unm.edu

Jeff Graham ASC MSRC grahamje@email.wpafb.af.mil

Frankie Harris OSC frankie@osc.edu

Ann L. Hernandez Nichols asnalh01@asnmail.asc.edu

Dana Hoffman OSC hoffman@osc.edu

Jeffrey C. Huskamp OSC huskamp@osc.edu

Richard L. Klevans NCSU rlklevan@eos.ncsu.edu

Fletcher Kurtz Nichols kurtzf@88cg.wpafb.af.mil

Jan Labanowski OSC jkl@osc.edu

Rubin Landau Oregon St. U. rubin@physics.orst.edu

Dianne Love CSU dlove@cesvxa.ces.edu

Cheryl Lucas Nichols lucasca@email.wpafb.af.mil

Robert L. Marcus CSU marcus@cesvxa.ces.edu

Mike Natale Nichols natalem@email.wpafb.af.mil

Gordon R. Nelson OSC Team nelsongr@88cg.wpafb.af.mil

Olu "Tido" Olatidoye ClAU tido@visidel.cau.edu

Ruth Pachter Wright Laboratory pachterr@ml.wpafb.af.mil

Lawrence E. Porter OSC Team LPorterLPA@aol.com

Butch Rappe CEWES brappe@newton.wes.army.mil

Douglass Robertson CSU drobertson@cesvxa.ces.edu

Peter Schartz pjs@csar.com

Tracey Smith Nichols smithtl@email.wpafb.af.mil

Leslie Southern OSC leslie@osc.edu

Mary Stuessy OSC marys@osc.edu

Kate Treyens OSC kct@osc.edu

Mladen A. Vouk NCSU vouk@csc.ncsu.edu

Jerome J. Walker CSU walker@cesvxa.ces.edu

Theresa Windus OSC windustl@msrc.wpafb.af.mil

Stacy Wood OSC swood@osc.edu

John Ziebarth NCSA ziebarth@ncsa.uiuc.edu

3.0 WORKING GROUP RESULTS

The Workshop participants formed four Working Groups to discuss

functional areas related to the development and deployment of NBET.

Group One covered "Authoring and Production Tools;" Group Two covered

two areas, "Delivery Hardware, Software, and Tools" and "Service Quality

and Other Network Issues;" Group Three covered "Training Evaluation

Mechanisms;" and Group Four covered four topics in the area of "Course

Content and Quality." The Working Group findings and recommendations

for these five functional areas are presented in the following five

sections:

3.1 AUTHORING AND PRODUCTION TOOLS

The focus of Group One was to recommend methodologies and tools to

enable the creation of NBET materials ready for the Web. The scope of

this topic includes the origination of the material by the author, the

preparation of the material for classroom presentation by the author or

another instructor, the capture of the material if presented live in the

classroom, and the post-presentation processing of the material to make

it suitable for Web access. Group One was led by Dr. Olu (Tido)

Olatidoye from Clark Atlanta University, Atlanta, Georgia. The

participants were Rubin Landau, Rick Klevans, Cheryl Lucas, and Ann

Hernandez.

3.1.1 DISCUSSION OF NEEDS AND REQUIREMENTS:

The issues associated with authoring and production tools include the

following:

a. Platform compatibility. The training materials migrate through

several platforms from creation to final delivery. The NBET products

must be compatible with the hardware and software used by the authors,

instructors, NBET providers (e.g., the MSRC), and the students.

b. Interactive lessons have different levels of interactivity.

Students interact with the materials (e.g. they can submit numbers and

get graphs back), students can interact with the instructor or other

students, and the students could run remote lab equipment. Java can be

used reliably for easy interactions. Other tools are more limiting.

The spectrum of interaction should range from HTML as the minimum to

interactive video as the maximum. Video is not realistic over the Web

yet but may be for future DoD intranets.

c. Author input may be platform specific, but Web output should be

platform in-dependent as much as possible (there may be exceptions).

d. Software packages differ significantly. For example, preparing

multimedia materials for CD is not the same as for the Web; some

packages will transmit big plug-ins for materials, and these may be

platform specific.

e. Different production tools will be needed for different training

products.

f. Video production and post production editing is usually not done by

the author or instructor. Some local technical assistance needed.

g. The user system requirements (e.g., RealAudio, RealVideo, sound)

should be stated up front to facilitate the capture and processing of

the materials.

The following is a list of available graphics tools:

a. lview (?) for Windows;

b. xv (?) for UNIX; and

c. VRML (?) compatible tools.

The following is a list of available converters:

a. latex2html (?);

b. gif converter;

c. graphic converter;

d. xfig (?);

e. xview (?);

f. rtf2html (?);

g. MPEG (?);

h. Quicktime; and

i. jpeg (?) animation.

The following is a list of available audio tools:

a. RealAudio;

b. SoundEdit for Macintosh; and

c. SoundForge for Windows.

The following is a list of available programming tools (These now work

fairly universally.):

a. Java; and

b. JavaScript.

3.1.2 RECOMMENDATIONS:

Provide authoring guidelines and a list of supported tools (especially

for format and directory structures) so that the authors can supply

their materials in original text (not Postscript) using supported tools

like Microsoft Word or PowerPoint. The latest versions of these tools

have HTML output options (Internet Assistant). Other examples are NACSE

(?), uces (?), and ncsa (?).

Let authors use their favorite tools as long as a conversion can be

done. The authors should submit source materials that can be kept in

original format for future corrections, updates, and adaptations to new

technology. The originals can then be converted to HTML, if not already

in that preferred output format.

Provide different production tools for different products, for example,

regular class, tutorial, or complete video.

Establish local support and do format conversion in-house for all

approved platforms. Use the same resources to determine supported

products. Is the tool a standard or within established guidelines? Is

tool output convertible to a useful NBET format?

Local technical people should also do the video production and post

production.

Allow the authors to submit requests for supported tools; this will

facilitate data exchange and video production.

Use Optical Character Reading (OCR) to convert hard copy using a

commercial application like OmniPage Pro 7.0 (for the Mac) or a suitable

alternative.

Keep an archive in-house for reusable material such as templates,

guidelines, animations, codes, complete courses, and sample courses.

The NBET program should include on-going evaluation of new tools as the

need arises. A local team of experts can characterize available tools

based on feedback from authors, instructors, and students who give

recommendations for tools that may facilitate Web delivery.

3.2 DELIVERY HARDWARE, SOFTWARE, AND TOOLS

Group Two focused on two areas. The first (covered in this section)

was to recommend methodologies and tools to enable the delivery of NBET

materials over the World Wide Web. The scope of this topic includes

synchronous and asynchronous training, delivery hardware and software,

and remote student system requirements. Group Two was led by Dr.

Stephen Brewster from Central State University, Wilberforce, Ohio. The

participants were Bill Asbury, Gang Cheng, Ken Flurchick, Abayomi

Ajayi-Majebi, Mike Natale, Marek Porndorg (by teleconference from

Syracuse University), Douglass Robertson, Tracey Smith, and John

Ziebarth.

3.2.1 DISCUSSION OF NEEDS AND REQUIREMENTS:

The issues associated with authoring and production tools include the

following:

a. There needs to be sufficient disk access for users at the MSRC to

support the expected training load while providing reasonable access

times.

b. Users must be given a sufficiently fast response time and a robust

network connection.

c. Cost needs to be considered because not all recommended or commonly

used software is free for government use.

d. Access will include non-DREN (Defense Research and Engineering

Network) users, for example, academic to government and government to

academic.

e. The NBET implementation needs to accommodate the limitations of 28.8

Kbps access from modems or congested networks.

f. The system could be used not only for training but also for

scientific collaboration, and if it is, the requirements will be

different.

g. There is a need for electronic office hours including chat rooms or

video teleconferences.

h. Software and hardware selection impacts performance. The PET team

will want to measure the performance (server, network, and client) of

the system, and the results from any evaluation phase can be used to

fine tune the final server configuration needs. The PET team will need

feedback from users to see what problems they are encountering and their

impression of the system performance. This will be different for

different types of training methods. The team will need to track

network response time, delay time, down time, interactive time for the

user, time of usage, access patterns, and content hits.

i. The MSRC will support three different training models; asynchronous,

synchronous, and mixed Web-based training environments for users.

j. The client machines may need a minimum functionality of browser

access, two-way audio and video, and a network connection of at least

28.8 Kbps.

k. Access for persons with disabilities, especially vision, is

important.

l. Software with scientific equation capability is needed.

m. The Web-based training system needs to be scaleable, and a process

for identifying new and emerging capabilities needs to be built into the

plan.

The following is a list of available software products to consider:

a. Microsoft Internet Explorer 3.x;

b. Netscape Navigator 3.x;

c. Mosaic;

d. Web audio;

e. Web video;

f. Software to provide CD capability;

g. Microsoft Office automation applications;

h. VDO (?), VOD (?) (Syracuse, video and audio);

i. VOSAIC (?) (streamed video);

j. UNIX or NT server software;

k. HTML;

l. Java: there are security and performance issues to consider;

m. JavaScript;

n. Viewers;

o. Plug-ins;

p. Chat and threaded discussion software (e.g., MS Chat, and

Hypernews);

q. Netscape Communicator;

r. Tango;

s. netWorkPlace (NCSA);

t. netLearningPlace (NCSA);

u. Joule (NCSA);

v. Habanero (NCSA);

w. NovaNet (Plato); and

x. CyberProf (may require up to 250 hours of development time for each

hour of presentation time).

The following is a list of server hardware considerations:

a. For the best performance, separate servers may be required for each

component of the NBET products: Video, Audio, HTTP, CGI, and Database;

b. For practical development and delivery, separate servers may be

required for Evaluation and Production machines; the production server

specifications will depend on the capacity required to satisfy peak user

demand;

c. This may imply different servers with multiple disks and a total

capacity of 20 GB for each server, for example, five 4 GB disks for each

with at least 128 MB of memory;

d. The Indy at ASC is currently a test machine but needs an upgrade,

for example to two 4 GB disks and 128 MB memory;

e. Archive and backup capabilities are needed;

f. Network access requirements will grow over time (10Mbps, 100Mbps,

FDDI, and possibly ATM); and

g. An ability to monitor network and measure loads on the servers is

needed.

The following is a list of student client machine capabilities to

consider for all possible training capabilities, asynchronous and

synchronous:

a. Browser;

b. Sound (special sound systems may be needed eventually);

c. Two-way video;

d. Camera and microphone are needed on both server and client machines;

e. Color;

f. MPEG (?) card;

g. CD;

h. Minimum of 28.8 Kbps modem or direct access to the network;

i. Capability to run current software applications;

j. Pentium 90 performance level minimum;

3.2.2 RECOMMENDATIONS:

Initially use the existing SGI Indy workstation as an evaluation server

after upgrading its performance in accordance with the discussion in

Section 3.2.1. Based on the experiences with this, and after a thorough

evaluation of requirements, up to five servers (Video, Audio, HTTP, CGI,

and Database) may be needed for the best performance.

Obtain software to measure server and network performance, and collect

statistics in order to evaluate the number of servers and bandwidth

required. The PET team should determine which capabilities (Video,

Audio, HTTP, CGI, and Database) can be combined on a single CPU.

Tell the users up front what their system requirements are.

Require a minimum network connection of 28.8 Kbps.

Obtain and/or develop software to measure student response and access

times.

Survey the students to obtain student perception of performance.

Evaluate server capabilities for different training methods.

Incorporate a process for identifying and evaluating emerging NBET

capabilities.

3.3 SERVICE QUALITY AND OTHER NETWORK ISSUES

Group Two also focused on this second area to make recommendations

concerning the quality of NBET service over the World Wide Web and to

identify other network issues. This discussion was closely related to

the previous topic, but the Group reported the results separately. The

participants were Bill Asbury, Stephen Brewster, Gang Cheng, Abayomi

Ajayi-Majebi, Mike Natale, Marek Porndorg (by teleconference from

Syracuse University), Douglass Robertson, Tracey Smith, and John

Ziebarth.

3.3.1 DISCUSSION OF NEEDS AND REQUIREMENTS:

The issues of security, password protection, and authentication need to

be considered.

High quality is needed throughout the system.

The students will expect CD quality presentations.

The students will expect a stable network connection.

Bandwidth requirements will grow and grow (ATM).

Consider what response time delays the students will tolerate.

3.3.2 RECOMMENDATIONS:

The NBET implementation should provide no more than a 250 millisecond

response time (frame rate) for 95% of each one-hour session.

For synchronous training the "up time" needs to be close to 100%.

Consider whether trouble shooting capability should be provided to users

for network issues.

3.4 TRAINING EVALUATION MECHANISMS

The focus of Group Three was to recommend methodologies and tools to

enable the evaluation of NBET materials, infrastructure, and results.

The scope of this topic includes synchronous and asynchronous delivery,

knowledge transfer, student and instructor profiles, evaluation

technologies, and Quality of Service (QoS). Group Three was led by Mr.

Robert Marcus from Central State University, Wilberforce, Ohio. The

participants were Frank Baker, Douglas Campbell, Leslie Southern, and

Mladen Vouk.

3.4.1 DISCUSSION OF NEEDS AND REQUIREMENTS:

An NBET system must allow assessment and validation of both the

educational/training impacts and also the technological impacts of its

implementations using scientific methods and metrics. To achieve this,

there needs to be support for the measurement and assessment of:

a. Knowledge and training transfer,

b. Student and instructor profiles, and

c. Supporting technologies.

Functionally, therefore, the system should have facilities for data

collection, quantitative evaluation, and tracking of:

a. Student and system performance,

b. Knowledge acquisition and retention rates,

c. Course and lesson management issues,

d. Educational Quality of Service (QoS) needs, and

e. System usability including the computer-human interface.

The efficiency and effectiveness of NBET must be measured through

knowledge transfer and retention rates and other related metrics (e.g.,

usefulness of the training to specific user categories, increase of

productivity, etc.). Without benchmarks and measurement of both student

and system successes and failures, it is impossible to make any

meaningful and scientific evaluation of either the technology or the

educational paradigms.

Evaluation of author, instructor, and student profiles will allow mutual

matching of user workflow with computer and network-assisted support for

optimized knowledge/skill transfer and minimized resource expenditures.

Similarly, assessment of technological issues should include evaluation

of the material for suitability for synchronous delivery (in-class, and

remote class), and asynchronous delivery over networks.

The PET program should recognize at least two levels at which it should

collect related data:

1. The network/system level. This level is concerned with capabilities

of the underlying infrastructure to actually sustain user workflow

requirements (e.g., cell or packet loss probability (mean and

consecutive), cell or packet delay (maximum, mean, and variation),

bandwidth guarantees (minimum and mean), etc.).

2. User level. In the context of end-user-oriented workflow, the

network/system level definition of QoS measures may be expanded to

include measurable end-user quality characteristics such as system

reliability and availability, interface usability, performance,

algorithmic scaleability, effectiveness, quality of lessons, quality of

user-system interactions, semantic interoperability, and quality of the

delivered material.

Concerning the suitability and criteria for the use of NBET courses on

the Web, not every classroom presentation will be suitable for direct

transfer to Web presentation. For example, a highly interactive class

with an instructor leading the class through an interactive session with

each student at a computer might be very successful and receive high

evaluation marks. A video recording of that same course viewed across

the Web could be very ineffective. A version of that course could

almost certainly be created for the Web, but the delivery methodology

may need to be completely different.

Therefore, any classroom-to-Web transfer process must include the

question, "Will this course move gracefully to the Web?" This is

particularly applicable if PET will be posting synchronized audio and

slides to the Web as the default to make a class available quickly and

easily. Some courses would only provide embarrassment if posted in this

manner. Such courses should never be provided without appropriate

adaptation because they would waste student time, bandwidth, and the

time required for PET staff to respond to frustrated and/or irritated

students.

3.4.2 RECOMMENDATIONS:

For full evaluation of NBET courses, the PET program should consider all

of the following four components:

a. Measure knowledge/training transfer and usefulness of material to

students' productivity. Ideally this requires administering several

types of assessment instruments: pre-training, interim, post-training,

and a follow-up assessment six weeks to six months later. Non-intrusive

tracking (e.g., logging responses to embedded questions) should also be

used to capture students' in-class learning.

b. Measure management issues related to the course suitability,

transferability, presentation style, and form (technological issues).

State the training goals and objectives for each course up front in the

catalog or initial announcement, and then test against these goals.

Students should be given a pre-test for self-assessment to determine

whether prerequisites have been satisfied. Concerning the instructors,

determine the style of teaching, expertise, level at which teaching is

done, and technology they use in order to determine whether the

instructors' approaches are suitable to the Web and whether or not PET

can adapt or reuse their materials.

c. Measure issues related to student and instructor profiles that will

assist in addressing student needs; for this, PET needs to tailor each

course and then assess both profiles to make the delivery/learning

models conform to the requirements. Some methods to measure student and

instructor profiles include: (1) develop user, instructor, and

supervisor pre/post evaluation forms; (2) use class registration forms

and instructor resumes to gather profile information; (3) develop charts

showing the relative class status for each student.

d. Develop a risk assessment grid using the results of these

evaluations to capture an overall probability of success rating for the

instructor-course-student combinations. The goal is to adjust the

combinations when the risk rating is too high.

A Web-based course may be delivered synchronously or asynchronously.

Therefore, use different evaluation forms for different delivery

methods.

Capture live classroom questions and answers as supplemental written

material in a suitable database for NBET voice-over-charts

presentations. This material can also serve as a repository of data to

use in the evaluation of Web courses.

Develop a method for evaluating any linked information used in lessons.

There is a list of recommended approaches to NBET that have examples on

the Internet, many of them linked to NCSU's Department of Computer

Science. Incorporate a regular process for evaluating ideas from these

forums for incorporation into the PET NBET program.

3.5 COURSE CONTENT AND QUALITY

The focus of Group Four was to make recommendations for ensuring the

quality of NBET course content, to identify delivery mode requirements,

to discuss the need for user profiles, and to identify the need for

different types of NBET materials to reflect user needs and the training

versus education split. The scope of these topics includes the needs of

the MSRC users including the makeup of the user community, the delivery

applications, the split of education versus training for the courses

offered, and the preservation of legacy course investment for at least

the short term. The scope of these discussions overlapped the topics

considered by other Groups in some cases, but Group Four provided

additional insights in those areas. Group Four was led by Dr. Kamyar

Dezhgosha from Central State University, Wilberforce, Ohio. The

participants were Geoffrey Fox, Jeffrey Huskamp, Dianne Love, and

Theresa Windus.

3.5.1 QUALITY OF COURSE CONTENT:

3.5.1.1 Discussion of Needs and Requirements:

Course length is an issue for ASC researchers because many, especially

at the Ph.D. level, do not have time away from the laboratory to attend

a three or four day class at the MSRC. Therefore the NBET should

include courses suitable for this level of education or training that

can be completed at the student's pace.

There need to be mechanisms available for course feedback, e.g.,

discussions with the instructor through on-line office hours.

Instructor profiles are needed to gauge the level of technical expertise

in Web course development. This will facilitate a better match of

instructors to courses.

It is important to customize the courses to the users to maximize course

relevance to their jobs and to motivate researchers to sign up for the

course offerings.

The training environments at the universities and the MSRC are probably

different, and PET needs to account for that difference in the materials

and instructors.

PET needs to determine the level of help/guidance needed by the on-site

student versus the level needed over the network during an NBET

presentation.

Asynchronous courses may need facilitators at the delivery point.

3.5.1.2 Recommendations:

Tailor courses to user time constraints.

Create instructor profiles and carefully consider the assignment of

instructors to courses to ensure the best match.

Customize courses to DoD user profiles to maximize job relevance and to

provide incentives.

Generate a training module for the course developers and instructors.

Teach the instructors to employ the presentation development and

delivery mechanisms effectively.

Add hyperlinks from course material to other CTA-related information.

Provide an internal review mechanism for course materials.

Collect inputs from students to help maintain and improve course

quality.

Establish instructor on-line office hours with an interaction protocol.

To maintain quality, provide administrative and technical facilitators

for the courses. The facilitators may need to be time zone based.

3.5.2 DELIVERY MODE REQUIREMENTS:

3.5.2.1 Discussion of Needs and Requirements:

PET needs to track more than browser technology because browser

technology is aiming for the mass market with 28.8 Kbps modems. This

may not be sufficient for ASC applications.

Authoring systems may not be viable for a large course investment

because these tools change frequently. For example, CEWES initially

required Postscript, but that is not a good choice since it is difficult

for authors to generate and manipulate slides using Postscript. Every

slide must be less than 30 Kbytes and be a separate file. ASC has not

focused on MBONE so this MSRC is not requiring Postscript

MBONE should be compatible with supported technologies. Delivery

technology should not drive course content, particularly if the result

is of lower quality as in the example of Postscript.

A large investment is needed in infrastructure to adequately support the

NBET evaluation server hardware and software. PET must be prepared to

make this investment.

Delivery mode selections and changes should be made to allow the

preservation of legacy courses.

Delivery technologies should be chosen that allow the conversion of

training materials by the MSRC staff.

Delivery systems need to support existing delivery methodologies as a

baseline.

Existing course presentations include:

a. Handwritten slides (not acceptable);

b. Persuasion;

c. Microsoft PowerPoint and Word;

d. FrameMaker;

e. LaTeX (?); and

f. HTML files (may not be a good choice since the fonts are too small).

3.5.2.2 Recommendations:

Monitor closely where Web technology is going. This should be a special

emphasis area for the Information and Communication ITC tracking

function.

MBONE is required by the government, but Postscript is very difficult

for the Web, so it is not recommended for general PET NBET use. Find a

display technology for MBONE other than Postscript.

Solve the HTML small font size problem before using HTML files.

Previous courses have been given from slides. Investigate how

to put these on the Web directly.

Experiment with emerging Web technologies such as the following to

enhance presentations:

a. Java applets;

b. Automatic glossary support;

c. Two-way video; and

d. RealAudio.

3.5.3 USER PROFILES:

3.5.3.1 Discussion of Needs and Requirements:

An initial user profile questionnaire should be filled out when a user

signs up for a course and also when a user seeks information. This will

yield current information on current users.

Useful profile inputs would include the number of users that each course

attracts and the number of inquiries each course announcement generates.

The survey results can be weighted for the persons profiled according to

the amount of their computer allocation and usage.

Compare the results for the on-site versus off-site students. What is

the ASC percentage of off-site users? What does this mean for training?

Profiles from the distributed centers are being obtained now.

User capabilities will help determine how training is delivered. What

hardware (better than 386s?) and software are available? What is the

education, prior training, and experience level of the users? What

applications need to be factored in?

CEWES today may approximate the user profile of all MSRCs in five years

since CEWES was the largest center prior to the MSRC program.

There is currently almost no material on the Web on algorithms and

education. Almost all the Web-based material is training on how to use

HPC hardware.

3.5.3.2 Recommendations:

Develop a comprehensive user profile for the entire user base to drive

the types of presentations and the courses offered.

Concentrate on user needs.

Increase the ratio of education to training courses based on user

profiles.

Emphasize training in algorithms, and add the word "algorithm" to the

title of appropriate courses.

To maximize utilization of MSRC resources, consider increasing the

number of education courses (or course modules) addressing parallel

algorithms versus training courses.

Use the user profiles to connect students with the CTA on-site leads for

algorithm and application support.

Continue to connect users to the Dayton Area Graduate Studies Institute

(DAGSI) for additional educational needs (versus training).

3.5.4 TYPES OF TRAINING MATERIALS:

3.5.4.1 Discussion of Needs and Requirements:

"Training" courses are generally available on the Web for hardware

platforms and computing subjects, but "education" courses are not

generally available. Perhaps more effort should be put into education

courses. There is an attitude on the part of some that only training

courses are needed, and that education courses in the CTA areas are not

needed since the DoD employs experts in these fields. But what are the

real needs of the users? ASC is not set up to offer quarter-long

courses, so if there is a demonstrated need for longer education

courses, some of this functionality can be provided through self-paced

Web courses.

There is a need to demonstrate the applicability of course material to

the researchers' applications. For example, generating an example of

the course material's application to real DoD problems would be a

stimulating way to attract students.

Courses need to be dynamic and easily updated.

3.5.4.2 Recommendations:

The split of education offerings versus training offerings needs to be

evaluated. NBET courses should be evaluated for expansion into longer

formats that could assist the education component.

Some courses longer than three days should be considered.

Education must be couched in terms of near term deliverables and the

production oriented mindset that prevails at ASC; intellectual curiosity

not driven by a deliverable is not the norm.

Parallel algorithms should be emphasized due to the ASC hardware and

the direction of CTAs and the Common HPC Software Support Initiative

(CHSSI).