The history
of hypertext begins in July of 1945. President Roosevelt's science advisor
during World War II, Dr. Vannevar Bush, proposes Memex in an article
titled As We May Think
published in The Atlantic Monthly. In the article, Bush outlines the ideas
for a machine that would have the capacity to store textual and graphical
information in such a way that any piece of information could be arbitrarily
linked to any other piece. In his own words:
[...] He[the user] can add marginal notes and comments, taking advantage of one possible type of dry photography, and it could even be arranged so that he can do this by a stylus scheme, such as is now employed in the telautograph seen in railroad waiting rooms, just as though he had the physical page before him. All this is conventional, except for the projection forward of present-day mechanisms and gadgetry. It affords an immediate step, however, to associative indexing, the basic idea of which is a provision whereby any item may be caused at will to select immediately and automatically another. This is the essential feature of the memex. The process of tying two items together is the important thing.
Moreover, Memex would also give the user the capability to create an information trail of traveled links which could later be retrieved. The following excerpts from the article As We May Think further outline Dr. Bush's vision.
The real heart of the matter of selection, however, goes deeper than a lag in the adoption of mechanisms by libraries, or a lack of development of devices for their use. Our ineptitude in getting at the record is largely caused by the artificiality of systems of indexing. When data of any sort are placed in storage, they are filed alphabetically or numerically, and information is found (when it is) by tracing it down from subclass to subclass. It can be in only one place, unless duplicates are used; one has to have rules as to which path will locate it, and the rules are cumbersome. Having found one item, moreover, one has to emerge from the system and re-enter on a new path.
The human mind does not work that way. It operates by association. With one item in its grasp, it snaps instantly to the next that is suggested by the association of thoughts, in accordance with some intricate web of trails carried by the cells of the brain. It has other characteristics, of course; trails that are not frequently followed are prone to fade, items are not fully permanent, memory is transitory. Yet the speed of action, the intricacy of trails, the detail of mental pictures, is awe-inspiring beyond all else in nature.
Man cannot hope fully to duplicate this mental process artificially, but he certainly ought to be able to learn from it. In minor ways he may even improve, for his records have relative permanency. The first idea, however, to be drawn from the analogy concerns selection. Selection by association, rather than by indexing, may yet be mechanized. One cannot hope thus to equal the speed and flexibility with which the mind follows an associative trail, but it should be possible to beat the mind decisively in regard to the permanence and clarity of the items resurrected from storage.
Consider a future device for individual use, which is a sort of mechanized private file and library. It needs a name, and to coin one at random, ``memex'' will do. A memex is a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory.
It consists of a desk, and while it can presumably be operated from a distance, it is primarily the piece of furniture at which he works. On the top are slanting translucent screens, on which material can be projected for convenient reading. There is a keyboard, and sets of buttons and levers. Otherwise it looks like an ordinary desk.
In one end is the stored material. The matter of bulk is well taken care of by improved microfilm. Only a small part of the interior of the memex is devoted to storage, the rest to mechanism. Yet if the user inserted 5000 pages of material a day it would take him hundreds of years to fill the repository, so he can be profligate and enter material freely.
In 1965, Ted Nelson coined the terms "hypertext" and "hypermedia" in a paper to the ACM 20th national conference.[Nielsen , DeBra] In an article published by Literary Machines, Nelson explained:
[...] By 'hypertext' mean nonsequential writing - text that branches and allows choice to the reader, best read at an interactive screen.
The first hypertext-based system was developed in 1967 by a team of researchers led by Dr. Andries van Dam at Brown University. The research was funded by IBM and the first hypertext implementation, Hypertext Editing System, ran on an IBM/360 mainframe. IBM later sold the system to the Houston Manned Spacecraft Center which reportedly used it for the Apollo space program documentation. A year later, in 1968, van Dam developed FRESS, a File Retrieval and Editing System which was an improvement of his original Hypertext Editing System and was used commercially by Philips. [SU]
Doug Engelbart of the Stanford Research Institute, inventor of the mouse, was also inspired by the hypertext idea. In 1968 he introduced his NLS, the oN Line System, which held in a "shared journal," over 100,000 papers, reports, memos and cross references. [W3C , SU]
In 1972, researchers at Carnegie-Mellon University began development of ZOG (doesn't stand for anything!). ZOG was a large database designed for a multiuser environment. The ZOG database consisted of frames which, in turn, consisted of a title, a description, a line with standard ZOG commands, and a set of menu items (called selections) leading to other frames. The ZOG database was text-only and originally ran on an IBM mainframe. A PERQ workstation implementation of ZOG was used on the nuclear-powered aircraft carrier USS Carl Vinson. Two of the original developers of ZOG, Donald McCracken and Robert Akscyn, later developed KMS, Knowledge Management System, which was an improved version of ZOG. KMS ran on Sun and HP Apollo workstations with much enhanced performance. Though KMS included a GUI, it still remained a text-based system. It was intended to be a collaborative tool, in that users could modify the contents of a frame and the changes would be immediately visible to others through dynamically updated links. [DeBra , SU , W3C]
In 1978, Andrew Lippman of MIT Architecture Machine Group, lead a team of researchers that developed what is argued to be the first true hypermedia system called the Aspen Movie Map. This application was a virtual ride simulation through the city of Aspen, Colorado. Four cameras, pointing in different directions, were mounted on a truck which was driven through the streets of Aspen. The cameras took pictures at regular intervals, and all the pictures were compiled onto videodisks. The images were linked in such a way that would allow the user to start at a given point and move forward, back, left, or right. Once a route through the city was chosen, the system could display the images in rapid succession creating a movie-like motion. The system also included images of the interior of several landmark Aspen buildings, so the user could take a virtual tour of these buildings. Another interesting feature of the system was a navigation map which was displayed in addition to the movie window. The user could jump directly to a point on the city map instead of finding the way through the city streets to that destination. The Aspen Movie Map was a landmark in hypermedia development in that, through a sophisticated application, it demonstrated what could be achieved with the technology available at the time. [DeBra , SU , W3C]
In
my limited reading on the history of computing, I have not encountered
any subject as passionately discussed as Xanadu. Its followers believe
in it with almost religious zeal and its skeptics bash it with equal conviction.
Theodor Holm Nelson, a writer, film-maker, and software designer, conceived
the idea of Xanadu in 1981. In his own words, "explaining it quickly:"
[Gromov]
Andrew Pam, in his Where World Wide Web Went Wrong article explains transclusion as:
"Transclusion" is a term introduced by Ted Nelson to define virtual inclusion, the process of including something by reference rather than by copying. This is fundamental to the Xanadu designs; originally transclusions were implemented using hyperlinks, but it was later discovered that in fact hyperlinks could be implemented using transclusions! Transclusions permit storage efficiency for multiple reasonably similar documents, such as those generated by versions and alternates as discussed above.
In the Xanadu scheme, a universal document database (docuverse), would allow addressing of any substring of any document from any other document. "This requires an even stronger addressing scheme than the Universal Resource Locators used in the World-Wide Web." [De Bra] Additionally, Xanadu would permanently keep every version of every document, thereby eliminating the possibility of a broken link and the ever-so-familiar 404-Document Not Found error. Xanadu would only maintain the current version of the document in its entirety. The previous versions could then be dynamically reconstructed from the current version through a very sophisticated versioning system which would keep track of modifications made to each generation of the document. In Samuel Taylor Coleridge's poem, Kubla Khan, Xanadu is a "magic place of literary memory" where nothing is ever forgotten.[Gromov , Zeltser]
Xanadu was never implemented. In his article in Wired Magazine, The Curse of Xanadu, Gary Wolf writes:
Xanadu, a global hypertext publishing system, is the longest-running vaporware story in the history of the computer industry. It has been in development for more than 30 years. This long gestation period may not put it in the same category as the Great Wall of China, which was under construction for most of the 16th century and still failed to foil invaders, but, given the relative youth of commercial computing, Xanadu has set a record of futility that will be difficult for other companies to surpass.
Despite these harsh words, Wolf later writes:
Nelson's writing and presentations inspired some of the most visionary computer programmers, managers, and executives - including Autodesk Inc. founder John Walker - to pour millions of dollars and years of effort into the project. Xanadu was meant to be a universal library, a worldwide hypertext publishing tool, a system to resolve copyright disputes, and a meritocratic forum for discussion and debate. By putting all information within reach of all people, Xanadu was meant to eliminate scientific ignorance and cure political misunderstandings.
Wolfsbane is Nelson's retort to Gary Wolf's 'Curse of Xanadu' in Wired Magazine published by the Xanadu project.
After years of frustration, Ted Nelson accepted an invitation from Japan in 1994, and founded the Sapporo HyperLab where he continued his Xanadu research. He is currently a Professor of Environmental Information at the Shonan Fujisawa Campus of Keio University.
Other landmarks in the history of hypertext include Janet Walker's 1985 Symbolics Document Examiner which was the first hypertext-based system to gain wide-spread acceptance and usage. The system provided the manual for Symbolics computers in hypertext format as opposed to the 8000 page printed version. This application was significant in that it was generic enough to be used for general purposes. This was a change from other hypertext applications of that time which were written for specific needs. The application gave the users the option to bookmark nodes within the document database. [DeBra]
Also in 1985, Xerox released NoteCards, a LISP-based hypertext system. NoteCards' unique features included scrolling windows for each notecard, pre-formatted specialized notecards, and a separate browser/navigator window. Another hypertext application released in 1985 was Brown University's Intermedia for the Macintosh A/UX system. [DeBra]
In 1986 Office Workstations Ltd (OWL) introduced OWL-Guide, which was a hypertext system developed for the Macintosh. The original version of Guide was a PERQ workstation hypertext system based on the work of Peter Brown of University of Kent at Canterbury developed in 1982. OWL-Guide was later ported to the IBM-PC platform and became the first multi-platform hypertext system. The application gained wide-spread acceptance due to the popularity of the Macintosh platform. [DeBra]
In 1987, Bill Atkinson of Apple Computers introduced HyperCard. Apple bundled the application free with all Macintosh machines. HyperCard soon-after became the most widely used hypertext system and many HyperCard-based applications were developed. Many believe HyperCard to be the application that contributed the most to the popularization of the hypertext model. ACM held the first Conference on Hypertext later that year. [DeBra]
In 1989, the World Wide Web came along...
Shahrooz Feizabadi <shahrooz@vt.edu>