October 1, 1998
4:00 - 6:00 p.m.
MIT Media Lab
20 Ames Street
will feature video clips from a 1995 symposium in honor of Vannevar
Bush, which included Ted Nelson as a speaker, to illustrate
how hypertext evolved from conceptualizations rooted in older
media towards the reality of today's World Wide Web. A discussion
following the screening will focus on factors that have constrained
current implementations of hypertext.
Bernstein is Founder and Chief Scientist at Eastgate
Systems, Inc. He is a familiar figure in the national and international
hypertext community who has worked on a variety of successful
projects, includingthe HyperGate hypertext authoring system,
Link Apprintice, a research tool received with interest by hypertext
community and Storyspace for which he was primary developer.
In addition, He served as program co-chair for the ACM conferences
Hypertext'96 (Washington, D.C.) and Hypertext'97 (Southhampton,
Hopper is a Postdoctoral Associate of Comparative
Media Studies at the Massachusetts Institute of Technology and
Managing Editor of the Media in Transition Project WWW site.
Her past activities include a comparative study of successful
educational computing projects at Purdue University (ESCAPE),
Brown University (Intermedia/Context32) and the Massachusetts
Institute of Technology (Athena/AthenaMuse).
I designed this event to address a key goal of the goal of theMedia-in-Transition
project, which is to encourage a historical perspective of the
relationships between digital media and older media. My personal
interpretation of this is that perhaps the best path into the
future is through the past. Without a historical perspective,
we are running blind into the future without a lot of guidance.
The past is at least one valuable tool we can use to map what
to do more thoughtfully in the future.
today is particularly on hypertext, the corner stone of the
World-Wide-Web (WWW) and one of the most "hyped" aspects of
the new digital media. This forum is designed to provide a more
in-depth picture of what its about. It seems ironic how little
discussion surrounds such a key component of our ongoing period
of media transformation. Even popular books for novices, such
as Internet for Dummies, mention Vannevar Bush and Ted
Nelson, but serious books about the Internet and WWW seldom
go further than a cursory history. You will almost always find
a paragraph on Nelson, and it says he invented the termhypertext.
You might find another paragraph on Bush that says he created
the Memex. Everything else is almost always left out unless
you start digging into serious hypertext literature.
thing I am going to do is present some video clips that have
very seldom been seen. The clips are from a symposium calledAs
We May Think, A Celebration of Vannevar Bush's 1945 Vision.
This symposium provided a unique opportunity to view hypertext
from a special historical perspective. The clips from that event
show how hypertext, as viewed by Bush and Nelson, evolved from
conceptualizations rooted in older media towards the reality
of the Berners-Lee's WWW.
guest who are in the video clips are Paul Kahn and Ted Nelson.Paul
is a scholar who has written about Bush's Memex, and he was
invited to speak at the symposium about his work. Rather than
presenting the clips linearly, I have decided to take advantage
of hypertext by creating juxtaposed segments from different
presentations that are related. People that were actually at
the symposium saw one complete presentation by Nelson and one
presentation by Kahn, and they never had the opportunity to
break it all down and analyze it.
go discuss the backgrounds of Bush and Nelson, I would like
to mention the key initial publications that they both wrote
about their technologies which are often referenced. Bush's
article about Memex
first appeared in the Atlantic Monthly (1945), while
Nelson's ideas that were the foundation for Xanadu
were first printed twenty years later in ACM Proceedings
(1965). In addition, they also both then published follow-ups
about their technologies many years after their first publications,
and Nelson is continuing to write about his concepts today.
One interesting point to note is that not only was Nelson definitely
influenced by Bush, but in fact, reprinted the entire Memex
and Nelson were highly connected to traditional media, and these
interests played a role in their particular forms of technology
they chose to explore. Bush
was most famous for overseeing the Manhattan project and establishing
the National Science Foundation (NSF). He held many titles,
the most well known of which was President Roosevelt's science
advisor. He also held the title of President of MIT, which is
why the Bush Symposium was at MIT and all his archives are here.
Almost always, when Memex is discussed, you get the sense it
is a digital tool like a computer, but it wasn't. All of Bush's
technologies, including the Memex, were analog devices that
were purely mechanical. His earliest invention was the profile
which he patented while a graduate student at Tufts. It is seldom
mentioned that he also invented the justifying
His most significant invention was thedifferential
which was used to calculate ballistics tables during W.W.II
background was quite different. He focused primarily on the
humanities, having earned a B.A. in Philosophy from Swathmore
College in 1959 and a Masters in Sociology from Harvard in 1963.
His interests began with an orientation towards traditional
media like cinema
However, he was equally focused on the theatrical
aspects of media.
these gentlemen were quite different in both their focus and
careers, they both shared some broader concerns for what they
were doing, and the particular technologies they developed were
intended to address these broader concerns. Both of them wanted
ways to improve people's ability to store, process and communicate
their ideas. That is the commonalty that brings them together.
Bush said that the role
of the computer
in the future would be to "supplement a mans thinking methods."
For him to say this was particularly significant when you consider
his role in history. To say he founded the NSF and a few other
things doesn't really capture the point. He designed both the
military industrial complex in this country and the role of
information systems within that framework. It is easy to argue
that vision underlies what eventually became the Internet. Nelson
also has been quite concerned about the nature of the human
mind and its relationship to computers. However, his perspective
is somewhat different. While Nelson's emphasis has also been
on the computer's role in improving the human mind's ability
to comprehend complex
he frames the task of designing
as an art rather than science or engineering.
Let us now
turn to looking at what the technologies behind Memex and Xanadu.
was designed as an analog device, like his other inventions,
although it was never actually built. On the other hand, a different
machine that was Memex's predecessor, called the microfilm
was built in the late 1930s here at MIT. That technology was
elaborated over time, until it became quite sophisticated in
the hands of the CIA. However, that technology never quite achieved
the level of sophistication that Bush described in his article.
To give a better sense of what the Memex that Bush proposed
would look like to the participants at the Bush Symposium, Kahn
provided an animated demonstration
of the "Turkish Bow Scenario" that Bush described in his article
and a point and click interface to explore the technology behind
credited with coining the term hypertext, but most people think
that just means linking like we see on the WWW. In fact, his
concepts are more complex than that. In order to give a better
sense of how Xanadu was intended to function, Nelson presented
his notion of how a computer
should look, and then demonstrated a piece of software called
to illustrate the key components of his Xanadu concept. In contrast
to Bush's "Turkish Bow Scenario", he used the Zip Editor to
show how his ideas could be applied to a historical writing
example. He spent a considerable amount of time explaining his
model of transpublication
which would allow the virtual republishing of documents within
a new copyright framework. Near the "grand finale" of his presentation,
he then summarized the core
that are still missing from our current ideas of hypertext.
- things you look at together whose specific connections are
- virtual instance across a boundary with original identity
maintained and original content available.
- allow pointing across window boundaries explicitly.
- see from one transcluded instance to another.
- virtual republishing by distribution of pointers where materials
obtained from the originator or their agent.
- permission doctrine.
to sharing similar goals, both the Bush's Memex and Nelson's
Xanadu faced questions regarding their technical feasibility.
his exposition regarding the technology of Memex by providing
an account of how the diagrams
that the public has seen weren't created by Bush. He also explained,
not only did the article describe technologies that never did
or could exist in the form described, but also that the earlier
machine upon which they were based, the rapid
was also later abandoned because of its mechanical shortcomings.
was that Memex was more of a successful exercise in imagination
than a technical feat.
Nelson was working with 20 years more of technological development
at his disposal, the question of technical feasibility has been
a more serious issue that has haunted his career. A complete
system like he described has never been built, and there has
always been controversy surrounding why. Speculation has attributed
the cause to a variety of reasons:
- The technical
concepts were too far fetched to function.
- The technology
simply wasn't advanced enough to support it yet.
- He wasn't
able to communicate it well enough.
- He didn't
have the people skills needed to run the scale of project
addressed thes issues
and attributed the problem to a combination of factors that
included both mismatchs
in his personality and timing. However, he also explains that
he believes that his ideas are implementable on the WWW within
browser model. In fact, he is still in Japan working on it.
While Bush's Memex was clearly more vision than reality, you
can see that at least Nelson has the Zip Editor he shows in
the clips. Recently, he has also released another piece of software
called ZigZag, available on the WWW for free. His attributing
the entire problem to himself, while containing some truth,
may also be somewhat of a disservice.
is a great recognition that the complexity required for building
large software systems is a incredible problem faced by anyone
trying to do it. While he didn't make great progress, he is
not the only one. In another presentation at the Bush Symposium
in 1995, Tim Berners-Lee said that it was a trivial problem
to make the WWW interactive. That was almost five years ago.
Anybody that develops for the WWW today will still tell you
that it is not terribly easy to edit the WWW or to get other
people to interact with each other in the same workspace. That
is a really major theme right now. There are grounds to suggest
that, while the WWW has been a wild success in some ways, some
of Berners-Lee's most valuable larger goals were also never
It is easily
argued that Bush's vision of Memex has been realized in spirit,
while much of Nelson's vision remains unattained by him or others.
Nelson eloquently described the limitations of some of our favorite
interfaces. In a wonderful series of clips, he attacks the Macintosh
for being a paper simulator with inadequate metaphors like cut
or the garbage
Nelson continues to remind us that we still need more elegant
to help us to better express the multidimensional
nature of our ideas.
Nelson suggests, let us consider the possibility that there
are many things that we could potentially do electronically
which we have never achieved. Hypertext, like computing in general,
has a long history of valuable
that have been implemented in demonstration systems, but have
not made their way into well known commercial systems for the
general public. HyperCard was definitely considered a bastardization
of the hypertext concept when it became popular, because there
were much more sophisticated hypertext systems available long
before it was released. Many of the features from early systems,
as well as many newer concepts, are still not available in any
widely available systems -- including the WWW. We should wonder
whether there is a large percentage of the possibilities of
hypertext, or computers in general, which are not available
to the average person yet.
that there may be vast potentials in electronic media that we
still haven't implemented, and we may be in danger of missing
some of the greatest potential in electronic media because we
mistake some situational limitations in technical development
or project management for impossiblity. This is the grounds
upon which I suggest that the past is of more than just historical
interest. The past, and the pioneers who shaped the past, may
hold many suggestions as to paths it might be valuable to explore
in the future and advice on the most productive ways to go down
First, let me begin by mentioning that I wasn't around for the
early history of hypertext. I started building systems in 1982,
almost twenty years after Nelson. I did predate HyperCard, but
I came in later than Nelson and his early Xanadu.
I have opinions
based on insubstantial things about Bush. If Bush's essay had
been forgotten, as it effectively was when Nelson reprinted
it, would anything have been very different? People will say
things like, "Bush created Memex." But, of course, he didn't,
because it never got created. Its not even a failure of management
or implementation. As far as I've been able to discover there
was never a serious effort to create it. Memex doesn't really
anticipate Xanadu in a very meaningful way. Their interests
are very distinct and different.
years, there was a tendency to believe that Nelson was a starry
eyed leftist idealist whose ideas were simply impractical. This
is now very hard to recover -- especially if you enter the world
post WWW. To countermand this, Nelson adopted a very business
oriented rhetoric in the early 70s. This is why he fell into
branding every term. Notice that every single concept had to
be "trans" something. This led to Nelson getting into a content
free, but no win bind in his speaking and rhetoric in the 80s.
On the one hand, he is too business like, because he keeps branding
everything. On the other hand, he is not business like enough.
So he could never raise enough money to build it all.
as Hypertext 91, we had a stringently referred paper in a terribly
competitive conference. It was a paper analyzing the transcopyright
model, and it said, "in conclusion, most uncertain are the adequacy
and financial incentives for authors to put their most valuable
copyrighted works into the Xanadu system." This seemed like
a completely reasonable assertion as late as 1991. Of course,
by 1993, people were lining up in droves to pay people to put
their copyrighted works on the WWW, which for all practical
purposes was the Xanadu vision.
also the political issue that is important to remember, but
wasn't discussed much. The founding document by Ted Nelson that
everyone has in mind, especially those who came onto the scene
after 1987, is a book called Computer Lib (87). One of the two
front covers in that book is a wood cut with a clenched fist
surrounded by the slogan "computers for the people, you can
and must understand computers now."
this is the decade in which computer liberation actually becomes
significant as a political force. They have stopped being things
that we treat with reverence like they were a decade earlier
when we were told we could hand them offerings through the window.
I'm beginning to see that what Nelson was getting at was the
incredible problem of trying to build software to accommodate
human thought. What is the next step?
The next step is always going on in lots of places. Look at
the systems that many people are working on.
What are some of the biggest things that have been left out
of current systems.
Everyone knows that the worst thing about the WWW is latency.
Even The New York Times knows it is a problem. How many
hours are lost? Today, we seem to be falling into using the
WWW as a synonym for hypertext. That's probably a mistake, a
blip in our consciousness. A lot of hypertext happens off the
WWW, and it will continue to do so. We are also going to see
some interesting hybrids. For example, I expect that, in a year
or two, the usual way to buy a Eastgate hypertext will be through
the wire. It will be buying a chunk of stuff that will show
up on your harddisk and stay there. There are a lot of things
about the browser that are dictated by latency or just wrong.
blunder is that links are underlined text. This was only one
of a whole host of proposals of representing what links should
look like. Some people used typographic indicators, colors or
symbols. Almost anything would be better than colored and underlined,
because colored and underlined reads like emphatic. There are
lots of things you want to do with typographic emphasis that
have nothing to do with saying there is a link here. A classic
problem in footnote writing is that they always call undo attention
to themselves. The footnote that says "I stole this from somewhere"
is indistinguishable from "the foregoing passage has been shown
to be completely wrong, and if you build a bridge based on it,
it will fall down."
is also a big problem. What we have now is an improvised cronieism
from the seventeenth century that has worked well over the years,
but that doesn't mean it was handed down on stone tablets, and
we can't revise it. I have had good fortune, but it can be a
insurmountable problem. I know lots of people have things they
badly want to do, where they are facing viable copyrights that
are being defended by people who badly want to see the work
Chislenko: If you want to improve on existing systems, as
in other areas of engineering, there are two ways to do it.
One is to figure out what the good system should be and build
it from scratch. The other is to take advantage of the existing
conditions and build on them. This would mean taking into account
the huge existing WWW and adding elements that would be something
more to our liking. I see some elements of this going on, like
semantic markup in the form of lets say XML, and I wonder if
you think the existing WWW can be mended.
Remember, I am not Ted Nelson. I think the WWW is great! In
fact, so does he. He has actually written an article that I
am hoping to publish called "Two Cheers for the Web". Things
that would be nice on the WWW would be browsers that would work
if they are reconfigured. HTML is a great markup language, but
now we know all the things that it messes up, such as the confusion
between visual and content markup.
distinction is not all the things that HTML did wrong, but that
there was a perfectly good markup language before it called
SGML. Lots of people thought it was going to be the future,
but it was an engineering disaster. This was because, when they
built the standard, they apparently had representatives from
every international agency you could ask, but, apparently, zero
computer scientists. So, they wrote a language that wasn't parsable,
and that's no fun. On the other hand, HTML can be parsed by
a second year undergraduate on a bad day, and some of that code
is still in Netscape. That's the big answer. Lots of the things
that people want to do could be added with tags, tag extensions
or fancy servers. Its easy to sit and say Microsoft or Netscape
should do this, but remember this all just happened yesterday,
and it happened over night, and it can change.