[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
RSS

The Freedom to Associate

23 Sep

In 1854, an Austrian priest and physics teacher named Gregor Mendel sought and received permission from his abbot to plant a two-acre garden of pea plants on the grounds of the monastery at which he lived. Over the course of the next seven years, he bred together thousands upon thousands of the plants under carefully controlled circumstances, recording in a journal the appearance of every single offspring that resulted, as defined by seven characteristics: plant height, pod shape and color, seed shape and color, and flower position and color. In the end, he collected enough data to formulate the basis of the modern science of genetics, in the form of a theory of dominant and recessive traits passed down in pairs from generation to generation. He presented his paper on the subject, “Experiments on Plant Hybridization,” before the Natural History Society of Austria in 1865, and saw it published in a poorly circulated scientific journal the following year.

And then came… nothing. For various reasons — perhaps due partly to the paper’s unassuming title, perhaps due partly to the fact that Mendel was hardly a known figure in the world of biology, undoubtedly due largely to the poor circulation of the journal in which it was published — few noticed it at all, and those who did dismissed it seemingly without grasping its import. Most notably, Charles Darwin, whose On the Origin of Species had been published while Mendel was in the midst of his own experiments, seems never to have been aware of the paper at all, thereby missing this key gear in the mechanism of evolution. Mendel was promoted to abbot of his monastery shortly after the publication of his paper, and the increased responsibilities of his new post ended his career as a scientist. He died in 1884, remembered as a quiet man of religion who had for a time been a gentleman dabbler in the science of botany.

But then, at the turn of the century, the German botanist Carl Correns stumbled upon Mendel’s work while conducting his own investigations into floral genetics, becoming in the process the first to grasp its true significance. To his huge credit, he advanced Mendel’s name as the real originator of the set of theories which he, along with one or two other scientists working independently, was beginning to rediscover. Correns effectively shamed those other scientists as well into acknowledging that Mendel had figured it all out decades before any of them even came close. It was truly a selfless act; today the name of Carl Correns is unknown except in esoteric scientific circles, while Gregor Mendel’s has been done the ultimate honor of becoming an adjective (“Mendelian”) and a noun (“Mendelism”) locatable in any good dictionary.

Vannevar Bush

Vannevar Bush

So, all’s well that ends well, right? Well, maybe, but maybe not. Some 30 years after the rediscovery of Mendel’s work, an American named Vannevar Bush, dean of MIT’s School of Engineering, came to see the 35 years that had passed between the publication of Mendel’s theory and the affirmation of its importance as a troubling symptom of the modern condition. Once upon a time, all knowledge had been regarded as of a piece, and it had been possible for a great mind to hold within itself huge swathes of this collective knowledge of humanity, everything informing everything else. Think of that classic example of a Renaissance man, Leonardo da Vinci, who was simultaneously a musician, a physicist, a mathematician, an anatomist, a botanist, a geologist, a cartographer, an alchemist, an astronomer, an engineer, and an inventor. Most of all, of course, he was a great visual artist, but he used everything else he was carrying around in that giant brain of his to create paintings and drawings as technically meticulous as they were artistically sublime.

By Bush’s time, however, the world had long since entered the Age of the Specialist. As the sheer quantity of information in every field exploded, those who wished to do worthwhile work in any given field — even those people gifted with giant brains — were increasingly being forced to dedicate their intellectual lives entirely to that field and only that field, just to keep up. The intellectual elite were in danger of becoming a race of mole people, closeted one-dimensionals fixated always on the details of their ever more specialized trades, never on the bigger picture. And even then, the amount of information surrounding them was so vast, and existing systems for indexing and keeping track of it all so feeble, that they could miss really important stuff within their own specialties; witness the way the biologists of the late nineteenth century had missed Gregor Mendel’s work, and the 35-years head start it had cost the new science of genetics. “Mendel’s work was lost,” Bush would later write, “because of the crudity with which information is transmitted between men.” How many other major scientific advances were lying lost in the flood of articles being published every year, a flood that had increased by an order of magnitude just since Mendel’s time? “In this are thoughts,” wrote Bush, “certainly not often as great as Mendel’s, but important to our progress. Many of them become lost; many others are repeated over and over.” “This sort of catastrophe is undoubtedly being repeated all around us,” he believed, “as truly significant attainments become lost in the sea of the inconsequential.”

Bush’s musings were swept aside for a time by the rush of historical events. As the prospect of another world war loomed, he became President Franklin Delano Roosevelt’s foremost advisor on matters involving science and engineering. During the war, he shepherded through countless major advances in the technologies of attack and defense, culminating in the most fearsome weapon the world had ever known: the atomic bomb. It was actually this last that caused Bush to return to the seemingly unrelated topic of information management, a problem he now saw in a more urgent light than ever. Clearly the world was entering a new era, one with far less tolerance for the human folly, born of so much context-less mole-person ideology, that had spawned the current war.

Practical man that he was, Bush decided there was nothing for it but to roll up his sleeves and make a concrete proposal describing how humanity could solve the needle-in-a-haystack problem of the modern information explosion. Doing so must entail grappling with something as fundamental as “how creative men think, and what can be done to help them think. It is a problem of how the great mass of material shall be handled so that the individual can draw from it what he needs — instantly, correctly, and with utter freedom.”

As revolutionary manifestos go, Vannevar Bush’s “As We May Think” is very unusual in terms of both the man that wrote it and the audience that read it. Bush was no Karl Marx, toiling away in discontented obscurity and poverty. On the contrary, he was a wealthy upper-class patrician who was, as a member of the White House inner circle, about as fabulously well-connected as it was possible for a man to be. His article appeared first in the July 1945 edition of the Atlantic Monthly, hardly a bastion of radical thought. Soon after, it was republished in somewhat abridged form by Life, the most popular magazine on the planet. Thereby did this visionary document reach literally millions of readers.

With the atomic bomb still a state secret, Bush couldn’t refer directly to his real reasons for wanting so urgently to write down his ideas now. Yet the dawning of the atomic age nevertheless haunts his article.

It is the physicists who have been thrown most violently off stride, who have left academic pursuits for the making of strange destructive gadgets, who have had to devise new methods for their unanticipated assignments. They have done their part on the devices that made it possible to turn back the enemy, have worked in combined effort with the physicists of our allies. They have felt within themselves the stir of achievement. They have been part of a great team. Now, as peace approaches, one asks where they will find objectives worthy of their best.

Seen in one light, Bush’s essay is similar to many of those that would follow from other Manhattan Project alumni during the uncertain interstitial period between the end of World War II and the onset of the Cold War. Bush was like many of his colleagues in feeling the need to advance a utopian agenda to counter the apocalyptic potential of the weapon they had wrought, in needing to see the ultimate evil that was the atomic bomb in almost paradoxical terms as a potential force for good that would finally shake the world awake.

Bush was true to his engineer’s heart, however, in basing his utopian vision on technology rather than politics. The world was drowning in information, making the act of information synthesis — intradisciplinary and interdisciplinary alike — ever more difficult.

The difficulty seems to be, not so much that we publish unduly in view of the extent and variety of present-day interests, but rather that publication has been extended far beyond our present ability to make real use of the record. The summation of human experience is being expanded at a prodigious rate, and the means we use for threading through the consequent maze to the momentarily important item is the same as was used in the days of square-rigged ships.

Our ineptitude in getting at the record is largely caused by the artificiality of systems of indexing. When data of any sort are placed in storage, they are filed alphabetically or numerically, and information is found (when it is) by tracing it down from subclass to subclass. It can be in only one place, unless duplicates are used; one has to have rules as to which path will locate it, and the rules are cumbersome. Having found one item, moreover, one has to emerge from the system and reenter on a new path.

The human mind does not work that way. It operates by association. With one item in its grasp, it snaps instantly to the next that is suggested by the association of thoughts, in accordance with some intricate web of trails carried by the cells of the brain. It has other characteristics, of course; trails that are not frequently followed are prone to fade, items are not fully permanent, memory is transitory. Yet the speed of action, the intricacy of trails, the detail of mental pictures, is awe-inspiring beyond all else in nature.

Man cannot hope fully to duplicate this mental process artificially, but he certainly ought to be able to learn from it. In minor ways he may even improve it, for his records have relative permanency. The first idea, however, to be drawn from the analogy concerns selection. Selection by association, rather than indexing, may yet be mechanized. One cannot hope thus to equal the speed and flexibility with which the mind follows an associative trail, but it should be possible to beat the mind decisively in regard to the permanence and clarity of the items resurrected from storage.

Bush was not among the vanishingly small number of people who were working in the nascent field of digital computing in 1945. His “memex,” the invention he proposed to let an individual free-associate all of the information in her personal library, was more steampunk than cyberpunk, all whirring gears, snickering levers, and whooshing microfilm strips. But really, those things are just details; he got all of the important stuff right. I want to quote some more from “As We May Think,” and somewhat at length at that, because… well, because its vision of the future is just that important. This is how the memex should work:

When the user is building a trail, he names it, inserts the name in his code book, and taps it out on his keyboard. Before him are the two items to be joined, projected onto adjacent viewing positions. At the bottom of each there are a number of blank code spaces, and a pointer is set to indicate one of these on each item. The user taps a single key, and the items are permanently joined. In each code space appears the code word. Out of view, but also in the code space, is inserted a set of dots for photocell viewing; and on each item these dots by their positions designate the index number of the other item.

Thereafter, at any time, when one of these items is in view, the other can be instantly recalled merely by tapping a button below the corresponding code space. Moreover, when numerous items have been thus joined together to form a trail, they can be reviewed in turn, rapidly or slowly, by deflecting a lever like that used for turning the pages of a book. It is exactly as though the physical items had been gathered together from widely separated sources and bound together to form a new book. It is more than this, for any item can be joined into numerous trails.

The owner of the memex, let us say, is interested in the origin and properties of the bow and arrow. Specifically he is studying why the short Turkish bow was apparently superior to the English long bow in the skirmishes of the Crusades. He has dozens of possibly pertinent books and articles in his memex. First he runs through an encyclopedia, finds an interesting but sketchy article, leaves it projected. Next, in a history, he finds another pertinent item, and ties the two together. Thus he goes, building a trail of many items. Occasionally he inserts a comment of his own, either linking it into the main trail or joining it by a side trail to a particular item. When it becomes evident that the elastic properties of available materials had a great deal to do with the bow, he branches off on a side trail which takes him through textbooks on elasticity and tables of physical constants. He inserts a page of longhand analysis of his own. Thus he builds a trail of his interest through the maze of materials available to him.

And his trails do not fade. Several years later, his talk with a friend turns to the queer ways in which a people resist innovations, even of vital interest. He has an example, in the fact that the outraged Europeans still failed to adopt the Turkish bow. In fact he has a trail on it. A touch brings up the code book. Tapping a few keys projects the head of the trail. A lever runs through it at will, stopping at interesting items, going off on side excursions. It is an interesting trail, pertinent to the discussion. So he sets a reproducer in action, photographs the whole trail out, and passes it to his friend for insertion in his own memex, there to be linked into the more general trail.

Wholly new forms of encyclopedias will appear, ready-made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified. The lawyer has at his touch the associated opinions and decisions of his whole experience, and of the experience of friends and authorities. The patent attorney has on call the millions of issued patents, with familiar trails to every point of his client’s interest. The physician, puzzled by a patient’s reactions, strikes the trail established in studying an earlier similar case, and runs rapidly through analogous case histories, with side references to the classics for the pertinent anatomy and histology. The chemist, struggling with the synthesis of an organic compound, has all the chemical literature before him in his laboratory, with trails following the analogies of compounds, and side trails to their physical and chemical behavior.

The historian, with a vast chronological account of a people, parallels it with a skip trail which stops only on the salient items, and can follow at any time contemporary trails which lead him all over civilization at a particular epoch. There is a new profession of trail blazers, those who find delight in the task of establishing useful trails through the enormous mass of the common record. The inheritance from the master becomes, not only his additions to the world’s record, but for his disciples the entire scaffolding by which they were erected.

Ted Nelson

Ted Nelson

There is no record of what all those millions of Atlantic Monthly and Life readers made of Bush’s ideas in 1945 — or for that matter if they made anything of them at all. In the decades that followed, however, the article became a touchstone of the burgeoning semi-underground world of creative computing. Among its discoverers was Ted Nelson, who is depending on whom you talk to either one of the greatest visionaries in the history of computing or one of the greatest crackpots — or, quite possibly, both. Born in 1937 to a Hollywood director and his actress wife, then raised by his wealthy and indulgent grandparents following the inevitable Hollywood divorce, Nelson’s life would largely be defined by, as Gary Wolf put it in his classic profile for Wired magazine, his “aversion to finishing.” As in, finishing anything at all, or just the concept of finishing in the abstract. Well into middle-age, he would be diagnosed with attention-deficit disorder, an alleged malady he came to celebrate as his “hummingbird mind.” This condition perhaps explains why he was so eager to find a way of forging permanent, retraceable associations among all the information floating around inside and outside his brain.

Nelson coined the terms “hypertext” and “hypermedia” at some point during the early 1960s, when he was a graduate student at Harvard. (Typically, he got a score of Incomplete in the course for which he invented them, not to mention an Incomplete on his PhD as a whole.) While they’re widely used all but interchangeably today, in Nelson’s original formulation the former term was reserved for purely textual works, the later for those incorporating others forms of media, like images and sound. But today we’ll just go with the modern flow, call them all hypertexts, and leave it at that. In his scheme, then, hypertexts were texts capable of being “zipped” together with other hypertexts, memex-like, wherever the reader or writer wished to preserve associations between them. He presented his new buzzwords to the world at a conference of the Association for Computing Machinery in 1965, to little impact. Nelson, possessed of a loudly declamatory style of discourse and all the rabble-rousing fervor of a street-corner anarchist, would never be taken all that seriously by the academic establishment.

Instead, it being the 1960s and all, he went underground, embracing computing’s burgeoning counterculture. His eventual testament, one of the few things he ever did manage to complete — after a fashion, at any rate — was a massive 1200-page tome called Computer Lib/Dream Machines, self-published in 1974, just in time for the heyday of the Altair and the Homebrew Computer Club, whose members embraced Nelson as something of a patron saint. As the name would indicate, Computer Lib/Dream Machines was actually two separate books, bound back to back. Theoretically, Computer Lib was the more grounded volume, full of practical advice about gaining access to and using computers, while Dream Machines was full of the really out-there ideas. In practice, though, they were often hard to distinguish. Indeed, it was hard to even find anything in the books, which were published as mimeographed facsimile copies filled with jotted marginalia and cartoons drafted in Nelson’s shaky hand, with no table of contents or page numbers and no discernible organizing principle beyond the stream of consciousness of Nelson’s hummingbird mind. (I trust that the irony of a book concerned with finding new organizing principles for information itself being such an impenetrable morass is too obvious to be worth belaboring further.) Nelson followed Computer Lib/Dream Machines with 1981’s Literary Machines, a text written in a similar style that dwelt, when it could be bothered, at even greater length on the idea of hypertext.

The most consistently central theme of Nelson’s books, to whatever extent one could be discerned, was an elaboration of the hypertext concept he called Xanadu, after the pleasure palace in Samuel Taylor Coleridge’s poem “Kubla Khan.” The product of an opium-fueled hallucination, the 54-line poem is a mere fragment of a much longer work Coleridge had intended to write. Problem was, in the course of writing down the first part of his waking dream he was interrupted; by the time he returned to his desk he had simply forgotten the rest.

So, Nelson’s Xanadu was intended to preserve information that would otherwise be lost, which goal it would achieve through associative linking on a global scale. Beyond that, it was almost impossible to say precisely what Xanadu was or wasn’t. Certainly it sounds much like the World Wide Web to modern ears, but Nelson insists adamantly that the web is a mere bad implementation of the merest shadow of his full idea. Xanadu has been under allegedly active development since the late 1960s, making it the most long-lived single project in the history of computer programming, and by far history’s most legendary piece of vaporware. As of this writing, the sum total of all those years of work are a set of web pages written in Nelson’s inimitable declamatory style, littered with angry screeds against the World Wide Web, along with some online samples that either don’t work quite right or are simply too paradigm-shattering for my poor mind to grasp.

In my own years on this planet, I’ve come to reserve my greatest respect for people who finish things, a judgment which perhaps makes me less than the ideal critic of Ted Nelson’s work. Nevertheless, even I can recognize that Nelson deserves huge credit for transporting Bush’s ideas to their natural habitat of digital computers, for inventing the term “hypertext,” for defining an approach to links (or “zips”) in a digital space, and, last but far from least, for making the crucial leap from Vannevar Bush’s concept of the single-user memex machine to an interconnected global network of hyperlinks.

But of course ideas, of which both Bush and Nelson had so many, are not finished implementations. During the 1960s, 1970s, and early 1980s, there were various efforts — in addition, that is, to the quixotic effort that was Xanadu — to wrestle at least some of the concepts put forward by these two visionaries into concrete existence. Yet it wouldn’t be until 1987 that a corporation with real financial resources and real commercial savvy would at last place a reasonably complete implementation of hypertext before the public. And it all started with a frustrated programmer looking for a project.

Steve Jobs and Bill Atkinson

Steve Jobs and Bill Atkinson

Had he never had anything to do with hypertext, Bill Atkinson’s place in the history of computing would still be assured. Coming to Apple Computer in 1978, when the company was only about eighteen months removed from that famous Cupertino garage, Atkinson was instrumental in convincing Steve Jobs to visit the Xerox Palo Alto Research Center, thereby setting in motion the chain of events that would lead to the Macintosh. A brilliant programmer by anybody’s measure, he eventually wound up on the Lisa team. He wrote the routines to draw pixels onto the Lisa’s screen — routines on which, what with the Lisa being a fundamentally graphical machine whose every display was bitmapped, every other program depended. Jobs was so impressed by Atkinson’s work on what he named LisaGraf that he recruited him to port his routines over to the nascent Macintosh. Atkinson’s routines, now dubbed QuickDraw, would remain at the core of MacOS for the next fifteen years. But Atkinson’s contribution to the Mac went yet further: after QuickDraw, he proceeded to design and program MacPaint, one of the two applications included with the finished machine, and one that’s still justifiably regarded as a little marvel of intuitive user-interface design.

Atkinson’s work on the Mac was so essential to the machine’s success that shortly after its release he became just the fourth person to be named an Apple Fellow — an honor that carried with it, implicitly if not explicitly, a degree of autonomy for the recipient in the choosing of future projects. The first project that Atkinson chose for himself was something he called the Magic Slate, based on a gadget called the Dynabook that had been proposed years ago by Xerox PARC alum (and Atkinson’s fellow Apple Fellow) Alan Kay: a small, thin, inexpensive handheld computer controlled via a touch screen. It was, as anyone who has ever seen an iPhone or iPad will attest, a prescient project indeed, but also one that simply wasn’t realizable using mid-1980s computer technology. Having been convinced of this at last by his skeptical managers after some months of flailing,  Atkinson wondered if he might not be able to create the next best thing in the form of a sort of software version of the Magic Slate, running on the Macintosh desktop.

In a way, the Magic Slate had always had as much to do with the ideas of Bush and Nelson as it did with those of Kay. Atkinson had envisioned its interface as a network of “pages” which the user navigated among by tapping links therein — a hypertext in its own right. Now he transported the same concept to the Macintosh desktop, whilst making his metaphorical pages into metaphorical stacks of index cards. He called the end result, the product of many months of design and programming, “Wildcard.” Later, when the trademark “Wildcard” proved to be tied up by another company, it turned into “HyperCard” — a much better name anyway in my book.

By the time he had HyperCard in some sort of reasonably usable shape, Atkinson was all but convinced that he would have to either sell the thing to some outside software publisher or start his own company to market it. With Steve Jobs now long gone and with him much of the old Jobsian spirit of changing the world through better computing, Apple was heavily focused on turning the Macintosh into a practical business machine. The new, more sober mood in Cupertino — not to mention Apple’s more buttoned-down public image — would seem to indicate that they were hardly up for another wide-eyed “revolutionary” product. It was Alan Kay, still kicking around Cupertino puttering with this and that, who convinced Atkinson to give CEO John Sculley a chance before he took HyperCard elsewhere. Kay brokered a meeting between Sculley and Atkinson, in which the latter was able to personally demonstrate to the former what he’d been working on all these months. Much to Atkinson’s surprise, Sculley loved HyperCard. Apparently at least some of the old Jobsian fervor was still alive and well after all inside Apple’s executive suite.

At its most basic, a HyperCard stack to modern eyes resembles nothing so much as a PowerPoint presentation, albeit one which can be navigated non-linearly by tapping links on the slides themselves. Just as in PowerPoint, the HyperCard designer could drag and drop various forms of media onto a card. Taken even at this fairly superficial level, HyperCard was already a full-fledged hypertext-authoring (and hypertext-reading) tool — by no means the first specimen of its kind, but the first with the requisite combination of friendliness, practicality, and attractiveness to make it an appealing environment for the everyday computer user. One of Atkinson’s favorite early demo stacks had many cards with pictures of people wearing hats. If you clicked on a hat, you were sent to another card showing someone else wearing a hat. Ditto for other articles of fashion. It may sound banal, but this really was revolutionary, organization by association in action. Indeed, one might say that HyperCard was Vannevar Bush’s memex, fully realized at last.

But the system showed itself to have much, much more to offer when the author started to dig into HyperTalk, the included scripting language. All sorts of logic, simple or complex, could be accomplished by linking scripts to clicks on the surface of the cards. At this level, HyperCard became an almost magical tool for some types of game development, as we’ll see in future articles. It was also a natural fit for many other applications: information kiosks, interactive tutorials, educational software, expert systems, reference libraries, etc.

HyperCard in action

HyperCard in action

John Sculley himself premiered HyperCard at the August 1987 MacWorld show. Showing unusual largess in his determination to get HyperCard into the hands of as many people as possible as quickly as possible, he announced that henceforward all new Macs would ship with a free copy of the system, while existing owners could buy copies for their machines for just $49. He called HyperCard the most important product Apple had released during his tenure there. Considering that Sculley had also been present for the launch of the original Macintosh, this was certainly saying something. And yet he wasn’t clearly in the wrong either. As important as the Macintosh, the realization in practical commercial form of the computer-interface paradigms pioneered at Xerox PARC during the 1970s, has been to our digital lives of today, the concept of associative indexing — hyperlinking — has proved at least as significant. But then, the two do go together like strawberries and cream, the point-and-click paradigm providing the perfect way to intuitively navigate through a labyrinth of hyperlinks. It was no coincidence that an enjoyable implementation of hypertext appeared first on the Macintosh; the latter almost seemed a prerequisite for the former.

The full revolutionary nature of the concept of hypertext was far from easy to get across in advertising copy, but Apple gave it a surprisingly good go, paying due homage to Vannevar Bush in the process.

The full import of the concept of hypertext was far from easy to get across in advertising copy, but Apple gave it a surprisingly serious go, paying due homage to Vannevar Bush in the process.

In the wake of that MacWorld presentation, a towering tide of HyperCard hype rolled from one side of the computer industry to the other, out into the mainstream media, and then back again, over and over. Hypertext’s time had finally come. In 1985, it was an esoteric fringe concept known only to academics and a handful of hackers, being treated at real length and depth in print only in Ted Nelson’s own sprawling, well-nigh impenetrable tomes. Four years later, every bookstore in the land sported a shelf positively groaning with trendy paperbacks advertising hypertext this and hypertext that. By then the curmudgeons had also begun to come out in force, always a sure sign that an idea has truly reached critical mass. Presentations showed up in conference catalogs with snarky titles like “Hypertext: Will It Cook Me Breakfast Too?.”

The curmudgeons had plenty of rabid enthusiasm to push back against. HyperCard, even more so than the Macintosh itself, had a way of turning the most sober-minded computing veterans into starry-eyed fanatics. Jan Lewis, a long time business-computing analyst, declared that “HyperCard is going to revolutionize the way computing is done, and possibly the way human thought is done.” Throwing caution to the wind, she abandoned her post at InfoWorld to found HyperAge, the first magazine dedicated to the revolution. “There’s a tremendous demand,” she said. “If you look at the online services, the bulletin boards, the various ad hoc meetings, user groups — there is literally a HyperCulture developing, almost a cult.” To judge from her own impassioned statements, she should know. She recruited Ted Nelson himself — one of the HyperCard holy trinity of Bush, Nelson, and Atkinson — to write a monthly column.

HyperCard effectively amounted to an entirely new computing platform that just happened to run atop the older platform that was the Macintosh. As Lewis noted, user-created HyperCard stacks — this new platform’s word for “programs” or “software” — were soon being traded all over the telecommunications networks. The first commercial publisher to jump into the HyperCard game was, somewhat surprisingly, Mediagenic. [1]Mediagenic was known as Activision until mid-1988. To avoid confusion, I just stick with the name “Mediagenic” in this article. Bruce Davis, Mediagenic’s CEO, has hardly gone down into history as a paradigm of progressive thought in the realms of computer games and software in general, but he defied his modern reputation in this one area at least by pushing quickly and aggressively into “stackware.” One of the first examples of same that Mediagenic published was Focal Point, a collection of business and personal-productivity tools written by one Danny Goodman, who was soon to publish a massive bible called The Complete HyperCard Handbook, thus securing for himself the mantle of the new ecosystem’s go-to programming guru. Focal Point was a fine demonstration that just about any sort of software could be created by the sufficiently motivated HyperCard programmer. But it was another early Mediagenic release, City to City, that was more indicative of the system’s real potential. It was a travel guide to most major American cities — an effortlessly browsable and searchable guide to “the best food, lodgings, and other necessities” to be found in each of the metropolises in its database.

City to City

City to City

Other publishers — large, small, and just starting out — followed Mediagenic’s lead, releasing a bevy of fascinating products. The people behind The Whole Earth Catalog — themselves the inspiration for Ted Nelson’s efforts in self-publication — converted their current edition into a HyperCard stack filling a staggering 80 floppy disks. A tiny company called Voyager combined HyperCard with a laser-disc player — a very common combination among ambitious early HyperCard developers — to offer an interactive version of the National Gallery of Art which could be explored using such associative search terms as “Impressionist landscapes with boats.” Culture 1.0 let you explore its namesake through “3700 years of Western history — over 200 graphics, 2000 hypertext links, and 90 essays covering topics from the Black Plague to Impressionism,” all on just 7 floppy disks. Mission: The Moon, from the newly launched interactive arm of ABC News, gathered together details of every single Mercury, Gemini, and Apollo mission, including videos of each mission hosted on a companion laser disc. A professor of music converted his entire Music Appreciation 101 course into a stack. The American Heritage Dictionary appeared as stackware. And lots of what we might call “middlestackware” appeared to help budding programmers with their own creations: HyperComposer for writing music in HyperCard, Take One for adding animations to cards.

Just two factors were missing from HyperCard to allow hypertext to reach its full potential. One was a storage medium capable of holding lots of data, to allow for truly rich multimedia experiences, combining the lavish amounts of video, still pictures, music, sound, and of course text that the system clearly cried out for. Thankfully, that problem was about to be remedied via a new technology which we’ll be examining in my very next article.

The other problem was a little thornier, and would take a little longer to solve. For all its wonders, a HyperCard stack was still confined to the single Macintosh on which it ran; there was no provision for linking between stacks running on entirely separate computers. In other words, one might think of a HyperCard stack as equivalent to a single web site running locally off a single computer’s hard drive, without the ability to field external links alongside its internal links. Thus the really key component of Ted Nelson’s Xanadu dream, that of a networked hypertext environment potentially spanning the entire globe, remained unrealized. In 1990, Bill Nisen, the developer of a hypertext system called Guide that slightly predated HyperCard but wasn’t as practical or usable, stated the problem thus:

The one thing that is precluding the wide acceptance of hypertext and hypermedia is adequate broadcast mechanisms. We need to find ways in which we can broadcast the results of hypermedia authoring. We’re looking to in the future the ubiquitous availability of local-area networks and low-cost digital-transmission facilities. Once we can put the results of this authoring into the hands of more users, we’re going to see this industry really explode.

Already at the time Nisen made that statement, a British researcher named Tim Berners-Lee had started to experiment with something he called the Hypertext Transfer Protocol. The first real web site, the beginning of the World Wide Web, would go online in 1991. It would take a few more years even from that point, but a shared hypertextual space of a scope and scale the likes of which few could imagine was on the way. The world already had its memex in the form of HyperCard. Now — and although this equivalency would scandalize Ted Nelson — it was about to get its Xanadu.

Associative indexing permeates our lives so thoroughly today that, as with so many truly fundamental paradigm shifts, the full scope of the change it has wrought can be difficult to fully appreciate. A century ago, education was still largely an exercise in retention: names, dates, Latin verb cognates. Today’s educational institutions  — at least the more enlightened ones — recognize that it’s more important to teach their pupils how to think than it is to fill their heads with facts; facts, after all, are now cheap and easy to acquire when you need them. That such a revolution in the way we think about thought happened in just a couple of decades strikes me as incredible. That I happened to be present to witness it strikes me as amazing.

What I’ve witnessed has been a revolution in humanity’s relationship to information itself that’s every bit as significant as any political revolution in history. Some Singularity proponents will tell you that it marks the first step on the road to a vast worldwide consciousness. But even if you choose not to go that far, the ideas of Vannevar Bush and Ted Nelson are still with you every time you bring up Google. We live in a world in which much of the sum total of human knowledge is available over an electronic connection found in almost every modern home. This is wondrous. Yet what’s still more wondrous is the way that we can find almost any obscure fact, passage, opinion, or idea we like from within that mass, thanks to selection by association. Mama, we’re all cyborgs now.

(Sources: the books Hackers: Heroes of the Computer Revolution and Insanely Great: The Life and Times of the Macintosh, the Computer That Changed Everything by Steven Levy; Computer Lib/Dream Machines and Literary Machines by Ted Nelson; From Memex to Hypertext: Vannevar Bush and the Mind’s Machine, edited by James M. Nyce and Paul Kahn; The New Media Reader, edited by Noah Wardrip-Fruin and Nick Montfort; Multimedia and Hypertext: The Internet and Beyond by Jakob Nielsen; The Making of the Atomic Bomb by Richard Rhodes. Also the June 1995 Wired magazine profile of Ted Nelson; Andy Hertzfeld’s website Folklore; and the Computer Chronicles television episodes entitled “HyperCard,” “MacWorld Special 1988,” “HyperCard Update,” and “Hypertext.”)

Footnotes

Footnotes
1 Mediagenic was known as Activision until mid-1988. To avoid confusion, I just stick with the name “Mediagenic” in this article.
 
54 Comments

Posted by on September 23, 2016 in Digital Antiquaria, Interactive Fiction

 

Tags: ,

54 Responses to The Freedom to Associate

  1. Lisa H.

    September 23, 2016 at 8:14 pm

    it was about to gets its Xanadu.

    Get.

    Fascinating stuff! I’ll have to take a look at that Xanadu site later.

     
    • Jimmy Maher

      September 24, 2016 at 6:53 am

      Thanks as always!

       
  2. Brian Holdsworth

    September 23, 2016 at 8:43 pm

    Hi Jimmy.

    I enjoy all your articles, and this is one of your most interesting posts for me. Recalling the arrival of the Macintosh, HyperCard, and the World Wide Web all coming in such rapid succession, it really was like a sudden Dawn of a new Era. I was a university student through those few years, and it all seemed rather exciting at the time, but hardly the Dawn of something as big as we know it to be today.

    Frankly, I was too distracted with programming and gaming on my Amiga at the time, that it was easy to dismiss HyperCard as something cool, but not really worth learning until there was a similar product for the Amiga. It didn’t even have color! Anyway, Commodore tried and failed to deliver on that with “Amiga Vision” – but instead just added to the long list of “graphical” programming environments that couldn’t quite pull it off. I guess the real successor was probably Director.

    Thanks for reawakening me on this fascinating period of computing history!

    Best,
    Brian

     
    • Jimmy Maher

      September 24, 2016 at 6:52 am

      HyperCard actually could do color. It’s just most of the Macs on which it ran couldn’t. ;)

      I think a lot of people would say that there never really has been a true successor to HyperCard. It’s hugely missed by just about everyone who used it back in the day. It really was a magnificent system, combining enough power to create commercial-quality multimedia applications with an ease of use that would allow a complete beginner to create something she could be proud of in just a few hours. And it was just plain *fun* to boot. The world of computers is poorer for its loss.

       
      • Alan Kay

        September 24, 2016 at 1:15 pm

        Actually, it couldn’t do color for almost all of its useful life

         
        • Jimmy Maher

          September 24, 2016 at 1:32 pm

          Really? I didn’t know that. I never had access to a color Mac back when HyperCard was a major thing. Thanks!

           
          • Sylvestrus

            September 24, 2016 at 8:15 pm

            HyperCard was natively black-and-white for its entire life. There were various third-party XCMDs that added limited color support, and Apple’s eventual color solution was to just ship a first-party XCMD with HyperCard 2.2. All of the normal HyperCard tools were still black-and-white, and you could enter a separate mode to paint color onto objects on the underlying black-and-white cards. Stacks using the color tools would have scripts inserted which ran each time the card changed, which painted the color on top after the HyperCard client got done with its own drawing.

            True, native color support was supposed to ship in the never-released HyperCard 3.0.

             
          • Brian Bagnall

            September 25, 2016 at 1:44 pm

             
          • Jerry

            September 26, 2016 at 4:35 am

            The Apple IIgs version of Hypercard did color. But obviously, the IIgs being a sidelined platform, it didn’t have much of a dent in the world.

             
      • Sol_HSA

        September 24, 2016 at 3:38 pm

        Macromedia director kind of? was a follow-up to hypercard, with it’s “lingo” language derived from hypercard, and then evolved up to a point where designers (who it was originally aimed for) found the language too complex to understand and developers (who it fell to eventually) hating it for it’s absurd syntax, and in the end it was replaced with a javascript derivative.

         
    • FilfreFan

      September 25, 2016 at 2:57 am

      Thanks Jimmy — what a great article. As usual, it stirred a few memories.

      Brian, it turns out that the Hypercard WAS released for the Amiga, retailing for about $300.00.

      Once it became routine to use computer file systems to store and access role playing material, we wished for some kind of content-based file access system, although we had no idea that such a thing might already have a name. Our circle of friends sometimes discussed how this might be implemented, and imagined that this was a fairly obvious and widespread desire among computer users.

      I had begun programming “embedded file launchers” in C, and was just getting bogged down in the complexity of pointers to functions when a new Amiga product came to my attention.

      I don’t recall the magazine that reviewed Hypercard for the Amiga, nor do I remember any awareness of its Macintosh origins, but on reading the product review, I realized that this was what we had been looking for and what I had been struggling to create!

      I immediately rushed out and bought it, and worked with it for a few weeks or a month, but ran into a number of very frustrating challenges. It had a good sized manual which documented the scripting language, but the manual was simply inaccurate. Here’s an example (not an actual example, but a notional example muted by the passage of time).

      In C, if you want to understand how to print “Hello World,” you might find this example in your manual:

      #include
      main()
      {
      printf(“Hello World”);
      }

      A decent manual would also explain how to compile and run this snippet to magically render a cheery “Hello World” on your standard in/out device.

      But what if your manual’s example omitted the quotes, or the semicolon, or the #include statement, or the parentheses? Although you typed in the examples EXACTLY as they appeared in the manual, what would you do when it bombed in compilation or execution? You might try to isolate the offending snippet and recompile it alone. If that failed, you’d probably look through other examples in the manual and try to guess what was missing. Then you’d begin the trial and error process to understand the failure. Was the string encapsulated improperly? Did it require quotes? Did it require parentheses? Was the manual’s example snippet improperly terminated? etc. Eventually through a variety of research and guessing, you’d get lucky and get that snippet to function. Then you’d move on to the next challenge.

      This happened over and over.

      Despite it’s absolutely amazing promise, that Amiga product was simply not ready for release. I went back and re-read that magazine review, and in retrospect it had subtly hinted that sometimes using it seemed like trying to guess how to rub the lamp to make the genie emerge. Somehow I’d missed the import of that one one sentence buried in a few pages of glowing review.

      In the end, the guessing games were just too much, so a few days before the return interval had expired, I uninstalled it, packaged it all back up, and returned to the shop where I’d bought it. There I learned that opened software packages were not returnable. Fortunately for me, there was another customer standing at the check-out counter, ready to purchase a brand new Amiga Hypercard package! I explained the challenges I was facing and why I wouldn’t recommend it, but if he really, really wanted to buy it, I’d sell him mine for half the price — $150 vs the $300 it had cost me. He jumped on the opportunity! I never learned how it worked out for him, but imagine that, despite my full disclosure, ultimately each of us were out $150 bucks apiece.

      This experience very typically and powerfully illustrates the challenges faced by Amiga adopters. Once we had expended our resources and hitched our cart to a horse, of course we wanted that horse to have a fairly decent run. The Amiga had so much promise, and although much of it’s potential was realized, ultimately, it’s long term potential was squandered. While the Hypercard experience was probably not a reflection on Commodore leadership and execution, it was very typical of the bumpy Amiga experience, ultimately culminating in the purchase of an IBM-compatible once the PC’s capabilities had so clearly bypassed those of the Amiga.

      So, here I am, a grumpy old curmudgeon, still relieving disappointments that most had moved beyond back when the dinosaurs still ruled the Earth. For some of us, the echoes of history continue to resonate long past any relevance.

       
      • Jimmy Maher

        September 25, 2016 at 6:31 am

        Interesting. Was the product you’re referring to perhaps UltraCard? It couldn’t have been *the* HyoerCard; Apple wasn’t in the habit of releasing software on rival platforms, much less ones whose very existence they refused to officially acknowledge. Nor would they have released a manual in such atrocious shape. But that sort of thing was unfortunately typical of the Amiga market, which was full of tiny companies without the money or wherewithal to polish their products.

         
        • Brian Holdsworth

          September 26, 2016 at 3:25 pm

          I suspect the product for the Amiga in question was “Can-Do”.

           
          • Brian Holdsworth

            September 26, 2016 at 3:41 pm

            “Can-Do” was put out by a German company called “Inovatronics”, which I can find no information about online. An old Amiga World ad from December, 1992 describes Can-Do 2.0 as “a revolutionary, interactive software authoring environment that lets you take advantage of the Amiga’s sophisticated architecture without any technical knowledge”.

            “Amiga Vision” was a very polished package that actually shipped in box with the Amiga 3000, and possibly other models, at least for a time. According to the impressive, hard-bound, thick manual in front of me, Amiga Vision was developed for Commodore by IMSATT Corporation.

             
          • Bernie

            October 10, 2016 at 8:03 pm

            Brian, I was an avid amiga user during 1990-1994 and got the OS 2.04 upgrade very early on. I agree with you that we are probably talking about “Can-Do” here, judging from FilfreFan’s description of the obtuse scripting involved, which Cand-Do was noted for. It “could do” almost anything, but the scripting was very confusing.

            I used a lot of AmigaVision back then, and it was very solid and much easier than Can-Do. I don’t know if Can-Do was as powerful as HyperCard, but AmigaVision definitely wasn’t. AmigaVision was more of a precursor of modern PowerPoint, with its ability to call other programs from within a presentation. In those days, given how primitive PowerPoint and others still were, and how hard to use Can-Do was, this made AmigaVision seem like magic to Amiga and PC users, but it wasn’t in the same league as HyperCard.

             
  3. Keith Palmer

    September 23, 2016 at 8:58 pm

    I was somewhere between expecting and hoping your next “Macintosh in the late 1980s” post would deal with HyperCard, but this was somehow more impressive an introduction than I’d imagined. I managed to dabble in that program a bit myself when my father bought through work a Macintosh SE/30, a few years before his company standardized on Compaq; he could sometimes carry that “compact Mac” home. As I started learning about all the Infocom-and-otherwise adventure games I hadn’t managed to play and seemingly would never get the chance to, I began working on a sort of “Choose Your Own Adventure” stack where you could stumble into my impression of a wide variety of adventures. Looking back on it I would have to call it “cargo cult programming,” a crude imitation of the real thing as if to try and bring it back through sheer magic, and it only lasted until I really started learning what the term “combinatorial explosion” meant (afterwards, I just wrote some “fake transcripts”). When we also got a Macintosh LC II (not as fondly remembered by the Macintosh cognoscenti as the SE/30) for home, though, I did go through a good number of freeware stacks in the fading days of HyperCard. (A lot of “complete histories” do strike a tone somewhere between elegiac and vehement on how Apple never quite pushed to keep the program up at the cutting edge…)

     
  4. Baldur Björnsson

    September 24, 2016 at 11:46 am

    Aloha Jimmy!

    Super amazing site, look forward every week for the new articles. As a man who grew up on BBC micro -> Sinclair Spectrum (first game “the Hobbit”) -> Atari ST (my all-time favorite machine) this is a treasure trove of obscure fun!

    Small note: Rennassance should be Renaissance

    Keep on trucking!
    Blaldur.

     
    • Jimmy Maher

      September 24, 2016 at 12:33 pm

      Thanks!

       
  5. Rob

    September 24, 2016 at 12:47 pm

    Jimmy, I have been reading your blog for years. However, you have somehow managed to condense my thoughts and feelings about being born at the cusp of pre- and post-Internet humanity into one single post. It is truly an amazing time to be alive. Thanks!

     
  6. Alan Kay

    September 24, 2016 at 1:52 pm

    Hi Jimmy

    Very good account in almost all regards!

    A few more details. Part of being an Apple Fellow was to be given a pre-approved budget large enough to fund a small group to do something, and Bill used his to create a 4-5 person group to work on “WildCard” over several years.

    One of his realizations was that indexing should be completely automatic *and* that the author could also put in explicit hyperlinks. Another realization (that the Engelbartians also had realized and shown in the late 60s — you should have mentioned Doug Engelbart, he actually had more real impact by quite a bit than Ted Nelson) was that anything that is like “retrieval” results in a large number of hits that aren’t what you want. So you have to do retrieval very quickly and allow rejection for the next retrieval to be as fast as a human can move their fingers.

    On the way to the WildCard designs, Bill had done a desktop utility for the Mac — called “Rolodex” that allowed typing free-form on a card, did automatic indexing, and was faster than the humans involved. This was my favorite Mac little app — it was essentially perfect. A real achievement in WildCard to HyperCard was to be able to get very close to the speed of Rolodex in a much more elaborate and useful system.

    Thus HyperCard had the *two* things that are needed, not just the hyperlinking for browsing, but the automatically indexed searching that is absolutely necessary when you have a data base of more than 100 or so items (i.e. browsing mainly works when you have a sense of where you are going, and you need search as the larger covering mechanism).

    When Apple Marketing rejected Bill’s entreaties to put out WildCard, he was going to quit Apple, and came to me — we had a kind of brotherly relationship. I went to John Sculley (who had become a friend as well as the CEO), explained HyperCard to him, and got him to let Bill demonstrate it. John saw right away what the marketing people couldn’t see, and decided to become the champion of HyperCard (via “papal fiats”). My observation was that Apple had gotten broken if the CEO had to override the rest of the company on a terrific product idea.

    John asked for my advice, and my main one was to for him to require that HyperCard be made scriptable (WildCard was not, it was what became known as “level 4” and scripting became “level 5”).

    John decided to go all out funding the packaging for HyperCard, and this involved many dozens of people — maybe as many as 100 — and at least 18 months work to not just polish what Bill and his team had done, but to create the scripting for it (a fabulous job by Dan Winkler and his team), and also to do the many supports for it, including the 30+ sample apps that showed how to make things, the wonderful “help stack” done by Carol Kaehler, commisioning the several “How To” books, and much more.

    The decision to put it out as a free “value added” facility for the Mac was also brilliant. The combination of factors led to enormous adoption and literally millions of scripted stacks by every kind of end-user.

    I was thrilled to see this system solve some of the most important end-user problems in a simplfied but hugely useful system that did some of the things that the ARPA/Parc research community had been working on for years (and that wound up causing the Mac).

    However, Apple in a few years was no longer responding to John (or anyone). So the pleas of the former Xerox Parc people at Apple to e.g. put HyperCard on the Internet as a symmetric viewing and authoring medium fell on deaf ears (this was a significant bad turn, because the later WWW conception was long obsolete and quite wrong from the start). Another plea was for a complete redo that would add not just color, but to go to the next steps for objects and user defined objects, especially graphical ones. But: ditto.

    Eventually, this marvelous system was allowed to languish and die.

    It’s worth noting that there is nothing today as well done for general user authoring on the web or in “presentation systems” — in today’s technologies — so in many ways the entire computing milieu has taken a step (or more) backwards from the Hypercard of now almost 30 years ago.

     
    • Jimmy Maher

      September 24, 2016 at 2:37 pm

      Thanks so much for this! I made a small edit to correctly describe how you were able to help Atkinson get HyperCard in front of Sculley.

      I am aware of Doug Engelbart’s status as the “other” father of hypertext, but readability is a big priority for me as a writer, and it’s a little difficult to explain his contributions — important though they were — without getting farther down into the details of implementations than I really wanted to go. But certainly your comment can stand as a fine footnote to correct that omission, and as a springboard for those who want to delve still further. If it’s any consolation, I did try to give Engelbart his due in an earlier article on Xerox PARC: https://www.filfre.net/2013/01/xerox-parc/.

       
      • Alan Kay

        September 24, 2016 at 7:19 pm

        Not to slight or diminish Ted Nelson in any way, but if there is to be only one mention of “computer fathers of hyperlinking” it has to be Doug Engelbart, not Ted (and I think even Ted would agree).

        This is not just because of priority, but of scope. Engelbart’s whole vision was really large and important, and dealt with “what the human race needed” as well as what would be good for organizations, and then for individuals.

        The sorry state we are in today with regard to the tiny amount of usefulness in the www is due to the disinclination of most computer people to (a) deal with big ideas, and (b) to learn about the really big and important ideas in our not so distant past.

         
  7. Bill Maya

    September 24, 2016 at 2:03 pm

    Excellent article (as always), though, like Mr. Kay, I kept waiting for the mention of Douglas Engelbart that never came.

     
  8. AguyinaRPG

    September 24, 2016 at 4:08 pm

    It’s funny that this comes up now because one of my teachers who worked at Apple on Quicktime just talked about Bill Atkinson’s role in the Lisa and Mac (as a programming concept). Given Apple’s huge roster, it’s difficult to know who to credit sometimes (hi Mr. Kay!).

    I also now have some context as to when Hypercard came out because I keep getting confused in the timeline. That’s something I highly appreciate about your approach, Jimmy, you put things into perspective. Hypertext and GUIs seem so interlinked today, but it took a number of years for that real association. I’m looking forward to see how you’ll be relating these ideas to narrative experiences later down the line.

     
  9. Andrew Dalke

    September 24, 2016 at 6:03 pm

    I have been researching the early days of information retrieval. While I can’t speak of all the millions of Atlantic or Life readers, I can say that Bush’s Memex was well known by people in IR, and often cited.

    R. A. Fairthorne in “Automatic Retrieval of Recorded Information”, The Computer Journal (1958), leads with “In 1945 Vannevar Bush wrote a much referred-to paper, “As We May Think”…”. The book “Punched Cards” (1951) says “The “Memex” of Vannevar Bush may be closer than is generally realized”, in such a way that it’s clear the readership will know of the paper. It also refers to Memex in a couple of other places.

    The introduction to “Automatic indexing : a state-of-the-art report” by Mary Elizabeth Stevens (1965), has the line “By 1945, Bush had prophesied Memex”. Later it describes how “The interest aroused among some documentalists by the provocative idea of a “Memex” to record and display associations between ideas as proposed by Bush in 1945 led to specific attempts at Documentation, Inc. in the 1950’s to develop a device which would incorporate at least the associations between indexing terms assigned to documents and between documents with respect to their sharing of common indexing terms

    Or, look at the growth of the word “memex” in Google n-grams: https://books.google.com/ngrams/graph?content=memex%2BMemex&year_start=1930&year_end=2000&corpus=15&smoothing=3&share=&direct_url=t1%3B%2C%28memex%20%2B%20Memex%29%3B%2Cc0

    I think it’s fair to say that some of the readers did pay attention to what Bush wrote, and made something of it.

     
    • Alan Kay

      September 24, 2016 at 6:27 pm

      Yes, I agree. One of the factors that kept the mid-40s idea alive through the 50s was that the great sci-fi author Robert Heinlein used “memex” as a generic term in one of his stories, and many of his readers knew that they could often find something interesting if they took the trouble to go to the library to look up the term (that is how I found the Bush articles in the 50s: in the Queensborough Public Library).

      Doug Engelbart had seen the article (I think the Life magazine version) while in the Navy in WWII around age 20, and this got him started thinking about these ideas.

      And, once ARPA IPTO got started ca 1962 (the same time as Engelbart’s first proposal), there was enough general buzz around the East Coast to reach Bush — who had been an MIT professor before the war, and this is where he returned in the 60s — and wrote the essay “Memex Revisited” in 1965 (Google will find a pdf of this).

       
      • Andrew Dalke

        September 25, 2016 at 10:03 am

        Do you recall which Heinlein story that was? Despite being an avid Heinlein reader, nothing comes to mind. In “The Puppet Masters” (1951), the protagonists, ‘Sam’ and ‘Mary’ visit the Library of Congress and use a catalogue to create selector cards, which are then passed to library staff. The staff use mechanical means to deliver “spools” to a room. The protagonists also use spools to record notes, which is the closest thing I saw to a memex.

        However, I didn’t see the word “memex”, and I can’t tell if it’s specifically memex or some other aspect of using microfilm. Microfilm was a hot topic in the 1920s and 1930s. H. G. Wells promoted it in “World Brain” in the late 1930s. Heinlein uses microfilm as a document storage medium in the 1949 novella Gulf, without being coupled to a search mechanism. But there is no mention in Gulf or The Puppet Masters of leaving trails of information.

        I don’t believe the memex ideas faded from the information retrieval field in the 1950s. For example, the title of Taube’s paper on coordinate indexing in American Documentation (1955), “Storage and retrieval of information by means of the association of ideas” is a direct homage to the “association of ideas” from “As We May Think”. It starts by quoting from that essay. (Taube founded Documentation, Inc., which I mentioned above.)

        Garfield’s 1954 ACS Continuation Course, described at http://www.garfield.library.upenn.edu/papers/docorganizofsciinfoy1954.html , gives a good idea of what chemical documentation people were expected to know of the field at that time. This includes “Electronic Computers and Searching Machines: from Univac to Memex” and “Film searching machines” including Bush and Shaw’s Rapid Selector, and improvements like Samain’s Filmorex.

        Garfield started ISI and founded the field of bibliometrics. In “Memex Revisited”, Bush elaborated on the Mendel story and describes how “for thirty years Mendel’s work was lost because of the crudity with which information is transmitted between men. This situation is not improving.” But it *was* improving even in the mid-1960s. ISI was collecting and digitizing that information. In 1970 Garfield expressed his belief that ISI’s Science Citation Index had solved that transmission problem, though psychological reasons could still have prevented its uptake in the late 1800s. See http://garfield.library.upenn.edu/essays/V1p133y1962-73.pdf . I think his belief is correct.

        My specific interest is in Calvin Mooers, who coined the term “information retrieval”, and who is featured in “Dream Machines” as the inventor of TRAC language. He proposed his DOKEN system in 1951, and argued that his superimposed coding method would make it 5,000x faster than the Microfilm Rapid Selector. In his archived papers there’s a letter where he declines to send a copy to Bush to because he considers Bush to be a competitor.

        I believe his proposals for reactive typewriters – easy-to-use terminals for librarians and other ‘duffers’ – and TRAC were continuations of the scholar’s desk expressed by Bush. (Mooers doesn’t cite a direct influence from “As We May Think”. He would likely say he was thinking of similar ideas before 1945. I think that’s true. His father-in-law was Watson Davis, the documentation and microfilm advocate who founded the American Documentation Institute in the 1930s, and corresponded with Wells in 1937 about microfilm. Mooers started thinking about new library organization systems during the war. But the memex was a more concrete expression of those ideas than anything Mooers had thought of by 1945.)

        To get reactive typewriters for library use, he needed lower-case letters and interoperable character sets on cheap hardware, which is why he joined the X3.2 committee to develop ASCII.

        As another detail to show how things are connected, while Mooers invented the TRAC language, the teenage L Peter Deutsch implemented it.

        Mooers would later complain about computer scientists of the 1960s “inventing” concepts that had been worked out in the 1940s and 1950s for punched card information systems. I can’t help but think that he would complain should someone tell him that the memex idea needed help staying alive in the 1950s, or that little came from Bush’s publication until the 1960s.

        The influence of “As We May Think” on Engelbart is well known. My goal in pointing out these less illustrious people is to show that Engelbart and Nelson weren’t lone flames keeping Bush’s ideas alive, but that many people in IR knew of Bush’s ideas and were developing them further. One of those threads became hypertext, but there were many other threads under active development during the 1950s and 1960s.

         
        • Alan Kay

          September 26, 2016 at 12:02 pm

          Good question about Heinlein. I don’t recall the story (or novel) that “memex” appeared in. And I don’t remember exactly how old I was — older than 12, younger than 17 — was when I lived in Queens and used that library.

          And Calvin Mooers was quite a character, to say the least! With some worthy ideas, but hugely (what’s a better word than “paranoid”) about his IP.

           
          • Andrew Dalke

            September 26, 2016 at 2:42 pm

            I would say Mooers was ferociously territorial combined with extremely high self-regard. He thought people, from IBM on down, would come to his door begging to use his ideas, which he developed independently using raw brain power. That’s why he applied for a patent on superimposed coding in the 1940s, but that lead to bad feelings from possible allies and customers, and a long and expensive patent opposition from McBee. It also took decades for the patent to be granted. That’s the backstory to the whole TRAC issue starting in the 1960s, which I’m sure you know about; our host might like to read a bit about it in the first issue of Dr. Dobb’s. Over and over again in his life, relationships soured when people started using ideas he developed first, and without his permission or paying him. The solution, he thought, was stronger IP so that independent people like him had a chance to make a living developing ideas.

            (Apropos Heinlein, Mooers, was not only an avid SF reader in the 1950s, but a member of SF fandom.)

            While I could go for hours, I’m now several steps removed from computer entertainment or this essay, so I’ll stop. :)

             
      • Bernie

        October 10, 2016 at 3:40 pm

        This is really one of Jimmy’s best posts, and the overall quality of his work is evinced by motivating none other than Dr. Alan Kay to post several comments, to get involved in this discussion.

        Since Jimmy has done all of us the great service of “attracting” someone of Dr. Kay’s stature within the industry, I’d like to take opportunity to ask for Dr. Kay’s opinion on the following idea :

        * Can a sort of “continuum” be established starting with Dr. Bush’s proposals, built on by Dr. Engelbart in the 60’s , PARC in the 70’s , Apple in the 80’s , and reaching maturity with Google in the 90’s ?

        * How can Seteve Jobs’ role be described , considering that he sponsored both Atkinson and Sculley ?

        Infinite thanks to Jimmy and Dr. Kay for such an enriching discussion.

         
  10. Nate

    September 25, 2016 at 1:42 am

    Jimmy, excellent intro to Hypercard. It’s a testament to your quality that someone like Alan Kay would take the time to comment.

    One thing (among many) that the Web lacks is the ability to annotate and build your own paths through pages. Sure, you can do bookmarks, but you can’t select text or an object on a web page, automatically see similar items pop up, and select one to add a bookmark. The Web is still mostly read-only.

    The part about Xanadu I miss most is true micropayments. The credit card has emerged as a means of paying for small things, maybe $10 minimum, but the underlying fees make it unusable for smaller payments. If only I could turn on automatic Patreon for browsing, approving 1 cent per view or something wherever I browse.

    Typo: “strikes me as a amazing”

     
    • Nate

      September 25, 2016 at 1:43 am

      I meant “add a link”, not “add a bookmark”

       
    • Jimmy Maher

      September 25, 2016 at 6:26 am

      Thanks!

      It does strike me that we’ve imported a lot of old-media approaches, like advertising impressions, to the Web instead of devising new ones like micro-payments that could make it a more financially rewarding place for creators.

       
      • Felix

        September 28, 2016 at 5:51 am

        We tried micropayments. People have been trying for decades now. They don’t work, for the same reason paying one cent at a tollbooth on every street corner, or else being barred from getting to the store, would be a terrible way to cover the maintenance costs for the pavement we walk on. You’re right that we’ve imported a lot of old-media approaches… to a medium more akin to the public domain. And it worked out about as well for society as the great enclosure movement of the 18th century.

        As for making the web read-write, um, we have, ever since 1995. It’s called a wiki. And maybe it’s not the kind of pie-in-the-sky, elegant solution academics like to dream of. Neither is the web itself, for that matter. But they *work*. *Right now.* As they have for the past quarter century. They have utterly transformed society despite being these cobbled-together solutions, while the Xanadus of the world are still vaporware, and have been for twice as long. We have search engines. We have news aggregators. There are even services that let you collect links and article summaries into virtual magazines, for future reference. And to the degree tools are still lacking… well, we can always make more — something that would be impossible with a closed-source, monolithic system.

        Welcome to the 21st century.

         
        • Jimmy Maher

          September 28, 2016 at 6:59 am

          Very well said. The practical is often preferable to the ideal.

           
    • Ben Galbraith

      October 1, 2016 at 2:48 pm

      Fwiw, the Brave browser has implemented micro payments this year using bitcoin.

       
  11. FilfreFan

    September 25, 2016 at 1:52 pm

    Well, that makes perfect sense. HyperBook maybe? It seems that some 25 year old memories simply cannot be relied upon :)

     
  12. Bret

    September 25, 2016 at 7:10 pm

    Thanks, Jimmy. Another book you might be interested is Belinda Barnet’s “Memory Machines”. It covers not only Engelbart’s and Nelson’s work, but also that of Andy van Dam (HES and FRESS), which is a story that doesn’t get told as much as it should. https://www.amazon.com/dp/0857280600

    Also, Matthias Müller-Prove’s thesis “Vision and Reality of Hypertext and Graphical User Interfaces” provides brief but wide-ranging coverage of a dozen or so influential hypertext systems, and is a good starting point for further study.
    http://mprove.de/diplom/

     
  13. Sam Garret

    September 26, 2016 at 2:42 am

    My Dad used to have a copy of Computer Lib/Dream Machines. I vaguely recall being fascinated by its cartoon covers and not understanding a damn thing in it at a very young age.

    typo watch:

    becoming an adjective (“Medelian”)

    …should probably be “Mendelian”

     
    • Jimmy Maher

      September 26, 2016 at 7:37 am

      Thanks!

       
  14. Jubal

    September 27, 2016 at 11:40 pm

    Having never used Hypercard, does anyone know how a modern system like TWINE compares to it?

     
    • Jimmy Maher

      September 28, 2016 at 7:06 am

      Although I understand that both are described as “hypertext authoring tools,” the comparison strikes me as a bit apples and oranges. HyperCard was a full-fledged multimedia authoring environment as well, and as such was much, much bigger in scope, flexibility, and possibility. While it would be possible — perhaps not as easy, but possible — to implement a Twine game using HyperCard, I can’t imagine using Twine to recreate an ambitious HyperCard title like The Manhole (http://www.mobygames.com/game/manhole). Put another way, I can easily imagine someone writing Twine as an extension to run within HyperCard, if HyperCard was still an ongoing thing today.

       
  15. mrob

    September 28, 2016 at 12:20 pm

    “a researcher in Britain named Tim Berners-Lee had started to experiment with something he called the Hypertext Transfer Protocol”

    Tim Berners-Lee is British, but he did his work on HTTP outside of Britain, at CERN on the French/Swiss border.

     
    • Jimmy Maher

      September 28, 2016 at 12:42 pm

      For some reason, I wanted to believe that he was working remotely from Britain. Looking back, though, I have no idea why I thought that. Made a minor edit. Thanks!

       
  16. Dan

    September 29, 2016 at 10:10 pm

    “The product of an opium-fueled hallucination, the 54-line poem is a mere fragment of a much longer work Taylor had intended to write.”

    Taylor should be Coleridge.

     
    • Jimmy Maher

      September 30, 2016 at 7:00 am

      Thanks!

       
  17. Alexander Freeman

    October 7, 2016 at 4:07 pm

    In Xanadu did Kubla Khan
    A stately pleasure-dome decree…

    Sorry, just had to make that reference. Well, actually, this whole Xanadu thing reminds me of something called Lotus Agenda:
    http://homeoftheunderdogs.net/game.php?id=3807
    Apparently, the best personal information manager ever made, it too uses associations to help you organize your data. Steep learning curve, though.

     
  18. Johan

    October 11, 2016 at 7:31 pm

    Tiny spelling error: “Atikinson”

     
    • Jimmy Maher

      October 12, 2016 at 7:11 am

      Thanks!

       
  19. ydy

    October 27, 2016 at 4:37 pm

    I think there might be a word missing near “same” in “One of the first examples of same that Mediagenic published was Focal Point”.

     
    • Jimmy Maher

      October 27, 2016 at 5:35 pm

      No, that’s actually acceptable usage and as intended.

       
  20. Wolfeye M.

    September 27, 2019 at 11:16 am

    Odd how things can domino like that. The internet, it could be said, had its seeds planted by a monk messing around with peas.

     

Leave a Reply

Your email address will not be published.


This site uses Akismet to reduce spam. Learn how your comment data is processed.