For the Texas Library Association Annual Conference, closing luncheon,
George R. Brown Convention Center, Ballroom A, Noon, Friday, April 26, 1996
by John H. Lienhard
Mechanical Engineering Department
University of Houston
Houston, TX 77204-4792
jhl [at] uh.edu (jhl[at]uh[dot]edu)
You librarians and we engineers share a huge commonality in our work. We both deal with the constructed world. We are technologists, if I may use a word that takes a terrible beating. I deal in a world built of steam and steel. You deal in a world built of paper and electronics. And that's why we speak a common language.
You and I both know what too many people don't understand -- that we and our technologies drive one another. We form and shape each other. It's hard to find any line between what we are and the machines around us, because technology and culture are the same thing. And for that reason, you and I both live very close to cultural bedrock.
And you will surely understand when I say that, for a technology to succeed, it must have a seldom-talked-about quality. For a technology to become a part of our lives, it must also be a part of our metaphorical substrate. Look closely at our cultural metaphors and you'll find most are technological. Take clocks: The circular face of a sundial, with its shadow moving left to right, was copied straight on to the faces of water clocks. Water clocks used a float in a steadily draining tank to tell time. But that float drove gears, and the gears drove hands around a dial.
Then, around AD 1300, the tick-tock mechanical escapement radically improved clock accuracy. It made clocks smaller and cheaper. But, changed as they were, clocks still had dials, bells, and gears. Medieval writers had almost nothing to say about the new mechanism inside, so historians still aren't exactly sure when that change took place.
You see, the outward form, the clock face, could not change, because that's where the metaphor was expressed.
Around 1920, another radical change took place in clocks: This time, electrical timing elements used the steady oscillation of alternating current to replace the mechanical escapement. Accuracy took another leap forward. But clocks still looked the same.
Now quartz crystals confuse the issue again. My desk clock not only has the circular face of a sundial or a water clock; it also has a second hand that moves in little jumps -- as though it were controlled by an escapement mechanism. Designers understand, on a visceral level, that the meeting ground between user and machine should never change any more than it has to.
So what about digital clocks? They offer a more precise readout than analog clocks do. They're easier for children to read. Linear time -- time as a sequence of rising numbers. That's pure simplicity. Of course it's simplicity in the same way a tree is simpler than a forest.
A circular dial paints a picture of Earth's rotation. It models our own experience of passing time. It's a lovely analog of reality. In a digital display, night never falls. Time just advances, without features, minute after minute.
And so the competition between analog and digital readout might seem to be in balance. But! What do most of you wear on your wrists? The fascinating truth is, the digital clock has already lost in that competition. The digital clock simply can't compete with the metaphorical power and visual grace of the circling motion of an analog face.
Many technologies look good for a while; then they get left -- Betamax, slide rules, dirigibles, LP's, autogyros, and digital clocks as well. So what does survive, and why?
If you want to predict the death or survival of a technology, you certainly ask if it's functional. But that's never enough by itself. You have to ask if it's a metaphor for something more than function. Only after a technology has touched us in that deep visceral and emotional place will it find a way to persist from one generation to the next.
And so we come to another technology -- to the book. Its story began in Pergamon -- then one of the largest cities in the world. Now it's called Bergama. It's in Western Turkey -- South of Istanbul and North of Izmir. It sits on a hill, 16 miles from the Aegean Sea.
Pergamon became capital of the Attalid dynasty after 280 BC. It was one of two great centers in the cosmopolitan world that formed after Alexander died. The other was Alexandria, in Egypt. The Attalids took their name from King Attalus, who reigned till just after 200 BC.
Attalus began an artistic renaissance in Pergamon. His son, Eumenes, continued it. Eumenes set out to build the greatest library in the world. He meant to outdo the famous library in Alexandria.
What followed was the stuff of black comedy. His soldiers ranged the land stealing books. Book lovers buried what they could in secret hiding places. Pergamon scribes forged manuscripts. The library grew to 200,000 volumes.
Egypt didn't take all that lying down. She quit supplying papyrus to Pergamon. That could've ended Pergamon's pretensions. But Pergamon scholars had an ace up their sleeve. They had a rich wool industry. They had plenty of sheep. They'd already begun writing on sheepskin, or vellum. They called the stuff Charta Pergamene. That meant paper of Pergamon. The words Charta Pergamene mutated intoparchment.
It's harder to roll parchment into a scroll than it is papyrus. So someone thought of folding parchment into rectangular pages and sewing those gatherings together. Someone invented the codex -- the modern book.
Soon after that, both Pergamon and Egypt fell under Roman control. Then, in 40 BC, Roman soldiers in Egypt accidentally burned part of Alexandria's library. Anthony, in his obsessive love for Cleopatra, did a remarkable thing. To repay the loss he gave her the Pergamon Library.
So we remember Alexandria and forget Pergamon. But their brief competition changed human history. Pergamon had given us the most efficient information storage technology ever known.
This was one of the few times a new user interface was good enough to change the technological metaphor. Bear in mind that the scroll still survives, even to this day, as its own technological metaphor. But the book -- the codex -- became metaphor unto itself. It well may be the most powerful technological metaphor of them all.
Once a technology finds that place of metaphor in our psyche, its outward form will survive. The user interface will not be given up.
Remember what happened when Gutenberg began printing with movable metal type. He made print look just like the work of scribes. He was counterfeiting manuscript books. It often takes a trained eye to tell an early printed book from a manuscript book. And books today still keep most of those features. We still fold pages into gatherings, sew gatherings together, and lace them between hard covers. Movable metal type made books cheap and abundant. Yet we readers still receive information the same way they did in Pergamon, 2000 years ago.
When I work at my personal computer, I use what's clearly recognizable as a typewriter keyboard. That arrangement's over a century old. Once more, the place where I meet the machine, for all its imperfect arrangement, is old and well loved. It simply will not be abandoned.
Friends ask me, over and over, "How much change will we have to undergo?" The answer is that where the user meets the machine is the one place we will not tolerate change -- even though the machine itself is mutating into something so different as to redirect human history.
We do indeed bend ourselves to each new machine. But where the machine has become metaphor is where that process stops. Think about pianos: They evolved from harpsichord improvements. But they were soon something wholly different. Pianos are so different from harpsichords that you still need a harpsichord when you want to hear harpsichord music.
Many technologies survive their replacements that way. Live concerts have survived recordings. Pens survive word processors.
In an interview taped in 1957, H.L. Mencken answered a question that might not occur to us today. "What do you think of the way newspapers are getting into the new medium of television?"
In his answer, Mencken quoted Gresham's law, "Bad money drives out the good." A fine restaurant goes downhill when it tries to compete with the lucrative hot-dog stand across the street.
Newspapers, he says, do what they do very well. TV is a new technology, utterly different from newspapers. It won't replace them. Instead it'll go off in new directions. If newspaper people let themselves be suckered into the TV business, they'll surely lose the ability to put out good newspapers.
Mencken's disdains often led him astray, but not here. Forty-five years ago, we feared that TV would replace newspapers. And, of course, it never did. Newspapers may no longer fit into our lives quite the way they did in 1948. TV has certainly captured some of their old functions. But there's too much it cannot replace. The papers let you sift the details and re-read what you missed. They let you work a crossword puzzle.
People ask me about the electronic book when the computer has already leap-frogged that technology. Before a decent electronic book could come into existence, the World-Wide Web was already on its way to providing everything we might ever hope to get from one: Screen resolution and illustrations are improving; the supply of texts is rocketing upward. And now we have both sound and motion.
As we abandon the limitations of the paper book, electronic information is already completely different in character from that in a paper book. And that's exactly why the paper book will have to survive, after all. Paper books will keep right on doing what they've always done so well. They take you into the author's mind. You give yourself over to her story-telling rhythm.
Your own mind frames the pictures and plays the music. You feel organic cloth and paper against your fingers. What the computer offers has as much in common with the paper book as the horseless carriage has in common with the horse.
That's fine, many people tell me, but they still wonder what paper books have that computers won't soon have as well. If you fix the screen, fix the portability, find means for dog-earing your place, then what's left to fix?
Well, the answer lies in the metaphor. Not only has the book long since found its metaphorical place in our lives; the computer has already found its metaphorical place as well. The book is our metaphorical mentor. The computer is our metaphorical servant.
We all switch between the roles of parent and child. We need some control over things around us. But we also need to submit to other people's knowledge. In some things, we should play the parent. In others, we'd better know how to be a child.
And the child says, "Tell me a story." The story we choose might be a Gothic novel. It might be a math textbook. In either case we have to give ourselves over to the story-teller if we hope to profit from the story. We do that when we read a book, go to the theater, even listen to a concert.
Computer communications are quite another matter. Once we master a computer, it does our bidding. We say, "Go and do. Buy me an airplane ticket. Give me a stock quotation. Tell me if the library has a book. Pass this message to a friend." The computer dances to our tune. We are in control.
When you and I go to the computer for text material, it's to look things up. It's not to let words wash over us nor to touch and feel paper. The computer is far better than a book if you want to find things.
Insofar as paper books function as simple repositories of fact, they've already given way to computers. But the sort of book we submit ourselves to will have to remain written out and uncontrollable.
It's an important omen that the first books that appeared on computers offered their readers control over the story. It's no accident that the very first computer books were ones that let you dictate the course of the plot.
To learn, we become as children. We seek out our own ignorance. Now and then we follow the mind of someone who knows what we do not. We yield to the rhythm of the story-teller.
Printed books let us put control aside for a while. That's the wonderful gift books offer. But the metaphor of the computer has already been set. Whatever we can do with electronic media, we simply will not use them as mentors.
Now, the dark side of all this: The emergence of a new technological metaphor means revolution, and revolution means trouble. Each major communication revolution has brought a new metaphorical substratum into our lives. But it's also brought with it terrible upheavals.
Let me trace a couple of those revolutions so you can see what I'm talking about. First, try the technology of writing itself. Let me ask: Is language about words? You see, there's a vast gulf between speech and writing. Breaking speech into words doesn't become really useful until we write language down.
A librarian friend chides my attempts to pronounce French. "John," he says, "You have to understand there're no words in spoken French, only phrases." His subtle point is, the way we cast speech into words is pretty arbitrary.
When Mark Twain's Connecticut Yankee uttered a bogus magic spell at King Arthur's court, he used a gigantic German word:Konstantinopelitanisherdudelsachspfeifenmachersgesellschaft. That means an organization of bagpipe-makers from Constantinople. Now you tell me: is that seven words or just one?
Egyptians, who did the first hieroglyphic writing, credited its invention to the ibis-headed god, Toth. They picture him writing with a reed pen. The Hindu god Brahma supposedly based letters on the shape of seams in the human skull. But: By the time the Old Testament took form, we took writing for granted. The Bible no longer treated writing as a Gift from God.
That's because the old hieroglyphic languages had mystic meanings that lay far from human speech. Pictures aren't the same as words. Early writing conveyed a sense of things, quite apart from speech.
Only gradually did we reduce speech directly to writing. To do that, we identified words as the least parts of speech with stand-alone meanings. The problem is, that doesn't work consistently.
For example: The word linger means to tarry. The preposition on means many things. If we say "The melody lingers on," we call out a small additional meaning. A person lingers, but a melody or an odor attaches itself to us. It lingers on. So: Is "lingers on" one word or two?
Signing for the deaf is a form of expression remarkable for the way it blends words into continuous action. If you've ever watched a dancer incorporate signing, you've seen, dramatically, how artificial it is to break speech into separate words.
And a great trap opens before us. The linearity of written language can cloud our minds to the multidimensionality of human thought. Many of us have a hard time thinking without making recourse to words. Hamlet, asked what he read, replied hopelessly, "Words, words, words." Imagination is far too complex to be hogtied to anything so limited.
So the very act of writing started a powerful shift in the very nature of human consciousness, but the worst was yet to come. The real fun began with the invention of the alphabet. By the way, the Greek word for alphabet is stoicheia. That's where chemists get the word stoichiometry -- the science of combining chemical elements. For them, letters of the alphabet were the minimal elements of speech.
Early Sumerian cuneiform, in use 5000 years ago, had only some 300 characters. It lacked anything like the full expressivity of speech. Yet it evoked things that speech could not.
The invention of an alphabet was begun by pre-classical Greeks around 1400 BC and finished by the Phoenicians in the 11th century BC. Alphabets now transcribed speech directly. All alphabets are phonetic. They reduce speech to its least divisible elements -- to its stoicheia -- to its atoms.
For 2000 years before the invention of the alphabet, writing gave us means for storing knowledge, but it stored it much as an etching or woodcut might. Then all that changed.
And the result was catastrophic. Psychologist Julian Jaynes has pointed out that it was just at this time -- before 1000 BC -- that humans developed analytical consciousness. In popular terms, we became very left-brain in our thinking. What followed was terrible social upheaval.
Without the older and more mystical means of dealing with human behavior, leaders instituted the systematic use of cruelty. They took up slavery. Knowledge was once mystery; now it became power. We struck new poses of masculine domination.
Once writing turned into canned speech, we had means for watching ourselves think. In the long run that led to mathematics, philosophy, and literature. Perhaps the first great literature it produced was the Book of Genesis, which begins by telling how we'd eaten the fruit of new and forbidden knowledge.
But don't for a moment forget the damage it did to us. A mid-19th century philologist, Henry Humphreys, saw the impact of the shift long before Janyes did. In 1853 he wrote,
From the invention of letters the machinations of the human heart began to operate; falsity and error increased; litigation and prisons had their beginnings, as [did] specious and artful language which causes so much confusion in the world.
Alphabets altered human consciousness in wonderful and terrible ways. And the new printing presses of the 15th century altered it again. By the 16th century they'd shifted our thinking to the external world. They'd also called up all that was evil in the old Classical world. We call that humanism. In fact it meant revivals of male dominance, of slavery, and of a kind of egocentricity that had been under control in the Medieval world.
Now computers are once more attacking the very metaphors for thought. They're providing us with inconceivable access to information. But what is the price we pay? I'll name just three domains of mischief: pointillism, memory, and spatial visualization.
First, pointillism: Computers do an odd thing with knowledge. Ask a question, and, in a blink, they immediately highlight the precise answer -- the citation, the definition. You're handed the answer with no context.
I've learned so much in the process of looking up something else -- adjacent pages in a dictionary, sidetracks in books. With the computer, context becomes an avoidable waste of time. And that means far greater loss than we first imagine.
Next, memory: When I used a slide rule, I had to do a lot of the calculation in my head. That meant memorizing decimal placements and roughing out the calculation as I went along. Now that dimension of thought is wholly gone. Once I had to memorize spelling; now the machine does that for me as well.
The problem is that creativity is recognizing a fact, an idea, out of context. Our dusty attic of randomly remembered stuff -- names, dates, lyrics and melody -- is what creativity feeds upon. Piece by piece, the computer is robbing us of that legacy.
And finally, spatial visualization: We've built the rules of perspective, geometry, and mathematical graphing into our computers. What we once did in our heads, the computer now does for us. We are simply handed the result on a two-dimensional screen.
Did you know that the builders of the great Gothic cathedrals didn't even have working drawings? They built in their minds and then rendered in stone. Their achievement took an enormous capacity for seeing in their mind's eye. That seems so impossible to us that we're hard-pressed to believe it ever really happened.
So the book will remain, but we users and keepers of books are being changed. For the metaphors we live by are being rewritten by this new technology. The electronic media are unthreading the culture we know. They are both serving and disrupting the human condition in ways we cannot yet conceive.
And you are the people who, more than anyone in our society, sit in the eye of this new hurricane. I am hard-pressed to conceive of a more exciting line of work to be in, as we enter the third millennium.
Rohr, R.R.J., Sundials: History, Theory, and Practice. Toronto: University of Toronto Press, undated.
Marshall, R.K., Sundials. New York: the MacMillan Company, 1963.
Gould, S.J., "The Panda's Thumb of Technology," Bully for Bronto-saurus: Reflections in Natural History. New York: W.W. Norton & Company, 1991, Chapter 4.
Sarton, G., Galen of Pergamon. Lawrence, KA: University of Kansas Press, 1954, Chapter II.
Hansen, E.V., The Attalids of Pergamon. 2nd ed. Ithaca, NY: Cornell University Press, 1971.
Miller, G.A., The Science of words. New York: Scientific American Library, 1991.
Ogg, O., The 26 Letters. New York: The Thomas Y. Company, 1961, 1948.
See also the Encyclopaedia Britannica article on writing.
H.L. Mencken Speaking. Caedmon SWC 1082, 1957, an imprint of Harper Audio.