« PREVIOUS ENTRY
@cavafy probably wouldn’t be on Twitter
In October of 1962, Douglas Englebart imagined a remarkable new technology for writing. Englebart is a serial visionary; among other things, he’s famous for having invented the computer mouse. But in his essay “Augmenting Human Intellect”, he envisioned a typing machine that was also equipped with a special sensing stylus. It would work like this: You’d type away at the machine, composing notes or a raw draft of a piece. But after you’d written a bunch of stuff, you could sit back, read it over, and if you found a passage you wanted to clip out and reproduce, you could just wave the stylus over the words and — presto — they’d be re-typed by your machine.
He was, in essence, imagining a machine that could electronically cut and paste.
Englebart suspected cut-and-paste would have an enormous impact on the way we’d write. As he predicted:
This writing machine would permit you to use a new process of composing text. For instance, trial drafts could rapidly be composed from re-arranged excerpts of old drafts, together with new words or passages which you stop to type in. Your first draft could represent a free outpouring of thoughts in any order, with the inspection of foregoing thoughts continuously stimulating new considerations and ideas to be entered. If the tangle of thoughts represented by the draft became too complex, you would compile a reordered draft quickly. It would be practical for you to accommodate more complexity in the trails of thought you might build in search of the path that suits your needs.
Pretty amazing foresight, eh? He wrote that 50 years ago — when computers were still room-sized industrial tools — yet he nailed it: One of the biggest impacts of word processing has been the way it makes cutting and pasting a central part of how we organize our thoughts.
The funny thing is, cutting and pasting is now so routine that we often forget how strange it felt at first. I’m 42, old enough that I wrote my high-school essays — and even the essays of my first year of college — longhand on paper, then typed them up on a typewriter. The work of arranging and redacting my thoughts was done with a pencil and paper; the typewriter existed mostly just as a way to produce a good-looking final draft, though I’d occasionally buff or improve a sentence as I was typing it up. (Though I wouldn’t edit too much; if my attempts to tweak the sentence while typing made things worse, I’d have to laboriously white out my screwed-up text with Liquid Paper, a substance beyond foul.)
When I first got my hands on a word processor, it felt absolutely uncanny: The words! They’re … they’re moving around! THEY LOOK LIKE PRINTED WORDS BUT THEY’RE MOVING AROUND. But pretty quickly I grasped the new style of composition that was possible, and I loved it. Precisely as Englebart envisioned, I could write longer, more discursive drafts, letting my thoughts wander into ever-more-creative-or-weirder nooks, and taking arguments to their logical endpoint just to see where they’d lead. I could give myself mental permission to do this because it was easy to redact the best parts into my final essay. Robert Frost talked about how he couldn’t tell what a poem was going to be about until he’d finished writing it. That’s what word processors did to my academic and journalistic writing: As the mechanical act of writing became easier, it became easier to write prodigiously as a way of sussing out my own thoughts.
It’s hard to remember now, but many people back in the 80s totally freaked out about word processing. I recall professors worrying that it would make students write more sloppily, and even think more sloppily. The fluidity of cutting and pasting seemed intellectually suspicious. I even remember one of my TAs arguing — in a lovely foreshadowing of today’s fears that “the Internet is making us stupid” — that cutting and pasting would render our generation unable to craft a coherent argument, because the sheer slipperiness of digital prose, its slithy rearrangeability, would render our ideas and prose rootless, nonsequential, and flighty.
Mind you, they weren’t entirely wrong. Cut-and-paste poses cognitive risks that plague me even today. Sometimes when I’m working on a story, I’ll cut and paste so many bloated passages from white papers and interviews into my “research file” that it eventually metasasizes to the length of Infinite Jest, and becomes completely useless. (Fascinatingly, some scientific studies (PFD link) have found that the most high-performing students resist this impulse: During research, they’re more judicious in their use of cut-and-paste than their lower-performing peers.) And I also find there are times when I need to step away from word processing. When I’m blocked on a piece of writing — particularly when I need to do big-picture structural thinking about the shape of a long article — I often reach for a pencil and huge piece of paper, so I can diagram the flow. (And hey: There are word-processing holdouts even more hard-core, like the excellent sci-fi novelist Joe Halderman, who writes not only exclusively in longhand but by the light of oil lamps.)
It’s an interesting question either way. How has the word processor changed the way we think? How has it changed the way you think?
I'm Clive Thompson, a writer on science, technology, and culture. This blog collects bits of offbeat research I'm running into, and musings thereon.
Currently, I'm a contributing writer for the New York Times Magazine and a columnist for Wired magazine. I also write for Fast Company and Wired magazine's web site, among other places. Email or AOL IM me (pomeranian99) to say hi or send in something strange!
May 20, 2011 » 02:28 PM
From Christopher Kennedy’s very droll book “Neitzsche’s Horse”.
July 28, 2010 » 07:35 AM
“Wr” - S
July 06, 2010 » 10:05 AM
My Xbox broke, and I was trying to Google some possible technical solutions, when I noticed that Google appears to be encouraging me to make a typo. I suppose it’s possible that Google’s algorithms know that typing “wont” instead of “won’t” would produce better results.
June 29, 2010 » 05:00 PM
On the other hand, when I tried the test for multitasking, I was pretty abysmal. I performed worse than people who identify themselves as heavy multitaskers, and those who identify as low multitaskers.
June 29, 2010 » 04:58 PM
I finally got around to trying out the interactive “test your distractability and multitasking” page at the New York Times, which they put up alongside their story earlier this month about how computer distractions are eroding our lives.
According to the test, I guess I have good focus — I’m not very distractable!
El Rey Del Art
Frankly, I'd Rather Not
The Shifted Librarian
Howard Sherman's Nuggets
Donut Rock City
The Antic Muse
Techdirt Wireless News
Corante Gaming blog
Corante Social Software blog
Arts and Letters Daily
Alan Reiter's Wireless Data Weblog
Viral Marketing Blog