Books for National Novel Writing Month
For National Novel Writing Month in November, we have prepared a collection of books that will help students with their writing goals.
Prologue
I love style manuals. Ever since I was assigned Strunk and White’s The Elements of Style in an introductory psychology course, the writing guide has been among my favorite literary genres. It’s not just that I welcome advice on the lifelong challenge of perfecting the craft of writing. It’s also that credible guidance on writing must itself be well written, and the best of the manuals are paragons of their own advice. William Strunk’s course notes on writing, which his student E. B. White turned into their famous little book, was studded with gems of self-exemplification such as “Write with nouns and verbs,” “Put the emphatic words of a sentence at the end,” and best of all, his prime directive, “Omit needless words.” Many eminent stylists have applied their gifts to explaining the art, including Kingsley Amis, Jacques Barzun, Ambrose Bierce, Bill Bryson, Robert Graves, Tracy Kidder, Stephen King, Elmore Leonard, F. L. Lucas, George Orwell, William Safire, and of course White himself, the beloved author of Charlotte’s Web and Stuart Little. Here is the great essayist reminiscing about his teacher:
I like to read style manuals for another reason, the one that sends botanists to the garden and chemists to the kitchen: it’s a practical application of our science. I am a psycholinguist and a cognitive scientist, and what is style, after all, but the effective use of words to engage the human mind? It’s all the more captivating to someone who seeks to explain these fields to a wide readership. I think about how language works so that I can best explain how language works.
But my professional acquaintance with language has led me to read the traditional manuals with a growing sense of unease. Strunk and White, for all their intuitive feel for style, had a tenuous grasp of grammar.2 They misdefined terms such as phrase, participle, and relative clause, and in steering their readers away from passive verbs and toward active transitive ones they botched their examples of both. There were a great number of dead leaves lying on the ground, for instance, is not in the passive voice, nor does The cock’s crow came with dawn contain a transitive verb. Lacking the tools to analyze language, they often struggled when turning their intuitions into advice, vainly appealing to the writer’s “ear.” And they did not seem to realize that some of the advice contradicted itself: “Many a tame sentence . . . can be made lively and emphatic by substituting a transitive in the active voice” uses the passive voice to warn against the passive voice. George Orwell, in his vaunted “Politics and the English Language,” fell into the same trap when, without irony, he derided prose in which “the passive voice is wherever possible used in preference to the active.”3
Self-contradiction aside, we now know that telling writers to avoid the passive is bad advice. Linguistic research has shown that the passive construction has a number of indispensable functions because of the way it engages a reader’s attention and memory. A skilled writer should know what those functions are and push back against copy editors who, under the influence of grammatically naïve style guides, blue-pencil every passive construction they spot into an active one.
Style manuals that are innocent of linguistics also are crippled in dealing with the aspect of writing that evokes the most emotion: correct and incorrect usage. Many style manuals treat traditional rules of usage the way fundamentalists treat the Ten Commandments: as unerring laws chiseled in sapphire for mortals to obey or risk eternal damnation. But skeptics and freethinkers who probe the history of these rules have found that they belong to an oral tradition of folklore and myth. For many reasons, manuals that are credulous about the inerrancy of the traditional rules don’t serve writers well. Although some of the rules can make prose better, many of them make it worse, and writers are better off flouting them. The rules often mash together issues of grammatical correctness, logical coherence, formal style, and standard dialect, but a skilled writer needs to keep them straight. And the orthodox stylebooks are ill equipped to deal with an inescapable fact about language: it changes over time. Language is not a protocol legislated by an authority but rather a wiki that pools the contributions of millions of writers and speakers, who ceaselessly bend the language to their needs and who inexorably age, die, and get replaced by their children, who adapt the language in their turn.
Yet the authors of the classic manuals wrote as if the language they grew up with were immortal, and failed to cultivate an ear for ongoing change. Strunk and White, writing in the early and middle decades of the twentieth century, condemned then-new verbs like personalize, finalize, host, chair, and debut, and warned writers never to use fix for “repair” or claim for “declare.” Worse, they justified their peeves with cockamamie rationalizations. The verb contact, they argued, is “vague and self-important. Do not contact people; get in touch with them, look them up, phone them, find them, or meet them.” But of course the vagueness of to contact is exactly why it caught on: sometimes a writer doesn’t need to know how one person will get in touch with another, as long as he does so. Or consider this head-scratcher, concocted to explain why a writer should never use a number word with people, only with persons: “If of ‘six people’ five went away, how many people would be left? Answer: one people.” By the same logic, writers should avoid using numbers with irregular plurals such as men, children, and teeth (“If of ‘six children’ five went away . . .”).
In the last edition published in his lifetime, White did acknowledge some changes to the language, instigated by “youths” who “speak to other youths in a tongue of their own devising: they renovate the language with a wild vigor, as they would a basement apartment.” White’s condescension to these “youths” (now in their retirement years) led him to predict the passing of nerd, psyched, ripoff, dude, geek, and funky, all of which have become entrenched in the language.
The graybeard sensibilities of the style mavens come not just from an underappreciation of the fact of language change but from a lack of reflection on their own psychology. As people age, they confuse changes in themselves with changes in the world, and changes in the world with moral decline—the illusion of the good old days.4 And so every generation believes that the kids today are degrading the language and taking civilization down with it:5
The common language is disappearing. It is slowly being crushed to death under the weight of verbal conglomerate, a pseudospeech at once both pretentious and feeble, that is created daily by millions of blunders and inaccuracies in grammar, syntax, idiom, metaphor, logic, and common sense. . . . In the history of modern English there is no period in which such victory over thought-in-speech has been so widespread.—1978
Recent graduates, including those with university degrees, seem to have no mastery of the language at all. They cannot construct a simple declarative sentence, either orally or in writing. They cannot spell common, everyday words. Punctuation is apparently no longer taught. Grammar is a complete mystery to almost all recent graduates.—1961
From every college in the country goes up the cry, “Our freshmen can’t spell, can’t punctuate.” Every high school is in disrepair because its pupils are so ignorant of the merest rudiments.—1917
The vocabularies of the majority of high-school pupils are amazingly small. I always try to use simple English, and yet I have talked to classes when quite a minority of the pupils did not comprehend more than half of what I said.—1889
Unless the present progress of change [is] arrested . . . there can be no doubt that, in another century, the dialect of the Americans will become utterly unintelligible to an Englishman.—1833
Our language (I mean the English) is degenerating very fast. . . . I begin to fear that it will be impossible to check it.—1785
Complaints about the decline of language go at least as far back as the invention of the printing press. Soon after William Caxton set up the first one in England in 1478, he lamented, “And certaynly our langage now vsed veryeth ferre from what whiche was vsed and spoken when I was borne.” Indeed, moral panic about the decline of writing may be as old as writing itself:
Non Sequitur © 2011 Wiley Ink, Inc. Dist. by Universal Uclick. Reprinted with permission. All rights reserved.
The cartoon is not much of an exaggeration. According to the English scholar Richard Lloyd-Jones, some of the clay tablets deciphered from ancient Sumerian include complaints about the deteriorating writing skills of the young.6
My discomfort with the classic style manuals has convinced me that we need a writing guide for the twenty-first century. It’s not that I have the desire, to say nothing of the ability, to supplant The Elements of Style. Writers can profit by reading more than one style guide, and much of Strunk and White (as it is commonly called) is as timeless as it is charming. But much of it is not. Strunk was born in 1869, and today’s writers cannot base their craft exclusively on the advice of a man who developed his sense of style before the invention of the telephone (let alone the Internet), before the advent of modern linguistics and cognitive science, before the wave of informalization that swept the world in the second half of the twentieth century.
A manual for the new millennium cannot just perpetuate the diktats of earlier manuals. Today’s writers are infused by the spirit of scientific skepticism and the ethos of questioning authority. They should not be satisfied with “That’s the way it’s done” or “Because I said so,” and they deserve not to be patronized at any age. They rightly expect reasons for any advice that is foisted upon them.
Today we can provide the reasons. We have an understanding of grammatical phenomena which goes well beyond the traditional taxonomies based on crude analogies with Latin. We have a body of research on the mental dynamics of reading: the waxing and waning of memory load as readers comprehend a passage, the incrementing of their knowledge as they come to grasp its meaning, the blind alleys that can lead them astray. We have a body of history and criticism which can distinguish the rules that enhance clarity, grace, and emotional resonance from those that are based on myths and misunderstandings. By replacing dogma about usage with reason and evidence, I hope not just to avoid giving ham-fisted advice but to make the advice that I do give easier to remember than a list of dos and don’ts. Providing reasons should also allow writers and editors to apply the guidelines judiciously, mindful of what they are designed to accomplish, rather than robotically.
“The sense of style” has a double meaning. The word sense, as in “the sense of sight” and “a sense of humor,” can refer to a faculty of mind, in this case the faculties of comprehension that resonate to a well-crafted sentence. It can also refer to “good sense” as opposed to “nonsense,” in this case the ability to discriminate between the principles that improve the quality of prose and the superstitions, fetishes, shibboleths, and initiation ordeals that have been passed down in the traditions of usage.
The Sense of Style is not a reference manual in which you can find the answer to every question about hyphenation and capitalization. Nor is it a remedial guide for badly educated students who have yet to master the mechanics of a sentence. Like the classic guides, it is designed for people who know how to write and want to write better. This includes students who hope to improve the quality of their papers, aspiring critics and journalists who want to start a blog or column or series of reviews, and professionals who seek a cure for their academese, bureaucratese, corporatese, legalese, medicalese, or officialese. The book is also written for readers who seek no help in writing but are interested in letters and literature and curious about the ways in which the sciences of mind can illuminate how language works at its best.
My focus is on nonfiction, particularly genres that put a premium on clarity and coherence. But unlike the authors of the classic guides, I don’t equate these virtues with plain words, austere expression, and formal style.7 You can write with clarity and with flair, too. And though the emphasis is on nonfiction, the explanations should be useful to fiction writers as well, because many principles of style apply whether the world being written about is real or imaginary. I like to think they might also be helpful to poets, orators, and other creative wordsmiths, who need to know the canons of pedestrian prose to flout them for rhetorical effect.
People often ask me whether anyone today even cares about style. The English language, they say, faces a new threat in the rise of the Internet, with its texting and tweeting, its email and chatrooms. Surely the craft of written expression has declined since the days before smartphones and the Web. You remember those days, don’t you? Back in the 1980s, when teenagers spoke in fluent paragraphs, bureaucrats wrote in plain English, and every academic paper was a masterpiece in the art of the essay? (Or was it the 1970s?) The problem with the Internet-is-making-us-illiterate theory, of course, is that bad prose has burdened readers in every era. Professor Strunk tried to do something about it in 1918, when young Elwyn White was a student in his English class at Cornell.
What today’s doomsayers fail to notice is that the very trends they deplore consist in oral media—radio, telephones, and television—giving way to written ones. Not so long ago it was radio and television that were said to be ruining the language. More than ever before, the currency of our social and cultural lives is the written word. And no, not all of it is the semiliterate ranting of Internet trolls. A little surfing will show that many Internet users value language that is clear, grammatical, and competently spelled and punctuated, not just in printed books and legacy media but in e-zines, blogs, Wikipedia entries, consumer reviews, and even a fair proportion of email. Surveys have shown that college students are writing more than their counterparts in earlier generations did, and that they make no more errors per page of writing.8 And contrary to an urban legend, they do not sprinkle their papers with smileys and instant-messaging abbreviations like IMHO and L8TR, any more than previous generations forgot how to use prepositions and articles out of the habit of omitting them from their telegrams. Members of the Internet generation, like all language users, fit their phrasing to the setting and audience, and have a good sense of what is appropriate in formal writing.
Style still matters, for at least three reasons. First, it ensures that writers will get their messages across, sparing readers from squandering their precious moments on earth deciphering opaque prose. When the effort fails, the result can be calamitous—as Strunk and White put it, “death on the highway caused by a badly worded road sign, heartbreak among lovers caused by a misplaced phrase in a well-intentioned letter, anguish of a traveler expecting to be met at a railroad station and not being met because of a slipshod telegram.” Governments and corporations have found that small improvements in clarity can prevent vast amounts of error, frustration, and waste,9 and many countries have recently made clear language the law of the land.10
Second, style earns trust. If readers can see that a writer cares about consistency and accuracy in her prose, they will be reassured that the writer cares about those virtues in conduct they cannot see as easily. Here is how one technology executive explains why he rejects job applications filled with errors of grammar and punctuation: “If it takes someone more than 20 years to notice how to properly use it’s, then that’s not a learning curve I’m comfortable with.”11 And if that isn’t enough to get you to brush up your prose, consider the discovery of the dating site OkCupid that sloppy grammar and spelling in a profile are “huge turn-offs.” As one client said, “If you’re trying to date a woman, I don’t expect flowery Jane Austen prose. But aren’t you trying to put your best foot forward?”12
Style, not least, adds beauty to the world. To a literate reader, a crisp sentence, an arresting metaphor, a witty aside, an elegant turn of phrase are among life’s greatest pleasures. And as we shall see in the first chapter, this thoroughly impractical virtue of good writing is where the practical effort of mastering good writing must begin.
Chapter 1
GOOD WRITING
REVERSE-ENGINEERING GOOD PROSE AS THE KEY TO DEVELOPING A WRITERLY EAR
Education is an admirable thing,” wrote Oscar Wilde, “but it is well to remember from time to time that nothing that is worth knowing can be taught.”1 In dark moments while writing this book, I sometimes feared that Wilde might be right. When I polled some accomplished writers about which style manuals they had consulted during their apprenticeships, the most common answer I got was “none.” Writing, they said, just came naturally to them.
I’d be the last to doubt that good writers are blessed with an innate dose of fluency with syntax and memory for words. But no one is born with skills in English composition per se. Those skills may not have come from stylebooks, but they must have come from somewhere.
That somewhere is the writing of other writers. Good writers are avid readers. They have absorbed a vast inventory of words, idioms, constructions, tropes, and rhetorical tricks, and with them a sensitivity to how they mesh and how they clash. This is the elusive “ear” of a skilled writer—the tacit sense of style which every honest stylebook, echoing Wilde, confesses cannot be explicitly taught. Biographers of great authors always try to track down the books their subjects read when they were young, because they know these sources hold the key to their development as writers.
I would not have written this book if I did not believe, contra Wilde, that many principles of style really can be taught. But the starting point for becoming a good writer is to be a good reader. Writers acquire their technique by spotting, savoring, and reverse-engineering examples of good prose. The goal of this chapter is to provide a glimpse of how that is done. I have picked four passages of twenty-first-century prose, diverse in style and content, and will think aloud as I try to understand what makes them work. My intent is not to honor these passages as if I were bestowing a prize, nor to hold them up as models for you to emulate. It’s to illustrate, via a peek into my stream of consciousness, the habit of lingering over good writing wherever you find it and reflecting on what makes it good.
Savoring good prose is not just a more effective way to develop a writerly ear than obeying a set of commandments; it’s a more inviting one. Much advice on style is stern and censorious. A recent bestseller advocated “zero tolerance” for errors and brandished the words horror, satanic, ghastly, and plummeting standards on its first page. The classic manuals, written by starchy Englishmen and rock-ribbed Yankees, try to take all the fun out of writing, grimly adjuring the writer to avoid offbeat words, figures of speech, and playful alliteration. A famous piece of advice from this school crosses the line from the grim to the infanticidal: “Whenever you feel an impulse to perpetrate a piece of exceptionally fine writing, obey it—wholeheartedly—and delete it before sending your manuscript to press. Murder your darlings.”2
An aspiring writer could be forgiven for thinking that learning to write is like negotiating an obstacle course in boot camp, with a sergeant barking at you for every errant footfall. Why not think of it instead as a form of pleasurable mastery, like cooking or photography? Perfecting the craft is a lifelong calling, and mistakes are part of the game. Though the quest for improvement may be informed by lessons and honed by practice, it must first be kindled by a delight in the best work of the masters and a desire to approach their excellence.
In the opening lines of Richard Dawkins’s Unweaving the Rainbow, the uncompromising atheist and tireless advocate of science explains why his worldview does not, as the romantic and the religious fear, extinguish a sense of wonder or an appreciation of life.3
We are going to die, and that makes us the lucky ones. Good writing starts strong. Not with a cliché (“Since the dawn of time”), not with a banality (“Recently, scholars have been increasingly concerned with the question of . . .”), but with a contentful observation that provokes curiosity. The reader of Unweaving the Rainbow opens the book and is walloped with a reminder of the most dreadful fact we know, and on its heels a paradoxical elaboration. We’re lucky because we’ll die? Who wouldn’t want to find out how this mystery will be solved? The starkness of the paradox is reinforced by the diction and meter: short, simple words, a stressed monosyllable followed by six iambic feet.*
Most people are never going to die. The resolution to the paradox—that a bad thing, dying, implies a good thing, having lived—is explained with parallel constructions: never going to die . . . never going to be born. The next sentence restates the contrast, also in parallel language, but avoids the tedium of repeating words yet again by juxtaposing familiar idioms that have the same rhythm: been here in my place . . . see the light of day.
the sand grains of Arabia. A touch of the poetic, better suited to the grandeur that Dawkins seeks to invoke than a colorless adjective like massive or enormous. The expression is snatched from the brink of cliché by its variant wording (sand grains rather than sands) and by its vaguely exotic feel. The phrase sands of Arabia, though common in the early nineteenth century, has plunged in popularity ever since, and there is no longer even a place that is commonly called Arabia; we refer to it as Saudi Arabia or the Arabian Peninsula.4
unborn ghosts. A vivid image to convey the abstract notion of a mathematically possible combination of genes, and a wily repurposing of a supernatural concept to advance a naturalistic argument.
greater poets than Keats, scientists greater than Newton. Parallel wording is a powerful trope, but after dying and being born, being here in my place and seeing the light of day, enough is enough. To avoid monotony Dawkins inverts the structure of one of the lines in this couplet. The phrase subtly alludes to another meditation on unrealized genius, “Some mute inglorious Milton here may rest,” from Thomas Gray’s “Elegy Written in a Country Churchyard.”
In the teeth of these stupefying odds. The idiom brings to mind the menacing gape of a predator, reinforcing our gratitude for being alive: to come into existence we narrowly escaped a mortal threat, namely the high odds against it. How high? Every writer faces the challenge of finding a superlative in the English word-hoard that has not been inflated by hyperbole and overuse. In the teeth of these incredible odds? In the teeth of these awesome odds? Meh. Dawkins has found a superlative—to render into a stupor, to make stupid—that still has the power to impress.
Good writing can flip the way the world is perceived, like the silhouette in psychology textbooks which oscillates between a goblet and two faces. In six sentences Dawkins has flipped the way we think of death, and has stated a rationalist’s case for an appreciation of life in words so stirring that many humanists I know have asked that it be read at their funerals.
What is it that makes a person the very person that she is, herself alone and not another, an integrity of identity that persists over time, undergoing changes and yet still continuing to be—until she does not continue any longer, at least not unproblematically?
I stare at the picture of a small child at a summer’s picnic, clutching her big sister’s hand with one tiny hand while in the other she has a precarious hold on a big slice of watermelon that she appears to be struggling to have intersect with the small o of her mouth. That child is me. But why is she me? I have no memory at all of that summer’s day, no privileged knowledge of whether that child succeeded in getting the watermelon into her mouth. It’s true that a smooth series of contiguous physical events can be traced from her body to mine, so that we would want to say that her body is mine; and perhaps bodily identity is all that our personal identity consists in. But bodily persistence over time, too, presents philosophical dilemmas. The series of contiguous physical events has rendered the child’s body so different from the one I glance down on at this moment; the very atoms that composed her body no longer compose mine. And if our bodies are dissimilar, our points of view are even more so. Mine would be as inaccessible to her—just let her try to figure out [Spinoza’s] Ethics—as hers is now to me. Her thought processes, prelinguistic, would largely elude me.
Yet she is me, that tiny determined thing in the frilly white pinafore. She has continued to exist, survived her childhood illnesses, the near-drowning in a rip current on Rockaway Beach at the age of twelve, other dramas. There are presumably adventures that she—that is that I—can’t undergo and still continue to be herself. Would I then be someone else or would I just no longer be? Were I to lose all sense of myself—were schizophrenia or demonic possession, a coma or progressive dementia to remove me from myself—would it be I who would be undergoing those trials, or would I have quit the premises? Would there then be someone else, or would there be no one?
Is death one of those adventures from which I can’t emerge as myself? The sister whose hand I am clutching in the picture is dead. I wonder every day whether she still exists. A person whom one has loved seems altogether too significant a thing to simply vanish altogether from the world. A person whom one loves is a world, just as one knows oneself to be a world. How can worlds like these simply cease altogether? But if my sister does exist, then what is she, and what makes that thing that she now is identical with the beautiful girl laughing at her little sister on that forgotten day?
In this passage from Betraying Spinoza, the philosopher and novelist Rebecca Newberger Goldstein (to whom I am married) explains the philosophical puzzle of personal identity, one of the problems that engaged the Dutch-Jewish thinker who is the subject of her book.5 Like her fellow humanist Dawkins, Goldstein analyzes the vertiginous enigma of existence and death, but their styles could not be more different—a reminder of the diverse ways that the resources of language can be deployed to illuminate a topic. Dawkins’s could fairly be called masculine, with its confrontational opening, its cold abstractions, its aggressive imagery, its glorification of alpha males. Goldstein’s is personal, evocative, reflective, yet intellectually just as rigorous.
at least not unproblematically. The categories of grammar reflect the building blocks of thought—time, space, causality, matter—and a philosophical wordsmith can play with them to awaken her readers to metaphysical conundrums. Here we have an adverb, unproblematically, modifying the verb continue, an ellipsis for continue to be. Ordinarily to be is not the kind of verb that can be modified by an adverb. To be or not to be—it’s hard to see shades of gray there. The unexpected adverb puts an array of metaphysical, theological, and personal questions on the table before us.
a big slice of watermelon that she appears to be struggling to have intersect with the small o of her mouth. Good writing is understood with the mind’s eye.6 The unusual description of the familiar act of eating in terms of its geometry—a piece of fruit intersecting with an o—forces the reader to pause and conjure a mental image of the act rather than skating over a verbal summary. We find the little girl in the photograph endearing not because the author has stooped to telling us so with words like cute or adorable but because we can see her childlike mannerisms for ourselves—as the author herself is doing when pondering the little alien who somehow is her. We see the clumsiness of a small hand manipulating an adult-sized object; the determination to master a challenge we take for granted; the out-of-sync mouth anticipating the sweet, juicy reward. The geometric language also prepares us for the prelinguistic thinking that Goldstein introduces in the next paragraph: we regress to an age at which “to eat” and even “to put in your mouth” are abstractions, several levels removed from the physical challenge of making an object intersect with a body part.
That child is me. But why is she me? . . . [My point of view] would be as inaccessible to her . . . as hers is now to me. . . . There are presumably adventures that she—that is that I—can’t undergo and still continue to be herself. Would I then be someone else? Goldstein repeatedly juxtaposes nouns and pronouns in the first and third person: that child . . . me; she . . . I . . . herself; I . . . someone else. The syntactic confusion about which grammatical person belongs in which phrase reflects our intellectual confusion about the very meaning of the concept “person.” She also plays with to be, the quintessentially existential verb, to engage our existential puzzlement: Would I then be someone else or would I just no longer be? . . . Would there then be someone else, or would there be no one?
frilly white pinafore. The use of an old-fashioned word for an old-fashioned garment helps date the snapshot for us, without the cliché faded photograph.
The sister whose hand I am clutching in the picture is dead. After eighteen sentences that mix wistful nostalgia with abstract philosophizing, the reverie is punctured by a stark revelation. However painful it must have been to predicate the harsh word dead of a beloved sister, no euphemism—has passed away, is no longer with us—could have ended that sentence. The topic of the discussion is how we struggle to reconcile the indubitable fact of death with our incomprehension of the possibility that a person can no longer exist. Our linguistic ancestors parlayed that incomprehension into euphemisms like passed on in which death consists of a journey to a remote location. Had Goldstein settled for these weasel words, she would have undermined her analysis before it began.
I wonder every day whether she still exists. A person whom one has loved seems altogether too significant a thing to simply vanish altogether from the world. A person whom one loves is a world, just as one knows oneself to be a world. How can worlds like these simply cease altogether? This passage fills my eyes every time I read it, and not just because it is about a sister-in-law I will never meet. With a spare restatement of what philosophers call the hard problem of consciousness (A person . . . is a world, just as one knows oneself to be a world), Goldstein creates an effect that is richly emotional. The puzzlement in having to make sense of this abstract philosophical conundrum mingles with the poignancy of having to come to terms with the loss of someone we love. It is not just the selfish realization that we have been robbed of their third-person company, but the unselfish realization that they have been robbed of their first-person experience.
The passage also reminds us of the overlap in techniques for writing fiction and nonfiction. The interweaving of the personal and the philosophical in this excerpt is being used as an expository device, to help us understand the issues that Spinoza wrote about. But it is also a theme that runs through Goldstein’s fiction, namely that the obsessions of academic philosophy—personal identity, consciousness, truth, will, meaning, morality—are of a piece with the obsessions of human beings as they try to make sense of their lives.
MAURICE SENDAK, AUTHOR OF SPLENDID NIGHTMARES, DIES AT 83
Maurice Sendak, widely considered the most important children’s book artist of the 20th century, who wrenched the picture book out of the safe, sanitized world of the nursery and plunged it into the dark, terrifying, and hauntingly beautiful recesses of the human psyche, died on Tuesday in Danbury, Conn. . . .
Roundly praised, intermittently censored, and occasionally eaten, Mr. Sendak’s books were essential ingredients of childhood for the generation born after 1960 or thereabouts, and in turn for their children.
PAULINE PHILLIPS, FLINTY ADVISER TO MILLIONS AS DEAR ABBY, DIES AT 94
Dear Abby: My wife sleeps in the raw. Then she showers, brushes her teeth and fixes our breakfast—still in the buff. We’re newlyweds and there are just the two of us, so I suppose there’s really nothing wrong with it. What do you think?—Ed
Dear Ed: It’s O.K. with me. But tell her to put on an apron when she’s frying bacon.
Pauline Phillips, a California housewife who nearly 60 years ago, seeking something more meaningful than mah-jongg, transformed herself into the syndicated columnist Dear Abby—and in so doing became a trusted, tart-tongued adviser to tens of millions—died on Wednesday in Minneapolis. . . .
With her comic and flinty yet fundamentally sympathetic voice, Mrs. Phillips helped wrestle the advice column from its weepy Victorian past into a hard-nosed 20th-century present. . . .
Dear Abby: Our son married a girl when he was in the service. They were married in February and she had an 8 1/2-pound baby girl in August. She said the baby was premature. Can an 8 1/2-pound baby be this premature?—Wanting to Know
Dear Wanting: The baby was on time. The wedding was late. Forget it.
Mrs. Phillips began her life as the columnist Abigail Van Buren in 1956. She quickly became known for her astringent, often genteelly risqué, replies to queries that included the marital, the medical, and sometimes both at once.
HELEN GURLEY BROWN, WHO GAVE “SINGLE GIRL” A LIFE IN FULL, DIES AT 90
Helen Gurley Brown, who as the author of Sex and the Single Girl shocked early-1960s America with the news that unmarried women not only had sex but thoroughly enjoyed it—and who as the editor of Cosmopolitan magazine spent the next three decades telling those women precisely how to enjoy it even more—died on Monday in Manhattan. She was 90, though parts of her were considerably younger. . . .
As Cosmopolitan’s editor from 1965 until 1997, Ms. Brown was widely credited with being the first to introduce frank discussions of sex into magazines for women. The look of women’s magazines today—a sea of voluptuous models and titillating cover lines—is due in no small part to her influence.
My third selection, also related to death, showcases yet another tone and style, and stands as further proof that good writing does not fit into a single formula. With deadpan wit, an affection for eccentricity, and a deft use of the English lexicon, the linguist and journalist Margalit Fox has perfected the art of the obituary.7
plunged [the picture book] into the dark, terrifying, and hauntingly beautiful recesses of the human psyche; a trusted, tart-tongued adviser to tens of millions; a sea of voluptuous models and titillating cover lines. When you have to capture a life in just eight hundred words, you have to choose those words carefully. Fox has found some mots justes and packed them into readable phrases which put the lie to the lazy excuse that you can’t sum up a complex subject—in this case a life’s accomplishments—in just a few words.
Roundly praised, intermittently censored, and occasionally eaten. This is a zeugma: the intentional juxtaposition of different senses of a single word. In this list, the word books is being used in the sense of both their narrative content (which can be praised or censored) and their physical form (which can be eaten). Along with putting a smile on the reader’s face, the zeugma subtly teases the bluenoses who objected to the nudity in Sendak’s drawings by juxtaposing their censorship with the innocence of the books’ readership.
and in turn for their children. A simple phrase that tells a story—a generation of children grew up with such fond memories of Sendak’s books that they read them to their own children—and that serves as an understated tribute to the great artist.
Dear Abby: My wife sleeps in the raw. Beginning the obit with a bang, this sample column instantly brings a pang of nostalgia to the millions of readers who grew up reading Dear Abby, and graphically introduces her life’s work to those who did not. We see for ourselves, rather than having to be told about, the offbeat problems, the waggish replies, the (for her time) liberal sensibility.
Dear Abby: Our son married a girl when he was in the service. The deliberate use of surprising transitions—colons, dashes, block quotations—is one of the hallmarks of lively prose.8 A lesser writer might have introduced this with the plodding “Here is another example of a column by Mrs. Phillips,” but Fox interrupts her narration without warning to redirect our gaze to Phillips in her prime. A writer, like a cinematographer, manipulates the viewer’s perspective on an ongoing story, with the verbal equivalent of camera angles and quick cuts.
the marital, the medical, and sometimes both at once. Killjoy style manuals tell writers to avoid alliteration, but good prose is enlivened with moments of poetry, like this line with its pleasing meter and its impish pairing of marital and medical.
She was 90, though parts of her were considerably younger. A sly twist on the formulaic reporting and ponderous tone of conventional obituaries. We soon learn that Brown was a champion of women’s sexual self-definition, so we understand the innuendo about cosmetic surgery as good-natured rather than catty—as a joke that Brown herself would have enjoyed.
hauntingly, flinty, tart-tongued, weepy, hard-nosed, astringent, genteelly, risqué, voluptuous, titillating. In selecting these uncommon adjectives and adverbs, Fox defies two of the commonest advisories in the stylebooks: Write with nouns and verbs, not adjectives and adverbs, and Never use an uncommon, fancy word when a common, plain one will do.
But the rules are badly stated. It’s certainly true that a lot of turgid prose is stuffed with polysyllabic Latinisms (cessation for end, eventuate in for cause) and flabby adjectives (is contributive to instead of contributes to, is determinative of instead of determines). And showing off with fancy words you barely understand can make you look pompous and occasionally ridiculous. But a skilled writer can enliven and sometimes electrify her prose with the judicious insertion of a surprising word. According to studies of writing quality, a varied vocabulary and the use of unusual words are two of the features that distinguish sprightly prose from mush.9
The best words not only pinpoint an idea better than any alternative but echo it in their sound and articulation, a phenomenon called phonesthetics, the feeling of sound.10 It’s no coincidence that haunting means “haunting” and tart means “tart,” rather than the other way around; just listen to your voice and sense your muscles as you articulate them. Voluptuous has a voluptuous give-and-take between the lips and the tongue, and titillating also gives the tongue a workout while titillating the ear with a coincidental but unignorable overlap with a naughty word. These associations make a sea of voluptuous models and titillating cover lines more lively than a sea of sexy models and provocative cover lines. And a sea of pulchritudinous models would have served as a lesson on how not to choose words: the ugly pulchritude sounds like the opposite of what it means, and it is one of those words that no one ever uses unless they are trying to show off.
But sometimes even show-off words can work. In her obituary of the journalist Mike McGrady, who masterminded a 1979 literary hoax in which a deliberately awful bodice ripper became an international bestseller, Fox wrote, “Naked Came the Stranger was written by 25 Newsday journalists in an era when newsrooms were arguably more relaxed and inarguably more bibulous.”11 The playful bibulous, “tending to drink too much,” is related to beverage and imbibe and calls to mind babbling, bobbling, bubbling, and burbling. Readers who want to become writers should read with a dictionary at hand (several are available as smartphone apps), and writers should not hesitate to send their readers there if the word is dead-on in meaning, evocative in sound, and not so obscure that the reader will never see it again. (You can probably do without maieutic, propaedeutic, and subdoxastic.) I write with a thesaurus, mindful of the advice I once read in a bicycle repair manual on how to squeeze a dent out of a rim with Vise-Grip pliers: “Do not get carried away with the destructive potential of this tool.”
From the early years of the twentieth century to well past its middle age, nearly every black family in the American South, which meant nearly every black family in America, had a decision to make. There were sharecroppers losing at settlement. Typists wanting to work in an office. Yard boys scared that a single gesture near the planter’s wife could leave them hanging from an oak tree. They were all stuck in a caste system as hard and unyielding as the red Georgia clay, and they each had a decision before them. In this, they were not unlike anyone who ever longed to cross the Atlantic or the Rio Grande.
It was during the First World War that a silent pilgrimage took its first steps within the borders of this country. The fever rose without warning or notice or much in the way of understanding by those outside its reach. It would not end until the 1970s and would set into motion changes in the North and South that no one, not even the people doing the leaving, could have imagined at the start of it or dreamed would take a lifetime to play out.
Historians would come to call it the Great Migration. It would become perhaps the biggest underreported story of the twentieth century. . . .
The actions of the people in this book were both universal and distinctly American. Their migration was a response to an economic and social structure not of their making. They did what humans have done for centuries when life became untenable—what the pilgrims did under the tyranny of British rule, what the Scotch-Irish did in Oklahoma when the land turned to dust, what the Irish did when there was nothing to eat, what the European Jews did during the spread of Nazism, what the landless in Russia, Italy, China, and elsewhere did when something better across the ocean called to them. What binds these stories together was the back-against-the-wall, reluctant yet hopeful search for something better, any place but where they were. They did what human beings looking for freedom, throughout history, have often done.
They left.
In The Warmth of Other Suns, the journalist Isabel Wilkerson ensured that the story of the Great Migration would be underreported no longer.12 Calling it “great” is no exaggeration. The movement of millions of African Americans from the Deep South to Northern cities set off the civil rights movement, redrew the urban landscape, rewrote the agenda of American politics and education, and transformed American culture and, with it, world culture.
Wilkerson not only rectifies the world’s ignorance about the Great Migration, but with twelve hundred interviews and crystalline prose she makes us understand it in its full human reality. We live in an era of social science, and have become accustomed to understanding the social world in terms of “forces,” “pressures,” “processes,” and “developments.” It is easy to forget that those “forces” are statistical summaries of the deeds of millions of men and women who act on their beliefs in pursuit of their desires. The habit of submerging the individual into abstractions can lead not only to bad science (it’s not as if the “social forces” obeyed Newton’s laws) but to dehumanization. We are apt to think, “I (and my kind) choose to do things for reasons; he (and his kind) are part of a social process.” This was a moral of Orwell’s essay “Politics and the English Language,” which warned against dehumanizing abstraction: “Millions of peasants are robbed of their farms and sent trudging along the roads with no more than they can carry: this is called transfer of population or rectification of frontiers.” With an allergy to abstraction and a phobia of cliché, Wilkerson trains a magnifying glass on the historical blob called “the Great Migration” and reveals the humanity of the people who compose it.
From the early years of the twentieth century to well past its middle age. Not even the chronology is described in conventional language: the century is an aging person, a contemporary of the story’s protagonists.
Typists wanting to work in an office. Not “denial of economic opportunities.” By invoking a moderately skilled occupation from an earlier era, Wilkerson invites us to imagine the desperation of a woman who has acquired a proficiency that could lift her from the cotton fields to a professional office but who is denied the chance because of the color of her skin.
Yard boys scared that a single gesture near the planter’s wife could leave them hanging from an oak tree. Not “oppression,” not “the threat of violence,” not even “lynching,” but a horrific physical image. We even see what kind of tree it is.
as hard and unyielding as the red Georgia clay. Once again prose is brought to life with a snatch of poetry, as in this simile with its sensual image, its whiff of allusion (I think of Martin Luther King’s “red hills of Georgia”), and its lyrical anapest meter.
anyone who ever longed to cross the Atlantic or the Rio Grande. Not “immigrants from Europe or Mexico.” Once again the people are not sociological categories. The author forces us to visualize bodies in motion and to remember the motives that pulled them along.
what the pilgrims did . . . what the Scotch-Irish did . . . what the European Jews did . . . what the landless in Russia, Italy, China, and elsewhere did. Wilkerson begins the paragraph by stating that the actions of her protagonists are universal, but she does not rest with that generalization. She nominates the Great Migration for inclusion in a list of storied emigrations (expressed in pleasingly parallel syntax), whose descendants doubtless include many of her readers. Those readers are implicitly invited to apply their respect for their ancestors’ courage and sacrifice to the forgotten pilgrims of the Great Migration.
when the land turned to dust, not “the Dust Bowl”; when there was nothing to eat, not “the Potato Famine”; the landless, not “the peasants.” Wilkerson will not allow us to snooze through a recitation of familiar verbiage. Fresh wording and concrete images force us to keep updating the virtual reality display in our minds.
They left. Among the many dumb rules of paragraphing foisted on students in composition courses is the one that says that a paragraph may not consist of a single sentence. Wilkerson ends a richly descriptive introductory chapter with a paragraph composed of exactly two syllables. The abrupt ending and the expanse of blankness at the bottom of the page mirror the finality of the decision to move and the uncertainty of the life that lay ahead. Good writing finishes strong.
• • •
The authors of the four passages share a number of practices: an insistence on fresh wording and concrete imagery over familiar verbiage and abstract summary; an attention to the readers’ vantage point and the target of their gaze; the judicious placement of an uncommon word or idiom against a backdrop of simple nouns and verbs; the use of parallel syntax; the occasional planned surprise; the presentation of a telling detail that obviates an explicit pronouncement; the use of meter and sound that resonate with the meaning and mood.
The authors also share an attitude: they do not hide the passion and relish that drive them to tell us about their subjects. They write as if they have something important to say. But no, that doesn’t capture it. They write as if they have something important to show. And that, we shall see, is a key ingredient in the sense of style.
Chapter 2
A WINDOW ONTO THE WORLD
CLASSIC STYLE AS AN ANTIDOTE FOR ACADEMESE, BUREAUCRATESE, CORPORATESE, LEGALESE, OFFICIALESE, AND OTHER KINDS OF STUFFY PROSE
Writing is an unnatural act.1 As Charles Darwin observed, “Man has an instinctive tendency to speak, as we see in the babble of our young children, whereas no child has an instinctive tendency to bake, brew, or write.” The spoken word is older than our species, and the instinct for language allows children to engage in articulate conversation years before they enter a schoolhouse. But the written word is a recent invention that has left no trace in our genome and must be laboriously acquired throughout childhood and beyond.
Speech and writing differ in their mechanics, of course, and that is one reason children must struggle with writing: it takes practice to reproduce the sounds of language with a pencil or a keyboard. But they differ in another way, which makes the acquisition of writing a lifelong challenge even after the mechanics have been mastered. Speaking and writing involve very different kinds of human relationship, and only the one associated with speech comes naturally to us. Spoken conversation is instinctive because social interaction is instinctive: we speak to those with whom we are on speaking terms. When we engage our conversational partners, we have an inkling of what they know and what they might be interested in learning, and as we chat with them, we monitor their eyes, their face, and their posture. If they need clarification, or cannot swallow an assertion, or have something to add, they can break into the conversation or follow up in turn.
We enjoy none of this give-and-take when we cast our bread upon the waters by sending a written missive out into the world. The recipients are invisible and inscrutable, and we have to get through to them without knowing much about them or seeing their reactions. At the time that we write, the reader exists only in our imaginations. Writing is above all an act of pretense. We have to visualize ourselves in some kind of conversation, or correspondence, or oration, or soliloquy, and put words into the mouth of the little avatar who represents us in this simulated world.
The key to good style, far more than obeying any list of commandments, is to have a clear conception of the make-believe world in which you’re pretending to communicate. There are many possibilities. A person thumb-typing a text message can get away with acting as if he is taking part in a real conversation.* A college student who writes a term paper is pretending that he knows more about his subject than the reader and that his goal is to supply the reader with information she needs, whereas in reality his reader typically knows more about the subject than he does and has no need for the information, the actual goal of the exercise being to give the student practice for the real thing. An activist composing a manifesto, or a minister drafting a sermon, must write as if they are standing in front of a crowd and whipping up their emotions.
Which simulation should a writer immerse himself in when composing a piece for a more generic readership, such as an essay, an article, a review, an editorial, a newsletter, or a blog post? The literary scholars Francis-Noël Thomas and Mark Turner have singled out one model of prose as an aspiration for such writers today. They call it classic style, and explain it in a wonderful little book called Clear and Simple as the Truth.
The guiding metaphor of classic style is seeing the world. The writer can see something that the reader has not yet noticed, and he orients the reader’s gaze so that she can see it for herself. The purpose of writing is presentation, and its motive is disinterested truth. It succeeds when it aligns language with the truth, the proof of success being clarity and simplicity. The truth can be known, and is not the same as the language that reveals it; prose is a window onto the world. The writer knows the truth before putting it into words; he is not using the occasion of writing to sort out what he thinks. Nor does the writer of classic prose have to argue for the truth; he just needs to present it. That is because the reader is competent and can recognize the truth when she sees it, as long as she is given an unobstructed view. The writer and the reader are equals, and the process of directing the reader’s gaze takes the form of a conversation.
Prologue
I love style manuals. Ever since I was assigned Strunk and White’s The Elements of Style in an introductory psychology course, the writing guide has been among my favorite literary genres. It’s not just that I welcome advice on the lifelong challenge of perfecting the craft of writing. It’s also that credible guidance on writing must itself be well written, and the best of the manuals are paragons of their own advice. William Strunk’s course notes on writing, which his student E. B. White turned into their famous little book, was studded with gems of self-exemplification such as “Write with nouns and verbs,” “Put the emphatic words of a sentence at the end,” and best of all, his prime directive, “Omit needless words.” Many eminent stylists have applied their gifts to explaining the art, including Kingsley Amis, Jacques Barzun, Ambrose Bierce, Bill Bryson, Robert Graves, Tracy Kidder, Stephen King, Elmore Leonard, F. L. Lucas, George Orwell, William Safire, and of course White himself, the beloved author of Charlotte’s Web and Stuart Little. Here is the great essayist reminiscing about his teacher:
I like to read style manuals for another reason, the one that sends botanists to the garden and chemists to the kitchen: it’s a practical application of our science. I am a psycholinguist and a cognitive scientist, and what is style, after all, but the effective use of words to engage the human mind? It’s all the more captivating to someone who seeks to explain these fields to a wide readership. I think about how language works so that I can best explain how language works.
But my professional acquaintance with language has led me to read the traditional manuals with a growing sense of unease. Strunk and White, for all their intuitive feel for style, had a tenuous grasp of grammar.2 They misdefined terms such as phrase, participle, and relative clause, and in steering their readers away from passive verbs and toward active transitive ones they botched their examples of both. There were a great number of dead leaves lying on the ground, for instance, is not in the passive voice, nor does The cock’s crow came with dawn contain a transitive verb. Lacking the tools to analyze language, they often struggled when turning their intuitions into advice, vainly appealing to the writer’s “ear.” And they did not seem to realize that some of the advice contradicted itself: “Many a tame sentence . . . can be made lively and emphatic by substituting a transitive in the active voice” uses the passive voice to warn against the passive voice. George Orwell, in his vaunted “Politics and the English Language,” fell into the same trap when, without irony, he derided prose in which “the passive voice is wherever possible used in preference to the active.”3
Self-contradiction aside, we now know that telling writers to avoid the passive is bad advice. Linguistic research has shown that the passive construction has a number of indispensable functions because of the way it engages a reader’s attention and memory. A skilled writer should know what those functions are and push back against copy editors who, under the influence of grammatically naïve style guides, blue-pencil every passive construction they spot into an active one.
Style manuals that are innocent of linguistics also are crippled in dealing with the aspect of writing that evokes the most emotion: correct and incorrect usage. Many style manuals treat traditional rules of usage the way fundamentalists treat the Ten Commandments: as unerring laws chiseled in sapphire for mortals to obey or risk eternal damnation. But skeptics and freethinkers who probe the history of these rules have found that they belong to an oral tradition of folklore and myth. For many reasons, manuals that are credulous about the inerrancy of the traditional rules don’t serve writers well. Although some of the rules can make prose better, many of them make it worse, and writers are better off flouting them. The rules often mash together issues of grammatical correctness, logical coherence, formal style, and standard dialect, but a skilled writer needs to keep them straight. And the orthodox stylebooks are ill equipped to deal with an inescapable fact about language: it changes over time. Language is not a protocol legislated by an authority but rather a wiki that pools the contributions of millions of writers and speakers, who ceaselessly bend the language to their needs and who inexorably age, die, and get replaced by their children, who adapt the language in their turn.
Yet the authors of the classic manuals wrote as if the language they grew up with were immortal, and failed to cultivate an ear for ongoing change. Strunk and White, writing in the early and middle decades of the twentieth century, condemned then-new verbs like personalize, finalize, host, chair, and debut, and warned writers never to use fix for “repair” or claim for “declare.” Worse, they justified their peeves with cockamamie rationalizations. The verb contact, they argued, is “vague and self-important. Do not contact people; get in touch with them, look them up, phone them, find them, or meet them.” But of course the vagueness of to contact is exactly why it caught on: sometimes a writer doesn’t need to know how one person will get in touch with another, as long as he does so. Or consider this head-scratcher, concocted to explain why a writer should never use a number word with people, only with persons: “If of ‘six people’ five went away, how many people would be left? Answer: one people.” By the same logic, writers should avoid using numbers with irregular plurals such as men, children, and teeth (“If of ‘six children’ five went away . . .”).
In the last edition published in his lifetime, White did acknowledge some changes to the language, instigated by “youths” who “speak to other youths in a tongue of their own devising: they renovate the language with a wild vigor, as they would a basement apartment.” White’s condescension to these “youths” (now in their retirement years) led him to predict the passing of nerd, psyched, ripoff, dude, geek, and funky, all of which have become entrenched in the language.
The graybeard sensibilities of the style mavens come not just from an underappreciation of the fact of language change but from a lack of reflection on their own psychology. As people age, they confuse changes in themselves with changes in the world, and changes in the world with moral decline—the illusion of the good old days.4 And so every generation believes that the kids today are degrading the language and taking civilization down with it:5
The common language is disappearing. It is slowly being crushed to death under the weight of verbal conglomerate, a pseudospeech at once both pretentious and feeble, that is created daily by millions of blunders and inaccuracies in grammar, syntax, idiom, metaphor, logic, and common sense. . . . In the history of modern English there is no period in which such victory over thought-in-speech has been so widespread.—1978
Recent graduates, including those with university degrees, seem to have no mastery of the language at all. They cannot construct a simple declarative sentence, either orally or in writing. They cannot spell common, everyday words. Punctuation is apparently no longer taught. Grammar is a complete mystery to almost all recent graduates.—1961
From every college in the country goes up the cry, “Our freshmen can’t spell, can’t punctuate.” Every high school is in disrepair because its pupils are so ignorant of the merest rudiments.—1917
The vocabularies of the majority of high-school pupils are amazingly small. I always try to use simple English, and yet I have talked to classes when quite a minority of the pupils did not comprehend more than half of what I said.—1889
Unless the present progress of change [is] arrested . . . there can be no doubt that, in another century, the dialect of the Americans will become utterly unintelligible to an Englishman.—1833
Our language (I mean the English) is degenerating very fast. . . . I begin to fear that it will be impossible to check it.—1785
Complaints about the decline of language go at least as far back as the invention of the printing press. Soon after William Caxton set up the first one in England in 1478, he lamented, “And certaynly our langage now vsed veryeth ferre from what whiche was vsed and spoken when I was borne.” Indeed, moral panic about the decline of writing may be as old as writing itself:
Non Sequitur © 2011 Wiley Ink, Inc. Dist. by Universal Uclick. Reprinted with permission. All rights reserved.
The cartoon is not much of an exaggeration. According to the English scholar Richard Lloyd-Jones, some of the clay tablets deciphered from ancient Sumerian include complaints about the deteriorating writing skills of the young.6
My discomfort with the classic style manuals has convinced me that we need a writing guide for the twenty-first century. It’s not that I have the desire, to say nothing of the ability, to supplant The Elements of Style. Writers can profit by reading more than one style guide, and much of Strunk and White (as it is commonly called) is as timeless as it is charming. But much of it is not. Strunk was born in 1869, and today’s writers cannot base their craft exclusively on the advice of a man who developed his sense of style before the invention of the telephone (let alone the Internet), before the advent of modern linguistics and cognitive science, before the wave of informalization that swept the world in the second half of the twentieth century.
A manual for the new millennium cannot just perpetuate the diktats of earlier manuals. Today’s writers are infused by the spirit of scientific skepticism and the ethos of questioning authority. They should not be satisfied with “That’s the way it’s done” or “Because I said so,” and they deserve not to be patronized at any age. They rightly expect reasons for any advice that is foisted upon them.
Today we can provide the reasons. We have an understanding of grammatical phenomena which goes well beyond the traditional taxonomies based on crude analogies with Latin. We have a body of research on the mental dynamics of reading: the waxing and waning of memory load as readers comprehend a passage, the incrementing of their knowledge as they come to grasp its meaning, the blind alleys that can lead them astray. We have a body of history and criticism which can distinguish the rules that enhance clarity, grace, and emotional resonance from those that are based on myths and misunderstandings. By replacing dogma about usage with reason and evidence, I hope not just to avoid giving ham-fisted advice but to make the advice that I do give easier to remember than a list of dos and don’ts. Providing reasons should also allow writers and editors to apply the guidelines judiciously, mindful of what they are designed to accomplish, rather than robotically.
“The sense of style” has a double meaning. The word sense, as in “the sense of sight” and “a sense of humor,” can refer to a faculty of mind, in this case the faculties of comprehension that resonate to a well-crafted sentence. It can also refer to “good sense” as opposed to “nonsense,” in this case the ability to discriminate between the principles that improve the quality of prose and the superstitions, fetishes, shibboleths, and initiation ordeals that have been passed down in the traditions of usage.
The Sense of Style is not a reference manual in which you can find the answer to every question about hyphenation and capitalization. Nor is it a remedial guide for badly educated students who have yet to master the mechanics of a sentence. Like the classic guides, it is designed for people who know how to write and want to write better. This includes students who hope to improve the quality of their papers, aspiring critics and journalists who want to start a blog or column or series of reviews, and professionals who seek a cure for their academese, bureaucratese, corporatese, legalese, medicalese, or officialese. The book is also written for readers who seek no help in writing but are interested in letters and literature and curious about the ways in which the sciences of mind can illuminate how language works at its best.
My focus is on nonfiction, particularly genres that put a premium on clarity and coherence. But unlike the authors of the classic guides, I don’t equate these virtues with plain words, austere expression, and formal style.7 You can write with clarity and with flair, too. And though the emphasis is on nonfiction, the explanations should be useful to fiction writers as well, because many principles of style apply whether the world being written about is real or imaginary. I like to think they might also be helpful to poets, orators, and other creative wordsmiths, who need to know the canons of pedestrian prose to flout them for rhetorical effect.
People often ask me whether anyone today even cares about style. The English language, they say, faces a new threat in the rise of the Internet, with its texting and tweeting, its email and chatrooms. Surely the craft of written expression has declined since the days before smartphones and the Web. You remember those days, don’t you? Back in the 1980s, when teenagers spoke in fluent paragraphs, bureaucrats wrote in plain English, and every academic paper was a masterpiece in the art of the essay? (Or was it the 1970s?) The problem with the Internet-is-making-us-illiterate theory, of course, is that bad prose has burdened readers in every era. Professor Strunk tried to do something about it in 1918, when young Elwyn White was a student in his English class at Cornell.
What today’s doomsayers fail to notice is that the very trends they deplore consist in oral media—radio, telephones, and television—giving way to written ones. Not so long ago it was radio and television that were said to be ruining the language. More than ever before, the currency of our social and cultural lives is the written word. And no, not all of it is the semiliterate ranting of Internet trolls. A little surfing will show that many Internet users value language that is clear, grammatical, and competently spelled and punctuated, not just in printed books and legacy media but in e-zines, blogs, Wikipedia entries, consumer reviews, and even a fair proportion of email. Surveys have shown that college students are writing more than their counterparts in earlier generations did, and that they make no more errors per page of writing.8 And contrary to an urban legend, they do not sprinkle their papers with smileys and instant-messaging abbreviations like IMHO and L8TR, any more than previous generations forgot how to use prepositions and articles out of the habit of omitting them from their telegrams. Members of the Internet generation, like all language users, fit their phrasing to the setting and audience, and have a good sense of what is appropriate in formal writing.
Style still matters, for at least three reasons. First, it ensures that writers will get their messages across, sparing readers from squandering their precious moments on earth deciphering opaque prose. When the effort fails, the result can be calamitous—as Strunk and White put it, “death on the highway caused by a badly worded road sign, heartbreak among lovers caused by a misplaced phrase in a well-intentioned letter, anguish of a traveler expecting to be met at a railroad station and not being met because of a slipshod telegram.” Governments and corporations have found that small improvements in clarity can prevent vast amounts of error, frustration, and waste,9 and many countries have recently made clear language the law of the land.10
Second, style earns trust. If readers can see that a writer cares about consistency and accuracy in her prose, they will be reassured that the writer cares about those virtues in conduct they cannot see as easily. Here is how one technology executive explains why he rejects job applications filled with errors of grammar and punctuation: “If it takes someone more than 20 years to notice how to properly use it’s, then that’s not a learning curve I’m comfortable with.”11 And if that isn’t enough to get you to brush up your prose, consider the discovery of the dating site OkCupid that sloppy grammar and spelling in a profile are “huge turn-offs.” As one client said, “If you’re trying to date a woman, I don’t expect flowery Jane Austen prose. But aren’t you trying to put your best foot forward?”12
Style, not least, adds beauty to the world. To a literate reader, a crisp sentence, an arresting metaphor, a witty aside, an elegant turn of phrase are among life’s greatest pleasures. And as we shall see in the first chapter, this thoroughly impractical virtue of good writing is where the practical effort of mastering good writing must begin.
Chapter 1
GOOD WRITING
REVERSE-ENGINEERING GOOD PROSE AS THE KEY TO DEVELOPING A WRITERLY EAR
Education is an admirable thing,” wrote Oscar Wilde, “but it is well to remember from time to time that nothing that is worth knowing can be taught.”1 In dark moments while writing this book, I sometimes feared that Wilde might be right. When I polled some accomplished writers about which style manuals they had consulted during their apprenticeships, the most common answer I got was “none.” Writing, they said, just came naturally to them.
I’d be the last to doubt that good writers are blessed with an innate dose of fluency with syntax and memory for words. But no one is born with skills in English composition per se. Those skills may not have come from stylebooks, but they must have come from somewhere.
That somewhere is the writing of other writers. Good writers are avid readers. They have absorbed a vast inventory of words, idioms, constructions, tropes, and rhetorical tricks, and with them a sensitivity to how they mesh and how they clash. This is the elusive “ear” of a skilled writer—the tacit sense of style which every honest stylebook, echoing Wilde, confesses cannot be explicitly taught. Biographers of great authors always try to track down the books their subjects read when they were young, because they know these sources hold the key to their development as writers.
I would not have written this book if I did not believe, contra Wilde, that many principles of style really can be taught. But the starting point for becoming a good writer is to be a good reader. Writers acquire their technique by spotting, savoring, and reverse-engineering examples of good prose. The goal of this chapter is to provide a glimpse of how that is done. I have picked four passages of twenty-first-century prose, diverse in style and content, and will think aloud as I try to understand what makes them work. My intent is not to honor these passages as if I were bestowing a prize, nor to hold them up as models for you to emulate. It’s to illustrate, via a peek into my stream of consciousness, the habit of lingering over good writing wherever you find it and reflecting on what makes it good.
Savoring good prose is not just a more effective way to develop a writerly ear than obeying a set of commandments; it’s a more inviting one. Much advice on style is stern and censorious. A recent bestseller advocated “zero tolerance” for errors and brandished the words horror, satanic, ghastly, and plummeting standards on its first page. The classic manuals, written by starchy Englishmen and rock-ribbed Yankees, try to take all the fun out of writing, grimly adjuring the writer to avoid offbeat words, figures of speech, and playful alliteration. A famous piece of advice from this school crosses the line from the grim to the infanticidal: “Whenever you feel an impulse to perpetrate a piece of exceptionally fine writing, obey it—wholeheartedly—and delete it before sending your manuscript to press. Murder your darlings.”2
An aspiring writer could be forgiven for thinking that learning to write is like negotiating an obstacle course in boot camp, with a sergeant barking at you for every errant footfall. Why not think of it instead as a form of pleasurable mastery, like cooking or photography? Perfecting the craft is a lifelong calling, and mistakes are part of the game. Though the quest for improvement may be informed by lessons and honed by practice, it must first be kindled by a delight in the best work of the masters and a desire to approach their excellence.
In the opening lines of Richard Dawkins’s Unweaving the Rainbow, the uncompromising atheist and tireless advocate of science explains why his worldview does not, as the romantic and the religious fear, extinguish a sense of wonder or an appreciation of life.3
We are going to die, and that makes us the lucky ones. Good writing starts strong. Not with a cliché (“Since the dawn of time”), not with a banality (“Recently, scholars have been increasingly concerned with the question of . . .”), but with a contentful observation that provokes curiosity. The reader of Unweaving the Rainbow opens the book and is walloped with a reminder of the most dreadful fact we know, and on its heels a paradoxical elaboration. We’re lucky because we’ll die? Who wouldn’t want to find out how this mystery will be solved? The starkness of the paradox is reinforced by the diction and meter: short, simple words, a stressed monosyllable followed by six iambic feet.*
Most people are never going to die. The resolution to the paradox—that a bad thing, dying, implies a good thing, having lived—is explained with parallel constructions: never going to die . . . never going to be born. The next sentence restates the contrast, also in parallel language, but avoids the tedium of repeating words yet again by juxtaposing familiar idioms that have the same rhythm: been here in my place . . . see the light of day.
the sand grains of Arabia. A touch of the poetic, better suited to the grandeur that Dawkins seeks to invoke than a colorless adjective like massive or enormous. The expression is snatched from the brink of cliché by its variant wording (sand grains rather than sands) and by its vaguely exotic feel. The phrase sands of Arabia, though common in the early nineteenth century, has plunged in popularity ever since, and there is no longer even a place that is commonly called Arabia; we refer to it as Saudi Arabia or the Arabian Peninsula.4
unborn ghosts. A vivid image to convey the abstract notion of a mathematically possible combination of genes, and a wily repurposing of a supernatural concept to advance a naturalistic argument.
greater poets than Keats, scientists greater than Newton. Parallel wording is a powerful trope, but after dying and being born, being here in my place and seeing the light of day, enough is enough. To avoid monotony Dawkins inverts the structure of one of the lines in this couplet. The phrase subtly alludes to another meditation on unrealized genius, “Some mute inglorious Milton here may rest,” from Thomas Gray’s “Elegy Written in a Country Churchyard.”
In the teeth of these stupefying odds. The idiom brings to mind the menacing gape of a predator, reinforcing our gratitude for being alive: to come into existence we narrowly escaped a mortal threat, namely the high odds against it. How high? Every writer faces the challenge of finding a superlative in the English word-hoard that has not been inflated by hyperbole and overuse. In the teeth of these incredible odds? In the teeth of these awesome odds? Meh. Dawkins has found a superlative—to render into a stupor, to make stupid—that still has the power to impress.
Good writing can flip the way the world is perceived, like the silhouette in psychology textbooks which oscillates between a goblet and two faces. In six sentences Dawkins has flipped the way we think of death, and has stated a rationalist’s case for an appreciation of life in words so stirring that many humanists I know have asked that it be read at their funerals.
What is it that makes a person the very person that she is, herself alone and not another, an integrity of identity that persists over time, undergoing changes and yet still continuing to be—until she does not continue any longer, at least not unproblematically?
I stare at the picture of a small child at a summer’s picnic, clutching her big sister’s hand with one tiny hand while in the other she has a precarious hold on a big slice of watermelon that she appears to be struggling to have intersect with the small o of her mouth. That child is me. But why is she me? I have no memory at all of that summer’s day, no privileged knowledge of whether that child succeeded in getting the watermelon into her mouth. It’s true that a smooth series of contiguous physical events can be traced from her body to mine, so that we would want to say that her body is mine; and perhaps bodily identity is all that our personal identity consists in. But bodily persistence over time, too, presents philosophical dilemmas. The series of contiguous physical events has rendered the child’s body so different from the one I glance down on at this moment; the very atoms that composed her body no longer compose mine. And if our bodies are dissimilar, our points of view are even more so. Mine would be as inaccessible to her—just let her try to figure out [Spinoza’s] Ethics—as hers is now to me. Her thought processes, prelinguistic, would largely elude me.
Yet she is me, that tiny determined thing in the frilly white pinafore. She has continued to exist, survived her childhood illnesses, the near-drowning in a rip current on Rockaway Beach at the age of twelve, other dramas. There are presumably adventures that she—that is that I—can’t undergo and still continue to be herself. Would I then be someone else or would I just no longer be? Were I to lose all sense of myself—were schizophrenia or demonic possession, a coma or progressive dementia to remove me from myself—would it be I who would be undergoing those trials, or would I have quit the premises? Would there then be someone else, or would there be no one?
Is death one of those adventures from which I can’t emerge as myself? The sister whose hand I am clutching in the picture is dead. I wonder every day whether she still exists. A person whom one has loved seems altogether too significant a thing to simply vanish altogether from the world. A person whom one loves is a world, just as one knows oneself to be a world. How can worlds like these simply cease altogether? But if my sister does exist, then what is she, and what makes that thing that she now is identical with the beautiful girl laughing at her little sister on that forgotten day?
In this passage from Betraying Spinoza, the philosopher and novelist Rebecca Newberger Goldstein (to whom I am married) explains the philosophical puzzle of personal identity, one of the problems that engaged the Dutch-Jewish thinker who is the subject of her book.5 Like her fellow humanist Dawkins, Goldstein analyzes the vertiginous enigma of existence and death, but their styles could not be more different—a reminder of the diverse ways that the resources of language can be deployed to illuminate a topic. Dawkins’s could fairly be called masculine, with its confrontational opening, its cold abstractions, its aggressive imagery, its glorification of alpha males. Goldstein’s is personal, evocative, reflective, yet intellectually just as rigorous.
at least not unproblematically. The categories of grammar reflect the building blocks of thought—time, space, causality, matter—and a philosophical wordsmith can play with them to awaken her readers to metaphysical conundrums. Here we have an adverb, unproblematically, modifying the verb continue, an ellipsis for continue to be. Ordinarily to be is not the kind of verb that can be modified by an adverb. To be or not to be—it’s hard to see shades of gray there. The unexpected adverb puts an array of metaphysical, theological, and personal questions on the table before us.
a big slice of watermelon that she appears to be struggling to have intersect with the small o of her mouth. Good writing is understood with the mind’s eye.6 The unusual description of the familiar act of eating in terms of its geometry—a piece of fruit intersecting with an o—forces the reader to pause and conjure a mental image of the act rather than skating over a verbal summary. We find the little girl in the photograph endearing not because the author has stooped to telling us so with words like cute or adorable but because we can see her childlike mannerisms for ourselves—as the author herself is doing when pondering the little alien who somehow is her. We see the clumsiness of a small hand manipulating an adult-sized object; the determination to master a challenge we take for granted; the out-of-sync mouth anticipating the sweet, juicy reward. The geometric language also prepares us for the prelinguistic thinking that Goldstein introduces in the next paragraph: we regress to an age at which “to eat” and even “to put in your mouth” are abstractions, several levels removed from the physical challenge of making an object intersect with a body part.
That child is me. But why is she me? . . . [My point of view] would be as inaccessible to her . . . as hers is now to me. . . . There are presumably adventures that she—that is that I—can’t undergo and still continue to be herself. Would I then be someone else? Goldstein repeatedly juxtaposes nouns and pronouns in the first and third person: that child . . . me; she . . . I . . . herself; I . . . someone else. The syntactic confusion about which grammatical person belongs in which phrase reflects our intellectual confusion about the very meaning of the concept “person.” She also plays with to be, the quintessentially existential verb, to engage our existential puzzlement: Would I then be someone else or would I just no longer be? . . . Would there then be someone else, or would there be no one?
frilly white pinafore. The use of an old-fashioned word for an old-fashioned garment helps date the snapshot for us, without the cliché faded photograph.
The sister whose hand I am clutching in the picture is dead. After eighteen sentences that mix wistful nostalgia with abstract philosophizing, the reverie is punctured by a stark revelation. However painful it must have been to predicate the harsh word dead of a beloved sister, no euphemism—has passed away, is no longer with us—could have ended that sentence. The topic of the discussion is how we struggle to reconcile the indubitable fact of death with our incomprehension of the possibility that a person can no longer exist. Our linguistic ancestors parlayed that incomprehension into euphemisms like passed on in which death consists of a journey to a remote location. Had Goldstein settled for these weasel words, she would have undermined her analysis before it began.
I wonder every day whether she still exists. A person whom one has loved seems altogether too significant a thing to simply vanish altogether from the world. A person whom one loves is a world, just as one knows oneself to be a world. How can worlds like these simply cease altogether? This passage fills my eyes every time I read it, and not just because it is about a sister-in-law I will never meet. With a spare restatement of what philosophers call the hard problem of consciousness (A person . . . is a world, just as one knows oneself to be a world), Goldstein creates an effect that is richly emotional. The puzzlement in having to make sense of this abstract philosophical conundrum mingles with the poignancy of having to come to terms with the loss of someone we love. It is not just the selfish realization that we have been robbed of their third-person company, but the unselfish realization that they have been robbed of their first-person experience.
The passage also reminds us of the overlap in techniques for writing fiction and nonfiction. The interweaving of the personal and the philosophical in this excerpt is being used as an expository device, to help us understand the issues that Spinoza wrote about. But it is also a theme that runs through Goldstein’s fiction, namely that the obsessions of academic philosophy—personal identity, consciousness, truth, will, meaning, morality—are of a piece with the obsessions of human beings as they try to make sense of their lives.
MAURICE SENDAK, AUTHOR OF SPLENDID NIGHTMARES, DIES AT 83
Maurice Sendak, widely considered the most important children’s book artist of the 20th century, who wrenched the picture book out of the safe, sanitized world of the nursery and plunged it into the dark, terrifying, and hauntingly beautiful recesses of the human psyche, died on Tuesday in Danbury, Conn. . . .
Roundly praised, intermittently censored, and occasionally eaten, Mr. Sendak’s books were essential ingredients of childhood for the generation born after 1960 or thereabouts, and in turn for their children.
PAULINE PHILLIPS, FLINTY ADVISER TO MILLIONS AS DEAR ABBY, DIES AT 94
Dear Abby: My wife sleeps in the raw. Then she showers, brushes her teeth and fixes our breakfast—still in the buff. We’re newlyweds and there are just the two of us, so I suppose there’s really nothing wrong with it. What do you think?—Ed
Dear Ed: It’s O.K. with me. But tell her to put on an apron when she’s frying bacon.
Pauline Phillips, a California housewife who nearly 60 years ago, seeking something more meaningful than mah-jongg, transformed herself into the syndicated columnist Dear Abby—and in so doing became a trusted, tart-tongued adviser to tens of millions—died on Wednesday in Minneapolis. . . .
With her comic and flinty yet fundamentally sympathetic voice, Mrs. Phillips helped wrestle the advice column from its weepy Victorian past into a hard-nosed 20th-century present. . . .
Dear Abby: Our son married a girl when he was in the service. They were married in February and she had an 8 1/2-pound baby girl in August. She said the baby was premature. Can an 8 1/2-pound baby be this premature?—Wanting to Know
Dear Wanting: The baby was on time. The wedding was late. Forget it.
Mrs. Phillips began her life as the columnist Abigail Van Buren in 1956. She quickly became known for her astringent, often genteelly risqué, replies to queries that included the marital, the medical, and sometimes both at once.
HELEN GURLEY BROWN, WHO GAVE “SINGLE GIRL” A LIFE IN FULL, DIES AT 90
Helen Gurley Brown, who as the author of Sex and the Single Girl shocked early-1960s America with the news that unmarried women not only had sex but thoroughly enjoyed it—and who as the editor of Cosmopolitan magazine spent the next three decades telling those women precisely how to enjoy it even more—died on Monday in Manhattan. She was 90, though parts of her were considerably younger. . . .
As Cosmopolitan’s editor from 1965 until 1997, Ms. Brown was widely credited with being the first to introduce frank discussions of sex into magazines for women. The look of women’s magazines today—a sea of voluptuous models and titillating cover lines—is due in no small part to her influence.
My third selection, also related to death, showcases yet another tone and style, and stands as further proof that good writing does not fit into a single formula. With deadpan wit, an affection for eccentricity, and a deft use of the English lexicon, the linguist and journalist Margalit Fox has perfected the art of the obituary.7
plunged [the picture book] into the dark, terrifying, and hauntingly beautiful recesses of the human psyche; a trusted, tart-tongued adviser to tens of millions; a sea of voluptuous models and titillating cover lines. When you have to capture a life in just eight hundred words, you have to choose those words carefully. Fox has found some mots justes and packed them into readable phrases which put the lie to the lazy excuse that you can’t sum up a complex subject—in this case a life’s accomplishments—in just a few words.
Roundly praised, intermittently censored, and occasionally eaten. This is a zeugma: the intentional juxtaposition of different senses of a single word. In this list, the word books is being used in the sense of both their narrative content (which can be praised or censored) and their physical form (which can be eaten). Along with putting a smile on the reader’s face, the zeugma subtly teases the bluenoses who objected to the nudity in Sendak’s drawings by juxtaposing their censorship with the innocence of the books’ readership.
and in turn for their children. A simple phrase that tells a story—a generation of children grew up with such fond memories of Sendak’s books that they read them to their own children—and that serves as an understated tribute to the great artist.
Dear Abby: My wife sleeps in the raw. Beginning the obit with a bang, this sample column instantly brings a pang of nostalgia to the millions of readers who grew up reading Dear Abby, and graphically introduces her life’s work to those who did not. We see for ourselves, rather than having to be told about, the offbeat problems, the waggish replies, the (for her time) liberal sensibility.
Dear Abby: Our son married a girl when he was in the service. The deliberate use of surprising transitions—colons, dashes, block quotations—is one of the hallmarks of lively prose.8 A lesser writer might have introduced this with the plodding “Here is another example of a column by Mrs. Phillips,” but Fox interrupts her narration without warning to redirect our gaze to Phillips in her prime. A writer, like a cinematographer, manipulates the viewer’s perspective on an ongoing story, with the verbal equivalent of camera angles and quick cuts.
the marital, the medical, and sometimes both at once. Killjoy style manuals tell writers to avoid alliteration, but good prose is enlivened with moments of poetry, like this line with its pleasing meter and its impish pairing of marital and medical.
She was 90, though parts of her were considerably younger. A sly twist on the formulaic reporting and ponderous tone of conventional obituaries. We soon learn that Brown was a champion of women’s sexual self-definition, so we understand the innuendo about cosmetic surgery as good-natured rather than catty—as a joke that Brown herself would have enjoyed.
hauntingly, flinty, tart-tongued, weepy, hard-nosed, astringent, genteelly, risqué, voluptuous, titillating. In selecting these uncommon adjectives and adverbs, Fox defies two of the commonest advisories in the stylebooks: Write with nouns and verbs, not adjectives and adverbs, and Never use an uncommon, fancy word when a common, plain one will do.
But the rules are badly stated. It’s certainly true that a lot of turgid prose is stuffed with polysyllabic Latinisms (cessation for end, eventuate in for cause) and flabby adjectives (is contributive to instead of contributes to, is determinative of instead of determines). And showing off with fancy words you barely understand can make you look pompous and occasionally ridiculous. But a skilled writer can enliven and sometimes electrify her prose with the judicious insertion of a surprising word. According to studies of writing quality, a varied vocabulary and the use of unusual words are two of the features that distinguish sprightly prose from mush.9
The best words not only pinpoint an idea better than any alternative but echo it in their sound and articulation, a phenomenon called phonesthetics, the feeling of sound.10 It’s no coincidence that haunting means “haunting” and tart means “tart,” rather than the other way around; just listen to your voice and sense your muscles as you articulate them. Voluptuous has a voluptuous give-and-take between the lips and the tongue, and titillating also gives the tongue a workout while titillating the ear with a coincidental but unignorable overlap with a naughty word. These associations make a sea of voluptuous models and titillating cover lines more lively than a sea of sexy models and provocative cover lines. And a sea of pulchritudinous models would have served as a lesson on how not to choose words: the ugly pulchritude sounds like the opposite of what it means, and it is one of those words that no one ever uses unless they are trying to show off.
But sometimes even show-off words can work. In her obituary of the journalist Mike McGrady, who masterminded a 1979 literary hoax in which a deliberately awful bodice ripper became an international bestseller, Fox wrote, “Naked Came the Stranger was written by 25 Newsday journalists in an era when newsrooms were arguably more relaxed and inarguably more bibulous.”11 The playful bibulous, “tending to drink too much,” is related to beverage and imbibe and calls to mind babbling, bobbling, bubbling, and burbling. Readers who want to become writers should read with a dictionary at hand (several are available as smartphone apps), and writers should not hesitate to send their readers there if the word is dead-on in meaning, evocative in sound, and not so obscure that the reader will never see it again. (You can probably do without maieutic, propaedeutic, and subdoxastic.) I write with a thesaurus, mindful of the advice I once read in a bicycle repair manual on how to squeeze a dent out of a rim with Vise-Grip pliers: “Do not get carried away with the destructive potential of this tool.”
From the early years of the twentieth century to well past its middle age, nearly every black family in the American South, which meant nearly every black family in America, had a decision to make. There were sharecroppers losing at settlement. Typists wanting to work in an office. Yard boys scared that a single gesture near the planter’s wife could leave them hanging from an oak tree. They were all stuck in a caste system as hard and unyielding as the red Georgia clay, and they each had a decision before them. In this, they were not unlike anyone who ever longed to cross the Atlantic or the Rio Grande.
It was during the First World War that a silent pilgrimage took its first steps within the borders of this country. The fever rose without warning or notice or much in the way of understanding by those outside its reach. It would not end until the 1970s and would set into motion changes in the North and South that no one, not even the people doing the leaving, could have imagined at the start of it or dreamed would take a lifetime to play out.
Historians would come to call it the Great Migration. It would become perhaps the biggest underreported story of the twentieth century. . . .
The actions of the people in this book were both universal and distinctly American. Their migration was a response to an economic and social structure not of their making. They did what humans have done for centuries when life became untenable—what the pilgrims did under the tyranny of British rule, what the Scotch-Irish did in Oklahoma when the land turned to dust, what the Irish did when there was nothing to eat, what the European Jews did during the spread of Nazism, what the landless in Russia, Italy, China, and elsewhere did when something better across the ocean called to them. What binds these stories together was the back-against-the-wall, reluctant yet hopeful search for something better, any place but where they were. They did what human beings looking for freedom, throughout history, have often done.
They left.
In The Warmth of Other Suns, the journalist Isabel Wilkerson ensured that the story of the Great Migration would be underreported no longer.12 Calling it “great” is no exaggeration. The movement of millions of African Americans from the Deep South to Northern cities set off the civil rights movement, redrew the urban landscape, rewrote the agenda of American politics and education, and transformed American culture and, with it, world culture.
Wilkerson not only rectifies the world’s ignorance about the Great Migration, but with twelve hundred interviews and crystalline prose she makes us understand it in its full human reality. We live in an era of social science, and have become accustomed to understanding the social world in terms of “forces,” “pressures,” “processes,” and “developments.” It is easy to forget that those “forces” are statistical summaries of the deeds of millions of men and women who act on their beliefs in pursuit of their desires. The habit of submerging the individual into abstractions can lead not only to bad science (it’s not as if the “social forces” obeyed Newton’s laws) but to dehumanization. We are apt to think, “I (and my kind) choose to do things for reasons; he (and his kind) are part of a social process.” This was a moral of Orwell’s essay “Politics and the English Language,” which warned against dehumanizing abstraction: “Millions of peasants are robbed of their farms and sent trudging along the roads with no more than they can carry: this is called transfer of population or rectification of frontiers.” With an allergy to abstraction and a phobia of cliché, Wilkerson trains a magnifying glass on the historical blob called “the Great Migration” and reveals the humanity of the people who compose it.
From the early years of the twentieth century to well past its middle age. Not even the chronology is described in conventional language: the century is an aging person, a contemporary of the story’s protagonists.
Typists wanting to work in an office. Not “denial of economic opportunities.” By invoking a moderately skilled occupation from an earlier era, Wilkerson invites us to imagine the desperation of a woman who has acquired a proficiency that could lift her from the cotton fields to a professional office but who is denied the chance because of the color of her skin.
Yard boys scared that a single gesture near the planter’s wife could leave them hanging from an oak tree. Not “oppression,” not “the threat of violence,” not even “lynching,” but a horrific physical image. We even see what kind of tree it is.
as hard and unyielding as the red Georgia clay. Once again prose is brought to life with a snatch of poetry, as in this simile with its sensual image, its whiff of allusion (I think of Martin Luther King’s “red hills of Georgia”), and its lyrical anapest meter.
anyone who ever longed to cross the Atlantic or the Rio Grande. Not “immigrants from Europe or Mexico.” Once again the people are not sociological categories. The author forces us to visualize bodies in motion and to remember the motives that pulled them along.
what the pilgrims did . . . what the Scotch-Irish did . . . what the European Jews did . . . what the landless in Russia, Italy, China, and elsewhere did. Wilkerson begins the paragraph by stating that the actions of her protagonists are universal, but she does not rest with that generalization. She nominates the Great Migration for inclusion in a list of storied emigrations (expressed in pleasingly parallel syntax), whose descendants doubtless include many of her readers. Those readers are implicitly invited to apply their respect for their ancestors’ courage and sacrifice to the forgotten pilgrims of the Great Migration.
when the land turned to dust, not “the Dust Bowl”; when there was nothing to eat, not “the Potato Famine”; the landless, not “the peasants.” Wilkerson will not allow us to snooze through a recitation of familiar verbiage. Fresh wording and concrete images force us to keep updating the virtual reality display in our minds.
They left. Among the many dumb rules of paragraphing foisted on students in composition courses is the one that says that a paragraph may not consist of a single sentence. Wilkerson ends a richly descriptive introductory chapter with a paragraph composed of exactly two syllables. The abrupt ending and the expanse of blankness at the bottom of the page mirror the finality of the decision to move and the uncertainty of the life that lay ahead. Good writing finishes strong.
• • •
The authors of the four passages share a number of practices: an insistence on fresh wording and concrete imagery over familiar verbiage and abstract summary; an attention to the readers’ vantage point and the target of their gaze; the judicious placement of an uncommon word or idiom against a backdrop of simple nouns and verbs; the use of parallel syntax; the occasional planned surprise; the presentation of a telling detail that obviates an explicit pronouncement; the use of meter and sound that resonate with the meaning and mood.
The authors also share an attitude: they do not hide the passion and relish that drive them to tell us about their subjects. They write as if they have something important to say. But no, that doesn’t capture it. They write as if they have something important to show. And that, we shall see, is a key ingredient in the sense of style.
Chapter 2
A WINDOW ONTO THE WORLD
CLASSIC STYLE AS AN ANTIDOTE FOR ACADEMESE, BUREAUCRATESE, CORPORATESE, LEGALESE, OFFICIALESE, AND OTHER KINDS OF STUFFY PROSE
Writing is an unnatural act.1 As Charles Darwin observed, “Man has an instinctive tendency to speak, as we see in the babble of our young children, whereas no child has an instinctive tendency to bake, brew, or write.” The spoken word is older than our species, and the instinct for language allows children to engage in articulate conversation years before they enter a schoolhouse. But the written word is a recent invention that has left no trace in our genome and must be laboriously acquired throughout childhood and beyond.
Speech and writing differ in their mechanics, of course, and that is one reason children must struggle with writing: it takes practice to reproduce the sounds of language with a pencil or a keyboard. But they differ in another way, which makes the acquisition of writing a lifelong challenge even after the mechanics have been mastered. Speaking and writing involve very different kinds of human relationship, and only the one associated with speech comes naturally to us. Spoken conversation is instinctive because social interaction is instinctive: we speak to those with whom we are on speaking terms. When we engage our conversational partners, we have an inkling of what they know and what they might be interested in learning, and as we chat with them, we monitor their eyes, their face, and their posture. If they need clarification, or cannot swallow an assertion, or have something to add, they can break into the conversation or follow up in turn.
We enjoy none of this give-and-take when we cast our bread upon the waters by sending a written missive out into the world. The recipients are invisible and inscrutable, and we have to get through to them without knowing much about them or seeing their reactions. At the time that we write, the reader exists only in our imaginations. Writing is above all an act of pretense. We have to visualize ourselves in some kind of conversation, or correspondence, or oration, or soliloquy, and put words into the mouth of the little avatar who represents us in this simulated world.
The key to good style, far more than obeying any list of commandments, is to have a clear conception of the make-believe world in which you’re pretending to communicate. There are many possibilities. A person thumb-typing a text message can get away with acting as if he is taking part in a real conversation.* A college student who writes a term paper is pretending that he knows more about his subject than the reader and that his goal is to supply the reader with information she needs, whereas in reality his reader typically knows more about the subject than he does and has no need for the information, the actual goal of the exercise being to give the student practice for the real thing. An activist composing a manifesto, or a minister drafting a sermon, must write as if they are standing in front of a crowd and whipping up their emotions.
Which simulation should a writer immerse himself in when composing a piece for a more generic readership, such as an essay, an article, a review, an editorial, a newsletter, or a blog post? The literary scholars Francis-Noël Thomas and Mark Turner have singled out one model of prose as an aspiration for such writers today. They call it classic style, and explain it in a wonderful little book called Clear and Simple as the Truth.
The guiding metaphor of classic style is seeing the world. The writer can see something that the reader has not yet noticed, and he orients the reader’s gaze so that she can see it for herself. The purpose of writing is presentation, and its motive is disinterested truth. It succeeds when it aligns language with the truth, the proof of success being clarity and simplicity. The truth can be known, and is not the same as the language that reveals it; prose is a window onto the world. The writer knows the truth before putting it into words; he is not using the occasion of writing to sort out what he thinks. Nor does the writer of classic prose have to argue for the truth; he just needs to present it. That is because the reader is competent and can recognize the truth when she sees it, as long as she is given an unobstructed view. The writer and the reader are equals, and the process of directing the reader’s gaze takes the form of a conversation.
For National Novel Writing Month in November, we have prepared a collection of books that will help students with their writing goals.
In celebration of Native American Heritage Month this November, Penguin Random House Education is highlighting books that detail the history of Native Americans, and stories that explore Native American culture and experiences. Browse our collection here: Books for Native American Heritage Month