The Sense of Style: The Thinking Person's Guide to Writing in the 21st Century
Rate it:
Open Preview
2%
Flag icon
George Orwell, in his vaunted “Politics and the English Language,” fell into the same trap when, without irony, he derided prose in which “the passive voice is wherever possible used in preference to the active.”
2%
Flag icon
Linguistic research has shown that the passive construction has a number of indispensable functions because of the way it engages a reader’s attention and memory.
2%
Flag icon
The rules often mash together issues of grammatical correctness, logical coherence, formal style, and standard dialect, but a skilled writer needs to keep them straight.
3%
Flag icon
As people age, they confuse changes in themselves with changes in the world, and changes in the world with moral decline—the illusion of the good old days.4 And so every generation believes that the kids today are degrading the language and taking civilization down with it:5
4%
Flag icon
style earns trust. If readers can see that a writer cares about consistency and accuracy in her prose, they will be reassured that the writer cares about those virtues in conduct they cannot see as easily.
10%
Flag icon
Classic writing, with its assumption of equality between writer and reader, makes the reader feel like a genius. Bad writing makes the reader feel like a dunce.
11%
Flag icon
Inexperienced writers often think they’re doing the reader a favor by guiding her through the rest of the text with a detailed preview. In reality, previews that read like a scrunched-up table of contents are there to help the writer, not the reader.
11%
Flag icon
One way to introduce a topic without metadiscourse is to open with a question:
13%
Flag icon
It’s not that good writers never hedge their claims. It’s that their hedging is a choice, not a tic.
13%
Flag icon
That’s the basis for the common advice (usually misattributed to Mark Twain) to “substitute damn every time you’re inclined to write very; your editor will delete it and the writing will be just as it should be”—though
13%
Flag icon
When a reader is forced to work through one stale idiom after another, she stops converting the language into mental images and slips back into just mouthing the words.13 Even worse, since a cliché-monger has turned off his own visual brain as he plonks down one dead idiom after another, he will inevitably mix his metaphors, and a reader who does keep her visual brain going will be distracted by the ludicrous imagery.
14%
Flag icon
Dickens describes a man “with such long legs that he looked like the afternoon shadow of somebody else”;
14%
Flag icon
Could you recognize a “level” or a “perspective” if you met one on the street? Could you point it out to someone else? What about an approach, an assumption, a concept, a condition, a context, a framework, an issue, a model, a process, a range, a role, a strategy, a tendency, or a variable? These are metaconcepts: concepts about concepts. They serve as a kind of packing material in which academics, bureaucrats, and corporate mouthpieces clad their subject matter. Only when the packaging is hacked away does the object come into view.
16%
Flag icon
there is nothing wrong with a news report that uses the passive voice to say, “Helicopters were flown in to put out the fires.”24 The reader does not need to be informed that a guy named Bob was flying one of the helicopters.
16%
Flag icon
the reader’s attention usually starts out on the entity named by the subject of the sentence. Actives and passives differ in which character gets to be the subject, and hence which starts out in the reader’s mental spotlight.
16%
Flag icon
I have long been skeptical of the bamboozlement theory, because in my experience it does not ring true.
16%
Flag icon
People often tell me that academics have no choice but to write badly because the gatekeepers of journals and university presses insist on ponderous language as proof of one’s seriousness. This has not been my experience, and it turns out to be a myth.
16%
Flag icon
Call it the Curse of Knowledge: a difficulty in imagining what it is like for someone else not to know something that you know. The term was invented by economists to help explain why people are not as shrewd in bargaining as they could be, in theory, when they possess information that their opposite number does not.
16%
Flag icon
The inability to set aside something that you know but that someone else does not know is such a pervasive affliction of the human mind that psychologists keep discovering related versions of it and giving it new names. There is egocentrism, the inability of children to imagine a simple scene, such as three toy mountains on a tabletop, from another person’s vantage point.4 There’s hindsight bias, the tendency of people to think that an outcome they happen to know, such as the confirmation of a disease diagnosis or the outcome of a war, should have been obvious to someone who had to make a ...more
17%
Flag icon
The better you know something, the less you remember about how hard it was to learn.
17%
Flag icon
The curse of knowledge is the single best explanation I know of why good people write bad prose.
18%
Flag icon
Sometimes two examples are better than one, because they allow the reader to triangulate on which aspect of the example is relevant to the definition.
19%
Flag icon
An adult mind that is brimming with chunks is a powerful engine of reason, but it comes with a cost: a failure to communicate with other minds that have not mastered the same chunks.
19%
Flag icon
as we become familiar with something, we think about it more in terms of the use we put it to and less in terms of what it looks like and what it is made of. This transition, another staple of the cognitive psychology curriculum, is called functional fixity
20%
Flag icon
Many experiments have shown that readers understand and remember material far better when it is expressed in concrete language that allows them to form visual images,
20%
Flag icon
What could be so hard about pretending to open your eyes and hold up your end of a conversation? The reason it’s harder than it sounds is that if you are enough of an expert in a topic to have something to say about it, you have probably come to think about it in abstract chunks and functional labels that are now second nature to you but still unfamiliar to your readers—and you are the last one to realize it.
21%
Flag icon
Social psychologists have found that we are overconfident, sometimes to the point of delusion, about our ability to infer what other people think, even the people who are closest to us.27 Only when we ask those people do we discover that what’s obvious to us isn’t obvious to them.
22%
Flag icon
We have learned to associate each thought with a little stretch of sound called a word, and can cause each other to think that thought by uttering the sound.
26%
Flag icon
As with any form of mental self-improvement, you must learn to turn your gaze inward, concentrate on processes that usually run automatically, and try to wrest control of them so that you can apply them more mindfully.
27%
Flag icon
So every time a writer adds a word to a sentence, he is imposing not one but two cognitive demands on the reader: understanding the word, and fitting it into the tree. This double demand is a major justification for the prime directive “Omit needless words.”
34%
Flag icon
Topic-then-comment and given-then-new orderings are major contributors to coherence, the feeling that one sentence flows into the next rather than jerking the reader around.
37%
Flag icon
reader must know the topic of a text in order to understand it.
41%
Flag icon
It’s always surprising to me to see how often scientists thoughtlessly use synonyms in comparisons, because the cardinal principle of experimental design is the Rule of One Variable.
43%
Flag icon
A failure to command coherence connectives turned out to be among the skills that most sharply differentiated the struggling students from their successful peers.
44%
Flag icon
More than three centuries ago, Baruch Spinoza pointed out that the human mind cannot suspend disbelief in the truth or falsity of a statement and leave it hanging in logical limbo awaiting a “true” or “false” tag to be hung on it.18 To hear or read a statement is to believe it, at least for a moment.
44%
Flag icon
a negative statement such as The king is not dead is harder on the reader than an affirmative one like The king is alive.20 Every negation requires mental homework, and when a sentence contains many of them the reader can be overwhelmed.
45%
Flag icon
Negative sentences are easy when the reader already has an affirmative in mind or can create one on short notice; all he has to do is pin a “false” tag onto it. But concocting a statement that you have trouble believing in the first place (such as “A herring is a mammal”), and then negating it, requires two bouts of cognitive heavy lifting rather than one.
45%
Flag icon
Or, to put it more positively, when a writer wants to negate an unfamiliar proposition, she should unveil the negation in two stages:   1. You might think . . . 2. But no.
48%
Flag icon
And after a lifetime of scholarship he was so laden with erudition that his ideas came avalanching down faster than he could organize them.
48%
Flag icon
The idea that there are exactly two approaches to usage—all the traditional rules must be followed, or else anything goes—is the sticklers’ founding myth.
50%
Flag icon
Phony rules, which proliferate like urban legends and are just as hard to eradicate, are responsible for vast amounts of ham-fisted copyediting and smarty-pants one-upmanship. Yet when language scholars try to debunk the spurious rules, the dichotomizing mindset imagines that they are trying to abolish all standards of good writing. It is as if anyone who proposed repealing a stupid law, like the one forbidding interracial marriage, must be a black-cloaked, bomb-clutching anarchist.
51%
Flag icon
if you insist that decimate be used only with its original meaning, “kill one in ten,” shouldn’t you also insist that December be used with its original meaning, “the tenth month in the calendar”?
51%
Flag icon
Many of the commonest usage errors are the result of writers thinking logically when they should be mindlessly conforming to convention.
51%
Flag icon
And this brings us to the reasons to obey some prescriptive rules (the ones accepted by good writers, as opposed to the phony ones that good writers have always ignored). One is to provide grounds for confidence that the writer has a history of reading edited English and has given it his full attention. Another is to enforce grammatical consistency: to implement rules, such as agreement, that everyone respects but that may be hard to keep track of when the sentence gets complicated (see chapter 4). The use of consistent grammar reassures a reader that the writer has exercised care in ...more
52%
Flag icon
Using an informal style when a formal style is called for results in prose that seems breezy, chatty, casual, flippant. Using a formal style when an informal style is called for results in prose that seems stuffy, pompous, affected, haughty. Both kinds of mismatch are errors. Many prescriptive guides are oblivious to this distinction, and mistake informal style for incorrect grammar.
57%
Flag icon
The pseudo-rule was invented by John Dryden based on a silly analogy with Latin (where the equivalent to a preposition is attached to the noun and cannot be separated from it) in an effort to show that Ben Jonson was an inferior poet. As the linguist Mark Liberman remarked, “It’s a shame that Jonson had been dead for 35 years at the time, since he would otherwise have challenged Dryden to a duel, and saved subsequent generations a lot of grief.”
61%
Flag icon
Many spurious rules start out as helpful hints intended to rescue indecisive writers from paralysis when faced with a choice provided by the richness of English. These guides for the perplexed also make the lives of copy editors easier, so they may get incorporated into style sheets. Before you know it, a rule of thumb morphs into a rule of grammar, and a perfectly innocuous (albeit second-choice) construction is demonized as incorrect. Nowhere is this transition better documented than with the phony but ubiquitous rule on when to use which and when to use that.
65%
Flag icon
None has always been either singular or plural, depending on whether the writer is pondering the entire group at once or each member individually.
65%
Flag icon
This is part of a larger phenomenon called notional agreement, in which the grammatical number of a noun phrase can depend on whether the writer conceives of its referent as singular or plural, rather than on whether it is grammatically marked as singular or plural.
69%
Flag icon
If you know a word and then come across a similar one with a fancy prefix or suffix, resist the temptation to use it as a hoity-toity synonym.